This page describes the various unit tests, some of which are more time-consuming than others. For this reason, only the basic and booleans suites are performed during the Travis build process. The test suites and tests are:
- basic : suite for underlying functionality tests
- example : test that ExampleSet can construct and retrieve example data.
- alt : test that the alternate() function works.
- altex : test ExampleSet::ALTERNATE shuffling on examples.
- stride : test ExampleSet::STRIDE shuffling.
- altex4 : test ExampleSet::ALTERNATE with 4 modulator levels.
- testmse : test mean squared error sum of outputs on a zero parameter net
- loadmnist : test that MNIST data sets can be loaded. and confirm the MSE is low on training complete. This test is described in this section.
- basictrain : test training of backprop nets
- trainparams : test that we can train the identity function.
- trainparams2 : as trainparams, but with more examples and no crossvalidation; it aims to be identical to an existing program written using Angort.
- addition : train a plain backprop network to perform addition.
- additionmod : train a UESMANN network to perform addition and scaled addition: at h=0 the generated function will be y= a + b, while at h=1 it becomes y=0.3( a + b ).
- trainmnist train a plain backpropagation network to recognise MNIST digits using a low number of iterations; we aim for a success rate of at least 85%.
- booleans : test training of a boolean modulatory pairing (XOR/AND) in all 3 modulatory network types - the network should modulate from XOR to AND as the modulator moves from 0 to 1.
- obxorand : output blending
- hinxorand : h-as-input
- uesmann : UESMANN
- saveload : test that saving and loading the different network types leaves the parameters of the network unchanged. This is done by training a network on a single silly example, so it essentially has random parameters, then saving, then loading into a new network and comparing.
- saveloadplain : plain backprop
- saveloadob : output blending
- saveloadhin : h-as-input
- saveloadues : UESMANN
Example code
Some of the tests are described below in more detail with commented source code. These should give you some idea of how to use the system.
basictrain/addition
This test constructs an ExampleSet consisting of 1000 examples of pairs of random numbers as input with their sums as output. It then builds a BPNet - a plain multilayer perceptron, traininable by backpropagation with no modulation. This is done by calling NetFactory::makeNet() with the NetType::PLAIN argument.
A Net::SGDParams structure is set up with suitable training parameters for stochastic gradient descent, and the network is trained using this structure. The result is the mean squared error for all outputs:
where
is the number of examples,
is the number of outputs,
is network's output for example
, and
is the desired output for the same example.
drand48_data rd;
srand48_r(10,&rd);
for(int i=0;i<1000;i++){
double *
ins = e.getInputs(i);
double *out = e.getOutputs(i);
double a,b;
drand48_r(&rd,&a);a*=0.5;
drand48_r(&rd,&b);b*=0.5;
ins[0] = a;
ins[1] = b;
*out = a+b;
}
params.crossValidation(e,
0.5,
1000,
10,
false
)
.storeBest()
.setSeed(0);
printf("%f\n",mse);
BOOST_REQUIRE(mse<0.03);
for(double a=0;a<0.5;a+=0.02){
for(double b=0;b<0.5;b+=0.02){
double runIns[2];
runIns[0]=a;
runIns[1]=b;
double out = *(net->
run(runIns));
double diff = fabs(out-(a+b));
BOOST_REQUIRE(diff<0.05);
}
}
delete net;
}
basictrain/additionmod
This test is similar to basictrain/addition, but builds an ExampleSet consisting of 2000 examples. Evenly numbered examples are of y= a + b, and odd-numbered have y=0.3( a + b ). The modulator on each example is set at 0 for even and 1 for odd. Thus these should train a modulated network to transition from the former to the latter function as the modulator goes from 0 to 1.
drand48_data rd;
srand48_r(10,&rd);
int idx=0;
for(int i=0;i<1000;i++){
double a,b;
drand48_r(&rd,&a);a*=0.5;
drand48_r(&rd,&b);b*=0.5;
double *ins = e.getInputs(idx);
double *out = e.getOutputs(idx);
ins[0] = a;
ins[1] = b;
*out = a+b;
e.setH(idx,0);
idx++;
ins = e.getInputs(idx);
out = e.getOutputs(idx);
ins[0] = a;
ins[1] = b;
*out = (a+b)*0.3;
e.setH(idx,1);
idx++;
}
params.crossValidation(e,
0.5,
1000,
10,
true
)
.storeBest()
.setSeed(0);
printf("%f\n",mse);
BOOST_REQUIRE(mse<0.03);
for(double a=0.1;a<0.4;a+=0.02){
for(double b=0.1;b<0.4;b+=0.02){
double runIns[2];
runIns[0]=a;
runIns[1]=b;
double out = *(net->
run(runIns));
double diff = fabs(out-(a+b));
printf("%f+%f=%f (%f)\n",a,b,out,diff);
BOOST_REQUIRE(diff<0.07);
out = *(net->
run(runIns));
diff = fabs(out-(a+b)*0.3);
printf("%f+%f=%f (%f)\n",a,b,out,diff);
BOOST_REQUIRE(diff<0.07);
}
}
delete net;
}
basictrain/trainmnist
This test loads the MNIST handwritten digits dataset and trains an ordinary unmodulated network to recognise them. It makes use of the MNIST class and the special MNIST constructor for ExampleSet, and runs another test set through the network to see how well it performs.
MNIST m(
"../testdata/train-labels-idx1-ubyte",
"../testdata/train-images-idx3-ubyte");
params.crossValidation(e,0.5,1000,10,true)
.storeBest()
.setSeed(10);
BOOST_REQUIRE(mse<0.03);
MNIST mtest(
"../testdata/t10k-labels-idx1-ubyte",
"../testdata/t10k-images-idx3-ubyte");
int correct=0;
for(int i=0;i<testSet.getCount();i++){
double *ins = testSet.getInputs(i);
int correctLabel = getHighest(testSet.getOutputs(i),testSet.getOutputCount());
int netLabel = getHighest(o,testSet.getOutputCount());
if(correctLabel==netLabel)correct++;
}
double ratio = ((double)correct)/(double)testSet.getCount();
printf("MSE=%f, correct=%d/%d=%f\n",mse,correct,testSet.getCount(),ratio);
BOOST_REQUIRE(ratio>0.85);
delete n;
}