28 UESNet(
int nlayers,
const int *layerCounts) :
BPNet(nlayers,layerCounts),
36 virtual void setH(
double h){
40 virtual double getH()
const {
56 errors[ol][i] = o*(1-o)*(o-out[i]);
63 for(
int j=0;j<layerSizes[l];j++){
65 for(
int i=0;i<layerSizes[l+1];i++)
77 double hfactor = modulator+1.0;
81 for(
int k=0;k<layerSizes[i-1];k++){
102 for(
int nn=0;nn<num;nn++){
103 int exampleIndex = nn+start;
117 for(
int j=0;j<layerSizes[l-1];j++)
125 int ol = numLayers-1;
128 double e = (o-outs[i]);
134 double hfactor = modulator+1.0;
138 double factor = 1.0/(double)num;
142 for(
int j=0;j<layerSizes[l-1];j++){
145 double wdelta = eta*
getavggradw(l,i,j)*factor*hfactor;
147 getw(l,i,j) -= wdelta;
155 return totalError*factor;
virtual void setInputs(double *d)
Set the inputs to the network before running or training.
double ** gradAvgsBiases
average gradient for each bias (built during training)
int numLayers
number of layers, including input and output
double ** biases
array of biases, stored as a rectangular array of [layer][node]
void calcError(double *in, double *out)
double & getavggradw(int tolayer, int toneuron, int fromneuron) const
get the value of the gradient for a given weight
The "basic" back-propagation network using a logistic sigmoid, as described by Rumelhart, Hinton and Williams (and many others). This class is used by output blending and h-as-input networks.
UESNet(int nlayers, const int *layerCounts)
The constructor is mostly identical to the BPNet constructor.
virtual void update()
Run a single update of the network.
double * getOutputs(int example)
Get a pointer to the outputs for a given example, for reading or writing.
double & getw(int tolayer, int toneuron, int fromneuron) const
get the value of a weight.
virtual void setH(double h)
Set the modulator level for subsequent runs and training of this network.
double ** outputs
outputs of each layer: one array of doubles for each
double ** errors
the error for each node, calculated by calcError()
NetType type
type of the network, used for load/save
The UESMANN network, which it itself based on the BPNet code as it has the same architecture as the p...
double getH(int example) const
Get the h (modulator) for a given example.
double ** gradAvgsWeights
average gradient for each weight (built during training)
virtual double getH() const
get the modulator level
int largestLayerSize
number of nodes in largest layer
double * getInputs(int example)
Get a pointer to the inputs for a given example, for reading or writing.
virtual double trainBatch(ExampleSet &ex, int start, int num, double eta)
Train a network for batch (or mini-batch) (or single example).
A set of example data. Each datum consists of hormone (i.e. modulator value), inputs and outputs...
int * layerSizes
array of layer sizes