This header provides functionality for sample-by-sample stochastic gradient descent and gradient computation with a neural net. More...
Go to the source code of this file.
Classes | |
class | NnetUpdater |
Namespaces | |
kaldi | |
This code computes Goodness of Pronunciation (GOP) and extracts phone-level pronunciation feature for mispronunciations detection tasks, the reference: | |
kaldi::nnet2 | |
Functions | |
void | FormatNnetInput (const Nnet &nnet, const std::vector< NnetExample > &data, Matrix< BaseFloat > *mat) |
Takes the input to the nnet for a minibatch of examples, and formats as a single matrix. More... | |
double | DoBackprop (const Nnet &nnet, const std::vector< NnetExample > &examples, Nnet *nnet_to_update, double *tot_accuracy=NULL) |
This function computes the objective function and either updates the model or adds to parameter gradients. More... | |
double | DoBackprop (const Nnet &nnet, const std::vector< NnetExample > &examples, Matrix< BaseFloat > *examples_formatted, Nnet *nnet_to_update, double *tot_accuracy=NULL) |
This version of DoBackprop allows you to separately call FormatNnetInput and provide the result to DoBackprop; this can be useful when using GPUs because the call to FormatNnetInput can be in a separate thread from the one that uses the GPU. More... | |
BaseFloat | TotalNnetTrainingWeight (const std::vector< NnetExample > &egs) |
Returns the total weight summed over all the examples... More... | |
double | ComputeNnetObjf (const Nnet &nnet, const std::vector< NnetExample > &examples, double *tot_accuracy=NULL) |
Computes objective function over a minibatch. More... | |
double | ComputeNnetObjf (const Nnet &nnet, const std::vector< NnetExample > &examples, int32 minibatch_size, double *tot_accuracy=NULL) |
This version of ComputeNnetObjf breaks up the examples into multiple minibatches to do the computation. More... | |
double | ComputeNnetGradient (const Nnet &nnet, const std::vector< NnetExample > &examples, int32 batch_size, Nnet *gradient) |
ComputeNnetGradient is mostly used to compute gradients on validation sets; it divides the example into batches and calls DoBackprop() on each. More... | |
This header provides functionality for sample-by-sample stochastic gradient descent and gradient computation with a neural net.
See also nnet-compute.h which is the same thing but for whole utterances.
Definition in file nnet-update.h.