[Insight-developers] Neural Networks -- Ordering onf Initialize functions?

kent williams norman-k-williams at uiowa.edu
Mon Aug 27 13:33:14 EDT 2007


In order to use an itk::OneHiddenLayerbackPropagationNeuralNetwork, you have
to explicitly initialize it.

For example, (from NNetClassifierTest1.cxx)

OneHiddenLayerBackPropagationNeuralNetworkType::Pointer
OneHiddenLayerNet = OneHiddenLayerBackPropagationNeuralNetworkType::New();

OneHiddenLayerNet->SetNumOfInputNodes(num_input_nodes);
OneHiddenLayerNet->SetNumOfFirstHiddenNodes(num_hidden_nodes);
OneHiddenLayerNet->SetNumOfOutputNodes(num_output_nodes);

OneHiddenLayerNet->Initialize();
OneHiddenLayerNet->InitializeWeights();
OneHiddenLayerNet->SetLearningRate(0.001);

The problem I encountered today was that the ORDER of Initialize() and
InitializeWeights() was transposed, and what resulted was a Neural Network
that divides by zero during training.

Is the crucial order of initialization documented anywhere?  If it IS
crucial, why is the user allowed to do things in the wrong order?  Why doesn
Initialize() initialize the Weights for itself?

By the way, if you turn on numeric exceptions you also get exceptions
elsewhere during training, in the Sigmoid function.

In order to catch exceptions, I resorted to this bit of voodoo I found by
Google Search:

#include <xmmintrin.h>
EnableNumericExceptions()
{
_mm_setcsr( _MM_MASK_MASK &~
            (_MM_MASK_OVERFLOW|_MM_MASK_INVALID|_MM_MASK_DIV_ZERO) );
}

Is there any portable way to turn on Numeric Exceptions? Should ITK be
tested with Numeric Exceptions turned on?



More information about the Insight-developers mailing list