Loss turns to be NAN at the first round step in my tensorflow CNN -
loss turns nan @ first round step in tensorflow cnn.
1 .network :
3 hidden layyer ( 2 convolutional layer +1 hidden fullconnect layyer ) + readout layyer.
2. 3 hidden layyer :
a) weights :
w = tf.variable( tf.truncated_normal(wt,stddev=0.1,name='wights' ))
b) bias :
b = tf.variable( tf.fill([w.get_shape().as_list()[-1] ],0.9),name = 'biases' )
c) activition:
relu
d) dropout
0.6 .
**loss trun nan if dropout 0.0
- readout layyer
softmax
4: loss function:
tf.reduce_mean(-tf.reduce_sum(_lables * tf.log(_logist), reduction_indices=[1]))
5.optimizer:
tf.train.adamoptimizer
learning_rate:0.0005
**loss truns nan if learning_rate = 0
since don't have entire source code, it's hard see problem. however, may try use 'tf.nn.softmax_cross_entropy_with_logits' in cost function. example:
cost = tf.reduce_mean(tf.nn.softmax_cross_entropy_with_logits(predictions, labels))
you can find entire code example using 'tf.nn.softmax_cross_entropy_with_logits' @ https://github.com/nlintz/tensorflow-tutorials.
Comments
Post a Comment