这是一个已知的bug, tensorflow使用batch_normalization的正确打开方式类似如下代码片段:
with tf.name_scope("train"):
optimizer = tf.train.GradientDescentOptimizer(learning_rate)
extra_update_ops = tf.get_collection(tf.GraphKeys.UPDATE_OPS)
with tf.control_dependencies(extra_update_ops):
training_op = optimizer.minimize(loss)老师笔误用错了API, 稍后会更新代码。