采纳答案成功!
向帮助你的同学说点啥吧!感谢那些助人为乐的人
activation(bn) 为什么可以直接可以这样用,activation这个函数是怎么来的,它不属于tf吗
可以看代码:
def conv_wrapper(inputs, name, is_training, output_channel = 32, kernel_size = (3,3), activation = tf.nn.relu, padding = 'same'): """wrapper of tf.layers.conv2d""" # without bn: conv -> activation # with batch normalization: conv -> bn -> activation with tf.name_scope(name): conv2d = tf.layers.conv2d(inputs, output_channel, kernel_size, padding = padding, activation = None, name = name + '/conv2d') bn = tf.layers.batch_normalization(conv2d, training = is_training) return activation(bn)
这里的activation是tensorflow里的激活函数啊,只不过它是从外面传进来的。默认是tf.nn.relu.
登录后可查看更多问答,登录/注册
深度学习算法工程师必学,深入理解深度学习核心算法CNN RNN GAN
1.5k 9
1.6k 8
1.1k 7
1.2k 7
2.0k 7