model_cifar.py 文件源码

python
阅读 25 收藏 0 点赞 0 评论 0

项目:deep_separation_contraction 作者: edouardoyallon 项目源码 文件源码
def net(x, n_layer_per_block, n_classes, phase_train,alpha,number_channel, scope='deep_net'):
  with tf.variable_scope(scope):
    n1=number_channel
    n2=number_channel
    n3=number_channel
    n4=number_channel

    y = conv2d(x, 3, n1, 3, 1, 'SAME',False, phase_train,scope='conv_init')
    y = batch_norm(y, n1, phase_train, scope='bn_init')
    y = tf.nn.relu(y, name='relu_init')

    y = group(y, n1, n2, n_layer_per_block, False, alpha,phase_train, scope='group_1')
    y = group(y, n2, n3, n_layer_per_block, True,alpha, phase_train, scope='group_2')
    y = group(y, n3, n4, n_layer_per_block, True,alpha, phase_train, scope='group_3')

    y = tf.nn.avg_pool(y, [1, 8, 8, 1], [1, 1, 1, 1], 'VALID', name='avg_pool')
    y = tf.squeeze(y, squeeze_dims=[1, 2])

    w = tf.get_variable('DW', [n4, n_classes],initializer=tf.uniform_unit_scaling_initializer(factor=1.0))
    tf.add_to_collection('weights', w)
    bias = tf.get_variable('bias', [n_classes], initializer=tf.constant_initializer(0.0))
    y=tf.nn.xw_plus_b(y, w, bias)
  return y
评论列表
文章目录


问题


面经


文章

微信
公众号

扫码关注公众号