def activation(self,features, scope=None): # scope=None
with tf.variable_scope(scope, 'PReLU', initializer=self.initializer):
alpha = tf.get_variable('alpha', features.get_shape().as_list()[1:])
pos = tf.nn.relu(features)
neg = alpha * (features - tf.abs(features)) * 0.5
return pos + neg
a3_entity_network.py 文件源码
python
阅读 35
收藏 0
点赞 0
评论 0
评论列表
文章目录