neural_fingerprints.py 文件源码

python
阅读 27 收藏 0 点赞 0 评论 0

项目:neural_fingerprints_tf 作者: fllinares 项目源码 文件源码
def output_embedding_layer(self, node_emb, scope):
        # Path to hyperparameters and configuration settings for the fingerprint output layers
        prefix = 'model/fingerprint_output_layers'

        with tf.variable_scope(scope, reuse=not self.is_training):
            # Compute node-level activation

            node_fp = tf.contrib.layers.fully_connected(inputs=node_emb,
                                                        num_outputs=self.getitem('config', 'num_outputs', prefix),
                                                        activation_fn=self.string_to_tf_act(self.getitem('config', 'activation_fn', prefix)),
                                                        weights_initializer=self.weights_initializer_fp_out,
                                                        weights_regularizer=self.weights_regularizer_fp_out,
                                                        biases_initializer=tf.constant_initializer(0.0, tf.float32),
                                                        trainable=self.getitem('config', 'trainable', prefix))

            # Apply dropout (if necessary). Alternatively, could have also forced keep_prob to 1.0 when is_training is
            # False
            if self.is_training:
                node_fp = tf.nn.dropout(node_fp, self.getitem('config', 'keep_prob', prefix))

            # Compute the graph-level activation as the sum of the node-level activations for all nodes in the graph
            graph_fp = tf.segment_sum(data=node_fp, segment_ids=self.input['node_graph_map'])

        return graph_fp, node_fp
评论列表
文章目录


问题


面经


文章

微信
公众号

扫码关注公众号