model.py 文件源码

python
阅读 29 收藏 0 点赞 0 评论 0

项目:Attention_Based_LSTM_AspectBased_SA 作者: gangeshwark 项目源码 文件源码
def _init_aspect_embeddings(self):
        with tf.variable_scope("AspectEmbedding") as scope:
            self.input_shape = tf.shape(self.inputs)
            # Uniform(-sqrt(3), sqrt(3)) has variance=1.
            sqrt3 = tf.sqrt(3.0)
            initializer = tf.random_uniform_initializer(-sqrt3, sqrt3)

            """self.aspect_embedding_matrix = tf.get_variable(
                name="aspect_embedding_matrix",
                shape=[self.aspect_vocab_size, self.aspect_embedding_size],
                initializer=initializer,
                dtype=tf.float32)"""
            self.aspect_embedding_matrix = tf.Variable(
                tf.constant(0.0, shape=[self.aspect_vocab_size, self.aspect_embedding_size]),
                trainable=False, name="aspect_embedding_matrix")
            self.aspect_embedding_placeholder = tf.placeholder(tf.float32,
                                                               [self.aspect_vocab_size, self.aspect_embedding_size])
            self.aspect_embedding_init = self.aspect_embedding_matrix.assign(self.aspect_embedding_placeholder)

            self.input_aspect_embedded = tf.nn.embedding_lookup(
                self.aspect_embedding_matrix, self.input_aspect)  # -> [batch_size, da]
            s = tf.shape(self.input_aspect_embedded)
            self.input_aspect_embedded_final = tf.tile(tf.reshape(self.input_aspect_embedded, (s[0], -1, s[1])),
                                                       (1, self.input_shape[1], 1))  # -> [batch_size, N, da]
评论列表
文章目录


问题


面经


文章

微信
公众号

扫码关注公众号