def encoder(x):
# Variational posterior q(y|x), i.e. the encoder (shape=(batch_size, 200))
net = slim.stack(x,
slim.fully_connected,
[512, 256])
# Unnormalized logits for number of classes (N) seperate K-categorical distributions
logits_y = tf.reshape(slim.fully_connected(net,
FLAGS.num_classes*FLAGS.num_cat_dists,
activation_fn=None),
[-1, FLAGS.num_cat_dists])
q_y = tf.nn.softmax(logits_y)
log_q_y = tf.log(q_y + 1e-20)
return logits_y, q_y, log_q_y
vae_gumbel_softmax.py 文件源码
python
阅读 22
收藏 0
点赞 0
评论 0
评论列表
文章目录