with_locations_lstm.py 文件源码

python
阅读 19 收藏 0 点赞 0 评论 0

项目:neural-CWS 作者: Akuchi612 项目源码 文件源码
def build_model(data, word_weights, max_len, tag_window=5, embed_dim=100, location_dim=10):
    batch_size = 2048
    nb_epoch = 16
    nb_class = 4
    hidden_dim = 128

    train_x = np.array(list(data['x']))
    train_l = np.array(list(data['l']))
    train_y = np.array(list(data['y']))
    train_y = np_utils.to_categorical(train_y, nb_class)

    print(train_x.shape)
    print(train_l.shape)
    print(train_y.shape)
    input_x = Input(shape=(tag_window, ), dtype='float32', name='input_x')
    input_l = Input(shape=(tag_window, ), dtype='float32', name='input_l')

    embed_x = Embedding(output_dim=embed_dim, 
            input_dim=word_weights.shape[0],
            input_length=tag_window,
            weights=[word_weights],
            name='embed_x')(input_x)
    embed_l = Embedding(output_dim=location_dim, 
            input_dim=max_len,
            input_length=tag_window,
            name='embed_l')(input_l)

    merge_embed = merge([embed_x, embed_l],
            mode='concat', concat_axis=2)
    bi_lstm = Bidirectional(LSTM(hidden_dim, return_sequences=False), merge_mode='sum')(merge_embed)
    x_dropout = Dropout(0.5)(bi_lstm)
    x_output = Dense(nb_class,
        # kernel_regularizer=regularizers.l2(0.01),
        # kernel_constraint=maxnorm(3.0),
        # activity_regularizer=regularizers.l2(0.01),
        activation='softmax')(x_dropout)
    model = Model(inputs=[input_x, input_l], outputs=[x_output])
    model.compile(optimizer='adamax', loss='categorical_crossentropy',metrics=['accuracy'])
    print('Train...')
    model_path = './model/location_128hidden_2048batch'
    modelcheckpoint = ModelCheckpoint(model_path, verbose=1, save_best_only=True)
    model.fit([train_x, train_l], [train_y], validation_split=0.2, 
            batch_size=batch_size, epochs=nb_epoch, shuffle=True)
评论列表
文章目录


问题


面经


文章

微信
公众号

扫码关注公众号