郁 组会 Aspect Level Sentiment Classification with Deep Memory Network

2020-03-01 58浏览

  • 1.Aspect Level Sentiment Clas sification with Deep Memory Network Duyu Tang, Bing Qin∗ , Ting Liu Harbin Institute of Technology, Harbin, China {dytang, qinb, tliu}@ir.hit.edu.cn —— 郁
  • 2.Problem definition SemEval-2014Task4:Aspect Based Sentiment Analysis Sentence : Aspect : Sentiment : e.g.1 Sentence : But the $T$ was so horrible to us . Aspect : Staf Sentiment : -1 SemEval-2016 Task 5: Aspect Based Sentiment Analysis
  • 3.e.g.2 I had a terrific $T$ , and our server guided us toward a very nice wine in our price range , instead of allowing us to purchase a similarly priced wine that was n't as good . Meal 1 I had a terrific meal , and our $T$ guided us toward a very nice wine in our price range , instead of allowing us to purchase a similarly priced wine that was n't as good . Server 1 I had a terrific meal , and our server guided us toward a very nice $T$ in our price range , instead of allowing us to purchase a similarly priced wine that was n't as good . Wine 1 I had a terrific meal , and our server guided us toward a very nice wine in our price range , instead of allowing us to purchase a similarly priced $T$ that was n't as good . Wine -1 I had a terrific meal , and our server guided us toward a very nice wine in our price range , instead of allowing us to purchase a similarly $T$ wine that was
  • 4.Related works 1. Majority : 在训练集上面向对象训练 2. Feature-based SVM : 特征工程,包括 ngram features 、 parse features and lexicon features ( best ) 3. LSTM -> TDLSTM : Using two LSTM networks, a forward one and a backward one, towards the aspect 4. ContextAVG : Context word vectors are averaged and the result is added to the aspect vector.
  • 5.Memory Network
  • 6.
  • 7.Model �1 ,� 2 , ⋯ ,� �−1 ,� � ,� �+1 , ⋯, ��− 1 , ��
  • 8.Model Context embedding matrix context words �1 ,� 2 , ⋯ ,� �−1 ,� � ,� �+1 , ⋯, ��− 1 , �� aspect word Aspect vector
  • 9.Model Context embedding matrix context words �1 ,� 2 , ⋯ ,� �−1 ,� � ,� �+1 , ⋯, ��− 1 , �� aspect word Attentio n Aspect vector
  • 10.Model Context embedding matrix context words �1 ,� 2 , ⋯ ,� �−1 ,� � ,� �+1 , ⋯, ��− 1 , �� aspect word Attentio n Aspect vector
  • 11.Model Aspect vector Context embedding matrix context words �1 ,� 2 , ⋯ ,� �−1 ,� � ,� �+1 , ⋯, ��− 1 , �� aspect word Attentio n Aspect vector
  • 12.Hidden_size : d Context_size : k �∈ � �×1 Aspect vector � ×1 Context embedding matrix �∈� �×� context words �1 ,� 2 , ⋯ ,� �−1 ,� � ,� �+1 , ⋯, ��− 1 , �� aspect word �∈ � �∈� �× 1 Attentio n Aspect vector �∈ � �×1
  • 13.Aspect vector Context embedding matrix � ∈ � � ×� context words Attention �1 ,� 2 ,⋯,� �−1 ,� � ,� �+1 ,⋯,��− 1 ,�� Aspect vector aspect word � ������ ∈ ��× 1 �=� ������� ⊙ � ( � ������� )
  • 14.Aspect vector Context embedding matrix context words � ∈ � �× 1 Attention �1 ,� 2 ,⋯,� �−1 ,� � ,� �+1 ,⋯,��− 1 ,�� Aspect vector aspect word �=� ������� ⊙ � ( � ������� ) ∈� 1× 1 � ∈ � 1× 2∈ � �×1∈ � �×1 � �=���h ( � [ �� ; � ������ ] +� ) 1× 1 ∈ � �� = ��� ( �� ) � ∑ ��� ( � � ) �=1
  • 15.Aspect vector Context embedding matrix context words �∈ �� ×1 � �=���h ( � [ �� ; � ������ ] +� ) Attention �1 ,� 2 ,⋯,� �−1 ,� � ,� �+1 ,⋯,��− 1 ,�� Aspect vector aspect word �=� ������� ⊙ � ( � ������� ) �� = ��� ( �� ) � ∑ ��� ( � � ) �=1 ∈ � � ×1� �=∑ �� �� �=1
  • 16.Aspect vector Context embedding matrix context words m O R Attention �1 ,� 2 ,⋯,� �−1 ,� � ,� �+1 ,⋯,��− 1 ,�� Aspect vector aspect word R I
  • 17.Aspect vector Context embedding matrix context words Attention �1 ,� 2 ,⋯ ,� �−1 ,� � ,� �+1 ,⋯,��− 1 ,�� Aspect vector aspect word One Hop(Layer) Multi Hop
  • 18.Result
  • 19.MemNet 与其他 NN 关系的讨论 —— 与前馈网络 —— 与循环网络 ——FastText 为何有效
  • 20.NN 能否做乘法? 异或门的实现
  • 21.NN 能否做乘法? 仅通过全连接神经网络和非线性激活函数能否实现乘法?