[2017-ACL]ESIM: Enhanced LSTM for Natural Language Inference

一、Architecture(暂只关注左半边)

二、Detail

1.Input Encoding

[Embedding]pre-trained word embeddings—>[BiLSTM]concatenate forward and backward hidden state each time step

2.Local Inference Modeling(key contribution)

(1)local Inference information: soft-attention: a to b & b to a

(2)Enhancement of local inference information

3.Inference Composition

4.prediction(pooling+softmax)

三、Reference