[NLP Project] LSTM + self-attention
·
Project/캡스톤디자인2
모델에 attention 층 추가 model.add(Bidirectional(LSTM(units=128, return_sequences=True, dropout=0.2, recurrent_dropout=0.2))) model.add(SeqSelfAttention(attention_activation='sigmoid')) model.add(Bidirectional(LSTM(units=64, return_sequences=True, dropout=0.2, recurrent_dropout=0.2))) model.add(SeqSelfAttention(attention_activation='sigmoid')) model.add(Dense(1, activation='sigmoid')) model.py from ke..