this file reproduce the seq2seq attention brought up by this paper NEURAL MACHINE TRANSLATION BY JOINTLY LEARNING TO ALIGN AND TRANSLATE
seq2seq_translation.py
is a implementation of the Soft-Attention with a standard encoder-decoder model implemented using the GRU
seq2seq_translation_2.py
is a modified the version of soft-attention with a slightly different implementation and a multi-layer GRU
pip install torch
pip install numpy
python seq2seq_translation.py
@article{bahdanau2014neural,
title={Neural machine translation by jointly learning to align and translate},
author={Bahdanau, Dzmitry},
journal={arXiv preprint arXiv:1409.0473},
year={2014}
}