Skip to content

NJUxlj/seq2seq-attention-translation

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

13 Commits
 
 
 
 
 
 
 
 

Repository files navigation

Sequence to Sequence Attention

this file reproduce the seq2seq attention brought up by this paper NEURAL MACHINE TRANSLATION BY JOINTLY LEARNING TO ALIGN AND TRANSLATE

File

seq2seq_translation.py is a implementation of the Soft-Attention with a standard encoder-decoder model implemented using the GRU

seq2seq_translation_2.py is a modified the version of soft-attention with a slightly different implementation and a multi-layer GRU

Run

pip install torch
pip install numpy
python seq2seq_translation.py

Results

image image

Citation

@article{bahdanau2014neural,
  title={Neural machine translation by jointly learning to align and translate},
  author={Bahdanau, Dzmitry},
  journal={arXiv preprint arXiv:1409.0473},
  year={2014}
}

About

Seq2Seq-attention (Soft-Attention)

Topics

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages