Date of Completion
3-10-2019
Degree Type
Honors Thesis
Discipline
Mathematics (MATH)
First Advisor
Dr. Thomas Laurent
Abstract
Neural Machine Translation is the primary algorithm used in industry to perform machine translation. This state-of-the-art algorithm is an application of deep learning in which massive datasets of translated sentences are used to train a model capable of translating between any two languages. The architecture behind neural machine translation is composed of two recurrent neural networks used together in tandem to create an Encoder Decoder structure. Attention mechanisms have recently been developed to further increase the accuracy of these models. In this senior thesis, the various parts of Neural Machine Translation are explored towards the eventual creation of a tutorial on the topic. In the first half of this paper, each of the aspects that go into creating a NMT model are explained in depth. With an understanding of the mechanics of NMT, the second portion of this paper briefly outlines enhancements that were made to the PyTorch tutorial on NMT to create an updated and more effective tutorial on the topic.
Recommended Citation
Lanners, Quinn M. and Laurent, Thomas, "Neural Machine Translation" (2019). Honors Thesis. 201.
https://digitalcommons.lmu.edu/honors-thesis/201
Comments
Along with final Thesis paper, I also have included a copy of the PPT deck I used to present on the same topic.