Date of Completion

3-10-2019

Degree Type

Honors Thesis

Discipline

Mathematics (MATH)

First Advisor

Dr. Thomas Laurent

Abstract

Neural Machine Translation is the primary algorithm used in industry to perform machine translation. This state-of-the-art algorithm is an application of deep learning in which massive datasets of translated sentences are used to train a model capable of translating between any two languages. The architecture behind neural machine translation is composed of two recurrent neural networks used together in tandem to create an Encoder Decoder structure. Attention mechanisms have recently been developed to further increase the accuracy of these models. In this senior thesis, the various parts of Neural Machine Translation are explored towards the eventual creation of a tutorial on the topic. In the first half of this paper, each of the aspects that go into creating a NMT model are explained in depth. With an understanding of the mechanics of NMT, the second portion of this paper briefly outlines enhancements that were made to the PyTorch tutorial on NMT to create an updated and more effective tutorial on the topic.

Comments

Along with final Thesis paper, I also have included a copy of the PPT deck I used to present on the same topic.

Share

COinS