Fusion: Practice and Applications
FPA
2692-4048
2770-0070
10.54216/FPA
https://www.americaspg.com/journals/show/775
2018
2018
Robust Neural Language Translation Model Formulation using Seq2seq approach
Chandigarh University, INDIA
Meenu
Gupta
Chandigarh University, INDIA
Prince
Kumar
In this work, the approach used is to sequence powerful models that have achieved excellent performance on language translation encoding-decoding tasks. A language transformer model is used in this work based on the sequence-to-sequence approach, which uses a Long Short-Term Memory (LSTM) to map the input sequence to a vector of fixed dimensionality. Then another deep LSTM decodes the target sequence from the vector. Evaluated the model efficiency through BLEU score and LSTM's BLEU score was penalized on out-of-vocabulary words. Additionally, the LSTM did not have difficulty with long-short of sentences. This work performed the deep LSTM setup English-Japanese translation accuracy at an order of magnitude faster speed, both on GPU and CPU. The variety of the data is introduced into it to evaluate the robustness using the BLEU score. Finally, a better result is achieved by merging the two different types of datasets and getting the highest BLEU score of 40.1 at the end.
2021
2021
70
76
10.54216/FPA.050203
https://www.americaspg.com/articleinfo/3/show/775