Natural Language Processing (NLP): Language Translation in Multilingual Chatbots - Challenges and Solutions
EOI: 10.11242/viva-tech.01.05.001
Citation
Nitesh Kumar,Kaushal korgaonkar,Jaykumar Rajane,"Natural Language Processing (NLP): Language Translation in Multilingual Chatbots - Challenges and Solutions", VIVA-IJRI Volume 1, Issue 7, Article 1, pp. 1-7, 2024. Published by Master of Computer Application Department, VIVA Institute of Technology, Virar, India.
Abstract
Context-aware Translation, Domain-specific Adaptations, Multilingual Chatbots, Natural Language Processing, Real-time Communication.
References
- Cho, K., van Merriënboer, B., Bahdanau, D., & Bengio, Y. (2014). On the properties of neural machine translation: Encoder-decoder approaches. arXiv preprint arXiv:1409.1259.
- Devlin, J., Chang, M. W., Lee, K., & Toutanova, K. (2018). BERT: Pre-training of deep bidirectional transformers for language understanding. arXiv preprint arXiv:1810.04805.
- Vaswani, A., Shazeer, N., Parmar, N., Uszkoreit, J., Jones, L., Gomez, A. N., ... & Polosukhin, I. (2017). Attention is all you need. In Advances in neural information processing systems (pp. 5998-6008).
- Papineni, K., Roukos, S., Ward, T., & Zhu, W. J. (2002). BLEU: a method for automatic evaluation of machine translation. In Proceedings of the 40th annual meeting of the Association for Computational Linguistics (pp. 311-318).
- Johnson, M., Schuster, M., Le, Q. V., Krikun, M., Wu, Y., Chen, Z., ... & Dean, J. (2017). Google's multilingual neural machine translation system: Enabling zero-shot translation. Transactions of the Association for Computational Linguistics, 5, 339-351.
- [6] Luong, T., Pham, H., & Manning, C. D. (2015). Effective approaches to attention-based neural machine translation. arXiv preprint arXiv:1508.04025.
- Koehn, P. (2005). Europarl: A parallel corpus for statistical machine translation. In MT summit (Vol. 5, pp. 79-86). https://www.pure.ed.ac.uk/ws/portalfiles/portal/26315407/MTS_2005_Koehn.pdf
- Johnson, R., & Zhang, T. (2017). Deep pyramid convolutional neural networks for text categorization. In Proceedings of the 55th Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers) (Vol. 1, pp. 562-570).
- Images for model of Time Communication. https://benchpartner.com/model-of-real-time-communication
- Vaswani, A., Bengio, S., Brevdo, E., Chollet, F., Gomez, A. N., Gouws, S., ... & Zaremba, W. (2018). Tensor2Tensor for neural machine translation. arXiv preprint arXiv:1803.07416.
- Bahdanau, D., Cho, K., & Bengio, Y. (2015). Neural machine translation by jointly learning to align and translate. In Proceedings of the 3rd International Conference on Learning Representations (ICLR).
- A confirmatory factorial analysis of the Chatbot Usability Scale: a multilanguage validation. https://link.springer.com/article/10.1007/s00779-022-01690-0
- Lewis, M., Liu, Y., Goyal, N., Ghazvininejad, M., Mohamed, A., Levy, O., & Zettlemoyer, L. (2019). BART: Denoising sequence-to-sequence pre-training for natural language understanding. arXiv preprint arXiv:1910.13461.
- Sutskever, I., Vinyals, O., & Le, Q. V. (2014). Sequence to sequence learning with neural networks. In Advances in neural information processing systems (pp. 3104-3112).
- Tiedemann, J. (2012). Parallel data, tools and interfaces in OPUS. In LREC (pp. 2214-2218).
- Ranzato, M., Chopra, S., Auli, M., & Zaremba, W. (2016). Sequence level training with recurrent neural networks. In Proceedings of the 32nd International Conference on Machine Learning (Vol. 37). https://arxiv.org/pdf/1511.06732
- [17] Schuster, M., & Paliwal, K. K. (1997). Bidirectional recurrent neural networks. IEEE Transactions on Signal Processing, 45(11), 2673-2681. https://deeplearning.cs.cmu.edu/F20/document/readings/Bidirectional%20Recurrent%20Neural%20Netwo rks.pdf
- Chen, M., Sun, J., Li, Q., & Zhu, H. (2018). Combining convolutional and recurrent neural networks for sentiment analysis of short texts. In Proceedings of the 27th International Conference on Computational Linguistics (pp. 3526-3536).