Advancements in Natural Language Processing for Human-Computer Interaction

Advancements in Natural Language Processing for Human-Computer Interaction

Authors

  • Md Mizanur Rahman Graduate researcher, Civil Engineering, University of the District of Columbia, Washington, DC, United States

DOI:

https://doi.org/10.62304/jieet.v3i05.216

Keywords:

Natural Language Processing, Human-Computer Interaction, Deep Learning, Transformer Models, Multimodal Processing

Abstract

This systematic review explores the advancements in Natural Language Processing (NLP) for Human-Computer Interaction (HCI) over the period from 2010 to 2024. It highlights the significant breakthroughs achieved through deep learning models, particularly transformer architectures such as BERT and GPT, which have transformed the ability of machines to understand and generate human language. The integration of multimodal capabilities has further enriched user interactions by enabling the processing of diverse data types, including text, audio, and visual inputs. However, the review also identifies persistent challenges, including maintaining coherence in long dialogues, resolving ambiguous language, addressing bias in training data, and the need for resource-efficient models. Additionally, the paper emphasizes the importance of cross-lingual capabilities for low-resource languages and the necessity of personalized, adaptive systems. The findings underscore the need for ongoing research to overcome existing limitations and enhance the effectiveness and inclusivity of NLP technologies in HCI, ultimately contributing to a more intuitive and accessible user experience.

Downloads

Published

2024-11-04

How to Cite

Rahman, M. M. (2024). Advancements in Natural Language Processing for Human-Computer Interaction. Global Mainstream Journal of Innovation, Engineering & Emerging Technology, 3(05), 1–10. https://doi.org/10.62304/jieet.v3i05.216
Loading...