Advancements in Natural Language Processing for Human-Computer Interaction
DOI:
https://doi.org/10.62304/jieet.v3i05.216Keywords:
Natural Language Processing, Human-Computer Interaction, Deep Learning, Transformer Models, Multimodal ProcessingAbstract
This systematic review explores the advancements in Natural Language Processing (NLP) for Human-Computer Interaction (HCI) over the period from 2010 to 2024. It highlights the significant breakthroughs achieved through deep learning models, particularly transformer architectures such as BERT and GPT, which have transformed the ability of machines to understand and generate human language. The integration of multimodal capabilities has further enriched user interactions by enabling the processing of diverse data types, including text, audio, and visual inputs. However, the review also identifies persistent challenges, including maintaining coherence in long dialogues, resolving ambiguous language, addressing bias in training data, and the need for resource-efficient models. Additionally, the paper emphasizes the importance of cross-lingual capabilities for low-resource languages and the necessity of personalized, adaptive systems. The findings underscore the need for ongoing research to overcome existing limitations and enhance the effectiveness and inclusivity of NLP technologies in HCI, ultimately contributing to a more intuitive and accessible user experience.