Abstract
• This systematic review explores the advancements in Natural Language Processing (NLP) for Human-Computer Interaction (HCI) over the period from 2010 to 2024. It highlights the significant breakthroughs achieved through deep learning models, particularly transformer architectures such as BERT and GPT, which have transformed the ability of machines to understand and generate human language. The integration of multimodal capabilities has further enriched user interactions by enabling the processing of diverse data types, including text, audio, and visual inputs. However, the review also identifies persistent challenges, including maintaining coherence in long dialogues, resolving ambiguous language, addressing bias in training data, and the need for resource-efficient models. Additionally, the paper emphasizes the importance of cross-lingual capabilities for low-resource languages and the necessity of personalized, adaptive systems. The findings underscore the need for ongoing research to overcome existing limitations and enhance the effectiveness and inclusivity of NLP technologies in HCI, ultimately contributing to a more intuitive and accessible user experience.
Published Version (Free)
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
More From: Global mainstream journal of innovation, engineering & emerging technology
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.