Abstract

In this paper, we apply the neural architecture search (NAS) method to Korean grammaticality judgment tasks. Since the word order of a language is the final result of complex syntactic operations, a successful neural architecture search in linguistic data suggests that NAS can automate language model designing. Although NAS application to language has been suggested in the literature, we add a novel dataset that contains Korean-specific linguistic operations, which adds great complexity in the patterns. The result of the experiment suggests that NAS provides an architecture for the language. Interestingly, NAS has suggested an unprecedented structure that would not be designed manually. Research on the final topology of the architecture is the topic of our future research.

Highlights

  • In this paper, we apply a modified neural architecture search (NAS) proposed in [1,2,3,4,5,6] to a grammaticality task for Korean linguistic phenomena

  • The deep learning methods have been applied to the fields of psycholinguistics, which attempt to identify the cognitive processing of human languages [17]

  • We show that the patterns in Korean shown above do not require recurrent neural network language models [27] for the desired result

Read more

Summary

Introduction

We apply a modified neural architecture search (NAS) proposed in [1,2,3,4,5,6] to a grammaticality task for Korean linguistic phenomena. It is a novel approach to language modeling of Korean linguistic phenomena in terms of automatic neural architecture design. The successful application of deep learning in various fields is due to its automation of pattern finding and powerful performance on difficult problems [7,8]. The deep learning methods have been applied to the fields of psycholinguistics, which attempt to identify the cognitive processing of human languages [17]. This success requires a need for architecture engineering, where more complex neural architectures are designed manually for different tasks. There are several approaches to autoML: hyper-parameter optimization (HPO), metalearning, and NAS

Methods
Results
Conclusion
Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call