Abstract

AbstractIn this work, we attempt to extract Deterministic Finite Automata (DFA) for a set of regular grammars from sequential Recurrent Neural Networks (RNNs). We have considered Long Short-Term Memory (LSTM) architecture, which is a variant of RNN. We have classified a set of regular grammars by considering their imbalances in terms of strings they accept and the strings they reject by using an LSTM architecture. We have formulated a set of the extended Tomita Grammar by adding a few more regular grammars. The different imbalance classes we introduce are Nearly Balanced (NB), Mildly Imbalanced (MI), Highly Imbalanced (HI), Extremely Imbalanced (EI). We have used L* algorithm for DFA extraction from LSTM networks. As a result, we have shown the performance of training an LSTM architecture for extraction of DFA in the context of the imbalances for a set of so formed regular grammars. We were able to extract correct minimal DFA for various imbalanced classes of regular grammar, though in some cases, we could not extract minimal DFA from the Network.KeywordsDeterministic Finite Automata (DFA)ImbalanceLong Short-Term Memory (LSTM) NetworkSequential Neural NetworkExtended Tomita Grammar Set (ETGS)

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call