Abstract

Recently, recurrent neural networks (RNNs) have been used to infer regular grammars from positive and negative examples. Several clustering algorithms have been suggested to extract a finite state automaton (FSA) from the activation patterns of a trained net. However, the consistency with the examples of the extracted FSA is not guaranteed in these methods, and typically, some parameter of the clustering algorithm must be set arbitrarily (e.g. the number of clusters in k-means method). In this paper we present a hybrid approach to regular grammatical inference based on neural learning and hierarchical clustering. The important new feature in the proposed method is the use of symbolic representation (unbiased FSA) and processing (merge operation) along with the clustering performed after neural learning, which allows to guarantee the extraction of a consistent deterministic FSA with the ”minimal” size (with respect to the consistent FSA extractable by hierarchical clustering). Moreover, it is only required to define the cluster distance measure criterion.KeywordsHierarchical ClusterRecurrent Neural NetworkSymbolic RepresentationHide UnitRegular LanguageThese keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call