From Generalized Phrase Structure Grammar to Categorial Grammar (and partway back again)

  • Abstract
  • Literature Map
  • Similar Papers
Abstract
Translate article icon Translate Article Star icon
Take notes icon Take Notes

Abstract The account of extraction using only generalized context free phrase structure (put forth in a series of papers by Gazdar in the late 1970s and early 1980s and then codified in Generalized Phrase Structure Grammar) used, slash as a feature to indicate that there was something missing in wh -extraction constructions. Although this was (deliberately) reminiscent of the slash of Categorial Grammar (CG) (which encodes argument selection), they treated it as distinct from the CG slash. Subsequent work by Steedman proposed to unite them. This paper argues first, that Gazdar et al. were correct to treat the two differently. Second, I advocate a natural view of syntactic categories under the CG world view. Thus, we take the function categories of CG to correspond to functions on strings, and with this we preclude what I call S-crossing composition, used in many CG analyses. With this in mind, we suggest that rightward extraction as in Right Node Raising really is function composition, while wh- extraction should be handled by something much closer to the account in Gazdar et al. The two behave differently under coordination chains involving a silent and or or. This behavior provides evidence that the two should be kept distinct (see also work by Oehrle for this poit), while providing striking evidence for the view of syntactic categories advocated here.

Similar Papers
  • Research Article
  • Cite Count Icon 1
  • 10.1162/coli_a_00179
Ivan A. Sag
  • Mar 1, 2014
  • Computational Linguistics
  • Emily M Bender

Ivan A. Sag

  • Research Article
  • 10.1353/lan.1994.0006
Formal grammar: Theory and implementation Ed. by Robert Levine (review)
  • Sep 1, 1994
  • Language
  • Tibor Kiss

608 LANGUAGE, VOLUME 70, NUMBER 3 (1994) nese and Japanese learners of English in correcting mistakes by lower-status and higherstatus addressees, Takahashi & Beebe found evidence of transfer of Ll discourse patterns, not only in utterance content, but also in patterns of style-shifting between situations. In Part III, Shoshana Blum-Kulka & Hadass Sheffer (The metapragmatic discourse of American-Israeli families at dinner', 196-223) show that acculturation to L2 pragmatics can affect speakers' Ll, and indeed that interlanguage can be an Ll phenomenon. Guy Aston ('Notes on the interlanguage ofcomity', 224-50) argues that the resources used by native and nonnative speakers to establish rapport are not a diminished subset of native speakers' conversational abilities, but an independent set of successful strategies. Despite the pitfalls that the papers in Part II alert us to, writers in Part III show that interlanguage speakers can be satisfactorily competent conversational actors. The editors have succeeded in their aim of producing a volume that speaks to scholars both in pragmatics and in second language acquisition , and that will also appeal to readers interested in cultural and stylistic aspects of conversation. [Susan Meredith Burt, Normal , Illinois.] Formal grammar: Theory and implementation . Ed. by Robert Levine. (Vancouver studies in cognitive science , 2.) Oxford & New York: Oxford University Press, 1992. Pp. x, 439. Cloth $60.00. The present volume grew out of a conference held at Simon Fraser University (Vancouver) in 1989. Most of the papers—those by Fodor (comment by Gawron), Oehrle (comment by Jacobson ), Carpenter, and Stabler (comment by Dahl)—deal with formal aspects of syntax, whereas the other contributions cover phonology (Dresher, comment by Church), morphology (Zwicky), semantics (Crain & Hamburger), and neurolinguistics (Kean, Shapiro). Given the subtitle of Formal grammar, one might expect the main focus of the contributions to be the relationship between linguistic theories and their implementations. However, the editor explains in his brief introduction (vii-x) that the 'notion of implementation was construed rather broadly [...] embracing not only machine-based applications [...] but real-time aspects of human linguistic capability ...' (vii). However broadly the notion of implementation may be construed, some of the contributions (those by Zwicky, Oehrle, and Crain & Hamburger) are related only very loosely to either machine-based applications or real-time aspects of linguistic capacities. Because some of the main articles are less directly related to issues of implementation in a narrow sense, one would expect the commentators to try to relate the theoretical advances of the articles to implementational aspects. This, however, is not the case. But it should be noted that most of the main contributions as well as the comments reflect important research in formal linguistics. Given this constellation, it would perhaps have been a wiser choice to omit the rather misleading subtitle. In 'Learnability of phrase structure grammars ' (3-68), Janet Dean Fodor discusses whether standard Generalized Phrase Structure Grammar (GPSG) is learnable. She shows that GPSG does not obey the Subset Principle in its treatment of subjacency. Consequently, a language learner equipped with a GPSG is not capable of learning several languages, since the grammar cannot proceed from more restricted to less restricted rule schemata but predicts the correctness of the less restrictive schemata. The problem is illustrated by extraction data from Polish, English, and Swedish. To overcome the difficulty, Fodor proposes an alternative version of GPSG, where negative constraints on feature propagation are eliminated in favor of default specifications and a rule-based treatment of extraction. This modification is the starting point for Jean Mark Gawron's remarks (69-78). He admits that the elimination of negative constraints seems plausible in the domain of extraction, but doubts that such a move can be made with respect to anaphoric phenomena. Richard T. Oehrle's contribution (79-128) is a presentation of a variant of categorial grammar —Dynamic Categorial Grammar (DCG)— which conforms to the Lambek-Calculus. Beside the more familiar scheme of function application, DCG includes function composition and type lifting as basic operations. DCG shows two interesting formal properties: the calculus is decidable, which means that it can be proven for any string whether it belongs to a particular language or not; and it is structurally complete—which amounts to the assumption that, if...

  • Book Chapter
  • Cite Count Icon 6
  • 10.1007/978-1-4615-3986-5_5
Parsing with Categorial Grammar in Predictive Normal Form
  • Jan 1, 1991
  • Kent Wittenburg + 1 more

Steedman (1985, 1987), Dowty (1987), Moortgat (1988), Morrill (1988), and others have proposed that Categorial Grammar, a theory of syntax in which grammatical categories are viewed as functional types, be generalized in order to analyze “noncanonical” natural language constructions such as whextraction and nonconstituent conjunction. A consequence of these augmentations is an explosion of semantically equivalent derivations admitted by the grammar, a problem we have characterized as spurious ambiguity from the parsing perspective (Wittenburg, 1986). In Wittenburg (1987), it was suggested that the offending rules of these grammars could take an alternate predictive form that would eliminate the problem of spurious ambiguity. This approach, consisting of compiling grammars into forms more suitable for parsing, is within the tradition of discovering normal forms for phrase structure grammars, and thus our title. Our approach stands in contrast to those which are attempting to address the spurious ambiguity problem in Categorial Grammars through the parsing algorithm itself rather than through the grammar (see Gardent & Bes, 1989; Pareschi & Steedman, 1987) and also to those addressing the problem by proof-theoretic means in the Lambek calculus tradition (Bouma, 1989; Hepple & Morrill, 1989; Koenig, 1989; Lambek, 1958; Moortgat, 1986, 1988). We follow the line of Steedman (1985, 1987), Dowty (1987), and various strains of Categorial Unification Grammar (Karttunen, 1986; Uszkoreit, 1986; Wittenburg, 1986; Zeevat, Klein & Calder, 1987) in that we assume a finite number of combinatory rules and study the behavior of parsers that apply these rewrite rules in roughly the phrase-structure parsing tradition.

  • Book Chapter
  • Cite Count Icon 128
  • 10.1007/978-94-015-6878-4_7
Type Raising, Functional Composition, and Non-Constituent Conjunction
  • Jan 1, 1988
  • David Dowty

A very striking feature of the system of categorial grammar in Ades and Steedman (1982) and Steedman (1985), which differentiates it from most other current work in categorial grammar as well as from related theories like Generalized Phrase Structure Grammar (Gazdar et al., 1985), is its appeal to the operation of functional composition as a highly general rule of natural language grammars: Steedman and Ades include in their grammars not only the familiar functional application rule (1) but also the functional composition rule (2) (for which their term is partial combination):

  • Research Article
  • Cite Count Icon 37
  • 10.1017/s0022226700014134
An HPSG approach to Welsh
  • Sep 1, 1989
  • Journal of Linguistics
  • Robert D Borsley

Welsh differs from English in a number of ways. The most obvious point is that it is a VSO language, but it also has distinctive agreement phenomena and clitics. For this reason, it is natural to ask of any theory of syntax that has been developed primarily on the basis of English: how can it handle Welsh? Welsh has had fairly extensive attention within the Government-Binding theory (see, for example, Harlow, 1981; Sproat, 1985; Sadler, 1988, and Hendrick, 1988). It has also had some attention within Generalized Phrase Structure Grammar (GPSG) (see Harlow, 1983; Borsley, 1983; 1988a). In this paper, I will consider how some of the central features of Welsh can be accommodated within Head-driven Phrase Structure Grammar (HPSG). This is a framework developed over the last few years by Carl Pollard, Ivan Sag and others, which seeks to combine the insights of GPSG, categorial grammar and certain other theories (see Pollard, 1985, 1988; Sag & Pollard, 1987, and Pollard & Sag, 1988). In fact, I will be mainly concerned with the version of HPSG developed in Borsley (1986, 1987, 1988 b), but I will also have something to say about standard HPSG.

  • Research Article
  • 10.4314/jcsia.v27i1.2
A Comparative Study of Deep and Shallow Parsing Approaches to Automated Grammaticality Evaluation
  • Aug 7, 2020
  • Journal of Computer Science and Its Application
  • Mk Aregbesola + 3 more

The concept of automated grammar evaluation of natural language texts is one that has attracted significant interests in the natural language processing community. It is the examination of natural language text for grammatical accuracy using computer software. The current work is a comparative study of different deep and shallow parsing techniques that have been applied to lexical analysis and grammaticality evaluation of natural language texts. The comparative analysis was based on data gathered from numerous related works. Shallow parsing using induced grammars was first examined along with its two main sub-categories, the probabilistic statistical parsers and the connectionist approach using neural networks. Deep parsing using handcrafted grammar was subsequently examined along with several of it‟s subcategories including Transformational Grammars, Feature Based Grammars, Lexical Functional Grammar (LFG), Definite Clause Grammar (DCG), Property Grammar (PG), Categorial Grammar (CG), Generalized Phrase Structure Grammar (GPSG), and Head-driven Phrase Structure Grammar (HPSG). Based on facts gathered from literature on the different aforementioned formalisms, a comparative analysis of the deep and shallow parsing techniques was performed. The comparative analysis showed among other things that while the shallow parsing approach was usually domain dependent, influenced by sentence length and lexical frequency and employed machine learning to induce grammar rules, the deep parsing approach were not domain dependent, not influenced by sentence length nor lexical frequency, and they made use of well spelt out set of precise linguistic rules. The deep parsing techniques proved to be a more labour intensive approach while the induced grammar rules were usually faster and reliability increased with size, accuracy and coverage of training data. The shallow parsing approach has gained immense popularity owing to availability of large corpora for different languages, and has therefore become the most accepted and adopted approach in recent times.
 Keywords: Grammaticality, Natural language processing, Deep parsing, Shallow parsing, Handcrafted grammar, Precision grammar, Induced grammar, Automated scoring, Computational linguistics, Comparative study.

  • Research Article
  • Cite Count Icon 5
  • 10.1016/j.tcs.2007.07.027
S4 enriched multimodal categorial grammars are context-free
  • Jul 28, 2007
  • Theoretical Computer Science
  • Andrew R Plummer

S4 enriched multimodal categorial grammars are context-free

  • Research Article
  • Cite Count Icon 77
  • 10.1017/s0022226700011816
Coordination and grammatical relations
  • Sep 1, 1988
  • Journal of Linguistics
  • Richard Hudson

The most serious recent work on the theory of coordination has probably been done in terms of three theories of grammatical structure: Generalized Phrase Structure Grammar (GPSG–see especially Gazdar, 1981; Gazdaret al., 1982; 1985; Saget al., 1985; Schachter & Mordechay, 1983), Categorial Grammar (CG–see especially Steedman, 1985; Dowty, 1985) and Transformational Grammar (TG–notably Williams, 1978, 1981; Neijt, 1979; van Oirsouw, 1985, 1987). Each of these approaches is different in important respects: for instance, according to whether or not they allow deletion rules, and according to the kinds of information which they allow to be encoded in syntactic features. However, behind these differences lies an important similarity: in each case the theory concerned makes two assumptions about grammatical structure in general (i.e. about all structures, including coordinate ones):I The basic syntagmatic relations in sentence-structure are part-whole relations (consituent structure) and temporal order; note that this is true whether or not syntactic structure is seen as a ‘projection’ of lexical properties, since these lexical properies are themselves defined in terms of constituent structure and temporal order.

  • Book Chapter
  • Cite Count Icon 5
  • 10.1016/b978-0-12-037101-3.50006-5
JPSG—A Phrase Structure Grammar for Japanese
  • Jan 1, 1990
  • Advances in Software Science and Technology
  • Yasunari Harada + 4 more

JPSG—A Phrase Structure Grammar for Japanese

  • Research Article
  • Cite Count Icon 11
  • 10.1016/s0304-3975(97)00266-1
Algebraic structures in categorial grammar
  • Jun 1, 1998
  • Theoretical Computer Science
  • Wojciech Buszkowski

Algebraic structures in categorial grammar

  • Conference Article
  • Cite Count Icon 15
  • 10.3115/981131.981137
Computational complexity of current GPSG theory
  • Jan 1, 1986
  • Eric Sven Ristad

An important goal of computational linguistics has been to use linguistic theory to guide the construction of computationally real-world natural language processing systems. At first glance, generalized phrase structure grammar (GPSG) appears to be a blessing on two counts. First, the precise formalisms of GPSG might be a direct and transparent guide for parser design and implementation. Second, since GPSG has weak context-free generative power and context-free languages can be parsed in O(n3) by a wide range of algorithms, GPSG parsers would appear to run in polynomial time. This widely-assumed GPSG efficient parsability result is misleading: here we prove that the universal recognition problem for current GPSG theory is exponential-polynomial time hard, and assuredly intractable. The paper pinpoints sources of complexity (e.g. metarules and the theory of syntactic features) in the current GPSG theory and concludes with some linguistically and computationally motivated restrictions on GPSG.

  • Research Article
  • Cite Count Icon 9
  • 10.1007/bf00627292
Computational structure of GPSG models
  • Oct 1, 1990
  • Linguistics and Philosophy
  • Eric Sven Ristad

The primary goal of this essay is to demonstrate how considerations from computational complexity theory can inform grammatical theorizing. To this end, generalized phrase structure grammar (GPSG) linguistic theory is revised so that its power more closely matches the limited ability of an ideal speaker-hearer: GPSG Recognition is EXP-POLY time hard, while Revised GPSG Recognition is NP-complete. A second goal is to provide a theoretical framework within which to better understand the wide range of existing GPSG models, embodied in formal definitions as well as in implemented computer programs.

  • Book Chapter
  • 10.1093/oso/9780198804239.003.0004
Syntax calls the shots
  • Jun 9, 2022
  • Daniel Altshuler + 1 more

This chapter surveys syntactic analyses of extraction from coordinate structures. We begin by outlining some desiderata for a syntactic account, and outline the challenge posed by the poor fit between the Coordinate Structure Constraint and other syntactic constraints on movement enumerated in recent work in the Chomskyan tradition. We distinguish three approaches to this challenge: some analyses (for example, those developed in Generalized Phrase Structure Grammar or Combinatory Categorial Grammar) reject central assumptions of Chomskyan locality theory; others (so-called ‘multiplanar’ analyses) derive the Coordinate Structure Constraint within the broad outline of Chomskyan locality theory by proposing a novel syntax of coordination which cannot be reduced to context-free phrase structure; and a final group of analyses (such as that of Munn 1993) reject the CSC as a syntactic constraint. The first two types of analyses are liable to undergenerate because they rule out asymmetric extraction patterns of the sort described in Chapter 3, while the final type of analysis is liable to overgenerate unless supplemented with nonsyntactic constraints of the sort to be discussed in Chapters 5 and 6.

  • Book Chapter
  • Cite Count Icon 32
  • 10.1007/3-540-54594-8_63
Verb order and head movement
  • Jan 1, 1991
  • Tibor Kiss + 1 more

This paper will focus on the treatment of so-called verb movement phenomena within non-transformational approaches to grammar, namely Categorial Grammar (CG) and Head Driven Phrase Structure Grammar (HPSG). We will discuss a number of recent proposals put forward by categorial grammarians, including one of the approaches taken within the LILOG project, which is a version of Categorial Unification Grammar (CUG). Subsequently, we will present a proposal for the treatment of finite verb positions in German within the framework of HPSG, which represents a further development of the original LILOG grammar and which can also be viewed as an extension to the theory of HPSG in general.KeywordsWord OrderSubordinate ClauseCategorial GrammarRule SchemaHead FeatureThese keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.

  • Book Chapter
  • Cite Count Icon 8
  • 10.1090/trans2/192/02
Lambek calculus and formal grammars
  • May 4, 1999
  • Mati Pentus

The question about the position of categorial grammars in the Chomsky hierarchy arose in late 1950s and early 1960s. In 1960 Bar-Hillel, Gaifman, and Shamir [1] proved that a formal language can be generated by some basic categorial grammar if and only if the language is context-free. They conjectured (see also [7]) that the same holds for Lambek grammars, i. e., for categorial grammars based on a syntactic calculus introduced in 1958 by J. Lambek [10] (this calculus operates with three connectives: multiplication or concatenation of languages, left division, and right division). The proof of one half of this conjecture (namely, that every context-free language can be generated by some Lambek grammar) in fact coincides with the proof

Save Icon
Up Arrow
Open/Close
  • Ask R Discovery Star icon
  • Chat PDF Star icon

AI summaries and top papers from 250M+ research sources.