Information-based syntax and semantics. Volume I: Fundamentals

Information-based syntax and semantics. Volume I: Fundamentals

243 Reviews While on the subject of editorial shortcomings, let me also mention the following categories: manyfold violations of English grammar and...

571KB Sizes 9 Downloads 88 Views

243

Reviews

While on the subject of editorial shortcomings, let me also mention the following categories: manyfold violations of English grammar and spelling, printing errors in the text and in running titles (recurring at the top of every second page of the papers by Shieber and Block and Haugeneder), arrows missing in figures (pp. 275, 276, 284, 285) missing glosses in non-English examples (Netter’s paper), a missing example grammar (p. II), use of unexplained abbreviations (like MT, NLP and GKPS), references missing in bibliographies, and an inadequate subject index (which for instance contains an entry X-component with a pointer to a page of the introduction but without a pointer to the relevant section in Wehrli’s paper concerning the X-bar component of his parser). I always wonder why people who are willing to have their names printed on the cover of a book do not care to invest the necessary amount of time for doing a proper job. Let me conclude in a positive tone of voice: this book contains a number of valuable contributions that will be of interest to workers in both the fields of theoretical linguistics and natural language processing. We can at least thank the editors for their efforts in collecting these materials and getting them published within a reasonable time span.

References Aho,

A.V.,

ACM Chomsky,

grammars

N., 1981. Lectures

Gazdar,

G.,

Oxford: Kaplan,

1968. Indexed

- an extension

E. Klein,

grammars.

Journal

of the

on government

G. Pullum

and

and binding.

I. Sag.

Dordrecht:

1985. Generalized

Foris. phrase

structure

grammar.

Blackwell. R. M. and J. Bresnan,

cal representation.

1982. Lexical-functional

In: J. Bresnan

theory

and computer

applications.

H., E. Klein and J. Calder,

Klein and G. Morrill Working

Papers

London:

Academic

grammar,

A formal

representation

R. Johnson

1987. Unification

(eds.), Categorial

in Cognitive

grammar:

(ed.), The mental

1733281. Cambridge, MA: MIT Press. Whitelock, P., M.M. Wood, H.L. Somers, Zeevat,

of context-free

15, 641-611.

system for grammati-

of grammatical

and P. Bennet

(eds.),

relations,

1987. Linguistic

Press.

categorial unification

grammar. grammar

In: N.J. Haddock, and parsing.

E.

Edinburgh

Science 1, 195-222.

Carl Pollard and Ivan A. Sag, Information-based Fundamentals (CSLI Lecture Notes Series No. Study of Language and Information. 1987. x + Reviewed by: G.C. Horrocks, St. John’s College,

Syntax and Semantics. Volume I. 13). Stanford, CA: Centre for the 233 pp. Cambridge, UK.

In the Introduction to this book (p. lo), published in the Lecture Notes series distributed by the University of Chicago Press, the authors draw attention to what

they see as ‘the rapid obsolescence of a certain authoritarianism in the sociology of the field, which has dictated that one’s investigations are to be conducted in the ‘right’ framework and that one’s fruitful collegial interactions are to be confined to devotees of the ‘right’ research tradition’. Desirable though this outcome might be, there is surely still some way to go before all those who think of themselves as ‘generative grammarians’ come to share an interest in ‘composing, decomposing, comparing and recombining the full range of current theories’ (p. I I) by way of the neutral lin~uu f&ca of unification-based formalisms. Nevertheless, any contribution to a broadening of perspectives is to be welcomed, and this book is certainly that. Though it is the first of a promised pair (the second volume is apparently not yet available), and it is therefore difficult at present to give a full and fair account of the authors’ achievement, the first volume is a clearly-written introduction to the fundamental assumptions and some of the basic technical apparatus of Head-driven Phrase Structure Grammar (HPSG). This is an ‘information-based’ theory of syntax and semantics that has drawn upon several different branches of contemporary syntactic and semantic theory and synthesised these with contributions from computer science, in particular work on knowledge representation and unification-based formalisms. Those familiar with recent advances in categorial grammar (e.g. Ades and Steedman (1982), Steedman (1985)) lexical-functional grammar (LFG) (e.g. Bresnan ( 1982)) and generalised phrase structure grammar (GPSG) (e.g. Gazdar et al (1985)) will certainly find interesting areas of overlap and contrast, borrowing and (often subtle) adaptation, in HPSG. ‘Standard’ version government-binding theory (GB), as presented in e.g. Chomsky (1981), also makes a contribution on the technical side, though the general style and orientation of HPSG definitely place it in the former camp. Since, however, HPSG is not specifically a theory of syntax but rather ‘is concerned with the interaction among all the forms of information that bear upon the linguistic meaning relation, including (inter alia) both the syntactic information borne by signs . . . as well as their semantic content’ (p. 16) many of the central concepts in fact derive from work in semantics, in particular the situation semantics (though often in highly modified form) of Barwise and Perry (1983), and syntactic and semantic aspects of grammatical structure are built up from the start in an integrated way; no question here, then, of semantics being ‘bolted on’ as an afterthought, still less of syntax being dismissed, ri la Montague (1974: 223) as a tedious, but unfortunately necessary, preliminary to semantics. First, some general observations. I said above that the book is clearly-written, and so it is, but only, I suspect, for those who are already well-acquainted with recent work within the fields and frameworks mentioned in the first paragraph. There are many fascinating comparative observations throughout (sometimes in footnotes and bracketed asides), but, in the absence of systematic contextualisation, these depend almost entirely for their effect on adequate prior knowledge on the part of the reader. While it is always possible in principle to teach theories ‘synchronically’ and ‘in isolation’, it remains true that many aspects of the way a particular theory works, and

Reviews

245

the reasons for its tackling some particular range of problems in a particular way, are very much a reflection of the evolution of that theory and of the historical influences that have come to bear upon it. Furthermore, a key prerequisite to a proper understanding of theories like HPSG, which emphasise the traditional objective of explicitness, is a thorough grasp of the formalism as well as of the issues which motivate the choice of apparatus in specific cases. This book, then, is clearly aimed at the advanced student, since it would be very difficult indeed, without extensive background knowledge, to appreciate the full significance of the argumentation and comparisons offered or to cope with the sometimes very complex and detailed notation employed in the illustrative examples. No book, of course, can do everything, but it is surely a pity that the relatively inaccessible contents of this one will guarantee a smaller readership than its importance deserves. The single biggest obstacle to the promotion of the study of ‘generative grammar’ in the more liberal framework that the authors advocate is the shortage of straightforward textbook treatments of theories other than GB (Sells (198.5) and Horrocks (1987) being rare contributions to this end). Without the help that such works might offer, and without careful and committed teaching based upon them, it is hard to see how GPSG, LFG, HPSG, etc., are ever to make a fully effective contribution outside the computational linguistics fraternity. What, then, of specifics? Volume 1 of Information-based Syntax and Semantics contains eight chapters. The Introduction deals with the nature of natural language objects (things in the mind or the real world?), explains the emphasis in HPSG on language as a means of making information available to speaker/hearers, and provides a (highly condensed and rather allusive) preview of the theory. Chapter 2 lays the necessary formal foundations by introducing feature structures as information-bearing objects (signs) that can represent grammatical and semantic objects of all kinds, as well as explaining the ways (including the central unification operation) in which partial information structures can be legitimately augmented. Chapter 3 focuses on syntactic features and how these are employed to define syntactic categories and to determine the syntactic structure of signs, while chapter 4 provides an outline of the semantic apparatus used to characterise the ‘contents’ of such objects. Chapter 5 deals with the interesting view of subcategorisation adopted in HPSG, which links standard category selection facts with case assignment, government, semantic role assignment and various agreement phenomena (the latter in fact left over to volume 2). Chapter 6 presents the HPSG notion of grammar rule as a device for characterising the structure of phrasal signs, and explains how much of the information contained in such signs can be predicted from the lexical heads of phrases by general principles, thereby reducing the inventory of rules to a small set of highly abstract generalised schemata. Chapter 7 explains the principles of constituent order, which, as in GPSG, are ‘factored out’ as general constraints on all phrasal signs. And finally, chapter 8 deals with the cross-cutting hierarchical organisation of lexical types assumed in HPSG as a basis for capturing generalisations about classes of words with common properties,

and then moves on to the way in which the type hierarchy can define the domain of the lexical rules employed to handle various word-formation processes (and the structural and semantic changes that these imply). The treatment of many interesting topics, such as binding/control and long-distance dependencies, is promised for Volume 2. Historically, HPSG emerged as a response to work in GPSG. In the latter framework schematic syntactic rules and (a sometimes highly complex interaction of) general and language-specific principles of feature instantiation combine to define all the possible well-formed structures of a given language, and simultaneously provide the basis for defining the semantic translations of those structures into a Montaguestyle intensional logic. One difficulty which emerged in pursuing this approach was that the well-formedness of a given structure had to be calculated not merely on the basis of the information associated with its daughter constituents, but also required reference to the set of possible instantiations of each category in the structure in question. The advocates of HPSG argue that this complexity can be drastically reduced if such manipulations of instantiated structures are eliminated by presenting all grammatical and semantic information (lexical entries, grammatical functions, syntactic categories, phrase structure trees, semantic contents, even rules and principles of grammar) in the form of ‘partial information structures’ (for the mathematically minded, a kind of directed acyclic graph), which can be related by subsumption (relative degree of informativeness) and combined by unification (merger of compatible information). (Despite its name, then, HPSG does not employ anything like conventional phrase structure rules, and one looks in vain for conventional phrasemarker representations of syntactic structure!) Partial information structures are simply matrices of attributes with specified values (e.g. in the simplest case, structures such as [GENDER MASCULINE]), which acquire their capacity to bear complex information through the possibility of having information structures (recursively) embedded inside other information structures (as the values of particular attributes), and having a single structure as the value of two or more distinct attributes (obviously useful in dealing, for example, with control phenomena). The determination of any given linguistic object involves the straightforward unification of the information derived from lexical entries (that include syntactic and semantic information in HPSG), grammar rules, and various universal and language-specific principles of wellformedness. This is in line with, though constitutes a radical extension of, work in LFG (cf. Functional Structures), and follows some recent versions of categorial grammar (e.g. Karttunen (1986)) and computational formalisms such as PATR-II (Shieber (1984)). Another significant modification of the GPSG approach is the rejection of the role of metarules. Once it became clear that metarules had to be constrained by something like Flickinger’s (1983) Lexical Head Constraint (in some ways an analogue of the GB requirement of proper government for traces), it began to seem that grammar rules were in fact being used for manipulations that were essentially lexical in nature. In

Reviews

247

HPSG, therefore, lexical ID rules are replaced by more complex lexical Structures and metarules by lexical rules (an approach in many ways reminiscent of LFG). Nevertheless, many features of GPSG are retained, though often in a modified form designed to overcome limitations which emerged from further research. Thus the idea that certain kinds of information associated with lexical items are also systematically associated with the phrases they head is formalised as the Head Feature Principle (a revision of the Head Feature Convention). Similarly, the Binding Inheritance Principle (incorporating the essence of the Foot Feature Principle) constrains the propagation through linguistic structures of information about the presence of dependent items such as gaps and relative/interrogative pronouns and guarantees that these items are bound by an appropriate element. Almost equally important in HPSG, however, is the contribution of categorial grammar. At the heart of HPSG (and hence the name of the theory!) is the idea that grammars can be simplified radically if heads incorporate information about the categories they combine with, including subjects, as values for the attribute SUBCAT. In other words, it is argued, as in categorial frameworks, that the combinatorial properties of words and phrases are inherent in the words and phrases themselves, and are not determined by grammar rules that apply to them. This entails the abandonment of sets of highly specific rules (which merely duplicate lexical information) and effectively reduces the rule component to a small set of extremely general schemata. Apart from the fact that not only lexical items but also phrases have a ‘subcategorisation’ in this approach, this is, of course, reminiscent of the Projection Principle of GB theory. Another important contribution of categorial grammar is the adoption of Dowty’s (1982) approach to grammatical functions/relations, whereby the position of a category in the SUBCAT ‘frame’ of a lexical item determines its grammatical function. This assumption of a universal underlying ordering of functions such as subject, object, second object, allows a systematic treatment of certain phenomena that have an apparently ‘hierarchical’ character (cf. for example Keenan and Comrie’s (1977) cross-linguistic work on the hierarchy of grammatical functions with respect to accessibility to relativisation), but also allows for different surface orderings in different languages through a GPSG-style split between the treatment of dominance facts and that of linear precedence. The functional hierarchy is also exploited to deal with the data that fall under the headings of ‘binding’ and ‘control’ (as variously construed), though this is postponed to the second volume. Since each lexical sign consists of phonological, syntactic and semantic information, something must be said here about the treatment of the semantic contents of partial information structures/signs (the book does not in fact have anything to say about phonology). The approach is founded on the situation semantics of Barwise and Perry (1983) but differs from it quite extensively in its technical details. The original version is a relational theory which holds that meaning arises from conventional constraints that hold between types of utterance situations and the types of things in the real

world that utterances describe. In other words, the mind and mental representations are held not to be what semantics is about. This is obviously counter to current orthodoxy in much of the linguistics community and is also quite different from the associative bond between psychological objects (signifiant/phonological representation and signifie/concept) supposed in Saussure’s sign theory. According to Barwise and Perry the world is made up of individuals. properties, relations and situations, where situations are parts of the world consisting of individuals having properties or being in relations. Conventional linguistic constraints between different kinds of ‘real-world’ situation are exploited for communicative purposes by people who are ‘attuned’ to them. But since meanings are accessible only to organisms that ‘know’ the constraints, a language can in fact be viewed either as ‘the system of situation types that conform to the conventions, or as the system of shared knowledge by virtue of which the conventions can be used’ (p. 5). Pollard and Sag therefore suggest that, if the relevant organisms and their minds are indeed constituents of linguistic-meaning situations, the ditference between a ‘conceptualist.’ and a ‘realist’ approach ‘may well be far less significant than recent debates have suggested’, and accordingly proceed to adopt an essentially agnostic position, considering natural languages in terms of the information they make available to the members of a given speech community. As noted earlier, HPSG adopts an integrated approach to syntax and semantics. The semantic content of linguistic structures is determined by the syntactic and semantic information associated with their constituents in conjunction with universal principles and contextual factors. Crucially, the approach differs from Montague-type semantics in that the ‘semantic content of a sentence is not determined by a syntaxdirected process of model-theoretic interpretation; instead, it “falls out” from the semantic contents of its lexical constituents by virtue of general linguistic constraints which require that certain pieces of information associated with signs be unified with certain other pieces’ (p. 18). As an example, the semantic content of the sentence Kim admires Sandy is given (p. 17) as RELATION ADMIRER ADMIREE

ADMIRE KIM SANDY

I

The relation is ADMIRE and the individuals KIM and SANDY play the roles of ADMIRER and ADMIREE. KIM and SANDY are a part of the semantic contents of the subject and object NPs Kim and Sandy. The relation ADMIRE and the assignment of the relevant semantic roles to the subject and object come from the head verb, represented as a lexical sign (suppressing the phonology attribute) (2) (admires,

V[SUBCAT(NPi,

NP,)],

RELATION ADMIRER ADMIREE

ADMIRE j i

1 )

Reviews

249

The variables i and j are ‘parameters’ which make up part of the semantic content of NPs (cf. reference markers in Discourse Representation Theory), and the specifications NP, and NP, demand the presence of noun phrases whose variables are to be unified with the fillers of the ADMIREE and ADMIRER roles. A proposed ‘Subcategorisation Principle’ (a kind of equivalent to the ‘cancellation’ of categories in categorial grammar) requires that the actual subject and object constituents of the sentence be unified with the subject and object specifications (identified by their order, subjects occurring rightmost) on the SUBCAT feature of the head verb. The variables j and i are therefore unified with KIM and SANDY respectively. A ‘Semantics Principle’ then enforces the requirement that the semantic contents of phrases be unified with those of their head daughters, and so guarantees that the semantic content of (2) is indeed that of the whole sentence (on the assumption that verbs head sentences). There are obvious parallels here with O-theory and the use of syntactic indices in GB theory, but a crucial difference between the NP variables and semantic roles of HPSG and the indices and B-roles of GB theory is that the former are objects with a clear situation-theoretic interpretation (as parameters introduced by NP tokens and ways of participating in relations), while the latter are objects which, while playing a part in the determination of syntactic well-formedness, lack (in standard accounts) any comparable explication. Part of the attraction of work in GPSG and LFG in the early 1980s was the fact that it initiated fruitful interactions between linguistics and other disciplines with an interest in language. It was also apparent that many of these ‘alternative’ approaches to generative grammar had much in common in terms of basic assumptions, goals and modus operandi. HPSG embodies this co-operative spirit, and has done much to combine the best of earlier theoretical and descriptive work and to pull together common threads. Information-based Syntax and Semuntics constitutes an excellent introduction to what it is all about (I look forward to seeing Volume II), and sets high standards of argumentation and presentation. It is also refreshing to read a book written in a rhetorically unpretentious and unaggressive style, to see issues faced squarely (and failure admitted where appropriate). and to find discussion of a really wide range of phenomena (there is, for example, a whole section on the littleresearched and poorly understood syntax of adjuncts). There are a few slips on the proof-reading side (errors of numbering, dittography, etc.) but none of these are likely to cause serious difficulty for the sort of readers the book is going to attract. I very much hope, however, that Volume II will provide an index to the whole work; an efficient means of following issues through and referring back to earlier expositions would be a great assistance to the learner. References Ades, A. and M. Steedman, 1982. On the order of words, Linguistics and Philosophy 4, 5 17-558. Barwise, J. and J. Perry, 1983. Situations and attitudes. Cambridge, MA: MIT Press.

Bresnan,

J. (ed.).

1982. The mental

representation

of grammatical

relations,

Cambridge.

MA:

MIT Press. Chomsky, Dowty,

N., 1981. Lectures

on government

D., 1982. Grammatical

relations

Pullum

(eds.), The nature

Flickinger,

D.,

M. Wescoat Standard.

1983.

of syntactic

Lexical

CA: Stanford

representation,

heads

(eds.), Proceedings

and binding.

and Montague

and

phrasal

University

In:

Linguistics

Barlow,

Conference

G.. 1987. Generative

Karttunen,

L., 1986. Radical

Keenan.

E. L. and B. Comrie,

Inquiry Montague,

R., 1974. Formal

Stanford,

1977. Noun phrase

structure

grammar.

York:

Longman.

No. CSLI-86-68. accessibility

phrase

Stanford,

and universal

CA: CSLI. grammar.

Linguistic

philosophy.

New Haven.

on contemporary

syntactic

CT: Yale University

theories.

CSLI Lecture

Press. Notes

Series No. 3,

CA: CSLI.

S.. 1984. The design of a computer

L. Karttunen papers

London/New Report

and

linguistics.

8, 63-99.

Sells. P., 1985. Lectures Shieber,

grammar. lexicalism.

D. Flickinger on formal

Department.

Gazdar, G., E. Klein, G.K. Pullum and I.A. Sag. 1985. Generalized Oxford: Blackwell;Cambridge: MA: Harvard University Press. Horrocks,

and G.K.

Reidel.

M.

West Coast

Foris. In: P. Jacobson

Dordrecht: gaps.

of the Second

Dordrecht: grammar.

and F. Pereira

on unification-based

language

for linguistic

(eds.), Notes from the unification grammar

formalisms.

information.

underground:

SRI Technical

In: S. Shieber, A compilation

Note 327. Menlo

of

Park, CA:

SRI International. Steedman, Language

M..

1985. Dependency

61(3), 5233568.

and

coordination

in the grammar

of Dutch

and

English.