This page is a sample of focus pages under development. The one below is accompanied by our partial lexicon of key terms in the study of language. James Garson prepared it.

Grammatical Knowledge
One of Chomsky's most influential and controversial views is that all human languages have a structure in common called universal grammar. A number of different lines of evidence help support this view. All normal speakers of a language have extensive knowledge of its grammatical structure (or syntax).

For example, any speaker of English knows that 'He loves her' is grammatical, while 'Him loves she' is not. This knowledge of syntax can be quite subtle. The results of removing the word 'that' from the sentence 'the dog that I saw has rabies' is a new sentenc e that is still well formed. But 'that' cannot be removed from the sentence 'the dog that bit me has rabies'. Although most of us are unable to explain the rule that governs when 'that' may be safely deleted from a sentence, we are all able to recognize which sentences are correct and which sentences are not correct. This ability to distinguish well formed from ill formed sentences is called our linguistic competence .

Competence - Performance Distinction
Linguistic competence is not always reflected in actual speech. Our linguistic performance is peppered with 'ums' and 'ahs', false starts and sentence fragments. Nevertheless, when asked, we are still able to judge the difference between those utterances that live up to the rules of English and those that do not. Although we are probably not c onsciously aware of any of these rules, our unconscious mastery of them is revealed in our linguistic competence.?

Productivity of Language
How is linguistic competence learned? One suggestion is that we simply learn a number of basic patterns that all English sentences exhibit. However, this idea won't work because language is productive (or creative). The productivity of language means that there is no limit to the comp lexity of structure that can be found in English sentences that we use and understand. We can add phrases for new animals to 'The old lady swallowed a cat that swallowed a mouse that swallowed a spider that swallowed a fly' without limit, each time creating a more complex structure. So there is no way to explain competence by listing a finite nu mber of English sentence forms.

Recursive Rules
Instead we must use recursive rules . Recursive rules allow us to define structures to which the very same rule may be used again. For example we may say that a Noun Phrasemay be constructed from another Noun Phrase ('a cat') followed by a Relative Clause ('that s wallowed a mouse'). Since the result ('a cat that swallowed a mouse') is by definition a new Noun Phrase, the rule may be applied all over again (to form 'a cat that swallowed a mouse that swallowed a spider') and so on. One of Chomsky's major contributions to linguistics was to demonstrate the need for recursive rules in the theory of grammatic al structure. His theory was based on phrase structure rules that could be applied recursively to create an unlimited collection of well formed sentences. He was inspired by the way wffs are recursively defined for formal languages like logic.

Why Language Learning Is Hard: Poverty of Stimulus
If this is right, then acquiring linguistic competence means acquiring a collection of recursive grammatical rules. But how is that done? This question is particularly pressing because linguists have shown that the basic rules governing the syntax of human languages are very complex. We still have a long way to go in formulating a full fledged theory of the grammatical structure of English. Despite the fact that decades of research have not uncovered a complete story, even little children who have learned English display excellent abilities at judging correct syntactic form. Language learning is an astonishing achievement. (A person with an average size vocabulary, has learned 5-10 words a day during childhood, and must master hundreds of complicated rules such as when 'that' is optional.) The accomplishment is all the more amazing when we examine the data the child has to go on. The child rarely gets crucial negat ive evidence. Poorly formed sentences that the child actually hears or uses are hardly ever corrected. How can the child learn to distinguish right from wrong? Chomsky argued that the data children actually have to go on is too poor to allow them to predict the grammar of the language they are trying to learn. (This is called the poverty of st imulus argument.)

Language Acquisition Device
Language learning must be guided by an innate mechanism called a language acquisition device (LAD) The LAD is part of our genetic endowment and explains why humans are so much better than the animals at language. The LAD assumes that the language to be learned has a basic form called universal grammar. While languages humans speak vary in their syntax, there are basic principles common to then all that the LAD presumes are part of the language being learned.

Principles and Parameters Theory
What might some of these principles be? One problem is that languages vary quite widely along a number of different dimensions, for example, syllable structure, case assignment, and the placement of modifiers. To resolve this problem linguistics have proposed a basic system of linguistic principles, that still allow options to be selected for ea ch language (parameters). For example, modifiers can be placed before or after what they modify, but never (say) 5 words away. So a child learning a given language need only learn the parameter setting (before or after) for a kind of modifier. Similarly there are phonological universals. Although language s vary as to whether consonants are allowed at the end of a syllable. (English allows them, but Italian does not.) there are no languages that require consonants to end every syllable. So the child needs only to learn whether the language allows final consonants or not. Similarly theories that decide what a pronoun may refer to (binding theories ) follow the same principle in all languages; it is just the nature of the grammatical chunk that varies (In English it is the clause, while in Icelandic it is the tensed clause). X-bar theory is an attempt to describe the principles and parameters for possible phrase structure grammars. One important idea is that phrases always contain a head (Noun Phrases contain a head Noun, Verb Phrases a head Verb, Prepositional Phrases a head Preposition, etc.). One interesting regularity is that in most languages the head of a phrase is either always on the right, or always on the left. For exampl e in Japanese, the verb and the preposition both end the phrase while in Hebrew they both lead it. (English is an exception, however.)

The Case for and Against Linguistic Universals
Not all cognitive scientists believe that language requires a special device such as the LAD. Many object that Chomsky's view is overly nativism . (A nativist explains abilities by postulating special purpose mechanisms which are the product of our genetic inheritance.) An alternative view would be that language learning is brought about by mechanisms that are useful more generally and are not specific to a specialized language learning module. Evidence for the alternative view includes the fact that there does not seem to be enough time for evolution to have coded linguistic information into our genes. Also, there is some evidenc e also that abilities that serve language (such as categorical perception) also serve other abilities in non humans. Despite these objections, Chomsky's view has been extremely influential. For example it has inspired Fodor's widely discussed hypothesis that human thinking depends on an innate langu age of thought.

The Hierarchy of Grammars

Finite State Machines.
These are devices that merely move from one box or node to another, making a selection from each box and continuing along any arrow from a box. Chomsky showed that no finite state machine can generate all and only strings of the form: aaaa...bbbb... where the number of as and bs in the string is the same. But language has structures with si milar complexity, notably wherever there are rules of agreement (say) between noun phrase and verb phrase: 'John loves Mary', but 'Men love Mary'.

Phrase Structure Grammars.
Phrase structure grammars allow the introduction of rewrite rules with variables referring to grammatical types. This vastly improves the power of the grammar to account for the structure of language. Rules of language must be expressed not by how one transitions from one 'box' to another, but by the understanding of grammatical categories: NP , VP, Auxiliary Verb,. <>

Transformational Grammars.
These introduce yet another innovation. Rules that transform phrase structures into alternative forms. Transformations provide especially economical explanations for the formation of questions, and passive voice, but also in accounting for deletions ('John and Mary like Jill' instead of 'John likes Jill and Mary Likes Jill') that we may be using to help memory chunking that helps overcome the 7 plus or minus 2 constraint on short term memory.