Academic Biography

1963 – 1975 Generative Semantics

George’s last year at MIT was the first year of MIT’s Linguistics program. He studied that year with Roman Jakobson, Morris Halle, and Noam Chomsky and became part of the group that formed Chomsky’s first student cohort. He went to Indiana to study English Literature, but soon switched to linguistics, working out early theoretical details in Chomsky’s transformational grammar.

Chomsky’s theory assumed that the structure of language is “autonomous” — independent of meaning and communication. In 1963, George stumbled upon the first of hundreds of counterexamples. Working on the grammar of baseball, George looked at the sentence “Yaztremski doubled to left.” It contains a directional adverb “to left.” In autonomous syntax, directional adverbs should modify verbs of motion and the grammar of the sentence should include the moving entity. But “double” is not a verb of motion, and the ball, whose motion is modified by “to left” is not in the sentence at all. Instead, the meaning of the verb “double” includes a ball moving “to left.” It is the meaning, not the grammar, that determines the occurrence of “to left.”

It soon became clear that meaning and communication play a major role in grammar, and in the ensuing years hundreds of cases were discovered. To deal with these cases, George brought formal logic (in 1963) and model theory (in 1968) into linguistics, creating the theory of generative semantics, in which he was joined at first by Haj Ross of MIT and James D. McCawley of the University of Chicago, and later by researchers through out the world.

By 1968, generative semantics technically developed into a constraint satisfaction system, keeping as much as possible from generative linguistics, but linking grammatical and lexical form to meaning and context, including principles of communication (e.g., speech acts and implicatures).

1975 – Present: Cognitive Linguistics

By 1975, George was part of the nascent cognitive science community at Berkeley, and become convinced that Linguistics had to be based on the nature of cognition. Counterexamples to both transformational linguistics in general and formal logic as the right way to do semantics were discovered.

The Major Idea:

Language is just one aspect of human cognition, using general cognitive mechanisms. Linguistics is both constrained by discoveries in other branches of the cognitive sciences, and can contribute to them.

In generative linguistics and in formal logic, language and logic are seen as autonomous — independent of perception, action, and social interaction. In cognitive linguistics, such possibilities are ruled out. Philosophy is also seen as constrained by discoveries in the cognitive sciences.

Cognitive Linguistics

George started integrating research by Charles Fillmore (on frame semantics), Gilles Fauconnier (on mental spaces), Len Talmy and Ron Langacker (on image schemas), Eleanor Rosch (on prototype theory and basic level categories), and Paul Kay and Chad McDaniel (on the relation beween color categories and the neurophysiology of color vision). In 1978, George (and Michael Reddy, independently) discovered the basic mechanism of metaphorical thought — frame-to-frame mappings across conceptual domains. In 1979, George was joined in metaphor research by philosopher Mark Johnson. Together they worked out the early details of the cognitive theory of metaphor, writing Metaphors We Live By. In the late 1970’s and early 1980’s.

What brought all these discoveries together was the realization that the semantics of language is embodied — that the body contributes, and places constraints on, the human conceptual system.

In 1975, George, together with his student Henry Thompson, adapted Bill Woods’ development of augmented transition network grammars, to show that a grammar could both be a theoretical grammar and also do real-time processing. They called this idea “cognitive grammar.”

In the late 1970’s and early 1980’s, George and Charles Fillmore began developing a theory of construction grammar, bringing together Fillmore’s case grammar and frame semantics with George’s generative semantics, metaphor theory, and cognitive grammar.

The first detailed study of construction grammar was George’s lengthy 1983 paper on There-constructions, published in 1987 as case study 3 of Women, Fire, and Dangerous Things, which was a book on how cognitive science and cognitive linguistics had created a new theory of human categorization.

1988 – Present: The Neural Theory of Language

In the early 1980’s, George became convinced through his research with Mark Johnson on embodiment that it was vital to understand how the brain could yield embodied conceptual systems and complex linguistic systems. At the time, his friend and colleague, David Rumelhart, was developing parallel distributed processing (PDP) at UC San Diego. He regularly visited Rumelhart to learn the basics and keep up with developments. In the summer of 1988, he taught at the Connectionist Summer School at Carnegie-Mellon University.

In the fall of 1988, Jerry Feldman came to Berkeley as Director of the International Computer Science Institute. He and George formed the Neural Theory of Language (NTL) Group at the Institute. Jerry had developed an alterative account of neural computation that better fit actual brain circuitry than PDP models, which didn’t fit very well at all. In addition, George had come to the conclusion that PDP models were not adequate to characterize the complexities in conceptual systems and language uncovered by cognitive linguists. George and Jerry set to work incorporating ongoing cognitive linguistics research with Feldman’s neural computational models.

The Major Idea:

The revolutionary idea behind NTL research is basic to neuroscience: Thought and language are physical, structured and given meaning by embodied experience, and carried out by brain circuitry at the level of neuronal groups, or “nodes.” Neural computation models what the circuitry does. The notation for cognitive linguistics characterizes the conceptual and linguistic functions carried out by the neural circuitry.

This idea replaces the old cognitive science idea from the 1970’s that the mind is software running on the brain as hardware. Software uses algorithmic symbol manipulation that changes strings of symbols into other strings of symbols. In the brain, there are no symbols and no symbols changes. There are just neural structures, functioning in a highly structured brain, which extends throughout the body and coordinates the body’s activities.

NTL

Meanwhile, George continued his research on metaphorical thought and construction grammar. In 1989, he and his former student Mark Turner published More Than Cool Reason, an application of cognitive metaphor theory to poetic metaphor that later became the basis for research in cognitive literary theory. and in 2002 began work with Mark Johnson on what became Philosophy in the Flesh, perhaps the most detailed study of conceptual metaphor in existence. It showed that the most basic concepts studied by philosophy are deeply metaphorical in nature, and that philosophers take a small number of those metaphors as literal and then work out the consequences in meticulous detail.

The NTL group blossomed in the 1990’s and 2000’s. Feldman’s outstanding book From Molecules to Metaphors provides an overview and recounts the major developments. Lokendra Shastri worked out computational models of neural binding. Terry Regier constructed neural computational models of image schemas and the language expressing them, along with a theory of learning without negative examples. David Bailey constructed a computational model of the learning of verbs of hand motion in various languages.

Perhaps, the most ambitious work was done by Srini Narayanan. Narayanan began by working out a theory of neural computation that could control sequential actions such a way as to provide a computational neural model of body movements and actions, and the perception of bodily actions and events. His neural computational theory also characterized the structure and logic of aspect — the structure of actions and events built into every language in the world. To study the structure and logic of “abstract” concepts, Narayanan in his dissertation constructed a neural theory of conceptual metaphor, and showed that it characterized “abstract” reasoning via metaphorical mappings from embodied concepts.

Narayanan’s dissertation, together with dissertations by Christopher Johnson and Joe Grady, led to a neural theory of metaphor learning for “primary metaphors,” which are learned mostly prior to language just by living in the world and having bodily experiences. These metaphors have provided the basis for the contemporary experimental study of “embodied cognition,” in which subjects unconsciously act out conceptual metaphors behaviorally.

During the late 1990’s and early 2000’s, Feldman originated the ECG project, attempting to provide a natural language parser and semantic analyzer using cognitive semantics and construction grammar. ECG research has been done by Nancy Chang (on grammar learning), by John Bryant (on best fit in grammar and semantics), by Eva Mok (on language learning in Chinese), and by Ellen Dodge who worked out the semantics and grammar of a crucial area of argument structure constructions. George was involved with the group in working out the cognitive linguistics notation used in ECG and in working out details.

During late 1990’s, George and Rafael Núñez began research on the neural cognitive foundations of mathematics. They discovered, following the work of Stanislaus Dehaene, that mathematics is embodied in the neural system, and that so-called “abstract” mathematics is actually metaphorical. In 2000 they published a 600-page book, Where Mathematics Comes From, working out the details for many branches of mathematics. The most interesting result is The Basic Metaphor of Infinity, a single general conceptual metaphor that account for the conceptualization of infinity as a thing throughout many branches of mathematics — infinite sets, infinite decimals, infinite intersections, limits, infinite numbers, infinitesimal numbers, and so on.