Mind, Technology, and Society Talk Series

Paul Smolensky
Professor, Johns Hopkins University and Microsoft Research
Time/Date: 3-4:30 p.m. Monday,October 26, 2015

Chancellor’s Conference Room, KL 232

Gradient symbols in grammar


Gradient Symbolic Computation (GSC) operates over representations that are discrete structures built of blends of symbols with gradient activity levels, or equivalently, discrete symbols occupying gradient blends of structural positions. Gradient variation in phonetic and semantic representations is in part the result of interpretation by the phonetic and semantic systems of gradience internal to predominantly-discrete phonological and syntactic representations. The phonological and syntactic systems construct gradient representations that are optimal with respect to grammars deploying numerically-weighted wellformedness constraints evaluating numerically-weighted symbolic constituents. These optimal representations result from continuous, stochastic activation spreading in a new type of neural network in which symbols and structural positions are encoded in distributed activity patterns. These patterns define continuously-varying degrees of similarity among different symbols and different structural positions. Gradient Symbolic Computation enables a deep unification of linguistic theories of grammatical competence with psycholinguistic theories of both discrete and continuous aspects of grammatical performance.

(with Matt Goldrick, Don Mathis, & the GSC Research Group)

Suggested reading:

Smolensky, Paul, Goldrick, Matthew & Mathis, Donald. 2014. Optimization and quantization in gradient symbol systems: A framework for integrating the continuous and the discrete in cognition. Cognitive Science, 38, 1102−1138. DOI: 10.1111/cogs.120472013.


Paul Smolensky is Krieger-Eisenhower Professor of Cognitive Science at Johns Hopkins University, where he directed two NSF Integrated Graduate Education and Research Training Programs on the cognitive science of language (1999−2015). His research addresses mathematical unification of the continuous and the discrete facets of cognition: principally, the development of grammar formalisms that are grounded in cognitive and neural computation.

A member of the Parallel Distributed Processing (PDP) Research Group at UCSD (1986), he developed Harmony Theory, proposing what is now known as the ‘Restricted Boltzmann Machine’ architecture. He then developed Tensor Product Representations (1990), a compositional, recursive technique for encoding symbol structures as real-valued activation vectors. Combining these two theories, he co-developed Harmonic Grammar (1990) and Optimality Theory (1993), general grammatical formalisms now widely used in phonological theory.

His publications include the books Mathematical perspectives on neural networks (1996, with M. Mozer, D. Rumelhart), Optimality Theory: Constraint interaction in generative grammar (1993/2004, with A. Prince), Learnability in Optimality Theory (2000, with B. Tesar), and The harmonic mind: From neural computation to optimality-theoretic grammar (2006, with G. Legendre). He was awarded the 2005 David E. Rumelhart Prize for Outstanding Contributions to the Formal Analysis of Human Cognition, a Blaise Pascal Chair in Paris (2008−9), and the 2015 Sapir Professorship of the Linguistic Society of America.

His website is http://cogsci.jhu.edu/people/smolensky.html.