Calibrating Generative Models: The Probabilistic Chomsky-Schützenberger Hierarchy

Journal of Mathematical Psychology 95 (2020)
  Copy   BIBTEX

Abstract

A probabilistic Chomsky–Schützenberger hierarchy of grammars is introduced and studied, with the aim of understanding the expressive power of generative models. We offer characterizations of the distributions definable at each level of the hierarchy, including probabilistic regular, context-free, (linear) indexed, context-sensitive, and unrestricted grammars, each corresponding to familiar probabilistic machine classes. Special attention is given to distributions on (unary notations for) positive integers. Unlike in the classical case where the "semi-linear" languages all collapse into the regular languages, using analytic tools adapted from the classical setting we show there is no collapse in the probabilistic hierarchy: more distributions become definable at each level. We also address related issues such as closure under probabilistic conditioning.

Author's Profile

Thomas Icard
Stanford University

Analytics

Added to PP
2020-02-02

Downloads
1,486 (#6,627)

6 months
284 (#7,101)

Historical graph of downloads since first upload
This graph includes both downloads from PhilArchive and clicks on external links on PhilPapers.
How can I increase my downloads?