ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 1708.01525
39
19
v1v2v3v4v5 (latest)

Language Design and Renormalization

4 August 2017
Ángel J. Gallego
Roman Orus
ArXiv (abs)PDFHTML
Abstract

Here we consider some well-known facts in syntax from a physics perspective, which allows us to establish some remarkable equivalences. Specifically, we observe that the operation MERGE put forward by N. Chomsky in 1995 can be interpreted as a physical information coarse-graining. Thus, MERGE in linguistics entails information renormalization in physics, according to different time scales. We make this point mathematically formal in terms of language models, i.e., probability distributions over word sequences, widely used in natural language processing as well as other ambits. In this setting, MERGE corresponds to a 3-index probability tensor implementing a coarse-graining, akin to a probabilistic context-free grammar. The probability vectors of meaningful sentences are naturally given by stochastic tensor networks (TN) that are mostly loop-free, such as Tree Tensor Networks and Matrix Product States. These structures have short-ranged correlations in the syntactic distance by construction and, because of the peculiarities of human language, they are extremely efficient to manipulate computationally. We also propose how to obtain such language models from probability distributions of certain TN quantum states, which we show to be efficiently preparable by a quantum computer. Moreover, using tools from entanglement theory, we use these quantum states to prove classical lower bounds on the perplexity of the probability distribution for a set of words in a sentence. Implications of these results are discussed in the ambits of theoretical and computational linguistics, artificial intelligence, programming languages, RNA and protein sequencing, quantum many-body systems, and beyond. Our work shows how many of the key linguistic ideas from the last century, including developments in computational linguistics, fit perfectly with known physical concepts linked to renormalization.

View on arXiv
Comments on this paper