ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2409.13629
28
3

Transformers in Uniform TC0^00

20 September 2024
David Chiang
ArXivPDFHTML
Abstract

Previous work has shown that the languages recognized by average-hard attention transformers (AHATs) and softmax-attention transformers (SMATs) are within the circuit complexity class TC0^00. However, these results assume limited-precision arithmetic: using floating-point numbers with O(log n) bits (where n is the length of the input string), Strobl showed that AHATs can be approximated in L-uniform TC0^00, and Merrill and Sabharwal showed that SMATs can be approximated in DLOGTIME-uniform TC0^00. Here, we improve these results, showing that AHATs with no approximation, SMATs with O(poly(n)) bits of floating-point precision, and SMATs with at most 2−O(poly(n))2^{-O(poly(n))}2−O(poly(n)) absolute error are all in DLOGTIME-uniform TC0^00.

View on arXiv
Comments on this paper