ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2505.09663
30
0

Analog Foundation Models

14 May 2025
Julian Büchel
Iason Chalas
Giovanni Acampa
An Chen
Omobayode Fagbohungbe
Sidney Tsai
Kaoutar El Maghraoui
Manuel Le Gallo
Abbas Rahimi
Abu Sebastian
    MQ
ArXivPDFHTML
Abstract

Analog in-memory computing (AIMC) is a promising compute paradigm to improve speed and power efficiency of neural network inference beyond the limits of conventional von Neumann-based architectures. However, AIMC introduces fundamental challenges such as noisy computations and strict constraints on input and output quantization. Because of these constraints and imprecisions, off-the-shelf LLMs are not able to achieve 4-bit-level performance when deployed on AIMC-based hardware. While researchers previously investigated recovering this accuracy gap on small, mostly vision-based models, a generic method applicable to LLMs pre-trained on trillions of tokens does not yet exist. In this work, we introduce a general and scalable method to robustly adapt LLMs for execution on noisy, low-precision analog hardware. Our approach enables state-of-the-art models \unicodex2013\unicode{x2013}\unicodex2013 including Phi-3-mini-4k-instruct and Llama-3.2-1B-Instruct \unicodex2013\unicode{x2013}\unicodex2013 to retain performance comparable to 4-bit weight, 8-bit activation baselines, despite the presence of analog noise and quantization constraints. Additionally, we show that as a byproduct of our training methodology, analog foundation models can be quantized for inference on low-precision digital hardware. Finally, we show that our models also benefit from test-time compute scaling, showing better scaling behavior than models trained with 4-bit weight and 8-bit static input quantization. Our work bridges the gap between high-capacity LLMs and efficient analog hardware, offering a path toward energy-efficient foundation models. Code is available atthis https URL.

View on arXiv
@article{büchel2025_2505.09663,
  title={ Analog Foundation Models },
  author={ Julian Büchel and Iason Chalas and Giovanni Acampa and An Chen and Omobayode Fagbohungbe and Sidney Tsai and Kaoutar El Maghraoui and Manuel Le Gallo and Abbas Rahimi and Abu Sebastian },
  journal={arXiv preprint arXiv:2505.09663},
  year={ 2025 }
}
Comments on this paper