15
0

RCNet: ΔΣΔΣ IADCs as Recurrent AutoEncoders

Main:13 Pages
7 Figures
Bibliography:1 Pages
2 Tables
Appendix:1 Pages
Abstract

This paper proposes a deep learning model (RCNet) for Delta-Sigma (ΔΣ\Delta\Sigma) ADCs. Recurrent Neural Networks (RNNs) allow to describe both modulators and filters. This analogy is applied to Incremental ADCs (IADC). High-end optimizers combined with full-custom losses are used to define additional hardware design constraints: quantized weights, signal saturation, temporal noise injection, devices area. Focusing on DC conversion, our early results demonstrate that SNRSNR defined as an Effective Number Of Bits (ENOB) can be optimized under a certain hardware mapping complexity. The proposed RCNet succeeded to provide design tradeoffs in terms of SNRSNR (>>13bit) versus area constraints (<<14pF total capacitor) at a given OSROSR (80 samples). Interestingly, it appears that the best RCNet architectures do not necessarily rely on high-order modulators, leveraging additional topology exploration degrees of freedom.

View on arXiv
@article{verdant2025_2506.16903,
  title={ RCNet: $ΔΣ$ IADCs as Recurrent AutoEncoders },
  author={ Arnaud Verdant and William Guicquero and Jérôme Chossat },
  journal={arXiv preprint arXiv:2506.16903},
  year={ 2025 }
}
Comments on this paper