RCNet: IADCs as Recurrent AutoEncoders

This paper proposes a deep learning model (RCNet) for Delta-Sigma () ADCs. Recurrent Neural Networks (RNNs) allow to describe both modulators and filters. This analogy is applied to Incremental ADCs (IADC). High-end optimizers combined with full-custom losses are used to define additional hardware design constraints: quantized weights, signal saturation, temporal noise injection, devices area. Focusing on DC conversion, our early results demonstrate that defined as an Effective Number Of Bits (ENOB) can be optimized under a certain hardware mapping complexity. The proposed RCNet succeeded to provide design tradeoffs in terms of (13bit) versus area constraints (14pF total capacitor) at a given (80 samples). Interestingly, it appears that the best RCNet architectures do not necessarily rely on high-order modulators, leveraging additional topology exploration degrees of freedom.
View on arXiv@article{verdant2025_2506.16903, title={ RCNet: $ΔΣ$ IADCs as Recurrent AutoEncoders }, author={ Arnaud Verdant and William Guicquero and Jérôme Chossat }, journal={arXiv preprint arXiv:2506.16903}, year={ 2025 } }