7
0

ReDASH: Fast and efficient Scaling in Arithmetic Garbled Circuits for Secure Outsourced Inference

Main:8 Pages
4 Figures
Bibliography:1 Pages
3 Tables
Appendix:2 Pages
Abstract

ReDash extends Dash's arithmetic garbled circuits to provide a more flexible and efficient framework for secure outsourced inference. By introducing a novel garbled scaling gadget based on a generalized base extension for the residue number system, ReDash removes Dash's limitation of scaling exclusively by powers of two. This enables arbitrary scaling factors drawn from the residue number system's modular base, allowing for tailored quantization schemes and more efficient model evaluation.Through the new ScaleQuant+\text{ScaleQuant}^+ quantization mechanism, ReDash supports optimized modular bases that can significantly reduce the overhead of arithmetic operations during convolutional neural network inference. ReDash achieves up to a 33-fold speedup in overall inference time compared to Dash Despite these enhancements, ReDash preserves the robust security guarantees of arithmetic garbling. By delivering both performance gains and quantization flexibility, ReDash expands the practicality of garbled convolutional neural network inference.

View on arXiv
@article{maurer2025_2506.14489,
  title={ ReDASH: Fast and efficient Scaling in Arithmetic Garbled Circuits for Secure Outsourced Inference },
  author={ Felix Maurer and Jonas Sander and Thomas Eisenbarth },
  journal={arXiv preprint arXiv:2506.14489},
  year={ 2025 }
}
Comments on this paper