ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2010.11796
14
15

CryptoGRU: Low Latency Privacy-Preserving Text Analysis With GRU

22 October 2020
Bo Feng
Qian Lou
Lei Jiang
Geoffrey C. Fox
ArXivPDFHTML
Abstract

Billions of text analysis requests containing private emails, personal text messages, and sensitive online reviews, are processed by recurrent neural networks (RNNs) deployed on public clouds every day. Although prior secure networks combine homomorphic encryption (HE) and garbled circuit (GC) to preserve users' privacy, naively adopting the HE and GC hybrid technique to implement RNNs suffers from long inference latency due to slow activation functions. In this paper, we present a HE and GC hybrid gated recurrent unit (GRU) network, CryptoGRU, for low-latency secure inferences. CryptoGRU replaces computationally expensive GC-based tanhtanhtanh with fast GC-based ReLUReLUReLU, and then quantizes sigmoidsigmoidsigmoid and ReLUReLUReLU with a smaller bit length to accelerate activations in a GRU. We evaluate CryptoGRU with multiple GRU models trained on 4 public datasets. Experimental results show CryptoGRU achieves top-notch accuracy and improves the secure inference latency by up to 138×138\times138× over one of state-of-the-art secure networks on the Penn Treebank dataset.

View on arXiv
Comments on this paper