ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 1705.02436
12
152

Nonlinear Information Bottleneck

6 May 2017
Artemy Kolchinsky
Brendan D. Tracey
David Wolpert
ArXivPDFHTML
Abstract

Information bottleneck (IB) is a technique for extracting information in one random variable XXX that is relevant for predicting another random variable YYY. IB works by encoding XXX in a compressed "bottleneck" random variable MMM from which YYY can be accurately decoded. However, finding the optimal bottleneck variable involves a difficult optimization problem, which until recently has been considered for only two limited cases: discrete XXX and YYY with small state spaces, and continuous XXX and YYY with a Gaussian joint distribution (in which case optimal encoding and decoding maps are linear). We propose a method for performing IB on arbitrarily-distributed discrete and/or continuous XXX and YYY, while allowing for nonlinear encoding and decoding maps. Our approach relies on a novel non-parametric upper bound for mutual information. We describe how to implement our method using neural networks. We then show that it achieves better performance than the recently-proposed "variational IB" method on several real-world datasets.

View on arXiv
Comments on this paper