ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 1801.04062
228
1285
v1v2v3v4v5 (latest)

MINE: Mutual Information Neural Estimation

12 January 2018
Mohamed Ishmael Belghazi
A. Baratin
Sai Rajeswar
Sherjil Ozair
Yoshua Bengio
Aaron Courville
R. Devon Hjelm
    DRL
ArXiv (abs)PDFHTML
Abstract

We argue that the estimation of mutual information between high dimensional continuous random variables can be achieved by gradient descent over neural networks. We present a Mutual Information Neural Estimator (MINE) that is linearly scalable in dimensionality as well as in sample size, trainable through back-prop, and strongly consistent. We present a handful of applications on which MINE can be used to minimize or maximize mutual information. We apply MINE to improve adversarially trained generative models. We also use MINE to implement Information Bottleneck, applying it to supervised classification; our results demonstrate substantial improvement in flexibility and performance in these settings.

View on arXiv
Comments on this paper