ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2201.00490
13
4

Learning with Latent Structures in Natural Language Processing: A Survey

3 January 2022
Zhaofeng Wu
    BDL
    DRL
ArXivPDFHTML
Abstract

While end-to-end learning with fully differentiable models has enabled tremendous success in natural language process (NLP) and machine learning, there have been significant recent interests in learning with latent discrete structures to incorporate better inductive biases for improved end-task performance and better interpretability. This paradigm, however, is not straightforwardly amenable to the mainstream gradient-based optimization methods. This work surveys three main families of methods to learn such models: surrogate gradients, continuous relaxation, and marginal likelihood maximization via sampling. We conclude with a review of applications of these methods and an inspection of the learned latent structure that they induce.

View on arXiv
Comments on this paper