ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2205.12621
  4. Cited By
Unbiased and Efficient Sampling of Dependency Trees

Unbiased and Efficient Sampling of Dependency Trees

25 May 2022
Milovs Stanojević
ArXivPDFHTML

Papers citing "Unbiased and Efficient Sampling of Dependency Trees"

4 / 4 papers shown
Title
Transformer Grammars: Augmenting Transformer Language Models with
  Syntactic Inductive Biases at Scale
Transformer Grammars: Augmenting Transformer Language Models with Syntactic Inductive Biases at Scale
Laurent Sartran
Samuel Barrett
A. Kuncoro
Milovs Stanojević
Phil Blunsom
Chris Dyer
59
50
0
01 Mar 2022
Universal Dependencies v2: An Evergrowing Multilingual Treebank
  Collection
Universal Dependencies v2: An Evergrowing Multilingual Treebank Collection
Joakim Nivre
M. Marneffe
Filip Ginter
Jan Hajivc
Christopher D. Manning
S. Pyysalo
Sebastian Schuster
Francis M. Tyers
Daniel Zeman
VLM
37
513
0
22 Apr 2020
Stanza: A Python Natural Language Processing Toolkit for Many Human
  Languages
Stanza: A Python Natural Language Processing Toolkit for Many Human Languages
Peng Qi
Yuhao Zhang
Yuhui Zhang
Jason Bolton
Christopher D. Manning
AI4TS
236
1,690
0
16 Mar 2020
Stochastic Beams and Where to Find Them: The Gumbel-Top-k Trick for
  Sampling Sequences Without Replacement
Stochastic Beams and Where to Find Them: The Gumbel-Top-k Trick for Sampling Sequences Without Replacement
W. Kool
H. V. Hoof
Max Welling
109
220
0
14 Mar 2019
1