ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2206.08441
  4. Cited By
GAAMA 2.0: An Integrated System that Answers Boolean and Extractive
  Questions
v1v2 (latest)

GAAMA 2.0: An Integrated System that Answers Boolean and Extractive Questions

16 June 2022
Scott McCarley
Mihaela A. Bornea
Sara Rosenthal
Anthony Ferritto
Md Arafat Sultan
Avirup Sil
Radu Florian
ArXiv (abs)PDFHTML

Papers citing "GAAMA 2.0: An Integrated System that Answers Boolean and Extractive Questions"

17 / 17 papers shown
Title
Do Answers to Boolean Questions Need Explanations? Yes
Do Answers to Boolean Questions Need Explanations? Yes
Sara Rosenthal
Mihaela A. Bornea
Avirup Sil
Radu Florian
Scott McCarley
56
4
0
14 Dec 2021
Poolingformer: Long Document Modeling with Pooling Attention
Poolingformer: Long Document Modeling with Pooling Attention
Hang Zhang
Yeyun Gong
Yelong Shen
Weisheng Li
Jiancheng Lv
Nan Duan
Weizhu Chen
92
98
0
10 May 2021
Improved Synthetic Training for Reading Comprehension
Improved Synthetic Training for Reading Comprehension
Yanda Chen
Md Arafat Sultan
T. J. W. R. Center
SyDa
59
5
0
24 Oct 2020
Document Modeling with Graph Attention Networks for Multi-grained
  Machine Reading Comprehension
Document Modeling with Graph Attention Networks for Multi-grained Machine Reading Comprehension
Bo Zheng
Haoyang Wen
Yaobo Liang
Nan Duan
Wanxiang Che
Daxin Jiang
Ming Zhou
Ting Liu
55
47
0
12 May 2020
AdapterFusion: Non-Destructive Task Composition for Transfer Learning
AdapterFusion: Non-Destructive Task Composition for Transfer Learning
Jonas Pfeiffer
Aishwarya Kamath
Andreas Rucklé
Kyunghyun Cho
Iryna Gurevych
CLLMoMe
143
858
0
01 May 2020
MAD-X: An Adapter-Based Framework for Multi-Task Cross-Lingual Transfer
MAD-X: An Adapter-Based Framework for Multi-Task Cross-Lingual Transfer
Jonas Pfeiffer
Ivan Vulić
Iryna Gurevych
Sebastian Ruder
103
630
0
30 Apr 2020
Talk to Papers: Bringing Neural Question Answering to Academic Search
Talk to Papers: Bringing Neural Question Answering to Academic Search
Tiancheng Zhao
Kyusong Lee
37
14
0
04 Apr 2020
TyDi QA: A Benchmark for Information-Seeking Question Answering in
  Typologically Diverse Languages
TyDi QA: A Benchmark for Information-Seeking Question Answering in Typologically Diverse Languages
J. Clark
Eunsol Choi
Michael Collins
Dan Garrette
Tom Kwiatkowski
Vitaly Nikolaev
J. Palomaki
171
612
0
10 Mar 2020
Unsupervised Cross-lingual Representation Learning at Scale
Unsupervised Cross-lingual Representation Learning at Scale
Alexis Conneau
Kartikay Khandelwal
Naman Goyal
Vishrav Chaudhary
Guillaume Wenzek
Francisco Guzmán
Edouard Grave
Myle Ott
Luke Zettlemoyer
Veselin Stoyanov
228
6,587
0
05 Nov 2019
CFO: A Framework for Building Production NLP Systems
CFO: A Framework for Building Production NLP Systems
Rishav Chakravarti
Cezar Pendus
Andrzej Sakrajda
Anthony Ferritto
Lin Pan
...
Vittorio Castelli
J. William Murdock
Radu Florian
Salim Roukos
Avirup Sil
53
10
0
16 Aug 2019
BoolQ: Exploring the Surprising Difficulty of Natural Yes/No Questions
BoolQ: Exploring the Surprising Difficulty of Natural Yes/No Questions
Christopher Clark
Kenton Lee
Ming-Wei Chang
Tom Kwiatkowski
Michael Collins
Kristina Toutanova
244
1,551
0
24 May 2019
End-to-End Open-Domain Question Answering with BERTserini
End-to-End Open-Domain Question Answering with BERTserini
Wei Yang
Yuqing Xie
Aileen Lin
Xingyu Li
Luchen Tan
Kun Xiong
Ming Li
Jimmy J. Lin
RALM
123
495
0
05 Feb 2019
Parameter-Efficient Transfer Learning for NLP
Parameter-Efficient Transfer Learning for NLP
N. Houlsby
A. Giurgiu
Stanislaw Jastrzebski
Bruna Morrone
Quentin de Laroussilhe
Andrea Gesmundo
Mona Attariyan
Sylvain Gelly
221
4,514
0
02 Feb 2019
A BERT Baseline for the Natural Questions
A BERT Baseline for the Natural Questions
Chris Alberti
Kenton Lee
Michael Collins
ELMAI4MH
61
127
0
24 Jan 2019
BERT: Pre-training of Deep Bidirectional Transformers for Language
  Understanding
BERT: Pre-training of Deep Bidirectional Transformers for Language Understanding
Jacob Devlin
Ming-Wei Chang
Kenton Lee
Kristina Toutanova
VLMSSLSSeg
1.8K
95,175
0
11 Oct 2018
MS MARCO: A Human Generated MAchine Reading COmprehension Dataset
MS MARCO: A Human Generated MAchine Reading COmprehension Dataset
Payal Bajaj
Daniel Fernando Campos
Nick Craswell
Li Deng
Jianfeng Gao
...
Mir Rosenberg
Xia Song
Alina Stoica
Saurabh Tiwary
Tong Wang
RALM
151
2,744
0
28 Nov 2016
SQuAD: 100,000+ Questions for Machine Comprehension of Text
SQuAD: 100,000+ Questions for Machine Comprehension of Text
Pranav Rajpurkar
Jian Zhang
Konstantin Lopyrev
Percy Liang
RALM
314
8,169
0
16 Jun 2016
1