Papers
Communities
Events
Blog
Pricing
Search
Open menu
Home
Papers
2010.00562
Cited By
ISAAQ -- Mastering Textbook Questions with Pre-trained Transformers and Bottom-Up and Top-Down Attention
1 October 2020
José Manuél Gómez-Pérez
Raúl Ortega
Re-assign community
ArXiv
PDF
HTML
Papers citing
"ISAAQ -- Mastering Textbook Questions with Pre-trained Transformers and Bottom-Up and Top-Down Attention"
7 / 7 papers shown
Title
Towards Language-driven Scientific AI
José Manuél Gómez-Pérez
34
0
0
27 Oct 2022
Artificial Intelligence and Natural Language Processing and Understanding in Space: A Methodological Framework and Four ESA Case Studies
José Manuél Gómez-Pérez
Andrés García-Silva
R. Leone
M. Albani
Moritz Fontaine
C. Poncet
L. Summerer
A. Donati
Ilaria Roma
Stefano Scaglioni
18
1
0
07 Oct 2022
MoCA: Incorporating Multi-stage Domain Pretraining and Cross-guided Multimodal Attention for Textbook Question Answering
Fangzhi Xu
Qika Lin
Jun Liu
Lingling Zhang
Tianzhe Zhao
Qianyi Chai
Yudai Pan
14
2
0
06 Dec 2021
Perhaps PTLMs Should Go to School -- A Task to Assess Open Book and Closed Book QA
Manuel R. Ciosici
Joe Cecil
Alex Hedges
Dong-Ho Lee
Marjorie Freedman
R. Weischedel
30
9
0
04 Oct 2021
RL-CSDia: Representation Learning of Computer Science Diagrams
Shaowei Wang
LingLing Zhang
Xuan Luo
Yi Yang
Xin Hu
Jun Liu
3DV
15
2
0
10 Mar 2021
Megatron-LM: Training Multi-Billion Parameter Language Models Using Model Parallelism
M. Shoeybi
M. Patwary
Raul Puri
P. LeGresley
Jared Casper
Bryan Catanzaro
MoE
245
1,833
0
17 Sep 2019
Language Models as Knowledge Bases?
Fabio Petroni
Tim Rocktaschel
Patrick Lewis
A. Bakhtin
Yuxiang Wu
Alexander H. Miller
Sebastian Riedel
KELM
AI4MH
451
2,589
0
03 Sep 2019
1