ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2009.04765
  4. Cited By
Brain2Word: Decoding Brain Activity for Language Generation

Brain2Word: Decoding Brain Activity for Language Generation

10 September 2020
Nicolas Affolter
Béni Egressy
Damian Pascual
Roger Wattenhofer
ArXivPDFHTML

Papers citing "Brain2Word: Decoding Brain Activity for Language Generation"

12 / 12 papers shown
Title
Human brain activity for machine attention
Human brain activity for machine attention
Lukas Muttenthaler
Nora Hollenstein
Maria Barrett
55
14
0
09 Jun 2020
Inducing brain-relevant bias in natural language processing models
Inducing brain-relevant bias in natural language processing models
Dan Schwartz
Mariya Toneva
Leila Wehbe
49
79
0
29 Oct 2019
Exploring the Limits of Transfer Learning with a Unified Text-to-Text
  Transformer
Exploring the Limits of Transfer Learning with a Unified Text-to-Text Transformer
Colin Raffel
Noam M. Shazeer
Adam Roberts
Katherine Lee
Sharan Narang
Michael Matena
Yanqi Zhou
Wei Li
Peter J. Liu
AIMat
427
20,181
0
23 Oct 2019
Linking artificial and human neural representations of language
Linking artificial and human neural representations of language
Jiajun Liu
Roger Levy
AI4CE
40
89
0
02 Oct 2019
RoBERTa: A Robustly Optimized BERT Pretraining Approach
RoBERTa: A Robustly Optimized BERT Pretraining Approach
Yinhan Liu
Myle Ott
Naman Goyal
Jingfei Du
Mandar Joshi
Danqi Chen
Omer Levy
M. Lewis
Luke Zettlemoyer
Veselin Stoyanov
AIMat
653
24,464
0
26 Jul 2019
Interpreting and improving natural-language processing (in machines)
  with natural language-processing (in the brain)
Interpreting and improving natural-language processing (in machines) with natural language-processing (in the brain)
Mariya Toneva
Leila Wehbe
MILM
AI4CE
75
226
0
28 May 2019
BERT: Pre-training of Deep Bidirectional Transformers for Language
  Understanding
BERT: Pre-training of Deep Bidirectional Transformers for Language Understanding
Jacob Devlin
Ming-Wei Chang
Kenton Lee
Kristina Toutanova
VLM
SSL
SSeg
1.8K
94,891
0
11 Oct 2018
Does the brain represent words? An evaluation of brain decoding studies
  of language understanding
Does the brain represent words? An evaluation of brain decoding studies of language understanding
Jon Gauthier
Anna A. Ivanova
NAI
44
32
0
02 Jun 2018
Neural Discrete Representation Learning
Neural Discrete Representation Learning
Aaron van den Oord
Oriol Vinyals
Koray Kavukcuoglu
BDL
SSL
OCL
226
5,019
0
02 Nov 2017
Attention Is All You Need
Attention Is All You Need
Ashish Vaswani
Noam M. Shazeer
Niki Parmar
Jakob Uszkoreit
Llion Jones
Aidan Gomez
Lukasz Kaiser
Illia Polosukhin
3DV
701
131,652
0
12 Jun 2017
XGBoost: A Scalable Tree Boosting System
XGBoost: A Scalable Tree Boosting System
Tianqi Chen
Carlos Guestrin
802
38,961
0
09 Mar 2016
FaceNet: A Unified Embedding for Face Recognition and Clustering
FaceNet: A Unified Embedding for Face Recognition and Clustering
Florian Schroff
Dmitry Kalenichenko
James Philbin
3DH
370
13,145
0
12 Mar 2015
1