ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 1808.08949
  4. Cited By
Dissecting Contextual Word Embeddings: Architecture and Representation

Dissecting Contextual Word Embeddings: Architecture and Representation

27 August 2018
Matthew E. Peters
Mark Neumann
Luke Zettlemoyer
Wen-tau Yih
ArXivPDFHTML

Papers citing "Dissecting Contextual Word Embeddings: Architecture and Representation"

20 / 70 papers shown
Title
Language Models are Few-Shot Learners
Language Models are Few-Shot Learners
Tom B. Brown
Benjamin Mann
Nick Ryder
Melanie Subbiah
Jared Kaplan
...
Christopher Berner
Sam McCandlish
Alec Radford
Ilya Sutskever
Dario Amodei
BDL
20
40,023
0
28 May 2020
Spying on your neighbors: Fine-grained probing of contextual embeddings
  for information about surrounding words
Spying on your neighbors: Fine-grained probing of contextual embeddings for information about surrounding words
Josef Klafka
Allyson Ettinger
51
42
0
04 May 2020
Emergence of Syntax Needs Minimal Supervision
Emergence of Syntax Needs Minimal Supervision
Raphaël Bailly
Kata Gábor
31
5
0
03 May 2020
Probing Contextual Language Models for Common Ground with Visual
  Representations
Probing Contextual Language Models for Common Ground with Visual Representations
Gabriel Ilharco
Rowan Zellers
Ali Farhadi
Hannaneh Hajishirzi
30
14
0
01 May 2020
A Financial Service Chatbot based on Deep Bidirectional Transformers
A Financial Service Chatbot based on Deep Bidirectional Transformers
S. Yu
Yuxin Chen
Hussain Zaidi
22
33
0
17 Feb 2020
oLMpics -- On what Language Model Pre-training Captures
oLMpics -- On what Language Model Pre-training Captures
Alon Talmor
Yanai Elazar
Yoav Goldberg
Jonathan Berant
LRM
19
300
0
31 Dec 2019
Pretrained Encyclopedia: Weakly Supervised Knowledge-Pretrained Language
  Model
Pretrained Encyclopedia: Weakly Supervised Knowledge-Pretrained Language Model
Wenhan Xiong
Jingfei Du
William Yang Wang
Veselin Stoyanov
SSL
KELM
44
201
0
20 Dec 2019
On the Linguistic Representational Power of Neural Machine Translation
  Models
On the Linguistic Representational Power of Neural Machine Translation Models
Yonatan Belinkov
Nadir Durrani
Fahim Dalvi
Hassan Sajjad
James R. Glass
MILM
33
68
0
01 Nov 2019
Discovering the Compositional Structure of Vector Representations with
  Role Learning Networks
Discovering the Compositional Structure of Vector Representations with Role Learning Networks
Paul Soulos
R. Thomas McCoy
Tal Linzen
P. Smolensky
CoGe
29
43
0
21 Oct 2019
Shallow Syntax in Deep Water
Shallow Syntax in Deep Water
Swabha Swayamdipta
Matthew E. Peters
Brendan Roof
Chris Dyer
Noah A. Smith
16
10
0
29 Aug 2019
On Identifiability in Transformers
On Identifiability in Transformers
Gino Brunner
Yang Liu
Damian Pascual
Oliver Richter
Massimiliano Ciaramita
Roger Wattenhofer
ViT
30
186
0
12 Aug 2019
How multilingual is Multilingual BERT?
How multilingual is Multilingual BERT?
Telmo Pires
Eva Schlinger
Dan Garrette
LRM
VLM
59
1,371
0
04 Jun 2019
What do you learn from context? Probing for sentence structure in
  contextualized word representations
What do you learn from context? Probing for sentence structure in contextualized word representations
Ian Tenney
Patrick Xia
Berlin Chen
Alex Jinpeng Wang
Adam Poliak
...
Najoung Kim
Benjamin Van Durme
Samuel R. Bowman
Dipanjan Das
Ellie Pavlick
85
848
0
15 May 2019
BERT Rediscovers the Classical NLP Pipeline
BERT Rediscovers the Classical NLP Pipeline
Ian Tenney
Dipanjan Das
Ellie Pavlick
MILM
SSeg
32
1,438
0
15 May 2019
Probing Biomedical Embeddings from Language Models
Probing Biomedical Embeddings from Language Models
Qiao Jin
Bhuwan Dhingra
William W. Cohen
Xinghua Lu
24
116
0
03 Apr 2019
Linguistic Knowledge and Transferability of Contextual Representations
Linguistic Knowledge and Transferability of Contextual Representations
Nelson F. Liu
Matt Gardner
Yonatan Belinkov
Matthew E. Peters
Noah A. Smith
52
717
0
21 Mar 2019
Sentence Encoders on STILTs: Supplementary Training on Intermediate
  Labeled-data Tasks
Sentence Encoders on STILTs: Supplementary Training on Intermediate Labeled-data Tasks
Jason Phang
Thibault Févry
Samuel R. Bowman
21
467
0
02 Nov 2018
BERT: Pre-training of Deep Bidirectional Transformers for Language
  Understanding
BERT: Pre-training of Deep Bidirectional Transformers for Language Understanding
Jacob Devlin
Ming-Wei Chang
Kenton Lee
Kristina Toutanova
VLM
SSL
SSeg
75
92,867
0
11 Oct 2018
What you can cram into a single vector: Probing sentence embeddings for
  linguistic properties
What you can cram into a single vector: Probing sentence embeddings for linguistic properties
Alexis Conneau
Germán Kruszewski
Guillaume Lample
Loïc Barrault
Marco Baroni
201
882
0
03 May 2018
OpenNMT: Open-Source Toolkit for Neural Machine Translation
OpenNMT: Open-Source Toolkit for Neural Machine Translation
Guillaume Klein
Yoon Kim
Yuntian Deng
Jean Senellart
Alexander M. Rush
262
1,896
0
10 Jan 2017
Previous
12