Papers
Communities
Events
Blog
Pricing
Search
Open menu
Home
Papers
2206.12368
Cited By
Using BERT Embeddings to Model Word Importance in Conversational Transcripts for Deaf and Hard of Hearing Users
24 June 2022
Akhter Al Amin
Saad Hassan
Cecilia Ovesdotter Alm
Matt Huenerfauth
Re-assign community
ArXiv (abs)
PDF
HTML
Papers citing
"Using BERT Embeddings to Model Word Importance in Conversational Transcripts for Deaf and Hard of Hearing Users"
6 / 6 papers shown
Title
Unpacking the Interdependent Systems of Discrimination: Ableist Bias in NLP Systems through an Intersectional Lens
Saad Hassan
Matt Huenerfauth
Cecilia Ovesdotter Alm
91
38
0
01 Oct 2021
Deriving Contextualised Semantic Features from BERT (and Other Transformer Model) Embeddings
Jacob Turton
D. Vinson
Robert Smith
44
25
0
30 Dec 2020
Social Biases in NLP Models as Barriers for Persons with Disabilities
Ben Hutchinson
Vinodkumar Prabhakaran
Emily L. Denton
Kellie Webster
Yu Zhong
Stephen Denuyl
76
314
0
02 May 2020
BERT: Pre-training of Deep Bidirectional Transformers for Language Understanding
Jacob Devlin
Ming-Wei Chang
Kenton Lee
Kristina Toutanova
VLM
SSL
SSeg
1.8K
95,324
0
11 Oct 2018
A Corpus for Modeling Word Importance in Spoken Dialogue Transcripts
Sushant Kafle
Matt Huenerfauth
47
12
0
29 Jan 2018
A Convolutional Neural Network for Modelling Sentences
Nal Kalchbrenner
Edward Grefenstette
Phil Blunsom
111
3,562
0
08 Apr 2014
1