ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2210.03353
  4. Cited By
The Lifecycle of "Facts": A Survey of Social Bias in Knowledge Graphs

The Lifecycle of "Facts": A Survey of Social Bias in Knowledge Graphs

7 October 2022
Angelie Kraft
Ricardo Usbeck
    KELM
ArXivPDFHTML

Papers citing "The Lifecycle of "Facts": A Survey of Social Bias in Knowledge Graphs"

10 / 10 papers shown
Title
Social Biases in Knowledge Representations of Wikidata separates Global North from Global South
Social Biases in Knowledge Representations of Wikidata separates Global North from Global South
Paramita Das
Sai Keerthana Karnam
Aditya Soni
Animesh Mukherjee
116
0
0
05 May 2025
Knowledge Prompting: How Knowledge Engineers Use Large Language Models
Knowledge Prompting: How Knowledge Engineers Use Large Language Models
Elisavet Koutsiana
Johanna Walker
Michelle Nwachukwu
Albert Meroño-Peñuela
Elena Simperl
40
1
0
02 Aug 2024
How Contentious Terms About People and Cultures are Used in Linked Open
  Data
How Contentious Terms About People and Cultures are Used in Linked Open Data
A. Nesterov
L. Hollink
Jacco van Ossenbruggen
14
3
0
13 Nov 2023
Large Language Models and Knowledge Graphs: Opportunities and Challenges
Large Language Models and Knowledge Graphs: Opportunities and Challenges
Jeff Z. Pan
Simon Razniewski
Jan-Christoph Kalo
Sneha Singhania
Jiaoyan Chen
...
Gerard de Melo
A. Bonifati
Edlira Vakaj
M. Dragoni
D. Graux
KELM
30
72
0
11 Aug 2023
Towards Automatic Bias Detection in Knowledge Graphs
Towards Automatic Bias Detection in Knowledge Graphs
Daphna Keidar
Mian Zhong
Ce Zhang
Y. Shrestha
B. Paudel
59
11
0
19 Sep 2021
Knowledge Enhanced Contextual Word Representations
Knowledge Enhanced Contextual Word Representations
Matthew E. Peters
Mark Neumann
IV RobertL.Logan
Roy Schwartz
Vidur Joshi
Sameer Singh
Noah A. Smith
228
656
0
09 Sep 2019
The Woman Worked as a Babysitter: On Biases in Language Generation
The Woman Worked as a Babysitter: On Biases in Language Generation
Emily Sheng
Kai-Wei Chang
Premkumar Natarajan
Nanyun Peng
211
616
0
03 Sep 2019
A Survey on Bias and Fairness in Machine Learning
A Survey on Bias and Fairness in Machine Learning
Ninareh Mehrabi
Fred Morstatter
N. Saxena
Kristina Lerman
Aram Galstyan
SyDa
FaML
323
4,203
0
23 Aug 2019
Fair prediction with disparate impact: A study of bias in recidivism
  prediction instruments
Fair prediction with disparate impact: A study of bias in recidivism prediction instruments
Alexandra Chouldechova
FaML
207
2,082
0
24 Oct 2016
Efficient Estimation of Word Representations in Vector Space
Efficient Estimation of Word Representations in Vector Space
Tomáš Mikolov
Kai Chen
G. Corrado
J. Dean
3DV
233
31,253
0
16 Jan 2013
1