ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2210.03575
  4. Cited By
Are Representations Built from the Ground Up? An Empirical Examination
  of Local Composition in Language Models

Are Representations Built from the Ground Up? An Empirical Examination of Local Composition in Language Models

7 October 2022
Emmy Liu
Graham Neubig
    CoGe
ArXivPDFHTML

Papers citing "Are Representations Built from the Ground Up? An Empirical Examination of Local Composition in Language Models"

12 / 12 papers shown
Title
Set-Theoretic Compositionality of Sentence Embeddings
Set-Theoretic Compositionality of Sentence Embeddings
Naman Bansal
Yash Mahajan
Sanjeev Kumar Sinha
S. Karmaker
CoGe
77
0
0
28 Feb 2025
Investigating Idiomaticity in Word Representations
Investigating Idiomaticity in Word Representations
Wei He
Tiago Kramer Vieira
Marcos García
Carolina Scarton
M. Idiart
Aline Villavicencio
36
1
0
04 Nov 2024
Semantics of Multiword Expressions in Transformer-Based Models: A Survey
Semantics of Multiword Expressions in Transformer-Based Models: A Survey
Filip Miletic
Sabine Schulte im Walde
40
6
0
27 Jan 2024
Assessing Logical Reasoning Capabilities of Encoder-Only Transformer
  Models
Assessing Logical Reasoning Capabilities of Encoder-Only Transformer Models
Paulo Pirozelli
M. M. José
Paulo de Tarso P. Filho
A. Brandão
Fabio Gagliardi Cozman
LRM
ELM
34
2
0
18 Dec 2023
Transformers are uninterpretable with myopic methods: a case study with
  bounded Dyck grammars
Transformers are uninterpretable with myopic methods: a case study with bounded Dyck grammars
Kaiyue Wen
Yuchen Li
Bing Liu
Andrej Risteski
21
21
0
03 Dec 2023
Divergences between Language Models and Human Brains
Divergences between Language Models and Human Brains
Yuchen Zhou
Emmy Liu
Graham Neubig
Michael J. Tarr
Leila Wehbe
32
1
0
15 Nov 2023
Unified Representation for Non-compositional and Compositional
  Expressions
Unified Representation for Non-compositional and Compositional Expressions
Ziheng Zeng
Suma Bhat
19
3
0
29 Oct 2023
Bridging Continuous and Discrete Spaces: Interpretable Sentence
  Representation Learning via Compositional Operations
Bridging Continuous and Discrete Spaces: Interpretable Sentence Representation Learning via Compositional Operations
James Y. Huang
Wenlin Yao
Kaiqiang Song
Hongming Zhang
Muhao Chen
Dong Yu
25
4
0
24 May 2023
Construction Grammar Provides Unique Insight into Neural Language Models
Construction Grammar Provides Unique Insight into Neural Language Models
Leonie Weissweiler
Taiqi He
Naoki Otani
David R. Mortensen
Lori S. Levin
Hinrich Schütze
21
13
0
04 Feb 2023
Can Transformer be Too Compositional? Analysing Idiom Processing in
  Neural Machine Translation
Can Transformer be Too Compositional? Analysing Idiom Processing in Neural Machine Translation
Verna Dankers
Christopher G. Lucas
Ivan Titov
38
36
0
30 May 2022
Investigating Robustness of Dialog Models to Popular Figurative Language
  Constructs
Investigating Robustness of Dialog Models to Popular Figurative Language Constructs
Harsh Jhamtani
Varun Gangal
Eduard H. Hovy
Taylor Berg-Kirkpatrick
28
21
0
01 Oct 2021
The paradox of the compositionality of natural language: a neural
  machine translation case study
The paradox of the compositionality of natural language: a neural machine translation case study
Verna Dankers
Elia Bruni
Dieuwke Hupkes
CoGe
162
75
0
12 Aug 2021
1