ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2104.12424
  4. Cited By
Attention vs non-attention for a Shapley-based explanation method

Attention vs non-attention for a Shapley-based explanation method

26 April 2021
T. Kersten
Hugh Mee Wong
Jaap Jumelet
Dieuwke Hupkes
ArXivPDFHTML

Papers citing "Attention vs non-attention for a Shapley-based explanation method"

2 / 2 papers shown
Title
Causal Transformers Perform Below Chance on Recursive Nested
  Constructions, Unlike Humans
Causal Transformers Perform Below Chance on Recursive Nested Constructions, Unlike Humans
Yair Lakretz
T. Desbordes
Dieuwke Hupkes
S. Dehaene
233
11
0
14 Oct 2021
The Rediscovery Hypothesis: Language Models Need to Meet Linguistics
The Rediscovery Hypothesis: Language Models Need to Meet Linguistics
Vassilina Nikoulina
Maxat Tezekbayev
Nuradil Kozhakhmet
Madina Babazhanova
Matthias Gallé
Z. Assylbekov
34
8
0
02 Mar 2021
1