ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2209.07879
  4. Cited By
Less is Better: Recovering Intended-Feature Subspace to Robustify NLU
  Models

Less is Better: Recovering Intended-Feature Subspace to Robustify NLU Models

16 September 2022
Ting Wu
Tao Gui
ArXivPDFHTML

Papers citing "Less is Better: Recovering Intended-Feature Subspace to Robustify NLU Models"

4 / 4 papers shown
Title
Modeling the Q-Diversity in a Min-max Play Game for Robust Optimization
Modeling the Q-Diversity in a Min-max Play Game for Robust Optimization
Ting Wu
Rui Zheng
Tao Gui
Qi Zhang
Xuanjing Huang
51
2
0
20 May 2023
Identifying and Mitigating Spurious Correlations for Improving
  Robustness in NLP Models
Identifying and Mitigating Spurious Correlations for Improving Robustness in NLP Models
Tianlu Wang
Rohit Sridhar
Diyi Yang
Xuezhi Wang
AAML
120
72
0
14 Oct 2021
An Investigation of Why Overparameterization Exacerbates Spurious
  Correlations
An Investigation of Why Overparameterization Exacerbates Spurious Correlations
Shiori Sagawa
Aditi Raghunathan
Pang Wei Koh
Percy Liang
152
371
0
09 May 2020
Hypothesis Only Baselines in Natural Language Inference
Hypothesis Only Baselines in Natural Language Inference
Adam Poliak
Jason Naradowsky
Aparajita Haldar
Rachel Rudinger
Benjamin Van Durme
190
576
0
02 May 2018
1