ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2502.07494
  4. Cited By
URECA: The Chain of Two Minimum Set Cover Problems exists behind Adaptation to Shifts in Semantic Code Search

URECA: The Chain of Two Minimum Set Cover Problems exists behind Adaptation to Shifts in Semantic Code Search

11 February 2025
Seok-Ung Choi
Joonghyuk Hahn
Yo-Sub Han
ArXivPDFHTML

Papers citing "URECA: The Chain of Two Minimum Set Cover Problems exists behind Adaptation to Shifts in Semantic Code Search"

8 / 8 papers shown
Title
Entropy is not Enough for Test-Time Adaptation: From the Perspective of
  Disentangled Factors
Entropy is not Enough for Test-Time Adaptation: From the Perspective of Disentangled Factors
Jonghyun Lee
Dahuin Jung
Saehyung Lee
Junsung Park
Juhyeon Shin
Uiwon Hwang
Sungroh Yoon
TTA
AAML
47
37
0
12 Mar 2024
DocPrompting: Generating Code by Retrieving the Docs
DocPrompting: Generating Code by Retrieving the Docs
Shuyan Zhou
Uri Alon
Frank F. Xu
Zhiruo Wang
Zhengbao Jiang
Graham Neubig
LLMAG
65
136
0
13 Jul 2022
CoCoSoDa: Effective Contrastive Learning for Code Search
CoCoSoDa: Effective Contrastive Learning for Code Search
Ensheng Shi
Yanlin Wang
Wenchao Gu
Lun Du
Hongyu Zhang
Shi Han
Dongmei Zhang
Hongbin Sun
55
33
0
07 Apr 2022
CodeT5: Identifier-aware Unified Pre-trained Encoder-Decoder Models for
  Code Understanding and Generation
CodeT5: Identifier-aware Unified Pre-trained Encoder-Decoder Models for Code Understanding and Generation
Yue Wang
Weishi Wang
Shafiq Joty
Guosheng Lin
280
1,532
0
02 Sep 2021
GraphCodeBERT: Pre-training Code Representations with Data Flow
GraphCodeBERT: Pre-training Code Representations with Data Flow
Daya Guo
Shuo Ren
Shuai Lu
Zhangyin Feng
Duyu Tang
...
Dawn Drain
Neel Sundaresan
Jian Yin
Daxin Jiang
M. Zhou
132
1,111
0
17 Sep 2020
Language Models are Few-Shot Learners
Language Models are Few-Shot Learners
Tom B. Brown
Benjamin Mann
Nick Ryder
Melanie Subbiah
Jared Kaplan
...
Christopher Berner
Sam McCandlish
Alec Radford
Ilya Sutskever
Dario Amodei
BDL
552
41,106
0
28 May 2020
CodeBERT: A Pre-Trained Model for Programming and Natural Languages
CodeBERT: A Pre-Trained Model for Programming and Natural Languages
Zhangyin Feng
Daya Guo
Duyu Tang
Nan Duan
Xiaocheng Feng
...
Linjun Shou
Bing Qin
Ting Liu
Daxin Jiang
Ming Zhou
137
2,588
0
19 Feb 2020
Learning to Mine Aligned Code and Natural Language Pairs from Stack
  Overflow
Learning to Mine Aligned Code and Natural Language Pairs from Stack Overflow
Pengcheng Yin
Bowen Deng
Edgar Chen
Bogdan Vasilescu
Graham Neubig
59
301
0
23 May 2018
1