ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2306.05817
  4. Cited By
How Can Recommender Systems Benefit from Large Language Models: A Survey

How Can Recommender Systems Benefit from Large Language Models: A Survey

9 June 2023
Jianghao Lin
Xinyi Dai
Yunjia Xi
Weiwen Liu
Bo Chen
Hao Zhang
Yong Liu
Chuhan Wu
Xiangyang Li
Chenxu Zhu
Huifeng Guo
Yong Yu
Ruiming Tang
Weinan Zhang
    LRM
ArXivPDFHTML

Papers citing "How Can Recommender Systems Benefit from Large Language Models: A Survey"

3 / 103 papers shown
Title
Extracting Training Data from Large Language Models
Extracting Training Data from Large Language Models
Nicholas Carlini
Florian Tramèr
Eric Wallace
Matthew Jagielski
Ariel Herbert-Voss
...
Tom B. Brown
D. Song
Ulfar Erlingsson
Alina Oprea
Colin Raffel
MLAU
SILM
290
1,833
0
14 Dec 2020
The Lottery Ticket Hypothesis for Pre-trained BERT Networks
The Lottery Ticket Hypothesis for Pre-trained BERT Networks
Tianlong Chen
Jonathan Frankle
Shiyu Chang
Sijia Liu
Yang Zhang
Zhangyang Wang
Michael Carbin
156
377
0
23 Jul 2020
Scaling Laws for Neural Language Models
Scaling Laws for Neural Language Models
Jared Kaplan
Sam McCandlish
T. Henighan
Tom B. Brown
B. Chess
R. Child
Scott Gray
Alec Radford
Jeff Wu
Dario Amodei
264
4,532
0
23 Jan 2020
Previous
123