ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2402.12691
  4. Cited By
Tree-Planted Transformers: Unidirectional Transformer Language Models
  with Implicit Syntactic Supervision

Tree-Planted Transformers: Unidirectional Transformer Language Models with Implicit Syntactic Supervision

20 February 2024
Ryosuke Yoshida
Taiga Someya
Yohei Oseki
ArXivPDFHTML

Papers citing "Tree-Planted Transformers: Unidirectional Transformer Language Models with Implicit Syntactic Supervision"

2 / 2 papers shown
Title
Training language models to follow instructions with human feedback
Training language models to follow instructions with human feedback
Long Ouyang
Jeff Wu
Xu Jiang
Diogo Almeida
Carroll L. Wainwright
...
Amanda Askell
Peter Welinder
Paul Christiano
Jan Leike
Ryan J. Lowe
OSLM
ALM
313
11,953
0
04 Mar 2022
Improving BERT Pretraining with Syntactic Supervision
Improving BERT Pretraining with Syntactic Supervision
Georgios Tziafas
Konstantinos Kogkalidis
G. Wijnholds
M. Moortgat
33
3
0
21 Apr 2021
1