ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2306.00124
  4. Cited By
Pre-Trained Language-Meaning Models for Multilingual Parsing and
  Generation

Pre-Trained Language-Meaning Models for Multilingual Parsing and Generation

31 May 2023
Chunliu Wang
Huiyuan Lai
Malvina Nissim
Johan Bos
ArXivPDFHTML

Papers citing "Pre-Trained Language-Meaning Models for Multilingual Parsing and Generation"

5 / 5 papers shown
Title
Neural Semantic Parsing with Extremely Rich Symbolic Meaning
  Representations
Neural Semantic Parsing with Extremely Rich Symbolic Meaning Representations
Xiao Zhang
Gosse Bouma
Johan Bos
NAI
41
0
0
19 Apr 2024
Gaining More Insight into Neural Semantic Parsing with Challenging
  Benchmarks
Gaining More Insight into Neural Semantic Parsing with Challenging Benchmarks
Xiao Zhang
Chunliu Wang
Rik van Noord
Johan Bos
41
3
0
12 Apr 2024
MVP: Multi-task Supervised Pre-training for Natural Language Generation
MVP: Multi-task Supervised Pre-training for Natural Language Generation
Tianyi Tang
Junyi Li
Wayne Xin Zhao
Ji-Rong Wen
63
24
0
24 Jun 2022
Rethinking Supervised Pre-training for Better Downstream Transferring
Rethinking Supervised Pre-training for Better Downstream Transferring
Yutong Feng
Jianwen Jiang
Mingqian Tang
Rong Jin
Yue Gao
SSL
64
39
0
12 Oct 2021
Pre-trained Models for Natural Language Processing: A Survey
Pre-trained Models for Natural Language Processing: A Survey
Xipeng Qiu
Tianxiang Sun
Yige Xu
Yunfan Shao
Ning Dai
Xuanjing Huang
LM&MA
VLM
253
1,458
0
18 Mar 2020
1