ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2305.16302
  4. Cited By
Cross-Lingual Knowledge Distillation for Answer Sentence Selection in
  Low-Resource Languages

Cross-Lingual Knowledge Distillation for Answer Sentence Selection in Low-Resource Languages

25 May 2023
Shivanshu Gupta
Yoshitomo Matsubara
Ankita N. Chadha
Alessandro Moschitti
ArXivPDFHTML

Papers citing "Cross-Lingual Knowledge Distillation for Answer Sentence Selection in Low-Resource Languages"

4 / 4 papers shown
Title
Distilling Linguistic Context for Language Model Compression
Distilling Linguistic Context for Language Model Compression
Geondo Park
Gyeongman Kim
Eunho Yang
45
38
0
17 Sep 2021
Will this Question be Answered? Question Filtering via Answer Model
  Distillation for Efficient Question Answering
Will this Question be Answered? Question Filtering via Answer Model Distillation for Efficient Question Answering
Siddhant Garg
Alessandro Moschitti
29
26
0
14 Sep 2021
Modeling Context in Answer Sentence Selection Systems on a Latency
  Budget
Modeling Context in Answer Sentence Selection Systems on a Latency Budget
Rujun Han
Luca Soldaini
Alessandro Moschitti
27
14
0
28 Jan 2021
PySBD: Pragmatic Sentence Boundary Disambiguation
PySBD: Pragmatic Sentence Boundary Disambiguation
Nipun Sadvilkar
Mark Neumann
58
78
0
19 Oct 2020
1