ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2110.07160
  4. Cited By
Transformer over Pre-trained Transformer for Neural Text Segmentation
  with Enhanced Topic Coherence

Transformer over Pre-trained Transformer for Neural Text Segmentation with Enhanced Topic Coherence

14 October 2021
Kelvin Lo
Yuan Jin
Weicong Tan
Ming Liu
Lan Du
Wray Buntine
ArXivPDFHTML

Papers citing "Transformer over Pre-trained Transformer for Neural Text Segmentation with Enhanced Topic Coherence"

1 / 1 papers shown
Title
ALBERT: A Lite BERT for Self-supervised Learning of Language
  Representations
ALBERT: A Lite BERT for Self-supervised Learning of Language Representations
Zhenzhong Lan
Mingda Chen
Sebastian Goodman
Kevin Gimpel
Piyush Sharma
Radu Soricut
SSL
AIMat
266
6,420
0
26 Sep 2019
1