ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2504.09170
26
0

Langformers: Unified NLP Pipelines for Language Models

12 April 2025
Rabindra Lamsal
M. Read
S. Karunasekera
    KELM
    AI4CE
ArXivPDFHTML
Abstract

Transformer-based language models have revolutionized the field of natural language processing (NLP). However, using these models often involves navigating multiple frameworks and tools, as well as writing repetitive boilerplate code. This complexity can discourage non-programmers and beginners, and even slow down prototyping for experienced developers. To address these challenges, we introduce Langformers, an open-source Python library designed to streamline NLP pipelines through a unified, factory-based interface for large language model (LLM) and masked language model (MLM) tasks. Langformers integrates conversational AI, MLM pretraining, text classification, sentence embedding/reranking, data labelling, semantic search, and knowledge distillation into a cohesive API, supporting popular platforms such as Hugging Face and Ollama. Key innovations include: (1) task-specific factories that abstract training, inference, and deployment complexities; (2) built-in memory and streaming for conversational agents; and (3) lightweight, modular design that prioritizes ease of use. Documentation:this https URL

View on arXiv
@article{lamsal2025_2504.09170,
  title={ Langformers: Unified NLP Pipelines for Language Models },
  author={ Rabindra Lamsal and Maria Rodriguez Read and Shanika Karunasekera },
  journal={arXiv preprint arXiv:2504.09170},
  year={ 2025 }
}
Comments on this paper