ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2407.18887
  4. Cited By
Embedding And Clustering Your Data Can Improve Contrastive Pretraining

Embedding And Clustering Your Data Can Improve Contrastive Pretraining

26 July 2024
Luke Merrick
ArXivPDFHTML

Papers citing "Embedding And Clustering Your Data Can Improve Contrastive Pretraining"

3 / 3 papers shown
Title
Plan-and-Refine: Diverse and Comprehensive Retrieval-Augmented Generation
Plan-and-Refine: Diverse and Comprehensive Retrieval-Augmented Generation
Alireza Salemi
Chris Samarinas
Hamed Zamani
36
0
0
10 Apr 2025
Improved Large Language Model Jailbreak Detection via Pretrained
  Embeddings
Improved Large Language Model Jailbreak Detection via Pretrained Embeddings
Erick Galinkin
Martin Sablotny
76
0
0
02 Dec 2024
OnlySportsLM: Optimizing Sports-Domain Language Models with SOTA
  Performance under Billion Parameters
OnlySportsLM: Optimizing Sports-Domain Language Models with SOTA Performance under Billion Parameters
Zexin Chen
Chengxi Li
Xiangyu Xie
Parijat Dube
ALM
36
2
0
30 Aug 2024
1