ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2310.10879
32
1

BLoad: Enhancing Neural Network Training with Efficient Sequential Data Handling

16 October 2023
Raphael Ruschel
A S M Iftekhar
B. S. Manjunath
Suya You
ArXivPDFHTML
Abstract

The increasing complexity of modern deep neural network models and the expanding sizes of datasets necessitate the development of optimized and scalable training methods. In this white paper, we addressed the challenge of efficiently training neural network models using sequences of varying sizes. To address this challenge, we propose a novel training scheme that enables efficient distributed data-parallel training on sequences of different sizes with minimal overhead. By using this scheme we were able to reduce the padding amount by more than 100xxx while not deleting a single frame, resulting in an overall increased performance on both training time and Recall in our experiments.

View on arXiv
Comments on this paper