ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 1906.04705
24
24

Fast and Accurate Least-Mean-Squares Solvers

11 June 2019
Alaa Maalouf
Ibrahim Jubran
Dan Feldman
ArXivPDFHTML
Abstract

Least-mean squares (LMS) solvers such as Linear / Ridge / Lasso-Regression, SVD and Elastic-Net not only solve fundamental machine learning problems, but are also the building blocks in a variety of other methods, such as decision trees and matrix factorizations. We suggest an algorithm that gets a finite set of nnn ddd-dimensional real vectors and returns a weighted subset of d+1d+1d+1 vectors whose sum is \emph{exactly} the same. The proof in Caratheodory's Theorem (1907) computes such a subset in O(n2d2)O(n^2d^2)O(n2d2) time and thus not used in practice. Our algorithm computes this subset in O(nd+d4log⁡n)O(nd+d^4\log{n})O(nd+d4logn) time, using O(log⁡n)O(\log n)O(logn) calls to Caratheodory's construction on small but "smart" subsets. This is based on a novel paradigm of fusion between different data summarization techniques, known as sketches and coresets. For large values of ddd, we suggest a faster construction that takes O(nd)O(nd)O(nd) time (linear in the input's size) and returns a weighted subset of O(d)O(d)O(d) sparsified input points. Here, sparsified point means that some of its entries were replaced by zeroes. As an example application, we show how it can be used to boost the performance of existing LMS solvers, such as those in scikit-learn library, up to x100. Generalization for streaming and distributed (big) data is trivial. Extensive experimental results and complete open source code are also provided.

View on arXiv
Comments on this paper