We consider a generalization of the classic linear regression problem to the case when the loss is an Orlicz norm. An Orlicz norm is parameterized by a non-negative convex function with : the Orlicz norm of a vector is defined as We consider the cases where the function grows subquadratically. Our main result is based on a new oblivious embedding which embeds the column space of a given matrix with Orlicz norm into a lower dimensional space with norm. Specifically, we show how to efficiently find an embedding matrix such that By applying this subspace embedding technique, we show an approximation algorithm for the regression problem , up to a factor. As a further application of our techniques, we show how to also use them to improve on the algorithm for the low rank matrix approximation problem for .
View on arXiv