Implicit bias of any algorithm: bounding bias via margin

Consider points in finite-dimensional euclidean space, each having one of two colors. Suppose there exists a separating hyperplane (identified with its unit normal vector for the points, i.e a hyperplane such that points of same color lie on the same side of the hyperplane. We measure the quality of such a hyperplane by its margin , defined as minimum distance between any of the points and the hyperplane. In this paper, we prove that the margin function satisfies a nonsmooth Kurdyka-Lojasiewicz inequality with exponent . This result has far-reaching consequences. For example, let be the maximum possible margin for the problem and let be the parameter for the hyperplane which attains this value. Given any other separating hyperplane with parameter , let be the euclidean distance between and , also called the bias of . From the previous KL-inequality, we deduce that , where is the maximum distance of the points from the origin. Consequently, for any optimization algorithm (gradient-descent or not), the bias of the iterates converges at least as fast as the square-root of the rate of their convergence of the margin. Thus, our work provides a generic tool for analyzing the implicit bias of any algorithm in terms of its margin, in situations where a specialized analysis might not be available: it is sufficient to establish a good rate for converge of the margin, a task which is usually much easier.
View on arXiv