We study the problem of reducing a task cost functional , defined over Sobolev-class signals , when the cost is invariant under a global symmetry group and accessible only as a black-box. Such scenarios arise in machine learning, imaging, and inverse problems, where cost metrics reflect model outputs or performance scores but are non-differentiable and model-internal. We propose a variational method that exploits the symmetry structure to construct explicit, symmetry-breaking deformations of the input signal. A gauge field , obtained by minimizing an auxiliary energy functional, induces a deformation that generically lies transverse to the -orbit of . We prove that, under mild regularity, the cost strictly decreases along this direction -- either via Clarke subdifferential descent or by escaping locally flat plateaus. The exceptional set of degeneracies has zero Gaussian measure. Our approach requires no access to model gradients or labels and operates entirely at test time. It provides a principled tool for optimizing invariant cost functionals via Lie-algebraic variational flows, with applications to black-box models and symmetry-constrained tasks.
View on arXiv@article{osipov2025_2505.13578, title={ Symmetry-Breaking Descent for Invariant Cost Functionals }, author={ Mikhail Osipov }, journal={arXiv preprint arXiv:2505.13578}, year={ 2025 } }