ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2405.13899
43
0

Symmetric Linear Bandits with Hidden Symmetry

22 May 2024
Nam-Phuong Tran
T. Ta
Debmalya Mandal
Long Tran-Thanh
ArXivPDFHTML
Abstract

High-dimensional linear bandits with low-dimensional structure have received considerable attention in recent studies due to their practical significance. The most common structure in the literature is sparsity. However, it may not be available in practice. Symmetry, where the reward is invariant under certain groups of transformations on the set of arms, is another important inductive bias in the high-dimensional case that covers many standard structures, including sparsity. In this work, we study high-dimensional symmetric linear bandits where the symmetry is hidden from the learner, and the correct symmetry needs to be learned in an online setting. We examine the structure of a collection of hidden symmetry and provide a method based on model selection within the collection of low-dimensional subspaces. Our algorithm achieves a regret bound of O(d01/3T2/3log⁡(d)) O(d_0^{1/3} T^{2/3} \log(d))O(d01/3​T2/3log(d)), where ddd is the ambient dimension which is potentially very large, and d0d_0d0​ is the dimension of the true low-dimensional subspace such that d0≪dd_0 \ll dd0​≪d. With an extra assumption on well-separated models, we can further improve the regret to O(d0Tlog⁡(d)) O(d_0\sqrt{T\log(d)} )O(d0​Tlog(d)​).

View on arXiv
Comments on this paper