ResearchTrend.AI
  • Communities
  • Connect sessions
  • AI calendar
  • Organizations
  • Join Slack
  • Contact Sales
Papers
Communities
Social Events
Terms and Conditions
Pricing
Contact Sales
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2026 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2303.00280
154
1
v1v2 (latest)

Label Attention Network for sequential multi-label classification: you were looking at a wrong self-attention

European Conference on Artificial Intelligence (ECAI), 2023
1 March 2023
Elizaveta Kovtun
Evgenia Romanenkova
Artem Zabolotnyi
Evgeny Burnaev
Martin Spindler
Alexey Zaytsev
ArXiv (abs)PDFHTML
Abstract

Most of the available user information can be represented as a sequence of timestamped events. Each event is assigned a set of categorical labels whose future structure is of great interest. For instance, our goal is to predict a group of items in the next customer's purchase or tomorrow's client transactions. This is a multi-label classification problem for sequential data. Modern approaches focus on transformer architecture for sequential data introducing self-attention for the elements in a sequence. In that case, we take into account events' time interactions but lose information on label inter-dependencies. Motivated by this shortcoming, we propose leveraging a self-attention mechanism over labels preceding the predicted step. As our approach is a Label-Attention NETwork, we call it LANET. Experimental evidence suggests that LANET outperforms the established models' performance and greatly captures interconnections between labels. For example, the micro-AUC of our approach is 0.95360.95360.9536 compared to 0.75010.75010.7501 for a vanilla transformer. We provide an implementation of LANET to facilitate its wider usage.

View on arXiv
Comments on this paper