ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2505.11198
12
0

User-centric Music Recommendations

16 May 2025
Jaime Ramirez Castillo
M. Julia Flores
Ann E. Nicholson
ArXivPDFHTML
Abstract

This work presents a user-centric recommendation framework, designed as a pipeline with four distinct, connected, and customizable phases. These phases are intended to improve explainability and boost user engagement.We have collected the historicalthis http URLtrack playback records of a single user over approximately 15 years. The collected dataset includes more than 90,000 playbacks and approximately 14,000 unique tracks.From track playback records, we have created a dataset of user temporal contexts (each row is a specific moment when the user listened to certain music descriptors). As music descriptors, we have used community-contributedthis http URLtags and Spotify audio features. They represent the music that, throughout years, the user has been listening to.Next, given the most relevantthis http URLtags of a moment (e.g. the hour of the day), we predict the Spotify audio features that best fit the user preferences in that particular moment. Finally, we use the predicted audio features to find tracks similar to these features. The final aim is to recommend (and discover) tracks that the user may feel like listening to at a particular moment.For our initial study case, we have chosen to predict only a single audio feature target: danceability. The framework, however, allows to include more target variables.The ability to learn the musical habits from a single user can be quite powerful, and this framework could be extended to other users.

View on arXiv
@article{castillo2025_2505.11198,
  title={ User-centric Music Recommendations },
  author={ Jaime Ramirez Castillo and M. Julia Flores and Ann E. Nicholson },
  journal={arXiv preprint arXiv:2505.11198},
  year={ 2025 }
}
Comments on this paper