ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 1806.04278
  4. Cited By
The NES Music Database: A multi-instrumental dataset with expressive
  performance attributes

The NES Music Database: A multi-instrumental dataset with expressive performance attributes

12 June 2018
Chris Donahue
H. H. Mao
Julian McAuley
ArXiv (abs)PDFHTML

Papers citing "The NES Music Database: A multi-instrumental dataset with expressive performance attributes"

5 / 5 papers shown
Title
The GigaMIDI Dataset with Features for Expressive Music Performance Detection
The GigaMIDI Dataset with Features for Expressive Music Performance Detection
Keon Ju M. Lee
J. Ens
Sara Adkins
Pedro Sarmento
M. Barthet
Philippe Pasquier
153
2
0
24 Feb 2025
Counterpoint by Convolution
Counterpoint by Convolution
Cheng-Zhi Anna Huang
Tim Cooijmans
Adam Roberts
Aaron Courville
Douglas Eck
BDL
63
151
0
18 Mar 2019
Variational Recurrent Auto-Encoders
Variational Recurrent Auto-Encoders
Otto Fabius
Joost R. van Amersfoort
GANBDLDRL
89
247
0
20 Dec 2014
Empirical Evaluation of Gated Recurrent Neural Networks on Sequence
  Modeling
Empirical Evaluation of Gated Recurrent Neural Networks on Sequence Modeling
Junyoung Chung
Çağlar Gülçehre
Kyunghyun Cho
Yoshua Bengio
601
12,734
0
11 Dec 2014
Modeling Temporal Dependencies in High-Dimensional Sequences:
  Application to Polyphonic Music Generation and Transcription
Modeling Temporal Dependencies in High-Dimensional Sequences: Application to Polyphonic Music Generation and Transcription
Nicolas Boulanger-Lewandowski
Yoshua Bengio
Pascal Vincent
179
704
0
27 Jun 2012
1