R-VAE: Live latent space drum rhythm generation from minimal-size datasets

Gabriel Vigliensoni, Louis McCallum, Esteban Maestre, and Rebecca Fiebrink. 2022. Journal of Creative Music Systems 1(1).

Abstract

In this article, we present R-VAE, a system designed for the modeling and exploration of latent spaces learned from rhythms encoded in MIDI clips. The system is based on a variational autoencoder neural network, uses a data structure that is capable of encoding rhythms in simple and compound meter, and can learn models from little training data. To facilitate the exploration of models, we implemented a visualizer that relies on the dynamic nature of the pulsing rhythmic patterns. To test our system in real-life musical practice, we collected small-scale datasets of contemporary music genre rhythms and trained models with them. We found that the non-linearities of the learned latent spaces coupled with tactile interfaces to interact with the models were very expressive and led to unexpected places in musical composition and live performance settings. A music album was recorded and it was premiered at a major music festival using the VAE latent space on stage.

Keywords

Rhythm, Meter, Latent space, Music visualization

How to Cite

Vigliensoni, G. , McCallum, L. , Maestre, E. & Fiebrink, R. (2022) “R-VAE: Live latent space drum rhythm generation from minimal-size datasets”, Journal of Creative Music Systems 1(1). doi: https://doi.org/10.5920/jcms.902