Skip to content

naplab/Music-Transformer-Representations

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

4 Commits
 
 
 
 
 
 
 
 

Repository files navigation

DOI

Standalone Software for Extracting Music Transformer Representations

This includes the code for extracting music transformer representations from the musicautobot model, since this portion of our analysis is not a standard method used across the field. We provide the code for extracting these representations from midi files, which are then used to train temporal receptive field (TRF) models which predict neural responses.

System Requirements

This requires Python>=3.6 and MuseScore.

This has been tested with python version 3.9 and MuseScore 3

Installation Guide

The remaining python package dependencies can be downloaded and installed into a conda environment using the provided environment.yml file, by running the following command:

conda create -f environment.yml

Demo

We provide a file called Demo_Midi_Representations.ipynb which is a jupyter notebook file that is run with python. It will first download the pretrained Musicautobot transformer model weights, and then instantiate a model and extract layer activations from the midi files.

The expected output should be saved files containing the representations for TRF analysis.

The longest portion of the runtime is from downloading the model, which may take about 20-30 minutes.

Instructions for Use

First, download the Midi stimuli files here: https://datadryad.org/stash/dataset/doi:10.5061/dryad.g1jwstqmh

Simply run Demo_Midi_Representations.ipynb with jupyter-notebook to extract layer representations from the Midi files.

Paper

Music transformer embeddings were used in the following paper (link):

Mischler, G., Li, Y. A., Bickel, S., Mehta, A. D., & Mesgarani, N. (2024).
The impact of musical expertise on disentangled and contextual neural
encoding of music revealed by generative music models. bioRxiv, 2024-12.

About

No description, website, or topics provided.

Resources

Stars

Watchers

Forks

Packages

No packages published