Skip to content

Latest commit

 

History

History
executable file
·
33 lines (23 loc) · 2.44 KB

README.md

File metadata and controls

executable file
·
33 lines (23 loc) · 2.44 KB

Open In Colab

LOOPNET: MUSICAL LOOP SYNTHESIS CONDITIONED ON INTUITIVE MUSICAL PARAMETERS

Pritish Chandna, António Ramires, Xavier Serra, Emilia Gómez

Music Technology Group, Universitat Pompeu Fabra, Barcelona

This repository contains the source code for loop synthesis. Audio examples can be found in the project website. An interactive notebook can be found in here.

Installation

To install, clone the repository and use
pip install -r requirements.txt 
to install the packages required. The pretrained model weights can be downloaded along with the validation hdf5 file. The path to the unzipped model weights and the validation file need to be set in the command line arguments for the main function.

The code for creating the validation outputs and calculating the FAD is in the evaluate.py file.


usage: evalutate.py [-h] [--model MODEL] [--log_dir LOG_DIR] [--val_file VAL_FILE]
               [--output_dir OUTPUT_DIR]

optional arguments:
  -h, --help            show this help message and exit
  --model MODEL         Models to use, must be in multi_env, multi, wavespec,
                        wav or spec
  --log_dir LOG_DIR     The directory where the models are saved
  --val_file VAL_FILE   Path to the file containing validation features
  --output_dir OUTPUT_DIR
                        Directory to save the outputs in
  

Acknowledgments

This work is partially supported by the Towards Richer Online Music Public-domain Archives (TROMPA) (H2020 770376) European project. This work is partially supported by the European Union’s Horizon 2020 research and innovation programme under the Marie Skłodowska-Curie grant agreement No765068, MIP-Frontiers. The TITANX used for this research was donated by the NVIDIA Corporation.