Generate fine-tuning samples & Fine-tuning the model & Generate samples by transferring Note On

Related tags

Deep Learning UPMT
Overview

UPMT

Generate fine-tuning samples & Fine-tuning the model & Generate samples by transferring Note On

See main.py as an example:

from model import PopMusicTransformer
import argparse
import tensorflow as tf
import os
import pickle
import numpy as np
from glob import glob
parser = argparse.ArgumentParser(description='')
parser.add_argument('--prompt_path', dest='prompt_path', default='./test/prompt/test_input.mid', help='path of prompt')
parser.add_argument('--output_path', dest='output_path', default='./test/output/test_generate.mid', help='path of the output')
parser.add_argument('--favorite_path', dest='favorite_path', default='./test/favorite/test_favorite.mid', help='path of favorite')
parser.add_argument('--trainingdata_path', dest='trainingdata_path', default='./test/data/training.pickle', help='path of favorite training data')
parser.add_argument('--output_checkpoint_folder', dest='output_checkpoint_folder', default='./test/checkpoint/', help='path of favorite')
parser.add_argument('--alpha', default=0.1, help='weight of events')
parser.add_argument('--temperature', default=300, help='sampling temperature')
parser.add_argument('--topk', default=5, help='sampling topk')
parser.add_argument('--smpi', default=[-2,-2,-1,-2,-2,2,2,5], help='signature music pattern interval')

parser.add_argument('--type', dest='type', default='generateno', help='generateno or pretrain or prepare')

args = parser.parse_args()


def main(_):

    tfconfig = tf.ConfigProto(allow_soft_placement=True)
    with tf.Session(config=tfconfig) as sess:
        if args.type == 'prepare':
            midi_paths = glob('./test/favorite'+'/*.mid')
            model = PopMusicTransformer(
                checkpoint='./test/model',
                is_training=False)
            model.prepare_data(
                        midi_paths=midi_paths)    
        elif args.type == 'generateno':
            model = PopMusicTransformer(
                checkpoint='./test/model',
                is_training=False)
            model.generate_noteon(
                        temperature=float(args.temperature),
                        topk=int(args.topk),
                        output_path=args.output_path,  
                        smpi= np.array(args.smpi),
                        prompt=args.prompt_path)
        elif args.type =='pretrain':
            training_data = pickle.load(open(args.trainingdata_path,"rb"))
            if not os.path.exists(args.output_checkpoint_folder):
                os.mkdir(args.output_checkpoint_folder)
            model = PopMusicTransformer(
                checkpoint='./test/model',
                is_training=True)
            model.finetune(
                training_data=training_data,
                alpha=float(args.alpha),
                favoritepath=args.favorite_path,
                output_checkpoint_folder=args.output_checkpoint_folder)

if __name__ == '__main__':
    tf.app.run()

Thanks https://github.com/YatingMusic/remi for the open source.

You might also like...
Ensemble Knowledge Guided Sub-network Search and Fine-tuning for Filter Pruning
Ensemble Knowledge Guided Sub-network Search and Fine-tuning for Filter Pruning

Ensemble Knowledge Guided Sub-network Search and Fine-tuning for Filter Pruning This repository is official Tensorflow implementation of paper: Ensemb

Code for T-Few from "Few-Shot Parameter-Efficient Fine-Tuning is Better and Cheaper than In-Context Learning"

T-Few This repository contains the official code for the paper: "Few-Shot Parameter-Efficient Fine-Tuning is Better and Cheaper than In-Context Learni

Fine-Tune EleutherAI GPT-Neo to Generate Netflix Movie Descriptions in Only 47 Lines of Code Using Hugginface And DeepSpeed
Fine-Tune EleutherAI GPT-Neo to Generate Netflix Movie Descriptions in Only 47 Lines of Code Using Hugginface And DeepSpeed

GPT-Neo-2.7B Fine-Tuning Example Using HuggingFace & DeepSpeed Installation cd venv/bin ./pip install -r ../../requirements.txt ./pip install deepspe

FaceVerse: a Fine-grained and Detail-controllable 3D Face Morphable Model from a Hybrid Dataset (CVPR2022)
FaceVerse: a Fine-grained and Detail-controllable 3D Face Morphable Model from a Hybrid Dataset (CVPR2022)

FaceVerse FaceVerse: a Fine-grained and Detail-controllable 3D Face Morphable Model from a Hybrid Dataset Lizhen Wang, Zhiyuan Chen, Tao Yu, Chenguang

In this project we investigate the performance of the SetCon model on realistic video footage. Therefore, we implemented the model in PyTorch and tested the model on two example videos.
In this project we investigate the performance of the SetCon model on realistic video footage. Therefore, we implemented the model in PyTorch and tested the model on two example videos.

Contrastive Learning of Object Representations Supervisor: Prof. Dr. Gemma Roig Institutions: Goethe University CVAI - Computational Vision & Artifici

Step by Step on how to create an vision recognition model using LOBE.ai, export the model and run the model in an Azure Function
Step by Step on how to create an vision recognition model using LOBE.ai, export the model and run the model in an Azure Function

Step by Step on how to create an vision recognition model using LOBE.ai, export the model and run the model in an Azure Function

Code samples for my book "Neural Networks and Deep Learning"

Code samples for "Neural Networks and Deep Learning" This repository contains code samples for my book on "Neural Networks and Deep Learning". The cod

Like Dirt-Samples, but cleaned up

Clean-Samples Like Dirt-Samples, but cleaned up, with clear provenance and license info (generally a permissive creative commons licence but check the

 PAWS 🐾 Predicting View-Assignments with Support Samples
PAWS 🐾 Predicting View-Assignments with Support Samples

This repo provides a PyTorch implementation of PAWS (predicting view assignments with support samples), as described in the paper Semi-Supervised Learning of Visual Features by Non-Parametrically Predicting View Assignments with Support Samples.

Owner
null
Black-Box-Tuning - Black-Box Tuning for Language-Model-as-a-Service

Black-Box-Tuning Source code for paper "Black-Box Tuning for Language-Model-as-a

Tianxiang Sun 149 Jan 4, 2023
NeurIPS 2021, "Fine Samples for Learning with Noisy Labels"

[Official] FINE Samples for Learning with Noisy Labels This repository is the official implementation of "FINE Samples for Learning with Noisy Labels"

mythbuster 27 Dec 23, 2022
A command line simple note taking app

Why yet another note taking program? note was designed with a very specific target in mind: me, and my 2354 scraps of paper. It runs from the command

null 64 Nov 20, 2022
Code for ACL2021 paper Consistency Regularization for Cross-Lingual Fine-Tuning.

xTune Code for ACL2021 paper Consistency Regularization for Cross-Lingual Fine-Tuning. Environment DockerFile: dancingsoul/pytorch:xTune Install the f

Bo Zheng 42 Dec 9, 2022
Cartoon-StyleGan2 🙃 : Fine-tuning StyleGAN2 for Cartoon Face Generation

Fine-tuning StyleGAN2 for Cartoon Face Generation

Jihye Back 520 Jan 4, 2023
Official codebase for Legged Robots that Keep on Learning: Fine-Tuning Locomotion Policies in the Real World

Legged Robots that Keep on Learning Official codebase for Legged Robots that Keep on Learning: Fine-Tuning Locomotion Policies in the Real World, whic

Laura Smith 70 Dec 7, 2022
Fine-tuning StyleGAN2 for Cartoon Face Generation

Cartoon-StyleGAN ?? : Fine-tuning StyleGAN2 for Cartoon Face Generation Abstract Recent studies have shown remarkable success in the unsupervised imag

Jihye Back 520 Jan 4, 2023
This repository is the official implementation of Unleashing the Power of Contrastive Self-Supervised Visual Models via Contrast-Regularized Fine-Tuning (NeurIPS21).

Core-tuning This repository is the official implementation of ``Unleashing the Power of Contrastive Self-Supervised Visual Models via Contrast-Regular

vanint 18 Dec 17, 2022
Example Of Fine-Tuning BERT For Named-Entity Recognition Task And Preparing For Cloud Deployment Using Flask, React, And Docker

Example Of Fine-Tuning BERT For Named-Entity Recognition Task And Preparing For Cloud Deployment Using Flask, React, And Docker This repository contai

Nikita 12 Dec 14, 2022
Implementation of the paper "Fine-Tuning Transformers: Vocabulary Transfer"

Transformer-vocabulary-transfer Implementation of the paper "Fine-Tuning Transfo

LEYA 13 Nov 30, 2022