Open-source implementation of Google Vizier for hyper parameters tuning

Overview

Advisor

Introduction

Advisor is the hyper parameters tuning system for black box optimization.

It is the open-source implementation of Google Vizier with these features.

  • Easy to use with API, SDK, WEB and CLI
  • Support abstractions of Study and Trial
  • Included search and early stop algorithms
  • Recommend parameters with trained model
  • Same programming interfaces as Google Vizier
  • Command-line tool just like Microsoft NNI.

Supported Algorithms

  • Grid Search
  • Random Search
  • Bayesian Optimization
  • TPE(Hyperopt)
  • Random Search(Hyperopt)
  • Simulate Anneal(Hyperopt)
  • Quasi Random(Chocolate)
  • Grid Search(Chocolate)
  • Random Search(Chocolate)
  • Bayes(Chocolate)
  • CMAES(Chocolate)
  • MOCMAES(Chocolate)
  • SMAC Algorithm
  • Bayesian Optimization(Skopt)
  • Early Stop First Trial Algorithm
  • Early Stop Descending Algorithm
  • Performance Curve Stop Algorithm

Quick Start

It is easy to setup advisor service in local machine.

pip install advisor

advisor_admin server start

Then go to http://127.0.0.1:8000 in the browser and submit tuning jobs.

git clone --depth 1 https://github.com/tobegit3hub/advisor.git && cd ./advisor/

advisor run -f ./advisor_client/examples/python_function/config.json

advisor study describe -s demo

Advisor Server

Run server with official package.

advisor_admin server start

Or run with official docker image.

docker run -d -p 8000:8000 tobegit3hub/advisor

Or run with docker-compose.

wget https://raw.githubusercontent.com/tobegit3hub/advisor/master/docker-compose.yml

docker-compose up -d

Or run in Kubernetes cluster.

wget https://raw.githubusercontent.com/tobegit3hub/advisor/master/kubernetes_advisor.yaml

kubectl create -f ./kubernetes_advisor.yaml

Or run from scratch with source code.

git clone --depth 1 https://github.com/tobegit3hub/advisor.git && cd ./advisor/

pip install -r ./requirements.txt

./manage.py migrate

./manage.py runserver 0.0.0.0:8000

Advisor Client

Install with pip or use docker container.

pip install advisor

docker run -it --net=host tobegit3hub/advisor bash

Use the command-line tool.

export ADVISOR_ENDPOINT="http://127.0.0.1:8000"

advisor study list

advisor study describe -s "demo"

advisor trial list --study_name "demo"

Use admin tool to start/stop server.

advisor_admin server start

advisor_admin server stop

Use the Python SDK.

client = AdvisorClient()

# Create the study
study_configuration = {
        "goal": "MAXIMIZE",
        "params": [
                {
                        "parameterName": "hidden1",
                        "type": "INTEGER",
                        "minValue": 40,
                        "maxValue": 400,
                        "scalingType": "LINEAR"
                }
        ]
}
study = client.create_study("demo", study_configuration)

# Get suggested trials
trials = client.get_suggestions(study, 3)

# Complete the trial
trial = trials[0]
trial_metrics = 1.0
client.complete_trial(trial, trial_metrics)

Please checkout examples for more usage.

Configuration

Study configuration describe the search space of parameters. It supports four types and here is the example.

{
  "goal": "MAXIMIZE",
  "randomInitTrials": 1,
  "maxTrials": 5,
  "maxParallelTrials": 1,
  "params": [
    {
      "parameterName": "hidden1",
      "type": "INTEGER",
      "minValue": 1,
      "maxValue": 10,
      "scalingType": "LINEAR"
    },
    {
      "parameterName": "learning_rate",
      "type": "DOUBLE",
      "minValue": 0.01,
      "maxValue": 0.5,
      "scalingType": "LINEAR"
    },
    {
      "parameterName": "hidden2",
      "type": "DISCRETE",
      "feasiblePoints": "8, 16, 32, 64",
      "scalingType": "LINEAR"
    },
    {
      "parameterName": "optimizer",
      "type": "CATEGORICAL",
      "feasiblePoints": "sgd, adagrad, adam, ftrl",
      "scalingType": "LINEAR"
    },
    {
      "parameterName": "batch_normalization",
      "type": "CATEGORICAL",
      "feasiblePoints": "true, false",
      "scalingType": "LINEAR"
    }
  ]
}

Here is the configuration file in JSON format for advisor run.

{
  "name": "demo",
  "algorithm": "BayesianOptimization",
  "trialNumber": 10,
  "concurrency": 1,
  "path": "./advisor_client/examples/python_function/",
  "command": "./min_function.py",
  "search_space": {
      "goal": "MINIMIZE",
      "randomInitTrials": 3,
      "params": [
          {
              "parameterName": "x",
              "type": "DOUBLE",
              "minValue": -10.0,
              "maxValue": 10.0,
              "scalingType": "LINEAR"
          }
      ]
  }
}

Or use the equivalent configuration file in YAML format.

name: "demo"
algorithm: "BayesianOptimization"
trialNumber: 10
path: "./advisor_client/examples/python_function/"
command: "./min_function.py"
search_space:
  goal: "MINIMIZE"
  randomInitTrials: 3
  params:
    - parameterName: "x"
      type: "DOUBLE"
      minValue: -10.0
      maxValue: 10.0

Screenshots

List all the studies and create/delete the studies easily.

study_list.png

List the detail of study and all the related trials.

study_detail.png

List all the trials and create/delete the trials easily.

trial_list.png

List the detail of trial and all the related metrics.

trial_detail.png

Development

You can edit the source code and test without re-deploying the server and client.

git clone [email protected]:tobegit3hub/advisor.git

cd ./advisor/advisor_client/

python ./setup.py develop

export PYTHONPATH="/Library/Python/2.7/site-packages/:$PYTHONPATH"
Comments
  • INTEGER out of bound

    INTEGER out of bound

    I have parameter defined like:

            {
                "parameterName": "l1024",
                "type": "INTEGER",
                "minValue": 0,
                "maxValue": 2,
                "feasiblePoints": "",
                "scallingType": "LINEAR"
            },
    

    but I got suggestion l1024 = 16

    image

    opened by Earthson 11
  • Support Python 3 and run advisor with PyCharm

    Support Python 3 and run advisor with PyCharm

    /usr/bin/python3.5 /home/hadoop/PycharmProjects/advisor/advisor_server/manage.py runserver 0.0.0.0:8000 Performing system checks...

    Unhandled exception in thread started by <function check_errors..wrapper at 0x7f1339688840> Traceback (most recent call last): File "/usr/local/lib/python3.5/dist-packages/django/utils/autoreload.py", line 225, in wrapper fn(*args, **kwargs) File "/usr/local/lib/python3.5/dist-packages/django/core/management/commands/runserver.py", line 121, in inner_run self.check(display_num_errors=True) File "/usr/local/lib/python3.5/dist-packages/django/core/management/base.py", line 364, in check include_deployment_checks=include_deployment_checks, File "/usr/local/lib/python3.5/dist-packages/django/core/management/base.py", line 351, in _run_checks return checks.run_checks(**kwargs) File "/usr/local/lib/python3.5/dist-packages/django/core/checks/registry.py", line 73, in run_checks new_errors = check(app_configs=app_configs) File "/usr/local/lib/python3.5/dist-packages/django/core/checks/urls.py", line 40, in check_url_namespaces_unique all_namespaces = _load_all_namespaces(resolver) File "/usr/local/lib/python3.5/dist-packages/django/core/checks/urls.py", line 57, in _load_all_namespaces url_patterns = getattr(resolver, 'url_patterns', []) File "/usr/local/lib/python3.5/dist-packages/django/utils/functional.py", line 36, in get res = instance.dict[self.name] = self.func(instance) File "/usr/local/lib/python3.5/dist-packages/django/urls/resolvers.py", line 536, in url_patterns patterns = getattr(self.urlconf_module, "urlpatterns", self.urlconf_module) File "/usr/local/lib/python3.5/dist-packages/django/utils/functional.py", line 36, in get res = instance.dict[self.name] = self.func(instance) File "/usr/local/lib/python3.5/dist-packages/django/urls/resolvers.py", line 529, in urlconf_module return import_module(self.urlconf_name) File "/usr/lib/python3.5/importlib/init.py", line 126, in import_module return _bootstrap._gcd_import(name[level:], package, level) File "", line 986, in _gcd_import File "", line 969, in _find_and_load File "", line 958, in _find_and_load_unlocked File "", line 673, in _load_unlocked File "", line 665, in exec_module File "", line 222, in _call_with_frames_removed File "/home/hadoop/PycharmProjects/advisor/advisor_server/advisor/urls.py", line 29, in url(r'^suggestion/', include('suggestion.urls')), File "/usr/local/lib/python3.5/dist-packages/django/urls/conf.py", line 34, in include urlconf_module = import_module(urlconf_module) File "/usr/lib/python3.5/importlib/init.py", line 126, in import_module return _bootstrap._gcd_import(name[level:], package, level) File "", line 986, in _gcd_import File "", line 969, in _find_and_load File "", line 958, in _find_and_load_unlocked File "", line 673, in _load_unlocked File "", line 665, in exec_module File "", line 222, in _call_with_frames_removed File "/home/hadoop/PycharmProjects/advisor/advisor_server/suggestion/urls.py", line 2, in from . import views File "/home/hadoop/PycharmProjects/advisor/advisor_server/suggestion/views.py", line 16, in from suggestion.algorithm.random_search import RandomSearchAlgorithm File "/home/hadoop/PycharmProjects/advisor/advisor_server/suggestion/algorithm/random_search.py", line 6, in from base_algorithm import BaseSuggestionAlgorithm ImportError: No module named 'base_algorithm'

    opened by codlife 7
  • Study is not done if it doesn't have trials

    Study is not done if it doesn't have trials

    If the study doesn't have any trials the method fails since trial.studyname in line 107 is not initialized, defining that if a study doesn't have trials then is not completed

    bug 
    opened by andresmore 3
  • Cannot minimize a simple function

    Cannot minimize a simple function

    I did a simple test to minimize x^2-3x+2 using BayesianOpt.

    It seems incapable of minimizing. It always maximize the function. I can maximize the inverse of the function.

    examples.zip

    bug 
    opened by michellemay 3
  • storage backend

    storage backend

    Hi, does advisor server have any way to persist trails to a storage back-end?

    If such an option isn't available atm could you point me to the source code best suited to extend with such capabilities?

    opened by ibayer 2
  • Wrong order for bounds in Python 2.7

    Wrong order for bounds in Python 2.7

    Hi, when using Advisor to get suggestion, i get wrong trial that the parameter was out of the range i set. The study conf shows as below

    { maxTrails: 20, randomInitTrail: 5, params: [ { scallingType: "LINEAR", type: "INTEGER", maxValue: 20, minValue: 5, parameterName: "max_depth", feasiblePoints: "" }, { scallingType: "LINEAR", type: "DOUBLE", maxValue: 0.2, minValue: 0, parameterName: "eta", feasiblePoints: "" } ], goal: "MAXIMIZE" }

    and i get a trial like this

    {"eta": 19.999988873780502, "max_depth": 0}

    then i found line 153 of bayesian_optimization.py
    bound_dict = {} it's not in order as you insert in python 2.7 while iterating, so it may get wrong order when construct bounds array at line 182-184

    for key in bound_dict.keys():
            bounds.append(bound_dict[key])
          bounds = np.asarray(bounds)
    

    I think it's better to use bound_dict = OrderedDict() to keep the order in bound_dict to get the right suggestion.

    bug 
    opened by gaomochi 2
  • Sign in error

    Sign in error

    I follow your instruction to open http://127.0.0.1:8000 in the browser and sign in with my github. But it shows 404 error the page not found. When I use the command "pip install advisor_clients", it shows "Retrying (Retry(total=4, connect=None, read=None, redirect=None, status=None)) after connection broken by 'ProxyError('Cannot connect to proxy.', OSError('Tunnel connection failed: 503 Forwarding failure',))': /simple/advisor-clients/"

    Thank you!!

    opened by lk1983823 2
  • Update urls.py

    Update urls.py

    while running the command "./manage.py migrate" to run advisor from scratch on MacOS, I got following error "AttributeError: module Django.contrib.auth.views has no attribute" therefore I made some changes in the script and now the server runs fine at the local system.

    opened by amansingh1608 1
  • feat: Avoid redundant trials

    feat: Avoid redundant trials

    The new_trials looks like this. Thus we should not use the first item all the times.

    [
        {'state': 0, 'tid': 2, 'spec': None, 'result': {'status': 'new'
            }, 'misc': {'tid': 2, 'cmd': ('domain_attachment', 'FMinIter_Domain'), 'workdir': None, 'idxs': {'param-1': [
                        2
                    ], 'param-2': [
                        2
                    ], 'param-3': [
                        2
                    ], 'param-4': [
                        2
                    ]
                }, 'vals': {'param-1': [
                        2
                    ], 'param-2': [
                        0
                    ], 'param-3': [
                        2
                    ], 'param-4': [
                        1.496156082719898
                    ]
                }
            }, 'exp_key': None, 'owner': None, 'version': 0, 'book_time': None, 'refresh_time': None
        },
        {'state': 0, 'tid': 3, 'spec': None, 'result': {'status': 'new'
            }, 'misc': {'tid': 3, 'cmd': ('domain_attachment', 'FMinIter_Domain'), 'workdir': None, 'idxs': {'param-1': [
                        3
                    ], 'param-2': [
                        3
                    ], 'param-3': [
                        3
                    ], 'param-4': [
                        3
                    ]
                }, 'vals': {'param-1': [
                        0
                    ], 'param-2': [
                        1
                    ], 'param-3': [
                        0
                    ], 'param-4': [
                        2.8898132037592097
                    ]
                }
            }, 'exp_key': None, 'owner': None, 'version': 0, 'book_time': None, 'refresh_time': None
        }
    ]
    

    Signed-off-by: Ce Gao [email protected]

    opened by gaocegege 1
  • pip install error.

    pip install error.

    Hi, there.

    Is anything wrong with the pip installation? I tried different pip mirror source even with or without conda, python2 and python3 But I got similar error and it will install a package named unknown. Anyone same as me?

    (master)⚡ [130] % pip install advisor --user
    Collecting advisor
      Downloading https://files.pythonhosted.org/packages/c1/fd/5cf49e6fbf34e3879ac12561e258827a5aa422469fca6faf3a17aaf3e7a9/advisor-0.1.6.tar.gz
      Running setup.py (path:/tmp/pip-install-v6LivQ/advisor/setup.py) egg_info for package advisor produced metadata for project name unknown. Fix your #egg=advisor fragments.
    Building wheels for collected packages: unknown, unknown
      Running setup.py bdist_wheel for unknown ... done
      Stored in directory: /home/myname/.cache/pip/wheels/97/10/41/fd24dfce3d66fe01c1a4ee2fca1e33c9dc5e68eef0d6177d88
      Running setup.py bdist_wheel for unknown ... error
      Complete output from command /home/tools/anaconda2/bin/python -u -c "import setuptools, tokenize;__file__='/tmp/pip-install-v6LivQ/unknown/setup.py';f=getattr(tokenize, 'open', open)(__file__);code=f.read().replace('\r\n', '\n');f.close();exec(compile(code, __file__, 'exec'))" bdist_wheel -d /tmp/pip-wheel-JAGH_G --python-tag cp27:
      Traceback (most recent call last):
        File "<string>", line 1, in <module>
      IOError: [Errno 2] No such file or directory: '/tmp/pip-install-v6LivQ/unknown/setup.py'
      
      ----------------------------------------
      Failed building wheel for unknown
      Running setup.py clean for unknown
      Complete output from command /home/tools/anaconda2/bin/python -u -c "import setuptools, tokenize;__file__='/tmp/pip-install-v6LivQ/unknown/setup.py';f=getattr(tokenize, 'open', open)(__file__);code=f.read().replace('\r\n', '\n');f.close();exec(compile(code, __file__, 'exec'))" clean --all:
      Traceback (most recent call last):
        File "<string>", line 1, in <module>
      IOError: [Errno 2] No such file or directory: '/tmp/pip-install-v6LivQ/unknown/setup.py'
      
      ----------------------------------------
      Failed cleaning build dir for unknown
    Successfully built unknown
    Failed to build unknown
    Installing collected packages: unknown
    Successfully installed unknown-0.0.0
    You are using pip version 10.0.1, however version 19.0.3 is available.
    You should consider upgrading via the 'pip install --upgrade pip' command.
    
    opened by ideaRunner 1
  • choose bytes or str based on python 2 or 3

    choose bytes or str based on python 2 or 3

    Code failed to run on my python 3.6 environment, and it's caused by using bytes for places where str is expected for python 3. So change to choose bytes or str based on python 2 or 3. Now I can run advisor server through docker, then use python 3 advisor client to communicate with it. In other words, I can use python3.6 to "advisor run -f ./advisor_client/examples/python_function/config.json"

    opened by colinzuo 1
  • PiP install error

    PiP install error

    If I use pip install advisor, it warns ERROR: More than one .egg-info directory found in C:\Users\xxx\AppData\Local\Temp\pip-pip-egg-info-y3dtu1ju. @tobegit3hub #45

    opened by william0620 0
  • ERROR: More than one .egg-info directory found in /tmp/pip-pip-egg-info-<somehash>

    ERROR: More than one .egg-info directory found in /tmp/pip-pip-egg-info-

    Installing advisor through pip install advisor fails with the following error:

    ERROR: More than one .egg-info directory found in /tmp/pip-pip-egg-info-<somehash>

    As @phdru pointed out on SO, there are two setup() calls in setup.py

    setup.py:22 setup.py:29

    opened by R2D2oid 1
  • Walk through request

    Walk through request

    Newb here. A quick walkthrough of something like theChocolate Chip cookie recipe optimization INCLUDING INSTALL OF everything would be really really really useful. This has been really painful to try to figure out how to get this going and I still don't feel like I'm even that close.

    I attended a talk on this about 2 years ago at Google X was told this "should" be easy enough for even a dumb mechanical engineer.

    opened by divenpuke 2
  • Installation of advisor with command

    Installation of advisor with command "pip3 install advisor" failed

    Installation of advisor with command "pip3 install advisor" failed, but with command "pip2 install advisor" succeeded. Whether the advisor is currently only compatible with python2 and not python3.

    opened by duni123 4
  • advisor_admin server start not work(python version)

    advisor_admin server start not work(python version)

    When i run advisor_admin server start, it has a SyntaxError an invalid syntax in the site-packages/advisor-0.1.6-py3.6.egg/advisor_client/commandline/admin_command.py

    def is_server_running():

    command = "docker ps |grep 'tobegit3hub/advisor'" print("Run the command: {}".format(command))

    try: command_output = subprocess.check_output(command, shell=True)

    if command_output != "":
      return True
    

    except subprocess.CalledProcessError, e: if e.output == "": pass else: print("Get error: {}".format(e.output)) return False

    opened by fairleehu 7
  • how to manage security and users

    how to manage security and users

    hello, we setup adviser server in linux, but how we can manage related security, i.e anyone can delete the list and trials, and there is not any instruction that how we can create users and what is the default admin credentials. please let us know how we can go ahead for users and security. thanks

    opened by sachinbanugariya 1
Owner
tobe
Work in @Xiaomi, @UnitedStack and @4Paradigm for Storage(HBase), IaaS(OpenStack, Kubernetes), Big data(Spark, Flink) and Machine Learning(TensorFlow).
tobe
Facilitating Database Tuning with Hyper-ParameterOptimization: A Comprehensive Experimental Evaluation

A Comprehensive Experimental Evaluation for Database Configuration Tuning This is the source code to the paper "Facilitating Database Tuning with Hype

DAIR Lab 9 Oct 29, 2022
Black-Box-Tuning - Black-Box Tuning for Language-Model-as-a-Service

Black-Box-Tuning Source code for paper "Black-Box Tuning for Language-Model-as-a

Tianxiang Sun 149 Jan 4, 2023
Saeed Lotfi 28 Dec 12, 2022
OpenDelta - An Open-Source Framework for Paramter Efficient Tuning.

OpenDelta is a toolkit for parameter efficient methods (we dub it as delta tuning), by which users could flexibly assign (or add) a small amount parameters to update while keeping the most paramters frozen. By using OpenDelta, users could easily implement prefix-tuning, adapters, Lora, or any other types of delta tuning with preferred PTMs.

THUNLP 386 Dec 26, 2022
This python-based package offers a way of creating a parametric OpenMC plasma source from plasma parameters.

openmc-plasma-source This python-based package offers a way of creating a parametric OpenMC plasma source from plasma parameters. The OpenMC sources a

Fusion Energy 10 Oct 18, 2022
Hyper-parameter optimization for sklearn

hyperopt-sklearn Hyperopt-sklearn is Hyperopt-based model selection among machine learning algorithms in scikit-learn. See how to use hyperopt-sklearn

null 1.4k Jan 1, 2023
An integration of several popular automatic augmentation methods, including OHL (Online Hyper-Parameter Learning for Auto-Augmentation Strategy) and AWS (Improving Auto Augment via Augmentation Wise Weight Sharing) by Sensetime Research.

An integration of several popular automatic augmentation methods, including OHL (Online Hyper-Parameter Learning for Auto-Augmentation Strategy) and AWS (Improving Auto Augment via Augmentation Wise Weight Sharing) by Sensetime Research.

null 45 Dec 8, 2022
Code for the paper "Query Embedding on Hyper-relational Knowledge Graphs"

Query Embedding on Hyper-Relational Knowledge Graphs This repository contains the code used for the experiments in the paper Query Embedding on Hyper-

DimitrisAlivas 19 Jul 26, 2022
RuDOLPH: One Hyper-Modal Transformer can be creative as DALL-E and smart as CLIP

[Paper] [Хабр] [Model Card] [Colab] [Kaggle] RuDOLPH ?? ?? ☃️ One Hyper-Modal Tr

Sber AI 230 Dec 31, 2022
PaddleRobotics is an open-source algorithm library for robots based on Paddle, including open-source parts such as human-robot interaction, complex motion control, environment perception, SLAM positioning, and navigation.

简体中文 | English PaddleRobotics paddleRobotics是基于paddle的机器人开源算法库集,包括人机交互、复杂运动控制、环境感知、slam定位导航等开源算法部分。 人机交互 主动多模交互技术TFVT-HRI 主动多模交互技术是通过视觉、语音、触摸传感器等输入机器人

null 185 Dec 26, 2022
Pytorch implementation of "Training a 85.4% Top-1 Accuracy Vision Transformer with 56M Parameters on ImageNet"

Token Labeling: Training an 85.4% Top-1 Accuracy Vision Transformer with 56M Parameters on ImageNet (arxiv) This is a Pytorch implementation of our te

蒋子航 383 Dec 27, 2022
Unofficial & improved implementation of NeRF--: Neural Radiance Fields Without Known Camera Parameters

[Unofficial code-base] NeRF--: Neural Radiance Fields Without Known Camera Parameters [ Project | Paper | Official code base ] ⬅️ Thanks the original

Jianfei Guo 239 Dec 22, 2022
PyTorch implementation of "Efficient Neural Architecture Search via Parameters Sharing"

Efficient Neural Architecture Search (ENAS) in PyTorch PyTorch implementation of Efficient Neural Architecture Search via Parameters Sharing. ENAS red

Taehoon Kim 2.6k Dec 31, 2022
MediaPipe is a an open-source framework from Google for building multimodal

MediaPipe is a an open-source framework from Google for building multimodal (eg. video, audio, any time series data), cross platform (i.e Android, iOS, web, edge devices) applied ML pipelines. It is performance optimized with end-to-end on device inference in mind.

Bhavishya Pandit 3 Sep 30, 2022
(Arxiv 2021) NeRF--: Neural Radiance Fields Without Known Camera Parameters

NeRF--: Neural Radiance Fields Without Known Camera Parameters Project Page | Arxiv | Colab Notebook | Data Zirui Wang¹, Shangzhe Wu², Weidi Xie², Min

Active Vision Laboratory 411 Dec 26, 2022
Solving SMPL/MANO parameters from keypoint coordinates.

Minimal-IK A simple and naive inverse kinematics solver for MANO hand model, SMPL body model, and SMPL-H body+hand model. Briefly, given joint coordin

Yuxiao Zhou 305 Dec 30, 2022
Evolving neural network parameters in JAX.

Evolving Neural Networks in JAX This repository holds code displaying techniques for applying evolutionary network training strategies in JAX. Each sc

Trevor Thackston 6 Feb 12, 2022
MM1 and MMC Queue Simulation using python - Results and parameters in excel and csv files

implementation of MM1 and MMC Queue on randomly generated data and evaluate simulation results then compare with analytical results and draw a plot curve for them, simulate some integrals and compare results and run monte carlo algorithm with them

Mohamadreza Rezaei 1 Jan 19, 2022
Torch-mutable-modules - Use in-place and assignment operations on PyTorch module parameters with support for autograd

Torch Mutable Modules Use in-place and assignment operations on PyTorch module p

Kento Nishi 7 Jun 6, 2022