News
November 19: Git repo is now public
Documentation
Google Colab Examples
See the examples folder for notebooks you can download or run on Google Colab.
Overview
This library consists of 11 modules:
Module | Description |
---|---|
Adapters | Wrappers for training and inference steps |
Containers | Dictionaries for simplifying object creation |
Datasets | Commonly used datasets and tools for domain adaptation |
Frameworks | Wrappers for training/testing pipelines |
Hooks | Modular building blocks for domain adaptation algorithms |
Layers | Loss functions and helper layers |
Meta Validators | Post-processing of metrics, for hyperparameter optimization |
Models | Architectures used for benchmarking and in examples |
Utils | Various tools |
Validators | Metrics for determining and estimating accuracy |
Weighters | Functions for weighting losses |
How to...
Use in vanilla PyTorch
from pytorch_adapt.hooks import DANNHook
from pytorch_adapt.utils.common_functions import batch_to_device
# Assuming that models, optimizers, and dataloader are already created.
hook = DANNHook(optimizers)
for data in tqdm(dataloader):
data = batch_to_device(data, device)
# Optimization is done inside the hook.
# The returned loss is for logging.
loss, _ = hook({}, {**models, **data})
Build complex algorithms
Let's customize DANNHook
with:
- virtual adversarial training
- entropy conditioning
from pytorch_adapt.hooks import EntropyReducer, MeanReducer, VATHook
# G and C are the Generator and Classifier models
misc = {"combined_model": torch.nn.Sequential(G, C)}
reducer = EntropyReducer(
apply_to=["src_domain_loss", "target_domain_loss"], default_reducer=MeanReducer()
)
hook = DANNHook(optimizers, reducer=reducer, post_g=[VATHook()])
for data in tqdm(dataloader):
data = batch_to_device(data, device)
loss, _ = hook({}, {**models, **data, **misc})
Wrap with your favorite PyTorch framework
For additional functionality, adapters can be wrapped with a framework (currently just PyTorch Ignite).
from pytorch_adapt.adapters import DANN
from pytorch_adapt.containers import Models, Optimizers
from pytorch_adapt.datasets import DataloaderCreator
from pytorch_adapt.frameworks.ignite import Ignite
# Assume G, C and D are existing models
models_cont = Models(models)
# Override the default optimizer for G and C
optimizers_cont = Optimizers((torch.optim.Adam, {"lr": 0.123}), keys=["G", "C"])
adapter = DANN(models=models_cont, optimizers=optimizers_cont)
dc = DataloaderCreator(num_workers=2)
trainer = Ignite(adapter)
trainer.run(datasets, dataloader_creator=dc)
Wrappers for other frameworks (e.g. PyTorch Lightning and Catalyst) are planned to be added.
Check your model's performance
You can do this in vanilla PyTorch:
from pytorch_adapt.validators import SNDValidator
# Assuming predictions have been collected
target_train = {"preds": preds}
validator = SNDValidator()
score = validator.score(epoch=1, target_train=target_train)
You can also do this using a framework wrapper:
validator = SNDValidator()
trainer = Ignite(adapter, validator=validator)
trainer.run(datasets, dataloader_creator=dc)
Run the above examples
See this notebook and the examples page for other notebooks.
Installation
Pip
pip install pytorch-adapt
To get the latest dev version:
pip install pytorch-adapt --pre
To use pytorch_adapt.frameworks.ignite
:
pip install pytorch-adapt[ignite]
Conda
Coming soon...
Dependencies
Required dependencies:
- numpy
- torch >= 1.6
- torchvision
- torchmetrics
- pytorch-metric-learning >= 1.0.0.dev5
Acknowledgements
Contributors
Pull requests are welcome!
Advisors
Thank you to Ser-Nam Lim, and my research advisor, Professor Serge Belongie.
Logo
Thanks to Jeff Musgrave for designing the logo.