Deep Learning with PyTorch made easy
Carefree?
carefree-learn
aims to provide CAREFREE usages for both users and developers. It also provides a corresponding repo for production.
User Side
📈
Machine Learning import cflearn
import numpy as np
x = np.random.random([1000, 10])
y = np.random.random([1000, 1])
m = cflearn.api.fit_ml(x, y, carefree=True)
🖼️
Computer Vision import cflearn
data = cflearn.cv.MNISTData(batch_size=16, transform="to_tensor")
m = cflearn.api.resnet18_gray(10).fit(data)
Developer Side
This is a WIP section :D
Production Side
carefree-learn
could be deployed easily because
- It could be exported to
onnx
format with one line of code (m.to_onnx(...)
) - A native repo called
carefree-learn-deploy
could do the rest of the jobs, which usesFastAPI
,uvicorn
anddocker
as its backend.
Please refer to Quick Start and Developer Guides for detailed information.
Why carefree-learn?
carefree-learn
is a general Deep Learning framework based on PyTorch. Since v0.2.x
, carefree-learn
has extended its usage from tabular dataset to (almost) all kinds of dataset. In the mean time, the APIs remain (almost) the same as v0.1.x
: still simple, powerful and easy to use!
Here are some main advantages that carefree-learn
holds:
📈
Machine Learning - Provides a scikit-learn-like interface with much more 'carefree' usages, including:
- Automatically deals with data pre-processing.
- Automatically handles datasets saved in files (.txt, .csv).
- Supports Distributed Training, which means hyper-parameter tuning can be very efficient in
carefree-learn
.
- Includes some brand new techniques which may boost vanilla Neural Network (NN) performances on tabular datasets, including:
TreeDNN
withDynamic Soft Pruning
, which makes NN less sensitive to hyper-parameters.Deep Distribution Regression (DDR)
, which is capable of modeling the entire conditional distribution with one single NN model.
- Supports many convenient functionality in deep learning, including:
- Early stopping.
- Model persistence.
- Learning rate schedulers.
- And more...
- Full utilization of the WIP ecosystem
cf*
, such as:carefree-toolkit
: provides a lot of utility classes & functions which are 'stand alone' and can be leveraged in your own projects.carefree-data
: a lightweight tool to read -> convert -> process ANY tabular datasets. It also utilizes cython to accelerate critical procedures.
From the above, it comes out that carefree-learn
could be treated as a minimal Automatic Machine Learning (AutoML) solution for tabular datasets when it is fully utilized. However, this is not built on the sacrifice of flexibility. In fact, the functionality we've mentioned are all wrapped into individual modules in carefree-learn
and allow users to customize them easily.
🖼️
Computer Vision - Also provides a scikit-learn-like interface with much more 'carefree' usages.
- Provides many out-of-the-box pre-trained models and well hand-crafted training defaults for reproduction & finetuning.
- Seamlessly supports efficient
ddp
(simply callcflearn.api.run_ddp("run.py")
, whererun.py
is your normal training script). - Bunch of utility functions for research and production.
Installation
carefree-learn
requires Python 3.6 or higher.
Pre-Installing PyTorch
carefree-learn
requires pytorch>=1.9.0
. Please refer to PyTorch, and it is highly recommended to pre-install PyTorch with conda.
pip installation
After installing PyTorch, installation of carefree-learn
would be rather easy:
If you pre-installed PyTorch with conda, remember to activate the corresponding environment!
pip install carefree-learn
Docker
Prepare
carefree-learn
has already been published on DockerHub, so it can be pulled directly:
docker pull carefree0910/carefree-learn:dev
or can be built locally:
docker build -t carefree0910/carefree-learn:dev .
Run
docker run --rm -it --gpus all carefree0910/carefree-learn:dev
Examples
- Iris – perhaps the best known database to be found in the pattern recognition literature.
- Titanic – the best, first challenge for you to dive into ML competitions and familiarize yourself with how the Kaggle platform works.
- Operations - toy datasets for us to illustrate how to build your own models in
carefree-learn
.
Citation
If you use carefree-learn
in your research, we would greatly appreciate if you cite this library using this Bibtex:
@misc{carefree-learn,
year={2020},
author={Yujian He},
title={carefree-learn, Deep Learning with PyTorch made easy},
howpublished={\url{https://https://github.com/carefree0910/carefree-learn/}},
}
License
carefree-learn
is MIT licensed, as found in the LICENSE
file.