PyTorch misc
Collection of code snippets I've written for the PyTorch discussion board.
All scripts were testes using the PyTorch 1.0 preview and torchvision 0.2.1
.
Additional libraries, e.g. numpy
or pandas
, are used in a few scripts.
Some scripts might be a good starter to create a tutorial.
Overview
- accumulate_gradients - Comparison of accumulated gradients/losses to vanilla batch update.
- adaptive_batchnorm- Adaptive BN implementation using two additional parameters:
out = a * x + b * bn(x)
. - adaptive_pooling_torchvision - Example of using adaptive pooling layers in pretrained models to use different spatial input shapes.
- batch_norm_manual - Comparison of PyTorch BatchNorm layers and a manual calculation.
- change_crop_in_dataset - Change the image crop size on the fly using a Dataset.
- channel_to_patches - Permute image data so that channel values of each pixel are flattened to an image patch around the pixel.
- conv_rnn - Combines a 3DCNN with an RNN; uses windowed frames as inputs.
- csv_chunk_read - Provide data chunks from continuous .csv file.
- densenet_forwardhook - Use forward hooks to get intermediate activations from
densenet121
. Uses separate modules to process these activations further. - edge_weighting_segmentation - Apply weighting to edges for a segmentation task.
- image_rotation_with_matrix - Rotate an image given an angle using 1.) a nested loop and 2.) a rotation matrix and mesh grid.
- LocallyConnected2d - Implementation of a locally connected 2d layer.
- mnist_autoencoder - Simple autoencoder for MNIST data. Includes visualizations of output images, intermediate activations and conv kernels.
- mnist_permuted - MNIST training using permuted pixel locations.
- model_sharding_data_parallel - Model sharding with
DataParallel
using 2 pairs of 2 GPUs. - momentum_update_nograd - Script to see how parameters are updated when an optimizer is used with momentum/running estimates, even if gradients are zero.
- pytorch_redis - Script to demonstrate the loading data from redis using a PyTorch Dataset and DataLoader.
- shared_array - Script to demonstrate the usage of shared arrays using multiple workers.
- shared_dict - Script to demonstrate the usage of shared dicts using multiple workers.
- unet_demo - Simple UNet demo.
- weighted_sampling - Usage of WeightedRandomSampler using an imbalanced dataset with class imbalance 99 to 1.
Feedback is very welcome!