NeuroGen: activation optimized image synthesis for discovery neuroscience

Overview

NeuroGen: activation optimized image synthesis for discovery neuroscience

NeuroGen is a framework for synthesizing images that control brain activations. Details can be found here: https://www.sciencedirect.com/science/article/pii/S1053811921010831. Supplementary Material can be found here: https://drive.google.com/drive/folders/1333yhTqTro6UgRS4sr6WAiR6a-J50PHK?usp=sharing

alt text

Requirements

  • Python 3.7
  • Pytorch 1.4.0
  • Other basic computing modules

Instructions

  1. output directory contains the trained encoding model for 8 subjects in the NSD dataset.
  2. encoding.py is called when loading the encoding model to NeuroGen.
  3. getROImask.py is used to get the ROI mask for the 24 used ROIs.
  4. getmaskedROI.py is used to get the voxel response within certain ROI.
  5. getmaskedROImean.py is used to get the mean voxel response within certain ROI.
  6. neurogen.py is the main script for NeuroGen, and can be called by

python neurogen.py --roi 1 --steps 1000 --gpu 0 --lr 0.01 --subj 1 --reptime 1 --truncation 1

  1. visualize.py contains some useful functions to save images and visualize them.
  2. pytorch_pretrained_biggan is available here: https://github.com/huggingface/pytorch-pretrained-BigGAN

Note: getROImask.py, getmaskedROI.py and getmaskedROImean.py deal with the NSD data which has not been released yet and are not necessary to run NeuroGen at this time. Paths in all scripts may need to change according to needs.

Citation

@article{gu2022neurogen,
title={NeuroGen: activation optimized image synthesis for discovery neuroscience},
author={Gu, Zijin and Jamison, Keith Wakefield and Khosla, Meenakshi and Allen, Emily J and Wu, Yihan and Naselaris, Thomas and Kay, Kendrick and Sabuncu, Mert R and Kuceyeski, Amy},
journal={NeuroImage},
volume={247},
pages={118812},
year={2022},
publisher={Elsevier}
}

You might also like...
PyTorch implementation of ''Background Activation Suppression for Weakly Supervised Object Localization''.
PyTorch implementation of ''Background Activation Suppression for Weakly Supervised Object Localization''.

Background Activation Suppression for Weakly Supervised Object Localization PyTorch implementation of ''Background Activation Suppression for Weakly S

Implementation of parameterized soft-exponential activation function.
Implementation of parameterized soft-exponential activation function.

Soft-Exponential-Activation-Function: Implementation of parameterized soft-exponential activation function. In this implementation, the parameters are

Unofficial Tensorflow 2 implementation of the paper Implicit Neural Representations with Periodic Activation Functions
Unofficial Tensorflow 2 implementation of the paper Implicit Neural Representations with Periodic Activation Functions

Siren: Implicit Neural Representations with Periodic Activation Functions The unofficial Tensorflow 2 implementation of the paper Implicit Neural Repr

PyTorch reimplementation of the Smooth ReLU activation function proposed in the paper
PyTorch reimplementation of the Smooth ReLU activation function proposed in the paper "Real World Large Scale Recommendation Systems Reproducibility and Smooth Activations" [arXiv 2022].

Smooth ReLU in PyTorch Unofficial PyTorch reimplementation of the Smooth ReLU (SmeLU) activation function proposed in the paper Real World Large Scale

Contrastive learning of Class-agnostic Activation Map for Weakly Supervised Object Localization and Semantic Segmentation (CVPR 2022)
Contrastive learning of Class-agnostic Activation Map for Weakly Supervised Object Localization and Semantic Segmentation (CVPR 2022)

CCAM (Unsupervised) Code repository for our paper "CCAM: Contrastive learning of Class-agnostic Activation Map for Weakly Supervised Object Localizati

Approximate Nearest Neighbors in C++/Python optimized for memory usage and loading/saving to disk
Approximate Nearest Neighbors in C++/Python optimized for memory usage and loading/saving to disk

Annoy Annoy (Approximate Nearest Neighbors Oh Yeah) is a C++ library with Python bindings to search for points in space that are close to a given quer

MACE is a deep learning inference framework optimized for mobile heterogeneous computing platforms.
MACE is a deep learning inference framework optimized for mobile heterogeneous computing platforms.

Documentation | FAQ | Release Notes | Roadmap | MACE Model Zoo | Demo | Join Us | 中文 Mobile AI Compute Engine (or MACE for short) is a deep learning i

Code for the paper: Adversarial Training Against Location-Optimized Adversarial Patches. ECCV-W 2020.

Adversarial Training Against Location-Optimized Adversarial Patches arXiv | Paper | Code | Video | Slides Code for the paper: Sukrut Rao, David Stutz,

Official Pytorch implementation of 'GOCor: Bringing Globally Optimized Correspondence Volumes into Your Neural Network' (NeurIPS 2020)
Official Pytorch implementation of 'GOCor: Bringing Globally Optimized Correspondence Volumes into Your Neural Network' (NeurIPS 2020)

Official implementation of GOCor This is the official implementation of our paper : GOCor: Bringing Globally Optimized Correspondence Volumes into You

Comments
  • Can't find few files in src folder

    Can't find few files in src folder

    I have forked the NeuroGen repo, and I find there is a problem. The problem codes located in lines 15-16 of 'src/data_preparation.py' file are copied below, and I am afraid it is mainly because that the author might miss uploading the 'lasagne_utility.py' file which should be located in the 'src' folder.

    from src.lasagne_utility import deconv, conv, batch_norm, batch_norm_n, fc_concat,
    conv_concat, avg, flatten, sigmoid, tanh from src.lasagne_utility import print_lasagne_network

    opened by danyan1 2
  • Lack of the code about calculating the optimized synthetic images

    Lack of the code about calculating the optimized synthetic images

    Hello, Gu,

    I find you use the top 10 synthetic images in NeuroGen, the corresponding code is in the line 117 in your neurogen file. "top_idx = np.load('./img/S%d' % args.subj + '/top10_class.npy')". But there is not related code or results of the top 10 synthetic images in your repository.

    Could you please attach this part?

    Thank you so much. Qin

    opened by danyan1 1
  • no image transform procedurs

    no image transform procedurs

    Hello Gu,

    I find that you use "S%d_stimuli_227.h5py" % subject in "fwrf_ROIvoxel_mean.py" file (line 119), which I guess might be the result after the transforms of images (normalized, RandomResizedCrop, etc), but I didn't find the relevant function or file to do this. The "227" represents what?

    Could you attach this part of processing?

    Thank you.

    opened by danyan1 1
Owner
null
Numenta Platform for Intelligent Computing is an implementation of Hierarchical Temporal Memory (HTM), a theory of intelligence based strictly on the neuroscience of the neocortex.

NuPIC Numenta Platform for Intelligent Computing The Numenta Platform for Intelligent Computing (NuPIC) is a machine intelligence platform that implem

Numenta 6.3k Dec 30, 2022
Numenta Platform for Intelligent Computing is an implementation of Hierarchical Temporal Memory (HTM), a theory of intelligence based strictly on the neuroscience of the neocortex.

NuPIC Numenta Platform for Intelligent Computing The Numenta Platform for Intelligent Computing (NuPIC) is a machine intelligence platform that implem

Numenta 6.2k Feb 12, 2021
Optimized code based on M2 for faster image captioning training

Transformer Captioning This repository contains the code for Transformer-based image captioning. Based on meshed-memory-transformer, we further optimi

lyricpoem 16 Dec 16, 2022
Official Repsoitory for "Activate or Not: Learning Customized Activation." [CVPR 2021]

CVPR 2021 | Activate or Not: Learning Customized Activation. This repository contains the official Pytorch implementation of the paper Activate or Not

null 184 Dec 27, 2022
ActNN: Reducing Training Memory Footprint via 2-Bit Activation Compressed Training

ActNN : Activation Compressed Training This is the official project repository for ActNN: Reducing Training Memory Footprint via 2-Bit Activation Comp

UC Berkeley RISE 178 Jan 5, 2023
FactSeg: Foreground Activation Driven Small Object Semantic Segmentation in Large-Scale Remote Sensing Imagery (TGRS)

FactSeg: Foreground Activation Driven Small Object Semantic Segmentation in Large-Scale Remote Sensing Imagery by Ailong Ma, Junjue Wang*, Yanfei Zhon

Kingdrone 43 Jan 5, 2023
Many Class Activation Map methods implemented in Pytorch for CNNs and Vision Transformers. Including Grad-CAM, Grad-CAM++, Score-CAM, Ablation-CAM and XGrad-CAM

Class Activation Map methods implemented in Pytorch pip install grad-cam ⭐ Tested on many Common CNN Networks and Vision Transformers. ⭐ Includes smoo

Jacob Gildenblat 6.6k Jan 6, 2023
With this package, you can generate mixed-integer linear programming (MIP) models of trained artificial neural networks (ANNs) using the rectified linear unit (ReLU) activation function

With this package, you can generate mixed-integer linear programming (MIP) models of trained artificial neural networks (ANNs) using the rectified linear unit (ReLU) activation function. At the moment, only TensorFlow sequential models are supported. Interfaces to either the Pyomo or Gurobi modeling environments are offered.

ChemEngAI 40 Dec 27, 2022
Tensorflow Implementation of SMU: SMOOTH ACTIVATION FUNCTION FOR DEEP NETWORKS USING SMOOTHING MAXIMUM TECHNIQUE

SMU A Tensorflow Implementation of SMU: SMOOTH ACTIVATION FUNCTION FOR DEEP NETWORKS USING SMOOTHING MAXIMUM TECHNIQUE arXiv https://arxiv.org/abs/211

Fuhang 5 Jan 18, 2022
Source for the paper "Universal Activation Function for machine learning"

Universal Activation Function Tensorflow and Pytorch source code for the paper Yuen, Brosnan, Minh Tu Hoang, Xiaodai Dong, and Tao Lu. "Universal acti

null 4 Dec 3, 2022