Optimized Gillespie algorithm for simulating Stochastic sPAtial models of Cancer Evolution (OG-SPACE)

Overview

OG-SPACE

Introduction

Optimized Gillespie algorithm for simulating Stochastic sPAtial models of Cancer Evolution (OG-SPACE) is a computational framework to simulate the spatial evolution of cancer cells and the experimental procedure of bulk and Single-cell DNA-seq experiments. OG-SPACE relies on an optimized Gillespie algorithm for a large number of cells able to handle a variety of Birth-Death processes on a lattice and an efficient procedure to reconstruct the phylogenetic tree and the genotype of the sampled cells.

REQUIRED SOFTWARE AND PACKAGE

  • R (tested on version 4.0) https://cran.r-project.org
  • The following R libraries:
    • igraph
    • gtools
    • ggplot2
    • gridExtra
    • reshape2
    • stringi
    • stringr
    • shiny
    • manipulateWidget
    • rgl

RUN OG-SPACE

  • Download the folder OG-SPACE.
  • use the following command "Rscript.exe my_path\Run_OG-SPACE.R". "my_path" is the path to the folder containing the OG-SPACE scripts.
  • When the pop-up window appears, select the file "Run_OG-SPACE.R" in the working folder. Alternatively, you can launch OG-SPACE, with software like RStudio. In this case, simply run the script "Run_OG-SPACE.R" and when the pop-up window appears, select the file "Run_OG-SPACE.R" in the working folder.

PARAMETERS OF OG-SPACE

Most of the parameters of OG-SPACE could be modified by editing with a text editor the file "input/Parameters.txt". Here a brief description of each parameters.

  • simulate_process three values "contact","voter" and "h_voter". This parameter selects which model simulate with OG-SPACE.
  • generate_lattice = if 1 OG-SPACE generate a regular lattice for the dynamics. If 0 OG-SPACE takes an Igraph object named "g.Rdata" in the folder "input".
  • dimension = an integer number, the dimensionality of the generated regular lattice.
  • N_e = an integer number, number of elements of the edge of the generated regular lattice.
  • dist_interaction = an integer number, the distance of interaction between nodes of the lattice.
  • simulate_experiments = if 1 OG-SPACE generates bulk and sc-DNA seq experiments data. If 0, no.
  • do_bulk_exp = if 1 OG-SPACE generates bulk seq experiment data . If 0, no
  • do_sc_exp = if 1 OG-SPACE generates sc-DNA seq experiments data . If 0, no
  • to_do_plots_of_trees = if 1 OG-SPACE generates the plots of the trees . If 0, no.
  • do_pop_dyn_plot = if 1 OG-SPACE generates the plots of the dynamics . If 0, no.
  • do_spatial_dyn_plot = if 1 OG-SPACE generates the plots of the spatial dynamics . If 0, no.
  • do_geneaology_tree = if 1 OG-SPACE generates the plots of the cell genealogy trees . If 0, no.
  • do_phylo_tree = if 1 OG-SPACE generate the plots of the phylogenetic trees . If 0 no.
  • size_of_points_lattice = an integer number, size of the points in the plot of spatial dynamics.
  • size_of_points_trees = an integer number, size of the points in the plot of trees.
  • set_seed = the random seed of the computation.
  • Tmax = maximum time of the computation [arb. units] .
  • alpha = birth rate of the first subpopulation [1/time].
  • beta = death rate of the first subpopulation [1/time].
  • driv_mut = probability of developing a driver mutation (between 0 and 1).
  • driv_average_advantadge = average birth rate advantage per driver [1/time].
  • random_start = if 1 OG-SPACE select randomly the spatial position of the first cell . If 0 it use the variable "node_to_start" .
  • node_to_start = if random_start=0 OG-SPACE, the variable should be setted to the label of the node of starting.
  • N_starting = Number of starting cells. Works only with random_start=1.
  • n_events_saving = integer number, frequency of the number of events when saving the dynamics for the plot.
  • do_random_sampling = if 2 OG-SPACE samples randomly the cells.
  • -n_sample = integer number of the number of sampled cell. Ignored if do_random_sampling = 0
  • dist_sampling = The radius of the spatial sampled region. Ignored if do_random_sampling = 1
  • genomic_seq_length = number of bases of the genome under study.
  • neutral_mut_rate = neutral mutational rate per base [1/time].
  • n_time_sample = integer number, number of the plots of the dynamics.
  • detected_vaf_thr = VAF threshold. If a VAF is lesser than this number is considered not observed.
  • sequencing_depth_bulk = integer number, the sequencing depth of bulk sequencing.
  • prob_reads_bulk = number between 0 and 1, 1- the prob of a false negative in bulk read
  • mean_coverage_cell_sc = integer number, mean number of read per cells
  • fn_rate_sc_exp = number between 0 and 1, 1- the prob of a false negative in sc read
  • fp_rate_sc_exp = number between 0 and 1, 1- the prob of a false positive in sc read
  • minimum_reads_for_cell = integer number, the minimum number of reads per cell in order to call a mutation
  • detection_thr_sc = ratio of successful reads necessary to call a mutation

OUTPUTS OF OG-SPACE

In the folder "output", you will find all the .txt data files of the output. Note that the trees are returned as edge list matrices. The files will contain:

  • The state of the lattice, with the position of each cell.
  • The Ground Truth (GT) genotype of the sampled cells.
  • The GT Variant Allele Frequency (VAF) spectrum of the sampled cells.
  • The GT genealogy tree of the sampled cells.
  • The GT phylogenetic tree of the sampled cells.
  • The mutational tree of the driver mutations appeared during the simulation of the dynamics.
  • The genotype of the sampled cells after simulating a sc-DNA-seq experiment (if required).
  • The VAF spectrum of the sampled cells after simulating a bulk DNA-seq experiment (if required).

In the folder "output/plots", you will find all required plots.

You might also like...
ESGD-M - A stochastic non-convex second order optimizer, suitable for training deep learning models, for PyTorch

ESGD-M - A stochastic non-convex second order optimizer, suitable for training deep learning models, for PyTorch

A fast Evolution Strategy implementation in Python

Evostra: Evolution Strategy for Python Evolution Strategy (ES) is an optimization technique based on ideas of adaptation and evolution. You can learn

Code for the paper Task Agnostic Morphology Evolution.

Task-Agnostic Morphology Optimization This repository contains code for the paper Task-Agnostic Morphology Evolution by Donald (Joey) Hejna, Pieter Ab

Pytorch implementation of FlowNet 2.0: Evolution of Optical Flow Estimation with Deep Networks
Pytorch implementation of FlowNet 2.0: Evolution of Optical Flow Estimation with Deep Networks

flownet2-pytorch Pytorch implementation of FlowNet 2.0: Evolution of Optical Flow Estimation with Deep Networks. Multiple GPU training is supported, a

Official implement of Evo-ViT: Slow-Fast Token Evolution for Dynamic Vision Transformer
Official implement of Evo-ViT: Slow-Fast Token Evolution for Dynamic Vision Transformer

Evo-ViT: Slow-Fast Token Evolution for Dynamic Vision Transformer This repository contains the PyTorch code for Evo-ViT. This work proposes a slow-fas

This implements one of result networks from Large-scale evolution of image classifiers
This implements one of result networks from Large-scale evolution of image classifiers

Exotic structured image classifier This implements one of result networks from Large-scale evolution of image classifiers by Esteban Real, et. al. Req

Evolution Strategies in PyTorch
Evolution Strategies in PyTorch

Evolution Strategies This is a PyTorch implementation of Evolution Strategies. Requirements Python 3.5, PyTorch = 0.2.0, numpy, gym, universe, cv2 Wh

NEATEST: Evolving Neural Networks Through Augmenting Topologies with Evolution Strategy Training
NEATEST: Evolving Neural Networks Through Augmenting Topologies with Evolution Strategy Training

NEATEST: Evolving Neural Networks Through Augmenting Topologies with Evolution Strategy Training

This is the official pytorch implementation of Student Helping Teacher: Teacher Evolution via Self-Knowledge Distillation(TESKD)
This is the official pytorch implementation of Student Helping Teacher: Teacher Evolution via Self-Knowledge Distillation(TESKD)

Student Helping Teacher: Teacher Evolution via Self-Knowledge Distillation (TESKD) By Zheng Li[1,4], Xiang Li[2], Lingfeng Yang[2,4], Jian Yang[2], Zh

Owner
Data and Computational Biology Group UNIMIB (was BI*oinformatics MI*lan B*icocca)
The github organization of the DCB group of the DISCo, Università degli Studi di Milano Bicocca
Data and Computational Biology Group UNIMIB (was BI*oinformatics MI*lan B*icocca)
Simulating Sycamore quantum circuits classically using tensor network algorithm.

Simulating the Sycamore quantum supremacy circuit This repo contains data we have obtained in simulating the Sycamore quantum supremacy circuits with

Feng Pan 46 Nov 17, 2022
Simulating an AI playing 2048 using the Expectimax algorithm

2048-expectimax Simulating an AI playing 2048 using the Expectimax algorithm The base game engine uses code from here. The AI player is modeled as a m

Subha Ramesh 2 Jan 31, 2022
Developed an optimized algorithm which finds the most optimal path between 2 points in a 3D Maze using various AI search techniques like BFS, DFS, UCS, Greedy BFS and A*

Developed an optimized algorithm which finds the most optimal path between 2 points in a 3D Maze using various AI search techniques like BFS, DFS, UCS, Greedy BFS and A*. The algorithm was extremely optimal running in ~15s to ~30s for search spaces as big as 10000000 nodes where a set of 18 actions could be performed at each node in the 3D Maze.

null 1 Mar 28, 2022
Code accompanying "Learning What To Do by Simulating the Past", ICLR 2021.

Learning What To Do by Simulating the Past This repository contains code that implements the Deep Reward Learning by Simulating the Past (Deep RSLP) a

Center for Human-Compatible AI 24 Aug 7, 2021
A python package simulating the quasi-2D pseudospin-1/2 Gross-Pitaevskii equation with NVIDIA GPU acceleration.

A python package simulating the quasi-2D pseudospin-1/2 Gross-Pitaevskii equation with NVIDIA GPU acceleration. Introduction spinor-gpe is high-level,

null 2 Sep 20, 2022
FuseDream: Training-Free Text-to-Image Generationwith Improved CLIP+GAN Space OptimizationFuseDream: Training-Free Text-to-Image Generationwith Improved CLIP+GAN Space Optimization

FuseDream This repo contains code for our paper (paper link): FuseDream: Training-Free Text-to-Image Generation with Improved CLIP+GAN Space Optimizat

XCL 191 Dec 31, 2022
Space robot - (Course Project) Using the space robot to capture the target satellite that is disabled and spinning, then stabilize and fix it up

Space robot - (Course Project) Using the space robot to capture the target satellite that is disabled and spinning, then stabilize and fix it up

Mingrui Yu 3 Jan 7, 2022
Deploy optimized transformer based models on Nvidia Triton server

Deploy optimized transformer based models on Nvidia Triton server

Lefebvre Sarrut Services 1.2k Jan 5, 2023
Deploy optimized transformer based models on Nvidia Triton server

?? Hugging Face Transformer submillisecond inference ?? and deployment on Nvidia Triton server Yes, you can perfom inference with transformer based mo

Lefebvre Sarrut Services 1.2k Jan 5, 2023
Collision risk estimation using stochastic motion models

collision_risk_estimation Collision risk estimation using stochastic motion models. This is a new approach, based on stochastic models, to predict the

Unmesh 7 Jun 26, 2022