GT China coal model

Overview

GT China coal model

The full version of a China coal transport model with a very high spatial reslution.

What it does

The code works in a few steps:

  1. Take easily understandable and readable xlsx input files on networks, plants, demand etc, and create project build files form this (done in R).
  2. Take the build files and create an LP problem file from it (in python, either locally or on AWS Sagemaker).
  3. Solve the problem from the LP problem file, and write solution to a txt file (either in Cplex interactive or in python).
  4. Process the solution txt file and write to easily understandable and readable xlsx (in R). The packages required to run these scripts are included in the environment.yml (for python) and the renv.lock file (for R; this first requires installation of renv package: https://rstudio.github.io/renv/articles/renv.html; after installation run renv:restore()).

The model

The model optimizes for a minimum cost of production + transport + transmission.
Production meaning coal mining costs, transport meaning rail/truck/riverborne/ocean-going transport and handling costs, and transmission meaning inter-provincial transport of electricity via UHV cables.

Constraints in the optimization

The constraints in the mini testbench are the same as in the full model. These are:

  • Mines (or any other node) cannot supply types of coal they do not produce.
  • The flow of coal of each type out of a node cannot exceed flows of coal of each type into a node plus supply by the node (with supply being non-zero only for mines).
  • The energy content of the supply and the flows of coal of each type into a node have to be at least equal to the demand for electricity, plus other thermal coal demand, plus the energy content of flows of coal of each type out of a node. Note that only mines can supply coal, all demand for electrical power occurs in provincial demand nodes, and demand for other thermal coal is placed at city-level nodes.
  • The amount of hard coking coal (HCC) flowing into a node has to at least be equal to the steel demand multiplied by 0.581. Note that all steel demand is placed in provincial level steel demand nodes, which are connecte with uni-directional links from steel plants to steel demand nodes. This means no coal can flow out of a steel demand node and we do not need further forumalae for mass balances. Also note that we presume a mix of coking coal need to produce a ton of steel of 581 kg Hard coking coal (HCC), 176 kg of soft coking coal (SCC), and 179 kg of pulverized coal for injection (PCI).
  • The amount of soft coking coal (SCC) flowing into a node has to at least be equal to the amount of HCC flowing into that node, mulitplied with 0.581/0.176.
  • The amount of pulverized coal for injection (PCI) flowing into a node has to at least be equal to the amount of HCC flowing into that node, mulitplied with 0.581/0.179.
  • The total volume of all coal types transported along a link cannot exceed the transport capacity of that link. Note that this constraint is applied only to those links with a non-infinite transport capacity. In practive this means rail links are assumed to have a transport capacity, ocean routes, rivers, and road links are assumed to have infinite capacity.
  • The total amount of energy transported along a link cannot exceed the transmission capacity of that link. That is, the amount of each coal type multiplied with the energy content of each coal type cannot exceed the electrical transmission capacity of links. This constraint is applied only links between power plant units and provincial electricity demand nodes, as well as UHV transmission links between provincial electricity demand nodes. These are the only links along which electrical energy is transported. All other links transport physcial quantities of coal. This line simultaneously deals with the production capapcity (MW) and conversion efficiency of power plants: the energy transported over a link cannot exceed the volume of each coal type, multiplied with the energy content of each coal type, multiplied with the energy conversion factor of the link. For links between coal fired power plant units and provincial electricity demand nodes, this is equal to the conversion effincy of the power plant unit. For UHV transmission links between two provincial level electricity demand nodes, this is equal to (1- transmission losses) over that UHV line, with transmission losses calculated based on transmission distance and a benchmark loss for UHV-DC or UHV-AC lines.
  • The handling capacity of ports cannot be exceeded. Specifically, the total amount of coal flowing out of a port cannot exceed its handling capacity.
  • The production capacity of steel plants cannot be exceeded. Specifically, the total amount of hard coking coal, soft coking coal, and pulverized coal for injection flowing out of a steel plant node (and towards a provincial steel demand node) cannot exceed the steel plant's production capacity multiplied by 0.581+0.176+0.179, the mix of different coking coals needed to produce steel.

Technical notes

  • All transport costs are pre-calculated for each link, and include a fixed handling costs and a distance based transport cost, based on the type handling (origin and destaination) and type of transport (separate for rail, truck, riverborne, ocean-going. A small number of coal rail lines has specific handling and transport costs).
  • Some of the capacities are already reported in the input sheet for the edges. The physical transport capacity from this sheet is used. For capacities of ports, steel plants, and electrical transmission capacities, the data from the separate port/steel plant/electrical capacities sheets is used.
  • An exmaple lp file is included to make this reporsitory as self-conatined as possible. This lp file is zipped to stay within github file size limits.

Contributions

This project was developed by:

  • Jorrit Gosens: conceptualization, bulk of the data preparation, software implementation and debugging in R & Python;
  • Alex H. Turnbull: conceptualization, some data preparation, proof-reading and debugging of software.

License

MIT License as separately included.
In short, do what you want with this script, but refer to the original authors when you use or develop this code.

License

MIT License as separately included.
In short, do what you want with this script, but refer to the original authors when you use or develop this code.

You might also like...
MBPO (paper: When to trust your model: Model-based policy optimization) in offline RL settings

offline-MBPO This repository contains the code of a version of model-based RL algorithm MBPO, which is modified to perform in offline RL settings Pape

A multi-functional library for full-stack Deep Learning. Simplifies Model Building, API development, and Model Deployment.
A multi-functional library for full-stack Deep Learning. Simplifies Model Building, API development, and Model Deployment.

chitra What is chitra? chitra (चित्र) is a multi-functional library for full-stack Deep Learning. It simplifies Model Building, API development, and M

RoMA: Robust Model Adaptation for Offline Model-based Optimization

RoMA: Robust Model Adaptation for Offline Model-based Optimization Implementation of RoMA: Robust Model Adaptation for Offline Model-based Optimizatio

An atmospheric growth and evolution model based on the EVo degassing model and FastChem 2.0

EVolve Linking planetary mantles to atmospheric chemistry through volcanism using EVo and FastChem. Overview EVolve is a linked mantle degassing and a

Arch-Net: Model Distillation for Architecture Agnostic Model Deployment

Arch-Net: Model Distillation for Architecture Agnostic Model Deployment The official implementation of Arch-Net: Model Distillation for Architecture A

MMRazor: a model compression toolkit for model slimming and AutoML
MMRazor: a model compression toolkit for model slimming and AutoML

Documentation: https://mmrazor.readthedocs.io/ English | 简体中文 Introduction MMRazor is a model compression toolkit for model slimming and AutoML, which

Cancer-and-Tumor-Detection-Using-Inception-model - In this repo i am gonna show you how i did cancer/tumor detection in lungs using deep neural networks, specifically here the Inception model by google.
Cancer-and-Tumor-Detection-Using-Inception-model - In this repo i am gonna show you how i did cancer/tumor detection in lungs using deep neural networks, specifically here the Inception model by google.

Cancer-and-Tumor-Detection-Using-Inception-model In this repo i am gonna show you how i did cancer/tumor detection in lungs using deep neural networks

Stroke-predictions-ml-model - Machine learning model to predict individuals chances of having a stroke

stroke-predictions-ml-model machine learning model to predict individuals chance

Caffe-like explicit model constructor. C(onfig)Model

cmodel Caffe-like explicit model constructor. C(onfig)Model Installation pip install git+https://github.com/bonlime/cmodel Usage In order to allow usi

Owner
null
Model search is a framework that implements AutoML algorithms for model architecture search at scale

Model search (MS) is a framework that implements AutoML algorithms for model architecture search at scale. It aims to help researchers speed up their exploration process for finding the right model architecture for their classification problems (i.e., DNNs with different types of layers).

Google 3.2k Dec 31, 2022
Capture all information throughout your model's development in a reproducible way and tie results directly to the model code!

Rubicon Purpose Rubicon is a data science tool that captures and stores model training and execution information, like parameters and outcomes, in a r

Capital One 97 Jan 3, 2023
Implementation of STAM (Space Time Attention Model), a pure and simple attention model that reaches SOTA for video classification

STAM - Pytorch Implementation of STAM (Space Time Attention Model), yet another pure and simple SOTA attention model that bests all previous models in

Phil Wang 109 Dec 28, 2022
ReConsider is a re-ranking model that re-ranks the top-K (passage, answer-span) predictions of an Open-Domain QA Model like DPR (Karpukhin et al., 2020).

ReConsider ReConsider is a re-ranking model that re-ranks the top-K (passage, answer-span) predictions of an Open-Domain QA Model like DPR (Karpukhin

Facebook Research 47 Jul 26, 2022
Model Zoo for AI Model Efficiency Toolkit

We provide a collection of popular neural network models and compare their floating point and quantized performance.

Qualcomm Innovation Center 137 Jan 3, 2023
😇A pyTorch implementation of the DeepMoji model: state-of-the-art deep learning model for analyzing sentiment, emotion, sarcasm etc

------ Update September 2018 ------ It's been a year since TorchMoji and DeepMoji were released. We're trying to understand how it's being used such t

Hugging Face 865 Dec 24, 2022
LIAO Shuiying 6 Dec 1, 2022
Demonstrates how to divide a DL model into multiple IR model files (division) and introduce a simplest way to implement a custom layer works with OpenVINO IR models.

Demonstration of OpenVINO techniques - Model-division and a simplest-way to support custom layers Description: Model Optimizer in Intel(r) OpenVINO(tm

Yasunori Shimura 12 Nov 9, 2022
This project deploys a yolo fastest model in the form of tflite on raspberry 3b+. The model is from another repository of mine called -Trash-Classification-Car

Deploy-yolo-fastest-tflite-on-raspberry 觉得有用的话可以顺手点个star嗷 这个项目将垃圾分类小车中的tflite模型移植到了树莓派3b+上面。 该项目主要是为了记录在树莓派部署yolo fastest tflite的流程 (之后有时间会尝试用C++部署来提升

null 7 Aug 16, 2022