jaxdf - JAX-based Discretization Framework
Overview | Example | Installation | Documentation
⚠️
This library is still in development. Breaking changes may occur.
Overview
jaxdf is a JAX-based package defining a coding framework for writing differentiable numerical simulators with arbitrary discretizations.
The intended use is to build numerical models of physical systems, such as wave propagation, or the numerical solution of partial differential equations, that are easy to customize to the user's research needs. Such models are pure functions that can be included into arbitray differentiable programs written in JAX. For example, they can be used as layers of neural networks, or to build a physics loss function.
Example
The following script builds the non-linear operator (∇2 + sin), using a Fourier spectral discretization on a square 2D domain. The output is given over the whole collocation grid.
from jaxdf import operators as jops
from jaxdf.core import operator
from jaxdf.geometry import Domain
from jax import numpy as jnp
from jax import jit, grad
# Defining operator
@operator()
def custom_op(u):
grad_u = jops.gradient(u)
diag_jacobian = jops.diag_jacobian(grad_u)
laplacian = jops.sum_over_dims(mod_diag_jacobian)
sin_u = jops.elementwise(jnp.sin)(u)
return laplacian + sin_u
# Defining discretizations
domain = Domain((256, 256), (1., 1.))
fourier_discr = FourierSeries(domain)
u_fourier_params, u = fourier_discr.empty_field(name='u')
# Discretizing operators: getting pure functions and parameters
result = custom_op(u=u)
op_on_grid = result.get_field_on_grid(0)
global_params = result.get_global_params() # This contains the Fourier filters
# Compile and use the pure function
result_on_grid = jit(op_on_grid)(
global_params,
{"u": u_fourier_params}
)
# Define a differentiable loss function
def loss(u_params):
op_output = jit(op_on_grid)(global_params, {"u": u_params})
return jnp.mean(jnp.abs(op_output)**2)
gradient = grad(loss)(u_fourier_params)
Installation
Before installing jaxdf
, make sure that you have installed JAX. Follow the instruction to install JAX with NVidia GPU support if you want to use jaxdf
on the GPUs.
Install jaxdf by cd
in the repo folder an run
pip install -r requirements.txt
pip install .
If you want to run the notebooks, you should also install the following packages
pip install jupyterlab, tqdm
Citation
This package will be presented at the Differentiable Programming workshop at NeurIPS 2021.
@article{stanziola2021jaxdf,
author={Stanziola, Antonio and Arridge, Simon and Cox, Ben T. and Treeby, Bradley E.},
title={A research framework for writing differentiable PDE discretizations in JAX},
year={2021},
journal={Differentiable Programming workshop at Neural Information Processing Systems 2021}
}
Related projects
odl
Operator Discretization Library (ODL) is a python library for fast prototyping focusing on (but not restricted to) inverse problems.deepXDE
: a TensorFlow and PyTorch library for scientific machine learning.SciML
: SciML is a NumFOCUS sponsored open source software organization created to unify the packages for scientific machine learning.