Tools for the Cleveland State Human Motion and Control Lab

Overview

Introduction

This is a collection of tools that are helpful for gait analysis. Some are specific to the needs of the Human Motion and Control Lab at Cleveland State University but other portions may have potential for general use. It is relatively modular so you can use what you want. It is primarily structured as a Python distribution but the Octave files are also accessible independently.

Latest Version https://travis-ci.org/csu-hmc/GaitAnalysisToolKit.png?branch=master

Python Packages

The main Python package is gaitanalysis and it contains five modules listed below. oct2py is used to call Octave routines in the Python code where needed.

gait.py
General tools for working with gait data such as gait landmark identification and 2D inverse dynamics. The main class is GaitData.
controlid.py
Tools for identifying control mechanisms in human locomotion.
markers.py
Routines for processing marker data.
motek.py
Tools for processing and cleaning data from Motek Medical's products, e.g. the D-Flow software outputs.
utils.py
Helper functions for the other modules.

Each module has a corresponding test module in gaitanalysis/tests sub-package which contain unit tests for the classes and functions in the respective module.

Octave Libraries

Several Octave routines are included in the gaitanalysis/octave directory.

2d_inverse_dynamics
Implements joint angle and moment computations of a 2D lower body human.
inertial_compensation
Compensates force plate forces and moments for inertial effects and re-expresses the forces and moments in the camera reference frame.
mmat
Fast matrix multiplication.
soder
Computes the rigid body orientation and location of a group of markers.
time_delay
Deals with the analog signal time delays.

Installation

You will need Python 2.7 or 3.7+ and setuptools to install the packages. Its best to install the dependencies first (NumPy, SciPy, matplotlib, Pandas, PyTables).

Supported versions:

  • python >= 2.7 or >= 3.7
  • numpy >= 1.8.2
  • scipy >= 0.13.3
  • matplotlib >= 1.3.1
  • tables >= 3.1.1
  • pandas >= 0.13.1, <= 0.24.0
  • pyyaml >= 3.10
  • DynamicistToolKit >= 0.4.0
  • oct2py >= 2.4.2
  • octave >= 3.8.1

We recommend installing Anaconda for users in our lab to get all of the dependencies.

We also utilize Octave code, so an install of Octave with is also required. See http://octave.sourceforge.net/index.html for installation instructions.

You can install using pip (or easy_install). Pip will theoretically [1] get the dependencies for you (or at least check if you have them):

$ pip install https://github.com/csu-hmc/GaitAnalysisToolKit/zipball/master

Or download the source with your preferred method and install manually.

Using Git:

$ git clone [email protected]:csu-hmc/GaitAnalysisToolKit.git
$ cd GaitAnalysisToolKit

Or wget:

$ wget https://github.com/csu-hmc/GaitAnalysisToolKit/archive/master.zip
$ unzip master.zip
$ cd GaitAnalysisToolKit-master

Then for basic installation:

$ python setup.py install

Or install for development purposes:

$ python setup.py develop
[1] You will need all build dependencies and also note that matplotlib doesn't play nice with pip.

Dependencies

It is recommended to install the software dependencies as follows:

Octave can be installed from your package manager or from a downloadable binary, for example on Debian based Linux:

$ sudo apt-get install octave

For oct2py to work, calling Octave from the command line should work after Octave is installed. For example,

$ octave
GNU Octave, version 3.8.1
Copyright (C) 2014 John W. Eaton and others.
This is free software; see the source code for copying conditions.
There is ABSOLUTELY NO WARRANTY; not even for MERCHANTABILITY or
FITNESS FOR A PARTICULAR PURPOSE.  For details, type 'warranty'.

Octave was configured for "x86_64-pc-linux-gnu".

Additional information about Octave is available at http://www.octave.org.

Please contribute if you find this software useful.
For more information, visit http://www.octave.org/get-involved.html

Read http://www.octave.org/bugs.html to learn how to submit bug reports.
For information about changes from previous versions, type 'news'.

octave:1>

The core dependencies can be installed with conda in a conda environment:

$ conda create -n gait python=2.7 pip numpy scipy matplotlib pytables pandas pyyaml nose sphinx numpydoc oct2py mock
$ source activate gait

And the dependencies which do not have conda packages can be installed into the environment with pip:

(gait)$ pip install DynamicistToolKit

Tests

When in the repository directory, run the tests with nose:

$ nosetests

Vagrant

A vagrant file and provisioning script are included to test the code on both a Ubuntu 12.04 and Ubuntu 13.10 box. To load the box and run the tests simply type:

$ cd vagrant
$ vagrant up

See VagrantFile and the *bootstrap.sh files to see what's going on.

Documentation

The documentation is hosted at ReadTheDocs:

http://gait-analysis-toolkit.readthedocs.org

You can build the documentation (currently sparse) if you have Sphinx and numpydoc:

$ cd docs
$ make html
$ firefox _build/html/index.html

Release Notes

0.2.0

  • Support Python 3. [PR #149]
  • Minimum dependencies bumped to Ubuntu 14.04 LTS versions and tests run on latest conda forge packages as of 2018/08/30. [PR #140]
  • The minimum version of the required dependency, DynamicistToolKit, was bumped to 0.4.0. [PR #134]
  • Reworked the DFlowData class so that interpolation and resampling is based on the FrameNumber column in the mocap data instead of the unreliable TimeStamp column. [PR #135]
  • Added note and setup.py check about higher oct2py versions required for Windows.

0.1.2

  • Fixed bug preventing GaitData.plot_grf_landmarks from working.
  • Removed inverse_data.mat from the source distribution.

0.1.1

  • Fixed installation issue where the octave and data files were not included in the installation directory.

0.1.0

  • Initial release
  • Copied the walk module from DynamicistToolKit @ eecaebd31940179fe25e99a68c91b75d8b8f191f
Comments
  • The right force plate is named FP1 instead of FP2 in 2d inverse dynamics test data

    The right force plate is named FP1 instead of FP2 in 2d inverse dynamics test data

    @tvdbogert

    In Octave-Matlab-Codes/2D-Inverse-Dynamics/test/rawdata.txt the force and moments are labeled FP1.* where as the markers are R*. The right force plate should be called FP2. I don't think there is an issue with the data, i.e. all the data is from the same side, but the labels are just not consistent. Still haven't found the real issue yet...

    opened by moorepants 13
  • SimpleControlSolver.solve() is extremely slow for full gain matrix and large # gait cycles

    SimpleControlSolver.solve() is extremely slow for full gain matrix and large # gait cycles

    I'm having trouble figuring out what the slow down is. One bottleneck is surely the parameter covariance computations in dtk.process.least_squares_variance(), but even when that computation is skipped it still seems to be slow.

    performance 
    opened by moorepants 8
  • Mass and inertia are now correctly computed for each segment.

    Mass and inertia are now correctly computed for each segment.

    The mass and inertia selection/computations were done in the wrong loop, so the same inertial properties were used for each segment. This remedies that by moving the computations into the force/moment loop.

    opened by moorepants 8
  • DFlow mocap module time stamp is unreliable

    DFlow mocap module time stamp is unreliable

    I just discovered that using the TimeStamp column in the mocap module was a bad idea. I new that the TimeStamp column had some anomalies (see http://gait-analysis-toolkit.readthedocs.org/en/latest/motek.html#data-column-descriptions) but none of my test data showed the additional anomalies I've just uncovered. Once I used a new differentiation method in PR #128 these anomalies were exposed, where as the previous differentiation method quietly but incorrectly smoothed them over. This issue is that DFlow receives streaming data from Cortex and it is accessible in the from the mocap module. Cortex likely handles the data stream correctly, i.e. within the 0.01 s sampling period it collects data from all cameras and computes the marker coordinates providing a value that was collected at sometime during the sampling period. Each cortex value is delivered along with the sequential frame number and it must have a mechanism to flag a frame number that has missing marker values. DFlow receives this data over ethernet and time stamps it using the DFlow CPU clock. This issue arises because the data flow comes into the DFlow computer serially and likely asynchronously wrt to time. A FIFO flow is likely used to so data is dropped, but this means that data can "stack up" with respect to the DFlow cpu time and if the TimeStamp column is relied upon, there can be issues.

    The following plot shows the same graph from (http://gait-analysis-toolkit.readthedocs.org/en/latest/motek.html#data-column-descriptions) except with a data set (trial 31) that doesn't have as clean a time stamp. This plot shows the difference in the current and subsequent time value: 031-time-diff Note that most of the differences are around 0.01 s, but there are some large outliers and notice the values that are close to zero. These correspond to the data stack up wrt to time.

    The next plot shows three plots where I've flagged all the values associated with the time stack ups in red. The first plot is TimeStamp vs FrameNumber and the following two show the RGTRO marker coordinate against time and frame number. 031-bad-idxs Notice that there are a fair amount of time stack ups in the data. We can zoom in to one instance to see what is going on: 031-bad-idxs-zoom This graph plots the same things but we see how the frames get stacked up wrt to time.

    The motek.py module parses the data and uses the TimeStamp column as a reliable value. You can see how this will cause issues when trying to differentiate marker positions with respect to time. The real time differentiator does can differentiate these instance without too much error so it wasn't noticed before. But the values are still wrong. The traditional central differencing methods have major issues with this incorrect data.

    Note that this issue is in addition to missing marker values. The DFlowData class needs to be reworked to make use of the FrameNumber instead of the TimeStamp column.

    The other issue with this is that synchronizing the mocap module and record module data relies on the time stamp values. I assume that the TimeStamp column from each module is generated from the DFlow cpu time. We'll need to look into the synchronization after the time stamp issues are taken care of to ensure that synchronization still works as expected.

    bug 
    opened by moorepants 6
  • License

    License

    We should put a license with the code so it is clear to people how they can use it. I've been using http://unlicense.org/ lately which makes it public domain (no restrictions). If you all desire some more restrictive (require citation, non-commerical, etc) we should discuss that. Right now the UNLICENSE is in the directory as that is what I had in DynamicisitToolKit.

    question 
    opened by moorepants 6
  • Accel

    Accel

    Added in motek.py

    • function to calibrate accelerometers
    • function to relabel analog channels to user-defined or sensible defaults

    Added in gait.py

    • function to find heelstrikes/toeoffs from accel signal
    • function to plot gait landmarks

    test_motex/test_gait

    • tests added to reflect additions
    opened by campanawanna 5
  • Adds the ability to remap marker labels from the meta.yml file.

    Adds the ability to remap marker labels from the meta.yml file.

    This fixes issue #68. It adds a new field in the meta data in which the user can provide a mappign between the marker names in the raw data files and the desired marker names in the resulting data frames returned by DFlowData methods. It follows the same design pattern as the analog channel relabeling method.

    enhancement 
    opened by moorepants 4
  • Error in leg2d.m: interp1q and number of output args.

    Error in leg2d.m: interp1q and number of output args.

    1. I found a typo in gaitanalysis/octave/2d_inverse_dynamics/leg2d.m where interp1 was called interp1q instead.

    Now I’m getting the error: /usr/local/lib/python2.7/site-packages/gaitanalysis/gait.pyc in inverse_dynamics_2d(self, left_leg_markers, right_leg_markers, left_leg_forces, right_leg_forces, body_mass, low_pass_cutoff) 240 angles, velocities, moments, forces =
    241 octave.leg2d(time, marker_array, normalized_force_array, --> 242 options) 243 244 dynamics = angles, velocities, moments, forces

    ValueError: too many values to unpack

    It seems leg2d.m only returns its first argument for some reason, the angles.

    opened by moorepants 3
  • Improves the plotting speed of the gait landmarks.

    Improves the plotting speed of the gait landmarks.

    The speeds up the plots, makes the code a little more efficient, and also adds some arguments to enable the user to plot sections of the data more easily.

    I can't reproduce this, so I'm skipping it:

    • [ ] The heel strike lines don't always span the entire y range.
    opened by moorepants 3
  • Fixed time delay function and added test.

    Fixed time delay function and added test.

    @campanawanna Can you check this and see if you think my change is correct and that the test is correct too. I'm not 100% confident that it is doing what I thought it should be doing.

    opened by moorepants 3
  • Bump pyyaml from 3.10 to 5.1

    Bump pyyaml from 3.10 to 5.1

    Bumps pyyaml from 3.10 to 5.1.

    Changelog

    Sourced from pyyaml's changelog.

    5.1 (2019-03-13)

    3.13 (2018-07-05)

    • Resolved issues around PyYAML working in Python 3.7.

    3.12 (2016-08-28)

    • Wheel packages for Windows binaries.
    • Adding an implicit resolver to a derived loader should not affect the base loader.
    • Uniform representation for OrderedDict? across different versions of Python.
    • Fixed comparison to None warning.

    3.11 (2014-03-26)

    • Source and binary distributions are rebuilt against the latest versions of Cython and LibYAML.
    Commits
    • e471e86 Updates for 5.1 release
    • 9141e90 Windows Appveyor build
    • d6cbff6 Skip certain unicode tests when maxunicode not > 0xffff
    • 69103ba Update .travis.yml to use libyaml 0.2.2
    • 91c9435 Squash/merge pull request #105 from nnadeau/patch-1
    • 507a464 Make default_flow_style=False
    • 07c88c6 Allow to turn off sorting keys in Dumper
    • 611ba39 Include license file in the generated wheel package
    • 857dff1 Apply FullLoader/UnsafeLoader changes to lib3
    • 0cedb2a Deprecate/warn usage of yaml.load(input)
    • Additional commits viewable in compare view

    Dependabot compatibility score

    Dependabot will resolve any conflicts with this PR as long as you don't alter it yourself. You can also trigger a rebase manually by commenting @dependabot rebase.


    Dependabot commands and options

    You can trigger Dependabot actions by commenting on this PR:

    • @dependabot rebase will rebase this PR
    • @dependabot recreate will recreate this PR, overwriting any edits that have been made to it
    • @dependabot merge will merge this PR after your CI passes on it
    • @dependabot squash and merge will squash and merge this PR after your CI passes on it
    • @dependabot cancel merge will cancel a previously requested merge and block automerging
    • @dependabot reopen will reopen this PR if it is closed
    • @dependabot ignore this [patch|minor|major] version will close this PR and stop Dependabot creating any more for this minor/major version (unless you reopen the PR or upgrade to it yourself)
    • @dependabot ignore this dependency will close this PR and stop Dependabot creating any more for this dependency (unless you reopen the PR or upgrade to it yourself)
    • @dependabot use these labels will set the current labels as the default for future PRs for this repo and language
    • @dependabot use these reviewers will set the current reviewers as the default for future PRs for this repo and language
    • @dependabot use these assignees will set the current assignees as the default for future PRs for this repo and language
    • @dependabot use this milestone will set the current milestone as the default for future PRs for this repo and language

    You can disable automated security fix PRs for this repo from the Security Alerts page.

    dependencies 
    opened by dependabot[bot] 2
  • Building old conda environments

    Building old conda environments

    In 2018 this env.yml file worked:

    name: gaitanalysis-dev-oldest
    channels:
      - defaults
    dependencies:
      - python =2.7.*
      - numpy =1.8.2
      - scipy =0.13.3
      - matplotlib =1.3.1
      - pytables =3.1.1
      - pandas =0.13.1
      - pyyaml =3.10
      - nose =1.3.1
      - sphinx =1.2.2
      - coverage
      - mock =1.0.1
      - numpydoc =0.4
      - pip:
        - dynamicisttoolkit==0.4.0
        - oct2py==2.4.2
    

    But apparently the "free" channel was moved from "defaults", so this is now required:

    name: gaitanalysis-dev-oldest
    channels:
      - free
      - defaults
    dependencies:
      - python =2.7.*
      - numpy =1.8.2
      - scipy =0.13.3
      - matplotlib =1.3.1
      - pytables =3.1.1
      - pandas =0.13.1
      - pyyaml =3.10
      - nose =1.3.1
      - sphinx =1.2.2
      - coverage
      - mock =1.0.1
      - numpydoc =0.4
      - pip:
        - dynamicisttoolkit==0.4.0
        - oct2py==2.4.2
    

    conda will resolve the packages in this updated version.

    opened by moorepants 0
  • argmin is deprecated

    argmin is deprecated

    gaitanalysis.tests.test_motek.TestDFlowData.test_extract_processed_data ... /home/moorepants/miniconda3/envs/gaitanalysistoolkit-dev/lib/python2.7/site-packages/numpy/core/fromnumeric.py:51: FutureWarning: 'argmin' is deprecated, use 'idxmin' instead. The behavior of 'argmin'
    will be corrected to return the positional minimum in the future.
    Use 'series.values.argmin' to get the position of the minimum now.
      return getattr(obj, method)(*args, **kwds)
    
    opened by moorepants 0
  • Non-tuple multidiminsional indexing deprecated.

    Non-tuple multidiminsional indexing deprecated.

    gaitanalysis.tests.test_gait.test_find_constant_speed ... /home/moorepants/miniconda3/envs/gaitanalysistoolkit-dev/lib/python2.7/site-packages/scipy/signal/_arraytools.py:45: FutureWarning: Using a non-tuple sequence for multidimensional indexing is deprecated; use `arr[tuple(seq)]` instead of `arr[seq]`. In the future this will be interpreted as an array index, `arr[np.array(seq)]`, which will result either in an error or a different result.
      b = a[a_slice]
    
    opened by moorepants 0
  • Pandas deprecating get_store()

    Pandas deprecating get_store()

    /home/moorepants/miniconda3/envs/gaitanalysistoolkit-dev/lib/python2.7/unittest/case.py:393: FutureWarning: get_store is deprecated and be removed in a future version
    HDFStore(path, **kwargs) is the replacement
      return self.run(*args, **kwds)
    
    opened by moorepants 1
  • NumPy lstsq rcond deprecation

    NumPy lstsq rcond deprecation

    /home/moorepants/src/GaitAnalysisToolKit/gaitanalysis/controlid.py:489: FutureWarning: `rcond` parameter will change to the default of machine precision times ``max(M, N)`` where M and N are the input matrix dimensions.
    To use the future default and silence this warning we advise to pass `rcond=None`, to keep using the old, explicitly pass `rcond=-1`.
      x, sum_of_residuals, rank, s = np.linalg.lstsq(A, b)
    
    opened by moorepants 0
  • Pandas is deprecating Panel

    Pandas is deprecating Panel

    Relevant warning from the tests:

    gaitanalysis.tests.test_controlid.TestSimpleControlSolver.test_compute_estimated_controls ... /home/moorepants/miniconda3/envs/gaitanalysistoolkit-dev/lib/python2.7/site-packages/nose/util.py:471: FutureWarning: 
    Panel is deprecated and will be removed in a future version.
    The recommended way to represent these types of 3-dimensional data are with a MultiIndex on a DataFrame, via the Panel.to_frame() method
    Alternatively, you can use the xarray package http://xarray.pydata.org/en/stable/.
    Pandas provides a `.to_xarray()` method to help automate this conversion.
    
      return func()
    
    opened by moorepants 1
Releases(v0.2.0)
  • v0.2.0(Jun 19, 2021)

    • Support Python 3. [PR #149_]
    • Minimum dependencies bumped to Ubuntu 14.04 LTS versions and tests run on latest conda forge packages as of 2018/08/30. [PR #140_]
    • The minimum version of the required dependency, DynamicistToolKit, was bumped to 0.4.0. [PR #134_]
    • Reworked the DFlowData class so that interpolation and resampling is based on the FrameNumber column in the mocap data instead of the unreliable TimeStamp column. [PR #135_]
    • Added note and setup.py check about higher oct2py versions required for Windows.

    .. _#149: https://github.com/csu-hmc/GaitAnalysisToolKit/pull/149 .. _#134: https://github.com/csu-hmc/GaitAnalysisToolKit/pull/134 .. _#135: https://github.com/csu-hmc/GaitAnalysisToolKit/pull/135 .. _#140: https://github.com/csu-hmc/GaitAnalysisToolKit/pull/140

    Source code(tar.gz)
    Source code(zip)
  • v0.1.2(Dec 8, 2014)

  • v0.1.1(Dec 8, 2014)

  • v0.1.0(Dec 1, 2014)

Owner
CSU Human Motion and Control Lab
CSU Human Motion and Control Lab
This repository contains the code for the paper "Hierarchical Motion Understanding via Motion Programs"

Hierarchical Motion Understanding via Motion Programs (CVPR 2021) This repository contains the official implementation of: Hierarchical Motion Underst

Sumith Kulal 40 Dec 5, 2022
Distilling Motion Planner Augmented Policies into Visual Control Policies for Robot Manipulation (CoRL 2021)

Distilling Motion Planner Augmented Policies into Visual Control Policies for Robot Manipulation [Project website] [Paper] This project is a PyTorch i

Cognitive Learning for Vision and Robotics (CLVR) lab @ USC 6 Feb 28, 2022
Synthesizing Long-Term 3D Human Motion and Interaction in 3D in CVPR2021

Long-term-Motion-in-3D-Scenes This is an implementation of the CVPR'21 paper "Synthesizing Long-Term 3D Human Motion and Interaction in 3D". Please ch

Jiashun Wang 76 Dec 13, 2022
A selection of State Of The Art research papers (and code) on human locomotion (pose + trajectory) prediction (forecasting)

A selection of State Of The Art research papers (and code) on human trajectory prediction (forecasting). Papers marked with [W] are workshop papers.

Karttikeya Manglam 40 Nov 18, 2022
EasyMocap is an open-source toolbox for markerless human motion capture from RGB videos.

EasyMocap is an open-source toolbox for markerless human motion capture from RGB videos. In this project, we provide the basic code for fitt

ZJU3DV 2.2k Jan 5, 2023
[WACV 2020] Reducing Footskate in Human Motion Reconstruction with Ground Contact Constraints

Reducing Footskate in Human Motion Reconstruction with Ground Contact Constraints Official implementation for Reducing Footskate in Human Motion Recon

Virginia Tech Vision and Learning Lab 38 Nov 1, 2022
Code for ICCV 2021 paper "HuMoR: 3D Human Motion Model for Robust Pose Estimation"

Code for ICCV 2021 paper "HuMoR: 3D Human Motion Model for Robust Pose Estimation"

Davis Rempe 367 Dec 24, 2022
Official Pytorch implementation of the paper "Action-Conditioned 3D Human Motion Synthesis with Transformer VAE", ICCV 2021

ACTOR Official Pytorch implementation of the paper "Action-Conditioned 3D Human Motion Synthesis with Transformer VAE", ICCV 2021. Please visit our we

Mathis Petrovich 248 Dec 23, 2022
Official Pytorch implementation of the paper "MotionCLIP: Exposing Human Motion Generation to CLIP Space"

MotionCLIP Official Pytorch implementation of the paper "MotionCLIP: Exposing Human Motion Generation to CLIP Space". Please visit our webpage for mor

Guy Tevet 173 Dec 26, 2022
ROS-UGV-Control-Interface - Control interface which can be used in any UGV

ROS-UGV-Control-Interface Cam Closed: Cam Opened:

Ahmet Fatih Akcan 1 Nov 4, 2022
Hand Gesture Volume Control is AIML based project which uses image processing to control the volume of your Computer.

Hand Gesture Volume Control Modules There are basically three modules Handtracking Program Handtracking Module Volume Control Program Handtracking Pro

VITTAL 1 Jan 12, 2022
Tensorflow implementation of Human-Level Control through Deep Reinforcement Learning

Human-Level Control through Deep Reinforcement Learning Tensorflow implementation of Human-Level Control through Deep Reinforcement Learning. This imp

Devsisters Corp. 2.4k Dec 26, 2022
Human POSEitioning System (HPS): 3D Human Pose Estimation and Self-localization in Large Scenes from Body-Mounted Sensors, CVPR 2021

Human POSEitioning System (HPS): 3D Human Pose Estimation and Self-localization in Large Scenes from Body-Mounted Sensors Human POSEitioning System (H

Aymen Mir 66 Dec 21, 2022
[CVPR2021] UAV-Human: A Large Benchmark for Human Behavior Understanding with Unmanned Aerial Vehicles

UAV-Human Official repository for CVPR2021: UAV-Human: A Large Benchmark for Human Behavior Understanding with Unmanned Aerial Vehicle Paper arXiv Res

null 129 Jan 4, 2023
Human Action Controller - A human action controller running on different platforms.

Human Action Controller (HAC) Goal A human action controller running on different platforms. Fun Easy-to-use Accurate Anywhere Fun Examples Mouse Cont

null 27 Jul 20, 2022
Python scripts for performing 3D human pose estimation using the Mobile Human Pose model in ONNX.

Python scripts for performing 3D human pose estimation using the Mobile Human Pose model in ONNX.

Ibai Gorordo 99 Dec 31, 2022
StyleGAN-Human: A Data-Centric Odyssey of Human Generation

StyleGAN-Human: A Data-Centric Odyssey of Human Generation Abstract: Unconditional human image generation is an important task in vision and graphics,

stylegan-human 762 Jan 8, 2023
piSTAR Lab is a modular platform built to make AI experimentation accessible and fun. (pistar.ai)

piSTAR Lab WARNING: This is an early release. Overview piSTAR Lab is a modular deep reinforcement learning platform built to make AI experimentation a

piSTAR Lab 0 Aug 1, 2022
All the code and files related to the MI-Lab of UE19CS305 course in sem 5

Machine-Intelligence-Lab-CS305 The compilation of all the code an drelated files from MI-Lab UE19CS305 (of batch 2019-2023) offered by PES University

Arvind Krishna 3 Nov 10, 2022