Human Action Controller - A human action controller running on different platforms.

Related tags

Deep Learning hac
Overview

Human Action Controller (HAC)

Goal

A human action controller running on different platforms.

Fun Easy-to-use
Accurate Anywhere

Fun Examples

Mouse Control

Mouse Control

Keyboard Control

Keyboard Control

Playing Game

Pikachu

Enhancing interaction

Gather Town

Solutions provided by HAC

Platform Module Progress Comment
PC / Win10 Mouse Control V
PC / Win10 Keyboard Control V
PC / Ubuntu Mouse Control
PC / Ubuntu Keyboard Control

Getting started

Installation

$ pip install pyhac

Run the demo of mouse control

$ git clone https://github.com/dabit-lucas/hac.git
$ cd hac
$ python demo.py

Recording custom actions

$ python recording.py -d {action name} -k True

Press key "r" to start recording, the data will be saved into ./data

Training a custom module

Here is an example of a config file of action set,

{
    "actions": [
        "r_five",
        "r_zero",
        "l_five",
        "l_zero",
        "two_index_fingers_up",
        "two_index_fingers_down",
        "33",
        "55",
        "sit"
    ],
    "type": "gesture_only"
}

These actions form a model by running a training process:

$ python train.py --conf {path_of_action} --model_name {name_of_model}

The generated model will become a module. Take mouse control module as an exmaple, it can create mappings among actions and controls by the following code:

mouse_module = hac.add_module("mouse_control")
hac.set_init_module(mouse_module)

# create mapping between controls and actions
mouse_module.add_mouse_mapping("mouse_left_down", ["r_five", "r_zero"])
mouse_module.add_mouse_mapping("mouse_left_up", "r_five")
mouse_module.add_mouse_mapping("mouse_right_down", ["l_five", "l_zero"])
mouse_module.add_mouse_mapping("mouse_right_up", "l_five")
mouse_module.add_mouse_mapping("right_move_diff", ["r_five", "r_five"])
mouse_module.add_mouse_mapping("right_move_diff", ["r_zero", "r_zero"])
mouse_module.add_mouse_mapping("left_move_diff", ["l_five", "l_five"])
mouse_module.add_mouse_mapping("left_move_diff", ["l_zero", "l_zero"])
mouse_module.add_mouse_mapping("roll_up", "two_index_fingers_up")
mouse_module.add_mouse_mapping("roll_down", "two_index_fingers_down") 

If the five gesture with a right hand shows in consecutive two frames ["r_five", "r_five"], then do control right_move_diff, which means moving the mouse cursor. The above description can be represented by the following code:

mouse_module.add_mouse_mapping("right_move_diff", ["r_five", "r_five"])

Development guideline

The structure of HAC

Community

Welcome to ask any question in issues.

Contributing

Any contribution is welcomed. Please fork this repo and summit a pull request.

You might also like...
Official Pytorch Implementation of 'Learning Action Completeness from Points for Weakly-supervised Temporal Action Localization' (ICCV-21 Oral)
Official Pytorch Implementation of 'Learning Action Completeness from Points for Weakly-supervised Temporal Action Localization' (ICCV-21 Oral)

Learning-Action-Completeness-from-Points Official Pytorch Implementation of 'Learning Action Completeness from Points for Weakly-supervised Temporal A

Official PyTorch implementation of
Official PyTorch implementation of "IntegralAction: Pose-driven Feature Integration for Robust Human Action Recognition in Videos", CVPRW 2021

IntegralAction: Pose-driven Feature Integration for Robust Human Action Recognition in Videos Introduction This repo is official PyTorch implementatio

Official Pytorch implementation of the paper
Official Pytorch implementation of the paper "Action-Conditioned 3D Human Motion Synthesis with Transformer VAE", ICCV 2021

ACTOR Official Pytorch implementation of the paper "Action-Conditioned 3D Human Motion Synthesis with Transformer VAE", ICCV 2021. Please visit our we

Irrigation controller for Home Assistant
Irrigation controller for Home Assistant

Irrigation Unlimited This integration is for irrigation systems large and small. It can offer some complex arrangements without large and messy script

Research on controller area network Intrusion Detection Systems

Group members information Member 1: Lixue Liang Member 2: Yuet Lee Chan Member 3: Xinruo Zhang Member 4: Yifei Han User Manual Generate Attack Packets

Shitty gaze mouse controller

demo.mp4 shitty_gaze_mouse_cotroller install tensofflow, cv2 run the main.py and as it starts it will collect data so first raise your left eyebrow(bo

[CVPR2021] UAV-Human: A Large Benchmark for Human Behavior Understanding with Unmanned Aerial Vehicles

UAV-Human Official repository for CVPR2021: UAV-Human: A Large Benchmark for Human Behavior Understanding with Unmanned Aerial Vehicle Paper arXiv Res

 Human POSEitioning System (HPS): 3D Human Pose Estimation and Self-localization in Large Scenes from Body-Mounted Sensors, CVPR 2021
Human POSEitioning System (HPS): 3D Human Pose Estimation and Self-localization in Large Scenes from Body-Mounted Sensors, CVPR 2021

Human POSEitioning System (HPS): 3D Human Pose Estimation and Self-localization in Large Scenes from Body-Mounted Sensors Human POSEitioning System (H

Python scripts for performing 3D human pose estimation using the Mobile Human Pose model in ONNX.
Python scripts for performing 3D human pose estimation using the Mobile Human Pose model in ONNX.

Python scripts for performing 3D human pose estimation using the Mobile Human Pose model in ONNX.

Comments
  • how to run demo on ubuntu?

    how to run demo on ubuntu?

    /pydirectinput/__init__.py", line 6, in <module>
        SendInput = ctypes.windll.user32.SendInput
    AttributeError: module 'ctypes' has no attribute 'windll'
    
    

    one dependence will using windll which not exist on ubuntu

    opened by jinfagang 4
Owner
null
Quadruped-command-tracking-controller - Quadruped command tracking controller (flat terrain)

Quadruped command tracking controller (flat terrain) Prepare Install RAISIM link

Yunho Kim 4 Oct 20, 2022
The official TensorFlow implementation of the paper Action Transformer: A Self-Attention Model for Short-Time Pose-Based Human Action Recognition

Action Transformer A Self-Attention Model for Short-Time Human Action Recognition This repository contains the official TensorFlow implementation of t

PIC4SeRCentre 20 Jan 3, 2023
A collection of pre-trained StyleGAN2 models trained on different datasets at different resolution.

Awesome Pretrained StyleGAN2 A collection of pre-trained StyleGAN2 models trained on different datasets at different resolution. Note the readme is a

Justin 1.1k Dec 24, 2022
MACE is a deep learning inference framework optimized for mobile heterogeneous computing platforms.

Documentation | FAQ | Release Notes | Roadmap | MACE Model Zoo | Demo | Join Us | 中文 Mobile AI Compute Engine (or MACE for short) is a deep learning i

Xiaomi 4.7k Dec 29, 2022
PyTorch 1.0 inference in C++ on Windows10 platforms

Serving PyTorch Models in C++ on Windows10 platforms How to use Prepare Data examples/data/train/ - 0 - 1 . . . - n examples/data/test/

Henson 88 Oct 15, 2022
Fair Recommendation in Two-Sided Platforms

Fair Recommendation in Two-Sided Platforms

gourabgggg 1 Nov 10, 2021
Energy consumption estimation utilities for Jetson-based platforms

This repository contains a utility for measuring energy consumption when running various programs in NVIDIA Jetson-based platforms. Currently TX-2, NX, and AGX are supported.

OpenDR 10 Jun 17, 2022
Cross-platform-profile-pic-changer - Script to change profile pictures across multiple platforms

cross-platform-profile-pic-changer script to change profile pictures across mult

null 4 Jan 17, 2022
Allows including an action inside another action (by preprocessing the Yaml file). This is how composite actions should have worked.

actions-includes Allows including an action inside another action (by preprocessing the Yaml file). Instead of using uses or run in your action step,

Tim Ansell 70 Nov 4, 2022
Official implementation of ACTION-Net: Multipath Excitation for Action Recognition (CVPR'21).

ACTION-Net Official implementation of ACTION-Net: Multipath Excitation for Action Recognition (CVPR'21). Getting Started EgoGesture data folder struct

V-Sense 171 Dec 26, 2022