Overview
Hand Physics Toolkit (HPTK) is a toolkit to implement hand-driven interactions in a modular and scalable way. Platform-independent. Input-independent. Scale-independent. Can be combined with MRTK-Quest for UI interactions.
This documentation is under construction.
Are you unable to find what you need? Please report it in our community.
You can clone a ready-to-go project at HPTK-Sample.
Main features
Data model to access parts, components or calculated values with very little code
Code architecture based on MVC-like modules. Support to custom modules
Platform-independent. Tested on VR/AR/non-XR applications
Input-independent. Use hand tracking or controllers
Scale-independent. Valid for any hand size
State-of-the-art configurable hand physics
Define strategies to deal with tracking loss
Physics-based touch/grab detection
Tracking noise smoothing
Supported versions
Unity 2020.x
Unity 2019.x
Supported input
Hand tracking
Oculus Quest 1/2 - Android
Hololens 2 - UWP
Controllers
Oculus Touch
WMR
Vive
OpenVR
Supported render pipelines
Universal Render Pipeline (URP)
Standard RP
Getting started with HPTK
Obtain HPTK.
Import Oculus Integration.
Configure Build Settings (Oculus Quest).
Configure Project Settings (!).
Setup a scene with hand tracking support (Oculus Quest).
Setup HPTK specific components.
Setup platform specific HPTK components (Oculus Quest).
Modify/Create HPTK Configuration Assets (if needed).
Check Setup for a detailed step-by-step guide.
Author
Jorge Juan González - HCI Researcher at I3A (University of Castilla-La Mancha)
Acknowledgements
Oxters Wyzgowski - GitHub - Twitter
Michael Stevenson - GitHub
Kiran Nasim and Young J. Kim. 2016. Physics-based Interactive Virtual Grasping. In Proceedings of HCI Korea (HCIK '16). Hanbit Media, Inc., Seoul, KOR, 114–120. DOI: https://doi.org/10.17210/hcik.2016.01.114
Linn, Allison. Talking with your hands: How Microsoft researchers are moving beyond keyboard and mouse. The AI Blog. Microsoft. 2016 https://blogs.microsoft.com/
License
Last updated