π Overview
Last updated
Last updated
Hand Physics Toolkit (HPTK) is a toolkit to implement hand-driven interactions in a modular and scalable way. Platform-independent. Input-independent. Scale-independent. Can be combined with MRTK-Quest for UI interactions.
This documentation is under construction.
Are you unable to find what you need? Please report it in our community.
You can clone a ready-to-go project at HPTK-Sample.
Data model to access parts, components or calculated values with very little code
Code architecture based on MVC-like modules. Support to custom modules
Platform-independent. Tested on VR/AR/non-XR applications
Input-independent. Use hand tracking or controllers
Pupettering for any avatar or body structure
Scale-independent. Valid for any hand size
Realistic configurable hand physics
Define strategies to deal with tracking loss
Physics-based touch/grab detection
Tracking noise smoothing
Unity 2021.x
Unity 2020.x
Unity 2019.x
Oculus Quest 1/2 - Android
Hololens 2 - UWP
Leap Motion - PC
WebXR - Web
Oculus Touch
WMR
Vive
OpenVR
Universal Render Pipeline (URP)
Standard RP
Obtain HPTK
Change ProjectSettings & BuildSettings
Import the built-in integration packge (if needed)
Drag & drop the default setup to your scene
Build and test
Check Setup for a detailed step-by-step guide.
Jorge Juan GonzΓ‘lez - HCI Researcher at I3A (University of Castilla-La Mancha)
Oxters Wyzgowski - GitHub - Twitter
Michael Stevenson - GitHub
Kiran Nasim and Young J. Kim. 2016. Physics-based Interactive Virtual Grasping. In Proceedings of HCI Korea (HCIK '16). Hanbit Media, Inc., Seoul, KOR, 114β120. DOI: https://doi.org/10.17210/hcik.2016.01.114
Linn, Allison. Talking with your hands: How Microsoft researchers are moving beyond keyboard and mouse. The AI Blog. Microsoft. 2016 https://blogs.microsoft.com/