Overview
Last updated
Last updated
Hand Physics Toolkit (HPTK) is a toolkit to implement hand-driven interactions in a modular and scalable way. Platform-independent. Input-independent. Scale-independent. Can be combined with MRTK-Quest for UI interactions.
This documentation is under construction.
Are you unable to find what you need? Please report it in our community.
You can clone a ready-to-go project at HPTK-Sample.
Data model to access parts, components or calculated values with very little code
Code architecture based on MVC-like modules. Support to custom modules
Platform-independent. Tested on VR/AR/non-XR applications
Input-independent. Use hand tracking or controllers
Scale-independent. Valid for any hand size
State-of-the-art configurable hand physics
Define strategies to deal with tracking loss
Physics-based touch/grab detection
Tracking noise smoothing
Unity 2020.x
Unity 2019.x
Oculus Quest 1/2 - Android
Hololens 2 - UWP
Oculus Touch
WMR
Vive
OpenVR
Universal Render Pipeline (URP)
Standard RP
Obtain HPTK.
Import Oculus Integration.
Configure Build Settings (Oculus Quest).
Configure Project Settings (!).
Setup a scene with hand tracking support (Oculus Quest).
Setup HPTK specific components.
Setup platform specific HPTK components (Oculus Quest).
Modify/Create HPTK Configuration Assets (if needed).
Check Setup for a detailed step-by-step guide.
Jorge Juan González - HCI Researcher at I3A (University of Castilla-La Mancha)
Oxters Wyzgowski - GitHub - Twitter
Michael Stevenson - GitHub
Kiran Nasim and Young J. Kim. 2016. Physics-based Interactive Virtual Grasping. In Proceedings of HCI Korea (HCIK '16). Hanbit Media, Inc., Seoul, KOR, 114–120. DOI: https://doi.org/10.17210/hcik.2016.01.114
Linn, Allison. Talking with your hands: How Microsoft researchers are moving beyond keyboard and mouse. The AI Blog. Microsoft. 2016 https://blogs.microsoft.com/