# Overview

**Hand Physics Toolkit (HPTK)** is a toolkit to implement hand-driven interactions in a modular and scalable way. Platform-independent. Input-independent. Scale-independent. Can be combined with [MRTK-Quest](https://github.com/provencher/MRTK-Quest) for UI interactions.

{% hint style="info" %}
**This documentation is under construction.**

Are you unable to find what you need? Please report it in our [community](broken://pages/-MYtw2246pyX-qG_uGZT).
{% endhint %}

You can clone a **ready-to-go project** at [HPTK-Sample](https://github.com/jorgejgnz/HPTK-Sample).

![](/files/-MYu6xAlujPDAL6a7Nv4)

## Main features

* **Data model** to access parts, components or calculated values with **very little code**
* **Code architecture** based on MVC-like modules. Support to **custom modules**
* **Platform-independent**. Tested on VR/AR/non-XR applications
* **Input-independent**. Use hand tracking or controllers
* **Scale-independent**. Valid for any hand size
* **State-of-the-art** configurable **hand physics**
* Define strategies to deal with tracking loss
* Physics-based touch/grab detection
* Tracking noise smoothing

## Supported versions

* Unity 2020.x
* Unity 2019.x

## Supported input

### Hand tracking

* Oculus Quest 1/2 - Android
* Hololens 2 - UWP

### Controllers

* Oculus Touch
* WMR
* Vive
* OpenVR

## Supported render pipelines

* Universal Render Pipeline (URP)
* Standard RP

## Getting started with HPTK

1. Obtain *HPTK*.
2. Import *Oculus Integration*.
3. Configure *Build Settings* (Oculus Quest).
4. Configure *Project Settings* (!).
5. Setup a scene with *hand tracking support* (Oculus Quest).
6. Setup *HPTK specific components*.
7. Setup *platform specific HPTK components* (Oculus Quest).
8. Modify/Create *HPTK Configuration Assets* (if needed).

Check [Setup](broken://pages/-MYsizSJYpGqiRe1h4YZ) for a detailed **step-by-step guide**.

## Author

**Jorge Juan González** - *HCI Researcher at I3A (University of Castilla-La Mancha)*

[LinkedIn](https://www.linkedin.com/in/jorgejgnz/) - [Twitter](https://twitter.com/jorgejgnz) - [GitHub](https://github.com/jorgejgnz)

### Acknowledgements

**Oxters Wyzgowski** - [GitHub](https://github.com/oxters168) - [Twitter](https://twitter.com/OxGamesCo)

**Michael Stevenson** - [GitHub](https://github.com/mstevenson)

Kiran Nasim and Young J. Kim. 2016. [Physics-based Interactive Virtual Grasping](https://dl.acm.org/doi/10.17210/hcik.2016.01.114). In Proceedings of HCI Korea (HCIK '16). Hanbit Media, Inc., Seoul, KOR, 114–120. DOI: <https://doi.org/10.17210/hcik.2016.01.114>

Linn, Allison. Talking with your hands: How Microsoft researchers are moving beyond keyboard and mouse. The AI Blog. Microsoft. 2016 [https://blogs.microsoft.com/](https://blogs.microsoft.com/ai/talking-hands-microsoft-researchers-moving-beyond-keyboard-mouse/)

## License

[MIT](https://github.com/jorgejgnz/HPTK/blob/master/LICENSE.md)


---

# Agent Instructions: Querying This Documentation

If you need additional information that is not directly available in this page, you can query the documentation dynamically by asking a question.

Perform an HTTP GET request on the current page URL with the `ask` query parameter:

```
GET https://jorge-jgnz94.gitbook.io/hptk/master.md?ask=<question>
```

The question should be specific, self-contained, and written in natural language.
The response will contain a direct answer to the question and relevant excerpts and sources from the documentation.

Use this mechanism when the answer is not explicitly present in the current page, you need clarification or additional context, or you want to retrieve related documentation sections.
