Up to date
This page is up to date for Godot
If you still find outdated information, please open an issue.
The OpenXR hand tracking¶
Hand tracking is the process by which the position and orientation of the players hands are tracked, including the orientation of the players fingers. We can identify 3 categories of this:
One, hand tracking through external sensors such as cameras. This is what Hololens, Quest, UltraLeap and similar devices do. This often results in very accurate tracking of all fingers of the players hands.
Two, hand tracking through VR gloves. This method is still mostly experimental but is likely to gain popularity soon. Gloves often have good finger tracking capabilities but their real selling point is the ability to restrict movement. This allows the sensation of touch. Gloves are often also recognised as controllers and often will have additional controls such as buttons embedded.
Three, inferred hand tracking. This has been the defacto approach since the early days of VR. As we know the player is holding a controller and we know the position of this controller, we can infer where to render the players hand. Fingers can be positioned based on the controls the player is interacting with. Many modern VR controllers have additional sensors to help determine finger positions on the controller.
Traditionally inferred hand tracking has been the responsibility of the game. However the principles behind the action map have somewhat limited the viable options here. Valve is currently the only XR Runtime that has implemented inferred hand tracking as part of the hand tracking extension. There is an expectation that other XR Runtimes will follow this example in the near future.
Until then we recommend that if your game depends on inferred hand tracking, to use the hand assets that are part of Godot XR Tools.
Tracking through interaction profiles¶
Tracking the location and state of controllers are performed through interaction profiles. Bindings can be set within the action map.
However it is important to realise that in OpenXR controllers are bound to paths indicating the usage of these controllers.
I.e. the controller held in the players left hand is bound to
while the controller in the players right hand is bound to
And while not yet supported outside of the HTC tracker extension,
it is likely OpenXR will be extended with paths such as
/user/foot/right at some point.
This paradigm therefore begs the question what happens to a controller that is not being held by a user. There is no answer to this question yet, this is still being debated and the specification may change in the near future. The behavior is thus undefined and can be different for different platforms.
The most common is that the controller will remain bound regardless of whether the player is actually holding the controller.
However there are runtimes, such as the Quest, that can unbind a controller when it is not being held by the user.
This may become the norm in the future and the expectation is that the action map system will be enhanced accordingly.
The hand tracking extension¶
OpenXR has an extension that exposes hand tracking functionality. This extension allows a game to request the hand skeleton with all bone positions for each hand.
The above image shows the joints that have to be provided by each XR runtime that implements this extension.
Currently Godot exposes this functionality through the OpenXRHand node.
This is a helper node that will retrieve the hand tracking data from OpenXR and apply it to a skeleton in Godot.
You select either the left or right hand through the
hand property on this node.
The hand asset itself has to be provided by the developer and is thus separate from the OpenXRHand node.
You set the
hand_skeleton on the OpenXRHand node to the skeleton it needs t