Files
godot-docs-l10n/sphinx/templates/tutorials/xr/openxr_hand_tracking.pot
Rémi Verschelde 9b1e915290 Sync Sphinx and Weblate templates with current docs (4.5)
First sync with 4.5 to prepare the stable release.
2025-08-01 17:37:27 +02:00

419 lines
23 KiB
Plaintext

# SOME DESCRIPTIVE TITLE.
# Copyright (C) 2014-present Juan Linietsky, Ariel Manzur and the Godot community (CC BY 3.0)
# This file is distributed under the same license as the Godot Engine package.
# FIRST AUTHOR <EMAIL@ADDRESS>, YEAR.
#
#, fuzzy
msgid ""
msgstr ""
"Project-Id-Version: Godot Engine latest\n"
"Report-Msgid-Bugs-To: \n"
"PO-Revision-Date: YEAR-MO-DA HO:MI+ZONE\n"
"Last-Translator: FULL NAME <EMAIL@ADDRESS>\n"
"Language-Team: LANGUAGE <LL@li.org>\n"
"MIME-Version: 1.0\n"
"Content-Type: text/plain; charset=UTF-8\n"
"Content-Transfer-Encoding: 8bit\n"
#: ../../docs/tutorials/xr/openxr_hand_tracking.rst:4
msgid "OpenXR hand tracking"
msgstr ""
#: ../../docs/tutorials/xr/openxr_hand_tracking.rst:7
msgid "Introduction"
msgstr ""
#: ../../docs/tutorials/xr/openxr_hand_tracking.rst:11
msgid "This page focuses specifically on the feature set exposed through OpenXR. Parts of the functionality presented here also applies to WebXR and can by provided by other XR interfaces."
msgstr ""
#: ../../docs/tutorials/xr/openxr_hand_tracking.rst:15
msgid "When discussing hand tracking it is important to know that there are differences of opinion as to where lines are drawn. The practical result of this is that there are differences in implementation between the different OpenXR runtimes. You may find yourself in a place where chosen hardware doesn't support a piece of the puzzle or does things differently enough from the other platforms that you need to do extra work."
msgstr ""
#: ../../docs/tutorials/xr/openxr_hand_tracking.rst:20
msgid "That said, recent improvements to the OpenXR specification are closing these gaps and as platforms implement these improvements we are getting closer to a future where we have either full portability between platforms or at least a clear way to detect the capabilities of a platform."
msgstr ""
#: ../../docs/tutorials/xr/openxr_hand_tracking.rst:24
msgid "When we look at the early days of VR the focus of the major platforms was on tracked controller based input. Here we are tracking a physical device that also has buttons for further input. From the tracking data we can infer the location of the player's hands but no further information is known, traditionally it was left up to the game to implement a mechanism to display the player's hand and animate the fingers based on further input from the controller, be it due to buttons being pressed or through proximity sensors. Often fingers are also placed based on context, what the user is holding, and what action a user is performing."
msgstr ""
#: ../../docs/tutorials/xr/openxr_hand_tracking.rst:32
msgid "More recently optical hand tracking has become a popular solution, where cameras track the user's hands and full tracking data for the hand and finger positions becomes available. Many vendors saw this as completely separate from controller tracking and introduced independent APIs to access hand and finger positions and orientation data. When handling input, it was up to the game developer to implement a gesture detection mechanism."
msgstr ""
#: ../../docs/tutorials/xr/openxr_hand_tracking.rst:38
msgid "This split also exists in OpenXR, where controller tracking is handled primarily by the action map system, while optical hand tracking is primarily handled by the hand tracking API extension."
msgstr ""
#: ../../docs/tutorials/xr/openxr_hand_tracking.rst:41
msgid "However, the world is not that black and white and we're seeing a number of scenarios where we cross the line:"
msgstr ""
#: ../../docs/tutorials/xr/openxr_hand_tracking.rst:44
msgid "Devices that fit in both categories, such as tracked gloves and controllers such as the Index controller that also perform finger tracking."
msgstr ""
#: ../../docs/tutorials/xr/openxr_hand_tracking.rst:46
msgid "XR Runtimes that implement inferred hand tracking from controller data as a means to solve proper finger placement for multiple controllers."
msgstr ""
#: ../../docs/tutorials/xr/openxr_hand_tracking.rst:48
msgid "XR applications that wish to seamlessly switch between controller and hand tracking offering the same user experience regardless of approach used."
msgstr ""
#: ../../docs/tutorials/xr/openxr_hand_tracking.rst:51
msgid "OpenXR is answering this call by introducing further extensions that lets us query the capabilities of the XR runtime/hardware or that add further functionality across this divide. The problem that currently does remain is that there are gaps in adopting these extensions, with some platforms thus not reporting capabilities to their full extent. As such you may need to test for the features available on specific hardware and adjust your approach accordingly."
msgstr ""
#: ../../docs/tutorials/xr/openxr_hand_tracking.rst:59
msgid "Demo project"
msgstr ""
#: ../../docs/tutorials/xr/openxr_hand_tracking.rst:61
msgid "The information presented on this page was used to create a demo project that can be found `here <https://github.com/godotengine/godot-demo-projects/tree/master/xr/openxr_hand_tracking_demo>`_."
msgstr ""
#: ../../docs/tutorials/xr/openxr_hand_tracking.rst:66
msgid "The Hand Tracking API"
msgstr ""
#: ../../docs/tutorials/xr/openxr_hand_tracking.rst:68
msgid "As mentioned in our introduction, the hand tracking API is primarily used with optical hand tracking and on many platforms only works when the user is not holding a controller. Some platforms support controller inferred hand tracking meaning that you will get hand tracking data even if the user is holding a controller. This includes SteamVR, Meta Quest (currently native only but Meta link support is likely coming), and hopefully soon others as well."
msgstr ""
#: ../../docs/tutorials/xr/openxr_hand_tracking.rst:75
msgid "The hand tracking implementation in Godot has been standardized around the Godot Humanoid Skeleton and works both in OpenXR and WebXR. The instructions below will thus work in both environments."
msgstr ""
#: ../../docs/tutorials/xr/openxr_hand_tracking.rst:78
msgid "In order to use the hand tracking API with OpenXR you first need to enable it. This can be done in the project settings:"
msgstr ""
#: ../../docs/tutorials/xr/openxr_hand_tracking.rst:83
msgid "For some standalone XR devices you also need to configure the hand tracking extension in export settings, for instance for Meta Quest:"
msgstr ""
#: ../../docs/tutorials/xr/openxr_hand_tracking.rst:88
msgid "Now you need to add 3 components into your scene for each hand:"
msgstr ""
#: ../../docs/tutorials/xr/openxr_hand_tracking.rst:90
msgid "A tracked node to position the hand."
msgstr ""
#: ../../docs/tutorials/xr/openxr_hand_tracking.rst:91
msgid "A properly skinned hand mesh with skeleton."
msgstr ""
#: ../../docs/tutorials/xr/openxr_hand_tracking.rst:92
msgid "A skeleton modifier that applies finger tracking data to the skeleton."
msgstr ""
#: ../../docs/tutorials/xr/openxr_hand_tracking.rst:97
msgid "Hand tracking node"
msgstr ""
#: ../../docs/tutorials/xr/openxr_hand_tracking.rst:99
msgid "The hand tracking system uses separate hand trackers to track the position of the player's hands within our tracking space."
msgstr ""
#: ../../docs/tutorials/xr/openxr_hand_tracking.rst:102
msgid "This information has been separated out for the following use cases:"
msgstr ""
#: ../../docs/tutorials/xr/openxr_hand_tracking.rst:104
msgid "Tracking happens in the local space of the :ref:`XROrigin3D <class_xrorigin3d>` node. This node must be a child of the `XROrigin3D` node in order to be correctly placed."
msgstr ""
#: ../../docs/tutorials/xr/openxr_hand_tracking.rst:106
msgid "This node can be used as an IK target when an upper body mesh with arms is used instead of separate hand meshes."
msgstr ""
#: ../../docs/tutorials/xr/openxr_hand_tracking.rst:108
msgid "Actual placement of the hands may be loosely bound to the tracking in scenarios such as avatar creation UIs, fake mirrors, or similar situations resulting in the hand mesh and finger tracking being localized elsewhere."
msgstr ""
#: ../../docs/tutorials/xr/openxr_hand_tracking.rst:112
msgid "We'll concentrate on the first use case only."
msgstr ""
#: ../../docs/tutorials/xr/openxr_hand_tracking.rst:114
msgid "For this you need to add an :ref:`XRNode3D <class_xrnode3d>` node to your ``XROrigin3D`` node."
msgstr ""
#: ../../docs/tutorials/xr/openxr_hand_tracking.rst:116
msgid "On this node the ``tracker`` should be set to ``/user/hand_tracker/left`` or ``/user/hand_tracker/right`` for the left or right hand respectively."
msgstr ""
#: ../../docs/tutorials/xr/openxr_hand_tracking.rst:118
msgid "The ``pose`` should remain set to ``default``, no other option will work here."
msgstr ""
#: ../../docs/tutorials/xr/openxr_hand_tracking.rst:119
msgid "The checkbox ``Show When Tracked`` will automatically hide this node if no tracking data is available, or make this node visible if tracking data is available."
msgstr ""
#: ../../docs/tutorials/xr/openxr_hand_tracking.rst:123
msgid "Rigged hand mesh"
msgstr ""
#: ../../docs/tutorials/xr/openxr_hand_tracking.rst:125
msgid "In order to display our hand we need a hand mesh that is properly rigged and skinned. For this Godot uses the hand bone structure as defined for the :ref:`Godot Humanoid <class_skeletonprofilehumanoid>` but optionally supporting an extra tip bone for each finger."
msgstr ""
#: ../../docs/tutorials/xr/openxr_hand_tracking.rst:129
msgid "The `OpenXR hand tracking demo <https://github.com/godotengine/godot-demo-projects/tree/master/xr/openxr_hand_tracking_demo>`_ contains example glTF files of properly rigged hands."
msgstr ""
#: ../../docs/tutorials/xr/openxr_hand_tracking.rst:132
msgid "We will be using those here and add them as a child to our ``XRNode3D`` node. We also need to enable editable children to gain access to our :ref:`Skeleton3D <class_skeleton3d>` node."
msgstr ""
#: ../../docs/tutorials/xr/openxr_hand_tracking.rst:136
msgid "The hand skeleton modifier"
msgstr ""
#: ../../docs/tutorials/xr/openxr_hand_tracking.rst:138
msgid "Finally we need to add an :ref:`XRHandModifier3D <class_xrhandmodifier3d>` node as a child to our ``Skeleton3D`` node. This node will obtain the finger tracking data from OpenXR and apply it the hand model."
msgstr ""
#: ../../docs/tutorials/xr/openxr_hand_tracking.rst:141
msgid "You need to set the ``Hand Tracker`` property to either ``/user/hand_tracker/left`` or ``/user/hand_tracker/right`` depending on whether we are apply the tracking data of respectively the left or right hand."
msgstr ""
#: ../../docs/tutorials/xr/openxr_hand_tracking.rst:144
msgid "You can also set the ``Bone Update`` mode on this node."
msgstr ""
#: ../../docs/tutorials/xr/openxr_hand_tracking.rst:146
msgid "``Full`` applies the hand tracking data fully. This does mean that the skeleton positioning will potentially reflect the size of the actual hand of the user. This can lead to scrunching effect if meshes aren't weighted properly to account for this. Make sure you test your game with players of all sizes when optical hand tracking is used!"
msgstr ""
#: ../../docs/tutorials/xr/openxr_hand_tracking.rst:150
msgid "``Rotation Only`` will only apply rotation to the bones of the hands and keep the bone length as is. In this mode the size of the hand mesh doesn't change."
msgstr ""
#: ../../docs/tutorials/xr/openxr_hand_tracking.rst:153
msgid "With this added, when we run the project we should see the hand correctly displayed if hand tracking is supported."
msgstr ""
#: ../../docs/tutorials/xr/openxr_hand_tracking.rst:156
msgid "The hand tracking data source"
msgstr ""
#: ../../docs/tutorials/xr/openxr_hand_tracking.rst:158
msgid "This is an OpenXR extension that provides information about the source of the hand tracking data. At this moment only a few runtimes implement it but if it is available, Godot will activate it."
msgstr ""
#: ../../docs/tutorials/xr/openxr_hand_tracking.rst:161
msgid "If this extension is not supported and thus unknown is returned, you can make the following assumptions:"
msgstr ""
#: ../../docs/tutorials/xr/openxr_hand_tracking.rst:163
msgid "If you are using SteamVR (including Steam link), only controller based hand tracking is supported."
msgstr ""
#: ../../docs/tutorials/xr/openxr_hand_tracking.rst:164
msgid "For any other runtime, if hand tracking is supported, only optical hand tracking is supported (Note, Meta Link currently fall into this category)."
msgstr ""
#: ../../docs/tutorials/xr/openxr_hand_tracking.rst:166
msgid "In all other cases, no hand tracking is supported at all."
msgstr ""
#: ../../docs/tutorials/xr/openxr_hand_tracking.rst:168
msgid "You can access this information through code:"
msgstr ""
#: ../../docs/tutorials/xr/openxr_hand_tracking.rst:188
msgid "This example logs the state for the left hand."
msgstr ""
#: ../../docs/tutorials/xr/openxr_hand_tracking.rst:190
msgid "If in this example no hand tracker is returned by ``get_tracker``, this means the hand tracking API is not supported on the XR runtime at all."
msgstr ""
#: ../../docs/tutorials/xr/openxr_hand_tracking.rst:193
msgid "If there is a tracker but `has_tracking_data` is false, the user's hand is currently not being tracked. This is likely caused by one of the following reasons:"
msgstr ""
#: ../../docs/tutorials/xr/openxr_hand_tracking.rst:196
msgid "The player's hand is not visible by any of the tracking cameras on the headset"
msgstr ""
#: ../../docs/tutorials/xr/openxr_hand_tracking.rst:197
msgid "The player is currently using a controller and the headset only supports optical hand tracking"
msgstr ""
#: ../../docs/tutorials/xr/openxr_hand_tracking.rst:198
msgid "The controller is turned off and only controller hand tracking is supported."
msgstr ""
#: ../../docs/tutorials/xr/openxr_hand_tracking.rst:201
msgid "Handling user input"
msgstr ""
#: ../../docs/tutorials/xr/openxr_hand_tracking.rst:203
msgid "Reacting to actions performed by the user is handled through :ref:`doc_xr_action_map` if controllers are used. In the action map you can map various inputs like the trigger or joystick on the controller to an action. This can then drive logic in your game."
msgstr ""
#: ../../docs/tutorials/xr/openxr_hand_tracking.rst:208
msgid "When hand tracking is used we originally had no such inputs, inputs are driven by gestures made by the user such as making a fist to grab or pinching the thumb and index finger together to select something. It was up to the game developer to implement this."
msgstr ""
#: ../../docs/tutorials/xr/openxr_hand_tracking.rst:213
msgid "Recognizing that there is an increasing demand for applications that can switch seamlessly between controller and hand tracking and the need some form of basic input capability, a number of extensions were added to the specification that provide some basic gesture recognition and can be used with the action map."
msgstr ""
#: ../../docs/tutorials/xr/openxr_hand_tracking.rst:219
msgid "The hand interaction profile"
msgstr ""
#: ../../docs/tutorials/xr/openxr_hand_tracking.rst:221
msgid "The `hand interaction profile extension <https://github.khronos.org/OpenXR-Inventory/extension_support.html#XR_EXT_hand_interaction>`_ is a new core extension which supports pinch, grasp, and poke gestures and related poses. There is still limited support for this extension but it should become available in more runtimes in the near future."
msgstr ""
#: ../../docs/tutorials/xr/openxr_hand_tracking.rst:228
msgid "The pinch gesture is triggered by pinching your thumb and index finger together. This is often used as a select gesture for menu systems, similar to using your controller to point at an object and press the trigger to select and is thus often mapped as such."
msgstr ""
#: ../../docs/tutorials/xr/openxr_hand_tracking.rst:232
msgid "The ``pinch pose`` is a pose positioned in the middle between the tip of the thumb and the tip of the index finger and oriented such that a ray cast can be used to identify a target."
msgstr ""
#: ../../docs/tutorials/xr/openxr_hand_tracking.rst:234
msgid "The ``pinch`` float input is a value between 0.0 (the tip of the thumb and index finger are apart) and 1.0 (the tip of the thumb and index finger are touching)."
msgstr ""
#: ../../docs/tutorials/xr/openxr_hand_tracking.rst:236
msgid "The ``pinch ready`` input is true when the tips of the fingers are (close to) touching."
msgstr ""
#: ../../docs/tutorials/xr/openxr_hand_tracking.rst:238
msgid "The grasp gesture is triggered by making a fist and is often used to pick items up, similar to engaging the squeeze input on controllers."
msgstr ""
#: ../../docs/tutorials/xr/openxr_hand_tracking.rst:241
msgid "The ``grasp`` float input is a value between 0.0 (open hand) and 1.0 (fist)."
msgstr ""
#: ../../docs/tutorials/xr/openxr_hand_tracking.rst:242
msgid "The ``grasp ready`` input is true when the user made a fist."
msgstr ""
#: ../../docs/tutorials/xr/openxr_hand_tracking.rst:244
msgid "The poke gesture is triggered by extending your index finger, this one is a bit of an exception as the pose at the tip of your index finger is often used to poke an interactable object. The ``poke pose`` is a pose positioned on the tip of the index finger."
msgstr ""
#: ../../docs/tutorials/xr/openxr_hand_tracking.rst:248
msgid "Finally the ``aim activate (ready)`` input is defined as an input that is 1.0/true when the index finger is extended and pointing at a target that can be activated. How runtimes interpret this, is not clear."
msgstr ""
#: ../../docs/tutorials/xr/openxr_hand_tracking.rst:252
#: ../../docs/tutorials/xr/openxr_hand_tracking.rst:279
#: ../../docs/tutorials/xr/openxr_hand_tracking.rst:321
msgid "With this setup the normal ``left_hand`` and ``right_hand`` trackers are used and you can thus seamlessly switch between controller and hand tracking input."
msgstr ""
#: ../../docs/tutorials/xr/openxr_hand_tracking.rst:257
msgid "You need to enable the hand interaction profile extension in the OpenXR project settings."
msgstr ""
#: ../../docs/tutorials/xr/openxr_hand_tracking.rst:260
msgid "Microsoft hand interaction profile"
msgstr ""
#: ../../docs/tutorials/xr/openxr_hand_tracking.rst:262
msgid "The `Microsoft hand interaction profile extension <https://github.khronos.org/OpenXR-Inventory/extension_support.html#XR_MSFT_hand_interaction>`_ was introduced by Microsoft and loosely mimics the simple controller profile. Meta has also added support for this extension but only on their native OpenXR client, it is currently not available over Meta Link."
msgstr ""
#: ../../docs/tutorials/xr/openxr_hand_tracking.rst:269
msgid "Pinch support is exposed through the ``select`` input, the value of which is 0.0 when the tip of the thumb and index finger are apart and 1.0 when they are together."
msgstr ""
#: ../../docs/tutorials/xr/openxr_hand_tracking.rst:273
msgid "Note that in this profile the ``aim pose`` is redefined as a pose between thumb and index finger, oriented so a ray cast can be used to identify a target."
msgstr ""
#: ../../docs/tutorials/xr/openxr_hand_tracking.rst:276
msgid "Grasp support is exposed through the ``squeeze`` input, the value of which is 0.0 when the hand is open, and 1.0 when a fist is made."
msgstr ""
#: ../../docs/tutorials/xr/openxr_hand_tracking.rst:283
msgid "HTC hand interaction profile"
msgstr ""
#: ../../docs/tutorials/xr/openxr_hand_tracking.rst:285
msgid "The `HTC hand interaction profile extension <https://github.khronos.org/OpenXR-Inventory/extension_support.html#XR_HTC_hand_interaction>`_ was introduced by HTC and is defined similarly to the Microsoft extension. It is only supported by HTC for the Focus 3 and Elite XR headsets."
msgstr ""
#: ../../docs/tutorials/xr/openxr_hand_tracking.rst:291
msgid "See the Microsoft hand interaction profile for the gesture support."
msgstr ""
#: ../../docs/tutorials/xr/openxr_hand_tracking.rst:293
msgid "The defining difference is that this extension introduces two new trackers, ``/user/hand_htc/left`` and ``/user/hand_htc/right``. This means that extra logic needs to be implemented to switch between the default trackers and the HTC specific trackers when the user puts down, or picks up, their controller."
msgstr ""
#: ../../docs/tutorials/xr/openxr_hand_tracking.rst:299
msgid "Simple controller profile"
msgstr ""
#: ../../docs/tutorials/xr/openxr_hand_tracking.rst:301
msgid "The simple controller profile is a standard core profile defined as a fallback profile when a controller is used for which no profile exists."
msgstr ""
#: ../../docs/tutorials/xr/openxr_hand_tracking.rst:304
msgid "There are a number of OpenXR runtimes that will mimic controllers through the simple controller profile when hand tracking is used."
msgstr ""
#: ../../docs/tutorials/xr/openxr_hand_tracking.rst:307
msgid "Unfortunately there is no sound way to determine whether an unknown controller is used or whether hand tracking is emulating a controller through this profile."
msgstr ""
#: ../../docs/tutorials/xr/openxr_hand_tracking.rst:312
msgid "XR runtimes are free to define how the simple controller profile operates, so there is also no certainty to how this profile is mapped to gestures."
msgstr ""
#: ../../docs/tutorials/xr/openxr_hand_tracking.rst:315
msgid "The most common mapping seems to be that ``select click`` is true when the tip of the thumb and index fingers are touching while the user's palm is facing away from the user. ``menu click`` will be true when tip of the thumb and index fingers are touching while the user's palm is facing towards the user."
msgstr ""
#: ../../docs/tutorials/xr/openxr_hand_tracking.rst:326
msgid "As some of these interaction profiles have overlap it is important to know that you can add each profile to your action map and the XR runtime will choose the best fitting profile."
msgstr ""
#: ../../docs/tutorials/xr/openxr_hand_tracking.rst:330
msgid "For instance, a Meta Quest supports both the Microsoft hand interaction profile and simple controller profile. If both are specified the Microsoft hand interaction profile will take precedence and will be used."
msgstr ""
#: ../../docs/tutorials/xr/openxr_hand_tracking.rst:335
msgid "The expectation is that once Meta supports the core hand interaction profile extension, that profile will take precedence over both Microsoft and simple controller profiles."
msgstr ""
#: ../../docs/tutorials/xr/openxr_hand_tracking.rst:340
msgid "Gesture based input"
msgstr ""
#: ../../docs/tutorials/xr/openxr_hand_tracking.rst:342
msgid "If the platform doesn't support any interaction profiles when hand tracking is used, or if you're building an application where you need more complicated gesture support you're going to need to build your own gesture recognition system."
msgstr ""
#: ../../docs/tutorials/xr/openxr_hand_tracking.rst:346
msgid "You can obtain the full hand tracking data through the :ref:`XRHandTracker <class_xrhandtracker>` resource for each hand. You can obtain the hand tracker by calling ``XRServer.get_tracker`` and using either ``/user/hand_tracker/left`` or ``/user/hand_tracker/left`` as the tracker. This resource provides access to all the joint information for the given hand."
msgstr ""
#: ../../docs/tutorials/xr/openxr_hand_tracking.rst:351
msgid "Detailing out a full gesture recognition algorithm goes beyond the scope of this manual however there are a number of community projects you can look at:"
msgstr ""
#: ../../docs/tutorials/xr/openxr_hand_tracking.rst:354
msgid "`Julian Todd's Auto hands library <https://github.com/Godot-Dojo/Godot-XR-AH>`_"
msgstr ""
#: ../../docs/tutorials/xr/openxr_hand_tracking.rst:355
msgid "`Malcolm Nixons Hand Pose Detector <https://github.com/Malcolmnixon/GodotXRHandPoseDetector>`_"
msgstr ""