Various grammar and spelling fixes

This commit is contained in:
A Thousand Ships
2025-06-23 16:37:26 +02:00
parent b2de954be7
commit 92cd36b50e
49 changed files with 84 additions and 84 deletions

View File

@@ -189,7 +189,7 @@ Every face whose front-side is hit by the light rays is lit, while the others
stay dark. Unlike most other light types, directional lights don't have specific
parameters.
The directional light also offers a **Angular Distance** property, which
The directional light also offers an **Angular Distance** property, which
determines the light's angular size in degrees. Increasing this above ``0.0``
will make shadows softer at greater distances from the caster, while also
affecting the sun's appearance in procedural sky materials. This is called a

View File

@@ -24,7 +24,7 @@ and then use an ``AnimationTree`` to control the playback.
``AnimationPlayer`` and ``AnimationTree`` can be used in both 2D and 3D scenes. When importing 3D scenes and their animations, you can use
`name suffixes <https://docs.godotengine.org/en/stable/tutorials/assets_pipeline/importing_3d_scenes/node_type_customization.html#animation-loop-loop-cycle>`_
to simplify the process and import with the correct properties. At the end, the imported Godot scene will contain the animations in a ``AnimationPlayer`` node.
to simplify the process and import with the correct properties. At the end, the imported Godot scene will contain the animations in an ``AnimationPlayer`` node.
Since you rarely use imported scenes directly in Godot (they are either instantiated or inherited from), you can place the ``AnimationTree`` node in your
new scene which contains the imported one. Afterwards, point the ``AnimationTree`` node to the ``AnimationPlayer`` that was created in the imported scene.

View File

@@ -169,7 +169,7 @@ to another format for viewing on the web or by Godot with the VideoStreamPlayer
node. MJPEG does not support transparency. AVI output is currently limited to a
file of 4 GB in size at most.
To use AVI, specify a path to an ``.avi`` file to be created in the
To use AVI, specify a path to a ``.avi`` file to be created in the
**Editor > Movie Writer > Movie File** project setting.
PNG

View File

@@ -22,7 +22,7 @@ as it was too buggy and difficult to maintain.
.. note::
You may find videos with an ``.ogg`` or ``.ogx`` extensions, which are generic
You may find videos with a ``.ogg`` or ``.ogx`` extensions, which are generic
extensions for data within an Ogg container.
Renaming these file extensions to ``.ogv`` *may* allow the videos to be
@@ -34,7 +34,7 @@ Setting up VideoStreamPlayer
1. Create a VideoStreamPlayer node using the Create New Node dialog.
2. Select the VideoStreamPlayer node in the scene tree dock, go to the inspector
and load an ``.ogv`` file in the Stream property.
and load a ``.ogv`` file in the Stream property.
- If you don't have your video in Ogg Theora format yet, jump to
:ref:`doc_playing_videos_recommended_theora_encoding_settings`.
@@ -183,7 +183,7 @@ maximize the quality of the output Ogg Theora video, but this can require a lot
of disk space.
`FFmpeg <https://ffmpeg.org/>`__ (CLI) is a popular open source tool
for this purpose. FFmpeg has a steep learning curve, but it's powerful tool.
for this purpose. FFmpeg has a steep learning curve, but it's a powerful tool.
Here are example FFmpeg commands to convert an MP4 video to Ogg Theora. Since
FFmpeg supports a lot of input formats, you should be able to use the commands

View File

@@ -105,7 +105,7 @@ Contiguous memory stores imply the following operation performance:
though. Done by re-sorting the Array after every edit and writing an
ordered-aware search algorithm.
Godot implements Dictionary as an ``HashMap<Variant, Variant, VariantHasher, StringLikeVariantComparator>``. The engine
Godot implements Dictionary as a ``HashMap<Variant, Variant, VariantHasher, StringLikeVariantComparator>``. The engine
stores a small array (initialized to 2^3 or 8 records) of key-value pairs. When
one attempts to access a value, they provide it a key. It then *hashes* the
key, i.e. converts it into a number. The "hash" is used to calculate the index

View File

@@ -10,7 +10,7 @@ declarative code.
Each system's capabilities are different as a result.
Scenes can define how an extended class initializes, but not what its
behavior actually is. Scenes are often used in conjunction with a script,
the scene declaring a composition of nodes, and the script adding behaviour with imperative code.
the scene declaring a composition of nodes, and the script adding behavior with imperative code.
Anonymous types
---------------

View File

@@ -16,7 +16,7 @@ suitable for this workflow.
On Windows and Linux, you can run a Godot binary in a terminal by specifying
its relative or absolute path.
On macOS, the process is different due to Godot being contained within an
On macOS, the process is different due to Godot being contained within a
``.app`` bundle (which is a *folder*, not a file). To run a Godot binary
from a terminal on macOS, you have to ``cd`` to the folder where the Godot
application bundle is located, then run ``Godot.app/Contents/MacOS/Godot``

View File

@@ -110,7 +110,7 @@ After making changes, open the **Editor** menu at the top of the editor then
choose **Editor Layouts**. In the dropdown list, you will see a list of saved
editor layouts, plus **Default** which is a hardcoded editor layout that can't
be removed. The default layout matches a fresh Godot installation with no
changes made to the docks' position and size, and no floating docks.
changes made to the docks' positions and sizes, and no floating docks.
You can remove a layout using the **Delete** option in the **Editor Layouts**
dropdown.

View File

@@ -13,7 +13,7 @@ Editor's interface
------------------
The following pages explain how to use the various windows, workspaces, and
docks that make up the Godot editor. We cover some specific editors' interface
docks that make up the Godot editor. We cover some specific editors' interfaces
in other sections where appropriate. For example, the :ref:`animation editor
<doc_introduction_animation>`.

View File

@@ -216,7 +216,7 @@ You can generate an MO file with the command below:
msgfmt fr.po --no-hash -o fr.mo
If the PO file is valid, this command will create a ``fr.mo`` file besides
If the PO file is valid, this command will create an ``fr.mo`` file besides
the PO file. This MO file can then be loaded in Godot as described above.
The original PO file should be kept in version control so you can update

View File

@@ -126,7 +126,7 @@ File logging can also be disabled completely using the
``debug/file_logging/enable_file_logging`` project setting.
When the project crashes, crash logs are written to the same file as the log
file. The crash log will only contain an usable backtrace if the binary that was
file. The crash log will only contain a usable backtrace if the binary that was
run contains debugging symbols, or if it can find a debug symbols file that
matches the binary. Official binaries don't provide debugging symbols, so this
requires a custom build to work. See

View File

@@ -169,7 +169,7 @@ Audio/video files
-----------------
Godot supports loading Ogg Vorbis, MP3, and WAV audio at runtime. Note that not *all*
files with an ``.ogg`` extension are Ogg Vorbis files. Some may be Ogg Theora
files with a ``.ogg`` extension are Ogg Vorbis files. Some may be Ogg Theora
videos, or contain Opus audio within an Ogg container. These files will **not**
load correctly as audio files in Godot.
@@ -191,7 +191,7 @@ Example of loading an Ogg Theora video file in a :ref:`class_VideoStreamPlayer`
var video_stream_theora = VideoStreamTheora.new()
# File extension is ignored, so it is possible to load Ogg Theora videos
# that have an `.ogg` extension this way.
# that have a `.ogg` extension this way.
video_stream_theora.file = "/path/to/file.ogv"
$VideoStreamPlayer.stream = video_stream_theora
@@ -203,7 +203,7 @@ Example of loading an Ogg Theora video file in a :ref:`class_VideoStreamPlayer`
var videoStreamTheora = new VideoStreamTheora();
// File extension is ignored, so it is possible to load Ogg Theora videos
// that have an `.ogg` extension this way.
// that have a `.ogg` extension this way.
videoStreamTheora.File = "/Path/To/File.ogv";
GetNode<VideoStreamPlayer>("VideoStreamPlayer").Stream = videoStreamTheora;

View File

@@ -65,7 +65,7 @@ other). This can be done by artists, or programmatically within Godot using an a
There is also a cost to batching together objects in 3D. Several objects
rendered as one cannot be individually culled. An entire city that is off-screen
will still be rendered if it is joined to a single blade of grass that is on
screen. Thus, you should always take objects' location and culling into account
screen. Thus, you should always take objects' locations and culling into account
when attempting to batch 3D objects together. Despite this, the benefits of
joining static objects often outweigh other considerations, especially for large
numbers of distant or low-poly objects.

View File

@@ -134,7 +134,7 @@ the project or the environment. The pipeline precompilation system will keep
track of these features as they're encountered for the first time and enable
precompilation of them for any meshes or surfaces that are created afterwards.
If your game makes use of these features, **make sure to have an scene that uses
If your game makes use of these features, **make sure to have a scene that uses
them as early as possible** before loading the majority of the assets. This
scene can be very simple and will do the job as long as it uses the features the
game plans to use. It can even be rendered off-screen for at least one frame if

View File

@@ -34,7 +34,7 @@ This has some implications:
Controlling the on / off behavior of 2D nodes therefore requires a little more
thought and planning.
- On the positive side, pivot behavior in the scene tree is perfectly preserved
during interpolation in 2D, which gives super smooth behaviour.
during interpolation in 2D, which gives super smooth behavior.
Resetting physics interpolation
-------------------------------

View File

@@ -60,7 +60,7 @@ Adapt the tick rate?
~~~~~~~~~~~~~~~~~~~~
Instead of designing the game at a fixed physics tick rate, we could allow the tick
rate to scale according to the end users hardware. We could for example use a fixed
rate to scale according to the end user's hardware. We could for example use a fixed
tick rate that works for that hardware, or even vary the duration of each physics
tick to match a particular frame duration.
@@ -70,7 +70,7 @@ run in the ``_physics_process``) work best and most consistently when run at a
that has been designed for 60 TPS (ticks per second) at e.g. 10 TPS, the physics
will behave completely differently. Controls may be less responsive, collisions /
trajectories can be completely different. You may test your game thoroughly at 60
TPS, then find it breaks on end users machines when it runs at a different tick
TPS, then find it breaks on end users' machines when it runs at a different tick
rate.
This can make quality assurance difficult with hard to reproduce bugs, especially

View File

@@ -8,7 +8,7 @@ Introduction
The Jolt physics engine was added as an alternative to the existing Godot Physics
physics engine in 4.4. Jolt is developed by Jorrit Rouwe with a focus on games and
VR applications. Previously it was available as a extension but is now built into
VR applications. Previously it was available as an extension but is now built into
Godot.
It is important to note that the built-in Jolt Physics module is considered

View File

@@ -138,7 +138,7 @@ Use the following steps if you have a v1 Android plugin you want to migrate to v
3. After updating the Godot Android library dependency, sync or build the plugin and resolve any compile errors:
- The ``Godot`` instance provided by ``GodotPlugin::getGodot()`` no longer has access to a ``android.content.Context`` reference. Use ``GodotPlugin::getActivity()`` instead.
- The ``Godot`` instance provided by ``GodotPlugin::getGodot()`` no longer has access to an ``android.content.Context`` reference. Use ``GodotPlugin::getActivity()`` instead.
4. Delete the ``gdap`` configuration file(s) and follow the instructions in the `Packaging a v2 Android plugin`_ section to set up the plugin configuration.

View File

@@ -241,7 +241,7 @@ you do this you have to be careful when you add more presets.
This is the method which defines the available options.
:ref:`_get_import_options() <class_EditorImportPlugin_private_method__get_import_options>` returns
an array of dictionaries, and each dictionary contains a few keys that are
checked to customize the option as its shown to the user. The following table
checked to customize the option as it's shown to the user. The following table
shows the possible keys:
+-------------------+------------+----------------------------------------------------------------------------------------------------------+

View File

@@ -243,7 +243,7 @@ done with caution.
On any Godot project, you can use the ``--disable-vsync``
:ref:`command line argument <doc_command_line_tutorial>` to forcibly disable V-Sync.
Since Godot 4.2, ``--max-fps <fps>`` can also be used to set a FPS limit
Since Godot 4.2, ``--max-fps <fps>`` can also be used to set an FPS limit
(``0`` is unlimited). These arguments can be used at the same time.
Hardware/OS-specific

View File

@@ -3,7 +3,7 @@
C# API differences to GDScript
==============================
This is a (incomplete) list of API differences between C# and GDScript.
This is an (incomplete) list of API differences between C# and GDScript.
General differences
-------------------

View File

@@ -223,7 +223,7 @@ an underscore (``_``) as a prefix for private fields (but not for methods or pro
.. code-block:: csharp
private Vector3 _aimingAt; // Use a `_` prefix for private fields.
private Vector3 _aimingAt; // Use an `_` prefix for private fields.
private void Attack(float attackStrength)
{

View File

@@ -330,7 +330,7 @@ Compiling the plugin
--------------------
To compile the project we need to define how SCons using should compile it
using a ``SConstruct`` file which references the one in ``godot-cpp``.
using an ``SConstruct`` file which references the one in ``godot-cpp``.
Writing it from scratch is outside the scope of this tutorial, but you can
:download:`the SConstruct file we prepared <files/cpp_example/SConstruct>`.
We'll cover a more customizable, detailed example on how to use these

View File

@@ -75,8 +75,8 @@ For example:
- ``template_scripts/Node/smooth_camera.gd``
- ``template_scripts/CharacterBody3D/platformer_movement.gd``
Default behaviour and overriding it
-----------------------------------
Default behavior and overriding it
----------------------------------
By default:
@@ -89,7 +89,7 @@ By default:
* the template will not be set as the default for the given node
It is possible to customize this behaviour by adding meta headers at the start
It is possible to customize this behavior by adding meta headers at the start
of your file, like this:
.. tabs::

View File

@@ -227,7 +227,7 @@ the total bandwidth usage at any given moment.
Monitors
--------
The monitors are graphs of several aspects of the game while its running such as
The monitors are graphs of several aspects of the game while it's running such as
FPS, memory usage, how many nodes are in a scene and more. All monitors keep
track of stats automatically, so even if one monitor isn't open while the game
is running, you can open it later and see how the values changed.

View File

@@ -1802,7 +1802,7 @@ To complete this tutorial, let's see how you can register a custom signal and
emit it when appropriate. As you might have guessed, we'll need a few more
function pointers from the API and more helper functions.
In the ``api.h`` file we're adding two things. One is a an API function to
In the ``api.h`` file we're adding two things. One is an API function to
register a signal, the other is a helper function to wrap the signal binding.
.. code-block:: c

View File

@@ -2255,7 +2255,7 @@ This is better explained through examples. Consider this scenario:
There are a few things to keep in mind here:
1. If the inherited class (``state.gd``) defines a ``_init`` constructor that takes
1. If the inherited class (``state.gd``) defines an ``_init`` constructor that takes
arguments (``e`` in this case), then the inheriting class (``idle.gd``) *must*
define ``_init`` as well and pass appropriate parameters to ``_init`` from ``state.gd``.
2. ``idle.gd`` can have a different number of arguments than the base class ``state.gd``.

View File

@@ -483,7 +483,7 @@ annotations, you can use ``@export_custom`` instead. This allows defining any
property hint, hint string and usage flags, with a syntax similar to the one
used by the editor for built-in nodes.
For example, this exposes the ``altitude`` property with no range limits but a
For example, this exposes the ``altitude`` property with no range limits but an
``m`` (meter) suffix defined:
::

View File

@@ -17,7 +17,7 @@ arrange user interfaces, etc. **Resources** are **data containers**. They don't
do anything on their own: instead, nodes use the data contained in resources.
Anything Godot saves or loads from disk is a resource. Be it a scene (a ``.tscn``
or an ``.scn`` file), an image, a script... Here are some :ref:`Resource <class_Resource>` examples:
or a ``.scn`` file), an image, a script... Here are some :ref:`Resource <class_Resource>` examples:
- :ref:`Texture <class_Texture>`
- :ref:`Script <class_Script>`

View File

@@ -148,7 +148,7 @@ to a non-zero value, the code block is included, otherwise it is skipped.
To evaluate correctly, the condition must be an expression giving a simple
floating-point, integer or boolean result. There may be multiple condition
blocks connected by ``&&`` (AND) or ``||`` (OR) operators. It may be continued
by a ``#else`` block, but **must** be ended with the ``#endif`` directive.
by an ``#else`` block, but **must** be ended with the ``#endif`` directive.
.. code-block:: glsl
@@ -163,7 +163,7 @@ by a ``#else`` block, but **must** be ended with the ``#endif`` directive.
Using the ``defined()`` *preprocessor function*, you can check whether the
passed identifier is defined a by ``#define`` placed above that directive. This
is useful for creating multiple shader versions in the same file. It may be
continued by a ``#else`` block, but must be ended with the ``#endif`` directive.
continued by an ``#else`` block, but must be ended with the ``#endif`` directive.
The ``defined()`` function's result can be negated by using the ``!`` (boolean NOT)
symbol in front of it. This can be used to check whether a define is *not* set.
@@ -273,7 +273,7 @@ Like with ``#if``, the ``defined()`` preprocessor function can be used:
This is a shorthand for ``#if defined(...)``. Checks whether the passed
identifier is defined by ``#define`` placed above that directive. This is useful
for creating multiple shader versions in the same file. It may be continued by a
for creating multiple shader versions in the same file. It may be continued by an
``#else`` block, but must be ended with the ``#endif`` directive.
.. code-block:: glsl

View File

@@ -239,9 +239,9 @@ only available for shaders that are in ``Particles`` mode.
Keep in mind that not all 3D objects are mesh files. a glTF file can't be dragged
and dropped into the graph. However, you can create an inherited scene from it,
save the mesh in that scene as it's own file, and use that.
save the mesh in that scene as its own file, and use that.
.. image:: img/vs_meshemitter.webp
You can also drag and drop obj files into the graph editor to add the node
for that specific mesh, other mesh files will not work for this.
for that specific mesh, other mesh files will not work for this.

View File

@@ -599,7 +599,7 @@ clicked URLs using the user's default web browser:
# to avoid script errors at runtime.
OS.shell_open(str(meta))
For more advanced use cases, it's also possible to store JSON in an ``[url]``
For more advanced use cases, it's also possible to store JSON in a ``[url]``
tag's option and parse it in the function that handles the ``meta_clicked`` signal.
For example:
@@ -1001,7 +1001,7 @@ All examples below mention the default values for options in the listed tag form
.. note::
Text effects that move characters' position may result in characters being
Text effects that move characters' positions may result in characters being
clipped by the RichTextLabel node bounds.
You can resolve this by disabling **Control > Layout > Clip Contents** in

View File

@@ -21,7 +21,7 @@ We are introducing 3 signals to our script so that our game can add further logi
- ``focus_lost`` is emitted when the player takes off their headset or when the player enters the menu system of the headset.
- ``focus_gained`` is emitted when the player puts their headset back on or exits the menu system and returns to the game.
- ``pose_recentered`` is emitted when the headset requests the players position to be reset.
- ``pose_recentered`` is emitted when the headset requests the player's position to be reset.
Our game should react accordingly to these signals.
@@ -93,7 +93,7 @@ Our updated ready function
We add a few things to the ready function.
If we're using the mobile or forward+ renderer we set the viewports ``vrs_mode`` to ``VRS_XR``.
If we're using the mobile or forward+ renderer we set the viewport's ``vrs_mode`` to ``VRS_XR``.
On platforms that support this, this will enable foveated rendering.
If we're using the compatibility renderer, we check if the OpenXR foveated rendering settings
@@ -199,7 +199,7 @@ This signal is emitted by OpenXR when our session is setup.
This means the headset has run through setting everything up and is ready to begin receiving content from us.
Only at this time various information is properly available.
The main thing we do here is to check our headsets refresh rate.
The main thing we do here is to check our headset's refresh rate.
We also check the available refresh rates reported by the XR runtime to determine if we want to set our headset to a higher refresh rate.
Finally we match our physics update rate to our headset update rate.
@@ -297,13 +297,13 @@ Not matching the physics update rate will cause stuttering as frames are rendere
On visible state
----------------
This signal is emitted by OpenXR when our game becomes visible but is not focussed.
This signal is emitted by OpenXR when our game becomes visible but is not focused.
This is a bit of a weird description in OpenXR but it basically means that our game has just started
and we're about to switch to the focussed state next,
that the user has opened a system menu or the users has just took their headset off.
and we're about to switch to the focused state next,
that the user has opened a system menu or the user has just took their headset off.
On receiving this signal we'll update our focussed state,
we'll change the process mode of our node to disabled which will pause processing on this node and it's children,
On receiving this signal we'll update our focused state,
we'll change the process mode of our node to disabled which will pause processing on this node and its children,
and emit our ``focus_lost`` signal.
If you've added this script to your root node,
@@ -377,12 +377,12 @@ the game stays in 'visible' state until the user puts their headset on.
It is thus important to keep your game paused while in visible mode.
If you don't the game will keep on running while your user isn't interacting with your game.
Also when the game returns to focussed mode,
Also when the game returns to the focused mode,
suddenly all controller and hand tracking is re-enabled and could have game breaking consequences
if you do not react to this accordingly.
Be sure to test this behaviour in your game!
Be sure to test this behavior in your game!
While handling our signal we will update the focusses state, unpause our node and emit our ``focus_gained`` signal.
While handling our signal we will update the focuses state, unpause our node and emit our ``focus_gained`` signal.
.. tabs::
.. code-tab:: gdscript GDScript

View File

@@ -162,9 +162,9 @@ This has two consequences:
objects cast shadows on real world objects [#]_.
.. figure:: img/xr_passthrough_example.webp
:alt: Image showing shadow to opacity being used to show the users desk.
:alt: Image showing shadow to opacity being used to show the user's desk.
Image showing shadow to opacity being used to show the users desk.
Image showing shadow to opacity being used to show the user's desk.
This enabled the following use cases:

View File

@@ -24,7 +24,7 @@ So to prevent our player from infinitely falling down we'll quickly add a floor
We start by adding a :ref:`StaticBody3D <class_staticbody3d>` node to our root node and we rename this to ``Floor``.
We add a :ref:`MeshInstance3D <class_meshinstance3d>` node as a child node for our ``Floor``.
Then create a new :ref:`PlaneMesh <class_planemesh>` as it's mesh.
Then create a new :ref:`PlaneMesh <class_planemesh>` as its mesh.
For now we set the size of the mesh to 100 x 100 meters.
Next we add a :ref:`CollisionShape3D <class_collisionshape3d>` node as a child node for our ``Floor``.
Then create a ``BoxShape`` as our shape.
@@ -76,7 +76,7 @@ Godot XR Tools supports this through the teleport function and we will be adding
Add a new child scene to your left hand :ref:`XRController3D <class_xrcontroller3d>` node by selecting the ``addons/godot-xr-tools/functions/function_teleport.tscn`` scene.
With this scene added the player will be able to teleport around the world by pressing the trigger on the left hand controller, pointing where they want to go, and then releasing the trigger.
The player can also adjust the orientation by using the left hand controllers joystick.
The player can also adjust the orientation by using the left hand controller's joystick.
If you've followed all instructions correctly your scene should now look something like this:

View File

@@ -142,7 +142,7 @@ This implementation only works for our ``OpenXRCompositionLayerQuad`` node.
...
We also define a helper function that takes our ``intersect`` value and
returns our location in the viewports local coordinate system:
returns our location in the viewport's local coordinate system:
.. code:: gdscript
@@ -195,7 +195,7 @@ If so, we check if our button is pressed and place our pointer at our intersecti
...
If we were intersecting in our previous process call and our pointer has moved,
we prepare a :ref:`InputEventMouseMotion <class_InputEventMouseMotion>` object
we prepare an :ref:`InputEventMouseMotion <class_InputEventMouseMotion>` object
to simulate our mouse moving and send that to our viewport for further processing.
.. code:: gdscript
@@ -216,7 +216,7 @@ to simulate our mouse moving and send that to our viewport for further processin
...
If we've just released our button we also prepare
a :ref:`InputEventMouseButton <class_InputEventMouseButton>` object
an :ref:`InputEventMouseButton <class_InputEventMouseButton>` object
to simulate a button release and send that to our viewport for further processing.
.. code:: gdscript
@@ -234,7 +234,7 @@ to simulate a button release and send that to our viewport for further processin
...
Or if we've just pressed our button we prepare
a :ref:`InputEventMouseButton <class_InputEventMouseButton>` object
an :ref:`InputEventMouseButton <class_InputEventMouseButton>` object
to simulate a button press and send that to our viewport for further processing.
.. code:: gdscript
@@ -292,5 +292,5 @@ the XR compositor will now draw the viewport first, and then overlay our renderi
.. figure:: img/openxr_composition_layer_hole_punch.webp
:align: center
Use case showing how the users hand is incorrectly obscured
Use case showing how the user's hand is incorrectly obscured
by a composition layer when hole punching is not used.

View File

@@ -135,7 +135,7 @@ We also need to enable editable children to gain access to our :ref:`Skeleton3D
The hand skeleton modifier
~~~~~~~~~~~~~~~~~~~~~~~~~~
Finally we need to add a :ref:`XRHandModifier3D <class_xrhandmodifier3d>` node as a child to our ``Skeleton3D`` node.
Finally we need to add an :ref:`XRHandModifier3D <class_xrhandmodifier3d>` node as a child to our ``Skeleton3D`` node.
This node will obtain the finger tracking data from OpenXR and apply it the hand model.
You need to set the ``Hand Tracker`` property to either ``/user/hand_tracker/left`` or ``/user/hand_tracker/right``

View File

@@ -175,7 +175,7 @@ Sets the foveation level used when rendering provided this feature is supported
Foveation is a technique where the further away from the center of the viewport we render content, the lower resolution we render at.
Most XR runtimes only support fixed foveation, but some will take eye tracking into account and use the focal point for this effect.
The higher the level, the better the performance gains, but also the more reduction in quality there is in the users peripheral vision.
The higher the level, the better the performance gains, but also the more reduction in quality there is in the user's peripheral vision.
.. Note::
**Compatibility renderer only**,

View File

@@ -150,7 +150,7 @@ The columns in our table are as follows:
* - 3
- 0
- This is the priority of the action set.
If multiple active action sets have actions bound to the same controllers inputs or
If multiple active action sets have actions bound to the same controller's inputs or
outputs, the action set with the highest priority value will determine the action
that is updated.
@@ -572,7 +572,7 @@ These settings are used as follows:
* ``On Haptic`` lets us define a haptic output that is automatically activated
when an action becomes pressed.
* ``Off Haptic`` lets us define a haptic output that is automatically activated
when a action is released.
when an action is released.
Binding modifiers on individual bindings

View File

@@ -28,7 +28,7 @@ The movement through controller input, and the physical movement of the player i
As a result, the origin node does not represent the position of the player.
It represents the center, or start of, the tracking space in which the player can physically move.
As the player moves around their room this movement is represented through the tracking of the players headset.
As the player moves around their room this movement is represented through the tracking of the player's headset.
In game this translates to the camera node's position being updated accordingly.
For all intents and purposes, we are tracking a disembodied head.
Unless body tracking is available, we have no knowledge of the position or orientation of the player's body.
@@ -229,7 +229,7 @@ In this approach step 1 is where all the magic happens.
Just like with our previous approach we will be applying our physical movement to the character body,
but we will counter that movement on the origin node.
This will ensure that the players location stays in sync with the character body's location.
This will ensure that the player's location stays in sync with the character body's location.
.. code-block:: gdscript
@@ -377,7 +377,7 @@ The problem with this approach is that physical movement is now not replicated i
This will cause nausea for the player.
What many XR games do instead, is to measure the distance between where the player physically is,
and where the players virtual body has been left behind.
and where the player's virtual body has been left behind.
As this distance increases, usually to a distance of a few centimeters, the screen slowly blacks out.
Our solutions up above would allow us to add this logic into the code at the end of step 1.