Spatial tracking

From XVRWiki
(Redirected from Absolute 3D tracking)
Jump to navigation Jump to search
Position and orientation tracking using laser interferometers
WorldSense positional tracking, an example of camera-based tracking

Spatial tracking, also called 3D tracking and positional tracking, refers to tracking an object's position and orientation in three-dimensional space, playing a crucial role in 3D human-computer interaction (3D HCI). 3D tracking systems can be used to track handheld devices like controllers and 3D styluses. It is also used to track headsets.

3D tracking can be categorized into the following types:

  • 3DOF Position Tracking: Determines the location of an object along three spatial axes (X, Y, Z).
  • 3DOF Rotation Tracking: Measures an object's rotation around three axes (pitch, yaw, roll).
  • 6DOF Tracking: Combines both position and rotation tracking, providing a full representation of an object's pose in space.

One of the goals of 3D tracking is to specify a point in 3D space.

It is useful for providing position and orientation data to a 3D engine like Unity or Unreal. It is used in devices like the Meta Quest 2. PnO tracking constitutes an absolute positioning system.

There are a few different methods for 3D tracking, such as electromagnetic 3D tracking and acoustic 3D tracking.

3D tracking can be done using multilateration with pointed laser interferometers.[1]

Systems[edit]

Spatial tracking systems track two separate things: The position of something, and the rotation (also known as orientation) of the same thing. Some systems use the same physical principle to track both, but other systems are much more effective by using one method to track position and another to track rotation.

For example, a hybrid system tracks position using multilateration and tracks rotation using a magnetic sensor.

Methods[edit]

Multilateration[edit]

The first class of spatial 5DOF or 6DOF tracking methods is those that use multilateration. This includes methods that use time-of-flight measurements of radio signals, such as visible light or ultra-wide band, or alternatively, ultrasonic acoustic signals.[2]

Triangulation[edit]

This method uses the angles of signals, and is similar to multilateration.

It can be done using phase difference of arrival (PDOA) in UWB.

Perspective n-point[edit]

The second class of methods is those that use perspective n-point calculations from a camera image.

See List of camera-based tracking methods

Electromagnetic[edit]

A third class of methods is those that use field strength of magnetic coils (electromagnetic tracking). Electromagnetic tracking gets jittery when there is metal around, depending on the type of tracking used.

Neural[edit]

A fourth class of methods is those that use neural networks for calculation of position using artificial structures such as an artificial ear to deduce a head-related transfer function (HRTF) neurally.

Neural nets can be run on a local machine to deduce the functions corresponding to an HRTF when using acoustic sensors.

Phase based[edit]

Phase-based ultrasonic tracking was used in Ivan Sutherland's original head-mounted display.

Capacitive 3D tracking[edit]

See also[edit]

References[edit]

  1. Nitsche, Jan; Franke, Matthias; Haverkamp, Nils; Heißelmann, Daniel (2021-02-19). "Six-degree-of-freedom pose estimation with µm/µrad accuracy based on laser multilateration". Journal of Sensors and Sensor Systems (Copernicus GmbH) 10 (1): 19–24. doi:10.5194/jsss-10-19-2021.
  2. 3D User Interfaces: Theory and Practice, 1st edition, page 97