Absolute 3D tracking

From XVRWiki
Jump to navigation Jump to search
Position and orientation tracking using laser interferometers
WorldSense positional tracking, an example of camera-based tracking
Razer hydra, a 3D input device that uses electromagnetic 3D tracking

Absolute 3D tracking refers to systems that determine the position and orientation of objects in three-dimensional space, playing a crucial role in human-computer interaction (HCI) in 3D.

3D tracking systems can be used to track headsets and handheld devices like controllers or phones. It can be for either 3DOF position, 3DOF rotation, or 6DOF input.

3D tracking can be categorized into the following types:

  • 3DOF Position Tracking: Determines the location of an object along three spatial axes (X, Y, Z).
  • 3DOF Rotation Tracking: Measures an object's rotation around three axes (pitch, yaw, roll).
  • 6DOF Tracking: Combines both position and rotation tracking, providing a comprehensive representation of an object's pose in space.

One of the goals of 3D tracking is to specify a point in 3D space.

3D tracking is pivotal for various applications in human-computer interaction, including:

  • Input Devices: Enhancing the usability of various input devices such as game controllers, 3D mice, and styluses.

It is useful for providing position and orientation data to a 3D engine like Unity or Unreal. It is used in devices like the Meta Quest 2. PnO tracking constitutes an absolute positioning system.

There are a few different methods for 3D tracking, such as electromagnetic 3D tracking, acoustic 3D tracking, and camera-based tracking such as SLAM or marker-based tracking.

Methods of markerless camera-based tracking include methods using SLAM and VIO.

3D tracking can be done in using multilateration using pointed laser interferometers.[1]

Methods[edit]

Multilateration[edit]

The first class of methods is those that use multilateration. This includes methods that use time-of-flight measurements of radio signals, such as visible light or ultra-wide band, or alternatively, ultrasonic acoustic signals.[2]

Perspective n-point[edit]

The second class of methods is those that use perspective n-point calculations from a camera image.

See List of camera-based tracking methods

Electromagnetic[edit]

A third class of methods is those that use field strength of magnetic coils (electromagnetic tracking).

  • Electromagnetic tracking. Electromagnetic tracking gets jittery when there is metal around, like a metal laptop computer.

Neural[edit]

A fourth class of methods is those that use neural networks for calculation of position using artificial structures such as an artificial ear to deduce HRTF neurally.

Neural nets can be run on a local machine to deduce the functions corresponding to an HRTF when using acoustic sensors.

Phase based[edit]

Phase-based ultrasonic tracking was used in Ivan Sutherland's original head-mounted display.

Capacitive 3D tracking[edit]

See also[edit]

References[edit]

  1. Nitsche, Jan; Franke, Matthias; Haverkamp, Nils; Heißelmann, Daniel (2021-02-19). "Six-degree-of-freedom pose estimation with µm/µrad accuracy based on laser multilateration". Journal of Sensors and Sensor Systems (Copernicus GmbH) 10 (1): 19–24. doi:10.5194/jsss-10-19-2021.
  2. 3D User Interfaces: Theory and Practice, 1st edition, page 97