Wiiheadtrack works with two kinds of coordinates/positions.
This section describes how Wiiheadtrack handles these different positions.
The first step is to find out the user's position in front of the Wiimote-camera. This position is called eye-position, because the IR-LEDs symbolize the user's eyes. So the terms eye and IR-LED are used synonymously. The Wiimote-camera captures an image of px. Wiiheadtrack uses the center Wiiheadtrack::eyeC between the eyes (Wiiheadtrack::eyeL and Wiiheadtrack::eyeR) as the initial point of his computations. You can imagine the captured image of the Wiimote as an "imaginary screen" detecting the user's position:
The image shows the imaginary screen from the Wiimote's point of view. You can see the user wearing glasses with two IR-LEDs (red) and eyeC
(green). Independent from the distance between the user and the Wiimote-camera, eyeC
represents the coordinates of the user at the screen, with the origin at the center of the screen and . From the user's point of view, the sign of the x-coordinate is inverted.
The distance between the user and the Wiimote-camera Wiiheadtrack::eyeZ is directly detected by the Wiimote. Because this distance is computed by the distance between Wiiheadtrack::eyeL and Wiiheadtrack::eyeR, it depends on the installation of the IR-LEDs.
Background: As explained, Wiiheadtrack::eyeC is computed with the help of Wiiheadtrack::eyeL and Wiiheadtrack::eyeR, which are detected by the Wiimote-camera:
Actually, the origin is placed in the lower left corner of the imaginary screen (from the users point of view) and . For further processing,
eyeC
is translated ( and
), so the origin is set to the imaginary screen's center.
Because eyeC
is computed by eyeL
and eyeR
, it is actually impossible for eyeC
to reach the left or right border of the imaginary screen. In this case eyeL
or eyeR
would be outside the screen and eyeC
could not be computed. This problem is solved by adding an "offset" from to
from the screen's center to the left or right border:
The second step is to map the eye-position to a (virtual) position Wiiheadtrack::pos (in the following "position") in the 3-dimensional space. The computation depends on the sensitivity and the mode, you work with. Both ways are explained in the context of sensitivity.