3. Integration in SNAKE

3.1. Face Tracking System Limitations

3.1.1. Estimating Depth

So far the position of the user relative to the screen is assumed to be known. However, that is not the case with SNAKE. The system’s face tracking feature can determine the user’s angle to the screen with good precision, but it cannot compute the user’s distance from the screen.

Instead, the system assumes that the user is located at the optimal distance, which is the distance at which the 3D effect is best perceived.

3.1.2. Using an Average For Distance Between Eyes

Even if the position of both eyes can be tracked by the system, the real distance between the eyes for a particular user is not known. The apparent distance between the eyes in the image retrieved from the camera varies depending on the user's distance from the screen, so it cannot be used for the actual distance between the eyes. A virtual user with an average distance between eyes is assumed. An average value is assumed for the distance between eyes for a user positioned at the optimal distance from the screen. The same assumption is used for pictures of the user retrieved from the camera. The eye positions of this virtual user are used for all computations instead of the raw ones.

3.1.3. Handling User Detection Failure

Sometimes, the user might move out of the field of view of the tracking system. Other times, poor conditions might make recognition impossible. In these situations, the system calculates an estimate for the user’s position, based on prior position information and input from the gyro sensors. The estimate begins with the last-detected position of the user. If there is no gyro sensor input for some time—implying that the SNAKE system is not moving—the estimated position is shifted over time toward the center of the screen. If the tracking system remains unable to detect the user, it assumes that the user has returned to a position facing the center of the screen. If the SNAKE system is moving, the calculations assume that the user is static, and estimate the user's position relative to the moving console.

This behavior has been designed to provide a user experience at least as good as with 3D screens that do not support Super-Stable 3D. In addition, it provides a good user experience when the user moves the device.

3.2. Comfort and Maximum Parallax

Experiments have shown the importance of maintaining parallax within limits. There is a limit distance for parallax, and this value must be taken into consideration when setting the distance for an object in the images presented to the left and right eyes. If this limit is exceeded, most users will feel uncomfortable.

Using the ideal stereoscopic camera described in the previous section, this can happen when the object is situated very far from the window plane. If an object located infinitely far from the user is rendered at a distance equal to the distance between eyes, the parallax is very large. The following methods can be used to ensure that this constraint is enforced.

3.2.1. Near and Far Planes

Setting near planes and far planes nearer to the window plane reduces the maximum parallax that a visible object can have.

3.2.2. Reducing Distance Between Eyes

Another way to reduce maximum parallax is to render the scene for a reduced distance between eyes. This reduces the parallax on the farthest objects. However, it also reduces the 3D effect on the nearest objects.

3.2.3. Increasing FOV

As shown in the following section, parallax in the far plane can also be reduced by increasing the FOV.


CONFIDENTIAL