Game Development Reference
In-Depth Information
change occurred because neither did the magnetic trackers develop much nor did their
cost go down, limiting the work space. While on the other hand, the optical tracker
technology developed significantly, mainly in the CCD or CMOS matrix sensors. Three
reasons can be given to explain this progress.
The first reason concerns the image sensors. A few years ago, image sensors were
still bound by television norms defined in 1930: interlaced images, fixed and limited
bandwidth (768
480 pixels at 30 images
per second). These sensors were designed to facilitate display of images in film and
not for processing each image. Ever since the sensor signals are no longer interlaced,
we talk about “progressive scan'' technology. Resolution and frequency are sufficient
to solve most of the problems faced. The bottleneck is more in the communication
link between the camera and the work station. Besides, the cost of these sensors is
reasonable compared to other technologies.
The second reason is the computing power. In fact, the computing power available
in a work station or embedded in a stand-alone system has increased considerably,
making it possible to process quite complex algorithms at video frequency.
Finally, the third reason is the algorithms used in computer vision. The algorithms
of camera calibration and 3D reconstruction are now successfully mastered by the
576 pixels at 25 images per second or 640
6.4.2 Principle
In general, these trackers operate by combining optical sources and photosensitive
sensors, point detection sensors (phototransistor) or flat panel detectors (camera using
CCD or CMOS technology). The optical sources are generally created from light
emitting diodes (LEDs), emitting in the visible spectrum or near infrared.
The physical principle depends on the sensitivity of silicon and mainly on image
sensors in the spectral band of 350-1100 nanometres (nm). The 350-750 nmband does
not correspond to the visible spectrum and the 750-1100 nm band is a sub-band of the
near infrared spectrum (750-3000 nm). Trackers in virtual reality use the infrared part
only because the human eye is not sensitive to infrared. An immersive room imposes a
partial shadow to give the users a good visual immersion. The use of any visible light
source is thus to be prohibited.
However, the visible spectrum can be used for portable augmented reality or virtual
reality systems like the Head-mounted Displays. Using one or more cameras fixed on
the display to calculate the location and orientation of the display in space could be
an advantageous solution. ARToolkit (Augmented Reality Toolkit) program provided
in “open source'' can be used to implement this solution (Figure 6.4). It works on the
following principle: Pre-defined targets are printed on a standard printer and then put
up on a fixed support. By analysing the images, the program calculates the location
and orientation of the camera and thereby that of the head-mounted display.
The geometric principle of three-dimensional location of points is generally based
on triangulation. In case of two cameras, the problem arises in the following manner:
Figure 6.5 shows two cameras with their optical centre marked as C and C', relative
position specified by translation T and relative orientation by rotation R. The position
of point M in the space marked by the cameras can be calculated by understanding
the camera parameters and their relative position with respect to each other. The
Search Nedrilad ::

Custom Search