Game Development Reference
vibration. Today this is still the most commonly used touch feedback. Some
of the first game-related peripherals to include vibration as a game feedback
include the Nintendo 64 Rumble Pak and Sony PlayStation's Dual Shock
You can make an iOS and Android device (with the required capability)
vibrate by executing the code iPhoneUtils.Vibrate();
In 2003, Sony released the EyeToy for PlayStation 2. This is essentially a
webcam that attaches to the gaming console. Through the processing of
the image taken by the camera, computer vision and gesture recognition
algorithms can estimate the movements of a player.
Although the Nintendo Wii took game haptics to a new level with its Wii
Remote, it was Microsoft that broke new ground by releasing the Kinect
in 2010. Although players are not feeling the game environment with a
sense of touch, they are still interacting with it kinesthetically and therefore
experiencing a level of immersion not before experienced in preceding
games. The Kinect provides full-body 3D motion capture through the use of
a continually projecting infrared laser out in front of the screen in a pixel grid.
Depth data are gathered from a monochrome sensor gathering information
about the reflections of the laser. Currently the Kinect is capable of tracking
six people simultaneously. The Kinect is also able to perform facial and voice
The Kinect differs from the PlayStation Move, a game controller wand with
built-in accelerometers and light on top, by sensing actual 3D data. The
Move released also in 2010 uses a combination of motion sensing with
visual recognition from the PlayStation Eye (the successor of the EyeToy)
to recognize player movements.
Numerous groups are attempting to make access to the Kinect open source
and cross platform. These include OpenKinect ( http://openkinect.org/wiki/
Main_Page ) and OpenNI ( http://openni.org/ ).
OpenKinect is a community of developers bringing free programming
interfaces for the Kinect to Windows, Linux, and Mac such that independent
application developers and researchers can implement their own
Kinect-based games and applications. Links to their source code and
tutorials for installation and use can be found on their Web site.
OpenNI is a not-for-profit organization driven by industry to promote the use
of natural HCI in applications and games. Their OpenNI framework is open
source and helps developers to integrate low-level vision and audio sensors
and high-level computer vision and tracking algorithms for use in products
that deliver third-party middleware solutions to be used by others.