Plantilla PFC - Departament d`Enginyeria Informàtica i Matemàtiques

Transcription

Plantilla PFC - Departament d`Enginyeria Informàtica i Matemàtiques
Departament d’Enginyeria Informàtica i Matemàtiques
Motion Controls for Pointer-based Games
DEGREE: Grau en Enginyeria Informatica
AUTHOR: Juan Javier López Reyes
DIRECTOR: Carles Aliagas Castell
DATE: Septembre / 2014.
Index
1. Project Objectives................................................................................5
2. Project Specifications..........................................................................6
3. Design...................................................................................................7
3.1 Problem Background.........................................................................................7
3.2 Study of Commercially Available Motion Control Peripherals.....................9
3.2.1 Nintendo “Wii Remote”.......................................................................9
3.2.2 Sony “PlayStation Move”...................................................................12
3.2.3 Namco “Guncon 3”............................................................................13
3.2.3 Sixense “Hydra”.................................................................................14
3.3 Calibration Algorithm......................................................................................15
3.4 Technical Demonstration.................................................................................19
4. Development......................................................................................20
4.1 Understanding the SixenseInput Plugin.........................................................20
4.2 Mathematical Model of the Lightgun.............................................................21
4.2.1 Screen Depth Offset Correction.........................................................26
4.2.2 GlovePIE Port of the Calibration Algorithm....................................27
4.3 First Person Shooter Prototype.......................................................................27
4.4 SixenseInput Plugin Input Smoothing............................................................35
5. Testing................................................................................................36
5.1 Calibration Logic Test.....................................................................................36
5.2 Calibration Performance Test.........................................................................39
2
5.2.1 15.6 Inches Results.............................................................................39
5.2.2 24 Inches Results................................................................................40
5.2.3 32 Inches Results................................................................................41
5.3 Prototype Logic Test........................................................................................42
5.4 Prototype Challenges Test...............................................................................47
6. Conclusions........................................................................................49
7. Resources............................................................................................50
7.1 References.........................................................................................................50
7.2 Third Party Resources.....................................................................................51
8. Manuals..............................................................................................53
8.1 Installing Unity.................................................................................................53
8.2 Installing the Hydra drivers............................................................................55
8.3 Setting up the technical demonstration in Unity...........................................56
8.4 Using the pre-compiled Technical Demonstration build on Windows........62
8.5 Using the GlovePIE implementation for Mouse emulation..........................62
9. Annexes..............................................................................................64
9.1 CalibrationController.cs..................................................................................64
9.2 CalibrationSceneController.cs........................................................................73
9.3 CannonAngleController.cs..............................................................................74
9.4 CharacterRotationController.cs.....................................................................76
9.5 ColorChanger.cs...............................................................................................78
9.6 FPSInputControllerSixense.cs........................................................................79
3
9.7 JoystickLookSixense.cs....................................................................................82
9.8 MainMenuSceneController.cs.........................................................................84
9.9 PausePlaneController.cs..................................................................................86
9.10 SixenseInput.cs................................................................................................87
9.11 SlidingDoorController.cs...............................................................................89
9.12 TargetController.cs........................................................................................92
9.13 WeaponController.cs......................................................................................95
9.14 GlovePIE Wiimote Script..............................................................................98
9.15 GlovePIE Hydra Script................................................................................101
9.16 Arm Cannon license agreement..................................................................107
4
1
Project Objectives
CRT screens allowed for lightgun peripherals that provided accurate and reliable onscreen reticle tracking without any major drawbacks. However, because these tracking
techniques relied on the specific display method of CRT screens (changing the brightness
and color of phosphor through electron beams), peripherals designed around these
techniques no longer work on newer, LCD based screens.
Due to the widespread adoption of LCD screens, mostly replacing CRT as the defacto household screen standard, new techniques that provide lightgun-like on screen
reticle tracking have appeared. These are mostly based on tracking of infrared signals,
emitted from LED sources, using infrared-sensing cameras. Others have attempted to
provide this sort of functionality through motion sensors such as accelerometers, or
orientation sensors such as gyroscopes and terrestrial magnetometers (much like those used
in current smartphones).
Even though some of these methods have proven as accurate as those used on CRT
screens, none have proven as reliable and hassle-free, often suffering interferences and
calibration errors due to common environment elements such as light sources, or even due
to something as simple as a sudden movement of the lightgun peripheral, a common
situation in fast paced shooting games.
This project aims to study the current commercially available motion control
peripherals and their fitness for the purpose of lightgun-like reticle tracking on modern
LCD screens, to analyze the potential benefits of switching to magnetic field-based
technologies and, finally, to design a calibration algorithm that enables a magnetic fieldbased peripheral to provide lightgun-like functionality (a functionality that is not currently
implemented in any of the manufacturer-provided libraries of this peripheral) and to
implement it as a generic library in a cross-platform game engine (such as Unreal Engine
or Unity3D), along with a small technical demonstration of this calibration library working
within a first-person-shooter game prototype designed both to showcase the benefits of this
technology and to provide a functional example for developers who wish to utilize the
calibration library.
5
2
Project Specifications
Due to the widespread adoption of LCD based screens incompatible with the
lightgun technologies developed for CRT monitors, as well as the unique drawbacks of the
currently employed LCD-compatible lightgun technologies, the goals of this research
project are based on solving the most common initial issues that any developer seeking to
produce a lightgun game must solve before being able to work on the full software:
–
To research and evaluate currently available end-user peripherals for the
purposes of lightgun gameplay on LCD monitors, generating documentation on
both their strengths and weaknesses for the purposes of household lightgun
tracking, as well as producing video material showcasing these drawbacks.
–
To make a justified choice of one of these peripherals from this research
–
To develop a platform-agnostic calibration and tracking algorithm, based on
mathematics and data from the peripheral, that provides lightgun-like operation
from said peripheral on LCD monitors and is easily portable across platforms. This
algorithm must not rely on built-in functions provided by game engines, further
ensuring portability.
–
Data from the devices may be obtained through both first party and third party
plugins such as the UniWii Wiimote plugin for Unity3D, so long as they are free
for non-commercial and/or educational use.
–
To implement at least two working versions of this calibration algorithm (one main
version and one port) on two different platforms currently capable of reading data
from the peripheral, to demonstrate both functionality and portability (ensuring the
previous steps won't need to be repeated with any platform and display technology
that supports the peripheral)
–
Finally, to implement an early prototype of the most common videogame genre
that utilizes lightgun technology: A “first person shooter” prototype (where at
least the aiming reticle is controlled by the player) showcasing the basic
functionalities of the algorithm in a working state.
–
The lightgun game prototype demonstration, along with one of the calibration
algorithm implementations, will be built in a cross-platform game engine such as
CryEngine, Unreal Engine or Unity3D, with the main target platform being
Windows PC.
6
3
Design
3.1
Problem Background
So-called “lightgun games”, games in which the primary method of interaction is to
aim a gun-shaped peripheral at the screen and shoot at targets, have existed since the
1930s, but they did not become a mainstream appliance in household videogames until the
debut of Nintendo's NES (Nintendo Entertainment System) and SEGA's SMS (Sega
Master System), where lightguns such as the “Zapper” and the “Light Phaser” became
available as purchaseable accessories for the aforementioned videogame systems.[1]
Initially, their screen detection systems were very primitive: Known as the
“Sequential Targets” method, involved blacking out the screen for the duration of a few
frames upon pressing the lightgun's trigger. During the first frame, a white square would be
drawn on the position of the first target. During the second frame, this square would
instead appear on the position of the second target, and so on. Thus, it was possible to
know which target was hit by knowing whether the gun was pointed at a white or black
light (the target versus the rest of the screen), and during which of these frames this
happened.[2]
This technology had a severe limit: Many frames would need to be drawn on the
screen for the duration of the trigger press. Too many targets, and the screen detection
system would obscure the screen for too long, as well as introducing potential errors due to
the player moving his hand away after the shot. At the time, in which NTSC region
televisions (from North America and Japan) displayed 60 frames every second, and PAL
region televisions (Europe and Australia) displayed 50, this meant that the speed at which
these images could be displayed was limited, and so the amount of on-screen targets
suffered as well: Nintendo's popular videogame, “Duck Hunt”, displayed up to two
simultaneous targets at any given time.[3]
Another limitation of this technology is very well known among gamers: Simply by
aiming the gun at a white light source, one could trick poorly programmed games into
always shooting one or many on-screen targets successfully with every trigger press.
Eventually, another targeting method was developed and very widely used during the
90s in several well-known lightguns such as Nintendo's “Super Scope”, Namco's “GunCon
45” (or simply “GunCon” in some regions), Konami's “Justifier” and SEGA's “Virtua
Gun” (or “Stunner” in North America). This method, known as “Cathode ray timing”,
solved the limitations of Sequential Targeting, and is still used to this day in some Arcade
machines, where CRT monitors are still relatively common thanks to their virtually
nonexistent input lag.
This targeting system makes use of the nature of the cathode ray tubes inside CRT
monitors, which was the most common type of home monitor during the 1980s and 1990s
thanks to its affordability. CRT monitors draw every image through a scanning electron
beam (or cathode ray) that travels across the screen. Starting at the top of the monitor, the
beam draws one line of the image until it hits the opposite end of the screen, and then it
moves right below to continue drawing the next line. This process is repeated until the
entire image is drawn, and it is not visible by the human eye due to its speed: As
previously mentioned, the standard refresh rates were 50Hz and 60Hz depending on the
region, so every screen update is done in roughly either 20 or 16.7 milliseconds.[4][5]
7
This method of updating the screen is used by the computer to calculate the aiming
spot: The lightgun's camera sends a signal to the computer after detecting a sudden change
in brightness at the spot where it is aimed. This sudden change of brightness happens when
the electron beam excites the phosphor at the screen location it is currently refreshing the
image at. Because the computer knows how long it takes for the entire screen to refresh,
and it also knows how long it took to detect a refresh in a particular area of the screen since
the refresh started, but it does not know how fast every line is filled or how fast the
electron beam travels to the next line, by knowing one of these components (thus knowing
the vertical or horizontal axis component of the aiming spot), it could compute the missing
one.
The computer obtains this information by either providing a time base for the
horizontal refresh rate through the controller's connector (a technique used by Nintendo's
Super Scope), or most commonly by analyzing the composite video signal (which it reads
through a T-connector on the A/V cable) during a couple calibration shots at the start of the
game.
This technology had no major drawbacks: Most of the time, “it just worked” with
very little setup. With the lights on or off, with or without sunlight, close up or far away
from the screen... However, for this technique to work, there needs to always be a sudden
change in brightness (from dark to bright) in spots of the screen that are being refreshed.
This was provided by default by CRT monitors, as the screen phosphor would darken
between refreshes due to not being excited by the electron beam. Modern display
technologies such as Plasma and LCD do not have this “off” state between refreshes,
rendering cathode ray timing-based peripherals unusable on most any contemporary
household monitor.
With the advent of Plasma and LCD monitors as the household standard, alternatives
had to be sought. The first two main commercial alternatives, Nintendo's “Wii Remote”
and Namco's “GunCon 3”, relied on two slightly different implementations of the same
technology: Two clusters of IR-emitting diodes would be placed near the screen (on top or
right below), one on the left side and the other one on the right.[6]
The lightgun peripheral was outfitted with an IR tracking camera that would detect
this pair of IR light sources and provide the computer with their vertical and horizontal
coordinates, allowing it to compute an on-screen position for the aiming reticle based on
these coordinates, as well as the distance between the lights (which allows the computer to
estimate movement on the depth axis, movement towards the screen or away from it).[7]
To compensate for temporary loss of the IR signal, some Wii games (such as “Red
Steel 2”) have used gyroscopes (included in the “Wii Motion Plus” peripheral and later
built into the “Wii Remote Plus”) to estimate the movement of the aiming reticle from the
last known IR-based position, but due to the gyroscope phenomenon known as “drift”
(varying offsets appearing on the original angle calibration values over time), gyroscopeonly aiming never became the norm on Wii games without serious drawbacks (as seen on
“The Legend of Zelda – Skyward Sword”, where the player would need to frequently recenter and re-calibrate the controller and aiming reticle).[8]
Today, IR tracking is rapidly becoming the standard for modern, newly produced
lightgun games on the Arcades, such as “Time Crisis Razing Storm”.[9] The controlled
nature of the arcade salon environment allows Arcade operators to disguise most of the
disadvantages associated with the technology. However, as of 2014, there is no such
standard for lightgun games in the household environment. The Xbox 360 and Xbox One
currently have no lightgun-game oriented peripheral, the Wii and the Wii U rely on the IR
8
tracking technology used in the Wii Remote, whereas the Playstation 3 and Playstation 4
use the Playstation Eye and Playstation Move peripherals, which make use of visible
spectrum light as opposed to IR, tracking an RGB LED light source of configurable color
placed on the wand, placing the camera (which will track this light to compute the gun's
position) near the screen instead, and using accelerometers, rate sensors and
magnetometers to compute the angle and rate of movement.[10]
3.2
Study of Commercially Available Motion Control Peripherals
Prior to designing a calibration algorithm to act as the basis of our lightgun game
prototype, the main lightgun peripheral alternatives must be studied and evaluated.
Calibration algorithms could vary wildly depending on the data said peripherals make
available to the programmer, making this a necessary previous step.
In this study, we will take an in-depth look at the main, first-party motion control
peripherals released for the seventh and eigth generation of home video game systems
(Nintendo's Wii Remote and Sony's PlayStation Move) and the biggest profile example of
a commercially available infrared-based dedicated lightgun peripheral (Namco's GunCon
3, compatible with Time Crisis 4 on the PlayStation 3). As an alternative to line-of-sight
based motion control peripherals, Razer's “Hydra” controller (manufactured by Sixense,
and previously known as Sixense Truemotion), which relies on magnetic motion tracking
in order to provide six degrees of freedom (namely three dimensional position and three
dimensional rotation), but has no camera-based tracking capabilities, will also be
considered. In particular, their fitness for the purpose of lightgun-style tracking of an onscreen reticle that accurately follows the line of sight of the peripheral will be analyzed,
regardless of their performance in other motion control-related scenarios.
As the Microsoft Kinect and Kinect 2.0 do not make use of any handheld peripheral
that could act as a lightgun, they were deemed unfit for this purpose without further
analysis.
3.2.1 Nintendo “Wii Remote”
Figure 1. Wii Remote with Wii Remote Jacket and strap
First up is Nintendo's “Wii Remote” (also known as Wiimote and Wii Rimokon in
Japan). Outfitted with an ADXL330 accelerometer[11] and a PixArt optical sensor[12], the
Wii Remote is capable of sensing acceleration along three axes (which can be used to
estimate both positional and rotational movement) and, most importantly, is capable of
9
tracking the coordinates of infrared light sources in two dimensions (vertical and
horizontal), which can be used to estimate where the Wii Remote is pointing in relation to
these IR sources (as well as the remote's distance from said sources if there are more than
one, by analyzing how the perceived distance between them increases or decreases).
Though the PixArt sensor is theoreitcally able to track up to four IR sources, the
official implementation (the Nintendo Sensor Bar) is two IR sources aligned horizontally,
and placed right on top or under the screen where the Wii output is being displayed.
Later in the Wii's lifespan, an accessory for the Wii Remote called “Wii MotionPlus”
was released as an attachable unit first, and ultimately integrated into the current (as of the
year 2014) revision of the Wii Remote (known as the Wii Remote Plus). This accessory
uses an InvenSense IDG-600 sensor[13] (upgraded to IDG-650 in later units). This sensor
is a multi-axis gyroscope, which complements (and usually replaces) the Wii Remote's
accelerometers when attempting to estimate its rotational movements, due to their
inaccuracy for this purpose.
When it comes to using the Wii Remote as a Lightgun, there are two main
approaches: To simply use the Wii Remote as an infrared lightgun, or to attempt to harness
its accelerometers for positional movement and its gyroscopes for rotational movement.
Let us analyze the first alternative: Using the Wii Remote as an infrared lightgun. In
order to be able to do this from a comfortably close distance from the screen, its optical
sensor must have a wide field of view in both the vertical and horizontal axes, so as to be
able to keep the TV-mounted Sensor Bar IR sources in sight when aiming at any part of the
screen from up close, especially on bigger television sets.
The Wii Remote's infrared camera has an effective field of view of approximately 33
degrees horizontally and 23 degrees vertically[14], which makes it sub-optimal for this
purpose. As the camera needs to keep either the bottom or the top of the screen visible at
all times (depending on the Sensor Bar's position), either a smaller screen size or a further
playing distance is needed. On very large screens, this is a serious issue: Standing away too
far causes the Wii Remote to lose track of the IR sources altogether, whereas getting closer
will result in inaccurate tracking.
A video showcasing the Field of View deficiences of the Wii Remote can be found
here (http://youtu.be/ErVBGL7ne-M): Using a GlovePie script (included in this document
as an annex) that will make the Wii Remote's fourth LED indicator light up when at least
one of the IR sensors is in sight, it can be seen that, sitting relatively up close to the 32 inch
TV screen, the infrared signal is lost frequently when aiming at the screen's bottom half.
Another problem arises when using the Wiimote as an infrared lightgun: It is
inherently sensitive to interference from other infrared sources. These can come in the
form of natural sunlight through a window, candle or lighter fire, or even a household
lightbulb. Windows and lightbulbs especially are common in most household rooms where
a gaming system may be placed.
As seen in the following video (http://youtu.be/h97wwcDk-4c), interference from
infrared sources can cause the Wii Remote to intermittently lose any IR signal altogether,
or to incorrectly detect the position of the IR sensors.
Taking a look at the other possibilities, if one were to omit IR tracking and instead
focus on the motion sensors, both positional and rotational tracking would be needed. The
Wii Remote can provide these through accelerometers and, if using MotionPlus,
gyroscopes.
10
When attempting to use the Wii Remote's gyroscopes for rotational tracking, serious
issues appear. Gyroscopes commonly suffer a problem known as “Gyroscope drift”.
Gyroscopes experience a non-zero bias error even when not experiencing any rotation. One
could easily account for a fixed bias error simply by measuring said bias, but unfortunately
gyroscope bias varies due to a number of factors (calibration errors, temperature, effects of
shock, mechanical wear...). It is this variation that is known as gyroscope drift, or “bias
drift”, and causes gyroscope measurements to become increasingly inaccurate over time or
even right after sudden movements[15].
If using the Wii Remote's IR tracking capabilities, one could continuously recalibrate
the MotionPlus gyroscope whenever IR sensors are in sight, allowing the gyroscope to take
over during brief moments where the player is aiming just outside of the IR tracking range
(for instance, the bottom portion of the screen). Unfortunately, sudden movements can
cause instantaneous and very considerable offsets to appear, as seen in the following
footage: (http://youtu.be/Ix2wkHkp8JI).
Nintendo themselves were unable to avoid the gyroscope drift problem, as evidenced
by the frequent need of recalibrating and recentering the MotionPlus in The Legend of
Zelda: Skyward Sword, noted by several independent game reviewers[8].
The Wii Remote's accelerometer can be used to estimate the controller's tilt thanks to
the way it behaves with regards to gravity: When left in free fall, it will report zero
acceleration. However, when held still, it will report an upwards acceleration
approximately equal to gravity. When rotating the controller, this upward acceleration
would begin to switch from one axis to another, as long as it is held reasonably still,
enabling the programmer to estimate the controller tilt[16].
This technique, of course, relies on the controller remaining reasonably still so that
the only upwards acceleration of the Wii Remote is that of gravity's. As soon as any other
acceleration is introduced from moving the controller, this method is no longer reliable,
hence the introduction of the Wii MotionPlus. As the scenario of using a lightgun
intrinsically involves movement (in order to aim from one target to another), this technique
is rendered intrinsically unreliable as an estimation of the Wii Remote's rotational
movement.
In summary, the only reliable way of using the Wii Remote as a lightgun is by using
its infrared tracking capabilities, a method which introduces its own set of issues, but is at
least mostly perfectly accurate with windows closed, lights off, the correct television and
the correct sitting distance (http://youtu.be/pl8dU9uqiwY), and with input lag virtually
indistinguishable from a dedicated IR lightgun peripheral such as a Guncon 3.
11
3.2.2 Sony “PlayStation Move”
Figure 2. PlayStation Move Controller
The Sony PlayStation Move motion controller is outfitted with a three-axis
gyroscope, a three-axis accelerometer, a terrestrial magnetic field sensor and a visible
light-emitting sphere capable of changing colors, which can be tracked by a camera known
as the PlayStation Eye, usually placed atop or under the TV screen and commonly used in
conjunction with the Move motion controller.[17]
Figure 3. PlayStation Eye camera
The PlayStation eye is capable of tracking the positional movement of several
PlayStation Move controllers at once, provided that each of them is emitting a different
color through its sphere. Because it can detect the relative size of the sphere, it is also
capable of detecting position along the depth axis. Since the PlayStation Eye is commonly
placed near the TV as specified by Sony and most games that utilize the peripheral[18],
this provides the PlayStation 3 (and 4) with a reference for how far the controllers are from
the television with a relatively small margin of error. Then, it is possible to know where the
PlayStation Move controllers are aiming thanks to a combination of their gyroscopes and
terrestrial magnetometers, allowing developers to compute the TV screen's size through
simple interactive calibrations that ask the players to shoot at the screen's edges, finding
where the Move controller's line of sight intersects with the plane at the PlayStation Eye's
depth.
12
Because of the aforementioned gyroscope drift issues, the PlayStation Move
controllers were outfitted with magnetometers capable of sensing the Earth's magnetic
field, allowing them to continuously recalibrate the built-in gyroscopes and reduce the
problem of gyroscope drift.
Unfortunately, these magnetometers suffer calibration issues of their own, and while
the problem of drift is severely reduced, it is not eliminated in a meaningful enough way
for accurate lightgun calibration, as showcased in the following experiments, where even a
single sudden movement from one target to another is enough to produce lightgun tracking
issues: (http://youtu.be/n7ka9_rhN5o). Once calibration drift manifests itself, the only way
to correctly re-center the aiming reticle is to go through the calibration process again.
Besides these, the PlayStation Move does not suffer any other major issues. The
PlayStation Eye does not track infrared, but visible light, and it is capable of configuring
the colors that it will track, allowing it to single out undesired light sources in the
environment. It has a very wide field of view (exceeding 45 degrees in the horizontal and
vertical), and is capable of tracking the PS Move spheres without issue from up-close, and
even slightly upwards of 3 meters away [19], and it boasts relatively low input latency
(http://youtu.be/3UjqfaOhY60), rendering the PlayStation Move a viable and rather hasslefree alternative for casual lightgun play that does not include sudden violent movements, as
seen here (http://youtu.be/36-9Irjka6k), but it is not fit for traditional action-heavy arcade
lightgun experiences.
3.2.3 Namco “Guncon 3”
Figure 4. Guncon 3, Japanese domestic model
Namco's Guncon 3 shares much in common with the Wii Remote's IR sensing
capabilities. Outfitted with an IR sensor camera, it is capable of tracking two IR-emitting
sources placed atop the TV screen[20].
As the Guncon 3 was designed specifically for lightgun use, it sports a significantly
wider field of view, as well as independent IR sources that can be manually adjusted to the
TV screen's size. While this still does not allow the player to sit extremely close to the
screen, it offers enough leeway for reasonably close sitting positions (working well from
approximately one meter away from a 32 inch television set). Testing of its sensing range
provided accurate tracking at distances up to approximately five meters in ideal conditions
13
with very little IR contamination (enclosed room with closed window shutters and no
lights on other than the TV screen).
However, and speaking of the subject of infrared interference, the Guncon 3 too
suffers from this issue, losing calibration entirely and displaying a warning LED when
faced by enough IR interference. As shown by the following footage, a desk lamp is
enough to cause this effect: (http://youtu.be/Jwsw2t8AlhM)
Namco's Guncon remains the best performing modern lightgun for arcade shooting
enthusiasts, as it is impervious to calibration issues related to fast movements, but like the
Wii Remote's IR tracking, it does require that the user removes as many IR sources as
possible from the environment, as well as direct line of sight from the lightgun to the
provided IR emitters at all times during a play session. Provided that the sensors are in the
Guncon 3's field of view (which is quite wide, as demonstrated here
(http://youtu.be/sMu84OACPhQ), at very close proximity to a very large screen), the
peripheral
provides
remarkable
tracking
accuracy
with
low
latency
(http://youtu.be/lNCEFxwj9P0),
albeit
with
some
reticle
jitter
(http://youtu.be/X4Th8zOyj3k).
3.2.4 Sixense “Hydra”
Figure 5. Razer Hydra, designed by Sixense Entertainment
The Razer Hydra, designed by Sixense Entertainment but manufactured and
distributed by Razer, does not include any line-of-sight-based tracking systems. Instead, it
expands on the concept of the terrestrial magnetometer by providing its own low-intensity
magnetic field through a base outfitted with electromagnetic coils.
The wands are equipped with magnetometers calibrated to detect the base's magnetic
field. Through this method, the Razer Hydra is able to provide both three-axis positional
and rotational data, with single-millimeter precision in the first and single-degree precision
in the second[21].
Currently, the wands are wired and attached to the base, restricting its maximum
range to approximately 1.8 meters from said base, but a wireless version with further
improvements to range and tracking accuracy is in development in the form of the Sixense
STEM System, which will also allow a combination of multiple electromagnetic bases to
further extend the operating range of the handheld controllers. The Sixense STEM will
14
make use of its own SDK, but will also be backwards compatible with the Razer Hydra
SDK.
Upon testing the Hydra controller, it was observed that both rotational and positional
data is very slightly biased. While the margin of error is very small, it is enough to produce
noticeable calibration issues, and it is not fixed, displaying seemingly random variations.
However, upon further analysis, its drift appears to be related to the controller's position in
relation to its base. This means that, so long as the player does not move from his playing
position during a play session, the data bias will drift very slightly. During these tests, the
Hydra's drift does not seem to vary in any noticeable way under the effects of shock, as
shown in the following footage: (http://youtu.be/Ai8H6_pmc_Y). Further, the Hydra's
input latency does not seem to be perceivably different from that of a dedicated lightgun
peripheral such as the Guncon 3 (http://youtu.be/iDz-16Fqv5k).
In order to take advantage of this bias drift behavior, a proposed calibration
algorithm should rely mostly on data provided from the player's playing position. This
way, recalibrations would only be needed if this playing position is changed, but unrelated
to the duration of the play session or the nature of the player's hand movements.
Should a calibration algorithm meet this criteria, it is likely that the Sixense Hydra
tracking tecnology could provide accurate and responsive lightgun-like on-screen reticle
tracking functionality, thanks to its high precision and its magnetic tracking tecnology,
impervious to IR contamination or bias drift appearing over time or due to sudden shock,
but also far more accurate than terrestrial magnetic tracking likely due to the controlled
nature of the magnetic field.
3.3
Calibration Algorithm
Because of the promising results obtained from initial testing of the Razer Hydra, as
well as a newer, improved version of this system being in the works, the calibration
algorithm will be developed with the Hydra's particularities in mind.
In particular, the Hydra is impervious to most of the issues found in more traditional
motion sensing technologies, such as IR contamination, low fields of view, line of sight
loss, inaccurate magnetometer tracking or rotational bias drift caused by the effects of
shock. At the same time, its main inconvenient seems like it could be easily circumvented
by an appropiate calibration algorithm design given a stationary playing position.
The Sixense Hydra provides us with positional data (height, length, depth in
millimeters) and rotational data (yaw, pitch, roll in Euler angles), from which we could
estimate both the position and angle of an “aiming ray” that would follow the line of sight
of the Hydra wand. Our objective here is to find where on the screen an intersection with
this line of sight is taking place, so as to place the aiming reticle there.
15
Figure 6. Calibration algorithm model
Two main problems arise: Even though the Hydra base must be placed in front of the
screen, several factors could influence the exact distance, such as the television's stand
having a large base. Furthermore, the size and aspect ratio of the television screen are
unknown.
By finding out the diagonal size and the aspect ratio of the television screen, as well
as its depth in relation to the Hydra base, a simple mathematical algorithm can be designed
through which the intersection of the Hydra wand's line of sight aiming vector in 3D
coordinate space intersects with a vertical plane placed at the television's depth. Then, by
finding whether or not this intersection point is within the boundaries of the screen
rectangle, and if so, where, one can then proceed to draw the aiming reticle on the screen.
For instance, one could take one corner of the screen as being the 0% vertical and
0% horizontal position, and the opposite corner of the screen to be the 100% vertical and
100% horizontal position, compute where within this range the intersection is taking place,
and then simply multiply this percentage by the screen's vertical and horizontal resolution
and placing the aiming reticle on the resulting on-screen pixel co-ordinate.
Figure 7. Screen co-ordinate system
16
A solution to the first problem, that of the screen's depth relative to the Hydra base, is
to have the user place the hydra wand right in front of the screen, reading its depth and
storing it as the screen's depth. Other solutions exist, such as offering a standard preset and
simply having the end-user deal with a small error margin, allowing the end-user to modify
this pre-set within a small range so that he can attempt to adjust the experience to his
liking, or even allowing the end-user to simply introduce this numerical distance, which he
could measure, himself.
In the end, the first solution is probably that which is most user friendly, as it
requires the user to have the least amount of knowledge about the set-up, and does not
require carefully adjusting settings manually. Because of the bias drift affecting this
measurement, the user should be asked to place his Hydra base in as close proximity to the
monitor as possible, much like other motion sensors such as the PlayStation Eye, and
facing directly towards the player's playing position (a requirement which is included as
part of the standard Razer Hydra set-up manual)[22].
Now that we have a reference for the screen's depth, we can find the screen's size and
aspect ratio simply by asking the user to aim at any pair of opposite corners on his screen.
Considering a rectangular screen, we would only need to assume these as the 0% vertical,
0% horizontal coordinate and the 100% vertical, 100% horizontal coordinate, providing us
with the complete screen boundaries.
However, when doing this, inaccurate tracking is experienced near the screen's
edges. This is caused by rotational bias, and if one finds the intersection points of the
Hydra's line of sight at the screen's depth when aiming at all four corners, it is found that
the screen appears not rectangular, but trapezoidal. Thus, information from the two missing
corners is erroneously extrapolated.
Figure 8. Erroneous model due to Hydra rotational bias
To compensate for this, one can instead opt to find the intersection points for all four
corners instead of only one pair, and then find the equations for the four straight lines
between each pair of corners in the same side of the screen, which form the screen's outer
boundaries. Then, when aiming at a certain coordinate (x,y) within the plane at screen
depth, one can find where the horizontal value of the lines that form the vertical boundaries
of the screen at height “y” reside, and thus at which horizontal percentage of the screen this
coordinate “x” resides. Similarly, by finding the vertical value of the lines that form the
horizontal (top and bottom) boundaries of the screen at horizontal coordinate “x”, one can
17
find at what height percentage within the trapezoidal screen boundary coordinate “y”
resides.
Figure 9. Translating (x,y) to vertical and horizontal screen percentage
The final conceptual design of the calibration algorithm, then, is thus:
–
Place Hydra wand at screen depth “z”, save depth value in millimeters
–
Go back to playing position and aim at all four corners of the screen
–
Find equation of the Hydra wand's line of sight thanks to positional 3D coordinates
in millimeters and rotational 3D values in Euler angles
–
Find intersection of said vector with the vertical plane at screen depth “z”, obtain
the (x,y) coordinates of all four intersections at depth “z”, one for each corner of
the screen
–
Find equations of the straight lines between these four intersection points that
would correspond to the four outer edges of the screen
–
Calibration is now complete: When aiming normally, find the (x,y) intersection of
the Hydra wand's line of sight at depth “z”, find the unknown values of the screen
borders' equations (using height “y” for the left and right borders and length “x” for
the top and bottom borders), take these values as 0-100% length and 0-100% height
respectively and find where in this percentual ranges the original (x,y) coordinates
fall.
–
By using these percentual height and length values, simply draw the aiming reticle
at the desired pixel position knowing the current screen's vertical and horizontal
resolution.
18
3.4
Technical Demonstration
The technical demonstration built for the calibration algorithm must concisely
showcase the benefits of Hydra's magnetic field motion tracking, as well as the fitness of
the algorithm for the purposes of providing lightgun-like reticle tracking by harnessing the
Hydra's technology.
By putting the player in situations in which he is likely to trigger the tracking
deficiences experienced with other peripherals, one can demonstrate whether or not the
Hydra, along with the calibration algorithm implementation, can properly and reliably
provide reticle tracking during and after said situations.
Of course, infrared contamination cannot be tested by utilizing only software, so the
technical demonstration will focus on a scenario that forces the player to make sudden
movements, either by quickly aiming from one target to another or by following moving
targets. Afterwards, the player will need to shoot a very distant target accurately, proving
that the reticle tracking is still accurate after these extreme cases.
Thus, the technical demonstration will consist of the following:
–
A room filled with a few stationary, easy to hit targets, so that the player may get
accustomed to the controls.
–
Two targets, put at a distance from one another, which must be shot at a rapid
succession, so as to force a sudden, violent movement when shifting aim from one
target to the other. If the targets are not shot in rapid succession, the next event will
not activate.
–
One moving target, which the player must shoot. By attempting to aim at a rapidly
moving target, the player will be forced to track it, ensuring the algorithm's reticle
tracking does not introduce significant overhead and the input latency remains low,
and ensuring reticle tracking remains reasonably accurate after the previous
challenge.
–
Finally, one very distant target, which the player cannot approach. It must be shot
accurately from a relatively large distance, ensuring that the Hydra wand and the
on-screen reticle still line-up properly after the previous challenges.
19
4
Development
Due to the ample documentation available from both official sources (such as Unity
and Sixense) and unofficial sources (end-users of the engine), as well as the availability of
a Hydra input plugin provided by Sixense, Unity3D has been chosen as the platform on
which the calibration library and technical demonstration will be built.
4.1
Understanding the SixenseInput Plugin
In order to make a correct implementation of the proposed calibration algorithm, the
format in which SixenseInput (the Hydra plugin for Unity provided by Sixense
Entertainment) provides us with the Razer Hydra's data.
Looking first at rotational data, SixenseInput provides us with a four dimensional
vector through the call “RotationRaw”. Through observation, it has been found that the (x,
y, z) elements provided within this structure correspond to the rotational elements of pitch,
yaw and roll, in this order.
These elements are provided in a range of [-1.0, 1.0] and behave as follows,
considering that the player remains looking at the TV screen at all times, and during the
Roll measurements, that the Hydra also remains pointing at the TV:
Pitch (x)
0.0 / -0.0 0º (aiming at
Yaw (y)
Roll (z)
0º (aiming to TV)
0º (buttons facing
upwards)
90º (buttons facing
to the left)
180º (buttons
facing
downwards)
270º (buttons
facing to the right)
TV)
90º
(aiming
0.5
at ceiling)
180º
(aiming
1.0 / -1.0
backwards)
90º (aiming to the
left)
180º (aiming
backwards)
270º (aiming
at ground)
270º (aiming to the
right)
-0.5
Table 1. Behavior of the x, y, z components of “RotationRaw”
From this behavior, we can infer the following equivalences:
RotationRaw
Arcdegree
Radian
Upper limit
Lower limit
1.0
180º
π (Pi)
-1.0
-180º
-π (- Pi)
Table 2. Equivalence between x, y, z components of “RotationRaw”, Arcdegree and
Radian
So, one can convert the RotationRaw values to Arcdegree simply by multiplying the
value by 180, or to Radian by multiplying by Pi.
20
Shifting our attention to positional data, matters are much simpler. Simply by calling
“PositionRaw”, we receive a three-dimensional vector (x, y, z) where the elements
correspond to length, height and depth respectively, in millimeters.
4.2
Mathematical Model of the Lightgun
Up to this point, we have been able to extract the following information from the
SixenseInput plugin:
–
Point of Origin
–
Pitch
–
Yaw
–
Roll
These elements are enough to create a parametric equation for a vector in the 3D
coordinate space, which we will intersect with a plane representing the real-world TV
screen at which the player will aim, as explained in the conceptual design.
Since Roll does not affect our spherical coordinates (those that affect only the
direction of the line of sight), it is not to be taken into account for the mathematical model,
as only the Lightgun's line of sight is relevant for this purpose.
In order to create said parametric equation, first we need to construct a directional
vector from the data we already know, so that we may combine it with our point of origin.
For this purpose, we must first construct a rotation matrix using the data we obtained
from Table 1 in the previous section and Unity3D's coordinate system:
(pitch, yaw) Unity X
axis
0
(0, 0)
Unity Y
axis
Unity Z
axis
Meaning
0
1
Pointing
along
Unity's Z
axis
Pointing
along
Unity's Y
axis
Pointing
along
Unity's X
axis
(π/2, 0)
0
1
0
(0, -π/2)
1
0
0
Table 3. Rotation matrix constructed from “RotationRaw”
From this information, we can construct the correct direction vector according to
Unity3D coordinates using the following formula, where (a, b, c) are the direction vector's
components:
21
a = cos(yaw) · cos(pitch)
(1)
b = sin(yaw) · cos(pitch)
(2)
c = sin(pitch)
(3)
Where yaw and pitch may be represented in either Radian or Arcdegree according to
Unity's “Mathf” library.
These equations can be derived from analyzing the projections of the sine and cosine
of the yaw and pitch components at the states in which each of the Unity axes would equal
“1” in the direction vector (as seen in the Rotation matrix).
Figure 10. Rotation matrix constructed from “RotationRaw”
22
Figure 11. Rotation matrix constructed from “RotationRaw”
Figure 12. Rotation matrix constructed from “RotationRaw”
23
Figure 13. Rotation matrix constructed from “RotationRaw”
The construction of this direction vector, using SixenseInput and Unity.Mathf, can be
done with the following code:
rotVar.x = (SixenseInput.GetController (hand).RotationRaw.x);
rotVar.y = SixenseInput.GetController (hand).RotationRaw.y;
// Compute Direction Vector
dirVar.x = Mathf.Cos (rotVar.x*Mathf.PI)*Mathf.Sin (-rotVar.y*Mathf.PI);
dirVar.y = Mathf.Sin (rotVar.x*Mathf.PI);
dirVar.z = Mathf.Cos (rotVar.x*Mathf.PI)*Mathf.Cos (rotVar.y*Mathf.PI);
Where “hand” corresponds to “LEFT” or “RIGHT”, depending on which of the
Hydra's wands one aims to track.
In order to construct the parametric equation we also need to obtain the positional
data, which can be done with the following code:
// Get Controller Position and Rotation
posVar.x = SixenseInput.GetController (hand).PositionRaw.x*0.001f;
posVar.y = SixenseInput.GetController (hand).PositionRaw.y*0.001f;
posVar.z = -SixenseInput.GetController (hand).PositionRaw.z*0.001f;
rotVar.x = (SixenseInput.GetController (hand).RotationRaw.x);
rotVar.y = SixenseInput.GetController (hand).RotationRaw.y;
24
Finally, the parametric equation would be as follows, where (x0, y0, z0) is our point
of origin (from the Hydra's positional data) and (a, b, c) is our direction vector:
x = x0 + at
(4)
y = y0 + bt
(5)
z = z0 + ct
(6)
Where “t” can take any Real value.
Now, our goal is to obtain (x, y) at depth (z), which is known, as it is the TV screen's
depth. This would tell us where in the plane at said depth the Hydra wand's line of sight is
intersecting.
In order to do this, the value of “t” must be obtained. A trivial task, since the Hydra's
point of origin is known (and thus, “z0”), the direction vector is known as well (thus
knowing the value of “c”) and depth (z) is also known.
t = ( z – z0 ) / c
(7)
Simply solving equations (4) and (5) for values (x, y) will tell us our intersection
point at depth (z). The following code will take care of this task:
// Get intersection between plane and vector (Solve Parametric Equation
for depth "screenDepth")
paramEquationT = (screenDeep - posVar.z)/dirVar.z;
intersectionPoint.y = posVar.y + dirVar.y*paramEquationT;
intersectionPoint.x = posVar.x + dirVar.x*paramEquationT;
Following the list at the end of the conceptual design, we have now obtained the
intersection point of the Hydra's line of sight at the TV screen's depth. Using this code, we
can now obtain the intersection points of the four corners of the screen. Through those
intersection points, we can create four equations for straight lines in a 2D coordinate space
that will correspond to the screen's borders at depth (z).
For any pair of points (x1, y1), (x2, y2), an equation for a straight line in a 2D
coordinate space can be constructed as follows, using the “general form”:
x (y2 – y1) – y (x2 – x1) = x1·y2 - x2·y1
(8)
Which we can then rewrite as either of these forms:
y = - (x1·y2 – x2·y1 – x ( y2 – y1) ) / (x2 - x1)
(9)
x = ( x1·y2 – x2·y1 + y (x2 – x1) ) / (y2 - y1)
(10)
Both of these forms will be useful, as we need to obtain the top and bottom vertical
boundaries, and the left and right horizontal boundaries.
Assuming that we know the (x,y) values of the four screen corners, where
“screenCorner1” is the top-left corner, “screenCorner2” is the top-right corner,
“screenCorner3” is the bottom-left corner and “screenCorner4” is the bottom-right corner
and screenAimPoint is the intersection at TV screen depth of our current aiming's line of
sight, we can find the maximum and minimum X and Y screen boundaries with the
following code, which uses formulas (9) and (10):
25
// Obtain current edges of the screen
maxY = -(((screenCorner1.x * screenCorner2.y) - (screenCorner2.x *
screenCorner1.y) - screenAimPoint.x*(screenCorner2.y - screenCorner1.y))/
(screenCorner2.x - screenCorner1.x));
minY = -(((screenCorner3.x * screenCorner4.y) - (screenCorner4.x *
screenCorner3.y) - screenAimPoint.x*(screenCorner4.y - screenCorner3.y))/
(screenCorner4.x - screenCorner3.x));
maxX = (((screenCorner2.x * screenCorner4.y) - (screenCorner4.x *
screenCorner2.y) + screenAimPoint.y*(screenCorner4.x - screenCorner2.x))/
(screenCorner4.y - screenCorner2.y));
minX = (((screenCorner1.x * screenCorner3.y) - (screenCorner3.x *
screenCorner1.y) + screenAimPoint.y*(screenCorner3.x - screenCorner1.x))/
(screenCorner3.y – screenCorner1.y));
Now that we know the maximum and minimum boundaries of the TV screen on the
X and Y axes at the horizontal and vertical spot that we are aiming at within the plane at
depth (z), all that is left is to find out at which vertical and horizontal percentage of the
screen we are aiming at, assuming that the minimum boundaries represent the 0%
horizontal and vertical positions of the screen, and the maximum boundaries represent the
100% positions of the screen.
// Compute at which percentage of the X and Y coordinates of the screen
the pointer is
aimPointPercentX = (screenAimPoint.x - minX)/(maxX - minX);
aimPointPercentY = -(((screenAimPoint.y - minY)/(maxY – minY))-1);
Finally, we can use these percentages to multiply them by the current screen
resolution, finding the pixel where we must place the tracking reticle.
All the aforementioned calculations have been implemented in the
“CalibrationHandler.cs” script, which will serve as the main library, developed during this
project, that provides lightgun-like reticle tracking harnessing the particular functionalities
of the Hydra and the plugin provided by Sixense Entertainment for Unity3D.
4.2.1 Screen Depth Offset Correction
During exhaustive testing of the calibration algorithm, it was found that the onscreen reticle was more sensitive to rotational movement that it should be. This was caused
by positional bias drift making the hydra appear further away from the screen than it
should be.
Furthermore, the Hydra wand's center of positional measurement is not at the tip of
the wand, but at the center, which is approximately 5 to 6 centimeters away from the
frontal part.
To solve this issue, a correcting offset has been added to the TV screen's depth
measurement, that makes the screen appear closer to the Hydra wand than the actual
measured distance by a fixed amount.
26
Through trial and error, an optimal adjustment of -150 millimeters (-50 of which
correspond to the Hydra wand's length from center to front) worked well at the Hydra's
recommended operating distance of 20 to 40 inches[22] on screens ranging from 15.6 to 32
inches.
In the future, this problem may be exacerbated by the increased range of the
upcoming Sixense STEM System. However, a variable offset that depends on how far
away the user is from the STEM's base could be easily implemented through further trialand-error testing.
4.2.2 GlovePIE Port of the Calibration Algorithm
Since the calibration algorithm relies mostly on mathematics, it is easily portable to
other languages and platforms that can read the Hydra's data.
In order to demonstrate this easy portability, the calibration algorithm has been
ported to the GlovePIE scripting language. Using the exact same mathematic formulas, it
allows the Hydra to control the Windows mouse pointer by aiming directly at the screen,
like a Lightgun reticle.
The complete code of this implementation is annexed to this document. However,
GlovePIE does not have native support for the Hydra, and no plugin is provided by Sixense
Entertainment.
A third party plugin, developed by Joshua Olson (known on GitHub as
“MrMormon”) and distributed on GitHub as “Hydra-OSC”, allows the Hydra to send its
data through the OpenSound Control protocol (a communications protocol meant for
musical devices capable of networking).
GlovePIE does natively support the receipt of messages through the OSC protocol,
allowing it to read the Hydra's data so long as “Hydra-OSC” is running and sending its
messages on the same port that GlovePIE is configured to listen for OSC messages on. Be
warned however that the OSC implementation of the latest versions of GlovePIE is
bugged. The last known version to support OSC correctly is GlovePIE 0.43.
The GlovePIE implementation of the Hydra calibration algorithm has already been
configured to receive OSC messages on port 7022, the port which “Hydra-OSC” sends its
messages through by default.
The data and calculations invovled in this implementation are identical to that of the
Unity3D implementation. Thus, it also provides identical tracking accuracy.
4.3
First Person Shooter Prototype
Now that we have a library that provides reticle tracking, it is time to develop a
simple FPS (First Person Shooter) game prototype that showcases the functionalities of the
library within the main context that it is most likely to be used in.
Besides providing a simple playground in which aspiring developers may play-test
the functionalities of the library, it will also be built with stress-testing of both the Hydra's
capabilities and the calibration algorithm in mind, by implementing the challenges
proposed during the design stage.
27
The prototype will consist of three basic elements: Level geometry, gateway doors
and targets.
Level geometry will be non-interactive, and will simply provide a stage on which the
player avatar can move, as well as providing physical boundaries so that the avatar may not
leave the test area, complete challenges out of order or approach the final target, defeating
the purpose of its existence.
Gateway doors will separate the various challenges, and will remain closed until the
current challenge has been completed. Upon completion, the gateway will open, allowing
access to the next challenge.
Targets will be of three types:
–
Stationary targets, which will have no special properties. Once they are shot, they
will change color permanently, providing a visual indicator that they have already
been shot before.
–
Succession targets, which have to be shot in rapid succession until a goal total of
targets has been shot. Their purpose is to force the player to aim in rapid succession
with sudden, violent movements, so as to demonstrate that the Hydra's technology
is impervious to shock resulting from said movements.
–
Moving targets, which will act much like stationary targets, but while rapidly
bouncing between two walls, forcing the player to track them as they shoot. A high
degree of accuracy, as well as a low input latency, will be demonstrated thanks to
these targets.
Because level geometry is to be non-interactive, and is to serve a very basic purpose,
a simple option is to construct extremely basic level geometry out of Unity
PrimitiveTypes: Basic geometrical shapes such as cubes, planes and spheres with a plain
color texture and no further functionality other than collision detection. These objects can
be easily resized from within the Unity editor by using the “Scale” property of the object's
“Transform” component, allowing, for instance, for construction of platforms, walls and
ceilings from basic cubes.
28
Figure 14. Level geometry constructed from Unity PrimitiveTypes
Gateways will provide a similarly basic purpose, so they can be constructed out of
modified PrimitiveTypes as well, and then joined together in a single Unity prefab, for
simple iteration.
Figure 15. SlidingDoor prefab constructed from Unity PrimitiveTypes
A very simple script attached to the main door geometry can provide very basic
vertical sliding door functionality, raising this geometry relative to the rest of the prefab
when opening it, and lowering it again when closing it. Because sliding doors will remain
fixed at the same position they are placed in the editor, their “closed” position can simply
be their original position, making their “open” position the same, but with a small vertical
offset.
29
// Compute closed and open positions of the sliding door based on the
current position
openPosition = this.transform.localPosition.y + 6;
closedPosition = this.transform.localPosition.y;
By using Unity's “Update” method, which is called once per frame at runtime, the
door can be moved upwards or downwards slightly during each frame until it has reached
its target position, giving the impression of a smooth movement.
Because of floating point imprecisions, it is generally not a good idea to compare
Float types with “equal” operators. Thus, instead of checking whether or not the door has
reached its target position, it will check whether or not the next increment would be equal
or beyond the target instead.
void Update () {
if (opening){
if (this.transform.localPosition.y + 0.2f <= openPosition){
this.transform.localPosition += new Vector3(0,0.2f,0);
}
else {
opening = false;
this.transform.localPosition = new
Vector3(this.transform.localPosition.x, openPosition,
this.transform.localPosition.z);
}
}
else {
if (closing){
if (this.transform.localPosition.y - 0.2f >= closedPosition){
this.transform.localPosition -= new Vector3(0,0.2f,0);
}
else {
closing = false;
this.transform.localPosition = new
Vector3(this.transform.localPosition.x, closedPosition,
this.transform.localPosition.z);
}
}
}
Sliding doors will automatically open once a certain quota of successfully hit targets
has been met. This quota can be configured by the developed on a door-by-door basis
thanks to a public variable in their script.
30
Figure 16. Configurable number of targets to hit for Sliding Doors to open
The sliding door's script will continuously compare this configurable number against
a static, globally accessible variable provided (and increased) by the targets' script. Once
the quota has been met, the script will automatically call its own “openDoor” script, which
will play the door's opening sound and enable the logic found in the Update() method to
raise the door.
if (TargetController.targetsHit == numTargets && !targetReached){
openDoor();
targetReached = true;
}
void openDoor () {
this.audio.PlayOneShot(doorMoving);
closing = false;
opening = true;
}
Finally, the targets. Since their only visual requirement is to be able to change their
basic color (in order to be able to differentiate between hit and not-yet-hit targets), a
requirement met by Unity's PrimitiveTypes, these too will be built from these geometrical
building blocks (in particular, the Cylinder primitive).
Figure 17. Target prefab constructed from Unity PrimitiveTypes
Firstly, the targets' controller script provides a static, globally available variable that
other scripts can access to check the current quota of targets that have been successfully hit
by the player.
31
When targets are hit by the player, their “targetHit” function will trigger, playing
their “hit” sound and changing their status to the “already hit” status. However, if these
targets are Succession targets, their status will be restored within 0.75 seconds unless the
player is able to hit the remaining targets in time.
// Upon being hit, the target is "deactivated" into its secondary color
permanently (or until the succession shots time frame expires)
public void targetHitDemo (){
this.audio.PlayOneShot (targetHitSound);
if (!hit){
this.hit = true;
this.gameObject.renderer.material.color = flashColor;
targetsHit++;
if (succession){
Invoke ("statusRestore", 0.75f);
}
}
}
If the target is a Moving target, it will continuously move by a set amount during
each Update() call, similar to how the sliding doors operate, so long as the target's
renderer is enabled.
if (moving && this.gameObject.renderer.enabled){
this.transform.position += new Vector3 (moveAmount, 0, 0);
}
Moving targets are invisible until a certain quota of targets to hit as been met. Their
Start() method (called when the object attached to the script is loaded in the level) disables
their renderer (making them invisible), and their Update() method continuously checks the
total amount of hit targets. Once the quota has been met, their renderer is enabled again,
and their movement logic starts.
Since moving targets are meant to bounce between walls, their moving direction will
change once they enter a collision.
// This is called only on collisions
void OnCollisionEnter (UnityEngine.Collision hit){
if (moving){
moveAmount = moveAmount * -1;
}
}
32
Much like sliding doors, the functionalities of the “TargetController.cs” script are
controlled by public variables, so they are configurable in the Unity component editor.
Figure 18. Configurable variables of the Target Controller script
Another component needed for the Target prefab to work is a component that will
provide collision detection from the Lightgun's shots. A prototype weapon controller,
which handles basic gunshot collision detection, serves this purpose.
The weapon controller's “shootWeapon” function plays a single instance of the
gunshot sound and enables the renderer of the gun's muzzle flash (a purely cosmetic
feature). Then, it shoots a Unity Physics.Raycast from the main camera out to the game
world. Unity provides a function which allows this (“Camera.ScreenPointToRay”), and
even allows the programmer to specify a certain point on the game screen from which this
ray will be shot (instead of the default, which is the center of the camera's field of view).
Thanks to this, a Raycast can be shot from the Lightgun's on-screen reticle's position
outwards to the game world.
Unity game objects can be categorized in “layers” (some of which are provided by
Unity3D, and some of which can be defined by the programmer). Thanks to this, Raycast
can be configured to ignore collisions with objects from certain layers.
In this case, targets, doors and level geometry were all placed in the Default layer,
which the Raycast will detect, as all of these elements are meant to be “physical” and be
able to stop bullets.
Physics.Raycast will return the first object, closest to the Raycast's origin, that has
collided with it and is categorized in whichever layers it has been configured to detect.
Upon receiving said object, the weapon controller's script checks this object's tag.
Unity tags are similar to layers. However, Unity tags are not tied in functionality to
any Unity-provided functions (such as Physics.Raycast), and come in handy when trying
to categorize objects while guaranteeing that their behavior will not be altered in any other
way that has not been explicitly specified by the programmer.
In this instance, the user-defined tag “Target” was created and applied to all targets
in the game scene. If the object returned by Physics.Raycast has this “Target” tag, it is
assumed to be a Target prefab, and its “targetHit” function is called from within the
weapon controller script.
33
switch (curWeapon) {
case 1:
Camera.mainCamera.audio.PlayOneShot(gunshotWeapon1);
muzzleFlash.enabled = true;
Invoke ("toggleMuzzleFlash", 0.05f);
if (Physics.Raycast ((Camera.mainCamera.ScreenPointToRay( new
Vector3((CalibrationController.aimPointPercentX*Screen.width), ((1CalibrationController.aimPointPercentY)*Screen.height), 0))), out hit,
100, targetLayer)){
otherObj = hit.collider.gameObject;
if (otherObj.tag == "Target"){
controller =
(TargetController)otherObj.GetComponent(typeof(TargetController));
controller.targetHitDemo();
}
}
break;
}
34
4.4
SixenseInput Plugin Input Smoothing
he provided plugin enables filtering of the Hydra's input by default to make it appear
more smooth and fluid. The downside of this, however, is that tracking becomes more
approximate, and very slight input delay is introduced. The plugin includes a function to
disable this input filtering, and the CalibrationController script calls this function upon
initialization. While the on-screen reticle experiences more jitter as a consequence (similar
to other devices such as the Guncon 3), overall responsiveness and accuracy is noticeably
improved.
// Disable the Sixense Plugin's input smoothing filter
SixensePlugin.sixenseSetFilterEnabled(0);
35
5
Testing
5.1 Calibration Logic Test
In order to test the calibration logic, one simply has to complete the Calibration
Handler's calibration steps successfully, as well as test whether or not the script pauses
and re-engages correctly when the Hydra is disconnected from the computer.
Upon detecting that the Hydra has been correctly connected, the calibration menu
will prompt you to press “START” on the right controller. This brings us to the first
calibration step, the screen depth:
Figure 19. Calibration Menu – Screen Depth
As the on-screen text says, this step requires the player to bring the Hydra as close to
the on-screen reticle as possible. There is no software control for this, as this distance is
entirely variable, so whether or not this step will be performed correctly depends entirely
on the user.
What follows is a series of four shots to be performed from the player's place of
play, one for each corner.
36
Figure 20. Calibration Menu – Screen Corners
Finally, reticle tracking will be enabled, and the end-user can test by himself
whether or not the calibration was satisfactory.
Figure 21. Calibration Menu – Test Calibration
If the player chooses to re-calibrate, the entire calibration procedure will re-start and
function the same as the first time. This procedure can be repeated until the user is
satisfied with the achieved degree of accuracy.
Accepting the calibration will automatically load the next scene, which is the
technical demonstration. The player can quit the calibration menu altogether by pressing
button “ONE” on the controller at the start of the calibration procedure, going back to the
main menu.
37
Figure 22. Main Menu
If at any time during the calibration the player disconnects the Hydra from the
computer, the calibration procedure pauses correctly and the following message is
displayed:
Figure 23. Calibration Menu – Hydra Unplugged
Upon re-connecting the Hydra, the calibration procedure will resume from wherever
it was at the point of disconnection, with no issues.
38
5.2 Calibration Performance Test
For the performance test, calibrations will be performed at an approximate distance
of one meter away from the screen and hydra base, with screen sizes of 15.6, 24 and 32
inches.
The results will be shown in picture format, showing one picture of the hydra aiming
at the center of the screen, and another picture aiming near one of the corners.
5.2.1 15.6 Inches Results
Figure 24. 15.6 inch screen – center shot
39
Figure 25. 15.6 inch screen – corner shot
5.2.2 24 Inches Results
Figure 26. 24 inch screen – center shot
40
Figure 27. 24 inch screen – corner shot
5.2.3 32 Inches Results
Figure 28. 32 inch screen – center shot
41
Figure 29. 32 inch screen – corner shot
5.3 Prototype Logic Test
As all the elements of the prototype's scene logic are triggered forcefully due to the
geometrical design of the test level, the logic test will consist of simply playing through
the technical demonstration and ensuring everything is triggered correctly.
For starters, upon loading the scene, reticle tracking is already enabled due to the
calibration scene which has already played before. However, the player can invoke the
calibration handler again simply by pushing the “START” button on the right controller.
This dim the screen, pause the character's movement and bring up the calibration menu.
42
Figure 30. FPS Prototype – Calibration Menu
The calibration menu is exactly the same as the one found in the dedicated
calibration scene, including the option to quit back to the menu.
Plugging out the Hydra from the computer during the technical demonstration also
dims the screen and pauses the character's movement, bringing up the same messages as in
the calibration menu.
Figure 31. FPS Prototype – Hydra Unplugged
The first room consists of relatively ample movement space with a row of five
targets lined together. They can be shot in any order, and when fired upon, they correctly
play their sound effect and turn blue, indicating their “already shot” state.
43
Figure 32. FPS Prototype – Shooting targets
When all five targets have been shot, the sliding door opens, revealing the next area
of the demonstration.
Figure 33. FPS Prototype – Door opens
Shooting the player's weapon causes the muzzle flash light source to become visible
for a brief instant, lighting up the surrounding area. This flash is correctly attached to the
end of the player weapon at all times.
44
Figure 34. FPS Prototype – Muzzle flash
When moving the reticle across the screen, the player's weapon will follow. This
behavior is purely cosmetic.
Figure 35. FPS Prototype – Gun movement 1
45
Figure 36. FPS Prototype – Gun movement 2
Jumping over the wall right behind the next set of targets is not possible. Also, when
fired upon, these targets correctly revert back to their “not shot” state if their partner target
is not shot within 0.75 seconds. This behavior plays out correctly, regardless of the order
in which the targets are shot.
When both targets are fired upon within the correct time span, the next target
becomes visible.
Figure 37. FPS Prototype – Succession and moving targets
This target will immediately begin moving, and correctly changes directions as soon
as it comes in contact with either of the walls at both of its sides. Once the target is shot,
the next sliding door opens, revealing the last target.
46
Figure 38. FPS Prototype – Final target
Approaching this target further is not possible, thanks to the wall that separates it
from the player. It can only be shot from a great distance, showcasing the accuracy of the
Hydra even after sudden movements. When shot, this target lights up in blue as the others,
and nothing more happens. At this point, the user is free to end the demonstration at any
time from the calibration menu which can be brought up with the “START” button on the
right controller.
5.4 Prototype Challenges Test
The purpose of this test is simply to demonstrate that the hydra is indeed unaffected
by the challenges in the technical demonstration.
Using the same calibration on the 32 inch screen as before, both succession targets
are rapidly shot. The reticle still aligns correctly with the Hydra controller.
47
Figure 39. FPS Prototype Challenges – first shock effects
Upon shooting the moving target, the final target appears. Before shooting at it, we
align the Hydra controller with it. As expected, the aiming reticle also aligns itself with
the Hydra, and is correctly placed on top of the final target.
Figure 40. FPS Prototype Challenges – second shock effects
48
6
Conclusions
As expected, the Razer Hydra peripheral was able to provide remarkable tracking
accuracy with no major end-user inconveniences or technical issues such as those
experienced with other currently available motion control peripherals. Although the
calibration algorithm had to account for both rotational and positional bias drift
experienced by the Razer Hydra wands related to their position with regards to the Hydra
base, satisfactory results were achieved within Razer's optimal playing distance of 20 to 40
inches from the Hydra base.
Although the Razer Hydra suffers from a somewhat limited range due to its wired
nature, the next iteration of this Sixense magnetic tracking technology, the Sixense STEM,
is fully wireless, and even allows for the combination of several magnetic coil bases to
further enhance the operating range.
The Sixense STEM is a very promising platform for lightgun-like tracking and
games which are operated via on-screen cursors or aiming reticles, as it boasts enhanced
end-user convenience thanks to its wireless technology, as well as improved tracking
accuracy from all positions within operating range, and all angles.
This project was able to successfully produce a working calibration library for one of
the major 3D video game engines currently in use by aspiring developers worldwide,
Unity3D, as well as a technical demonstration in the form of a first person shooter
videogame prototype. These assets provide a solid foundation for any developer who
wishes to start developing a videogame which makes use of lightgun-style reticle tracking
through the use of a Razer Hydra or the upcoming Sixense STEM wireless platform.
Further, the calibration library's code is mostly based on simple mathematics, making
it very easily portable to any other platform and game engine, as demonstrated by the
simple porting of the Unity3D library to the GlovePIE scripting language.
In conclusion, the project's originally outlined objectives were successfully met,
providing a new functionality to the Razer Hydra and Sixense STEM platforms that is easy
to use by developers, includes a working example in one of the currently most popular
game genres for one of the most widely used videogame engines among aspiring
developers, and is easy to port to other game engines.
49
7
Resources
7.1
References
[1] http://en.wikipedia.org/wiki/Light_gun#Use_in_video_games 20/07/2014
[2] http://worldwide.espacenet.com/publicationDetails/biblio?
CC=US&NR=4813682&KC=&FT=E 20/07/2014
[3] http://www.nindb.net/game/duck-hunt.html 20/07/2014
[4] http://en.wikipedia.org/wiki/Light_gun#Cathode_ray_timing 20/07/2014
[5] http://en.wikipedia.org/wiki/Ntsc#Lines_and_refresh_rate 20/07/2014
[6] http://en-americas-support.nintendo.com/app/answers/detail/a_id/2954/p/604
20/07/2014
[7] https://www.cs.cmu.edu/~15-821/CDROM/PAPERS/lee2008.pdf 20/07/2014
[8] http://www.wired.com/2011/11/skyward-sword-review/2/ 20/07/2014
[9] http://shelltoon.typepad.com/shelltoons_view/2010/07/razing-storm-arcadereview.html
20/07/2014
[10] http://en.wikipedia.org/wiki/PlayStation_Move#Technology 24/07/2014
[11] http://www.analog.com/en/pressrelease/May_09_2006_ADI_Nintendo_Collaboration/press.html 24/07/2014
[12] http://www.nintendoworldreport.com/news/11557/nintendo-and-pixart-team-up
24/07/2014
[13] http://arstechnica.com/gaming/2008/08/wii-motion-sensor/ 24/07/2014
[14] http://wiibrew.org/wiki/Wiimote#Optical_Characteristics 24/07/2014
[15] http://sensorwiki.org/doku.php/sensors/gyroscope#specifications 24/07/2014
[16] http://wiibrew.org/wiki/Wiimote#Accelerometer 24/07/2014
[17] http://www.develop-online.net/tools-and-tech/full-tech-specs-playstationmove/0116722 24/07/2014
[18] https://support.us.playstation.com/app/answers/detail/a_id/2092/~/playstation
%C2%AEmove-setup 24/07/2014
[19] https://support.us.playstation.com/app/answers/detail/a_id/2100/~/playstation
%C2%AEmove-calibration-faq 24/07/2014
[20]
http://support.bandainamcogames.com/index.php?/Knowledgebase/Article/GetAttachment/
5/82 24/07/2014
[21] http://sixense.com/razerhydra 24/07/2014
[22] http://dl.razerzone.com/master-guides/Hydra/HydraOMG-ENG.pdf 24/07/2014
50
7.2
Third Party Resources
•
“SixenseUnityPlugin”, by Sixense Studios, freely distributed through the Unity
Asset Store. Only redistributable as part of compiled Unity projects. Redistributed
as part of the compiled Unity executable of the lightgun technical demonstration,
but not included as part of the demonstration's Unity project.
http://u3d.as/content/sixense-studios/sixense-unity-plugin/
•
“Hydra-OSC”, by Joshua Olson “MrMormon”, freely distributed through GitHub
under no specific license, used for educational purposes only, and not distributed
with this document.
https://github.com/MrMormon/hydra-osc
•
“GlovePIE 0.43”, by Carl Kenner, freely distributed through GlovePIE.org under a
license that limits its use only for military purposes, or within the country of Israel.
More information about its license can be found in the README.txt included in
the GlovePIe download. Used for educational purposes only, and not distributed
with this document.
http://glovepie.org/lpghjkwer.php
•
“Unity Pro 4.5.3”, by Unity Technologies, freely distributed through GlovePIE.org
under a free trial license that limits its use to 30 days. More information about its
license can be found at https://unity3d.com/legal/eula. Used for educational
purposes only, and not distributed with this document.
https://unity3d.com/unity/download
•
Figure 1, picture of Wii Remote with safety jacket and strap, by Evan-Amos,
freely distributed through Wikimedia Commons and released onto the Public
Domain.
http://commons.wikimedia.org/wiki/File:Wiimote-Safety-First.jpg
•
Figure 2, picture of a PlayStation Move Controller, by Evan-Amos, freely
distributed through Wikimedia Commons through a Creative Commons
Atributtion-Share Alike 3.0 Unported license.
http://commons.wikimedia.org/wiki/File:PlayStation-Move-Controller.png
•
Figure 3, picture of a PlayStation Eye camera, by Evan-Amos, freely distributed
through Wikimedia Commons through a Creative Commons Atributtion-Share
Alike 3.0 Unported license.
http://commons.wikimedia.org/wiki/File:PlayStation-Eye.jpg
51
•
Figure 4, picture of a Japanese Domestic Guncon 3 Lightgun, by
Chinpokomon5, freely distributed through Wikimedia Commons through a
Creative Commons Atributtion-Share Alike 3.0 Unported license.
http://commons.wikimedia.org/wiki/File:Guncon-3.jpg
•
Figure 5, picture of a Razer Hydra controller, by Razer, freely distributed
through Wikimedia Commons through an Attribution license.
http://commons.wikimedia.org/wiki/File:Razer-Hydra-Motion-Controller.jpg
•
Gunshot sound, GUN_FIRE-GoodSoundForYou-820112263.mp3, recorded by
“GoodSoundForYou” and freely distributed through SoundBible under a Creative
Commons Attribution 3.0 license.
http://soundbible.com/1998-Gun-Fire.html
•
Target hit sound, Swords_Collide-Sound_Explorer-2015600826.mp3, recorded
by “Sound Explorer” and freely distributed through SoundBible under a Creative
Commons Attribution 3.0 license.
http://soundbible.com/1980-Swords-Collide.html
•
Sliding door sound, Old Door Creaking-SoundBible.com-1197162460.mp3,
recorded by “Stephan”,freely distributed through SoundBible and released onto the
Public Domain.
http://soundbible.com/1362-Old-Door-Creaking.html
•
Arm cannon geometry, modeled by Joseba Garcia Pablos, donated specifically to
this project under a Creative Commons Attribution-Share-Alike 3.0 Unported
license (CC BY-SA 3.0).
52
8
Manuals
8.1
Installing Unity
The Unity3D free SDK (http://unity3d.com/unity/download) must be installed,
simply by following the on-screen prompts during the installation procedure. The
following instructions correspond to the installation procedure of Unity3D version 4.5.2,
the latest version available as of the date of this document's elaboration.
Upon running the installation file and clicking “Next” and then “I Agree” on the first
two screens (the Welcome screen and the End-User License Agreement), it is necessary to
select which components of Unity will be installed.
Figure 41. Unity installation – Step 1
For the purpose of running and testing the components of this project, only the main
checkbox is strictly necessary. Other components may be installed without issue, but they
are not required.Figure 33. FPS Prototype – Final target
Next, choosing an install location is needed. None in particular is needed, so any
install location may be chosen. To proceed, click “Install”.
53
Figure 42. Unity installation – Step 2
Once the installation progress is finished, a prompt will notify the user of this fact. A
checkbox to run Unity automatically once this prompt is closed is automatically checked. It
is not necessary to run Unity yet. Upon unchecking this box, click “Finish”.
Figure 43. Unity installation – Step 3
54
Figure 44. Unity installation – Step 4
8.2
Installing the Hydra drivers
Windows 7, 8 and 8.1 will automatically detect and install the correct drivers for the
Razer Hydra when it is connected to a PC with internet access. Upon connecting the
peripheral on one of the USB ports, an automatic driver installation prompt will appear. No
end-user input is required.
Figure 45. Hydra installation – Windows 8.1
Additional software can be downloaded from:
http://drivers.razersupport.com//index.php?
_m=downloads&_a=viewdownload&downloaditemid=678&nav=0,349,166,181
However, this software is not necessary.
55
As per the Hydra Master Guide, which can be found on the following link:
(http://dl.razerzone.com/master-guides/Hydra/HydraOMG-ENG.pdf), the Razer Hydra
base should be placed with the “Razer” icon facing outwards from the PC screen.
Additionally, for correct operation for the purposes of this project, the Hydra base should
be placed as close in front as the screen as possible.
8.3
Setting up the technical demonstration in Unity
Now it is time to run Unity. After running the program, Unity will ask you to activate
your license. The premium version's features are required for this project, as they are
used by Sixense's basic plugin. Thus, select “Activate the 30 day free trial of Unity” and
press “OK”.
Figure 46. Unity set-up – Step 1
In order to activate a trial license for Unity, a personal Unity account is required.
Create an account if you do not already have one. Otherwise, introduce your log-in
credentials and click “OK”, and then “Start using Unity”.
56
Figure 47. Unity set-up – Step 2
Figure 48. Unity set-up – Step 3
57
Finally, the Lightgun Project folder must be placed inside the computer's file system.
In Unity, click the “File” tab and select “Open Project” (you won't need to do this if you
are opening Unity for the first time, as the project window will open automatically). If
“The Lightgun Project” does not appear in the next window, select “Open Other” and
browse your file system until you find the Lightgun Project folder. Select it, and click
“Select Folder”.
Figure 49. Unity set-up – Open Project
Unity might give a warning that the project must be upgraded to the current version
of Unity in order to be opened. Select “Continue”.
Now that the project has been opened, any of the project's three scenes (main menu,
standalone calibration and FPS prototype) may be opened and tested separately. To open
them, select “File”, then “Open Scene”. Browse to the root of the “The Lightgun Project”
folder, and then open the “Assets” folder. Now, open the “Scenes” folder, and all three
scenes should be visible. The intended application flow starts with the “Main Menu.unity”
scene.
Because the Asset Store end-user license only allows for assets downloaded from the
Asset Store to be used in compiled products, the project does not contain any of the official
Sixense Plugin files. In order to run the technical demonstration from the Unity editor, it is
necessary to download the SixenseUnityPlugin files (available for free) from the Asset
Store and import them into the project.
First, open the SixenseUnityPlugin's Asset Store page. To do this, click “Window”
on the Unity editor's menu, and then click “Asset Store”. Then, search for
“SixenseUnityPlugin”, and click on the SixenseUnityPlugin asset. Then, on its page, select
“Download”.
58
Figure 50. Downloading SixenseUnityPlugin
You may be prompted to log-in. Simply re-introduce your Unity account credentials
and accept the Asset Store terms of service.
Then, a package import window will appear. Select only the assets that are marked
on the following figure, and then click “Import”.
59
Figure 51. Importing SixenseUnityPlugin
Finally, these assets should now be part of the Project's file system. In order for the
different scenes to be able to read the Sixense Hydra's output, the SixenseInput.prefab
game object must be placed within each one of the three scenes.
To do this, simply navigate to the newly created “SixenseInput” folder in the
project's file viewer in the lower part of the Unity editor window, and then drag the
SixenseInput.prefab component into the scene's gameobject hierarchy list on the left part of
the editor window. Click “File”, then “Save Scene”, and do this for the remaining two
scenes of the project.
60
Figure 52. Adding the SixenseInput.prefab to the project's scenes
You can now test any scene. Once any of the scenes has been loaded, simply click
the “Play” button. It is recommended that you maximize the “Game” tab before playing
any of the scenes, for an optimal experience. To do this, you must first hold the left click
button on top of the “Game” tab and drag it out of the Unity window.
61
Figure 53. Unity set-up – Running the project
To stop the demonstration, simply hit the “Stop” button.
8.4
Using the pre-compiled Technical Demonstration build on Windows
Alternatively, you can use the already compiled “lightdemo.exe” executable, found
at the root of the project's filesystem, which includes all of these elements built-in, and is
ready for testing on Windows x86 and x64 operating systems. Simply run it, enter your
screen's resolution and uncheck the “windowed” option.
8.5
Using the GlovePIE implementation for Mouse emulation
At this point, the Hydra should already be installed and connected to the computer,
as per the instructions in section 8.2 of this document.
First, it is necessary to download Joshua “MrMormon” Olson's “Hydra-OSC”
program, which allows us to read the Hydra's data through the OSC port-based protocol. It
can be download freely from its Github page (https://github.com/MrMormon/hydra-osc).
Then, GlovePIE version 0.43 without Emotiv support must be downloaded freely
from its download page at http://glovepie.org/lpghjkwer.php.
Neither program requires installation. Simply extract both downloads (in the form of
compressed archives) in separate folders. Then, navigate to the Hydra-OSC folder and run
“OSC.exe” while the Hydra controller is plugged in to the computer.
Once Hydra-OSC is running, navigate to the GlovePIE 0.43 folder and run
“GlovePIE.exe”.
In the GlovePIE window, click “File” and then “Open...”, then navigate to the
“GlovePIE RAZER HYDRA 2 PLAYERS FINAL.PIE” script file, which is included
62
alongside this documentation. (Alternatively, simply open Notepad, copy the contents of
the GlovePIE Hydra Script in section 9.15 of this document into a new text file and save it
with a .PIE extension).
Figure 54. GlovePIE setup
After opening the GlovePIE Hydra script, hit the “Run” button to execute it. Make
sure that you are holding the left Hydra wand. You can identify it by looking at the trigger
and bumper buttons: The left wand has “LT” and “LB” imprinted, whereas the right wand
has “RT” and “RB” imprinted.
The calibration procedure is similar to that of the Unity implementation: Place the
Hydra wand as close to the center of the screen as possible and press the “4” button on the
wand. Then, press the “2” button. The “2” button is necessary to go from one step to the
next, and must be pressed between each step. Finally, take aim at each of the four corners
(in the same order as the Unity implementation: Top left, top right, bottom left, bottom
right) and shoot at each with the “4” button, and then pressing the “2” button before
moving to the next shot.
Once the four calibration shots have been performed, the on-screen Windows mouse
will follow the Hydra's line of sight. Because it uses DirectInput to simulate mouse
movement in Mickeys, instead of simply drawing the cursor at certain on-screen pixel coordinates, this script can also be used to play most mouse-based PC games (including
lightgun-style rail shooters). The Hydra wand's trigger is mapped to the left mouse click,
its bumper is mapped to the right mouse click and its “1” button is mapped to the middle
mouse click.
To reset the calibration procedure, press the “3” and “4” buttons simultaneously.
This will also disable the mouse tracking, so it is recommended to reset the calibration
procedure before attempting to use the real computer mouse to stop the GlovePIE script.
63
9
Annexes
9.1
CalibrationController.cs
//
// Author: Juan Javier Lopez Reyes, "Rukumouru"
//
using UnityEngine;
using System.Collections;
/// <summary>
/// Calibration controller handles the entire calibration menu flow, as
well as the tracking of the crosshair reticle after calibration.
/// </summary>
public class CalibrationController : MonoBehaviour {
// These are the current maximum and minimum values on the X and Y axes
of the trapezoidal model of the screen
float maxY;
float minY;
float maxX;
float minX;
// Where the sixense aiming intersects with the screen depth plane
Vector2 screenAimPoint;
// Calibration values are made static for persistence across Scenes
static public int calibrationStep = 0;
static float screenDepth;
static Vector2 screenCorner1;
static Vector2 screenCorner2;
static Vector2 screenCorner3;
static Vector2 screenCorner4;
static public bool reticleTracking = false;
// Percentage of the screen resolution where reticles should be drawn
static public float aimPointPercentY;
static public float aimPointPercentX;
64
// "T" component of the Parametric Equation
float paramEquationT;
// Calibration Aid
public Texture2D calCrosshairTexture;
// Aiming Crosshair
public Texture2D redCrosshairTexture;
// Generic Gunshot sound used only for sound feedback during
calibration
public AudioClip gunshot;
// Use this for initialization
void Start () {
// Disable the Sixense Plugin's input smoothing filter
SixensePlugin.sixenseSetFilterEnabled(0);
}
// Update is called once per frame
void Update () {
if (SixenseDetect.hydraEnabled){
try{
if (reticleTracking){
if (calibrationStep < 6){
// Cannot enable crosshair tracking if calibration is in
progress!
reticleTracking = false;
}
else{
// Compute where the sixense aiming intersects with the
screen plane
screenAimPoint = calibrationShot(SixenseHands.RIGHT,
screenDepth);
// Obtain current edges of the screen
maxY = -(((screenCorner1.x * screenCorner2.y) (screenCorner2.x * screenCorner1.y) - screenAimPoint.x*(screenCorner2.y screenCorner1.y))/(screenCorner2.x - screenCorner1.x));
65
minY = -(((screenCorner3.x * screenCorner4.y) (screenCorner4.x * screenCorner3.y) - screenAimPoint.x*(screenCorner4.y screenCorner3.y))/(screenCorner4.x - screenCorner3.x));
maxX = (((screenCorner2.x * screenCorner4.y) (screenCorner4.x * screenCorner2.y) + screenAimPoint.y*(screenCorner4.x screenCorner2.x))/(screenCorner4.y - screenCorner2.y));
minX = (((screenCorner1.x * screenCorner3.y) (screenCorner3.x * screenCorner1.y) + screenAimPoint.y*(screenCorner3.x screenCorner1.x))/(screenCorner3.y - screenCorner1.y));
// Compute at which percentage of the X and Y coordinates of
the screen the pointer is
aimPointPercentX = (screenAimPoint.x - minX)/(maxX - minX);
aimPointPercentY = -(((screenAimPoint.y - minY)/(maxY minY))-1);
}
}
if (SixenseInput.GetController (SixenseHands.RIGHT).GetButtonDown
(SixenseButtons.TRIGGER) && calibrationStep < 7){
Camera.mainCamera.audio.PlayOneShot(gunshot);
}
switch (calibrationStep){
// Calibration Step 1 - Controller Detection
case 0:
reticleTracking = false;
if (SixenseInput.GetController
(SixenseHands.RIGHT).GetButtonDown (SixenseButtons.START)){
calibrationStep = 1;
}
if (SixenseInput.GetController
(SixenseHands.RIGHT).GetButtonDown (SixenseButtons.ONE)){
Application.LoadLevel ("Main Menu");
}
break;
// Calibration Step 2 - Screen Position Detection
66
case 1:
if (SixenseInput.GetController
(SixenseHands.RIGHT).GetButtonDown (SixenseButtons.START)){
screenDepth = -(SixenseInput.GetController
(SixenseHands.RIGHT).PositionRaw.z+150)*0.001f;
//screenDepth = -150 * 0.001f;
calibrationStep = 2;
}
break;
// Calibration Step 3 - Shoot at Screen Corner 1 - Top Left
case 2:
if (SixenseInput.GetController
(SixenseHands.RIGHT).GetButtonDown (SixenseButtons.TRIGGER)){
screenDepth);
screenCorner1 = calibrationShot (SixenseHands.RIGHT,
calibrationStep = 3;
}
break;
// Calibration Step 4 - Shoot at Screen Corner 2 - Top Right
case 3:
if (SixenseInput.GetController
(SixenseHands.RIGHT).GetButtonDown (SixenseButtons.TRIGGER)){
screenDepth);
screenCorner2 = calibrationShot (SixenseHands.RIGHT,
calibrationStep = 4;
}
break;
// Calibration Step 5 - Shoot at Screen Corner 3 - Bottom Left
case 4:
if (SixenseInput.GetController
(SixenseHands.RIGHT).GetButtonDown (SixenseButtons.TRIGGER)){
screenDepth);
screenCorner3 = calibrationShot (SixenseHands.RIGHT,
67
calibrationStep = 5;
}
break;
// Calibration Step 6 - Shoot at Screen Corner 4 - Bottom Right
case 5:
if (SixenseInput.GetController
(SixenseHands.RIGHT).GetButtonDown (SixenseButtons.TRIGGER)){
screenCorner4 = calibrationShot (SixenseHands.RIGHT,
screenDepth);
calibrationStep = 6;
}
break;
// Calibration Step 7 - Confirm Calibration?
case 6:
reticleTracking = true;
if (SixenseInput.GetController
(SixenseHands.RIGHT).GetButtonDown (SixenseButtons.START)){
calibrationStep = 7;
WeaponController.weaponsEnabled = true;
}
else {
if (SixenseInput.GetController
(SixenseHands.RIGHT).GetButtonDown (SixenseButtons.TWO)){
calibrationStep = 0;
}
}
break;
// Calibration Step 8 - Pause and Recalibrate
case 7:
if (SixenseInput.GetController
(SixenseHands.RIGHT).GetButtonDown (SixenseButtons.START)){
WeaponController.weaponsEnabled = false;
calibrationStep = 0;
68
}
break;
}
}
catch (System.NullReferenceException){
}
}
}
void OnGUI () {
if (SixenseDetect.hydraEnabled){
try{
if (reticleTracking){
GUI.DrawTexture(new Rect((aimPointPercentX*Screen.width)(redCrosshairTexture.width/2), (aimPointPercentY*Screen.height)(redCrosshairTexture.height/2), redCrosshairTexture.width,
redCrosshairTexture.height), redCrosshairTexture);
}
switch (calibrationStep){
// Calibration Step 1 - Controller Detection
case 0:
GUI.Label(new Rect(Screen.width * 0.5f - 310f, Screen.height
* 0.5f - 10f,
1000f, 20f), "Now initiating Calibration process: Please
hit 'START' on the Red Lightgun (Right Controller), or hit 'ONE' to
quit");
break;
// Calibration Step 2 - Screen Position Detection
case 1:
GUI.Label(new Rect(Screen.width * 0.5f - 400f, Screen.height
* 0.65f,
1000f, 20f), "Point the controller at the screen and
place it as close as possible to the on-screen reticle (almost touching
the screen), then hit START.");
GUI.DrawTexture(new Rect((0.5f*Screen.width)(calCrosshairTexture.width/2), (0.5f*Screen.height)(calCrosshairTexture.height/2), calCrosshairTexture.width,
calCrosshairTexture.height), calCrosshairTexture);
69
break;
// Calibration Step 3 - Shoot at Screen Corner 1
case 2:
GUI.Label(new Rect(Screen.width * 0.5f - 300f, Screen.height
* 0.5f - 10f,
1000f, 20f), "Go back to your place of play. Then, aim at
the on-screen reticle and pull the trigger.");
GUI.DrawTexture(new Rect((0.0f*Screen.width)(calCrosshairTexture.width/2), (0.0f*Screen.height)(calCrosshairTexture.height/2), calCrosshairTexture.width,
calCrosshairTexture.height), calCrosshairTexture);
break;
// Calibration Step 4 - Shoot at Screen Corner 2
case 3:
GUI.Label(new Rect(Screen.width * 0.5f - 300f, Screen.height
* 0.5f - 10f,
1000f, 20f), "While still at your place of play, aim at
the new on-screen reticle and pull the trigger.");
GUI.DrawTexture(new Rect((1.0f*Screen.width)(calCrosshairTexture.width/2), (0.0f*Screen.height)(calCrosshairTexture.height/2), calCrosshairTexture.width,
calCrosshairTexture.height), calCrosshairTexture);
break;
// Calibration Step 5 - Shoot at Screen Corner 3
case 4:
GUI.Label(new Rect(Screen.width * 0.5f - 300f, Screen.height
* 0.5f - 10f,
1000f, 20f), "While still at your place of play, aim at
the new on-screen reticle and pull the trigger.");
GUI.DrawTexture(new Rect((0.0f*Screen.width)(calCrosshairTexture.width/2), (1.0f*Screen.height)(calCrosshairTexture.height/2), calCrosshairTexture.width,
calCrosshairTexture.height), calCrosshairTexture);
break;
// Calibration Step 6 - Shoot at Screen Corner 4
case 5:
GUI.Label(new Rect(Screen.width * 0.5f - 300f, Screen.height
* 0.5f - 10f,
1000f, 20f), "While still at your place of play, aim at
the new on-screen reticle and pull the trigger.");
GUI.DrawTexture(new Rect((1.0f*Screen.width)(calCrosshairTexture.width/2), (1.0f*Screen.height)-
70
(calCrosshairTexture.height/2), calCrosshairTexture.width,
calCrosshairTexture.height), calCrosshairTexture);
break;
// Calibration Step 7 - Confirm Calibration?
case 6:
GUI.Label(new Rect(Screen.width * 0.5f - 300f, Screen.height
* 0.5f - 10f,
1000f, 20f), "Is this calibration OK? (YES = Start // NO
= Button 2).");
break;
}
}
catch (System.NullReferenceException){
}
}
}
Vector2 calibrationShot (SixenseHands hand, float screenDeep) {
Vector2 screenCorner;
float paramEquationT;
Vector3 posVar, dirVar;
Vector2 rotVar;
// Get Controller Position and Rotation
posVar.x = SixenseInput.GetController (hand).PositionRaw.x*0.001f;
posVar.y = SixenseInput.GetController (hand).PositionRaw.y*0.001f;
posVar.z = -SixenseInput.GetController (hand).PositionRaw.z*0.001f;
// rotVar.x = Pitch, rotVar.y = Yaw
rotVar.x = (SixenseInput.GetController (hand).RotationRaw.x);
rotVar.y = SixenseInput.GetController (hand).RotationRaw.y;
// Compute Direction Vector
dirVar.x = Mathf.Cos (rotVar.x*Mathf.PI)*Mathf.Sin (rotVar.y*Mathf.PI);
dirVar.y = Mathf.Sin (rotVar.x*Mathf.PI);
dirVar.z = Mathf.Cos (rotVar.x*Mathf.PI)*Mathf.Cos
(rotVar.y*Mathf.PI);
71
// Get intersection between plane and vector (Solve Parametric
Equation for depth "screenDepth")
paramEquationT = (screenDeep - posVar.z)/dirVar.z;
screenCorner.y = posVar.y + dirVar.y*paramEquationT;
screenCorner.x = posVar.x + dirVar.x*paramEquationT;
return screenCorner;
}
}
72
9.2
CalibrationSceneController.cs
//
// Author: Juan Javier Lopez Reyes, "Rukumouru"
//
using UnityEngine;
using System.Collections;
/// <summary>
/// Calibration scene controller provides the logic of the Calibration
scene outside of the calibration process itself.
/// </summary>
public class CalibrationSceneController : MonoBehaviour {
// Use this for initialization
void Start () {
}
// Update is called once per frame
void Update () {
if (CalibrationController.calibrationStep == 7 &&
WeaponController.weaponsEnabled == true){
Application.LoadLevel("FPS Tech Demo");
}
}
}
73
9.3
CannonAngleController.cs
//
// Author: Juan Javier Lopez Reyes, "Rukumouru"
//
using UnityEngine;
using System.Collections;
/// <summary>
/// Cannon angle controller angles the first person weapon so that it
follows the position of the on-screen crosshair reticle.
/// </summary>
/// <remarks>
/// To function correctly, this script has to be applied to the joint
that controls the angle of the weapon geometry. It also needs
/// the CalibrationHandler prefab (or the CalibrationController script)
to be present in the scene.
/// </remarks>
public class CannonAngleController : MonoBehaviour {
private Vector3 baseAngle;
// Use this for initialization
void Start () {
}
// Update is called once per frame
void Update () {
// Use the Main Camera's local rotation as the base rotation of the
Arm Cannon
baseAngle = Camera.mainCamera.transform.localEulerAngles;
// Adjustments so that the local (relative to parent objects!)
rotation, as applied to the Arm Cannon, makes it face the camera's
direction
baseAngle += new Vector3(0, 45, -20);
// Do not apply crosshair-based rotation if crosshair exceeds the
Screen bounds
if (WeaponController.weaponsEnabled &&
CalibrationController.aimPointPercentX > 0 &&
74
CalibrationController.aimPointPercentX < 1 &&
CalibrationController.aimPointPercentY > 0 &&
CalibrationController.aimPointPercentY < 1){
this.transform.localEulerAngles = new Vector3(0, baseAngle.y,
baseAngle.z);
this.transform.localEulerAngles += new Vector3 (0,
90*CalibrationController.aimPointPercentX,
45*CalibrationController.aimPointPercentY);
}
}
}
75
9.4
CharacterRotationController.cs
//
// Author: Juan Javier Lopez Reyes, "Rukumouru"
//
using UnityEngine;
using System.Collections;
/// <summary>
/// Character rotation controller rotates the Player Character prefab
horizontally using the Sixense's right joystick.
/// This, in turn, also rotates its child Camera component horizontally.
/// </summary>
/// <remarks>
/// Use in conjunction with JoysticLookSixense.cs to achieve camera
rotation in both X and Y axes.
/// </remarks>
public class CharacterRotationController : MonoBehaviour {
// Amount by which the character will rotate, in Euler Angles
private float angH;
// Use this for initialization
void Start () {
}
// Update is called once per frame
void Update () {
if (SixenseDetect.hydraEnabled){
try{
if (WeaponController.weaponsEnabled) {
// Joystick deadzone of 50%
if (SixenseInput.GetController(SixenseHands.RIGHT).JoystickX >=
0.80f || SixenseInput.GetController(SixenseHands.RIGHT).JoystickX <=
-0.80f){
angH =
SixenseInput.GetController(SixenseHands.RIGHT).JoystickX * 2;
76
this.transform.eulerAngles += new Vector3(0, angH, 0);
}
}
}
catch (System.NullReferenceException){
}
}
}
}
77
9.5
ColorChanger.cs
//
// Author: Juan Javier Lopez Reyes, "Rukumouru"
//
using UnityEngine;
using System.Collections;
/// <summary>
/// Color changer is a simple script that changes the color of basic
gameobjects. It also supports a variable transparency amount.
/// </summary>
public class ColorChanger : MonoBehaviour {
public Color color;
public float alpha;
// Use this for initialization
void Start () {
this.renderer.material.color = new Color (color.r, color.g, color.b,
alpha);
}
// Update is called once per frame
void Update () {
}
}
78
9.6
FPSInputControllerSixense.cs
//
// Author: Juan Javier Lopez Reyes, "Rukumouru"
//
// Based on the FPSInputController.js Unity Standard Asset, written in
JavaScript
//
using UnityEngine;
using System.Collections;
/// <summary>
/// FPS input controller sixense is a C# translation of the
FPSInputController Standard Asset, adapted to use the Sixense as its
input method.
/// </summary>
public class FPSInputControllerSixense : MonoBehaviour {
private CharacterMotor motor;
private Vector3 directionVector;
private float directionLength;
// Use this for initialization
void Start () {
motor = (CharacterMotor)GetComponent("CharacterMotor");
}
// Update is called once per frame
void Update () {
if (SixenseDetect.hydraEnabled){
try{
if (WeaponController.weaponsEnabled) {
// Joystick deadzone of 50%
if (SixenseInput.GetController(SixenseHands.LEFT).JoystickX >=
0.80f || SixenseInput.GetController(SixenseHands.LEFT).JoystickX <=
-0.80f || SixenseInput.GetController(SixenseHands.LEFT).JoystickY >=
79
0.80f || SixenseInput.GetController(SixenseHands.LEFT).JoystickY <=
-0.80f ){
// Get the input vector from kayboard or analog stick
directionVector = new
Vector3(SixenseInput.GetController(SixenseHands.LEFT).JoystickX, 0,
SixenseInput.GetController(SixenseHands.LEFT).JoystickY);
}
else {
directionVector = Vector3.zero;
}
if (directionVector != Vector3.zero) {
// Get the length of the directon vector and then normalize
it
// Dividing by the length is cheaper than normalizing when we
already have the length anyway
directionLength = directionVector.magnitude;
directionVector = directionVector / directionLength;
// Make sure the length is no bigger than 1
directionLength = Mathf.Min(1, directionLength);
// Make the input vector more sensitive towards the extremes
and less sensitive in the middle
// This makes it easier to control slow speeds when using
analog sticks
directionLength = directionLength * directionLength;
// Multiply the normalized direction vector by the modified
length
directionVector = directionVector * directionLength;
}
motor.inputJump =
SixenseInput.GetController(SixenseHands.RIGHT).GetButtonDown(SixenseButto
ns.ONE);
80
// Apply the direction to the CharacterMotor
motor.inputMoveDirection = this.transform.rotation *
directionVector;
}
else{
directionVector = Vector3.zero;
}
}
catch (System.NullReferenceException){
directionVector = Vector3.zero;
}
}
// Apply the direction to the CharacterMotor
motor.inputMoveDirection = this.transform.rotation * directionVector;
}
}
81
9.7
JoystickLookSixense.cs
//
// Author: Juan Javier Lopez Reyes, "Rukumouru"
//
using UnityEngine;
using System.Collections;
/// <summary>
/// Joystick look sixense rotates the attached Camera vertically, using
the Sixense right stick.
/// </summary>
/// <remarks>
/// Use in conjunction with CharacterRotationController.cs for rotation
in both X and Y axes.
/// </remarks>
public class JoystickLookSixense : MonoBehaviour {
// Amount by which the camera will rotate, in Euler Angles
private float angV;
// Use this for initialization
void Start () {
}
// Update is called once per frame
void Update () {
if (SixenseDetect.hydraEnabled){
try{
if (WeaponController.weaponsEnabled) {
// Joystick deadzone of 20%
if (SixenseInput.GetController(SixenseHands.RIGHT).JoystickY >=
0.20f || SixenseInput.GetController(SixenseHands.RIGHT).JoystickY <=
-0.20f ){
angV =
-SixenseInput.GetController(SixenseHands.RIGHT).JoystickY * 2;
if (this.transform.eulerAngles.x + angV < 80 ||
this.transform.eulerAngles.x + angV > 280){
82
this.transform.localEulerAngles += new Vector3(angV, 0,
0);
}
}
}
}
catch (System.NullReferenceException){
}
}
}
}
83
9.8
MainMenuSceneController.cs
//
// Author: Juan Javier Lopez Reyes, "Rukumouru"
//
using UnityEngine;
using System.Collections;
/// <summary>
/// Main menu scene controller provides the basic scene logic of the Main
Menu scene.
/// </summary>
public class MainMenuSceneController : MonoBehaviour {
// Use this for initialization
void Start () {
CalibrationController.calibrationStep = 0;
WeaponController.weaponsEnabled = false;
TargetController.targetsHit = 0;
}
// Update is called once per frame
void Update () {
if (SixenseDetect.hydraEnabled){
try{
if (SixenseInput.GetController (SixenseHands.RIGHT).GetButtonDown
(SixenseButtons.START)){
Application.LoadLevel ("Calibration Testing");
}
if (SixenseInput.GetController (SixenseHands.RIGHT).GetButtonDown
(SixenseButtons.ONE)){
Application.Quit();
}
}
84
catch (System.NullReferenceException){
}
}
}
}
85
9.9
PausePlaneController.cs
//
// Author: Juan Javier Lopez Reyes, "Rukumouru"
//
using UnityEngine;
using System.Collections;
/// <summary>
/// Pause plane controller toggles the visibility of the semi-transparent
Pause Plane, which dims the main camera's view of the scene when it is
paused.
/// </summary>
public class PausePlaneController : MonoBehaviour {
// Use this for initialization
void Start () {
}
// Update is called once per frame
void Update () {
if (!SixenseDetect.hydraEnabled || !WeaponController.weaponsEnabled){
this.renderer.enabled = true;
}
else {
this.renderer.enabled = false;
}
}
}
86
9.10 SixenseDetect.cs
//
// Author: Juan Javier Lopez Reyes, "Rukumouru"
//
using UnityEngine;
using System.Collections;
/// <summary>
/// SixenseDetect detects whether or not the hydra is in a state of
readiness for being used. This means the base must be plugged in, and
both controllers bound.
/// </summary>
public class SixenseDetect : MonoBehaviour {
// hydraEnabled is used by other scripts to pause the game logic when
the hydra is disconnected or otherwise not ready to be used (IE: Unbound
controllers)
static public bool hydraEnabled = false;
protected bool guiConnectHydra = true;
// Use this for initialization
void Start () {
}
// Update is called once per frame
void Update () {
try{
if (!SixenseInput.IsBaseConnected (0)) {
hydraEnabled = false;
}
else{
guiConnectHydra = false;
if (!SixenseInput.GetController(SixenseHands.LEFT).Enabled || !
SixenseInput.GetController(SixenseHands.LEFT).Enabled){
hydraEnabled = false;
}
else{
87
hydraEnabled = true;
}
}
}
catch(System.NullReferenceException e){
hydraEnabled = false;
guiConnectHydra = false;
}
}
void OnGUI () {
if (guiConnectHydra) {
GUI.Label (new Rect (Screen.width * 0.5f - 300f, Screen.height *
0.5f - 10f, 1000f, 20f), "Please connect the Hydra base to the
computer");
}
}
}
88
9.11 SlidingDoorController.cs
//
// Author: Juan Javier Lopez Reyes, "Rukumouru"
//
using UnityEngine;
using System.Collections;
/// <summary>
/// Sliding door controller enables the Sliding Door prefab's door to
slide into open and closed position upon reaching a certain
/// number of shot Targets.
/// </summary>
/// <remarks>
/// The number of shot Targets is increased by the WeaponController.cs
and TargetController.cs scripts, as well as the Target prefab.
/// </remarks>
public class SlidingDoorController : MonoBehaviour {
protected bool closing = false;
protected bool opening = false;
protected float openPosition;
protected float closedPosition;
protected bool targetReached = false;
// Audio clip to play when the door opens or closes
public AudioClip doorMoving;
// Number of targets to shoot so the door opens
public int numTargets;
// Use this for initialization
void Start () {
// Compute closed and open positions of the sliding door based on the
current position
openPosition = this.transform.localPosition.y + 6;
closedPosition = this.transform.localPosition.y;
}
89
// Update is called once per frame
void Update () {
if (TargetController.targetsHit == numTargets && !targetReached){
openDoor();
targetReached = true;
}
if (opening){
if (this.transform.localPosition.y + 0.2f <= openPosition){
this.transform.localPosition += new Vector3(0,0.2f,0);
}
else {
opening = false;
this.transform.localPosition = new
Vector3(this.transform.localPosition.x, openPosition,
this.transform.localPosition.z);
}
}
else {
if (closing){
if (this.transform.localPosition.y - 0.2f >= closedPosition){
this.transform.localPosition -= new Vector3(0,0.2f,0);
}
else {
closing = false;
this.transform.localPosition = new
Vector3(this.transform.localPosition.x, closedPosition,
this.transform.localPosition.z);
}
}
}
}
void openDoor () {
this.audio.PlayOneShot(doorMoving);
closing = false;
90
opening = true;
}
void closeDoor () {
this.audio.PlayOneShot(doorMoving);
opening = false;
closing = true;
}
}
91
9.12 TargetController.cs
//
// Author: Juan Javier Lopez Reyes, "Rukumouru"
//
using UnityEngine;
using System.Collections;
/// <summary>
/// Target controller provides several functions to the Target prefab:
Being hit and playing a hit sound, being able to move bouncing between
two walls and
/// being set-up so that several targets need to be shot in rapid
succession.
/// </summary>
public class TargetController : MonoBehaviour {
// Static variable, accessible from other scripts, that indicates how
many targets have been successfully shot in total
public static int targetsHit = 0;
// Sound clip of the Targets being hit
public AudioClip targetHitSound;
// Starting color and secondary color. The secondary color is used as
either a flashing visual feedback to indicate a hit,
// or as a permanent color change to indicate the target has already
been shot before.
public Color firstColor;
public Color flashColor;
// Configure target so that it needs to be shot in rapid succession
along with other targets until targetAmount has been reached
public bool succession;
// Configure target so that it appears once targetAmount has been
reached, and it moves back and forth between two walls
public bool moving;
public int targetAmount;
// Movement speed for "moving" configuration flag
92
private float moveAmount = 0.3f;
// Denotes whether or not this target has already been shot
private bool hit = false;
// Use this for initialization
void Start () {
this.gameObject.renderer.material.color = firstColor;
if (moving){
this.gameObject.collider.enabled = false;
this.gameObject.renderer.enabled = false;
}
}
// Update is called once per frame
void Update () {
if (targetsHit >= targetAmount){
this.gameObject.collider.enabled = true;
this.gameObject.renderer.enabled = true;
}
if (moving && this.gameObject.renderer.enabled){
this.transform.position += new Vector3 (moveAmount, 0, 0);
}
}
// This is called only on collisions
void OnCollisionEnter (UnityEngine.Collision hit){
if (moving){
moveAmount = moveAmount * -1;
}
}
// Upon being hit, the target flashes briefly in its secondary color
public void targetHit (){
Camera.mainCamera.audio.PlayOneShot(targetHitSound);
93
this.gameObject.renderer.material.color = flashColor;
Invoke("colorRestore", 0.1f);
}
// Upon being hit, the target is "deactivated" into its secondary color
permanently (or until the succession shots time frame expires)
public void targetHitDemo (){
this.audio.PlayOneShot (targetHitSound);
if (!hit){
this.hit = true;
this.gameObject.renderer.material.color = flashColor;
targetsHit++;
if (succession){
Invoke ("statusRestore", 0.75f);
}
}
}
// Restores the original color
void colorRestore(){
this.gameObject.renderer.material.color = firstColor;
}
// Restores the original, non-shot status and color
void statusRestore(){
if (targetsHit < targetAmount){
this.gameObject.renderer.material.color = firstColor;
this.hit = false;
targetsHit--;
}
}
}
94
9.13 WeaponController.cs
//
// Author: Juan Javier Lopez Reyes, "Rukumouru"
//
using UnityEngine;
using System.Collections;
/// <summary>
/// Weapon controller provides basic weapon switching/cycling
functionality (unused in this tech demo) and handles the weapon shooting.
/// </summary>
public class WeaponController : MonoBehaviour {
static public bool weaponsEnabled = false;
static public int currentWeapon = 1;
// muzzle flash for weapon shots
public Light muzzleFlash;
// Gunshot sound for Weapon 1
public AudioClip gunshotWeapon1;
// Use this for initialization
void Start () {
// Hide the muzzle flash light source
muzzleFlash.enabled = false;
}
// Update is called once per frame
void Update () {
if (SixenseDetect.hydraEnabled){
try{
if (weaponsEnabled) {
if (CalibrationController.reticleTracking == false){
95
weaponsEnabled = false;
}
else{
// Weapon Switching
if
(SixenseInput.GetController(SixenseHands.RIGHT).GetButtonDown(SixenseButt
ons.TWO)){
currentWeapon++;
// Weapon Cycling
if (currentWeapon > 1){
currentWeapon = 1;
}
}
// Weapon Shooting
if (SixenseInput.GetController
(SixenseHands.RIGHT).GetButtonDown (SixenseButtons.TRIGGER)){
shootWeapon(currentWeapon);
}
}
}
}
catch (System.NullReferenceException){
}
}
}
// Handles the basic weapon shooting by using Physics.Raycast and the
TargetController's targetHit method
void shootWeapon(int curWeapon){
RaycastHit hit;
GameObject otherObj;
TargetController controller;
int targetLayer = 1 << 0;
switch (curWeapon) {
case 1:
Camera.mainCamera.audio.PlayOneShot(gunshotWeapon1);
96
muzzleFlash.enabled = true;
Invoke ("toggleMuzzleFlash", 0.05f);
if (Physics.Raycast ((Camera.mainCamera.ScreenPointToRay( new
Vector3((CalibrationController.aimPointPercentX*Screen.width), ((1CalibrationController.aimPointPercentY)*Screen.height), 0))), out hit,
100, targetLayer)){
otherObj = hit.collider.gameObject;
if (otherObj.tag == "Target"){
controller =
(TargetController)otherObj.GetComponent(typeof(TargetController));
controller.targetHitDemo();
}
}
break;
}
}
void toggleMuzzleFlash(){
muzzleFlash.enabled = false;
}
}
97
9.14 GlovePIE Wiimote Script
debug = "Step 1: "+Var.step1+" - Step 2: "+Var.step2+" - Donestep:
"+Var.donestep+" - Step 3: "+Var.step3+" - Current X,Y: "+
(wiimote1.dot1x+wiimote1.dot2x)+","+(wiimote1.dot1y+wiimote1.dot2y)+" MaxX/MinX & MaxY/MinY: "+Var.left+"/"+Var.right+" &
"+Var.bottom+"/"+Var.top+" - LIMIT: "+Var.limit
wiimote1.Led4 = (wiimote1.dot1vis || wiimote1.dot2vis)
Var.resX = 640
Var.resY = 448
if (wiimote1.B && Var.step1 == 0 && Var.donestep == 0)
Var.left = (wiimote1.dot1x + wiimote1.dot2x)
Var.top = (wiimote1.dot1y + wiimote1.dot2y)
Var.donestep = 1
Var.step1 = 1
end if
if (wiimote1.Home)
Var.donestep = 0
endif
if (wiimote1.B)
wiimote1.rumble = 1
endif
if (!wiimote1.B)
wiimote1.rumble = 0
endif
wiimote.Led2 =
((Var.step1 == 1 && Var.step2 == 0) || Var.step3)
wiimote.Led3 =
(Var.step2)
if (wiimote1.B && Var.step1 == 1 && Var.step2 == 0 && Var.donestep
== 0)
Var.right = (wiimote1.dot1x + wiimote1.dot2x)
Var.bottom = (wiimote1.dot1y + wiimote1.dot2y)
Var.donestep = 1
Var.step2 = 1
end if
98
if (wiimote1.B && Var.step1 == 1 && Var.step2 == 1 && Var.step3 ==
0 && Var.donestep == 0)
Var.centerX = (wiimote1.dot1x + wiimote1.dot2x)
Var.centerY = (wiimote1.dot1y + wiimote1.dot2y)
Var.left = (Var.left + ((Var.left - Var.centerX) x 0.33))
Var.right = (Var.right - ((Var.centerX - Var.right) x 0.33))
Var.top = (Var.top - ((Var.centerY - Var.top) x 0.33))
Var.bottom = (Var.bottom + ((Var.bottom - Var.centerY) x 0.33))
Var.donestep = 1
Var.step3 = 1
end if
if (wiimote1.Plus + wiimote1.Minus)
Var.step1 = 0
Var.step2 = 0
Var.step3 = 0
Var.donestep = 0
end if
if ((Var.step1 == 1) && (Var.step2 == 1) && (Var.step3 == 1) &&
Var.donestep == 0 && (wiimote1.dot1vis || wiimote1.dot2vis) && (!
Var.limit))
//mouse.DirectInputX = (1-((wiimote1.dot1x + wiimote1.dot2x)-Var.right)/
(Var.left - Var.right))*Var.resX
//mouse.DirectInputY = (((wiimote1.dot1y + wiimote1.dot2y)-Var.top)/
(Var.bottom - Var.top))*Var.resY
PPJoy1.Analog0 = (((1-((wiimote1.dot1x + wiimote1.dot2x)-Var.right)/
(Var.left - Var.right)) x 2)-1) x 1.085
PPJoy1.Analog1 = (((((wiimote1.dot1y + wiimote1.dot2y)-Var.top)/
(Var.bottom - Var.top)) x 2 )- 1) x 0.75
end if
Var.limit = (!wiimote1.2) &&( ((wiimote1.dot1x + wiimote1.dot2x) >=
(Var.left x 0.99)) || ((wiimote1.dot1x + wiimote1.dot2x) <= (Var.right x
1.01)) || ((wiimote1.dot1y + wiimote1.dot2y) >= (Var.bottom x 0.995)) ||
((wiimote1.dot1y + wiimote1.dot2y) <= (Var.top x 1.01)) )
PPJoy1.Digital0 = wiimote1.B
PPJoy1.Digital1 = wiimote1.A
99
//mouse.LeftButton = wiimote1.B
//mouse.RightButton = wiimote1.A
100
9.15 GlovePIE Hydra Script
OSC.listening = true
OSC.ListenPort = 7022
// Enter your screen's current resolution
Var.resX = 1920
Var.resY = 1080
debug = Var.RPosX+" --- "+Var.RPosY+" --- "+Var.RPosZ+" -"+Var.AimPointPercentX+" // "+Var.AimPointPercentY+" --- "+Var.RPosX2+"
--- "+Var.RPosY2+" --- "+Var.RPosZ2
Var.RPosX = OSC.l.p.x * 0.001
// X component of Right Position (meters)
Var.RPosY = OSC.l.p.y * 0.001 // Y component of Right Position (meters)
Var.RPosZ = -(OSC.l.p.z * 0.001)
(meters)
Var.RYaw = OSC.l.r.x
// Z component of Right Position
// Yaw component of Right Rotation
Var.RPitch = OSC.l.r.y // Pitch component of Right Rotation
Var.RDirX = (Math.cos(Var.RYaw*Math.pi)) * (Math.sin(Var.RPitch*Math.pi))
Var.RDirY = (Math.sin(Var.RYaw*Math.pi))
Var.RDirZ = (Math.cos(Var.RYaw*Math.pi)) * (Math.cos(Var.RPitch*Math.pi))
Var.ParamEquationT = (Var.ScreenDepth - Var.RPosZ)/Var.RDirZ
Var.ScreenCornerY = Var.RPosY + Var.RDirY * Var.ParamEquationT
Var.ScreenCornerX = Var.RPosX + Var.RDirX * Var.ParamEquationT
// Compute the screen edges
Var.MaxY = -(((Var.ScreenTopLeftX * Var.ScreenTopRightY) (Var.ScreenTopRightX * Var.ScreenTopLeftY) Var.ScreenCornerX*(Var.ScreenTopRightY - Var.ScreenTopLeftY))/
(Var.ScreenTopRightX - Var.ScreenTopLeftX))
Var.MinY = -(((Var.ScreenBottomLeftX * Var.ScreenBottomRightY) (Var.ScreenBottomRightX * Var.ScreenBottomLeftY) Var.ScreenCornerX*(Var.ScreenBottomRightY - Var.ScreenBottomLeftY))/
(Var.ScreenBottomRightX - Var.ScreenBottomLeftX))
Var.MaxX = (((Var.ScreenTopRightX * Var.ScreenBottomRightY) (Var.ScreenBottomRightX * Var.ScreenTopRightY) +
Var.ScreenCornerY*(Var.ScreenBottomRightX - Var.ScreenTopRightX))/
(Var.ScreenBottomRightY - Var.ScreenTopRightY))
Var.MinX = (((Var.ScreenTopLeftX * Var.ScreenBottomLeftY) (Var.ScreenBottomLeftX * Var.ScreenTopLeftY) +
101
Var.ScreenCornerY*(Var.ScreenBottomLeftX - Var.ScreenTopLeftX))/
(Var.ScreenBottomLeftY - Var.ScreenTopLeftY))
// Percentage of screen where the aimpoint is
Var.AimPointPercentX = (Var.ScreenCornerX - Var.MinX)/(Var.MaxX Var.MinX)
Var.AimPointPercentY = -(((Var.ScreenCornerY - Var.MinY)/(Var.MaxY Var.MinY))-1)
// Find Screen Depth
if (OSC.l.b.4 && Var.step == 0 && Var.donestep == 0)
Var.ScreenDepthRaw = Var.RPosZ // Take current screen depth
Var.ScreenDepth = Var.ScreenDepthRaw + (150*0.001) // Adjust for
Screen Depth Offset caused by Bias Drift
Var.donestep = 1
Var.step = 1
end if
if (OSC.l.b.2)
Var.donestep = 0
endif
// Find Top Left Corner
if (OSC.l.b.4 && Var.step == 1 && Var.donestep == 0)
Var.ScreenTopLeftY = (Var.ScreenCornerY)
Var.ScreenTopLeftX = (Var.ScreenCornerX)
Var.donestep = 1
Var.step = 2
end if
// Find Top Right Corner
if (OSC.l.b.4 && Var.step == 2 && Var.donestep == 0)
Var.ScreenTopRightY = (Var.ScreenCornerY)
Var.ScreenTopRightX = (Var.ScreenCornerX)
Var.donestep = 1
Var.step = 3
end if
// Find Bottom Left Corner
if (OSC.l.b.4 && Var.step == 3 && Var.donestep == 0)
Var.ScreenBottomLeftY = (Var.ScreenCornerY)
Var.ScreenBottomLeftX = (Var.ScreenCornerX)
102
Var.donestep = 1
Var.step = 4
end if
// Find Bottom Right Corner
if (OSC.l.b.4 && Var.step == 4 && Var.donestep == 0)
Var.ScreenBottomRightY = (Var.ScreenCornerY)
Var.ScreenBottomRightX = (Var.ScreenCornerX)
Var.donestep = 1
Var.step = 5
end if
if (OSC.l.b.3 && OSC.l.b.4)
Var.step = 0
Var.donestep = 0
end if
if ((Var.step == 5) && Var.donestep == 0)
if (Var.AimPointPercentX >= 0 && Var.AimPointPercentX <= 1 &&
Var.AimPointPercentY >= 0 && Var.AimPointPercentY <= 1)
mouse1.DirectInputX = Var.ResX * Var.AimPointPercentX
mouse1.DirectInputY = Var.ResY * Var.AimPointPercentY
//PPJoy1.Analog0 = ((Var.AimPointPercentX*2) -1)-0.12
//PPJoy1.Analog1 = ((Var.AimPointPercentY*2) -1)-0.2
end if
mouse1.LeftButton = OSC.l.t.t
mouse1.MiddleButton = OSC.l.b.1
mouse1.RightButton = OSC.l.b.b
//PPJoy1.Digital0 = OSC.l.t.t
//PPJoy1.Digital1 = OSC.l.b.1
//PPJoy1.Digital2 = OSC.l.b.2
end if
// LIGHTGUN 2
103
Var.RPosX2 = OSC.r.p.x * 0.001
// X component of Right Position (meters)
Var.RPosY2 = OSC.r.p.y * 0.001 // Y component of Right Position (meters)
Var.RPosZ2 = -(OSC.r.p.z * 0.001)
(meters)
Var.RYaw2 = OSC.r.r.x
// Z component of Right Position
// Yaw component of Right Rotation
Var.RPitch2 = OSC.r.r.y // Pitch component of Right Rotation
Var.RDirX2 = (Math.cos(Var.RYaw2*Math.pi)) * (Math.sin(Var.RPitch2*Math.pi))
Var.RDirY2 = (Math.sin(Var.RYaw2*Math.pi))
Var.RDirZ2 = (Math.cos(Var.RYaw2*Math.pi)) *
(Math.cos(Var.RPitch2*Math.pi))
Var.ParamEquationT2 = (Var.ScreenDepth2 - Var.RPosZ2)/Var.RDirZ2
Var.ScreenCornerY2 = Var.RPosY2 + Var.RDirY2 * Var.ParamEquationT2
Var.ScreenCornerX2 = Var.RPosX2 + Var.RDirX2 * Var.ParamEquationT2
// Compute the screen edges
Var.MaxY2 = -(((Var.ScreenTopLeftX2 * Var.ScreenTopRightY2) (Var.ScreenTopRightX2 * Var.ScreenTopLeftY2) Var.ScreenCornerX2*(Var.ScreenTopRightY2 - Var.ScreenTopLeftY2))/
(Var.ScreenTopRightX2 - Var.ScreenTopLeftX2))
Var.MinY2 = -(((Var.ScreenBottomLeftX2 * Var.ScreenBottomRightY2) (Var.ScreenBottomRightX2 * Var.ScreenBottomLeftY2) Var.ScreenCornerX2*(Var.ScreenBottomRightY2 - Var.ScreenBottomLeftY2))/
(Var.ScreenBottomRightX2 - Var.ScreenBottomLeftX2))
Var.MaxX2 = (((Var.ScreenTopRightX2 * Var.ScreenBottomRightY2) (Var.ScreenBottomRightX2 * Var.ScreenTopRightY2) +
Var.ScreenCornerY2*(Var.ScreenBottomRightX2 - Var.ScreenTopRightX2))/
(Var.ScreenBottomRightY2 - Var.ScreenTopRightY2))
Var.MinX2 = (((Var.ScreenTopLeftX2 * Var.ScreenBottomLeftY2) (Var.ScreenBottomLeftX2 * Var.ScreenTopLeftY2) +
Var.ScreenCornerY2*(Var.ScreenBottomLeftX2 - Var.ScreenTopLeftX2))/
(Var.ScreenBottomLeftY2 - Var.ScreenTopLeftY2))
// Percentage of screen where the aimpoint is
Var.AimPointPercentX2 = (Var.ScreenCornerX2 - Var.MinX2)/(Var.MaxX2 Var.MinX2)
Var.AimPointPercentY2 = -(((Var.ScreenCornerY2 - Var.MinY2)/(Var.MaxY2 Var.MinY2))-1)
// Find Screen Depth
if (OSC.r.b.4 && Var.step2 == 0 && Var.donestep2 == 0)
Var.ScreenDepthRaw2 = Var.RPosZ2 // Take current screen depth
104
Var.ScreenDepth2 = Var.ScreenDepthRaw2 + (60*0.001) // Adjust
for controller length
Var.donestep2 = 1
Var.step2 = 1
end if
if (OSC.r.b.2)
Var.donestep2 = 0
endif
// Find Top Left Corner
if (OSC.r.b.4 && Var.step2 == 1 && Var.donestep2 == 0)
Var.ScreenTopLeftY2 = (Var.ScreenCornerY2)
Var.ScreenTopLeftX2 = (Var.ScreenCornerX2)
Var.donestep2 = 1
Var.step2 = 2
end if
// Find Top Right Corner
if (OSC.r.b.4 && Var.step2 == 2 && Var.donestep2 == 0)
Var.ScreenTopRightY2 = (Var.ScreenCornerY2)
Var.ScreenTopRightX2 = (Var.ScreenCornerX2)
Var.donestep2 = 1
Var.step2 = 3
end if
// Find Bottom Left Corner
if (OSC.r.b.4 && Var.step2 == 3 && Var.donestep2 == 0)
Var.ScreenBottomLeftY2 = (Var.ScreenCornerY2)
Var.ScreenBottomLeftX2 = (Var.ScreenCornerX2)
Var.donestep2 = 1
Var.step2 = 4
end if
// Find Bottom Right Corner
if (OSC.r.b.4 && Var.step2 == 4 && Var.donestep2 == 0)
Var.ScreenBottomRightY2 = (Var.ScreenCornerY2)
Var.ScreenBottomRightX2 = (Var.ScreenCornerX2)
Var.donestep2 = 1
Var.step2 = 5
end if
105
if (OSC.r.b.3 && OSC.r.b.4)
Var.step2 = 0
Var.donestep2 = 0
end if
if ((Var.step2 == 5) && Var.donestep2 == 0)
if (Var.AimPointPercentX2 >= 0 && Var.AimPointPercentX2 <= 1 &&
Var.AimPointPercentY2 >= 0 && Var.AimPointPercentY2 <= 1)
//mouse2.DirectInputX = Var.ResX * Var.AimPointPercentX2
//mouse2.DirectInputY = Var.ResY * Var.AimPointPercentY2
PPJoy2.Analog0 = ((Var.AimPointPercentX2*2) -1)
PPJoy2.Analog1 = ((Var.AimPointPercentY2*2) -1)
end if
//mouse2.LeftButton = OSC.r.t.t
//mouse2.MiddleButton = OSC.r.b.1
//mouse2.RightButton = OSC.r.b.b
PPJoy2.Digital0 = OSC.r.t.t
PPJoy2.Digital1 = OSC.r.b.1
PPJoy2.Digital2 = OSC.r.b.2
end if
106
9.16 Arm Cannon license agreement
107