Real-time Game Editor in a Spatial Augmented Environment

Transcription

Real-time Game Editor in a Spatial Augmented Environment
The Real Augmented Reality: Real-time Game Editor
in a Spatial Augmented Environment
Patrick Oswald
University of Applied
Sciences Potsdam
14469 Potsdam, Germany
[email protected]
Jordi Tost
University of Applied
Sciences Potsdam
14469 Potsdam, Germany
[email protected]
ABSTRACT
Video games are conventionally screen-bound, restricted to
predefined character movements and have a limited amount
of interaction possibilities depending on the controller
and the level architecture. Although the ways in which we
can interact with games have improved over recent years,
the digital world we are interacting with is still normally
confined to the screen and restricted by predefined scenarios.
In this paper, we introduce the i.Ge engine, a real-time video
game level editor that allows users to interact with their
own environment to create game content with real everyday objects, making them part of the level design. Thus,
our engine reduces the gap between playing and creating by
making both possible at the same time in a spatial augmented
reality, thereby introducing new concepts in the field of game
interaction and game design.
Author Keywords
Mixed Reality; Spatial Augmented Reality; Virtual Reality;
Tangible Interaction; Projection Mapping; HCI, Human
Computer Interaction, Game Design.
Reto Wettach
University of Applied
Sciences Potsdam
14469 Potsdam, Germany
[email protected]
We propose to bring the game out of the screen and into the
user’s daily environment, connecting the virtual and real
world in one single spatial augmented reality. We do so by
including the user’s surrounding physical environment into
the virtual world he is playing in, thereby enhancing the
interactive gaming experience.
Furthermore, we want to encourage and enhance gamers’
creativity by making it possible to create level content with
almost every single physical object - using them directly as
level elements or creating augmented virtual content with
them. Finally, our approach tries to inspire entirely new
collaborative and challenging multiplayer concepts and
physical-virtual game mechanics.
Accordingly, we developed a prototype based on a jump and
run game to demonstrate the novel interaction paradigms
proposed by our engine.
ACM Classification Keywords
H5.2 [Information interfaces and presentation]: User
Interfaces—Input devices and strategies, Interaction styles
H5.1 [Information interfaces and presentation]: Multimedia
Information Systems—Artificial, augmented, and virtual
realities
INTRODUCTION
The screen, in any variation (flat, portable, touch sensitive
or projected), is still the place where videogame interaction
usually happens. The world we explore in a game is
traditionally a virtual one. While the ways of interaction
with a character or stage have improved over the years, level
editors, the tools used by game designers to create virtual
worlds, are still limited in their functions and content; they
mostly use dragging and dropping of pre-defined elements
and thus, the user is again limited to the game engine and the
content already built for it.
ACE '14, November 11 - 14 2014, Funchal, Portugal
Copyright is held by the owner/author(s). Publication rights licensed to ACM.
ACM 978-1-4503-2945-3/14/11…$15.00
http://dx.doi.org/10.1145/2663806.2663853
Figure 1. Demo of the i.Ge engine
I.GE SYSTEM PRINCIPLES
i.Ge is an interactive video game engine that allows users
to interact and play with their own environment in real
time. As illustrated in Figure 2, all interaction takes place
inside the room, which is turned into the game scenario,
mixing it with the virtual game content, mediated through
a Microsoft Kinect and a projector. For our demo scenario,
we chose Nintendo’s Super Mario Bros. characters (Super
Mario Bros.® is a registered trademark of Nintendo®) and
an ordinary game pad, both widely known by users and the
gaming community, in order to focus the players attention
and learning process on the novel interaction paradigms and
not in how to play the game.
Figure 2. Set-up scenario of our Real Augmentet Reality concept.
Creating a level
The real world, understood as the interface, provides the tools
to create a level. Figure 3a shows a typical situation in which
the character must be helped to reach a distant platform.
Users could move either the cube or the blue card closer,
but also attach new objects to the stage (Figure 3b). Another
possibility would be to create platforms by sketching blobs
(Figure 3c), or to even hold the character by hand (Figure 3d).
a
stage that shoots bullets that can damage our character, as
shown in Figure 4b. This generated digital content can also
interact and collide with the physical objects that form the
stage. In Figure 4c, for example, a user places a red card in
front of the character to protect him from the bullets.
a
b
b
c
c
d
Figure 3. Creating level content.
Figure 4. Creating digital content.
Besides physical objects, the game character can also interact
with other digital elements like opponents, projectiles or
bonus objects. They can be generated by detecting some
characteristics of the attached physical objects, for example,
their position and color. In our demo, when a new green
object (or a sketched one) with a given proportion (height
greater than width) is detected, a carnivorous plant is placed
above it. If there is another teleportation tube in the stage,
they will be also detected as teleportation tubes and will
teleport our character from one tube to the other. As Figure
2 sketches, a green object could be used, for example, to
teleport the character to reach the content inside a frame.
Our demo also detects black objects, adding a cannon to the
Enhancing collaboration
i.Ge allows gamers to act as level designers. Although
playing as a single player is possible, we propose to enhance
multiplayer collaboration, bringing interaction to a new level.
While traditional video games have multi-player modes,
the roles of both players are normally similar, sometimes
playing to achieve a common goal, sometimes as rivals. We
introduce the figure of the game level creator, who modifies
the stage content adding, moving or replacing objects in real
time to help the game player to achieve the tasks or level
(see Figures 3 and 5). This player acts like a level designer.
We also propose another player role, the opponent or enemy,
who tries to eliminate or damage the game character (Figure
4b). Thus, we enable a new batch of multiplayer interaction
possibilities by bringing together the play and build parts
into two simultaneously playable game aspects. By adapting
our game engine to other game concepts and genres, we want
to inspire and challenge new collaborative multiplayer ideas.
IMPLEMENTATION
Our system includes a common projector, a game controller
and a Microsoft Kinect (640x480px up to 30 fps) to scan
the environment. The software side was programmed with
Processing and the computer vision library OpenCV [9]. The
game level elements are detected using an edge detection
algorithm over the RGB channel provided by the Kinect,
as long they are not too small (because of the Kinect resolution) and rich in contrast. The game engine extracts their
bounding boxes (position and size) and apply them physical
properties in order to detect collisions between them and the
game character or other projected digital content. i.Ge also
performs a color detection algorithm for each real object in
order to assign them different behaviours or generate digital content depending on their color (see Figure 4). At this
point, to avoid the game creators (see Player 2 on Figure 1)
from obstructing the viewport and being detected as level
elements, we use the depth camera to ignore their boundaries. However, this filtering only happens from a threshold
distance in order to allow game creators to use their hands to
hold the game character in some situations (see Figure 3d).
PRELIMINARY EVALUATION – PLAYING WITH I.GE
To explore the potential of our engine and to observe users
behaviours in terms of interaction, usability, creativity
and collaboration, we conducted preliminary qualitative
evaluation sessions. This preliminary studies served to
identify the benefits of the system but also some issues to
be improved.
Twenty participants of different backgrounds, ages and
experience with video games were asked to play in a simple
scenario on the wall. Subjects were divided in ten teams of
two members and were given an introduction consisting of
a brief demonstration of the concept and the basics of play
– how to move the character with the gamepad, jump, etc.–
but no advanced training or explanation of user roles. The
teams were assigned a task: collect all coins projected on an
almost empty wall (Figure 5a). To support them, participants
received a set of post-its and sticky papers and boxes. We
chose sticky papers, as they are easy to handle, and to focus
users attention on accomplishing the level (Figure 5b).
a
b
Figure 5. User test scenario.
Results showed that all users completed the task properly
and also associated themselves automatically in the roles of
player and level creator. We received positive feedback from
the sessions but also had some issues with the perspective
calibration, overall illumination and performance that we
will improve in next versions.
RELATED WORK
Spatial Augmented Reality (SAR) is the technique of
merging the real world with projected images [4]. Existing
SAR game projects mostly use our real environment as a
canvas for the game in a static way or use the projector as
interaction tool. We will argue that i.Ge is different than
these existing approaches.
IllumiRoom [8], proposes to augment the physical
environment that surrounds a television. For example, by
projecting game content like a grenade rolling out the screen,
bouncing of the table around the living room. This use of
augmented digital content is responsive to the static physical
environment. Build Your World and Play In It [7] uses a
prebuilt wooden block construction as projection surface,
which can be created by gesturing with an IR stylus, adapting
the angle and direction of the virtual particles to the surface.
We propose an active use of the physical objects as a separate
aspect of the game and use them to create interactive level
elements which can independently generate digital content
in real time in order to offer the player a more creative use of
their surrounding objects.
Recent studies of handheld and wearable interactive
projectors have also shown promising results. Examples
include Twinkle [15], an interface for interacting with an
arbitrary physical flat surface using a handheld projector,
proposes a gaming interface as an application of this
technology. The projected character can interact with nonwhite physical objects in the viewport of the camera attached
to the projector. By colliding with these objects the character
changes its state depending on their color and size. With the
MotionBeam [12] project on the other hand, it is possible
for users to interact with the character by gesturing with
the handheld projector itself. These works present different
directions of augmenting environments and character
handling, but the interaction and thus augmentation with
digital content are limited to the region where the character
is (where the projector points), due to the alignment of the
character to the projection center.
A similar approach is used by Beamatron [13], but the user
controls an articulated system formed by a current projector
and a Kinect, placed on the ceiling. Beamatron augments
an avatar (a virtual toy car) onto a physical environment
using the depth data, but not other properties like color and
proportions.
In our work, we use a game controller to directly control
the main game character, which can be moved independently
of the projector direction, in order to achieve a higher
interactivity with all objects and digital content placed in the
entire projected area. We also afford the augmentation of real
objects with interactive digital content depending on their
characteristics and independent of the character position.
Additionally, we are introducing the game design as a
playable part of the game (which can be played by another
person), building complex scenarios in real time.
We implemented a jump and run demo game using our engine
principles as a proof of concept. Nevertheless, we could also
use it to develop other game concepts and genres. In the
future we plan to provide i.Ge as a framework or toolbox to
the community, in order to allow people to develop their own
videogame concepts based on the same principles.
CONCLUSION AND FURTHER WORK
1.
2.
In this paper, we introduced The Real Augmented Reality
project, which proposes to use the real world as a video game
stage. The main contribution is the concept of interactive
game editing in real time in a spatial environment, which we
support with the engine prototype i.Ge. By placing, moving
or removing every-day objects into and from the stage,
players can create and play the level simultaneously, opening
new possibilities of playing and collaborating.
Our engine contains a series of novel elements:
• The possibility of interacting with our own physical
environment to create a game level.
• The ability to modify the level by simply placing,
replacing and removing elements in real time.
• Collaborative playing with different user roles: game
player, level creator, rival...
First evaluations of the engine showed promising results
concerning users‘ imagination and creativity while
interacting with their surroundings. Most casual participants
were engaged to play with our prototype, mostly in pairs as
a team. In the near future we plan to do extensive user tests
to qualitatively evaluate users learning time and engagement
to play, compared to traditional video game systems,
and observe their interaction and behaviour in different
environments.
Some limitations were also shown. At the actual technical
state we are mainly limited to the size and rectangularity
of our objects. We are also performing an almost flat object
tracking and barely using depth data.
As future work, we plan to extend i.Ge to enable more
interaction possibilities and game concepts. To achieve this
we plan to a) increase the use of depth to enhance the use of
space and provide users more ways to interact with physical
objects, b) provide a better and more efficient approach
to create, manage and interact with digital content (bonus
items, enemies, etc.) and c) empower embodied interaction.
In order to achieve this, a couple of technical issues have to
be solved: we have to reconsider the ways we deal with 3D
objects and their perspective to improve Kinect calibration
and projection mapping, and also the ways we deal with
light to avoid problems caused by changing lightning
environments (to achieve an adaptative system to changing
lightning environments).
REFERENCES
Benko, H., Jota, R., and Wilson, A. MirageTable:
Freehand Interaction on a Projected Augmented Reality
Tabletop. In Proc. CHI‘12, ACM (2012), 199-208.
Bimber, O., Emmerling, A., and Klemmer, T. Embedded
entertainment with smart projectors. IEEE Computer
38, 1 (2005), 48-55.
3. Bimber, O., Coriand, F., Kleppe, A., Bruns, E., Zollmann,
S., and Langlotz, T. Superimposing pictorial artwork with
projected imagery. IEEE Multimedia 12, 1 (2005), 16–26.
4. Bimber, O. and Raskar, R. Spatial augmented reality:
Merging real and virtual worlds. AK Peters Ltd (2005).
5. Cauchard, J.R., Fraser, M., Han, T., and Subramanian,
S. Steerable projection: exploring alignment in interac tive mobile displays. In Personal Ubiquitous Computing
(2012), 27-37 .
6. Harrison, C., Benko, H., and Wilson, A.D., OmniTouch:
wearable multitouch interaction everywhere. In Proc.
UIST‘11, ACM (2011), 441-450.
7. Jones, B. R., Sodhi, R., Campbell, R. H., Garnett, G., and
Bailey, B. P. Build your world and play in it: Interacting
with surface particles on complex objects. In 9th
International Symposium on Mixed and Augmented
Reality (ISMAR), IEEE (2010), 165-174.
8. Jones, B. R., Benko, H., Ofek, E., and Wilson, A. D.
IllumiRoom: peripheral projected illusions for interactive
experiences. In Proc. CHI‘11, ACM (2013), 869-878.
9. OpenCV for Processing, by Greg Borenstein. A Proces sing library for the OpenCV computer vision library.
https://github.com/atduskgreg/opencv-processing
10.Pinhanez, C. The Everywhere Displays Projector:
A Device to Create Ubiquitous Graphical Interfaces. In
Proc. UbiComp, ACM (2001).
11.Raskar, R., Van Baar, J., Beardsley, P., Willwacher, T.,
Rao, S., and Forlines, C. iLamps: geometrically aware and self-configuring projectors. In ACM SIGGRAPH 2006 Courses (2006), 7.
12.Willis, K. D., Poupyrev, I., and Shiratori, T. Motionbeam:
a metaphor for character interaction with handheld pro-
ojectors. In Proc. CHI‘11, ACM (2011), 1031-1040.
13.Wilson, A., Benko, H., Izadi, S., and Hilliges, O. Steera ble augmented reality with the Beamatron. In Proc.
UIST‘12, ACM (2012), 413-422.
14.Wilson, A., and Robbins, D.C. Playtogether: Playing
games across multiple interactive tabletops. In IUI
Workshop on Tangible Play. Citeseer (2006).
15.Yoshida, T., Hirobe, Y., Nii, H., Kawakami, N., and
Tachi, S. Twinkle: Interacting with physical surfaces using handheld projector. In Proc. Virtual Reality
Conference (VR), IEEE (2010), 87-90.