Lecture 8

Transcription

Lecture 8
Milgram’
Milgram’s RealityRealityVirtuality continuum
Virtual Reality Technology
and Programming
Mixed Reality
Real
Environment
Augmented
Reality (AR)
Augmented
Virtuality (AV)
Virtual
Environment
TNM086:
Lecture 08: Augmented Reality
Reality - Virtuality (RV) Continuum
Adapted from Milgram, Takemura, Utsumi, Kishino. Augmented
Reality: A class of displays on the reality-virtuality continuum
(Based on the MUM2003 Tutorials by Mark Billinghurst and Mark Ollila)
Augmented Reality
Virtual Reality: Replaces Reality
Augmented Reality: Enhances Reality
Example AR image
– Immersive Displays
Youngkwan
Cho, STAR
system
– SeeSee-through Displays
Characteristics
– Combines Real and Virtual Images
– Interactive in realreal-time
– Registered in 3D
Why Augmented Reality ?
Virtual Reality is Ideal for:
– Replacing the Real World
– Simulation, Training, Games
Is AR easier/harder than VR?
Rendering: easier – There’
There’s less of it!
Display (resolution, FOV, colour):
colour): easier?
Tracking and sensing *Much* harder:
– But we need faster updates
– Greater bandwidth requirements (video, range
data, etc.)
– Support occlusion, general environmental
knowledge
– A big problem for registration!
Augmented Reality is Ideal for:
– Enhancing the real world
– Sophisticated interaction in the real world
“Intelligence
Amplification”
Amplification”
Portability:
– VE: User stays in one place – in the VE
– AR: User moves to task – In the real world
Additional problems of AR
A Brief History of AR (1)
Computer graphics: faster updates
– Objects must appear in the right place in the
real world
1960’
1960’s: Sutherland /
Sproull’
Sproull’s first HMD
system was seesee-through
Tracking must be:
– more accurate
With respect to the real world
– Faster
Stay aligned with the real world
So artificial objects are correctly ‘registered’
registered’
A Brief History of AR (2)
Early 1990’
1990’s: Boeing coined the term “AR.”
AR.”
– Wire harness assembly application
Early 1990’
1990’s: UNC ultrasound project
1994: Motion stabilized display
1994: Fiducial tracking in video seesee-through AR
Applications
Medicine
Manufacturing
Training
Architecture
Museum
A Brief History of AR (3)
1996: UNC hybrid electromagneticelectromagnetic-vision tracker
1998: Dedicated conferences begin
Late 90’
90’s: Collaboration, outdoor, interaction
2000: Augmented sports broadcasts
Medical
“X-ray vision”
vision” for surgeons
Aid visualization, minimallyminimally-invasive
operations. Training. MRI, CT data.
– Ultrasound project, UNC Chapel Hill.
Courtesy
UNC
Chapel
Hill
Assembly and
Maintenance
Applications: annotated
environment
© 1996 S. Feiner, B. MacIntyre, &
A. Webster, Columbia University
Public and private annotations
Aid recognition, “extended memory”
memory”
– Libraries, maps [Fitzmaurice93]
– Windows [Columbia]
– Mechanical parts [many places]
– Reminder notes [Sony, MIT Media Lab]
– Navigation and spatial information access
© 1993 S. Feiner, B. MacIntyre, &
D. Seligmann, Columbia University
Application: broadcast
augmentation
Annotation pictures
Columbia
University
–
–
–
© 1993 S. Feiner, B. MacIntyre,
M. Haupt, & E. Solomon,
Columbia University
Adding virtual content to live sports broadcasts
“First down”
down” line in American football
Hockey puck trails, virtual advertisements
National flags in swimming lanes in 2000 Olympics
Commercial application
– Princeton Video Image is one company
http://www.pvihttp://www.pvi-inc.com/
inc.com/
HRL
Broadcast Examples
Key AR Technologies
Input
– Tracking technologies
– Input devices
Output
– Display (visual, audio, haptic)
– Fast Image Generation
Optical seesee-through HMD
Virtual images
from monitors
AR Displays
Real
World
Optical seesee-through
HMDs
Optical
Combiners
Video seesee-through HMD
Virtual Vision VCAP
Video
cameras
Video
Graphics
Monitors
Combiner
Sony Glasstron
Video seesee-through HMD
Strengths of optical AR
Simpler (cheaper)
Direct view of real world
– Full resolution
– No lag (time delay)
At
least for the real world
– Safety
– Lower distortion
MR Laboratory’s COASTAR HMD
(Co-Optical Axis See-Through Augmented Reality)
Parallax-free video see-through HMD
No eye displacement
The Virtual Retinal
Display
Strengths of video AR
True occlusion
Digitized image of real world
– rather than composited as in optical
– Flexibility in composition
– Matchable time delays
– More registration, calibration strategies
Wide FOV is easier to support
Image scanned onto retina
Commercialized through Microvision
– Nomad System - www.mvis.com
Video Monitor AR
Video
cameras
Monitor
Brains and Bricks…
Bricks…
(Stereo
glasses)
AR interface for visualizing sensor data
– Using portable video seesee-through device
– Commonly available technology.
A mobile phone.
Video
Graphics
Combiner
ProjectorProjector-based AR
Example of projectorprojectorbased AR
User (possibly
head-tracked)
Projector
Real objects
with retroreflective
covering
Examples:
Raskar, UNC Chapel Hill
Inami, Tachi Lab, U. Tokyo
Ramesh Raskar, UNC Chapel Hill
Projection screen AR
Projection Screen AR
Place static (angled?) glass screen
(window) between user and real world
Project on screen with (angled?)
displays
Align displayed objects with real world
by tracking user’
user’s head
– Or by other means?
User (possibly
head-tracked)
Virtual object
Projector
‘Window’
Real objects
The importance of
tracking
Tracking is the basic enabling
technology for Augmented Reality
Tracking is significantly more difficult
in AR than in Virtual Environments
AR Tracking
– Realistic merged realreal-virtual environment
– Greater precision is required
– Latency can not be tolerated.
Sources of registration
errors
Static errors
–
–
–
–
Optical distortions
Mechanical misalignments
Tracker errors
Incorrect viewing parameters
Dynamic errors
– System delays (largest source of error)
For an ‘arms length’
length’ display:
– 1 ms delay ~ 1/3 mm registration error
Types of Trackers
– Mechanical
Armature with position sensors
– Electromagnetic
AC or DC field emmitors/sensors
Compass
– Optical
Target tracking (LEDs
(LEDs,, beads)
Line of sight, may require landmarks to work well.
Computer vision is computationallycomputationally-intensive
– Acoustic
Ultrasonic
– Inertial & dead reckoning
Acceleration and impulse forces
Sourceless but drifts
– GPS
Outdoor Augmented Reality
Accuracy not great
Line of sight, jammable
– Hybrid
Fiducial tracking
Markers…
Markers…
Since we have a real world…
world…
…and (often) a video capture of it
We can use the real world to track:
– Use object tracking (hard) or…
or…
– Use fiducial ‘markers’
markers’ to provide position, scale and
orientation information
Can look like anything
Can be attached to anything
May not be visible in the scene:
– Video ‘seesee-through’
through’ can overpaint them
Must be easily identified
Must be distinct and clearly orientable
Natural Feature Tracking
Goal:
– Overlay virtual imagery onto normal
printed material (maps, photos, etc)
Method:
– AR registration based on matching
templates generated from image texture
Hard to do reliably and *generally*
– Markers are easier and more reliable
ARToolKit
Enabling technology
Library for visionvision-based AR applications
– Open Source, multimulti-platform
Solves two significant problems in AR
Hardware
Camera
Computer
– 320x240+
– Pentium 500Mhz+
– 3D graphics video card
– Video capture card
– Tracking
– Interaction
Overlays 3D virtual objects on real markers
– Uses single tracking marker
– Determines camera pose information (6 DOF)
ARToolKit Website
http://www.hitl.washington.edu/artoolkit
http://www.hitl.washington.edu/artoolkit//
HMD (optional)
– Video seesee-through or Optical seesee-through
– Binocular or Monocular
Typical ARToolKit System
ARToolKit Coordinate
Frame
Pentium 4 2Ghz PC - €1000
Good ‘gaming’
gaming’ graphics cardcard- €200
Video capture card - €50
Marshall Board CCD Camera - €200
Sony Glastron PLMPLM-A35 - €400
VGA to NTSC converter - €100
Total Cost ~ US€
US€1950
Tangible AR Coordinate
Frames
ARToolKit Tracking
ARToolKit - Computer vision based tracking libraries
Tracking Limitations
Computer vision based
– Camera pose found only when marker is
visible
– Shadows/lighting can affect tracking
– Tracking range varies with marker size
– Tracking accuracy varies with marker angle
– Tracking speed decreases with the number
of visible markers
An ARToolKit Application
Basic Outline
–
–
–
–
–
–
Step1. Image capture &
display
Step2. Marker detection
Step3. Marker identification
Step4. Getting 3D information
Step5. Object Interactions
Step6. Display virtual objects
AR interfaces as context
based information browsers
Information is registered to
realreal-world context
– Hand held AR displays
AR Interaction
VideoVideo-seesee-through (Rekimoto
(Rekimoto,, 1997)
Magnetic trackers or computer vision
Interaction
– Manipulation of a window
into information space
Applications
– ContextContext-aware information displays
AR Interfaces as 3D data
browsers
3D virtual objects are
registered in 3D
3D AR Interfaces
– SeeSee-through HMDs,
HMDs, 6 DOF
optical, magnetic trackers
– “VR in Real World”
World”
Interaction
– 3D virtual viewpoint
control
Virtual objects in 3D
physical space - can be
freely manipulated
– SeeSee-through HMDs and 6DOF
headhead-tracking are required
– 6DOF magnetic, ultrasonic,
etc. hand trackers for input
Interaction
– Viewpoint control
– Traditional 3D UI interaction:
Applications
– Visualization, guidance,
training
Augmented Surfaces
Images are projected on a surface
– back or overhead projection
Collaborative
Kiyokawa, et al. 2000
Tangible AR: Generic Interface
Semantics
Tiles semantics
– data tiles
– operation tiles
Physical objects are used as
controls for virtual objects
– Tracked on the surface
– Virtual objects are registered to the
physical objects
– Physical embodiment of the user
interface elements
manipulation, selection,
adding, removing...
menu
clipboard
trashcan
help
Operation on tiles
– proximity
– spatial arrangements
– spacespace-multiplexed
SpaceSpace-multiplexed
Interface
Tangible AR: TimeTimemultiplexed interaction
Use of natural physical object
manipulations to control virtual objects
VOMAR Demo
– Catalog book:
Turn
Data authoring in Tiles
VOMAR Interface
over the page
– Paddle operation:
Push,
shake, incline, hit, scoop
Summary
Face to face collaboration
– AR often preferred over immersive VR
– AR facilitates seamless/natural
communication
Remote Collaboration
– AR spatial cues can enhance communication
– AR conferencing improves video conferencing
– Many possible confounding factors
Promising Research
Directions
Natural Feature Tracking
Outdoor (out of lab) AR UI Design
Other Modalities
HMD Design
AR on Everyday Devices
Natural feature tracking
Matris Project
– Collaboration between
LiU
– ISY C&C – Sensor Fusion - Fredrik Gustafsson
– IDA CVL – Computer Vision - Michael Felsberg
Fraunhofer
Institute
BBC
Christian-Albrechts-University,
Xsens
Kiel, Germany
Technologies, Netherlands
Matris Tracker