interact space experiment

Transcription

interact space experiment
→ INTERACT
SPACE EXPERIMENT
Online
Fact Sheet
telerobotics and
haptics laboratory
→ BACKGROUND
Interactive robotics demonstration from
on-board the ISS
In early September this year, Danish astronaut Andreas Mogensen will perform a
groundbreaking space experiment called Interact, developed by ESA in close
collaboration with the TU Delft Robotics Institute. During the 2015 ESA Short
Duration Mission, Mogensen will take control of the Interact Centaur rover on Earth
from the International Space Station in real-time with force-feedback.
The date of the activity has currently been planned for Monday the 7th of
September, but is subjected to change dependent on the ISS activity schedule.
The Mission
The Interact experiment, conceived and implemented by the ESA Telerobotics &
Haptics Laboratory, will be the first demonstration of teleoperation of a rover from
space to ground in which during part of the experiment, the operator will receive
force-feedback during control. The task set up for the astronaut is to maneuver the
rover located at ESA’s ESTEC technical center in Noordwijk through a special obstacle
course, to locate a mechanical task board and to perform a mechanical assembly
task. Once the task board is located and approached, the astronaut will use a
specially designed haptic joystick in space to take control of one of the Centaur’s
robotic arms on Earth. With the arm he will execute a “peg-in-hole” assembly task
to demonstrate the ability to perform connector mating through teleoperation with
tight mechanical tolerances of far below one millimeter. The haptic feedback allows
the astronaut to actually feel whether the connector is correctly inserted and, if
necessary to fine-tune the insertion angle & alignment. The complete operation is
performed from on-board the International Space Station, at approximately 400 km
altitude, using a data connection via a geosynchronous satellite constellation at
36.000 km altitude. The communication between the haptic joystick and the
ground system is bi-directional, where both systems are essentially coupled. This socalled bi-lateral system is particularly sensitive to time delay, which can cause
instability. The satellite connection, called the Tracking and Data Relay Satellite
System (TDRSS), results in communication time delays as large as 0.8 seconds,
which makes this experiment especially challenging. ESA copes with these
challenges through specialized control algorithms developed at ESA’s Telerobotics
Laboratory, through augmented graphical user interfaces with predictive displays
and with ‘force sensitive’ robotic control algorithms on ground. These ESA
technologies allow the operator to work in real-time from space on a planetary
surface. It is as if the astronaut could extend his arm from space to ground.
ESA TELEROBOTICS LAB
Noordwijk, Netherlands
www.esa-telerobotics.net
→ THE ASTRONAUT
Astronaut Andreas Mogensen
Set to launch to the International Space Station on the 2nd of September, Danish
ESA astronaut Andreas Mogensen is a preparing for a short-duration mission of a
maximum of 10-days. Andreas has a background as an aerospace engineer and has
familiarized himself with the technology at ESA’s Telerobotics Laboratory.
Andreas can be followed by visiting andreasmogensen.esa.int
→ THE TEAM
ESA Telerobotics & Haptics Laboratory
The Interact Experiment was conceived and developed by ESA’s Directorate of
Technical and Quality Management, in particular, within ESA’s Telerobotics &
Haptics Laboratory and in collaboration with the TU Delft Robotics Institute
The Interact experiment is supported by the ESA Human Spaceflight and
Exploration Directorate, in particular by its ISS Programme and Exploration
Department.
The ESA Telerobotics & Haptics Lab consists of a small but highly dynamic team of
engineers and engineering academics. Led by Dr. André Schiele, Associate Professor
at the Delft University of Technology, the team performs fundamental research in
mechatronics, robotics and control theory. The Laboratory hosts several ESA staff
members, research contractors and a varying number of Ph.D. and M.Sc. candidates
supported via the Delft University of Technology.
The Interact Centaur design was created in close collaboration with a team of
Industrial Design Master Students from TU Delft in 2014.
Follow the ESA Telerobotics & Haptics Lab by visiting esa-telerobotics.net
→ TECHNICAL
FEATURES
Technical Features
→ INTERACT CENTAUR
The mobile robotic platform called the Interact Centaur was specifically designed to
be able to maneuver through rough terrain at high speeds and to have the dexterity
to perform very delicate and precise manipulation tasks through remote control.
The custom vehicle design was brought from concept to reality in little over a year.
COMPUTING
The robot makes use of seven high performance
computers running software that has been
programmed in a highly modular, model-based
and distributed way.
ROBOTIC ARMS
Two KUKA lightweight robotic arms on the front
of the rover allow the operator to perform very
precise manipulation tasks. The arms can be
‘soft controlled’ to safely interact with humans
or delicate structures and can be programmed to
be compliant (like a spring and or damper) when
they hit an object. The arms are equipped with
highly ‘force sensitive’ sensors and can flex and
adapt in a similar manner to human arms during
remote control. This allows to tightly couple
those arms to an operator located far away by
means of haptic (i.e. force-transmitting)
interfaces. Their operation during the Interact
experiment is very intuitive, allowing delicate
and dexterous remote operations to take place
across very long distances with the finest
amount of force feedback to the operator despite
the communication time delay.
ROVER MOBILE PLATFORM
The drivetrain and wheels for the Interact
Centaur are a customized version of the remote
controlled platform manufactured by AMBOT.
This battery-powered, four-wheel-drive, fourwheel steering platform is weatherproof and
gives the rover over 8 hours of run-time in
challenging terrains.
ROBOTIC PAN-AND-TILT NECK AND HEAD
A robotic 6 degrees of freedom Neck gives the
cameras in the rover’s head an enormous field of
view, good for driving and for close visual
inspection tasks.
REAL-TIME CAMERAS
The rover has 4 dedicated real-time streaming
cameras that the astronaut can use during the
mission. A head pan-tilt camera that will allow
general contextual overview of the situation
during driving and exploration of the
environment. A tool camera mounted on the
right robotic arm for vision during precise tool
manipulation. Two hazard cameras (front and
back) to view the near proximity area otherwise
occluded by the chassis during driving.
EXTERIOR DESIGN
A custom-made exterior protects all delicate
mechatronic and computing hardware from dust
and ensures a good thermal design.
→ AUGMENTED REALITY
Virtual model overlays in real-time
To provide extra support to the astronaut while driving the rover, an augmented
reality (AR) overlay was developed. This allows for virtual markers such as predicted
position markers to be displayed on top of the camera feed.
1.
The current rover position is shown with two yellow blocks in front of the wheels.
The current rover position is shown with two yellow blocks in front of the wheels.
Similarly, white blocks indicate the predicted rover position. Before the rover moves the operator can see
where the rover is going to end up.
Green blocks are used to align the rover with the task board.
2.
3.
4.
5.
→ LASER GUIDANCE
Embedded lazer tool support
To visually support the astronaut when performing the mechanical alignment
during the peg-in-hole assembly task, a laser has been embedded within the tool.
When hovering over the hole, the laser will be invisible indicating that the
connection can be attempted. The Laser creates an artificial depth impression by a
dedicated depth-cue. This allows executing such complex 3D tasks without
requiring a dedicated stereo 3D video system, which would consume excessive data
bandwidth.
*
*
→ SPACE TO GROUND
Tracking and Data Relay Satellite System (TDRSS)
Satellite communications
As a complicating factor, the signals between the astronaut and the robot must
travel via a dedicated and highly complex network of satellites in geo-synchronous
orbit. The signals will travel from the International Space Station via NASA’s TDRSS
to ground facilities in the U.S. From there, they cross the Atlantic Ocean to the ESA
facilities in Noordwijk, the Netherlands. Forces between the robot and its
environment, as well as video and status data, travels back to the graphical user
interface and the haptic joystick. In this round-trip, all signals cover a distance of
nearly 90.000 km. The resulting round trip time delay approaches one second in
length.
ESA developed a model-mediated control approach that allows to perform forcefeedback between distributed systems up to multiple seconds of time delay,
without a noticeable reduction of performance, compared with directly coupled
systems. Despite the fact that this smart software and control methods enable the
astronaut to perform such tasks on Earth, research suggests that humans can only
handle signal transmission time delays of up to about three seconds for control
tasks that require hand-eye coordination. In theory this would allow haptic control
from Earth to robotic systems on as far away as the surface of our Moon.
International Space Station (ISS)
ESTEC
Noordwijk, Netherlands
NASA Ground Terminals
New Mexico, USA
90.000
km
→ HAPTICS-1 JOYSTICK
Teleoperation of earthbound robotics with
real-time force-feedback from Space
On-board the ISS, the astronaut will re-use equipment from the previous
Telerobotics & Haptics Lab experiments called Haptics-1 and Haptics-2. For these
experiments a tablet PC and a small force reflective joystick were flown to the ISS
with the goal to evaluate human haptic perception in space and to validate realtime telerobotic operations from space to ground. During Haptics-1, on the 30th of
December 2014, haptics was first used in the microgravity environment of the ISS.
During Haptics-2, on June 3rd (21:00 CEST) 2015, for the first time in history, a
handshake with force-feedback was performed between two humans, one located
in space and on ground.
TE
A
ES
RY
E
interact
T
✦ MOGENSEN ✦
R O B O T IC S
IN
LF
ST
DE
IT
UT
TU
WITH INTERACT, ESA AIMS TO PRESENT AND VALIDATE
A FUTURE WHERE HUMANS AND ROBOTS EXPLORE SPACE
TOGETHER. ROBOTS WILL PROVIDE THEIR OPERATORS MUCH
WIDER SENSORY FEEDBACK OVER MUCH GREATER DISTANCES
THAN WHAT CAN BE DONE BY TERRESTRIAL ROBOTS TODAY.
NOT ONLY IN SPACE, BUT ALSO ON EARTH, REMOTE
CONTROLLED ROBOTICS WILL PROVE HIGHLY ENABLING IN
DANGEROUS AND INACCESSIBLE ENVIRONMENTS. THEY CAN
BE USED IN ARCTIC CONDITIONS, IN THE DEEP SEA OR FOR
ROBUST INTERVENTION IN NUCLEAR DISASTER SITES.
WE CAN EXPECT THAT FUTURE HUMAN EXPLORATION
MISSIONS TO THE MOON AND MARS WILL BENEFIT FROM
SUCH ADVANCED HUMAN-ROBOTIC OPERATIONS. ESA’S
RESEARCH IN TELEROBOTIC TECHNOLOGIES AND ADVANCED
CREW OPERATIONS FROM ORBIT WILL PLAY A KEY ROLE
IN THESE COMING ADVENTURES. THE ESA TELEROBOTICS
AND HAPTICS LABORATORY, ALONG WITH ESA’S TECHNICAL
AND SPACE EXPLORATION DIRECTORATE ARE DEDICATED
TO TAKING THE NEXT BIG STEPS IN HUMAN-ROBOT
COLLABORATION IN SPACE.
TO
RA
LE
BO
RO
BOT IC S & H A P T IC S
LA
telerobotics and
haptics laboratory