Paper

Transcription

Paper
A TETHERED AEROBOT FOR PLANETARY ROVER MISSIONS
Laurence Tyler(1), Dave Barnes(1), Mark Neal(1), Fred Labrosse(1), Stephen Pugh(1), Gerhard Paar(2) & the
PRoViScout Team
(1)
Department of Computer Science, Aberystwyth University, SY23 3DB, Wales, UK
(2)
Joanneum Research, Graz, Austria, A-8010
ABSTRACT
We present an overview of the PRoViScout rover and
tethered aerobot. The PRoViScout project involves
visually-guided autonomous science target assessment,
operations planning and navigation for a planetary rover
class mission. A wheeled rover will carry a variety of
primarily visual sensors with which to explore a Mars
analogue site. Decisions about waypoints and science
target selection will be made autonomously during the
mission. A tethered aerobot provides area context for
the rover in the form of multispectral imagery that is
used to generate both a terrain model for navigation and
a mineralogical map to inform science assessment and
mission planning. Experiences and some results from
early tests of the aerobot are presented.
1.
The demonstrator system will comprise an autonomous
rover platform carrying a number of vision-based
sensors. Given an initial science plan and a Digital
Elevation Model (DEM) of the area to be explored, the
rover will conduct a fully autonomous scouting mission,
including re-planning to allow for the opportunistic
investigation of serendipitous science targets. The
PRoViScout project is described in greater detail in [1],
and see also [7].
INTRODUCTION
PRoViScout is a project within the European FP7SPACE framework. It aims to demonstrate the utility
and feasibility of equipping long duration planetary
rover missions with vision-based systems for
autonomous planning, terrain analysis and science target
selection. The project will culminate in a field trial
demonstration conducted in a Mars-analogue
environment.
Figure 2. Autonomous tethered aerobot
Complementing the rover platform will be an
autonomous tethered aerobot with a high-resolution
multispectral camera. The aerobot provides vital area
context for the rover, at a scale intermediate between
ground observations and orbital imagery. Images
captured by the aerobot will be used to generate an area
DEM for mission planning and initial waypoint
selection. The addition of mineralogical data derived
from the multispectral bands of the aerobot images will
allow further refinement of the initial science target
selection.
2.
Figure 1. Rover platform "Idris"
AEROBOT RESEARCH AT ABERYSTWYTH
The Space and Planetary Robotics group at Aberystwyth
University have an ongoing interest and involvement in
aerobot research. The group has taken part in several
aerobot-based projects in conjunction with other
partners.
A previous project investigated autonomous imagebased localisation for a future Mars aerobot mission [2].
This was an ESA-funded project. Project partners
included SciSys Ltd, University of Leicester and
Joanneum Research. The project culminated in a
demonstration of the developed localisation algorithms
controlling a real aerobot. Fig. 3a shows the completed
aerobot undergoing acceptance trials at the ESA ESTEC
Planetary Testbed Facility.
points downwards. The camera is mounted on a
stabilising platform under the balloon along with its
controlling computer, wireless network link and sensors
for measuring the position and attitude of the aerobot.
The projected ground view of the aerobot camera at an
altitude of 100m is approximately 106m by 88m, with a
resolution of 43 mm/pixel. This combines a useful level
of local detail with broader area coverage for mission
planning. The level of detail discernible is significantly
higher than would normally be achieved by orbital
imaging, and the wider surface coverage and elevated
viewpoint of the aerobot is advantageous in traverse
planning when compared to the more restricted view
from the rover's own sensors.
4.1. Aerobot Hardware
The prototype aerobot measures approximately 20cm x
18cm x 18cm and has a framework constructed from
thin aluminium plate with nylon bracing bars. The
aerobot is shown in fig. 4.
3a: Aerobot acceptance
trials at ESTEC
3b: Cooperating aerobots
flying in formation
Figure 3. Aerobot research at AU
AU has also investigated the use of cooperative control
methods for the coordination of multiple aerobots [3].
This project demonstrated the developed control
algorithms by autonomously flying a group of aerobots
in formation in a large enclosed space (fig. 3b). Each
aerobot operated its own behaviour-based controller.
The cooperative control algorithm was able to maintain
the formation pattern of the aerobots in the face of
external perturbations. Project partner was SciSys Ltd.
3.
4b: Oblique view showing
suspension points
4c: Left view - control PC,
IMU, GPS
4d: Right view – camera
and battery pack
ROVER
The autonomous rover platform used by PRoViScout is
based on a robuCAR-TT chassis [8]. The rover
platform, known as “Idris” (fig. 2), is a 4-wheel drive,
all-terrain vehicle. On-board systems control movement
and steering in both autonomous and teleoperated
modes, and also implement safety features such as
obstacle detection and emergency stop. The rover has a
substantial payload capacity of up to 150kg and can
provide accommodation, power and network services
for a number of independent payload systems.
4.
4a: Bottom face with
camera aperture
AEROBOT
Raw data for DEM generation will be provided by the
tethered aerobot. This consists of a helium balloon
platform capable of taking wide-angle images up to a
height ceiling of 100m. It is equipped with a
lightweight, high-resolution monochrome camera that
Figure 4. Views of prototype aerobot hardware
The following components are mounted on the
framework: control computer, inertial measurement unit
(IMU) and global positioning system (GPS) device (fig.
4c); high-resolution camera and battery pack (fig. 4d);
miniature pulleys for suspension from the envelope (fig.
4b). The downwards-facing camera aperture can be seen
in fig. 4a. The aperture is covered by an infra-red
blocking filter and the camera itself is shrouded in dark
material to keep out stray light.
4.2. Camera
The camera used in the current version of the aerobot is
a Prosilica GC2450 [9]. This camera has a Sony
ICX625 monochrome CCD sensor with a resolution of
2448x2050 (5 Mpixels) and a gigabit ethernet interface.
It is capable of imaging up to 15 frames per second at
full sensor resolution with 8 or 12 bit pixel data. The
camera allows operation modes such as single or
triggered frame capture and supports configurable autoexposure and auto-gain algorithms. Image sub-framing
and pixel binning are also possible.
The camera is currently fitted with an 8mm focal length
c-mount lens adjusted to have a hyperfocal distance of
6m; thus targets from 3m to infinity are well-focused.
The optical characteristics of the camera system are
summarised in tab. 1. From this table it can be seen that
the camera can distinguish ground features down to a
size of approx. 5cm at 100m altitude, and approx. 2.5cm
at 50m altitude.
Table 1. Aerobot camera optical properties
Sensor
Sony ICX625 monochrome
Active sensor size
8.45mm x 7.07mm (2/3 fmt)
Resolution
2448 x 2050 pixels
Lens focal length
8mm
Hyperfocal distance
6m (focus 3m – infinity)
Field of view
55.65° x 47.69°
Ground footprint
106m x 88m at 100m high
53m x 44m at 50m high
Pixel field of view
0.0247° (0.431 milliradians)
Ground pixel
resolution
0.043m at 100m high
0.022m at 50m high
and smoothed attitude readings are read on demand by
the aerobot control computer. Internally, the sensors are
read and processed at 100Hz.
Positional information is provided by a miniature
commercial off-the-shelf GPS unit (“dongle”) that plugs
directly into a USB port and provides standard NMEA
data to the aerobot control computer. After settling, this
can provide latitude and longitude information to an
accuracy of (at best) approximately 10m. Altitude
information is also available from the GPS unit, but this
has a 30m accuracy at best and so is of limited use.
4.4. Control and Interfaces
The aerobot is controlled by an onboard Fit-PC2i
miniature computer [11]. This device measures 101mm
x 115mm x 27mm. It has a 1.6 GHz CPU, 2GB main
memory, 2 x gigabit ethernet ports, 4 x USB 2.0 ports, 1
SD card slot and built-in WiFi. The computer weighs
0.37kg and consumes 6-8W of power in normal
operation. We have added a fast 60GB solid state disk to
store the operating system and captured data. The
computer runs a version of the Linux operating system.
4.3. Position and Attitude Sensors
Two additional sensor subsystems are used to provide
estimates of the aerobot position and attitude. This
information gives context to the captured images and is
used to inform the subsequent image co-registration and
DEM generation processes.
A Sparkfun RAZOR 9-DOF IMU is used to provide roll
and pitch angles and tilt-compensated magnetic heading
[10]. This miniature IMU has 3-axis magnetic,
acceleration and gyro sensors constructed using MEMS
technology. It also supports custom firmware.
Communication with the unit is via a serial connection,
and this is implemented by a USB-to-serial piggyback
board that also provides power for the IMU. Filtered
Figure 5. Aerobot control system
The aerobot computer handles all of the aerobot sensors
and provides control and data access to them from other
systems. Fig. 5 shows a block diagram of the overall
system. A server process directly handles the camera
and sensors and offers a command and data connection
using an efficient binary protocol over TCP/IP. Using a
portable client library as an adapter, this protocol can be
used to operate the camera and obtain sensor data from
any other network-attached system.
4.6. Envelope and Suspension
The protocol is neutral about client location, but
bandwidth limitations may restrict remote access. The
raw data frames from the aerobot camera are some 5MB
in size, and take several seconds to transfer over a good
wireless network link. When reception conditions are
poor, this may lengthen considerably.
The envelope currently used with the aerobot is a 1.8m
diameter spherical HDPE balloon, providing approx.
2kgf (approx. 20N) of lift. For the tether, 36kg (80lb)
breaking strain Spectra fishing line is used in
conjunction with a speed-controlled electric winch
mounted on the rover platform. This tether combines
high strength with low mass, and should withstand
tether forces up to approximately 350N.
Because of this, the usual mode of operating the aerobot
is to issue high-level image sequence commands that are
executed on the aerobot itself, with the images being
stored in predefined locations on the solid state disk.
Sample images can then be downloaded over the
wireless network for examination, and the bulk data
copied from the aerobot after it the imaging run has
been completed.
The protocol library has been implemented in C, Java
and Python on several systems. For the purposes of
PRoViScout integration, a CORBA interface will be
implemented to enable commanding of the aerobot from
other subsystems within the project.
The aerobot is suspended from the balloon tether using
a Picavet suspension [12] connected to both ends of a
lightweight rigid rod (fig. 6). This is a technique often
used in kite aerial photography. Since the connection to
the balloon itself is a single thread, the platform is not
affected by any rotational motion of the balloon, and the
Picavet suspension keeps the platform level despite
changes in the angle of the tether. The platform can still
be affected by horizontal swaying movements and
random air turbulence.
4.5. Power and Mass
Power for all components of the aerobot is provided by
a single Lithium-Polymer (Li-po) battery pack with a
nominal voltage of 11.1V. In laboratory tests, with all
components in use, the aerobot total current draw at
11.1V was between 900mA and 1100mA (power
consumption about 10-12W). Based on these figures, a
single 2200mAh battery pack should power the aerobot
for up to 2 hours before needing replacement, and this
estimate has been borne out by experience during field
tests. To avoid the possibility of over-discharging the
battery pack, aerobot flights are normally restricted to
about 90 minutes duration.
The total weight of the aerobot is currently 1.2kg
including all components. The net maximum payload of
the currently used envelope when fully inflated with
fresh helium is approx. 2kg, giving a payload mass
margin of 0.8kg. This is equivalent to 7.85N excess
lifting force (though the weight of the tether and
attachments will further reduce this).
Although lightweight items have been used wherever
possible, the framework itself has not been optimised
with respect to weight. It may be possible to replace
some of the metal panels with carbon fibre composite to
reduce weight. Also, the infra-red blocking filter
currently fitted is larger than necessary and could be
replaced with a smaller and lighter one.
However, the planned multispectral filter wheel will
also add to the payload mass. This will be offset to some
extent by the use of an OEM board version of the
camera.
Figure 6. Picavet suspension
5.
FIELD TESTS
5.1. Initial Test Flight
To test the envelope, suspension and basic hardware, an
initial test flight was conducted during January 2011 at
the Aberystwyth University robotics workshop at
Llanbadarn. On a clear day with a low steady breeze,
the aerobot was launched from a fixed platform and
allowed to ascend until approximately 110m of tether
had been paid out. Tests were made of platform stability,
wireless connectivity and basic image capture. Fig. 7
shows the “first light” image from the aerobot camera.
Accurate height measurements were not available, but
from the apparent sizes of objects in the image, the
aerobot height was estimated to be around 25m.
From the initial tests, the platform stability was found to
be good under static conditions, i.e. when the system
was left to settle. The envelope tended to move around
more during deployment and retrieval, though this was
not recorded quantitatively.
It was also noted that, although the wind was light, it
did result in the tether finally settling at quite a shallow
angle. This meant that the anchor point (notional rover
position) was not visible in the images taken.
Unfortunately, due to adverse weather conditions –
specifically, increasingly high and gusty wind – it was
found to be impossible to safely launch the aerobot on
this occasion. After several attempts (and a couple of
collisions), the aerobot was removed from the envelope
and taken to the top of the nearby cliffs, about 15-20m
above the beach level. Here a number of image sets
were captured while the aerobot was pointed manually
at the scene below.
The image quality from the aerobot camera on this
initial test was acceptable, however it was noticed that
the infra-red sensitivity of the camera in conjunction
with the lens being used resulted in a loss of sharpness.
The lens was designed for visible light use, and was not
achromatic in the infra-red. Hence the infra-red
component of the image (e.g. the white appearance of
the foliage) was not as sharp as the visible component.
Because of this, it was decided to add an infra-red
blocking filter to the aerobot camera system, at least
while this particular lens was in use.
Figure 8. High winds at Clarach Bay beach
Although several image sets were obtained, the pointing
of the aerobot was not well controlled, and this
combined with the difficult terrain at the site (not all
sections of cliff top were safely accessible) meant that
there was little or no overlap between successive image
sets. This made subsequent DEM reconstruction
difficult or impossible. However, a 3D laser scan of the
beach was successfully captured for later use.
Figure 7. "First light" image from aerobot initial test.
The sheep visible in this image are approx. 1m long.
5.2. Field Test at Clarach Bay
An outdoor field test of the aerobot was scheduled for
February 2011 as part of PRoViScout project. The test
was to take place at Clarach Bay beach, Aberystwyth.
This location has been characterised previously as a
visual Mars analogue test site [6]. A panoramic stereo
reconstruction of the terrain can be viewed at [13]. For
this test, the aerobot was to be tethered to the rover,
which was to traverse a path along the beach, stopping
at regular intervals to take a series of aerobot images.
The rover was also to take a 3D laser scan of the terrain
at each waypoint as a “ground truth” measurement for
later comparison with the generated DEM.
Figure 9. Aerobot image of Clarach Bay
beach from the cliff top (~18m)
5.3. Controlled Test at Robotics Workshop
In place of the planned field test at Clarach Bay, a
controlled test of aerobot image capture was conducted
at the robotics workshop at Aberystwyth University. A
test area was prepared with a number of imaging targets
scattered over an area of sloping terrain. Some obstacles
were placed within the test area, while several other
natural features were included.
• An image is captured at the nominal shutter time
and also at factors of 0.5 and 1.5 times the nominal
shutter time (exposure bracketing).
This whole sequence is repeated a number of times at
each viewpoint (10 in initial trials, later reduced to 4).
The combination of exposure bracketing and multiple
image capture is intended to allow for movement of the
aerobot platform and rapidly-changing light conditions.
The aerobot was disconnected from its envelope and
tether, and was instead mounted on a cantilevered
platform overlooking the test area. The height of the
platform was 4 – 5m above the ground level.
Figure 11. Aerobot image capture with
overlap
Figure 10. Captured aerobot images
Images were captured by the aerobot from a number of
overlapping viewpoints above the test area, moving
around a corner and back again. Altogether, images
were captured from 23 different viewpoints (a total of
690 images or 3.3GB of data). Representative captured
images are shown in fig. 10. The data captured during
this test was successfully used for DEM reconstruction.
The methodology used for this process is outlined in
section 6.
6.
The Aerobot images (fig. 10) contain rough orientation
parameters (position from GPS and pointing from
IMU). Together with the disparities obtained from
automatic interest point matching (shown in fig. 12),
they are fed into a bundle block adjustment procedure to
produce refined camera position, orientation and interest
point 3D coordinates (fig. 13)[4].
VISION DATA PROCESSING
The aerobot is tethered to the rover platform, thus it is
possible to take images from multiple viewpoints with
overlapping footprints (fig. 11). Images from the
aerobot are rectified and co-registered to produce a
consistent aerial view of the rover mission environment.
At each viewpoint, images are captured by the
following sequence:
• The camera auto-exposure algorithm is run once, to
set initial shutter time. Target mean pixel value is
set to 40% of maximum.
• The computed shutter time is read from the camera
(nominal shutter time).
Figure 12. Interest point matching between two images
A following 3D reconstruction uses disparities and
image orientations for dense DEM generation (figs. 14
& 15). The DEM is used in PRoViScout as an initial
map for scientific context and navigation [5]. In
addition, overlaying multispectral views of the same
terrain on the DEM will allow for a spectral and
therefore mineralogical analysis of the area, further
informing the initial mission plan.
Figure 13. Left: Camera x-y positions; Right: Camera
pose and interest points recovered in 3D.
Images © Center for Machine Perception, Czech Technical University
in Prague.
Figure 14. Generated DEM as elevation map
• The physical construction of the aerobot framework
needs some attention, to reduce weight and further
increase robustness in case of collisions.
• The limited stability and lifting power of the
envelope used has presented problems during
periods of strong wind. In order to broaden the
range of weather conditions under which the
aerobot will be flown, alternative envelope
configurations should be considered. A larger
sphere would increase lift force and hence improve
tether angle, while a more “blimp-like” envelope
would allow increased stability at higher wind
speeds. With both of these options there is a
tradeoff of lifting power against the cost of helium
to fill the envelope (which can be quite substantial).
• The current altitude measurement is inaccurate.
This is not a problem for DEM generation provided
that an object of known scale (for example, the
rover) appears in the image sequence. If required, a
more accurate altitude estimate could be obtained
from a suitably calibrated barometric altimeter, or
by measuring the tether angle and length with
suitable instrumentation.
A major outstanding item of work is the provision of a
filter wheel for the aerobot camera. Carrying a selection
of broadband (visual red-green-blue) and narrowband
(geology) filters, this will allow not only colour image
reconstruction for human consumption, but also spectral
analysis of the terrain and its classification according to
detected mineralogy. As an example of the intended
output, fig. 16 shows a DEM with overlaid mineralogy
information produced by the CRISM instrument of
NASA's Mars Reconnaissance Orbiter.
Figure 15. Generated DEM rendered in 3D
7.
DISCUSSION AND FUTURE WORK
The design of the aerobot is still at an early stage,
though the results from the initial tests are encouraging.
We have shown that the camera system and software
pipeline is capable of producing a valid DEM from
captured images. The aerobot control systems and
interfaces are also fully functional. Nevertheless, several
points present themselves for further consideration:
Figure 16. Mineralogy map of Nili Fossae region, Mars
Image: NASA/JPL/JHUAPL/University of Arizona/Brown University
ACKNOWLEDGEMENTS
WEB REFERENCES
The research leading to these results has received
funding from The European Community's Seventh
Framework Programme (FP7/2007-2013), Grant
Agreement No. 241523 PRoViScout.
Retrieved on 2011-04-01
Some images were provided by project partner Czech
Technical University, Prague.
REFERENCES
[1] Paar, G., Woods, M. & the PRoViScout Team. FP7SPACE PRoViScout – Planetary Robotics Vision
Scout. Proc. 11th Symposium on Advanced Space
Technologies in Robotics and Automation,
ESA/ESTEC, The Netherlands, 12-14 April, 2011
[2] Dave Barnes, Andy Shaw, Phil Summers, Roger
Ward, Mark Woods, Malcolm Evans, Gerhard Paar,
Mark Sims. Autonomous Image Based Localisation
For A Martian Aerobot. ISPRS Symposium “From
Sensors to Imagery”, Paris, July 3-6, 2006, 2006
[3] Honary, E., McQuade, F., Ward, R., Woodrow, I.,
Shaw, A., Barnes, D., Fyfe, M. Robotic
experiments with cooperative aerobots and
underwater swarms, Robotica, Vol. 27, Issue 01, pp
37-49, 2009.
[4] Jancosek, M. and Pajdla, T. Hallucination-free
Multi-View Stereo, RMLE 2010 (ECCV workshop),
Hersonissos, Heraklion, Crete, Grece, September
5–11, 2010
[5] Woods, M., Shaw, A., Rendell, P., Long, D., Paar,
G. High-Level Autonomy and Image prioritisation
for long distance Mars Rovers. Proc. 11th
Symposium on Advanced Space Technologies in
Robotics and Automation, ESA/ESTEC, The
Netherlands, 12-14 April, 2011
[6] Pullan. D. Robotic Geology Assessment of Clarach
Bay, Aberystwyth, UK. Technical Reference,
PRoViScout Project, July 2010
[7] PRoViScout official project web page:
http://www.proviscout.eu
[8] robuCAR-TT mobile platform:
http://www.robosoft.fr/eng/sous_categorie.php?
id=1025
[9] Prosilica GC2450 GigE camera:
http://www.alliedvisiontec.com/emea/products/cam
eras/gigabit-ethernet/prosilica-gc/gc2450.html
[10] Sparkfun RAZOR 9-DOF miniature IMU:
http://www.sparkfun.com/products/9623
[11] Fit-PC2i miniature computer:
http://www.fit-pc.com/web/
[12] Picavet suspension system:
http://en.wikipedia.org/wiki/Kite_aerial_photograp
hy#Picavet_suspension
[13] Rendered video fly-through of Clarach Bay stereo
panorama reconstruction:
http://www.youtube.com/watch?v=6gRo8QSXX5c