antenna landshark

Transcription

antenna landshark
Eurathlon 2013
Scenario Application Paper (SAP) – Review Sheet
Team/Robot
Scenario
ARTOR
Autonomous Navigation
For each of the following aspects, especially concerning the team’s approach to scenariospecific challenges, please give a short comment whether they are covered adequately in
the SAP.
Keep in mind that this evaluation, albeit anonymized, will be published online; private comments to the
organizers should be sent separately.
Robot Hardware
The SAP gives a thorough description of the ARTOR robot, a 6-wheel skid steered robot
based on the commercial LandShark robot.
Processing
High level processing is undertaken with a laptop running Linux and ROS: a very reasonable
approach and good to see the use of ROS.
Communication
Communications provision on the ARTOR robot is very comprehensive, with three separate
wireless systems for: short range (WiFi), UMTS for global internet connectivity and COFDM
for video/data transmission.
Localization
ARTOR's localisation is based on the Velodyne LIDAR, plus GPS, although the SAP describes
good results from iterative closest point matching (ICP) without GPS.
Sensing
The robot is very well equipped with sensors, including 3D and 2D laser scanners, stereo
and mono camera systems, an IMU and a GPS receiver.
Vehicle Control
The SAP provides a nice block-diagram level overview of the system control, and good
descriptions of the team's apprioaches to autonomous localization, mapping and
navigation.
System Readiness
The SAP makes it clear that although the ICP based mapping an localization has been
tested and achieves promising results with dense waypoints, the team expect the euRathlon
navigation scenario to be more demanding - and work is ongoing to develop a motion
planner between waypoints for 3D terrain.
Eurathlon 2013
Overall Adequacy to Scenario-Specific Challenges
This is a clear and thorough SAP. Our assessment is that the very comprehensively equipped
ARTOR robot is certainly adequate to meet the autonomous navigation challenge; it's
actual performance will depend on the success of the team's work on extending the
autonomous control for sparse waypoints in 3D terrain.
Autonomous Navigation using GPS waypoints
euRathlon Scenario Application Paper
Team ARTOR
Philipp Krüsi
Autonomous Systems Lab, ETH Zürich, Switzerland
e­mail: [email protected]
Date: June 13, 2013
Abstract
ARTOR is a rugged, electrically powered all­terrain robot equipped with laser range finders, several cameras, an IMU and GPS. All processes running on the robot are implemented as ROS1 nodes, which results in a highly modular system. Our navigation software installed on the vehicle enables the robot to localize itself and build three­dimensional maps of the environment even if no GPS signal is available. At the core of our software is highly accurate and robust mapping and localization module based on iterative closest point matching (ICP), using data from a spinning 3D laser scanner. Together with an obstacle detection technique and a motion planner, this allows us to navigate autonomously over large distances in dynamic and changing environments, as well as in rough, unstructured, three­dimensional terrain. 1. Introduction
Team ARTOR is a collaboration between the Autonomous Systems Lab 2 (ASL) at ETH Zürich, RUAG Defence3 and armasuisse W+T4. The team is composed of PhD students and PostDocs at ASL and technical staff at RUAG Defence, under the leadership of Philipp Krüsi (PhD student, ETH/ASL) and Dr. Thomas Nussbaumer (Head of the armasuisse research program UGV, RUAG Defence). We successfully participated in the ELROB competition in 2012, where our robot autonomously drove over several kilometers in different scenarios. Our robot ARTOR (Autonomous Rough Terrain Outdoor Robot) is a 6­wheeled, skid­steered electric vehicle. An array of on­board sensors is used for monitoring the robot’s state and gathering information about the environment for on­line mapping, localization and obstacle 1
2
3
4
Robot Operating System, http://www.ros.org
http://www.asl.ethz.ch
http://www.ruag.com/en/Defence/Defence_Home/About_us
http://www.ar.admin.ch/internet/armasuisse/de/home/themen/armasuisseWissenschaftundTechnologie/Forsch
ungsmanagement/technologiemonitoring/forschungsprogramm4.html
euRathlon SAP Team ARTOR: Autonomous Navigation
avoidance. The equipment includes a rotating 3D laser scanner, two 2D laser scanners, a stereo camera, a GPS receiver and an inertial measurement unit (IMU). Furthermore, a pan­
tilt­zoom unit containing both a visual and a thermal camera is installed. All data processing for autonomous navigation, including mapping, localization, path planning, obstacle avoidance and motion control, is performed on the on­board computer, using the robot operating system ROS. The remainder of this paper is organized as follows. Section 2 describes our vehicle and all the installed sensors, computers and communication systems in detail. In Section 3, an overview of the entire robotic system and the the navigation software architecture is given, followed by a description of our localization and mapping techniques and finally an outline of our approach to autonomous navigation. Last, but not least, we will briefly talk about the readiness of the system in Section 4, and mention the most important developments that remain to be done for the euRathlon competition.
2. Vehicle Description
ARTOR is a robotic vehicle capable of driving in rough terrain and at relatively high velocities. The maximum speed of the platform is around 3.5 m/s, but it is typically driven at around 1 m/s. It is based on a LandShark, a 6­wheeled, skid­steered electric vehicle built by the American company Black­I Robotics 5. The LandShark's front axle is raised by a few centimeters compared to the middle and rear axles, which reduces friction forces in tight turns and enhances the robot's climbing capabilities. The platform is well suited for driving on grass, gravel, or paved surfaces, even in the presence of steep slopes. However, it is not able to climb stairs or to overcome obstacles that are higher than 10 cm. All three wheels on either side are connected by a chain, and thus driven at the same speed at all times. The vehicle is powered by two electric motors, one for the left side and one for the right side. As the LandShark does not have any steerable axles, steering is accomplished by driving the left and right wheels at different speeds. While offering a reasonable amount of space for additional payload (such as sensors), the LandShark is still small enough to fit through standard doors. Autonomous operation of the robot requires an array of sensors for perception of the environment an for monitoring the vehicle's state. The LandShark in its basic configuration is equipped with two cameras as the only exteroceptive sensors. Thus we installed a series of additional sensors to the base vehicle. On top, a 3D laser scanner is mounted for accurate perception of obstacles and the terrain shape. Two 2D laser scanners – one at the front, one at the back – can be used for navigation in flat environments and for obstacle avoidance. A stereo camera at the front and a monocular camera at the back provide valuable information for localization and mapping. Furthermore, the vehicle is equipped with an Inertial Measurement Unit (IMU), and with a GPS receiver for global localization or ground truth collection. Last, but not least, a pan­tilt­zoom unit is mounted on the robot, containing both a visual and a thermal camera. It is primarily used for providing information about the environment to the 5 http://www.blackirobotics.com
2
euRathlon SAP Team ARTOR: Autonomous Navigation
operator when the robot is out of sight. The specifications of the most important sensors installed on ARTOR are listed below. Figure 1: ARTOR (Autonomous Rough Terrain Outdoor Robot): a 6­wheeled, skid­steered robot, equipped with an array of sensors for perception of the environment and equipment for communication with a base station.
3D laser scanner (Velodyne HDL­32E) • 32 laser/detector pairs
• field of view: +10° to ­30° vertical, 360° horizontal
• frame rate 10 Hz
• range: 5 cm to 100 m
• output: up to 800'000 points/s
2D laser scanner (Sick LMS151) • field of view: 270°
• scanning frequency: 50 Hz
• range: 0.5 m to 50 m
Stereo camera (front) (Point Grey Bumblebee2) • 2 color CCD sensors
• resolution: 1024 x 768 pixels
• baseline: 120 mm
• frame rate: 20 fps
• field of view: 97°
3
euRathlon SAP Team ARTOR: Autonomous Navigation
Monocular camera (rear) (AVT Stingray F­046C)
• color CCD sensor
• resolution: 780 x 580 pixels
• max. frame rate: 61 fps
Inertial Measurement Unit (Xsens MTi)
• output: • 3D linear acceleration
• 3D angular velocity
• 3D magnetic field measurement
• 3D orientation
• max. update rate: 512 Hz
GPS receiver (Trimble Pathfinder ProXT)
• update rate: 1 Hz
• additional external antenna
All data processing is executed onboard, on a laptop featuring an Intel Core i7 processor running Ubuntu Linux and the Robot Operating System ROS. The system is designed such that during autonomous navigation it can operate without permanent connection to the ground station. Three different communication systems are installed on ARTOR, and used depending on the specific application. We have a standard 802.11 Wi­Fi interface, which enables high bandwidth communication over short distances. Moreover, a UMTS modem is used to transmit sensor data, such as camera images, to a server on the internet, which can be accessed from anywhere in the world. Last, but not least, our robot is equipped with a COFDM video/data transmission system with a high range even without line of sight. A separate communication channel is used for the wireless emergency stop system. This system is intended to provide the operator with full control over the robot at all times. Power on the robot is immediately shut down as soon as either connection to the e­stop sender is lost, or the operator pushes the button on the sender. Furthermore, the emergency brakes on the vehicle are automatically engaged whenever power is out. Instead of attaching all the sensory and computing equipment directly to the platform, we built a modular system consisting of an exchangeable box on top of the robot, which is connected to the base vehicle with a standardized interface (power: 24V DC, data: Ethernet). All sensors, computers, and communication equipment are attached to or mounted inside the box. As a consequence, the set of sensors and computers could quickly be replaced by a different one by just exchanging the box on top of the robot, depending on the particular operation purpose. Moreover, the box protects all the sensitive parts of the hardware from water and dust. The entire robot is splash­proof, and in experiments it has been shown to be capable of driving in the rain for several hours without any damage. Table 1 summarizes the most important specifications of ARTOR in its current configuration.
4
euRathlon SAP Team ARTOR: Autonomous Navigation
Dimensions
Length: 140 cm
Width: 75 cm
Height: 125 cm
Weight
330 kg
Drivetrain
6 wheels (15 inch)
Skid steered
Raised front axle
Propulsion
2 brushed DC motors
3 lead­acid batteries (55 Ah, 24 V)
Maximum speed
3.5 m/s
Endurance
2­3 h
Ground clearance
14 cm
Climbing performance
30°
Emergency stop system
Make/Model: Hetronic MINI V1
Operating range: 50­100 m
Table 1: Specifications of ARTOR
3. Autonomous Operations
3.1 System Overview
Figure 2 shows a schematic overview of ARTOR's setup in terms of computing, sensing and communication. All computation for autonomous navigation, including sensor data processing, localization, mapping, path planning and motion control, is performed on the on­
board computer. In the current setting, this computer is a laptop with a quad­core Intel i7 processor. A similar computer is used as command station, which enables the operator to remotely control the robot (manual teleoperation, set waypoints, start/end autonomous navigation, adjust settings, etc.) and at the same time to supervise the vehicle's state, e.g. through video streams from on­board cameras. We use the Robot Operating System ROS on both the on­board computer and the command station as a framework for all processing. ROS is an open source meta­operating system (“middleware”), providing functionalities such as hardware abstraction, low­level device control, and message­passing between processes. All processes running on the robot's computer and on the command station are implemented as ROS nodes, and communicate with other nodes using ROS messages which are published on designated ROS topics.
5
euRathlon SAP Team ARTOR: Autonomous Navigation
Figure 2: Schematic description of ARTOR's data processing and control setup. The high­level computer on the vehicle processes sensor data and commands given by the user via the command station, and controls the robot's motions. The operator can shut down the robot at any time using the emergency stop system, which will bring the vehicle to a safe halt.
3.2 Localization and Mapping
Our localization and mapping system is based on iterative closest point matching (ICP) and uses the 3D point clouds from the Velodyne lidar. This sensor has the advantage of providing high­range, high­density and high­frequency measurements in 3D, independent of ambient lighting. We build a topological/metric map of the environment, a so­called pose graph, which is an undirected graph of coordinate frames with attached metric submaps, connected by relative coordinate transformations. The metric submaps are three­dimensional point clouds. ICP6 is used for 6D localization within a known map. When an area is explored for the first time, the system builds a new map by first localizing with respect to the existing part of the map and then inserting the points into the current submap. We have extensively tested our mapping and localization system in long­range route following experiments, where we first taught the robot a route and then let it repeat this path autonomously. These experiments have been done both in dynamic urban environment and in rough, unstructured, 3D terrain. Although we do not aim at building globally consistent metric maps (our maps only have to be locally accurate to enable navigation along the graph), we 6 We use the open source registration library libpointmatcher (https://github.com/ethz­asl/libpointmatcher)
6
euRathlon SAP Team ARTOR: Autonomous Navigation
have observed that our system exhibits only little drift even over large distances: typically the errors in (x,y) are below 0.5% of the distance traveled. Figure 3: A map recorded during an experiment near the ETH campus in Zurich. The width of the map is around 500 m. Note the high consistency of the map, which we achieve without loop closure and without any external global localization, such as GPS. The map is built using only the 3D lidar data and ICP.
3.3 Autonomous Navigation (waypoint following)
In the “Autonomous Navigation” scenario at euRathlon, our robot will have to navigate along a series of GPS waypoints. In addition to localization and mapping, this requires the ability to distinguish drivable from untraversable terrain, as well as motion planning. Our navigation system uses the mapping and localization module presented above, together with an obstacle detection algorithm and a motion planner. The task of the latter is to find a path from the robot's current location to the next waypoint, using the map built by the mapping module. When following this path, we use an obstacle detection algorithm based on the 3D lidar data to react to dynamic and static obstacles appearing on the route. We are able to detect both positive and negative obstacles with a high reliability, but only in the vicinity of the robot. Since our sensor has a limited range, and parts of the environment might be obstructed and not visible in the beginning, it cannot be guaranteed that all parts of the path generated by the motion planner are actually feasible. In many cases, unforeseen obstacles may be easy to avoid by slightly deviating from the originally planned path. However, this is not possible when the entire path is blocked, for example in a cul­de­sac. In such cases we can exploit the route following capabilities of our system: the robot is told to revert on its own path, until it reaches a place where the planner is able to find an alternative route to the next waypoint.
4. System Readiness
We have tested our ICP­based mapping and localization system in several long­range field experiments and achieved very promising results in both urban and unstructured terrain. 7
euRathlon SAP Team ARTOR: Autonomous Navigation
Moreover, we integrated and tested an obstacle detection module and a local motion planner, which allow for fully autonomous, adaptive route following even in dynamic and changing environments. However, this system requires an input reference path consisting of dense waypoints, such that we are guaranteed that the straight line connection between two waypoints is drivable in general (obstacles might appear and can be avoided, but no dead­ends etc.). We do not expect the waypoints at euRathlon to be equally dense, and we assume that the robot will have to move in rough, three­dimensional terrain. Consequently, we are currently working on a motion planner that is able to compute a feasible path between two waypoints in 3D terrain, based on the maps we construct with the lidar point clouds. As outlined in Section 3.3, this planner is going to be a central element of our approach to autonomous waypoint following.
8