Minibot III - Autonomous Terrain Mapping Robot

Transcription

Minibot III - Autonomous Terrain Mapping Robot
University of Southampton
Faculty of Engineering, Science and Mathematics
School of Engineering Sciences
A Group Design Project report submitted for the award of
MEng Aerospace Engineering,
MEng Electronic Engineering
and
MEng Mechanical Engineering
- Minibot III Autonomous Terrain Mapping Robot
Edward Gray
Steve McDowell
Laura McGurk
Neil Thompson
Iain Vowles
Project supervisor: Prof. Sandor Veres
Project examiner: Dr. Suleiman Abu-Sharkh
GDP – Minibot III
Autonomous Terrain Mapping Robot
ii
GDP – Minibot III
Autonomous Terrain Mapping Robot
Abstract
The Minibot project has been running at Southampton for a number of years now. The
challenge this year was to develop, from scratch, an autonomous, terrain-mapping robot,
which had a robust mechanical base and was suitable for expansion by future teams. As an
autonomous robot, the Minibot had to be capable of operating independently of any human
user for extended periods of time.
The developed robot is equipped with dual webcams, the images from which are processed
using a stereo matching algorithm executed in MATLAB. A two-dimensional map is
generated, around which the robot navigates by means of a stochastic, greedy algorithm, also
implemented in MATLAB. The robot’s stereo vision and navigation functions are supported
by lower level sensing functionality: an ultrasonic rangefinder for collision avoidance and
photo reflector ICs, used to perform odometry.
iii
GDP – Minibot III
Autonomous Terrain Mapping Robot
Acknowledgements
The team wishes to thank the following people for their help and support during the project:
Professor Sandor Veres
Project supervisor
Mr Peter Malson
Second supervisor
Dr Suleiman Abu Sharkh
Project examiner
Mr Tim Forcer
ECS laboratory support
Brian Newell
EDMC stores manager
iv
GDP – Minibot III
Autonomous Terrain Mapping Robot
Table of Contents
1
2
Introduction....................................................................................................................... 1
1.1
Background .............................................................................................................. 1
1.2
Aims and Objectives ................................................................................................ 2
1.3
Project Management................................................................................................. 3
1.3.1
Project Schedule .............................................................................................. 4
1.3.2
Project meetings .............................................................................................. 4
1.3.3
Group website.................................................................................................. 5
1.3.4
Project Budget ................................................................................................. 5
Modular Design Approach................................................................................................ 7
2.1
3
System Design.......................................................................................................... 7
2.1.1
Whole System.................................................................................................. 7
2.1.2
Stereo Vision ................................................................................................... 9
2.1.3
Navigation System......................................................................................... 10
2.1.4
Module 1: Motor Control and Dead Reckoning ............................................ 11
2.1.5
Module 2: Collision Detection System.......................................................... 11
2.1.6
Chassis Design............................................................................................... 13
Interface Communication................................................................................................ 15
3.1
RS-232.................................................................................................................... 15
3.1.1
Introduction ................................................................................................... 15
3.1.2
3.2
I C Bus ................................................................................................................... 17
3.2.1
Introduction ................................................................................................... 17
3.2.2
PIC Microcontroller....................................................................................... 18
3.3
4
Interfacing electronic circuits ........................................................................ 16
2
Wireless Fidelity (Wi-Fi) ....................................................................................... 18
3.3.1
Wi-Fi Standards............................................................................................. 18
3.3.2
WiFi Configuration and Use ......................................................................... 19
Terrain Mapping ............................................................................................................. 21
4.1
Terrain mapping sensors ........................................................................................ 21
4.1.1
Tactile Sensors............................................................................................... 21
4.1.2
Ultrasonic Sensors ......................................................................................... 22
4.1.3
Infra Red (IR) Sensors................................................................................... 24
4.1.4
Radar ............................................................................................................. 24
4.1.5
Laser Rangefinder ......................................................................................... 25
4.1.6
Stereo Vision ................................................................................................. 26
v
GDP – Minibot III
Autonomous Terrain Mapping Robot
4.1.7
4.2
Stereo Vision Systems ........................................................................................... 27
4.2.1
Camera Selection........................................................................................... 28
4.2.2
Image size...................................................................................................... 28
4.2.3
Camera Type ................................................................................................. 29
4.3
5
Sensor selection............................................................................................. 26
Logitech Quickcam Messenger.............................................................................. 29
4.3.1
Camera Properties ......................................................................................... 30
4.3.2
Field of view ................................................................................................. 30
4.3.3
Focal distance................................................................................................ 32
Stereo Vision .................................................................................................................. 33
5.1
Background ............................................................................................................ 33
5.1.1
Depth estimation ........................................................................................... 33
5.1.2
Epipolar geometry ......................................................................................... 34
5.2
Camera Setup ......................................................................................................... 35
5.2.1
Arrangement.................................................................................................. 35
5.2.2
Positioning..................................................................................................... 36
5.2.3
Alignment...................................................................................................... 36
5.2.4
Epipolar considerations ................................................................................. 37
5.2.5
Separation...................................................................................................... 37
5.2.6
Calibration..................................................................................................... 39
5.3
Candidate Regions ................................................................................................. 40
5.3.1
True limits ..................................................................................................... 40
5.3.2
Artificial limits .............................................................................................. 41
5.4
Stereo Matching ..................................................................................................... 43
5.4.1
Colour vs. Greyscale ..................................................................................... 43
5.4.2
Matching techniques ..................................................................................... 44
5.4.3
Intensity values.............................................................................................. 46
5.4.4
Intensity gradient........................................................................................... 46
5.4.5
Tiling ............................................................................................................. 48
5.4.6
Image pyramiding ......................................................................................... 49
5.4.7
Filtering ......................................................................................................... 50
5.5
Programme Implementation................................................................................... 51
5.5.1
Intensity correction........................................................................................ 52
5.5.2
Noise removal ............................................................................................... 53
5.5.3
Time-averaging ............................................................................................. 54
5.5.4
Image adjustment .......................................................................................... 55
vi
GDP – Minibot III
Autonomous Terrain Mapping Robot
5.5.5
Resolution...................................................................................................... 56
5.5.6
Matrix calculation.......................................................................................... 57
5.5.7
Parameter values............................................................................................ 59
5.6
5.6.1
Execution Time ............................................................................................. 62
5.6.2
Evaluation...................................................................................................... 63
5.7
6
Disparity Transformation ....................................................................................... 64
Navigation and Control................................................................................................... 68
6.1
7
Performance ........................................................................................................... 60
Map Construction................................................................................................... 68
6.1.1
Discussion of mapping techniques ................................................................ 68
6.1.2
Employed Solution ........................................................................................ 69
6.1.3
Method........................................................................................................... 71
6.2
Navigation and Exploration Strategies................................................................... 73
6.3
Robot Control Programme ..................................................................................... 78
6.3.1
Initiation ........................................................................................................ 79
6.3.2
Get Sensor Data............................................................................................. 79
6.3.3
Update Map ................................................................................................... 79
6.3.4
Route Selection.............................................................................................. 79
6.3.5
Output Commands......................................................................................... 79
6.3.6
Update Robot Location.................................................................................. 80
6.3.7
Display Current Status................................................................................... 80
6.3.8
Loop and Exit Conditions.............................................................................. 80
6.3.9
Termination Commands ................................................................................ 81
Low Level System Design .............................................................................................. 82
7.1
Requirements of the low level systems .................................................................. 82
7.2
Module 1 ................................................................................................................ 83
7.2.1
Dead Reckoning sensors................................................................................ 86
7.2.2
Motor Controllers .......................................................................................... 89
7.2.3
Design Decisions of Module 1 ...................................................................... 90
7.3
Module 2 ................................................................................................................ 92
7.3.1
Design Decisions of Module 2 ...................................................................... 92
7.3.2
Testing of the range finder ............................................................................ 93
7.4
Low Level System Testing..................................................................................... 94
7.4.1
Testing Dead Reckoning Sensors .................................................................. 95
7.4.2
Dead Reckoning Results................................................................................ 96
7.4.3
Summary of dead reckoning results .............................................................. 99
vii
GDP – Minibot III
Autonomous Terrain Mapping Robot
8
7.4.4
Reaction of Module 2 to an object .............................................................. 100
7.4.5
Reaction of Module 1 to an object .............................................................. 101
7.4.6
Complete response reaction time ................................................................ 102
7.4.7
Reaction to Dead Reckoning System .......................................................... 102
7.4.8
Conclusion of low level system testing....................................................... 103
The Chassis................................................................................................................... 105
8.1
Chassis Requirements .......................................................................................... 105
8.2
The Chosen Chassis ............................................................................................. 105
8.3
The Motors........................................................................................................... 108
8.3.1
Motor Housing ............................................................................................ 109
8.3.2
Motor Noise ................................................................................................ 110
8.3.3
Motor Calibration........................................................................................ 110
8.3.4
Motor Loading Calculations........................................................................ 113
8.4
8.4.1
Motor Batteries............................................................................................ 115
8.4.2
Computer Battery ........................................................................................ 116
8.5
8.5.1
8.6
The Motor Controllers ......................................................................................... 116
Programming the Motor Controllers ........................................................... 117
Mounting Components......................................................................................... 120
8.6.1
Chassis Layout ............................................................................................ 120
8.6.2
Mounting the Cameras ................................................................................ 122
8.6.3
Mounting the Dead Reckoning System....................................................... 126
8.6.4
Mounting the SRF08 Range Finder ............................................................ 128
8.6.5
Mounting the Batteries ................................................................................ 129
8.6.6
Mounting of the Motherboard and Hard Drive ........................................... 132
8.6.7
Mounting the Software Monitoring Display ............................................... 134
8.7
9
Battery Selection.................................................................................................. 115
Chassis and Mounting Conclusion....................................................................... 136
Robot Computer ........................................................................................................... 137
9.1
Choice of Platform............................................................................................... 137
9.1.1
Onboard or Remote Computing .................................................................. 137
9.1.2
Expandability and Cost ............................................................................... 137
9.1.3
Level of Abstraction.................................................................................... 137
9.2
Component Selection ........................................................................................... 138
9.2.1
Motherboard ................................................................................................ 138
9.2.2
Mass Storage Device................................................................................... 141
9.2.3
Processor ..................................................................................................... 143
viii
GDP – Minibot III
Autonomous Terrain Mapping Robot
9.2.4
RAM (Random Access Memory)................................................................ 143
9.2.5
PC Power Supply......................................................................................... 144
9.3
Operating System ................................................................................................. 144
9.4
Remote Control .................................................................................................... 145
9.4.1
Wireless Communication ............................................................................ 145
9.4.2
Remote Access Software ............................................................................. 146
9.5
Complete Onboard PC Specification ................................................................... 148
9.6
Robot Computer Summary................................................................................... 148
10
Conclusion................................................................................................................ 149
11
Recommendations .................................................................................................... 151
12
References ................................................................................................................ 152
13
Appendix A: Disparity-depth equations ................................................................... 156
13.1
Depth estimation .................................................................................................. 156
13.2
Redefining projected image location.................................................................... 157
14
Appendix B: Motor Wiring ...................................................................................... 158
15
Appendix C: Module 1 & 2 Circuit Diagram ........................................................... 160
16
Appendix D: Layout of Chassis – Layer 1 ............................................................... 161
17
Appendix E: Layout of Chassis – Layer 2................................................................ 162
ix
GDP – Minibot III
Autonomous Terrain Mapping Robot
x
GDP – Minibot III
Autonomous Terrain Mapping Robot
1 Introduction
1.1 Background
From the early-mid 20th century, robots began to play an important and increasingly
significant role in engineering and society as a whole. In 1962 General Motors first installed a
commercial robot on a production line, just 3 years after the Artificial Intelligence Lab was
founded at the Massachusetts Institute of Technology. The technology found its way to Japan
in 1967 and during the 70s robots with the ability to navigate in both structured, and later
unstructured, environments were developed. Until recently the primary use of robotics has
remained within manufacturing, where industrial robots can relieve humans from highly
repetitive or dangerous tasks. Introducing robots onto a production line can increase quality
and efficiency, and help to drive down unit production costs.
Present technology is such that robots can be cheap and reliable, and available to both large
and small enterprises and even the home user. Indeed, small biomorphic and humanoid
robots, such as the Robosapien and Sony’s Aibo have recently emerged as market leading
toys.
The most significant current developments in the field of robotics have been those of
autonomous robots. An autonomous robot is distinguished by its ability to act according to
external stimuli without human guidance. A fully autonomous robot has the ability to obtain
information about its environment, adjust its strategy based on its findings and navigate
without human assistance, all while avoiding situations that may be harmful to humans or the
robot itself.
A basic level of autonomy can be seen in recently released commercial vacuum cleaners and
lawn mowers, which employ navigation strategies ranging from sophisticated algorithms to
simple random motion, while avoiding collisions with obstacles. Conversely, the most
sophisticated autonomous robot systems can be seen in military and space applications. It is a
goal of the US Department of Defence that “by 2015, one-third of operational ground combat
vehicles are unmanned.” To this extent Unmanned Air Vehicles (UAVs) and their Combat
equivalent UCAVs have already been successfully employed for both surveillance and
combat missions, in which the aircraft can stay airborne in excess of 24 hours and offer no
danger to pilots’ lives. A high degree of autonomy is especially useful in space exploration
1
GDP – Minibot III
Autonomous Terrain Mapping Robot
due to communication delays and interruptions. The Mars Rover vehicles can distinguish safe
and unsafe areas of the planet’s surface and compute optimum routes for navigation. Such
behaviour is essential for the vehicles to perform, as radio commands sent from earth take 20
minutes to arrive at the planet!
Research into autonomous robotic systems is currently very healthy, and much of it is driven
by academic institutions and competitions. Previous Minibot projects at the University of
Southampton have aimed to develop robots that meet the performance criteria of the UK
National Micromouse Competition [36] and the Trinity College Fire-Fighting Home Robot
Contest [37], which are just two of many competitions that take place annually. The highly
publicised DARPA Grand Challenge [33] is perhaps leading the way in pushing the
boundaries of autonomous performance. Robots are required to complete a 175 mile course
over desert terrain featuring both natural and man-made obstacles within 10 hours. Guided by
GPS waypoints, the robot may receive no human intervention for the duration of the journey.
To date the challenge has not been met, with the best competitor only completing 7.5 miles of
the course. This illustrates the considerable amount of work yet to be undertaken in the field
of autonomous robotics.
Sources [27], [28], [29], [30], [31], [32], [34] & [35].
1.2 Aims and Objectives
The primary aim is this project is to:
•
Design and develop an autonomous vehicle, or robot, with the ability to explore and
map an unfamiliar environment, i.e. one of which the robot has no previous
knowledge.
Furthermore, the project is intended to achieve the following objectives:
•
Develop a robust, flexible and expandable mechanical platform that can support all
other sub-systems necessary for the robot to operate.
•
Develop an intelligent “control-centre”, in a common programming language, that
can translate sensor inputs into coherent movement commands sufficient to achieve
the primary project aim (above).
•
The completed robot should be able to generate a map of sufficient integrity both to
allow the vehicle to navigate thoroughly but safely (i.e. avoiding collisions without
leaving areas unexplored) and also allow human recognition of features in the
2
GDP – Minibot III
Autonomous Terrain Mapping Robot
completed map. The completed map should be accessible through a human,
preferably graphical, interface.
•
All robot sub-systems should be integrated to allow coherent operation. The finished
design should allow for easy upgrading and modification in the future.
As a secondary aim, this project should develop the engineering reasoning and problemsolving skills of all project team members, and further the depth of knowledge in specific
areas relevant to the degree programme of each individual. Team members should
furthermore develop leadership and team-working skills necessary to be successful in their
future careers, whether or not in the engineering industry.
This project will at all times adhere to the code of conduct of the University of Southampton
and the School of Engineering Sciences.
1.3 Project Management
The project team consisted of five members from three different degree specialisations (see
Table 1.1). The broad spectrum of engineering backgrounds ensured the project could move
in most directions without help from a third party source. The team nominated one leader who
was responsible for the overall coordination of the project including report writing and
presentations.
Name
Edward Gray
Degree Programme
Aerospace Engineering
Contribution
Treasurer
Map Construction & Navigation
System
Steven McDowell
Aerospace Engineering
Design and assembly of robot’s chassis
Laura McGurk
Electronic Engineering
Robot’s IT suite
Neil Thompson
Mechanical Engineering
Website designer
Stereo Vision system
Iain Vowles
Electronic Engineering
Team Coordinator
Sensor monitoring
Motor Control
Compiled project report
Table 1.1: Project team and responsibilities
3
GDP – Minibot III
Autonomous Terrain Mapping Robot
Although the background of each individual primarily decided which area each person should
work on, the work was often shared and overseen by other members of the group. On a
project of this scale, decisions were often made using techniques learnt through management
courses: the ideal solution was not always the practical or affordable and quite often an
element of compromise was used.
1.3.1 Project Schedule
A Gantt Chart was produced to outline the project timescale. It indicated all tasks and
highlighted milestones and deadlines (See Table 1.2). The complete chart can be found on the
project CD.
Description
Date
Submission of project proposal
22/10/04
Project Review
17/12/04
Christmas Vacation (Lab closure)
17/12/04 – 10/01/05
Semester 1 Examination Period
17/01/05 – 04/02/05
Easter Vacation (limited lab usage)
18/03/05 – 15/04/05
Final Project Hand-in
25/04/05
Table 1.2: Project milestones
1.3.2 Project meetings
Project meetings were held biweekly with Prof. Veres. Additional meetings were held as they
were called for. Two additional progress review meetings were held with the 2nd Examiner:
one in December 2004 and one in March 2005 (Minutes of the meetings can be found on the
project CD).
One day each week was allocated specifically for GDP lab work. This allowed people to work
individually or in small groups throughout the rest of the week and provided one day, on
which our ideas could be shared and discussed. This ensured the group worked coherently and
towards the same goal.
4
GDP – Minibot III
Autonomous Terrain Mapping Robot
1.3.3 Group website
The group constructed and maintained a project website, which was held on the university
home file store (see Figure 1.1). It served to inform 3rd parties of the project as well as provide
a central location for all group members to share information. In particular the website was
used to overlay our timetables: this aided with the planning of meetings and scheduling of
work. A copy of the website can be found on the project CD.
Figure 1.1: Project Website
1.3.4 Project Budget
The group nominated one person to act as treasurer and central point for all purchases. This
allowed the group to continually monitor all purchases, ensuring the correct buying
procedures were followed and that the group stayed within budget.
Owing to an incident last year, Minibot II and its components were stolen. This year’s group
had to reacquire and redesign all aspects of the project. Due to the high price of many
components, the group had no option but to request access to additional funding, which was
5
GDP – Minibot III
Autonomous Terrain Mapping Robot
granted by Prof. Veres. This funding was used to pay for particularly expensive components
such as the robot chassis.
Minibot III was allocated £700 (£300 plus £80 per person) of general funding. Table 1.3
summarises the spending of this budget and does not include the additional costs associated
with the robot chassis as it came from a different source.
Date
Item
Cost (£)
17/11/04
SRF08 Range Finder
30.49
08/12/04
PC Components
36.85
14/12/04
PC Components
253.88
28/01/05
PC Components
79.53
18/02/05
Motor Batteries and Chargers
61.54
21/05/05
PC Motherboard
43.54
23/02/05
Sheet Metal
23/02/05
SRF08 Range Finder Mount
15.15
26/02/05
B&Q Hardware
11.16
09/03/05
B&Q Hardware
9.20
15/03/05
B&Q Hardware
2.49
15/03/05
Motherboard Battery
37.98
16/03/05
PC Cables
11.10
103.40
Total
Table 1.3: GDP Budget Spending
6
696.31
GDP – Minibot III
Autonomous Terrain Mapping Robot
2 Modular Design Approach
When tackling a large and complicated task it is essential to break down the project into
distinct areas, define the requirements of each section and identify the inputs and outputs of
the module. This enables a modular design approach to be taken: a technique employed by
most engineering businesses where work is distributed around the company or outsourced.
Each module has not only been designed to meet the requirements but has been developed in
such a way that future groups could easily continue our work. This applies to everything from
the layout of the hardware to the choice of software tools and conforming to standards whilst
writing the programming code.
This section looks into what was actually produced and breaks down the project using simple
flow diagrams. The design decisions and relevant explanations follow in the appropriate
sections.
2.1 System Design
The following flow diagrams will show the whole system followed by a break down of the
various sections used to make up the complete system. They show how the systems were
linked together and how the data flowed between the sections.
2.1.1 Whole System
Figure 2.1 shows an overview of the whole system. The robot is monitored via a wireless link
to a remote computer, which allows the user to visually observe what the robot is ‘thinking’
and allows human interaction to start or stop the robot. The remote computer uses a ‘remote
desktop’ function to wirelessly communicate with the onboard PC. This means the user has
direct control of the robot.
The specification of the onboard PC is very high (see PC specification) and allows the
computer to process large amounts of image data at an acceptable speed. The onboard PC
primarily uses MATLAB. With this software it was possible to acquire two digital images,
process them to create an image disparity map (representing image depth by contrasting
between shades of black and white) and, using the resulting image, generate a depth map,
which the computer can use to identify the distance and angle of an object in its path.
7
GDP – Minibot III
Autonomous Terrain Mapping Robot
The navigation system sets the robots speed of travel, direction and distance. It also monitors
the movement of the robot in a basic form. Although accurate data is not fed back to the
navigation system about the actual movements of the robot, the system can simulate where it
believes the robot to have travelled.
Figure 2.1: Whole System Flow Diagram
Modules 1 and 2 are low level systems that work together to control all the robots sensors and
controllers. Effectively they interface between the sensors and the onboard PC using several
different communications techniques.
The arrows in the flow diagram show the flow of data around the system.
8
GDP – Minibot III
Autonomous Terrain Mapping Robot
Figure 2.2: Whole System Flow Diagram
2.1.2 Stereo Vision
The stereo vision software was contained within a module named Robot_Vision, the main
processes of which are displayed in Figure 2.3. The script was to be called without any input
arguments and produce disparity maps in the form of 2-dimensional arrays.
The first of the three sections was designed to obtain suitable images from the webcams,
eliminating noise and intensity variation. Next, the images were adjusted to transform them to
the format required for correlation. Finally, stereo matching was performed with the disparity
being calculated from the location of the best correlation, filtering was applied to the results
before they were passed to the robot’s navigation system.
9
GDP – Minibot III
Autonomous Terrain Mapping Robot
Figure 2.3: Flow Diagram of Stereo Vision System
2.1.3 Navigation System
Figure 2.4: Flow diagram of Navigation System
10
GDP – Minibot III
Autonomous Terrain Mapping Robot
Figure 2.4 shows the routine used to control the robot. The routine is based on a loop which
sequentially collects stereo image data and then outputs commands the motors as necessary.
An additional output of relevant information is displayed graphically to the user. The
programme utilises USB, RS-232 and Wi-Fi interfaces.
2.1.4 Module 1: Motor Control and Dead Reckoning
Module 1 communicates to the PC via RS 232. It receives four bytes of data (left side speed,
right side speed, left side distance and right side distance). Using the I2C data link, Module 1
commands the left and right motor controllers to operate the motors. Module 1 monitors the
distance travelled by each wheel using logic signals from the dead reckoning system and also
checks the status of Module 2 to determine if an obstacle exists in front of the robot.
Figure 2.5 shows how data is transferred between systems.
Dead Reckoning
Left side
Dead Reckoning
Right side
Logic Signals
RS 232
PC
Module 1 - I2C Data Link
Module 1
Logic Signals
Motor Controller
Left side
Motor Controller
Right side
Module 2 - I2C Data Link
Module 2
SRF08
Range Finder
Figure 2.5: Flow Diagram of Module 1
2.1.5 Module 2: Collision Detection System
Figure 2.5 also additionally shows where Module 2 fits into the system. It connects via two
logic signals to Module 1. The first logic signal is generated by Module 1 and commands
11
GDP – Minibot III
Autonomous Terrain Mapping Robot
Module 2 to initiate the range finder. If an object is detected, Module 2 generates a second
logic signal, which is directly read and acknowledged by Module 1.
The collision detection system is based on the SRF08 Ultra Sonic Range Finder. It is an
independent module that is communicated with via I2C. The range finder periodically emits
an ultra sonic pulse. Upon receiving the echoes, the range measurements are stored in high
and low bytes. Each echo is stored in a different register.
For the purpose of this project it was only necessary to look at the lower byte of data, as this
will provide range data up to 255cm (within the requirements). Similarly only the first echo
result is analysed because there is no robust way of determining which register the valid range
result is held.
Module 2 uses a form of intelligent discrimination to determine whether the range data held in
the first register is acceptable. This discrimination was based upon the light sensor reading
(light level reading is taken before any range results are logged). A common trend was found
that false range readings were given when the light sensor reading measured zero. Having
determined that each range result was valid the combined results were averaged over several
readings and based on the balance of probabilities (which accounted for spurious echo results)
Module 2 determined if an object was actually present in front of the robot.
12
GDP – Minibot III
Autonomous Terrain Mapping Robot
2.1.6 Chassis Design
Figure 2.6 : Chassis Flow Diagram
Figure 2.6 shows the set-up of the chassis and shows the location of every component. The
components mounted on the first layer are mainly the heavier parts so that the centre of
gravity can be kept as low as possible, these components includes the batteries, the four
motors, the motor housing and the wheels. There are two batteries both mounted centrally,
one on each side of the robot. This kept the centre of gravity close to the centre of the robot,
providing it with stability and balance. The battery mounts are of a rugged design so that
when the robot drives off-road the batteries will not move or vibrate which could possibly
shake the design apart.
The other components on this layer are the circuitry, such as the two micro-controller
modules and the two motor controllers connecting to the four motors underneath the chassis.
The micro-controllers being mounted on the first layer allowed the robot to be of a modular
design meaning the second layer could be removed and the robot still be controlled, at a basic
level, through a hardwire connection to a desktop PC. The Dead reckoning system is mounted
on the first layer due to the need for the system to measure wheel revolutions and the range
13
GDP – Minibot III
Autonomous Terrain Mapping Robot
finder being mounted on the first layer is due to be optimal height for accurate measure
measurement being approximately 10cm.
The second layer holds the large computer components such as the motherboard, the hard
drive and the computers Power Supply Unit, again, allowing the chassis design to be modular.
Also on this layer are the stereo vision cameras that are mounted on the front of the robot due
to the direction of travel. Finally there is a third layer, which acts as a roof to protect all the
vulnerable computer components from damage.
14
GDP – Minibot III
Autonomous Terrain Mapping Robot
3 Interface Communication
Wherever possible, industry standard communication methods were used between modules.
This allowed for a solution to be developed that would conform with other commercially
available technology. Additionally, the standards are globally recognised which allowed for
detailed research to be made into the capabilities and limitations of each method.
3.1 RS-232
3.1.1 Introduction
The term RS-232 is normally used to refer to the generic standards and protocols used for the
overwhelming majority of PC serial communication links. The scheme has been around for a
number of years and has been superseded several times. However, since the basic scheme has
been so well established and is so widely used, it is likely to remain available on the PC for
years to come.
Figure 3.1: RS-232 Serial Communication
The basic RS-232 link requires a 9-way D-type connector, as shown in Figure 3.1. Using this
standard connection type, bidirectional data can be exchanged using only three lines:
Transmit data, Receive data and Signal ground. The remaining pins are used to provide
control of modems and other such devices.
RS-232 uses bi-polar signals to allow for relatively error free transmission over distances up
to 30m (far in excess of the requirements of this project). Logic ‘0’ is represented by a
positive signal within the range +5V to +15V and logic ‘1’ by a negative signal within the
15
GDP – Minibot III
Autonomous Terrain Mapping Robot
range -5V to -15V. Any signal within the range -5V to +5V is not regarded as a valid signal
level.
Data is transmitted in a binary form independently of the transmitter or receiver’s clock. To
allow for asynchronous data transfer each data word, transmitted or received, is preceded by a
‘start’ bit and followed by a ‘stop’ bit. Figure 3.2 shows the transmission of ASCII character
‘S’ (also hexadecimal 53). The figure shows the use of the start and stop bit together with the
negative logic levels used.
Figure 3.2: Transmission of ASCII 'S' over RS-232
3.1.2 Interfacing electronic circuits
In order to interface the PC to the digital electronic circuit, the RS-232 link was setup as
shown in Figure 3.3.
Figure 3.3: RS-232 Interface
Hyper Terminal (communication software available in Microsoft Windows) was configured
with the following settings:
16
GDP – Minibot III
Autonomous Terrain Mapping Robot
•
Baud-rate (19200 Bauds)
•
Data bits (8)
•
Parity (None)
•
Stop bits (1)
•
Flow control (None)
•
Communication port number (1 or 2)
The same communications port, as defined in Hyper Terminal, was selected and connected to
the voltage level converter IC. Most digital electronic circuits operate with logic ‘0’ and logic
‘1’ having voltage signal levels of 0V and 5V respectively and not within the same range as
the RS-232 link. Therefore, the MAX 232 IC was used to make the relevant voltage level
conversions.
Hyper Terminal is a basic programme used to communicate with external devices. Although it
works exceptionally well, due to the simplicity of the programme it was eventually replaced
by a more sophisticated programme: MATLAB. MATLAB provides a greater degree of
programming freedom than Hyper Terminal and was therefore more appropriate for this
project.
3.2 I2C Bus
3.2.1 Introduction
The Inter-Integrated Circuit (I2C) bus was developed by Philips Semiconductor [1]. It is a
well-established standard supported by many commercially available electronic components.
It utilises a 2-wire bus to transmit and receive data: Serial Clock (SCL) and Serial Data
(SDA). Each device connected to the bus is addressed individually by a unique address, thus
preventing any device confliction. Data is transferred serially in synchronisation with the
clock.
The I2C interface supports three modes in hardware:
•
Master Mode
•
Multi-Master Mode
•
Slave Mode
17
GDP – Minibot III
Autonomous Terrain Mapping Robot
3.2.2 PIC Microcontroller
The I2C bus provides the PIC with simple means of communicating to another PIC or
peripheral device. In spite of its relatively slow speed (400Kbits/S), the I2C bus interface still
operates quick enough for most applications involved with the PIC.
Transfers on the I2C bus take 9 bits at a time: the first 8 bits on the SDA line are sent by the
transmitter, whereas the ninth ‘acknowledge bit’ is a response by the receiver.
To initiate or terminate any transfer on the I2C bus a ‘start’ or ‘stop’ condition is sent
respectively. Theses two conditions are unique and only occur when the relevant condition
signal is being sent. This ensures all devices connected to the I2C bus become synchronised
with the master controller.
The PIC Microcontroller was setup in master mode. This meant that all other devices
connected to the bus took their timings from the PIC: reducing timing issues between devices.
3.3 Wireless Fidelity (Wi-Fi)
3.3.1 Wi-Fi Standards
WiFi is a form of Wireless Local Area Network (WLAN). WLAN refers to a data network
wirelessly linking computers and other electronic devices within a small geographical area,
for example a home or an office building.
The first WLAN standard was created in 1997 by the Institute of Electrical and Electronics
Engineers (IEEE). It was called 802.11 after the name of the group formed to oversee its
development. The maximum bandwidth of the 802.11 standard was 2 Mbps. This was too
slow for most applications and is no longer in wide use [18].
There have been a number of extensions to this standard, with different bandwidths and
operating frequencies. The current most commonly used standard is 802.11b. This operates at
the same unregulated frequency as the original 802.11 standard, 2.4 GHz, but has a greater
bandwidth of 11Mbps.
18
GDP – Minibot III
Autonomous Terrain Mapping Robot
Newer standards include 802.11g, which is already in operation and offers a bandwidth of 54
Mbps. It is backwards compatible with 802.11b as it operates at 2.4 GHz. The future 802.11n
standard, which is currently in development, will have a bandwidth in excess of 100 Mbps. It
is not expected to be completed until November 2006, but will be fully backwards compatible
with 802.11b and g products.
3.3.2 Wi-Fi Configuration and Use
There are a number of possible configurations for a WLAN [19].
•
Infrastructure Network
•
Hotspots
•
Ad Hoc Network (peer-to-peer)
An infrastructure network allows wireless clients to connect to a wired network through a
wireless access point. The client can then function just as a wired client would. This
configuration is particularly suitable where a small number of wireless clients require access
to a predominantly wired system.
Figure 3.4: Infrastructure Network WLAN Configuration
Hotspots are similar to infrastructure networks. They tend to exist in public places such as
libraries or schools and provide a WLAN service for free or for a fee to which wireless clients
in range of the hotspot can connect.
Ad hoc networks consist of nodes (clients), which communicate directly with each other over
a wireless channel. There is no centralised access point in an ad hoc network, devices
communicate as peers. The performance of an ad hoc network suffers as the number of
devices connected grows, and as such these networks tend to feature a small number of
19
GDP – Minibot III
Autonomous Terrain Mapping Robot
devices, which are geographically close to each other. Ad hoc networks cannot bridge directly
to wired LANs or the Internet without using a gateway device. This type of network is
particularly suitable for constructing a small WLAN quickly and without the need for extra
equipment.
Figure 3.5: Ad Hoc WLAN Configuration
20
GDP – Minibot III
Autonomous Terrain Mapping Robot
4 Terrain Mapping
The primary role of Minibot's terrain mapping system was the provision of sufficient
knowledge of the local environment to allow the robot to navigate successfully.
Consequently, the main aim of the module was the detection and recording of obstacles. The
usefulness of the sensor information for other applications, such as in constructing a user
visualisation of the surroundings, was deemed to be of secondary importance.
The terrain mapping section naturally sub-divided into two main tasks, which were to be
tackled separately; sensing and interpreting the external landscape and compiling the
information into a map that would be readily usable by the robot.
The remaining part of this section is therefore dedicated to the selection of terrain sensors. In
the following chapter on "Stereo Vision" we detail the implementation of the selected terrain
mapping strategy whilst Section 5.7 considers the conversion of data into a readable map.
4.1 Terrain mapping sensors
In order to achieve an autonomous function it was necessary to provide Minibot with an onboard terrain mapping sensor or sensors that could quickly and accurately obtain information
about the surroundings. This key attribute would allow Minibot to be deployed in an entirely
new location and maintain function without the need for any external input or setup. The
choice of sensor type would have far reaching implications on Minibot's ability to perform
and thus potential solutions were considered in detail.
4.1.1 Tactile Sensors
Tactile, touch or bump sensors utilise a micro-switch that is activated by the force of contact
with an object. As such these devices are only able to determine the presence of an obstacle
when they collide with it.
21
GDP – Minibot III
Autonomous Terrain Mapping Robot
Figure 4.1: Bump Sensor [2]
Figure 4.2: Potential Bump Sensor Configurations – Bumper and Whisker
Whilst it would have been possible to employ a bumper or whisker arrangement to extend the
sensing ability past the perimeter of the robot, the limited range of these devices renders them
ineffective. The requirement for Minibot to perform advanced obstacle avoidance demands a
remote sensing capability.
4.1.2 Ultrasonic Sensors
Ultrasonic rangefinders consist of a closely mounted transmitter and receiver. The transmitter
emits a short pulse of inaudible sound typically at a frequency of 40 kHz. The receiver listens
for reflections of the pulse from nearby objects and records the delay between transmission
and the reception of the echo. Knowledge of the speed of travel then allows the distance to the
object to be calculated by what is known as the “time of flight method.”
Ultrasonic rangefinders typically operate over a range of 0.03-3m and provide a reasonable
degree of accuracy at short distance.
22
GDP – Minibot III
Autonomous Terrain Mapping Robot
Figure 4.3: Ultrasonic Rangefinder [3]
Ultrasonic sensors are a popular choice for many robots because of their low cost, small size
and ease of implementation; self contained units that return a distance value without the need
for further calculation are readily available for under £20.
Since the sensor unit would return a preformatted result the amount of signal processing
required would be reduced. However, the time taken to obtain measurements is potentially
longer than for other methods due to the slower speed of sound waves and the fact that a short
delay must be introduced to permit previous echoes to die away before the next reading can
be taken.
Ultimately ultrasonic rangefinders suffer from one major drawback that limits their
application to terrain mapping: The pulse of sound emitted by the transmitter invariably
spreads as it travels and echoes are recorded from any object that lies in the cone. Given that
the spread of the signal may be around 30 degrees, the ultrasonic rangefinder is endowed with
a very poor angular resolution since there is no way of knowing where in the cone the
obstacle lies. [51] and [52] describe methods of improving the accuracy and angular
resolution of ultrasonic sensors through the use of sensor arrays.
Figure 4.4: Possible issues arising from the use of ultrasonic sensors
23
GDP – Minibot III
Autonomous Terrain Mapping Robot
A further disadvantage is that any object presenting an oblique face to the transmitter would
scatter reflections away from the receiver and thus the sensor would fail to detect the obstacle.
4.1.3 Infra Red (IR) Sensors
IR sensors operate on an almost identical principle to ultrasonic rangefinders. However, the
infra red signal confers a slightly different set of characteristic properties on the device. For
instance, the divergence of an infra red beam is lower, resulting in a cone of vision of only
around 5 degrees [2]. This gives the sensor an improved directionality over ultrasound.
Figure 4.5: IR Rangefinder [4]
The size, cost and ease of implementation of IR rangefinders are comparable to the ultrasonic
modules, as is their potential accuracy. However, the precision of IR detectors is susceptible
to variation in the colour and reflectance of a surface.
Furthermore, the range of an IR sensor is limited to barely 80cm, which suggested that the
accurate range was only likely to be a few tens of centimetres. With a robot that itself
measured 45cm, this could potentially lead to issues with Minibot's ability to avoid obstacles
effectively.
4.1.4 Radar
A radar unit provides ranging information based on the time of flight of a radio- or microwave pulse reflected off of any nearby obstacles. Radar is capable of detecting objects
through adverse conditions, such as rain or fog, and with sufficient power radar can have an
effective range of the order of miles.
However, due to the relatively long wavelengths characteristic of radio waves, radar has low
resolution and requires a substantial receiving dish or wave-guide which is then impractical to
mount on small sized robots. Radar was therefore not considered to be a realistic solution to
the terrain mapping issue on the scale that Minibot was required to function on.
24
GDP – Minibot III
Autonomous Terrain Mapping Robot
4.1.5 Laser Rangefinder
Laser rangefinders work by transmitting and receiving a beam of Infra Red (IR) light. The
sensor may either rely on recording the time of flight, or on measurement of the phase
difference between transmitted and reflected signals for a series of modulation frequencies
[5]. The devices are accurate over a large range, potentially 1000's of metres, and are able to
detect objects regardless of their size and orientation.
Figure 4.6: Laser Rangefinder [6]
As with ultrasonic detectors, laser rangefinders are often sold as self-contained units capable
of returning a calculated distance. This again reduces the level of signal processing required
of the robot.
Due to the coherent nature of a laser the transmitted beam remains narrow along the whole of
its path. This ensures that only a single reflection is received and thus the measured distance
is subject to less uncertainty than for an ultrasonic or IR sensor. Since the beam does not
spread the device also has a more precise directionality than either of the low cost range
finders, allowing the position of any obstacle to be accurately pin pointed.
The restricted cross section of the laser beam also confers some disadvantages, since an
obstacle can only be detected if it is directly in the line of sight. In order that no obstruction
should be overlooked, the laser would be required to perform a scan across a range of angles
and at multiple heights. This survey would be potentially time consuming and introduce the
complication of controlling servo motors.
25
GDP – Minibot III
Autonomous Terrain Mapping Robot
4.1.6 Stereo Vision
A recently emerging solution to the problem of obstacle detection lies in the employment of
stereo imaging. The technique requires the use of two or more cameras programmed to
capture images at the same instant, with depth of field information being extracted by
comparing the position of objects from each viewpoint.
The Stereo vision method requires a large amount of onboard processing to obtain useful
information and the capability of the system is as much dependant on the ability of the
programme as it is on the quality of the input images. This had the disadvantage that the
accuracy of the system would be unknown until the module was programmed and installed.
The use of digital cameras would permit a stereo vision system to collect information over a
large field of vision in one go, with little loss of directionality. Furthermore, the wide range of
image acquisition devices available, from low cost web-cams to expensive digital cameras,
would allow the purchase of a component that optimised the quality/cost trade off.
4.1.7 Sensor selection
Table 4.1 qualitatively summarises the key features of each of the contender sensor classes in
order to facilitate easy comparison.
Effective Range
Accuracy
Directionality
Cost
Ultrasonic
medium
moderate
Poor
low
IR
short
moderate
Moderate
low
Laser
long
good
good
high
Stereo
medium
moderate
moderate
low - medium
Table 4.1: Qualitative summary of sensor properties
In order for Minibot to be able to make intelligent route selections it had to be aware of any
obstacles well before they were encountered. This effectively ruled out the use of short-range
IR sensors.
The obvious choice from an economical standpoint was ultrasonic rangefinders. However,
with such poor levels of angular resolution they remained the basic, low budget approach and
failed to reach the standard required for the Minibot terrain sensing module.
26
GDP – Minibot III
Autonomous Terrain Mapping Robot
The laser rangefinder meanwhile was considered to be more appropriate for a surveying
application where a high degree of accuracy would be required. In the case of Minibot, where
only moderate resolution is necessary for navigation, the expense was unjustifiable.
Stereo vision exhibits performance characteristics matching those which are required from the
robot's terrain sensors; the ability to capture reasonably accurate data quickly whilst covering
a wide cone of vision. When coupled with the moderate expense, stereo vision emerged with
the best ratio of performance to cost and as such was selected to as Minibot’s terrain mapping
system.
4.2 Stereo Vision Systems
Stereo vision systems were available as ready made components, such as Point Grey’s
“bumblebee” [12] illustrated below. Unfortunately, the cost [13] of such devices was well
above Minibot’s allocated budget and so it was necessary to construct a system from more
affordable components. The choice of image acquisition devices is considered in the
following section.
Figure 4.7: “Bumblebee” pre-calibrated vision system
Stereo vision may be implemented with any number of cameras although, as discussed above,
a stereo algorithm will only consider a single pair of images at a time.
The advantage of installing more cameras is that a greater number of image pairs are possible,
which should enhance the ability of a system to detect objects. For instance, using the
minimum of two viewpoints allows a single comparison AB, whereas adding just one more
camera facilitates three pairings AB, AC and BC, which seems to increase relative usefulness
of each image. Additional cameras can also allow disparity measurements in different planes,
as with the ‘L’ shaped arrangement present in the “Digiclops” [12] unit.
27
GDP – Minibot III
Autonomous Terrain Mapping Robot
Figure 4.8: Digiclops
However, whilst the price of purchasing an extra camera may be small, the cost in terms of
processing speed is a different matter. Since the image pairs must be considered separately a
three camera system would consume triple the calculation time of the basic system and yet
produce relatively little extra information. As a result, it was decided that Minibot would
utilise a two camera system.
4.2.1 Camera Selection
Stereo imaging has only become feasible thanks to the relatively recent introduction of digital
imaging equipment. The current increase in the use of stereo methods is no doubt due to the
reducing cost of the CCD devices that form the cameras.
When assessing the potential of stereo vision systems for terrain mapping purposes it was
noted that there was a wide range of digital imaging equipment available, both in terms of
quality and of cost. The definition of performance criteria to aid in the selection of a suitable
device was somewhat difficult since there was almost no way of knowing whether one camera
would function better than another: the choice of products was still so large that the final
decision was based largely on price and availability.
4.2.2 Image size
No quantitative measures of a camera's quality were readily available; digital cameras were
found to be sold more on the basis of the number of pixels they offered rather than the class of
their optics. Whilst capturing a higher number of pixels produced a better resolution, it was no
guarantee of a clearer picture.
28
GDP – Minibot III
Autonomous Terrain Mapping Robot
It was decided that almost all cameras were capable of providing images at a larger than
necessary size, although exceeding the requirement was not considered a disadvantage since
MATLAB’s image processing toolbox was capable of accurately resizing images.
4.2.3 Camera Type
The ability to access and extract image data from the cameras would be vital for the system's
success. To have employed a standard digital camera would have required extensive
additional programming: A programme would be needed to automatically download image
data from the camera’s memory onto Minibot's computer. Unfortunately, without specialist
knowledge in the field, this would require the data to be copied from the camera to the hard
drive and loaded from there into the programme. This extra step, requiring data to be saved to
the hard disk, would inevitably lead to a slower image acquisition sequence.
More crucially however, it would have been very difficult, if not impossible, to command the
camera to capture an image remotely, without physically pushing the shutter button.
By contrast, webcams were specifically designed to continually stream images to a computer;
as such, they transfer to a buffer and not to the hard drive. MATLAB'S ability to handle video
input devices meant that a webcam was therefore the most desirable image acquisition device.
4.3 Logitech Quickcam Messenger
The Logitech Quickcam Messenger [14] was selected for use as the stereo vision sensor. The
camera was a low to medium budget model, costing only £19 but capable of providing images
at a resolution of 352x288.
Figure 4.9: Logitech Quickcam Messenger
29
GDP – Minibot III
Autonomous Terrain Mapping Robot
Whilst the spherical nature of the device would introduce mounting issues, the sturdy base
was considered to be a distinct advantage over other models, as it would allow the camera to
be rigidly fixed in position. This was necessary so that, should Minibot suffer any knocks, the
cameras would maintain their alignment.
Also, the Quickcam Messenger featured a manual focusing ring rather than the digital autofocus present on many higher specification webcams. This meant that the focus could be set
to some suitable level and left, which was preferable to the scenario where two digitalfocusing devices auto-focused to different depths, based on their particular viewpoint, and
thus returned images that were poorly matched.
4.3.1 Camera Properties
Virtually no technical specifications were available for the camera, except for image size and
frame rate; presumably since the product was a basic webcam and had not been envisaged for
use in a scientific environment. Camera properties that were required were therefore derived
from experiment and by calculation.
4.3.2 Field of view
With the camera placed on a flat surface, markers were positioned to indicate the extend of
the device's field of view: This was achieved by monitoring a live video stream and placing
the markers so that they were just visible at the left and right hand sides of the image.
Figure 4.10: Experimental layout
The perpendicular distance z was measured from the front of the camera casing to the marker,
along with the separation, s, between the left and right markers. The average distance from the
30
GDP – Minibot III
Autonomous Terrain Mapping Robot
centreline, s’ = ½s, was plotted as a function of z in Excel and the line of best fit calculated as
s’ = 0.2976z + 0.0011.
z/m
s/m
s' / m
0.25
0.16
0.08
0.50
0.31
0.15
0.75
0.45
0.22
1.00
0.59
0.30
1.50
0.89
0.45
2.00
1.20
0.60
Table 4.2: Field of view measurements
0.70
0.60
y = 0.2976x + 0.0011
s' / m
0.50
0.40
0.30
0.20
0.10
0.00
0.00
0.50
1.00
1.50
2.00
2.50
z/m
Figure 4.11: Trend line fitted to data
The small constant term was assumed to be a consequence of measuring distance from the
front of the cameras rather than from the focal plane, whilst the gradient of the trend line was
taken to be representative of the field of view:
tan(θ ) = 0.2976
θ = 16.57
2θ = 33.15
31
GDP – Minibot III
Autonomous Terrain Mapping Robot
Hence the cameras were found to have a horizontal field of view of 33 degrees. The vertical
field of view was not required but could very easily have been calculated in a similar manner.
4.3.3 Focal distance
Knowledge of the field of view and image width allowed the evaluation of focal distance.
Since the cameras produced a digital image, the image width was only available in terms of
pixels and therefore the focal length derived was an abstract quantity, also in terms of pixels.
Figure 4.12: Focal length
The width of the image, w, was variable since the image could be resized by MATLAB thus
changing the effective focal length. Therefore, rather than specifying a particular distance it
was more appropriate to derive an expression for f that could be substituted into equations as
necessary.
tan(16.57) =
1 / 2w
f
f = 1.68w
Equation 4.1
32
GDP – Minibot III
Autonomous Terrain Mapping Robot
5 Stereo Vision
5.1 Background
The principle of stereo vision is founded upon the fact that the relative position of an object
alters with changing viewpoint: Specifically, features nearer the observer appear to shift more
than those in the distance. By careful analysis of the relative displacements of objects in a pair
of images it is possible to formulate some estimate of the distance to that object.
5.1.1 Depth estimation
The extraction of distance measurements from a pair of stereo images is essentially just
triangulation [8], requiring knowledge of the camera separation and focal length.
Figure 5.1: Focal Length
The focal length of an optical device is the distance from the focus to the image plane. In the
case of digital cameras, the image plane represents the Charge Couple Device (CCD) that
captures the image, whilst the projected image is the two dimensional view of the scene as
observed from that point, that is, the resulting picture.
The following diagram, Figure 5.2, illustrates a possible setup for a pair of stereo imaging
cameras. The cameras are located on a common base line, have parallel axes of sight and
identical focal lengths.
33
GDP – Minibot III
Autonomous Terrain Mapping Robot
Figure 5.2: Stereo depth estimation
From the construction of similar triangles it may be shown (Appendix A) that:
z=
Df
( x1 '− x 2 ' )
Equation 5.1
This equation relates the perpendicular distance of an object from the cameras, z, to the
known focal length and separation of the cameras, f and D, and to the difference in projected
object position x1' - x2'. The quantity x1' - x2' is known as the optical disparity and its
computation is the main goal of stereo imaging programs.
5.1.2 Epipolar geometry
In order for the above approach to be applied, the feature at point P must be identified in both
images. This is known as the stereo matching problem and solution methods are considered in
detail in section 5.4.2.
Initially, it seems necessary to assume that an object in the left image may appear anywhere in
the right image; thus requiring an extensive and time consuming 2-dimensional search for
each feature. However, the application of epipolar geometry reveals that for any object in
image 1, the corresponding feature in image 2 is constrained to lie along a unique curve [9].
This knowledge permits a much simpler and quicker 1-dimensional search to be implemented.
34
GDP – Minibot III
Autonomous Terrain Mapping Robot
Figure 5.3 illustrates the concept of epipolar geometry [10]. The diagram shows the projected
images as observed from two viewpoints or centres of projection (COP), assumed to have
convergent optical axes.
Epipolar
Plane
P
Epipolar
Line
p2‘
p2
p1
COP1
e1
e2
COP2
Figure 5.3: Epipolar geometry
A feature, P, appearing in image 1 at p1 is known to exist somewhere along the projection Op1. A plane, termed the epipolar plane, is uniquely defined by this projected line and the
baseline COP1-COP2. The epipolar plane cuts projected image 2 along a single, epipolar line
e2-p2'. The point p2 in image 2, corresponding to the observed object P, is constrained to exist
along this line.
An epipolar line exists in the right image for each point in the left image and vice versa. The
path of each line is dependent upon the geometry of the cameras and the position of the point
in the first image. The mapping function can be derived as the fundamental matrix [11],
5.2 Camera Setup
5.2.1 Arrangement
Minibot was to travel along the ground with freedom to explore in the xy plane but having no
control over its elevation. It was therefore considered that when mapping a location more
importance should be attached to estimating an object’s lateral position rather than its height.
Indeed, Minibot could potentially navigate using just a 2D, top-down map of its environment
and the ability to record the position of an object in 3D is more a requirement of potential
applications than an absolute necessity.
35
GDP – Minibot III
Autonomous Terrain Mapping Robot
With this aim in mind it was decided that the stereo imaging cameras should be mounted side
by side in a horizontal plane, rather than on top of each other. This would provide a horizontal
disparity, which would best facilitate the estimation of lateral position. As a consequence, the
stereo vision programme would be required to search for matching pixels in horizontal scan
lines.
5.2.2 Positioning
The mounting height of stereo vision cameras is potentially less critical than for some other
range finding techniques such as sonar or laser. Stereo vision creates a 3D map of the
surroundings and so objects at all relevant elevations will be sensed and recorded. Therefore,
the main consideration when positioning the cameras was that they had an unobstructed view
of the surroundings; this led to the cameras being placed centrally at the front edge of the
second level of the robot.
5.2.3 Alignment
In the derivation of equation 5.1 it was assumed that the optical axes of the cameras were
parallel, yet since convergent and divergent configurations were also possible it was
necessary to consider their relative merits.
Figure 5.4: Possible Optical Alignments
A divergent arrangement would give a much wider general field of view, but at the cost of
providing only a narrow region of overlap. Since an object must appear in both images in
order for it to be detected by stereo vision, the characteristic was regarded as detrimental to
performance and therefore to be avoided.
36
GDP – Minibot III
Autonomous Terrain Mapping Robot
A convergent configuration on the other hand, would offer much greater overlap than parallel
alignment. This would enable a larger proportion of the image to be matched-up and hence
ensure a more efficient operation.
However, non-parallel arrangements have additional properties that render them far less
attractive solutions for applications where processing speed is important. The concept of
epipolar geometry was raised in section 5.1.2 and its implications are critical here.
5.2.4 Epipolar considerations
As noted in the derivation of the general case, a match for a point in the first image may be
located along an arbitrary line in the second image. Therefore a basic stereo matching
programme would need to include an algorithm to calculate each epipolar path and extract the
necessary data.
A more sophisticated and commonly applied alternative is to perform a transformation prior
to processing that deforms the images such that the epipolar lines are horizontal [15]. This
method is known as re-projection and can simplify the extraction of stereo matches, since it is
far less demanding to describe and reference a horizontal line. However, the re-sampling of
the images can be a time consuming procedure in itself.
The parallel axes scenario is a unique case since, by virtue of the fundamental matrix, the
epipolar lines are already horizontal. This alleviates the requirement to run any additional,
time consuming algorithms to process the image data into a usable format. Furthermore, there
is no degradation of image quality, a factor that would be unavoidable if there was a required
to manipulate pixelated images.
Since the parallel alignment provided speed of processing and image clarity over those of a
skewed arrangement, it was selected for use on Minibot. The reduction in image overlap was
judged to be of less significance than the potential simplification of the stereo algorithm.
5.2.5 Separation
The ideal separation distance of the two cameras was subject to much debate. In theory, a
longer baseline between viewpoints would allow a more accurate prediction of distance to be
37
GDP – Minibot III
Autonomous Terrain Mapping Robot
made from the triangulation, since any uncertainties in angular position would become less
significant.
However, the further the two cameras are placed apart the harder it becomes to correctly
match pixels since there is a larger region of potential partners. Moreover, any difference in
an object’s appearance between the two viewpoints is exasperated by increasing the
separation. Finally, a larger separation would mean less overlap between views and therefore
a greater area of image that could not be matched up.
Whilst these arguments seemed to favour a smaller separation, it was still not known what
value the optimal distance would take. An experiment was conducted whereby disparity maps
of a scene were produced from pairs of images of varying separation.
Images were taken using a Logitech Quickcam along a horizontal baseline at intervals of
20mm. The leftmost image was paired, in turn, with each subsequent image and the two
passed to the stereo algorithm. The images and resulting maps are displayed in Figure 5.5.
Since it was suspected that the depth of field of the scene would have a bearing on the optimal
separation, it was necessary to create a setting with objects at distances typical of those to be
encountered by Minibot.
Figure 5.5: Right image and left disparity map for varying camera separations
The image sequence shows that the programme is quite capable of dealing with images of
various separations. At the smallest separation the minimal number of discrete disparities is
evidenced by the fact that only around three depth levels are visible. This was undesirable
since the distance information that could be gained from this map was very limited.
Increasing the separation has the effect of providing better definition, although, by the
38
GDP – Minibot III
Autonomous Terrain Mapping Robot
separation of around 8cm it was possible to see mismatched regions appearing on the right
hand side of the image.
This, and other similar experiments, suggested that a separation in the region of 4-6cm would
provide the optimal balance of depth information and reduction in exposure to mismatched
pixels. Due to the size of the cameras however, the minimum achievable separation between
focal points was 6cm; therefore this limiting value was chosen and the cameras were to be
mounted as close together as possible.
5.2.6 Calibration
Before the cameras could be used for capturing stereo image pairs they required calibration.
Colour correction was performed automatically by the cameras and so all that a user was
required to do was ensure that the cameras were focused correctly and pointing in the right
direction
Focusing was achieved by adjusting the manual focus ring whilst previewing a video feed.
The cameras were set to focus at a distance of around 1-2m, the range which Minibot was
required to detect most obstacles, although the main concern was to ensure that the cameras
were both focused to the same depth.
Ensuring that the cameras were aligned correctly was a rather more involved task. To
guarantee that both had the same elevation and were exactly parallel, a target was produced
featuring two circles spaced to match the separation of the cameras. The target was placed
directly in front of the camera mount at a reasonable distance and images streamed from each
camera. These images were overlaid with a crosshair such that the centre of the image could
be exactly aligned with the centre of the appropriate circle.
39
GDP – Minibot III
Autonomous Terrain Mapping Robot
Target
Board
Left Image
Right Image
Figure 5.6: Camera calibration
Alignment re-calibration was found to be necessary after long periods of robot operation,
since the movements were enough to shake the cameras out of their delicate positioning.
5.3 Candidate Regions
Before considering how best to search for a matching point it is timely to consider where the
desired match can be found. The candidate region describes the area of the complimentary
image the where the corresponding pixel must lie.
From the application of epipolar geometry we are already able to limit the search to a single,
horizontal scan line. By applying geometry again to the parallel camera alignment we can
reduce this region to a specific portion of that line.
5.3.1 True limits
Figure 5.7 depicts an object appearing at a horizontal location of x1 in the left hand image. If
the object is at an infinite distance from the cameras then the lines of site from both
viewpoints will be parallel and the object will appear at the same horizontal location in the
right image. For any object closer than infinity, the corresponding point in the second image
will be to the left of this extreme.
40
GDP – Minibot III
Autonomous Terrain Mapping Robot
Figure 5.7: True candidate range
In short, for a pixel in image 1 with horizontal position x1, the corresponding pixel in image 2
must have a horizontal location in the range [0 x1]. Similarly for a pixel in image 2 with
position x2 the corresponding point in image 1 must lie in the range [x2 w], where w is the
width of the image.
5.3.2 Artificial limits
In the above derivation the edge of the image was used to define one end of the candidate
range: Whilst this is strictly correct, it means that pixels over one side of the image have a
candidate region that approaches the width of the image and thus offers little improvement.
Large candidate regions slow image processing and increase the chance of mismatching a
pixel since there is a greater probability of another point appearing similar to the algorithm.
By imposing an artificial limit we can reduce the range that must be searched but at the risk of
excluding the correct match.
A reasonable compromise is to set the limit to exclude objects that are very close to the
camera, this may appear counter intuitive but it is nearby objects that create the large
disparities that we wish to avoid. Also, Minibot’s navigation programme is designed to turn
the robot well before it reaches any obstacles and the ultrasonic range finder is capable of
surveying up to 40-50cm ahead.
41
GDP – Minibot III
Autonomous Terrain Mapping Robot
Figure 5.8: Artificial candidate region
From similar triangles,
x
D
=
f z min
Equation 5.2
Equation shows that the maximum disparity x that must be considered is related to the
minimum distance we wish to be discernable, zmin and the separation, D and focal distance, f
of the cameras. Substituting
f = 1.68w
Where w is the width of the image, permits the rearrangement
x 1.68 D
=
w
z min
Equation 5.3
Substituting D as 60mm, Equation 5.3 reveals that if we are prepared to forgo seeing any
obstacles closer than 50cm the maximum disparity can be reduced to around 20% of the
image width.
42
GDP – Minibot III
Autonomous Terrain Mapping Robot
5.4 Stereo Matching
Stereo matching is the task of identifying corresponding features between the two images.
The challenge of programming a computer to ‘recognise’ equivalent points when viewed from
different positions is certainly not a trivial one and the problem has inspired a variety of
approaches. Most of the methods can be broadly categorised into one of two styles.
Sparse disparity maps - Some algorithms utilise a form of feature extraction to
identify regions of an image that are likely to yield good matches and then
concentrate resources on finding the right pairing. The extraction need not be
complex and often a form of edge detection is used.
Dense disparity maps - The alternative approach is to form a dense disparity
map by attempting to find a match for every pixel in an image, with any resulting
mismatches filtered by a series of fitness conditions.
Whilst the first method appeared efficient it was also a more complicated programming task
and yielded only a sparse disparity map; any area that was not considered as a feature would
not receive a depth estimate. Conversely, the second, bulk method was potentially easier to
implement and indeed lent itself to MATLAB’s matrix handling ability since an entire image
could be processed at once, reducing calculation times.
Therefore, the stereo matching algorithm developed was to pursue a bulk implementation: for
every pixel in the left image a corresponding pixel was selected from the right image and then
the process repeated the other way around to create a disparity map for both left and right
images. The second matching stage is necessary since it provides data that can be used in
error checking the maps and also ensures that the robots navigation is not subject to an off
centred view.
5.4.1 Colour vs. Greyscale
From a computers perspective an image is simply an array of intensity values, any programme
that attempts to compare and evaluate pixels must do so on the basis of these values.
Colour images are composed of multiple channels of intensity; commonly Red, Green and
Blue (RGB) or Hue, Saturation and Lightness (HSI). Consideration of each of these channels
43
GDP – Minibot III
Autonomous Terrain Mapping Robot
is possible either separately or in parallel, however, converting a coloured image to greyscale
reduces the number of channels and hence the processing time threefold and yet can still
preserve contrast.
Figure 5.9: Image channels
In practice, many of the problems encountered in stereo matching were found to relate to an
inability to identify a particular point on an object due to the similarity in appearance of the
immediate surroundings. This predicament was unresolved through the use of colour since
pixels in the region were likely to be of similar colour anyway and so multiple channels
provided no extra contrast to enable differentiation.
Since there was a potentially reasonable time saving to be gained for little or no disadvantage,
it was decided that the stereo imaging programme should operate using greyscale images.
5.4.2 Matching techniques
As mentioned above, any attempt to characterise pixels needed to focus on their intensity
values. Consider these traces of intensity values along an arbitrary horizontal scan line from a
44
GDP – Minibot III
Autonomous Terrain Mapping Robot
pair of stereo images [16]. Note that the actual data is discrete, due to the pixelated nature of
the images.
Figure 5.10: Stereo pair and intensity plot
The two lines exhibit the same general form with the plot for the right image showing a slight
negative phase shift by virtue of the camera positioning. From these plots it is possible to
identify the main corresponding features, although in some regions there are subtle
discrepancies.
45
GDP – Minibot III
Autonomous Terrain Mapping Robot
5.4.3 Intensity values
An initial, basic algorithm was simply to search for the pixel within the candidate region with
the most similar intensity value. Unfortunately, this technique was easily confused by small
differences between the images. For instance, consider the enlarged view of the intensity plot
below. It is clear that the data at point A corresponds to the pixel at B, yet a match based on a
single intensity value would pair A with C.
Figure 5.11: Enlarged intensity plot
The approach was further undermined by the presence of any noise in the image data since
when considering only a single pixel at a time, the interference element would completely
obscure the true intensity.
5.4.4 Intensity gradient
A desirable improvement on this basic method was to encapsulate some sense of the shape of
the curve at the point in question. This was thought possible through the estimation of
gradients at each point.
Due to the widely spaced and discrete nature of the data, estimating the gradient based upon
the two neighbouring points would drastically reduce the level of information contained in the
46
GDP – Minibot III
Autonomous Terrain Mapping Robot
data. This would be particularly apparent at local maxima or minima, where the averaged
slope would fail to show the zero gradient.
Instead, it was decided to calculate the data between two consecutive points and assign this
value to the second point in the pair. This quantity would represent the gradient before each
point and was sufficient to characterise the pixel. Assigning values of before and after
gradient to every point was redundant since the gradient after each point would be equivalent
to the gradient before the next point and so no additional information would be gained. Figure
5.12 shows an overlaid plot of the gradient before each point for both the above images.
Figure 5.12: Stereo pair gradient plot
Consider an enlargement of the gradient plot over the same region as before and notice that a
pairing based on the nearest gradient value would facilitate the correct matching of point A to
B.
47
GDP – Minibot III
Autonomous Terrain Mapping Robot
Figure 5.13: Enlarged gradient plot
However, whilst in this example the method was successful, utilising gradient values by
themselves would, by definition, encounter exactly the same problems as using intensity
values alone. The solution was to combine both measures of fit so that only points that were
similar in both criteria could be paired.
The combination and relative weighting of the two factors was possible in a variety of ways;
since no theoretical reasoning was able to suggest one method in preference to another, the
parameters were determined through experiment as detailed in ‘implementation’ section.
5.4.5 Tiling
A further way to develop the basic algorithm was to include consideration of the intensity
values of pixels neighbouring the point of interest. This approach would provide further
indication of the ‘shape’ of the intensity plot and also increase the resilience to noise.
Selecting an image tile centred on the original pixel and attempting to match this to a similar
sized region requires the calculation of a ‘goodness of fit’ statistic based upon the individual
intensity differences. A variety of methods have been suggested:
48
GDP – Minibot III
Autonomous Terrain Mapping Robot
Sum of Absolute Differences (SAD) - sum of the absolute magnitude of each
individual intensity difference.
Sum of Squared Differences (SSD) - sum of the square of each individual
intensity difference.
Normalised Cross Correlation (NXC) - convolution matrix operation available
within MATLAB capable of providing correlation values.
In experiment the first two methods produced very similar results in near identical times.
Normalised cross correlation took far longer to calculate results that showed no obvious sign
of superiority.
Due to the large similarity in output maps, it was decided to select a correlation function
based mainly upon speed of operation and so SAD was selected. As an additional advantage,
since SAD and SSD were implemented via an original script, rather than a pre-written
MATLAB function, there was opportunity to attach different weightings to the intensity
difference of each pixel within the tile: This would allow pixels nearer the centre of the tile to
be assigned more importance than those nearer the perimeter and allow more flexibility when
fine tuning the algorithm parameters.
5.4.6 Image pyramiding
An innovative way to increase the speed and performance of stereo matching algorithms is
through the use of image pyramiding [17], so called since the image size is increased at each
level. The basic proposal is to perform stereo matching in a hierarchical manner, starting with
an image pair of reduced resolution. At each level the resolution is increased and matching
performed using the previous disparity map to target the search. This can increase the
efficiency of the algorithm and provide denser disparity maps, since in the case where no
match can be found at a particular level, the result for the previous level can be substituted.
49
GDP – Minibot III
Autonomous Terrain Mapping Robot
Figure 5.14: Image pyramid sequence
Whilst sound in theory and successfully implemented in other programs, resolution
pyramiding was found not to be particularly useful in this application. One reason was that the
algorithm did not seem able to match pixels so accurately at low resolution and thus any
errors were carried forward, degrading the final disparity map.
Another factor that reduced the effectiveness was that the matrix method used to produce
disparity maps in one sweep could not be targeted to individual regions. In order to utilise the
previous disparity maps, correlation was performed across the entire candidate range and then
the specified target regions were extracted. Therefore, there was no reduction in the amount of
processing required and, in fact, an additional working step had been introduced. When
combined with the fact that stereo matching then had to be performed over a whole sequence
of images, this led to a not insignificant increase in execution time.
5.4.7 Filtering
When pairing pixels, the stereo algorithm selects a match on the basis of the best correlation
value available; at this stage no consideration is made of how well the pixels are matched or
whether any almost equally deserving matches are available. Resolution of the situation is
best achieved through subsequent filtering of the results. Three types of filter are prescribed to
remove the main classes of mismatched pixels:
50
GDP – Minibot III
Autonomous Terrain Mapping Robot
Correlation threshold - Removal of any points that are insufficiently well
matched. This is achieved by simple consideration of the correlation statistic; if
the value is too high then the points are significantly different and should not be
paired.
Uniqueness – Removal of any points that could reasonably be matched to more
than one partner. One possible system requires the second and third best matches
to be extracted along with the top correlation; the second placed value is
discarded since it is common for two neighbouring pixels to produce very similar
correlations. However if the third value is also similar, say within 5% of the
original, then there is deemed to be insufficient evidence that the correct match
has been made and the pixels are eliminated.
Reciprocation – Removal of any points that match to a pixel and do not receive
a match from that pixel in return. This filter is one of the main reasons why
disparity maps are produced for both images: If a pixel in the left image pairs
with a pixel in the right image and that pixel subsequently selects a different
pixel when it is matched, then this suggests that the original match was
erroneous. However, some allowance must be made since the positioning of the
cameras often means that more of an object is visible in one image than in the
other; thus many pixels would legitimately be matched to fewer and yet could
not all receive a return match. Therefore a match is rejected if the selected pixel
does not return the match to either the original pixel or one of its immediate
neighbours.
5.5 Programme Implementation
The stereo matching algorithm was to be implemented in MATLAB and capable of fulfilling
the following design requirements:
51
GDP – Minibot III
Autonomous Terrain Mapping Robot
•
Fitness based upon intensity
•
Fitness based upon intensity gradients
•
Average values over a definable tile area
•
Locate best match
•
Filter results
Figure 5.15 illustrates the programs operation. The subsequent sections detail the requirement
for, and operation of, specific sections of the software, including the selected value for any
user variable parameters. The source code can be found on the project CD.
Figure 5.15: Programme flow diagram
5.5.1 Intensity correction
The cameras were capable of automatically adjusting colour balance so as to provide an
optimum view under variable lighting conditions. Whilst this was a useful feature that would
allow Minibot to traverse areas of different lighting levels more freely, it conferred a major
drawback: The images supplied whilst the camera was adjusting exhibited large variations in
intensity.
52
GDP – Minibot III
Autonomous Terrain Mapping Robot
Figure 5.16: Mean and Individual pixel intensity variation
The plot shows the variation of intensity vs. frame number for a typically lit scene. The mean
curve exhibits the characteristic from of a feedback loop, with a rise (fall), overshoot and
settling. The second curve shows the same adjustment for a particular pixel in the images; the
initial drop in intensity is still apparent, although the effect is confounded with noise.
The auto correction procedure occurred each time data was requested from the cameras,
regardless of how long the device input had been open. Since the initially high intensity was
prone to saturating the image, especially when in brightly lit environments, the only feasible
solution was to disregard the first 30 image frames.
5.5.2 Noise removal
The level of noise present in the images supplied by the webcams was enough to seriously
impair the capability of the stereo vision system.
One solution was to use a smoothing algorithm to remove the noise. MATLAB provided three
such filters; median, averaging and adaptive, of which median appeared to be the most
effective. However, whilst smoothing the image would alleviate noise it could also have the
effect of reducing clarity and removing the contrasts that allow points to be matched.
An alternative method was to average each pixel intensity value over time by acquiring a
series of consecutive frames from the camera. This would reduce the effect of noise without
sacrificing sharpness, although the time taken to capture the image would be increased.
53
GDP – Minibot III
Autonomous Terrain Mapping Robot
Figure 5.17, below, allows a comparison of the two methods under different lighting regimes;
daylight and low level. Each row shows a single, original frame followed by the same image
smoothed by a median filter. The right hand column depicts the scene when averaged over
100 frames.
Figure 5.17: Filtering vs. averaging under different lighting conditions
Whilst the filtered and averaged images are almost indiscernible in good light, the averaged
picture clearly offers superior noise reduction under adverse conditions. Interestingly the loss
of sharpness in the filtered images is less significant than expected.
The comparison suggests that the time-averaging technique is an effective and more robust
method than filtering. Also, since the camera's auto-intensity correction already required
multiple frames to be captured, see below, the capture time penalty was a less significant
factor. Therefore, time averaging was selected as the best way to eliminate noise from
captured images.
5.5.3 Time-averaging
Time-averaging was performed by simply calculating the mean intensity value for each
individual pixel over a series of image frames.
54
GDP – Minibot III
Autonomous Terrain Mapping Robot
Averaging over a large number of frames was preferable in order to best reduce the effect of
noise and allow a closer estimate of a pixel's true intensity. However, each additional frame
captured would increase the time required to execute the algorithm.
Figure 5.18: Convergence of averaged images
Figure 5.18 shows the discrepancy between an image averaged over the indicated number of
frames and the overall average, taken over 80 frames. The difference is measured as the mean
sum of absolute differences in intensity values. It shows, for instance, that if when
considering a single frame we would expect each pixel to deviate from the 'true' value by just
over 0.9.
It was decided that 40-50 frames would be used to calculate pixel intensity as this appeared to
provide a reasonable approximation without requiring an excessively long capture period.
5.5.4 Image adjustment
Whilst the webcams were capable of adjusting brightness to suit variable lighting conditions,
the images produced were still lacking in contrast. By inserting a single contrast enhancement
command, imadjust, the adjusted images would better show the variations in intensity that
would help distinguish features and facilitate superior matching.
55
GDP – Minibot III
Autonomous Terrain Mapping Robot
Figure 5.19: Image as captured from camera and after contrast enhancement
5.5.5 Resolution
The size of the image upon which the disparity calculations are performed can have a
profound effect on the run time of the algorithm. For instance, by halving the width and
height of an image the number of pixels to be matched is reduced by a factor of four.
Furthermore, any reduction in image width can be accompanied by a reduction in the
maximum disparity that must be searched to. Therefore we may expect the calculation time to
be related to an image’s characteristic length raised to the third power.
Figure 5.20: Processing time as a function of image size
The figure shows the time taken to calculate disparity maps for a pair of images of the size
specified. Since the value of the maximum disparity parameter is dependent upon other
factors, measurements were taken at levels of both 10 and 20% of the image width for
illustrative purposes. Times were calculated as the average of three runs.
56
GDP – Minibot III
Autonomous Terrain Mapping Robot
The graph clearly shows that large image sizes can drastically increase run time; processing a
pair of original, 288x352, images straight from the cameras can even cause the programme to
crash.
However, higher resolution images allow objects to be plotted on a finer scale and can
improve the programs depth estimation capability. This is because the optical disparity is
measured in terms of pixels and therefore takes only integer values. By increasing the number
of pixels across an image you enable a finer measurement of disparity and are thus capable of
resolving a larger number of discrete depths.
Eventually, it was decided that high resolution disparity maps were unnecessary since
mapping on a centimetre scale would be gratuitous for a robot that had dimensions of the
order of tens of centimetres. By passing images of size 72x88 to the stereo imaging
programme maps of sufficient detail and adequate disparity levels can be produced in times
well below five seconds.
Image resizing was performed by an inbuilt MATLAB function, imresize. The command was
capable of scaling images by a variety of methods; nearest neighbour, bilinear or bicubic.
Bicubic sampling was selected as the best option as it was less prone to being affected by
individual pixel values than nearest neighbour, and took no longer to compute than bilinear.
5.5.6 Matrix calculation
One distinct advantage of using the MATLAB programming environment was the ability to
manipulate data in the form of matrices. This feature was ideal for processing stereo images,
which were ultimately two-dimensional arrays of intensity data, quickly and efficiently.
Having obtained suitable intensity and intensity gradient data for both images, Minibot’s
stereo vision programme relied on an original procedure to calculate disparity:
•
First, the correlation coefficient was calculated at every point in the left image,
for both intensity and gradient, at a disparity of zero: This was achieved by
aligning the right image arrays over the left images and computing the absolute
difference in corresponding values.
57
GDP – Minibot III
Autonomous Terrain Mapping Robot
•
Then the correlation was found at a disparity of one by shifting the right arrays
one pixel to the right and again taking the absolute difference between elements.
•
This process was repeated until the maximum disparity was reached, at each stage
the correlation values were recorded in 3d correlation matrices; one each for
intensity and intensity gradient. The correlation matrices’ x and y dimensions
corresponded to the horizontal and vertical position of the pixel being matched in
the left image, whilst the z dimension indexed the disparity level.
•
Next, the intensity and intensity gradient correlation matrices were combined,
either by addition or element-wise multiplication, with suitable weightings
applied.
•
Multiple versions of the overall correlation matrix, each one offset set by a
different amount, were summed to replicate the effect of pixel tiling.
•
Finally, the minimum correlation statistic along each z direction was located: The
z coordinate of this value was the disparity calculated for that point in the left
image.
•
The left correlation matrix formed during this procedure would only be partially
filled with values. This is because, for pixels near the edge, disparity values
above a certain level make reference to points outside of the right image. In this
region the correlation coefficient was set arbitrarily high.
Figure 5.21: Left correlation matrix
58
GDP – Minibot III
Autonomous Terrain Mapping Robot
There was no need to create the right correlation matrix completely from scratch since it
merely represents a transformation of the left correlation. Recognising this shortcut allowed
some of the more processor intensive calculations to be performed once, rather than twice.
The transformation from left to right correlation matrix required each xy plane of actual
values to be shifted to a 'left alignment.'
Figure 5.22: Left and right correlation matrices
5.5.7 Parameter values
The process of fine tuning the algorithm was largely a case of selecting values for the many
programme parameters; image size, maximum disparity, tile size, correlation weighting and
filtering levels. Unfortunately, altering one quantity would most likely affect the optimum
setting of another, for instance, changing the image resolution would mean a different tile size
was appropriate.
Furthermore, there was no definitive way of assessing the performance of each run since the
true depth values were unknown. Whilst some stereo pairs available online were accompanied
by 'true disparity maps' they were invariably computer generated images or disparity maps
calculated by other programmes and thus not necessarily correct. Either way, perfecting the
algorithm to perform well on downloaded images would not ensure optimum performance
with real world data.
In order to best calibrate the stereo vision system, it was therefore necessary to conduct a
number of trial runs on images captured by the webcams of typically encountered settings. By
varying parameters slowly and comparing performance by inspection it was possible to arrive
at a reasonable set of operation settings:
59
GDP – Minibot III
Autonomous Terrain Mapping Robot
Parameter
Value
image size
72x88
maximum disparity
15
tile size
5x5
Correlation matrix
combination method
multiplication
intensity weighting
^1
gradient weighting
^1
Filters
correlation threshold
~1500
uniqueness threshold
~100
reciprocation range
1
Table 5.1: Parameter settings
5.6 Performance
Upon implementation and tuning the stereo vision programme performed well, as illustrated
by Figure 5.23. These disparity maps were created for a pair of file photos where the picture
clarity is higher than for images produced by the webcams. As such more of the image can be
matched successfully; the result is included here to demonstrate the capability of the
algorithm.
60
GDP – Minibot III
Autonomous Terrain Mapping Robot
Figure 5.23: Disparity map for test image
The disparity map provides an indication of depth to the object at that point; the lighter the
pixel the closer the object to the observer, although the relationship between disparity and
depth is not linear but inverse. The wide black bands at the outer edge of each disparity map
are due to the fact that parts of the image in this region do not appear in the other view and
cannot be correctly matched.
61
GDP – Minibot III
Autonomous Terrain Mapping Robot
Figure 5.24: Disparity map for real image
Figure 5.24 shows the results for a pair of stereo images captured by Minibot’s camera unit.
In this instance, the uniqueness filter has removed matches made on the floor and door
leaving the toy clearly visible, along with the walls on either side.
5.6.1 Execution Time
Interestingly, the time taken to execute robot_vision.m was more due to the collection of
image data than by the calculation of disparity maps. Table 5.2 shows the time typically spent
performing each section of the stereo algorithm.
62
GDP – Minibot III
Autonomous Terrain Mapping Robot
Operation
Time / sec
Image capture
13
Adjusting / Resizing
0.7
Correlation
0.05
Tiling
0.7
Sort / Locate match
0.05
Filtering
0
Total
14.5
Table 5.2: Typical process run time
As can be seen, even when image preparation was included, the processing time of the stereo
vision system only amounted to 1.5 seconds. The vast majority of run time was dedicated to
acquiring images from the cameras, mainly because of the large number of frames required.
5.6.2 Evaluation
The in-service performance of Minibot’s stereo vision module occasionally failed to achieve
its full potential. Where this occurred two general manifestations were observed: Noisy
regions of mismatched pixels that appeared to represent an obstacle that did not exist or
inability to recognise real objects because of insufficiently unique matching.
Both of these faults could be attributed to a failure of the filters, letting either too many or too
few points through. However, it is perhaps fairer to cite an inability to optimise filter
parameters to be applicable to the widely variable lighting environment and image
composition.
Another factor which would certainly have a detrimental affect on execution was the
difficultly in aligning the cameras and maintaining the arrangement whilst operating. The
cameras could only be aligned by eye and were sensitive to very small changes in orientation,
which could be caused by the robot bumping and jerking during movement. Any slight
discrepancy in height would lead to the algorithm searching along the wrong scan line for an
object. Worse, a deviation from the parallel arrangement could lead to subtle changes in
disparity that would transform to substantial differences in estimated distance.
63
GDP – Minibot III
Autonomous Terrain Mapping Robot
Whilst inconsequential to the quality of disparity maps produced, the performance modules
performance could be vastly improved if the time taken to obtain images could be reduced.
This could be achieved by utilising a superior quality camera to capture a single image, thus
alleviating the need to remove saturated frames and noise.
5.7 Disparity Transformation
The output from the Stereo Vision module (Section 5.5) is in the form of a double-precision
integer matrix representing a positive disparity between right and left images (see Figure
5.25). This matrix, along with some necessary information about the camera installation,
allows for the transformation of each disparity value into a robot-relative position. For given
robot location, a further transformation can be made from robot-space into global-space, and
as such the features identified in the disparity image can be recorded onto a global map.
[38] describes the triangulation method for two identically orientated cameras, (Section 5.2)
contains a discussion of different camera orientation strategies). Equation 5.4 summarises this
calculation which, when ideal cameras are employed, is dependant on two constants only; the
focal distance f of the two cameras, and the distance between them, D.
z=
Df
x1 − x 2
Equation 5.4
where z is the image depth in metres, D is measured in metres, and f, x1 and x2 are both in
image pixels. An ideal camera is assumed to have a thin, plane lens with pixel aspect ratio
equal to 1 and no distortion [38] and [39].
The matrix of disparity values ( x1 − x 2 ) contains only positive integers which causes some
interesting results and limitations on the transformation. The smallest measurable disparity is
1 pixel, which, ignoring errors, corresponds to a distance (or image depth) of Df , or 5
metres, based on the constants defined below. The uncertainty of this measurement equates to
values of between 3.3 and 10 metres. Mathematically, a pixel disparity of zero represents an
infinite distance, but due to the discrete nature of the disparity matrix as mentioned above,
zero disparity can in fact represent distances of 10 metres and above. This uncertainty varies
in inverse proportion to the pixel disparity as shown in Figure 5.26, and so distances can only
64
GDP – Minibot III
Autonomous Terrain Mapping Robot
be considered accurate to ±10% for disparities of 6 pixels and above, representing distances
below 1 metre.
5
5
6
7
9
4
5
5
10
12
4
4
3
3
7
Figure 5.25 - Disparity represented as a) Grayscale Shading and b) Numerical Values
35
+/- Error (%)
30
25
20
15
10
5
0
0
5
10
Disparity (px)
Figure 5.26 – Maximum error at increasing disparity
Figure 5.27 – Shows the transformation of a perfect disparity map, after filtering. Note that D
and f were unknown for this image, and estimates were used.
65
GDP – Minibot III
Autonomous Terrain Mapping Robot
Figure 5.28: Disparity transformation of a real image
It should be noted that, subsequent to the theoretical derivation and analysis of these results,
[38] states that the relationship “in real-world situations is actually close to being useless” due
to the accumulation of measurement errors, inevitable differences in orientation between the
two cameras, and different focal lengths (f) for the two cameras. In this project it was
considered that the advance in present camera technology would alleviate some of these
issues, and initial tests showed that the triangulation technique was sufficiently robust to
overcome other errors.
The disparity matrix is transformed into a projected-distance matrix and the method of similar
triangles used to place features into 3d space. Figure 5.27 shows the complete transformation
process on a perfect test image from [41] and Figure 5.28 on the real image introduced in
Section 5.6. An empirically derived scaling factor, η, was used to correct skew in the image.
Although the exact cause of the skew has not been identified, the curvature of the camera lens
and finite size of photo-sensor (i.e. non-ideal cameras) have been acknowledged as
contributory factors [40].
66
GDP – Minibot III
Autonomous Terrain Mapping Robot
For these images the cameras were mounted 6cm apart (D = 0.06) where the focal distance f =
80 pixels. The scaling factor η = 0.5.
Following this stage the data set may be filtered to remove the ceiling or floor so as to prevent
clutter when viewed as a top-down projection. This would be necessary for flattened (twodimensional) maps, although the full data set should be included when plotting true 3d vector
maps (see Section 6.1.1 for a discussion of mapping techniques). The data is filtered by
removing the top 20% and bottom 5% of points in the y-direction (height). The other filtering
methods considered were based on vertical distribution density, where it is assumed that a
significant proportion of the original image is taken up by the floor, and an actual height,
which filters data-points below 5cm in height. While both of these alternatives could
potentially produce floor filters with a higher accuracy, the chosen method was considered to
be the most effective and simple to implement. A filter based on actual height would require
very precise calibration, and the distribution density filter produces erroneous results when
little or no floor is in view. The drop in accuracy for the chosen method will become most
relevant where low level obstacles in an indoor or outdoor environment could become lost.
67
GDP – Minibot III
Autonomous Terrain Mapping Robot
6 Navigation and Control
6.1 Map Construction
6.1.1 Discussion of mapping techniques
Robotic maps can be classified in a number of different ways. Perhaps the most obvious is
into two or three-dimensions. The simplest is a top-down two-dimensional (2d) map,
commonly used for navigational maps for walking or driving, or as floor plans for buildings.
These can be further enhanced into ‘two-and-a-half’ dimensions by the addition of further
information. Shipping charts, for example, use numerical values representing depth superimposed onto a 2d chart, and contour-lines on Ordinance Survey maps display relief
information in a similar way. Building plans often use two or more 2d maps ‘layered’ on top
of one another to represent a third dimension.
All of the described methods are used because true three-dimensional (3d) representations of
environments are very difficult to achieve without complex graphics or computation. Only
recently have Computer Aided Design (CAD) packages enabled technical drawings to be
represented easily in 3d, including some specialist computer modelling packages, such as
MATLAB, which have significant toolboxes dedicated to 3d modelling.
A further important classification of robotic mapping is into metric (either grid or vector
based) or topological maps. Metric maps are so called because they record distances between
geometric features in an environment. Grid based maps divide an environment into discrete
segments, and record the status of each segment. Well-suited to computational calculations
because of their rectangular properties, grid-based maps unfortunately require a high
resolution (i.e. small segment size) to achieve high accuracy, which leads to long processing
times and requires large amounts of data storage.
Vector maps record the exact position of features using mathematical expressions and so are
much more economical with storage. A single expression may be used for large wall
segments or other features. The primary drawback to vector maps is that while arcs and lines
can be quite easily plotted, more complex shapes can be difficult to represent mathematically.
68
GDP – Minibot III
Autonomous Terrain Mapping Robot
Both these methods can represent 3d data as a fairly straightforward extension of 2d
modelling, although in the case of occupancy grids, the aforementioned issues of time and
storage may increase by many factors of 10.
Rather than recording geometric data, topological maps focus on connectivity between
different places, for example doorways or corridors [50]. Environments are represented as
significant features, annotated with information on how to move between them, and as such
are very efficient as they only store relevant information. Most topological maps, however,
rely on metric data in practice [42]. Furthermore, while topological maps allow robots to
navigate with high efficiency, a further transformation of the map is required to allow human
interpretation. For this reason topological maps were not considered for implementation in
this project.
6.1.2 Employed Solution
Initially this project investigated 3d vector mapping, making use of the MATLAB function
patch.m. Obstacle data taken from one or more sensors, and represented in 3d space, was
modelled by fitting 3d planes to the data using a least-squares approach. This method did not
suffer many of the limitations of a grid based system, and has the potential to map quite
complex environments including stairwells. Figure 6.1 demonstrates a simple three-walled
room mapped in this way, using 0.5 metre panels. By only creating panels where a cluster of
more than n points occurs within a given radius (n is user defined) the method can effectively
filter out anomalous data points. The routine uses the plane fitting function fitplane.m from
[46].
When tested, the method behaved poorly when fed noisy data, as can be seen in Figure 6.2.
When a small cluster radius was used there were an insufficient number of points to plot, but
as the radius was increased the best fit plane failed to represent the contour of the
environment. It was concluded that this method was robust only when used with very accurate
sensor data, with little noise.
A grid-based approach to mapping was implemented to overcome the issues experienced with
noise. The grid was defined in two dimensions only, and with a 10cm resolution, i.e. 100 unit
squares per metre. Figure 6.3 shows how a possible MATLAB representation of a 3d grid
using voxels [45] might look, but this was not an approach followed during the project for
two main reasons. The first was complexity; A 2d square environment of 10m each side has
69
GDP – Minibot III
Autonomous Terrain Mapping Robot
1002 unit spaces. Adding a third dimension of 2m height increases this figure by a factor of
20. Secondly, to navigate successfully a robot is concerned only with obstacles either on the
floor or below its maximum height. In most common environments obstacles occur from the
ground up (chairs, tables, walls etc.) and so the third spatial dimension would add very little if
any useful information.
A significant limitation to the grid method was the choice of resolution. The chosen resolution
must be sufficient for the robot to recognise gaps between obstacles to an accuracy of
approximately a quarter of the robot width, while minimising the increase in processing time
and data storage capacity. Furthermore, it was decided that the map should only represent a
fixed size environment, which would be pre-determined. This was to avoid complicated
expressions needed to represent a dynamically changing size of map. As such, the size of
mapping environment chosen must be large enough to fully cover the area to be explored,
whilst again minimising the cost to processing time.
Further advantages to the grid approach are the ability to represent obstacles with probability
distributions (see below) and ease of robot control.
Figure 6.1: 2 views of a 3d vector map
70
GDP – Minibot III
Autonomous Terrain Mapping Robot
Figure 6.2: Vector-based representation of noisy data
Figure 6.3: 3d grid ‘voxel’ representation. Although visually pleasing, this method is not well
suited to mapping complex or cluttered environments
6.1.3 Method
The 10l × 10l obstacle matrix A represents a square environment of side l metres, or 10l
decimetres. From a top down perspective, cell A11 is the south-west (or bottom-left) corner
and A5l,5l is the centre cell. Initially A = 0 as all cells are assumed to be free from obstacles.
Section 8.3.3 shows that for one complete wheel rotation, the robot travels 52cm. To simplify
calculations, the Robot Metre was defined as 104cm, and used in the mapping algorithm.
71
GDP – Minibot III
Autonomous Terrain Mapping Robot
Data from the disparity transformation is first converted from robot-space to global space
using a straight-forward geometric transformation, based on the robot position at the time the
scan was taken. As y-data (height) is being ignored, a 2 × n matrix P contains the x- and zcoordinates for n data points determined by the disparity function.
If we let A(Pk ) = A ij where Pk = [i, j ] at time t then, at time interval t + 1 , A is updated by
the formula
A(Pk ) t +1 = A(Pk ) t + 1 n
for k = 1 : n
Equation 6.1
This formula states that for each data point Pk a value equal to 1 n is added to the previous
value of the corresponding cell in A. In this way, the greater the number of data points
calculated to be in any location, the larger the number added to that cell. After each iteration,
A is normalised by the formula A = A A max so that the minimum and maximum values of
A are always [0, 1] respectively. The result of this is that at any given time cell A ij stores the
probability of an obstacle existing at coordinates [i,j] in the mapping environment. The effect
of normalising the obstacle matrix is also to remove anomalous data points over time. Figure
6.4 shows how Equation 6.1 is implemented within the MATLAB environment. The obstacle
matrix A is represented in a MATLAB figure window using the colormap ‘bone’, whereby a
white square indicates no obstacle, with increasing probability as the shade turns to black
(Figure 6.5).
for i = 1:length(P)
A(P(i,2), P(i,1)) = A(P(i,2),P(i,1)) + 1/(length(P));
end
A = A./max(max(A2));
Figure 6.4: MATLAB code extract
72
GDP – Minibot III
Autonomous Terrain Mapping Robot
Figure 6.5: MATLAB figure showing obstacle grid
6.2 Navigation and Exploration Strategies
The robot navigation strategy should provide a thorough search of the environment in an
efficient (but not necessarily optimal) manner. For this robot, and others which rely on deadreckoning as the only method of localization, the strategy should also aim to minimize the
quantity and magnitude of turns, as these induce accumulative errors (see Section 7.4.1).
Many tools are available for finding optimum routes through known environments, but
exploring robots have to cope with partial and incomplete models. For this reason, most
robotic exploration relies on a random, or stochastic, strategy [47]. Common exploration
algorithms also greedily maximise information gain, that is they are heavily weighted towards
exploring areas previously unvisited [42]. The Rapidly-Exploring Random Tree method,
discussed in [43] and used effectively in robotic exploration in [44] is an example of both
these features. The Minibot exploration strategy was also designed along these two principles.
Search strategies based on both vector and grid-based data were simulated in MATLAB, to
allow a comparison between the two methods. Figure 6.6 and Figure 6.7 show the vector
method for two differently shaped simulated environments, with the robot path shown in red.
73
GDP – Minibot III
Autonomous Terrain Mapping Robot
Figure 6.6: Vector based search – Effective
Figure 6.7: Vector based search – Less Effective
In this simulation, the robot always travels in a straight line until in close proximity to a wall
or object, when it performs a turn of a randomly determined angle. The MATLAB function
rand.m was used to provide uniformly distributed random numbers. Figure 6.6 shows that the
strategy was able to perform an effective and efficient search of both environments. From
Figure 6.7, however, it can be seen that complete exploration cannot be guaranteed using this
strategy, as in these examples the robot completed many loops without exploring the full
environment.
The grid-based navigation simulation utilises the property that at any given time, the robot
can move to one of 8 neighbouring cells (Figure 6.8). Each of the 8 cells is assigned a
probability based on a number of user defined conditions, including the probability that the
cell contains an obstacle, whether the cell has previously been visited by the robot, and
whether visiting that cell would require the robot to turn. These variable probabilities are of
the same matrix form as A defined in section 6.1.3, and are combined through weighted
multiplication.
74
GDP – Minibot III
Autonomous Terrain Mapping Robot
The probability of the Robot moving to cell ci is given by Equation 6.2.
(
P ( c i ) = 1 − A ci
)
x
y
× B ci × C ci
z
Equation 6.2
where A is the obstacle matrix (see 6.1.3).
B is 1 initially, but Bj gets multiplied by 0.5 each time cell j is visited by the robot.
C is 1 if robot is currently facing that cell, 0.1 for all other cells.
Parameters x, y and z are user adjustable. Typical values are x = 25, y = 100 and z = 50.
Figure 6.9 shows how the method was simulated in MATLAB. Solid black lines represent
walls, with other shaded areas representing other obstacles and the path travelled by the robot,
as a combined probability distribution. The robot position is illustrated as a red cross, with its
‘tail’ showing the previous five moves. The simulation clearly shows a preference for travel
in straight lines, avoiding obstacles and previously visited locations. During testing of the
simulation, the exploration strategy consistently performed an efficient and thorough search
of the test environment.
c1
c4
c6
c2
R
c7
c3
c5
c8
At a given time the robot is at location R.
Cells c1 to c8 are the 8 possible connected moves.
Figure 6.8 – Possible moves in a grid-based navigation system
75
GDP – Minibot III
Autonomous Terrain Mapping Robot
Figure 6.9: Grid based mapping simulation. Robot is shown as a red cross.
An investigation was performed into the effect of varying parameters x, y and z. Of particular
significance, parameter y affects the robots preference to search new territory, equivalent to
the rate of information gain. The test measured the time taken to simulate a search of 95% of
the environment, and the results are shown in Figure 6.10. The graph shows that y has a
significant effect on reducing the search time, especially at values up to ~100 (note the
logarithmic scale). Above this, although some further improvement occurred, a detrimental
76
GDP – Minibot III
Autonomous Terrain Mapping Robot
effect could be seen in the robots preference for minimising the number of turns, an equally
important parameter. The simulation was found to perform well in all areas with y = 100.
The grid-based method of exploration and navigation was chosen for implementation on the
robot. The MATLAB simulation showed it to be reliable and consistently more effective than
the vector-based equivalent. Additional benefits of the chosen solution were a less
complicated MATLAB function and the ability to combine probability distributions for an
unlimited number of user-specified factors.
A significant limitation to the simulated strategy, however, was the inability of the robot to
consider environmental factors beyond the immediate eight connecting cells. This can cause
the robot to ignore large areas of unexplored space if a single bounding cell has been visited
already. The strategy was altered to overcome this limitation by forcing the robot to consider
a weighted polygon of cells for each potential move, similar to the method described in [48].
Using this weighting, many occupied cells a small distance from the robot would have the
same effect on the transition probability as would a single occupied cell close to the robot.
Figure 6.11 shows the polygon and weightings used. This solution increased slightly the risk
of potential collisions, but improved the efficiency of the search.
During testing of the actual robot, the number of potential moves was reduced from eight to
four, spaced equally at 90°, for simplicity.
1.6
1.4
Time (s)
1.2
1
0.8
0.6
0.4
0.2
0
0.1
1
10
100
1000
10000
y - information gain parameter
Figure 6.10: Variation in exploration time with varying y.
77
GDP – Minibot III
Autonomous Terrain Mapping Robot
Weighting
0.05
0.2
1
R
Figure 6.11: Weighted polygon of cell probabilities
6.3 Robot Control Programme
The robot control programme consists of nine distinct and independent sections, discussed
below. For this reason the MATLAB script was designed to be modular to allow individual
functions, or modules, to be updated and replaced more easily. The script was designed for
MATLAB v7.0 and employs the Image Acquisition and Image Processing toolboxes. A flow
chart of this process can be found in Section 2.1.3. The full MATLAB code can be found on
the accompanying project CD.
78
GDP – Minibot III
Autonomous Terrain Mapping Robot
6.3.1 Initiation
This header creates the main programme variables, including the initial environment map and
any user-defined parameters. The serial-port object is created, and the random-number seed
reset to ensure true stochastic behaviour.
6.3.2 Get Sensor Data
This module addresses the two cameras concurrently and acquires images. The images are
manipulated and the disparity calculated and returned to the control script. The module
requires no variable inputs.
6.3.3 Update Map
The data is manipulated and plotted as described in Section 6.2. The function takes as inputs
information on the current robot position and heading and the disparity values returned in
6.3.2.
6.3.4 Route Selection
This function takes as inputs the robot current location and probability distribution matrix
and selects the next move. Based on the current robot location and the chosen next cell, the
function calculates what, if any, angle the robot must turn and the distance to move forwards.
6.3.5 Output Commands
The movement calculations from 6.3.4 are converted to ASCII characters and sent via a oneway RS232 link to the hardware controller Module 1 (Section 7.2). Due to the different
processing speeds of the operating PC and hardware controller, and 0.1s pause was introduced
in between each of the four distinct commands. The serial port was opened before each
communication and closed afterwards. Figure 6.12 shows the form of the command to set the
left motor speed, stored in the variable Lspeed. The serial port object is stored in the variable
s.
79
GDP – Minibot III
Autonomous Terrain Mapping Robot
fprintf(s, '%s \r', num2str(Lspeed));
pause(0.1)
Figure 6.12: MATLAB serial commands
6.3.6 Update Robot Location
Before the process reiterates the new robot location must be calculated. It was intended that
MATLAB would receive information from the dead-reckoning system (Section 3.1) via the
RS232 link, concerning the movement of the robot. This feature has not currently been
implemented due to time constraints, but it is recommended that the modification be made in
the future.
Instead, the robot location is currently updated based on the commands sent to the motors,
under the assumption that the behaviour of the robot is accurate and predictable.
6.3.7 Display Current Status
At this stage the script outputs data to inform the user of the robot’s current status and
position. MATLAB figure windows are used to show the current situation of the map, and the
robot’s position on that map. Another figure shows the last taken camera still image, and the
disparity map calculated from it.
A GUI (Graphical User Interface) concept was explored to display the above information, in
addition to the current robot status. This could include motor speed and direction, total search
time, battery status and any other user-specified data. A two-way GUI could also allow the
user to interact with the robot, prematurely ending the search, or altering other variables and
parameters as necessary. Time constraints have not allowed the GUI to be implemented, but it
is recommended that this enhancement be made in the future. An example of a robot GUI can
be found in [49].
6.3.8 Loop and Exit Conditions
A function is used to calculate whether to continue or terminate the exploration. At present
the exit condition is set to a time limit (for example 30 minutes) although other exit conditions
could include completing 90% exploration of a given environment or looking for stagnation
80
GDP – Minibot III
Autonomous Terrain Mapping Robot
behaviour – when the robot follows the same movement pattern a number of times. If the exit
condition is not satisfied, a ‘while loop’ returns the programme to stage 6.3.2, and the process
is repeated. On satisfying the exit condition, the programme moves to section 6.3.9.
6.3.9 Termination Commands
These final commands take place after the exploration has been completed. None are
currently implemented, but commands may include auto-navigation back to the robot start
location, and additional map transformations to further aid the user.
81
GDP – Minibot III
Autonomous Terrain Mapping Robot
7 Low Level System Design
The low level systems include the control of sensors and movement. They operate in
conjunction with the higher level software systems which are primarily based in MATLAB.
There were three primary low level systems: motor controllers, range finder and dead
reckoning sensors which were all controlled and monitored via two PIC microcontrollers
(Module 1 and Module 2).
As simulation packages were not available for compiled PIC-C code (used on Module 1 & 2)
testing had to be done in stages. To assist in testing, a software monitoring display was
created (design and mounting is discussed in section 8.6.7) which allowed the user to visually
see where in the programme the PIC was operating.
The full circuit diagram for the low level systems (Module 1 & 2) can be found in Appendix
C.
7.1 Requirements of the low level systems
Controlling every system separately does not make for a good modular design approach. If
the high level system failed, it would not be possible to control any low level systems.
Therefore it was decided to centralise all the low level systems and control them through one
master module: Module 1. Any faults with the low level systems could be independently
analysed separate to the high level system.
The requirements of the low level systems were:
•
Operate independently of the computer main board
•
Take, as an input, the bare minimum of information from the main board to prevent
undue dependency upon the higher level systems
•
Communicate with the main board and all peripheral components using industry
standard communication techniques
•
Interface with the robot’s four motors using a controller
•
Provide a collision detection system that works independently of the robot’s stereo
vision system
•
Monitor the robot’s movements against the required movements
82
GDP – Minibot III
Autonomous Terrain Mapping Robot
7.2 Module 1
Module 1 is the ‘master’ module and provides the connection between the computer’s main
board and the robot’s sensors and controllers. Module 1 directly controls the motor controllers
and the dead reckoning sensors and indirectly (via Module 2) controls the range finder (each
system is discussed in detail further in the project). Figure 7.1 shows how the systems are
connected.
Module 1 is based on the PIC 16F877A. This IC was chosen because of its extensive memory
capacity (368 bytes of data memory and 256 bytes of EEPROM), its large number of
input/output pins (5 ports of 8 pins each) and its ability to handle RS 232 and I2C data links.
This PIC meet the requirements of this project and would allow for future developments
without having to change the hardware. PIC IC’s can be programmed using a low level
assembler language or in a higher level programming language such a PIC-C. Assembler is
primarily used when processing speed and memory capacity are a high priority. However,
since this project operates well within the limits of the PIC it was decided to use the higher
level programming language of PIC-C. PIC-C uses built in libraries of commands which
streamlines the programming and allows for easy understanding and integration with future
project work. An instructional guide to PIC-C is available on the project CD (ccsc.chm).
83
GDP – Minibot III
Autonomous Terrain Mapping Robot
Dead Reckoning
Left side
Dead Reckoning
Right side
Logic Signals
RS 232
PC
Module 1 - I2C Data Link
Module 1
Logic Signals
Motor Controller
Left side
Motor Controller
Right side
Module 2 - I2C Data Link
Module 2
SRF08
Range Finder
Figure 7.1: Integrated low level systems
The programme follows the process laid out in Figure 7.2. Upon power-up, the programme
initialises the motor controllers. As was shown in Figure 7.1, this process happens using I2C
commands and works by directly addressing each motor controller in turn (The specific
device address is set in hardware on each motor controller). Upon initialising the motor
controllers the programme waits for four bytes of data from the main PC. These variables
(Left speed, Right speed, Left distance and Right distance) are stored in memory and used to
command the motor controllers.
From this point, Module 1 operates independently of the main PC until an object has been
detected in its path or it has reached the required distance. When the robot moves, the stereo
vision system is not active and therefore no commands would ever be sent from the main PC
to Module 1 until all motion has ceased. This meets one of the requirements of the low level
systems to minimise interaction between modules and prevent undue dependency between
modules.
As shown in Figure 7.2, Module 1 subsequently monitors the output of the dead reckoning
system together with the output of the collision detection system (Module 2).
84
GDP – Minibot III
Autonomous Terrain Mapping Robot
It was decided to divide up the tasks between Module 1 and Module 2 in the manner
described because it was found that range finder worked best when it was constantly active
during the forward movement of the robot. Although not complicated, this activity requires
the sole devotion of one microcontroller. It was therefore appropriate to split this section into
a separate module (Module 2), leaving Module 1 with the sole responsibility of checking the
status of the dead reckoning system and the status of the collision detection system (Module
2). Both modules operate using a 4MHz oscillator which ensures the module will react faster
than the changing environment of the robot (i.e. PIC is capable of monitoring all inputs faster
than the signals will change due to a changing condition on the robot such as its position).
85
GDP – Minibot III
Autonomous Terrain Mapping Robot
Power On
Initialise Motor
Controllers
Wait for RS 232
No
Speed &
Distance Data
Yes
Determine
direction of robot
travel
Forward
Travel?
Yes
Activate Motor
Controllers
Activate Module 2
No
No
Activate Motor
Controllers
No
Collision
Detection
System
Reached
required dist?
Object
detected?
Yes
No
Reached
required dist?
Stop both motors
Yes
Yes
Stop both motors
Dead
Reckoning
System
Stop both motors
Figure 7.2: Flow diagram of Module 1
7.2.1 Dead Reckoning sensors
Relative positioning techniques suitable for use on a mobile robot fall into two categories:
odometry and inertial navigation. Odometry uses encoders to measure wheel rotation and, if
present, the steering capability of the robot. The advantages of such a method are that it is
86
GDP – Minibot III
Autonomous Terrain Mapping Robot
totally self-contained, and can always provide the vehicle with an estimate of its position
regardless of the state of external parameters. The disadvantage inherent in odometry is that it
incurs a position error which grows without bound. The error can be reduced by the regular
use of an external reference.
The inertial referencing method uses equipment such as gyroscopes and accelerometers,
integrating the data yielded to determine position. These systems are also self-contained.
Their major weakness is that they tend to suffer from what is known as ‘drift’. Any constant
error subject to integration increases greatly. These systems are therefore unsuitable for use
over an extended period of time. The equipment for inertial referencing is, in general, very
expensive. Although laser-gyros have reduced greatly in price, it would not be economical to
spend so much money on a secondary navigation system at this stage in the development of
the robot.
The photo reflectors are Hamamatsu P5587s (Figure 7.3). On the top they have a small IR
LED and a photodiode. The reflectors are active low. This means that they give an output
voltage of 5 V. When light from the LED is reflected back at the photodiode, the output
switches to 0 V. The ICs incorporate a Schmitt trigger, meaning that the output signal is
always a clean logic ‘1’ or ‘0’.
Figure 7.3: Hamamtsu Photoreflector IC
This method of dead reckoning was particularly suitable for use on the developing robot. The
encoder discs could be easily redesigned with more or less segments to alter the resolution of
the data. Magnetic switches, functioning in a similar capacity were also considered as an
option. These would have had the advantage that they would be insensitive to external
interference such as dirt. However, for the purposes of development, it was decided that a
simpler system with no moving parts would be more suitable.
The dead reckoning system, utilised in this project, used the photo detector to monitor the
change in tone of an optical disc. The disc was made from card with an even number of black
and white segments and was placed on the inside front left and right wheels. The photo
87
GDP – Minibot III
Autonomous Terrain Mapping Robot
detector is mounted less than 0.5cm away from the disc (mounting accuracy is not necessary
provided it is within the manufacturers limits). A balance had to be found whereby the optical
disc had as many segments as possible to provide the required level of accuracy, that the
photo detector could still comfortably differentiate between the segments (i.e. when the disc
rotates, the sensor must be able to detect the transition from a black to a white segment and
vice versa).
The circuit produced a rectified logic signal (logic 1 = 5V or logic 0 = 0V) with no levels in
between. This was important when using a PIC to monitor the logic level of the circuit, since
the PIC has a threshold detector which it uses to determine the difference between logic 1 and
0: intermediate levels would cause uncertainty in the system.
When checking the status of the two dead reckoning systems, the PIC operates on a loop.
Originally on each loop, the PIC would count every occurrence of a positive logic signal.
However, due to the clock speed of the oscillator the PIC would often detect the same signal
several times and would count each occurrence, giving a false result. Similarly, looking for
the rising or falling edge of the signal caused problems. If the PIC was not at the correct
location in the programme when the rising or falling edge occurred, it would not count the
transition. To overcome this problem, the programme looked for a change in the logic level
(not the rising or falling edges). It relied upon the clock speed of the PIC being far superior to
the rate at which the logic state changes.
nextstate_left=input(PIN_A0);
if(nextstate_left!=presentstate_left)
{
i++;
presentstate_left=nextstate_left;
}
Figure 7.4: Extract of code from Module 1
88
GDP – Minibot III
Autonomous Terrain Mapping Robot
Figure 7.4 shows how this was achieved. The current logic state of the dead reckoning system
was stored as the ‘present state’. When the programme loops round and checks the status of
the system again, it compares the new state (‘next state’) against the previous one (‘present
state’). If there has been a change (i.e. ‘present state’ does not equal ‘next state’), then the
programme increments a constant which indicates the optical disc has moved.
The circuit diagram for the dead reckoning circuit is shown in Figure 7.5.
Figure 7.5: Dead Reckoning Circuit Diagram
The mounting and testing of the dead reckoning circuit is discussed in section 8.6.3.
7.2.2 Motor Controllers
The four chosen motors can draw up to 5A each depending upon the loading. This exceeds
the available output current of PIC. Additionally, to precisely control the motor’s speed it was
necessary to accurately vary the supply voltage of the motors. This could not easily be
achieved with standard circuitry, therefore two Devantech MD22 Dual Motor controllers were
chosen instead to control the four motors (two motors per controller).
89
GDP – Minibot III
Autonomous Terrain Mapping Robot
Figure 7.6: MD22 Motor Controller
Figure 7.6 shows the chosen motor controller.
The advantages that the motor controllers offered the project were:
•
Simple control interface via I2C
•
Buffer between the delicate microcontroller circuitry and high powered motors
•
Custom solution designed specifically to accurately operate the chosen motors
•
Provided the voltage level of the battery does not drop below the required voltage, a
constant voltage can be maintained.
A detailed description of the motor controllers and their operation can be found in section 8.5.
7.2.3 Design Decisions of Module 1
The standard baud setting for RS 232 is 9600 bits per second. This represents the data transfer
speed of the RS 232 data link between the main PC and the PIC. In electronics, unless there is
a specific reason to slow down, it is usual to select the highest speed possible before errors are
detected. Through trial and error, it was found that 19200 bits per second was optimal.
Although faster speeds exist, they were found to be unreliable and therefore not appropriate
for this application as limited error checking facilities exist to correct the errors.
The watchdog timer is used to cause a hardware reset in the PIC if the software appears to be
stuck. For the timer to work it must first be enabled and given a time parameter. The
watchdog timer can be set at increments between 18ms and 2304ms. Should this time
parameter be reached, a hardware reset follows. Unfortunately, when the PIC is waiting for
data via RS 232, it is often forced to wait longer than the maximum time of 2304ms. Causing
a hardware reset would not be appropriate at this time despite the PIC appearing to be
inactive. To ensure this did not happen the watchdog timer was disabled.
90
GDP – Minibot III
Autonomous Terrain Mapping Robot
Module 1 receives four bytes of data from the main PC. These variables are stored as ASCII
characters. In order to interpret this data, the ASCII characters require to be converted into an
integer. This was achieved using extra commands available by using additional (nonstandard) header libraries. Once the value was converted to a standard integer data type, the
motor controllers would be able to interpret the motor speed settings. The additional header
libraries can be seen in the programming code.
In the scenario that the stereo vision sensor system failed to locate and avoid an obstacle, the
SRF08 range finder would be utilised. Once an obstacle has been located the robot ceases any
further forward movement and will be commanded to turn. At this point Module 2 (which
controls the range finder) is deactivated to ensure that it does not redetect the same obstacle
whilst turning. To determine when the range finder should be activated, the robot’s directional
movements are identified. Since the range finder is only positioned on the front of the robot, it
is only appropriate to activate the range finder when moving forward. Forward movement is
determined by a motor speed setting being between 0 and 127, reverse movement is
determined by a motor speed setting between 129 and 255 (stop is 128).
As can be seen from Figure 7.2, once the robot has begun moving, the main programme of
Module 1 loops around checking the status of the dead reckoning sensors and the output of
Module 2.
The left and right sides of the robot will not travel at the same rate with identical speed and
distance settings. Consequently the robot will not necessarily travel in a straight line or stop
precisely. This is due to inaccuracies in the electronics (distance between photo-resistor and
optical disc) and mechanical design (centre of mass position). Additionally, if the left and
right motors do not stop at identical times, one side will drag the other. This means the robots
movement cannot accurately be monitored.
To tackle this problem, a calibration test was run. It was decided that the distance the wheels
travel should be kept constant on both sides (even during turning where one side rotates
forward and the other in reverse). Instead, the speed of the wheels on both sides should be
changed. This allowed the left and right sides to cover the correct distance over the same time.
91
GDP – Minibot III
Autonomous Terrain Mapping Robot
7.3 Module 2
Module 2 is a ‘slave’ module to Module 1. It is solely responsible for operating the SRF08
Ultra Sonic Range Finder and returning a result back to Module 1. Module 2 connects to the
range finder and back to Module 1 as shown in Figure 7.1.
Module 2 receives a logic signal from Module 1 (activation signal). Upon receipt of the
signal, Module 2 commands the range finder (via I2C) to begin ranging. Module 2 continues
to range find while the activation signal remains asserted. Once an object has been detected,
Module 2 sends a signal back to Module 1. Module 1 acknowledges this signal by dropping
the activation signal. This handshaking technique ensures both modules are in full
communication with each other and keeps the timing correct.
7.3.1 Design Decisions of Module 2
The I2C data link was used to communicate with the SRF08 range finder. This allowed the
PIC to instruct the range finder to begin ranging and return a result in cm. By checking the
range finder’s first register (Software Revision Number) it was possible to determine whether
the range finder had finished receiving and storing all results. By checking this register, it was
possible to operate the range finder again as soon as the distance registers had been read. This
maximised the efficiency of the range finder.
A lot of problems were encountered whilst continually reading data from the range finder. On
the whole the range finder would crash and fail to work again until the power had been reset.
Upon inspection this could only have happened if the range finder was not receiving the
correct commands. It was found that when reading data from a slave device (such as the
SRF08), acknowledgements must be sent to receipt for the data. However, when reading
multiple registers consecutively, the last register accessed must not be acknowledged. This
instructs the device that no more register will be accessed.
A regular pattern was established whereby false results would be recorded when the light
sensor reading (automatically recorded every time a ranging happens) returned a zero. This
formed the first level of intelligent data sorting.
92
GDP – Minibot III
Autonomous Terrain Mapping Robot
When dealing with ultra sound devices, many factors can affect the results. The most
significant of which is the sensor either failing to detect an object that is there, or detecting an
object that isn’t there.
To overcome this problem a few intelligent checks are made on each valid range result.
Firstly only results within a certain limit will be accepted (currently the range is set to detect
obstacle between 5cm and 50cm). This accounts for the conical shape of the ultra sound wave
hitting the floor and returning a strong echo. By setting a maximum range the robot will not
detect the floor as an obstacle. Secondly to overcome spurious results, the range finder sends
out 10 pulses. If 4/10 of the results yield a positive result, it is deemed that an obstacle exists.
This ratio was set by trial and experiment.
When an obstacle is deemed to exist, Module 2 alerts Module 1 by raising a logic signal to
high. To ensure that Module 1 has actually received this signal, Module 1 acknowledges the
signal to Module 2.
7.3.2 Testing of the range finder
A box of 20cm3 was placed on the floor in front of the range finder which was placed 10cm
above the floor. (This scenario represented the true application of the sensor). The box was
initially placed 4 cm from the sensor and moved away at 2cm increments.
At each distance, nine successive range measurements were taken as quickly as possible. The
results were averaged and plotted in Figure 7.7.
The first line (actual distance) indicates the actual distance measured using a tape measure.
The second line (average distance) shows the distance as determined by the range finder.
93
GDP – Minibot III
Autonomous Terrain Mapping Robot
SRF08 Range Finder
160
140
120
Distance (cm)
100
Actual Dist (cm)
Average Distance (cm)
80
60
40
20
0
1
3
5
7
9 11 13 15 17 19 21 23 25 27 29 31 33 35 37 39 41 43 45 47 49 51 53 55 57
Point number
Figure 7.7: Range Results
As can be seen, the range finder is very accurate to 0.7m, beyond which the range result
seems to become less accurate.
These results indicate that it is possible to accurately detect an object’s presence and its
distance up to 0.7m.
Although the ultra sound device is a range finder, it is not used to determine the distance of an
object only that an obstacle exists in its path. Any obstacle that moves into the robot’s path
will be detected at the furthest point possible from the robot, this will be approximately at the
upper limit set for the range finder by Module 2 (not accounting for the reaction time of the
motor movement).
7.4 Low Level System Testing
This section looks at the timing issues related to the low level systems of the robot. The
calibration of the robot’s movements is covered later in the report.
94
GDP – Minibot III
Autonomous Terrain Mapping Robot
7.4.1 Testing Dead Reckoning Sensors
The dead reckoning system was tested using the University of Michigan Benchmark System
(UMBmark). When testing a dead reckoning system, two sorts of errors must be taken into
account: systematic and non-systematic errors.
Systematic errors are caused by:
•
Unequal wheel diameters
•
Average of both wheel diameters differs from nominal diameter
•
Misalignment of wheels
•
Uncertainty about the effective wheelbase (due to non-point wheel contact with the
floor)
•
Limited encoder resolution
•
Limited encoder sampling rate
Non-systematic errors are caused by:
•
Travel over uneven floors
•
Travel over unexpected objects on the floor
•
Wheel-slippage due to:
o
slippery floors
o
over-acceleration
o
fast turning (skidding)
o
external forces (interaction with external bodies)
o
internal forces (e.g., castor wheels)
o
non-point wheel contact with the floor
Although the robot may encounter non-systematic errors upon occasion, we must consider the
effects of systematic errors very seriously, as they would have a continual effect on
performance. Borenstein and Feng and others identify the two main systematic errors as
unequal wheel diameters and uncertainty about the wheelbase.
The UMBmark method involves directing the robot to traverse a square path. The robot will
start out at an absolute position: x0 y0 Ө0. The robot need not be programmed to return to its
start position precisely, but this facet of the benchmark adds a level of unnecessary
mathematical complication for the purposes of this experiment. It shall therefore be expected
that the robot will indeed return precisely to its starting point.
95
GDP – Minibot III
Autonomous Terrain Mapping Robot
Once the robot has navigated all four legs of the square, it should have returned
approximately to its start position. At this point the robot has completed the “unidirectional
square path”. The ‘positional error’ can now be found. This error is the difference between the
expected position of the robot, according to odometry data, and the actual position of the
robot.
In itself, this positional error could be used to identify a single systematic error. However,
were 2 errors to occur simultaneously, they could, over the four legs, cancel one another out.
The square path is therefore traversed in the opposite direction, such that errors cancelling in
one orientation would tend to demonstrate a cumulative effect when the robot is travelling in
the opposite direction.
The test procedure was conducted on a level, carpeted surface, to minimise the effect of nonsystematic errors. A number of preliminary runs were conducted to ensure that the experiment
would run smoothly. The robot gave a consistent performance each time. Measurements were
taken from a fixed point of reference, as specified by the UMBmark method, to the nearest
half-centimetre. The only significant modification to the standard procedure was that the test
was conducted round a 2.08m x 2.08m square path. This was because, being on a trailing
lead, the robots range was a little limited. Also, due to the encoder resolution of the robot, it
was easier to instruct the robot to travel 2.08m than 2.00m.
The robot had 14-segment encoder discs fitted on its wheels at the time of testing. It was
instructed to travel 56 segments, or 2.08 meters. The motor speed used on both motors was
72. To turn, the motor speeds were set to 0 and 255 (full forward and full reverse) and the
robot was instructed to move 18 segments on each wheel.
7.4.2 Dead Reckoning Results
The results of the test are shown graphically below. Figure 7.9 and Figure 7.10 show the
measurements taken at the beginning and end of each of the four legs of the journey. The
supposed path between them has been drawn in for clarity. The position before and after
turning was taken as, upon making a 90deg turn, the robot moves position. The recorded
average change in position for a right turn is shown below. The position change for a left turn
was very similar, but the robot moved right, instead of left.
96
GDP – Minibot III
Autonomous Terrain Mapping Robot
Figure 7.8: Turn Position Change
Figure 7.9: Clockwise Path of Robot
97
GDP – Minibot III
Autonomous Terrain Mapping Robot
Figure 7.10: Anticlockwise Path of Robot
From these two pictures, we can see that the robot performs fairly well, even though it shifts
position considerably when it turns. This path is not very useful for diagnosing any other dead
reckoning faults however. The shifts were therefore removed from the results. The resulting
paths are shown below in Figure 7.11 and Figure 7.12.
Figure 7.11: Modified Clockwise Path of Robot
98
GDP – Minibot III
Autonomous Terrain Mapping Robot
Figure 7.12: Modified Anticlockwise Path of Robot
7.4.3 Summary of dead reckoning results
It can be seen that, apart from the third leg of the clockwise path, the robot’s dead reckoning
system performs very well. The most likely explanation for the anomalous result is that the
robot turned more than 90deg at the beginning of this leg due to a non-systematic error. It is
probable that this was wheel-slippage due to fast turning.
Faults due to systematic errors did not appear to affect the performance of the robot in this
test. It had been expected that since the robot had air-filled rubber tires, a slightly curved path
might be exhibited due to the effects of any uneven loading causing the robot to have unequal
wheel diameter. Since these effects were not present, we can be sure that the loading
distribution on the robot is perfectly adequate and does not detrimentally affect its
performance.
It would have been desirable to be able to repeat the UMBmark procedure to obtain multiple
results. Unfortunately, due to time constraints a repeat was not possible. The results of this
test show the dead-reckoning system performs very well, but further testing would be required
to confirm this.
99
GDP – Minibot III
Autonomous Terrain Mapping Robot
7.4.4 Reaction of Module 2 to an object
An object (A4 in size) was placed 20cm in front of the robot before any movement begun.
This test was designed to see how long it took for Module 2 to find the object and indicate to
Module 1 that it existed.
Figure 7.13: Reaction time to an object by Module 2
Figure 7.13 shows two wave forms. Waveform 2 (bottom) is the signal from Module 1
commanding Module 2 to activate. While this signal is high, Module 2 is operating and
looking for obstacles. Waveform 1 (top) is the response signal from Module 2 to Module 1,
indicating that an obstacle has been found. This signal remains high until waveform 2 goes
low.
The reaction time for detecting an obstacle was 342ms. This test was slightly unrealistic, as
the robot would never be faced with moving forwards towards an obstacle that already exists
in such close proximity. However, it does show how quick Module 2 can react.
This experiment was repeated with an object placed at 5cm. The reaction time dropped to 276
ms. This shows that the system works better at short distances, but due to the conical
spreading of the ultra sound pulse, the further away an obstacle gets the less accurate the
100
GDP – Minibot III
Autonomous Terrain Mapping Robot
sensor becomes, not least because sound takes longer to travel the further the obstacle moves
away from the source.
7.4.5 Reaction of Module 1 to an object
This test shows how long it takes Module 1 to respond to the signal from Module 2 indicating
an obstacle exists. It is paramount to minimise this time, as any delay will allow the robot to
continue moving towards the obstacle.
Figure 7.14: Reaction time to an obstacle by Module 1
Figure 7.14 is a zoomed in view of Figure 7.13. Waveform 2 (bottom) is the signal from
Module 1 commanding Module 2 to activate. The section in view in the figure shows Module
1 acknowledging (by going low) the signal from Module 2. Waveform 1 (top) is the response
signal from Module 2 to Module 1, indicating that an obstacle has been found. The section in
view in the figure shows how long Module 2 has to wait before Module 1 acknowledges that
an obstacle exists.
The reaction time was 1ms. It was decided that this time could not be minimised any further
as it was primarily limited by the clock speed of the PIC and was therefore accepted.
101
GDP – Minibot III
Autonomous Terrain Mapping Robot
7.4.6 Complete response reaction time
This experiment looks at the time it takes between the robot recognising an obstacle exists in
its path, and commanding the motors to stop movement.
The oscilloscope measured from the time the motors began moving to the moment they were
commanded to halt. The experiment was setup up with an obstacle placed 20cm in front of the
robot.
The reaction time was 413ms. The result shows that it takes almost half a second to react to
an obstacle. This time is quite large when considering how far the robot could potentially
move in that time. From this result, it was decided to run the motors at approximately half
speed when driving forward or back. This increased the probability of the robot detecting an
obstacle and halting before it collided.
7.4.7 Reaction to Dead Reckoning System
Having analysed the collision detection system’s performance, it is essential to look at the
dead reckoning system. On the whole, the robot will be more reliant upon this system than
any other, as the dead reckoning system controls the robots general movement.
Figure 7.15 shows two waveforms. Waveform 1 (top) shows two spikes to ground. Each spike
is in fact a command to move transmitted through I2C. Without showing a zoomed in view of
each spike it is not clear exactly what information is transmitted but this information is not
relevant at this point. Waveform 2 (bottom) shows the rectified signal from the dead
reckoning system.
102
GDP – Minibot III
Autonomous Terrain Mapping Robot
Figure 7.15: Motors response time after reaching required distance
The first point to note is that the mark to space ratio is fairly even. This shows that the speed
is constant throughout. The second point is that the plot shows how reliable the dead
reckoning system is. Unfortunately, the dead reckoning system does not take into account
wheel slip.
The reaction time was measured at 200µs. This figure represents how quickly the motors react
to Module 1 receiving the last signal from the dead reckoning system.
7.4.8 Conclusion of low level system testing
As a unit, Module 1 and Module 2 operate successfully together. They successfully control
the robots movement (both in motor control and determining the robots movements) and can
avoid colliding with most obstacles.
The dead reckoning system is not entirely reliable. There have been several occasions when
the robot will fail to travel the required distance. This is particularly noticeable when trying to
rotate the robot through a specific angle. The sensor is not positioned at the widest point of a
segment on the optical disc. At slow speeds this does not matter as the sensor will
comfortably detect every edge. However, at high speeds there may be occasions when the
103
GDP – Minibot III
Autonomous Terrain Mapping Robot
sensor fails to detect every segment which will account for the occasional error in the distance
travelled. To combat this problem the sensor should be raised towards the top of the optical
disc (largest surface area of each segment).
Timing issues between MATLAB and Module 1 were resolved using delays. The PIC
operates on a 4MHz clock whereas MATLAB runs its code using the main computers 3GHz
processor. Delays helped to synchronise the two systems.
Module 1 does not return any data back to the main PC. This decision was taken to simplify
the complexity of the whole system. In hindsight, data should have been returned to allow the
main PC to recognise that all commands were understood by Module 1 and to allow the
results of the subsequent actions to be returned. This would allow the main PC to perform a
basic level of error correction. Additionally, by understanding how the robot moves in reality,
it is possible to accurately plot where the robot is in relation to a reference point.
104
GDP – Minibot III
Autonomous Terrain Mapping Robot
8 The Chassis
8.1 Chassis Requirements
Having specified the robot’s functionality, the requirements for the chassis were identified.
The first requirement was to ensure that the robot was able to run off-road on most types of
terrain such as low cut grass, footpaths and carpet, with the capability to drive over most
feasible obstacles such as small sticks and stones. To allow the robot to accomplish this, the
robot would be a powerful four-wheeled drive design to allow the robot to keep moving even
if a wheel lost traction with the ground. Another design that would accomplish this
requirement would be to have caterpillar tracks as this would also allow the robot to move
freely but, however, could be more expensive and problematic if a track broke or became
detached from the wheel, as only a basic caterpillar system would have been affordable and
would not have been of as high a standard. Another problem with a caterpillar system is that
tracks are designed to run on loose soil and so would slip when running on carpet, making it
difficult to keep track of the robot’s location.
The robot’s second requirement was to hold all the sensors and equipment that was needed to
complete its assignment. This also meant that the robot’s mounting base had to be wide
enough so that the equipment ideally did not have to be mounted above itself, which would
keep the centre of gravity as low as possible to minimise the likelihood of the robot toppling
over.
As the group’s budget was limited to £700 and an additional £5,000 research budget, the next
criteria of the chassis was to be low in cost but without compromising on the above
requirements.
8.2 The Chosen Chassis
The chassis that was chosen was the ‘All Terrain Robot Kit’ from the company ‘Super Droid
Robots’. As the robot came as a kit this kept the cost low and also allowed the robot to be
customized to the design and preferences needed, which then permitted the requirements to be
met.
105
GDP – Minibot III
Autonomous Terrain Mapping Robot
The base is a metal plate made of aluminium for high strength, light weight and no rust, and it
came in a large size of 350mm x 460mm which was perfect for mounting all the equipment
upon. The base was laser cut with predrilled holes to allow the other components to be
mounted and to also allow the robot to have the option of a second layer stacked above the
first to allow additional components to be stored.
The motors that where chosen were the 24VDC 252RPM geared motor, for the reason of it
being powerful enough to move the robot once all the equipment is mounted. The motors will
be looked at in further detail later.
Two motor controllers were purchased to allow the motors to be controlled in the required
way of having one controller in charge of the left side and the other controlling the right. This
method would allow the robot to rotate on the spot through 360 degrees to allow increased
manoeuvrability in a constricted space. The motor controller will also be discussed in greater
detail in a later section.
The wheels (Figure 8.1) are 172mm in diameter and are linked to the motor by a custom made
shaft which is bored with a 8mm threaded hole into the end to receive a locking bolt for the
wheel hub, and comes with a locking collar to fix the shaft to the motor and large hex nut that
locks into the wheel hub. Also the shafts come with a bearing that fixes into the motor
housing. The shaft is made of aluminium to be lightweight but has a steel insert so that the
threads will not strip. Due to the wheels being quite large in diameter, the robot will run
faster, but could have less power. Due to the choice of motors, however, this should not be a
problem. Also, due to their size, the wheels will allow the robot to climb virtually anything,
therefore small obstacles will not prevent the robot from performing its task.
To calculate the speed of the robot, the formula below was used:
V = RPM × D ×
π
60000
(metres per second)
Equation 8.1
Where:
V = Velocity of the Robot
RPM = Output RPM
D = Wheel diameter in millimetres
106
GDP – Minibot III
Autonomous Terrain Mapping Robot
Figure 8.1: Wheel and Shaft Set up
Batteries were chosen to power the robot, and for the motors two 12V 4Ah re-chargeable lead
batteries were chosen due to their low cost and small size. Each battery powers two motors
and is mounted on the side of the chassis that they are powering. Another battery is being
used to power the computer and all the sensor equipment; the battery that was chosen was the
12V 7 Ah battery to provide the power needed. It is small in size, as space and weight to be
carried is limited.
Overall, the chassis chosen and the components being used to power the robot are very
suitable for the intended application. The chassis design and size have allowed the robot to
have a low centre of gravity to reduce the chance of the robot toppling over when going over
rough terrain. The chosen motors allow the robot to have a large amount of power giving the
robot the ability to climb most obstacles and inclines.
107
GDP – Minibot III
Autonomous Terrain Mapping Robot
8.3 The Motors
Figure 8.2: 24VDC 252rpm Motor
As mentioned above, the motors chosen were four 24VDC motors. The motors (Figure 8.2)
are geared to allow the robot to produce more torque, with the gears made of steel to prevent
the gears stripping or burning out the windings. The motors utilize a 1:24 gear reduction
producing a stall torque of approximately 75kgf-cm and a rated torque of 10kgf-cm. Table 8.1
shows the characteristics of the Motors:
Specification
Units
Gear Reduction Ratio
Magnitude
1:24
Rated Torque
kgf-cm
10
Rated Speed
rpm
252
Rated Current
mA
<2300
No Load Speed
rpm
290
No Load Current
mA
<650
Table 8.1: Motor Specification
From Table 8.1 it can be seen that the motors are powerful and will be able to handle the
weight of the robot and also any future additions to the chassis. Also the motors have a high
level of torque and will be able to travel outdoors and up any inclines with ease.
108
GDP – Minibot III
Autonomous Terrain Mapping Robot
Geared Motor Perform ance
350
300
250
200
150
100
50
0
0
10
20
30
40
50
60
70
80
To r q ue ( k gf - c m )
Figure 8.3: Geared Motor Performance
The above graph shows the 1:24 geared motor characteristics. It can be seen that when using
the motor at the required speed of 167rpm there is a torque of approximately 32kgf-cm.
8.3.1 Motor Housing
Figure 8.4: Motor Housing
109
GDP – Minibot III
Autonomous Terrain Mapping Robot
The motor housing (Figure 8.4) holds the motor in place and fixes to the chassis plate. The
chosen motor housing is made of laser cut aluminium and has been welded together to
produce a high quality sturdy housing which is needed due to the high torque of the motor,
allowing the housing to prevent the rotation effect of the motor which, in extremes, could
shear through the joints and destroy the housing. The housing already has pre-mounted screws
that match up with the chassis base for ease of construction. The housing comes with a
support bearing which fits into a pre-cut hole on the housing, which gives an extra bearing
support for the shaft and wheel set-up, allowing the motor bearing to share the work load
giving better performance, longer life and increased weight handling capacity.
8.3.2 Motor Noise
A motor wiring kit was purchased for each motor to provide protection and help suppress
some of the electric noise that is given out from the motors. This kit includes capacitors,
shielded wire, ferrite rings and heat shrink. These components are all used to connect the
motors to the motor speed controllers. The reason the electric noise has to be kept to a
minimum is due to the significant effect it has on micro controllers and other electrical
components due to interfering and unwanted currents or voltages in an electrical device. In
general, noise provides the fundamental limitation to the range over which a signal can be
transmitted and received with integrity.
The wiring method was provided by the ‘Super Droid Robot’ website and is in Appendix B.
8.3.3 Motor Calibration
The motors are programmed to allow an input of 0 = full forward 128 = stop, and 255 = full
reverse, (See 8.5.1 Programming the Motor Controllers). A speed input of 70 was chosen to
move the Minibot in the forward direction due to it producing a relatively low and steady
speed. Due to the motors being unable to run at equal speeds the motors had to be calibrated.
When the robot was moving on the ground it was visually noticeable that the right side motors
were running slower than the left, and because of this the robot veered off to the right. A trail
and error process was used to produce a visually accurate straight heading for the robot. From
this process, speeds were given to the robot of 70 to the left side motors and 74 to the right.
110
GDP – Minibot III
Autonomous Terrain Mapping Robot
Although the robot appeared to head in a straight line, the accuracy of the Minibot’s terrain
mapping ability depends on the accuracy of its movement, so precise measurements of the
wheel movements were taken to produce a final calibration.
The distance travelled by each side was measured, and used to assist in calibrating the motor’s
speed. Also, the accuracy of the photo sensors were tested by varying the segmented disc by
changing the amount of segments, initially at four segments and then tested with eight
segments. Keeping the right side at a steady input while varying the left input allowed the
speed to be calibrated. The speed input for the right side motor was kept at 74 and the left
motors were varied from 70, 72 and 74.
Table 8.2 shows the distances measured when the photo sensor disc was equipped with four
segments. The tests show that the motor set-up of 72:74 produced the most accurate and
constant distances, allowing the Minibot to travel in a straight line.
Distance Input
Speed Input
4
70,74
72,74
8
74,74
70,74
Distance (cm)
72,74
74,74
Distance (cm)
Left Side
52.5
52
53
106
104
105
Right Side
52
52
52
103.5
104
104
Left Side
52.5
52.5
53
105.5
104
105
Right Side
52
52
51.5
103.5
104
104
Left Side
53
52
53
106
104
105.5
Right Side
52.5
52
52
103.5
104
104
Table 8.2: Motor Calibration; Photo Sensor Disc - 4 Segments
The photo sensor uses four colour changes on the inside of the wheel, two quarters white, two
black. The colour change on the wheel picked up by the photo sensor controls the distance
travelled. The efficiency of the photo sensor was confirmed by test, and is shown in Table 8.2,
using a distance input of four, relating to one wheel rotation using a four segment disc, the
actual distance travelled was measured to be 52cm and so from theory, the circumference of
the wheels is:
111
GDP – Minibot III
Autonomous Terrain Mapping Robot
X = πd = 3.142 × 172mm = 54cm
Equation 8.2
But due to the compression of the wheel from the weight of the Minibot, the measured
distance of 52cm for one wheel rotation is quite accurate.
Distance Input
2
4
8
16
24
32
Distance (cm)
Left Side
13
26.5
51
103.5
155.5
207.5
Right Side
13.5
26
51.5
103.5
154.5
208
Left Side
13
26.5
52.5
103.5
156
205.5
Right Side
12.5
26.5
52
103.5
156
206.5
Left Side
13.5
25.5
52
103.5
156
208
Right Side
13.5
26
52
103.5
156
208.5
Table 8.3 : Motor Calibration; 72:74 set-up, Four Segments
Table 8.3 shows the distance travelled by the Minibot over different distance inputs. It can be
seen that the Minibot is very accurate in distance repeatability when travelling ½, 1, 2 or 3
wheel revolutions. When travelling ¼, or 4 wheel revolutions the repeatability of distance
travelled is not as accurate, but will not prove to be a problem as the Minibot’s mapping
technique of a room does not travel such short or long distances in one movement.
Distance Input
1
2
4
8
12
16
Distance (cm)
Left Side
13.5
26.5
52
104
156
210.5
Right Side
13.5
26.5
52
104
156
208.5
Left Side
12
26.5
52.5
104
156
205.5
Right Side
12.5
26.5
52
104
156
203.5
Left Side
13.5
26.5
52
104
156
208
Right Side
13
26.5
52
104
156
208.5
Table 8.4 : Motor Calibration; 72:74 set-up, Eight Segments
112
GDP – Minibot III
Autonomous Terrain Mapping Robot
Table 8.4 shows the results from the photo sensor disc when installed with eight segments. It
can be shown that using a disc with eight segments has a higher repeatability across the
different distances compared to the four-segmented disc. So the accuracy increases with the
number of segments and is the reason for a disc with a large number of segments, such as 14,
is being used.
8.3.4 Motor Loading Calculations
To produce an estimation of how long the motor batteries will last the current consumption of
the motors needs to be calculated. Below shows the power consumption in the worst case
scenario:
Known Variables:
•
Wheel Diameter – 172mm
•
Weight of Minibot – 18kg
•
Top Speed – 1ms-1
•
Distance to reach top speed – 190mm
Assuming linear acceleration, the following equations of motion become applicable:
s = ut +
1 2
at
2
Equation 8.3
and
v = u + at
Equation 8.4
due to the initial speed being zero Equation 8.3and Equation 8.4 becomes,
a=
2s
t2
Equation 8.5
and
t=
v
a
Equation 8.6
113
GDP – Minibot III
Autonomous Terrain Mapping Robot
Substituting Equation 8.5 into Equation 8.6 gives,
a=
2s
v
 
a
2
Equation 8.7
which becomes
v2
a=
2s
Equation 8.8
Substituting in the variables gives
a=
12
= 2.63ms − 2
−3
(2 ) × (190 × 10 )
Equation 8.9
Using the calculated acceleration allows the force required to propel the mouse to be
calculated,
F = ma = 18kg × 2.63ms −2 = 47.34 N
Equation 8.10
The torque can now be calculated;
T =F×
D
172 × 10 −3
= 47.34 ×
= 4.07 Nm
2
2
Equation 8.11
Calculating for each motor gives,
T / motor =
(T ) = (4.07 ) = 1.0175Nm
4
4
Equation 8.12
for each of the four motors.
To calculate the power, the RPM has to be known;
114
GDP – Minibot III
Autonomous Terrain Mapping Robot
speed
1ms −1
× 60 =
× 60 = 115.4
rpm =
circumference
0.52m
Equation 8.13
Using the rpm to calculate the power gives
2π 

P = T / motor × rpm( rad / s ) = (1.0175) × 115.4 ×
 = 12.3watts
60 

Equation 8.14
So the current used by each motor is
I=
P 12.3
=
= 1.025Amps
12
V
Equation 8.15
so for each side of the robot, two motors will use 2.05 Amps .
As each battery provides 4Ah, each battery will last approximately 2 hours. But as the motors
will not be running at full speed and actually only running at a lower speed it is estimated they
will last between 4-6 hours.
8.4 Battery Selection
8.4.1 Motor Batteries
Two small batteries were used instead of one large battery in order to allow the weight of the
batteries to be evenly spread across the chassis. Using the two batteries meant that one battery
could power the left motors and the other could power the right motors. This allowed the
batteries to be mounted on the very edges of the chassis, ensuring an even weight distribution.
As the motors were 24VDC with a rated current of 2300mA, and each battery had to power
two motors, two 12V 4Ah Rechargeable Lead Acid batteries were chosen. These were
suitable due to their relatively cheap price of £21.99. The choice of 12V instead of 24V was
made as the availability of 24V batteries, with the required current rating, that are within the
115
GDP – Minibot III
Autonomous Terrain Mapping Robot
projects budget, is very low. As the motor would not be running at full speed, a slightly lower
battery current, at 4Ah, could be used instead of selecting the next battery size up being 7Ah.
8.4.2 Computer Battery
The computer requires a range of voltages. A 12V battery was required by the micro power
supply purchased. The power consumption of the robot computer was estimated at a little
over 100W when running processor-intensive applications, such as MATLAB. Using the
formula:
P = IV
Equation 8.16
we were able to estimate that the computer would consume around 8A. For a useful length of
operation, at least a 4Ah battery would therefore be required. A 7Ah battery was purchased as
this will give approximately 50 minutes of operation between charges.
8.5 The Motor Controllers
The Motor controllers chosen to run the motors are two Devantech MD22 Dual Motor
Controllers (Figure 8.5). It is very robust and is designed to supply medium power to two
independent motors, in the case of the project’s application two controllers will be used, one
to control the left side motors and the other to control the right hand side thus allowing the
robot to perform 360 degree turns (Figure 8.6).
Figure 8.5: MD22 Motor Controller
116
GDP – Minibot III
Autonomous Terrain Mapping Robot
Figure 8.6: Motor Set-up
The MD22 controller is designed to handle 5A current capacity for 5V – 50V motors, which
is suitable for the 24VDC motors being used on the robot’s chassis. The controller can be
controlled in a variety of methods:
•
I2C bus, with switch selectable addresses and 4 modes of operation
•
2 independent 0v-2.5v-5v analogue inputs, 0v full reverse, 2.5v stop, 5v full forward
•
0v-2.5v-5v analogue input for speed, with the other for steering
•
Radio Control
For the purpose of the robot the first method, the I2C bus, will be used allowing the computer
to control the movement of the robot instead of human intervention.
8.5.1 Programming the Motor Controllers
To use the Motor Controllers in a particular method that is suitable for the project the mode
switches have to the switched in a particular combination in order to select the desired mode
of operation. These four switches are read only once when the module is powered up and are
unable to be changed in mid-operation.
From the table below it can be seen that the I2C buses are the 0xB addresses, where Bit 0 is
the read/write Bit and so addresses 0xB0/0xB1 are read/write respectively to the same
address. As mentioned early the mode that has been selected for the control of the motors is
117
GDP – Minibot III
Autonomous Terrain Mapping Robot
one of the I2C bus modes as this allows the greatest accuracy of the movement and handling
of the motors, allowing precision speed and turning regulation.
Mode
Switch 1
Switch 2
Switch 3
Switch 4
I2C Bus – address 0xB0
On
On
On
On
I2C Bus – address 0xB2
Off
On
On
On
I2C Bus – address 0xB4
On
Off
On
On
I2C Bus – address 0xB6
Off
Off
On
On
I2C Bus – address 0xB8
On
On
Off
On
I2C Bus – address 0xBA
Off
On
Off
On
I2C Bus – address 0xBC
On
Off
Off
On
I2C Bus – address 0xBE
Off
Off
Off
On
0v-2.5v-5v Analogue
On
On
On
Off
0v-2.5v-5v Analogue + Turn
Off
On
On
Off
RC Servo
On
Off
On
Off
RC Servo + Turn
Off
Off
On
Off
Table 8.5: Motor Controller Mode of Operation
In order to operate the Motor Controller, firstly a start bit has to be sent to initialise the line
followed by the desired module address for the first motor, being either a read or write bit,
and finally the register number and a stop bit. This method is then repeated using a different
module address to access the second motor. An example of this is shown in the following
pseudo code (Figure 8.7):
118
GDP – Minibot III
Autonomous Terrain Mapping Robot
Send start bit
// Initialises the line
Send 0XB0
// send module address B0 being the write address
Send 0X00
// sends register address mode 0 allowing access to the first
motor
Send stop bit
// ends the line
Send start bit
// reinitialises the line
Send 0XB0
Send 0X01
// register address for second motor
Send 0XFF
// sets speed to 255
Send stop bit
Send start bit
Send 0XB0
Send 0X02
// register address for right motor
Send 0XFF
// sets speed to 255
Send stop bit
Figure 8.7: Pseudo Code for control of MD22
This pseudo code shows how the motor controllers are able to regulate the speed of the
motors with ease and precision by varying the desired speed setting, allowing the robot to be
controlled accurately.
The MD22 motor controller has a variety of registers (Table 8.6)
119
GDP – Minibot III
Autonomous Terrain Mapping Robot
Register Address
Name
Read/Write
Description
0
Mode
R/W
Mode of Operation
1
Speed1
R/W
Left Motor Speed (mode
0,1) or speed (mode 2,3)
2
Speed2/Turn
R/W
Right motor speed (mode
0,1) or turn (mode 2,3)
3
Acceleration
R/W
Acceleration for I2C (mode
0,1)
4
Unused
Read Only
Read as zero
5
Unused
Read Only
Read as zero
6
Unused
Read Only
Read as zero
7
Software Revision
Read Only
Software Revision Number
Table 8.6: Motor Register Addresses
For the application of the project only the register numbers 1 and 2 will be used, with the
speed registers being literal speeds in the range 0 = Full Forward, 128 = Stop and 255 = Full
Reverse. This has allowed to robot to be controlled with high accuracy that is a requirement
for the application of terrain mapping and the reason why the MD22 controllers were chosen.
When the Speed1 Register is addressed then it is possible to affect the speed of one of the
motors. The larger the number written to this register, the more power is applied to the motor.
The Speed2 register operates at the same as Speed1 but controls the operation of the second
motor instead. This has allowed the motors to be calibrated, due to the fact that the motors
will not run at the same speed given a set voltage, meaning the difference in motor speeds
would affect the movement of the robot.
8.6 Mounting Components
8.6.1 Chassis Layout
The chassis has been given two layers, the first layer consists of the initial plate that is part of
the ‘Super Droid Robot’ chassis and the second is a bought piece of 3mm thick aluminium
sheet cut to the same size as the chassis plate and is mounted above the first layer by using
8mm diameter steel threaded rods. These threaded rods allow the chassis layers to be removed
at ease and also to allow the height to be made adjustable, giving the robot room for future
additions.
120
GDP – Minibot III
Autonomous Terrain Mapping Robot
Figure 8.8: Chassis Layout
The chassis (Figure 8.8) was also given a third layer to act as a roof to protect the components
mounted on the second layer from the possibility of the robot accidentally driving into a wall
and flipping over and damaging vital electrical parts. The roof was made from the same 3mm
Aluminium sheet as the second layer and also acts as a dust cover.
The components that are mounted on the first layer are the Batteries (two for the motors and
one for the motherboard and subsequent components), Two Motor Controllers, the SRF08
Range Finder, the Dead Reckoning System, the Software Maintenance Display, and Motor
and Dead Reckoning Micro-controller (See Appendix D to view the layout of layer 1).
The components that are mounted on the second layer are the Hard Drive, the Motherboard,
the two Stereovision Web Cameras and the SRF08 Micro-controller (See Appendix E to view
the layout of layer 2).
The layout of the components was set-up to allow easy access to the components that would
often need attention, such as, the batteries for charging or the power switches to allow the
motherboard to be switched on. If a component actually needs removing from one of the
121
GDP – Minibot III
Autonomous Terrain Mapping Robot
layers, in the case of failure, then the layers above it would have to be removed as the bolts
holding the components down are not accessible from the sides of the chassis. This, however,
would be relatively quick due to the way in which each layer is mounted on the threaded rods,
as only nuts have to be undone for a layer to be removed to allow access to underneath. The
layout of components also has allowed the centre of gravity to be kept as low and as centrally
as possible.
8.6.2 Mounting the Cameras
Figure 8.9: Camera Mounts
The requirements for the camera mounts were to allow fine adjustments to the pitch and the
yaw directions of the cameras to give the best results from the stereovision and to prevent
movement of the cameras. Figure 8.9 shows how the stereovision web cameras are mounted.
The mount consists of two metal plates manufactured from 3mm aluminium sheet giving a
strong and sturdy structure. To mount the plates, four 5mm diameter threaded rods of 100mm
length were used, this allows the top plate to be raised and lowered to allow the access to the
cameras and to hold them into position. The cameras are mounted in their original rubber
holders, which have the ability to allow the cameras to move in the pitch and the yaw
122
GDP – Minibot III
Autonomous Terrain Mapping Robot
direction. As the mounting system allows access to the cameras this means the cameras angle
can be adjusted to the desired direction. In order to protect the cameras from the pressure
exerted from the mounting system, rubber grommets were used to protect and secure the
cameras in their desired position. The grommets needed to be used due to the web cameras
having buttons on the top which, when pressed, take pictures. These buttons could not have
any pressure exerted on to them as the computer would not be able to access new pictures for
terrain mapping. The button fits in between the sides of the grommets with room for the
button to move, even when the cameras are pivoted back and fourth to change the angle of
direction.
As the camera’s line of sight needs to be kept in parallel to produce the most accurate
stereovision map, a simple method was designed to achieve this. A metal rod is placed in the
sides of the camera in two pre-made holes in the camera casing (see Figure 8.9). This metal
rod stops the cameras from rotating in their mounts in the yaw direction and so allows the
stereovision map to be created as accurately as possible.
123
GDP – Minibot III
Autonomous Terrain Mapping Robot
Figure 8.10: Dimensions of Camera Plates
To give the camera mounts further adjustment in pitch and height, the threaded rods connect
to the first layer plate to secure its location but have further length to allow the camera mounts
to be raised above the first layer plate and so each rod is able to be raised and lowered easily
and accurately, hence changing the height of the bottom plate and hence the pitch angle.
To allow testing of the cameras, the distance of separation needed to be varied so another
mount of smaller size was constructed. The new mount meant the cameras could be fixed at
the minimum distance of separation allowing the cameras to actually be touching. See Figure
8.11 for the dimensions of the smaller mount.
124
GDP – Minibot III
Autonomous Terrain Mapping Robot
Figure 8.11: Dimensions of Smaller Camera Mount
125
GDP – Minibot III
Autonomous Terrain Mapping Robot
8.6.3 Mounting the Dead Reckoning System
Dead Reckoning Mount
Photo Sensor
Figure 8.12: Dead Reckoning Mounting System
The requirements for the Dead Reckoning mounting system were to allow small adjustments
to the distance of the photo sensor from the wheel hub so that precise distance measurement
and control could be obtained. Figure 8.12 shows how the dead reckoning system is mounted.
To allow the distance to be adjusted, two 4mm groves have been cut into the 3mm aluminium
sheet to allow the mount to slide up and down towards the wheel hub, sliding on two
mounting bolts that are fixed to the first layer.
Due to the four mounting bolts for the motor housing coming from underneath the first layer,
the mounting plate had to have a thin ‘neck’ to allow the dead reckoning to move forward and
backward freely in between the motor housing fixing bolts.
126
GDP – Minibot III
Autonomous Terrain Mapping Robot
Figure 8.13: Dimensions of Dead Reckoning Mount
Figure 8.13 shows the overall dimensions of the mounting system. The bolts used to fix the
mounting system in place have a diameter of 4mm, and allow small and accurate adjustments
to the distance by loosening the nuts and retightening them once in the desired position. The
25mm by 25mm square at the end of the thin ‘neck’ is the section that is bent at a right angle
to allow the photo sensor circuit to be mounted and held at the correct distance from the
wheel hub.
127
GDP – Minibot III
Autonomous Terrain Mapping Robot
8.6.4 Mounting the SRF08 Range Finder
Figure 8.14: Range Finder Mount
To allow the SRF08 Range Finder to produce accurate measurements and results, it must be
facing directly forwards relative to the robot’s direction and at a height of approximately
120mm. This means that the range finder had to be mounted perpendicularly to the first layer.
Originally a design for the mount was produced, but through research a purchasable mount
could be bought from the company ‘active robots’. The price of the sensor mount was £9.99.
Due to the low cost the decision was made that the mount should be bought to allow time to
be saved by not having to manufacture the component, as the total man-hours and the price of
materials would have cost more than to purchase a ready made component.
Once the sensor was fixed to the mount, it was then attached to the front of the chassis on the
first layer. The reason it was mounted on the front of the chassis is due to the way in which
the robot will explore a room. The programme that controls the robots movement only allows
the robot to travel forward, so at no time will the robot be in reverse and so will not need a
sensor to protect the rear of robot from hitting any obstacle.
128
GDP – Minibot III
Autonomous Terrain Mapping Robot
8.6.5 Mounting the Batteries
Due to the two different types of batteries being used, one type for the motors and another to
supply power to the motherboard and other computer components, two different sized battery
mounts were designed. Each mount had to keep the battery securely fixed to prevent
movement and also to allow access to the terminals without needing to remove the bracket, or
the battery from the chassis layer it is mounted on, to allow recharging. Another option to
mount the batteries is to use right-angled section around the base of the battery to prevent
movement, but this would still allow the battery to rise up and down and so a bracket would
still be needed to prevent vertical movement. Because of this a bracket was designed to
prevent all movement.
Figure 8.15: Motor Battery Mounts
Figure 8.15 shows the design for the bracket used to hold the battery for the motors. Two
mounts were made due to the need for a battery on each side, comprising of two motors for
the left and another two for the right. The mount covers the whole battery to prevent
movement in any axis, even preventing twist or vibration of the battery. The bracket is fixed
to the first layer plate by four 4mm bolts, two on each side of the bracket. On the right side
129
GDP – Minibot III
Autonomous Terrain Mapping Robot
there is a mount to secure a switch to the battery to allow the power to be turned off without
having to disconnect the battery wires.
Figure 8.16: Dimensions of Motor Battery Bracket
Figure 8.16 shows the dimensions of the plate used to create the motor battery mount. The
shape was created producing 90 degree bends in the plate on the appropriate points to hold the
battery securely. The actual battery dimensions are
Height = 70mm, Width = 102mm, Length = 90mm.
Looking at the dimensions in Figure 8.16 it can be seen that the plate was made 92mm and
71mm for the Length and Height respectively. The reason for the extra millimetre on each
side was to allow for the bending of the metal, due to the metal being compressed on the
inside of the bend and stretched on the outside of the bend. Because of compressive and
tensile stresses the bend will use more metal and so an allowance had to be made to still
produce the bracket to size.
The width of the plate stayed the same at 102mm due to the ability to easily produce the bend
at the correct place, unlike the length of the bracket where the bends are produced in
succession and so one misplaced bend would have a knock on effect on the rest of the bends
130
GDP – Minibot III
Autonomous Terrain Mapping Robot
producing an incorrect sized bracket. The side-securing flaps, which overhang the side of the
battery, are not needed to be a precise length and so the bend was able to use the metal from
the flaps.
Figure 8.17: Computer Battery Mount
From Figure 8.17, it can be seen that the computer battery and bracket set-up are different
from the motor battery bracket, due to the different shape of the battery and position of the
battery terminals. For this reason a different design was produced. The design still covers the
majority of the battery and so prevents movement in any axis, so still preventing twist or
vibration of the battery, but the right corner is left uncovered to allow access to the terminals.
131
GDP – Minibot III
Autonomous Terrain Mapping Robot
Figure 8.18: Dimensions of Computer Battery Mount
The computer battery size is
Height = 64mm, Width = 94mm, Length = 151mm.
Again the plate was produced a millimetre larger in each direction to allow for the metal
contraction once the plate is bent. The bracket is fixed to the chassis in the same way as the
motor battery bracket with four 4mm bolts.
8.6.6 Mounting of the Motherboard and Hard Drive
The motherboard and the hard drive were mounted on to the second layer so that heavier
components could be mounted on the first layer keeping the centre of gravity as low as
possible. Due to the need for the stereovision web cameras and the SRF08 micro-controller
needing to be mounted at the front of the robot, both the motherboard and hard drive have
been mounted at the back. The motherboard is mounted on the left of the plate, respective to
the robots forward direction, due to the USB connections and serial port being on the left side
of the board allowing the leads to easily connect to the layer underneath. Also the connections
for the hard drive are on the motherboard’s right and, due to the limited length of the
connection leads, the hard drive has been mounted on the right of the plate. Both the
motherboard and hard drive are mounted using motherboard mounts secured in to 4mm holes.
132
GDP – Minibot III
Autonomous Terrain Mapping Robot
Figure 8.19: Dimensions of Middle Plate (Version 1)
Due to the chosen motherboard’s failure on test, the problem occurred of how to mount the
replacement motherboard, which was larger in size, and also the new motherboards Power
Supply Unit (PSU). The new motherboard was mounted in the same position (Figure 8.20) as
the original but due to its larger size, this meant the SRF08 micro-controller was unable to
stay on the second layer. Due to the PSU being of similar height as the two Web Cameras, the
PSU was mounted on the second layer allowing room for the micro-controller to be mounted
on the first layer in the position where the original motherboards battery was initially placed.
The PSU is mounted by two sets of cable ties across the length of the unit, and the
motherboard and hard drive are mounted in the same way as before on mounts.
133
GDP – Minibot III
Autonomous Terrain Mapping Robot
Figure 8.20: Dimensions of Middle Plate (Version 2)
As the mounting holes are already pre-drilled for the original set-up, this meant that once the
replacement motherboard was delivered, it was a quick process to move the components back
to their original locations.
8.6.7 Mounting the Software Monitoring Display
The monitoring display (Figure 8.21) was mounted perpendicularly to the first layer to allow
the display to be visually accessible. It is fixed to the plate by two 4mm bolts with the actual
display being attached to the mount by 3mm bolts and motherboard spacers. The size of the
actual software monitoring display is 60mm by 34mm, and so the size of the face it is
mounted on is 70mm by 44mm to allow additional room for future upgrades to the circuit.
134
GDP – Minibot III
Autonomous Terrain Mapping Robot
Figure 8.21: Software Monitoring Display
Figure 8.22 shows the dimensions of the plate used for the mount. The plate is bent at 90
degrees along the line to produce the face of 70mm by 44mm. The plate is produced from
3mm thick aluminium plate and has two mounting holes to prevent the display from twisting.
135
GDP – Minibot III
Autonomous Terrain Mapping Robot
Figure 8.22: Dimensions of Software Monitoring Display Mount
8.7 Chassis and Mounting Conclusion
Overall the chassis performs very well, it has allowed the Minibot to be very powerful and
hold all the equipment and components necessary to perform its mission. Due to the position
configuration of all the components the Minibot has a low centre of gravity. The chassis is
highly manoeuvrable due to the four-wheeled drive configuration and the repeatability of
distance travelled is good and so allows terrain mapping to be measured very accurately.
All the mounts produced have proven to work well and perform all their requirements. The
camera mounts that allow small adjustments to pitch and yaw angles, and the dead reckoning
system can be adjusted to increase the accuracy measurements of distance travelled. Due to
the materials used on all the mounts the weight of the Minibot has been kept to a minimum.
136
GDP – Minibot III
Autonomous Terrain Mapping Robot
9 Robot Computer
In order to achieve autonomy, the robot needed to have some form of processing capability.
Indeed, an original specification of the project was that the robot should have a good degree
of processing power. In selecting a platform from the wide range that is currently
commercially available, a number of requirements were taken into consideration.
9.1 Choice of Platform
9.1.1 Onboard or Remote Computing
An available option for mobile robotics is to transfer the bulk of the processing to a remote
(host) computer. This would reduce the payload and overall power consumption of the robot
itself. However, remote processing would also greatly limit the physical independence of the
robot, as it would always have to be within range of communication of the host computer in
order to function correctly. This would produce a robot potentially capable of processing large
amounts of data, limited mainly by the bandwidth of the communications system.
Nevertheless, it could not be considered truly autonomous.
9.1.2 Expandability and Cost
Although a number of specialised platforms have been developed for use on mobile robots,
such as the Gumstix [20] range of extremely compact computing platforms, there is no
dominant standard or technology in this area. As such, future development of the robot could
be compromised, as compatible devices could be difficult to source. These dedicated
platforms also tend to be slightly more expensive than equivalent standard technologies, due
to the lower turnover relative to development costs. In order to maximise the capabilities of
the robot for the available budget, the decision was taken to use a more standardised
technology.
9.1.3 Level of Abstraction
It became clear that there were three main technologies suitable for consideration: The
relatively low level FPGA (Field Programmable Gate Array) technology; a higher level
PC/104-type embedded system; and finally, PC technology.
137
GDP – Minibot III
Autonomous Terrain Mapping Robot
FPGA technology is widely used in mobile robotics research [20]. It is most useful for
application specific projects where the target system is unlikely to be considerably altered
from its original state. This is because, short of designing circuitry, an FPGA is a semi-custom
design choice. It must be user-programmed for its purpose. Using an FPGA brings all the
advantages of semi-custom design such as flexibility of design style and low unit cost. The
drawback to this is that customised design incurs large development costs in terms of time
and effort. Running an operating system on an FPGA would be extremely impractical, due to
the large number of transistors required to support the image of a processor. Overall it is clear
that a higher level of abstraction was required for this project.
PC/104 is an embedded computing standard derived from PC technology. It was developed
for use in space and power sensitive applications. Systems consist of stackable modules, with
a form factor of 90 by 96 mm. Each module typically consumes only 1-2 Watts [22]. Upon
initial investigation, PC/104 technology seemed a suitable choice for use on the robot. It has
developed to the point where boards can be purchased which support up to Pentium III
(~933MHz) processors. Windows XP is also supported by these boards [23]. Unfortunately
the cost of purchasing such a system would exceed the entire project budget, and as such had
to be ruled out.
Having considered all of the aforementioned factors, the conclusion was reached that standard
PC technology would provide the most appropriate computing platform for the robot.
9.2 Component Selection
There were a number of requirements that the onboard PC needed to fulfil. As mentioned
previously, it is important to minimise the size, weight and power consumption of any
components on a mobile robot. The decision was therefore taken to custom select each
component of the PC, taking these factors into consideration
9.2.1 Motherboard
The first component to select was the motherboard. This would determine the overall size and
potential for expansion of the system. Motherboards are categorised in terms of basic shape
and size into different form factors. The form factors considered were two of the smallest
standard sizes: Mini-ITX (170 x 170 mm) and Micro-ATX (244 x 244 mm). Micro-ATX was
selected, as it generally provides greater connectivity but is only slightly larger than MiniITX.
138
GDP – Minibot III
Autonomous Terrain Mapping Robot
The connectivity of the motherboard was another important consideration. As previously
discussed, a 9-pin D-type connector would be required for the RS-232 interface with the lowlevel control circuitry (see Section 3.1). This is a widely supported standard, and did not
restrict the choice of motherboard. If a second RS-232 interface were to be required for any
reason, a simple USB adapter could be built or purchased.
The motherboard chosen was the Micro-Star International (MSI) KM3M-V Socket A VIA
KM266 Micro-ATX. The significant features of this board are listed in Table 9.1: Main
Features of Motherboard.
When the motherboard was delivered, it was not functional. The supplier was not willing to
refund the cost, but agreed to repair it and return it within 6 weeks. This would have caused
an unacceptable delay to the project. The decision was therefore taken to purchase another
motherboard to allow the project to continue. The KM3M-V was now out of stock, so the
equivalent and virtually identical KM4M-V was purchased. This is the motherboard now
installed on the robot. The only significant difference between the two is that the KM4M-V
has an extra connector, which is not used in this project.
139
GDP – Minibot III
Autonomous Terrain Mapping Robot
Manufacturer
MSI
Product Type
Mainboard
Form Factor
Micro ATX
Width
24.1 cm
Depth
19.1 cm
Compatible Processors
Athlon, Duron, Athlon XP
Processor Socket
Socket A
Maximum Front Side
333 MHz
Bus Speed
Maximum Supported
1
Processor Quantity
Supported RAM
DDR SDRAM
Technology
Networking
Network Adapter – Ethernet, Fast Ethernet
Total Expansion Slots
1 x processor – Socket A
2 x RAM (2.5 V) – DIMM 184-PIN
1 x AGP (1.5 V)
3 x PCI (3.3 V / 5 V)
Interfaces
1 x storage - floppy interface
1 x serial - RS-232 - 9 pin D-Sub (DB-9)
1 x display / video - VGA - 15 pin HD D-Sub (HD-15)
1 x keyboard - generic - 6 pin mini-DIN (PS/2 style)
1 x parallel - IEEE 1284 (EPP/ECP) - 25 pin D-Sub (DB-25)
1 x mouse - generic - 6 pin mini-DIN (PS/2 style)
1 x network - Ethernet 10Base-T/100Base-TX - RJ-45
4 x Hi-Speed USB - 4 PIN USB Type A
1 x audio - line-out - mini-phone stereo 3.5 mm
1 x microphone - input - mini-phone 3.5mm
1 x audio - line-In - mini-phone stereo 3.5 mm
Table 9.1: Main Features of Motherboard
140
GDP – Minibot III
Autonomous Terrain Mapping Robot
Figure 9.1: Diagram of Motherboard Layout
9.2.2 Mass Storage Device
The PC Architecture also requires the presence of an internal storage device, which holds the
operating system, and normally also holds data associated with the operation of the machine.
A typical PC would tend to use a standard 3.5” magnetic hard disk drive (HDD) as its mass
storage device. Currently available alternatives to this include network (remote) storage,
solid-state hard drives and compact flash devices. These must be evaluated against the
specific requirements of a PC on a mobile robot. The main factors to consider are: weight,
power consumption, cost, storage capacity and the ability to withstand the physical shocks
associated with the operation of a mobile robot.
Remote storage would offer a number of advantages. In addition to consuming no onboard
power and adding no weight to the payload, the storage device would not be subject to
physical shocks. However, in order to access and boot up the operating system (OS), the PC
has to be connected to the network. The communication protocol selected for use on the robot
is Wi-Fi, which is activated from within the operating system. In addition to this, remote
storage is subject to the same weakness as remote processing: a robot using this form of
141
GDP – Minibot III
Autonomous Terrain Mapping Robot
storage could not be considered truly autonomous, as it would have to remain within range of
communication with its storage device in order to function correctly.
Compact Flash (CF) would provide an excellent mass storage solution. It is lightweight, and
typically consumes less than five percent of the power required to operate 2.5" hard disk
drives. CF cards are also extremely robust, capable of withstanding rapid temperature changes
and operating shocks of up to 2000 Gs (equivalent to a 10 foot drop). The CF Specification
can support capacities up to 137GB and CF cards are available in capacities from 16 MB to
12 GB [24]. Unfortunately, in order to function usefully and achieve a good degree of future
proofing, the robot must have at least 5 GB of mass storage. At current prices this would cost
at least £300, making it an extremely costly option relative to the project budget. CF was
therefore ruled out at this time.
A solid-state HDD would provide advantages similar to those offered by a CF card. However,
this technology is also prohibitively expensive for our budget at the current time.
Having evaluated the alternatives, a magnetic HDD was found to be the most suitable mass
storage device. A 2.5” 40 GB HDD was selected. Designed for use in laptop computers, this
HDD is a robust device capable of withstanding operating shocks of up to 225 G. Operating at
5 V, it draws a maximum current of 1.2 A and consumes only 2.4 W of power during
Read/Write operations and 0.36 W during Standby operation. The HDD has dimensions of
100.2 mm x 69.85 mm x 9.5 mm and weighs 99 g. These specifications mean that this device
is perfectly adequate for use on the robot.
Figure 9.2: Seagate 2.5" 40 GB HDD
142
GDP – Minibot III
Autonomous Terrain Mapping Robot
It can be observed from Figure 9.2: Seagate 2.5" 40 GB HDD that the connector on this HDD
is not the same as that on a standard 3.5” HDD. For use in notebook applications, the
connector is smaller and is modified to include a 4-pin 5 V power connection (seen here on
the left of the connector). An adapter was therefore necessary to connect the HDD to the IDE
cable and the power supply.
Figure 9.3: Generic IDE to 2.5" Adaptor
It should be noted that the team considered the CF card to be more suitable for the mass
storage device. As this technology is presently rapidly depreciating, the suggestion is made
that the 2.5” HDD could be replaced with a CF card at some future time.
9.2.3 Processor
As can be seen from Table 9.1: Main Features of Motherboard, there were 3 types of
processor compatible with the motherboard which had been selected: Athlon, Duron and
Athlon XP. The processor selected from these was an Athlon XP, chosen for its relatively
high performance index of 3000+. This means that the processor will yield performance
equivalent to that of a 3 GHz Intel processor.
This processor will consume a relatively high amount of power; typically 58 W, with a
maximum consumption of 74 W. In order to navigate successfully using MATLAB, the robot
will need to have close to real-time performance in its operation. Also, any future
development of the robot would be likely to build in processor intensive artificial intelligence
functionality. For these reasons, it was decided that processor speed was not an area that
should be compromised in order to conserve power.
9.2.4 RAM (Random Access Memory)
The memory purchased was a 512 MB module. The selected motherboard can hold up to 2
GB of main memory (RAM). Although using this maximum amount of RAM would have
boosted the performance of the PC slightly, 512 MB is a sufficient quantity for this project.
143
GDP – Minibot III
Autonomous Terrain Mapping Robot
RAM is an expensive component, and the price for 2 GB would have been approximately
£200. If more RAM is required in the future, it can be easily added into the vacant slots.
9.2.5 PC Power Supply
To avoid using a trailing lead power supply for the computer, options for an onboard power
supply were investigated. A 12 V micro power supply, designed to run off a 12 V battery was
sourced along with a 12 V battery. Unfortunately, upon initial test, the power supply proved
to be supplying the incorrect voltage levels.
At this stage of the project, it would not have been prudent to attempt to power the PC from
this supply. Instead, it was decided to power the PC from a standard ATX power supply with
a trailing lead.
The documentation for the micro power supply is included with this project and the supply
itself will be passed on with the rest of the hardware. Had the time constraints been less, the
fault would have been investigated and rectified, but this was not possible within the time
available.
The ATX power supply now installed on the robot is a 200 W supply which was sourced from
ECS.
9.3 Operating System
The two dominant operating systems available for use on PC hardware such as that selected
for this project are Windows and Linux. Either system would have adequately fulfilled the
demands of this project. In general, Linux provides greater hardware access and control
whereas Windows provides more high level functionality. In many mobile robotics
applications a high degree of hardware access and control is indeed required, and Linux is
often the more natural choice for such systems. In this project however, there was no real
requirement for any great degree of hardware control. In fact, the excellent support for WiFi
provided by Windows XP meant that for development purposes, given that the team had no
significant previous experience of the Linux platform, Windows XP was the more suitable
system.
144
GDP – Minibot III
Autonomous Terrain Mapping Robot
If the specific features of Linux were required for future development of the robot, simply
changing the operating system would not be difficult.
9.4 Remote Control
9.4.1 Wireless Communication
The options available for wireless communication were WLAN: Wi-Fi / Bluetooth, or a
custom built RF link.
Since the robot is running Windows XP on a relatively powerful platform, no advantage
would have been gained from the development of a custom RF packet transmission system. It
is more efficient in terms of time and money to simply use a standard form of WLAN.
The Bluetooth standard incorporates devices capable of forming a wireless Personal Area
Network (WPAN) operating with a bandwidth of < 1 MBps. The term WPAN simply
indicates a shorter range than a WLAN, generally functioning only within a range of
approximately 10 m [25].
A core specification for the robot was that it should be capable of independently mapping an
area. A communications range of only 10 m would therefore have been extremely limiting.
Although the robot was to operate independently, the capacity to observe its operation
remotely was not only a key tool for development and testing but also a useful operational
feature.
The communication technology selected was therefore Wi-Fi. A USB 802.11b dongle was
purchased for the robot. The required connection was between the robot, acting as a host, and
a remote computer acting as a client. Since no access to a wired network was required, and
only two devices needed to be networked, an ad hoc network configuration was set up. See
Section 3.3.2 Wi-Fi Configuration and Use for details.
Details of how to enable Wi-Fi under Windows XP are available on the project CD.
145
GDP – Minibot III
Autonomous Terrain Mapping Robot
Network Name (SSID)
Test Minibot
Network Authentication
Open
Data Encryption
None
Network Key
(Not Used)
Host IP:
169.254.187.10
Table 9.2: Ad Hoc Network Properties
The only foreseeable weakness in the use of Wi-Fi communication is that since the 802.11b
protocol operates within an unregulated frequency band, it can be subject to interference from
equipment such as microwaves and portable telephones. This interference is not thought to be
significantly great to pose a problem in the vast majority of situations.
9.4.2 Remote Access Software
Two main alternatives exist for accessing the host (onboard) computer from the client
(remote) computer. The first of these is to custom build an application to communicate with
the host directly over the network using TCP/IP protocols. The advantage of this method is
that only required information is passed between the two computers. This reduces network
overheads and speeds up the transmission of the requested data. The drawbacks are that a
great deal of software development is required and only partial access to the host computer is
gained. As no members of the team were confident in this type of high-level programming, it
was decided that this option should be avoided if possible.
The alternative method for remotely accessing the host computer is to use a ‘remote desktop’
application. These can tend to incur overheads too large for lower specification PCs, but on
this relatively fast platform the overheads will not pose any considerable problem. The
advantage of these systems is that almost all functions of the host computer can be controlled
from the client. To the user, it is as though one is actually using the host computer.
The first remote desktop application investigated was the native Windows XP software. This
was deemed unsuitable for use due to a paradoxical feature of its operation. In order to use the
Windows XP remote desktop feature, the host computer must have both a username and a
password in operation on the active user account. However, if a username and password are
active on the account, then they must both be entered into the ‘Windows Welcome Screen’
every time the operating system is loaded. Since the Wi-Fi link to the client only connects
once the operating system has loaded, this would mean that a keyboard, mouse and monitor
146
GDP – Minibot III
Autonomous Terrain Mapping Robot
would have to be connected to the onboard host computer each time it was switched on. Such
an impractical method was deemed unsuitable for use.
A popular alternative to the native Windows XP remote desktop application is WinVNC.
RealVNC, a UK company founded in 2002 by a team from the AT&T Laboratories in
Cambridge, provides this software free. VNC software incorporates a number of features
that make it particularly suitable for use in this project. The first of these is that the host
computer need not have a username and password in operation on its user account. The
WinVNC application is simply put in the startup folder, and the computer need only be
manually switched on. Windows XP then loads automatically, connects to the ad hoc
network and the WinVNC host application starts up and waits for the VNC Viewer
software on the client to connect.
Another advantage of using VNC is that it is a cross-platform piece of software. This
means that the host and client computer need not even be running the same operating
system for a connection to be made [26].
The decision was therefore taken to use VNC software to remotely access the onboard
computer. The VNC application software was, as described, installed on the host
computer and linked from the Start-Up menu. The session password was set to ‘Minibot’.
The VNC Viewer software, which is only around 150KB in size, was then loaded onto the
client computer. Access is made by entering the network IP address and session password
of the host into the viewer.
147
GDP – Minibot III
Autonomous Terrain Mapping Robot
9.5 Complete Onboard PC Specification
Motherboard
KM4M-V Socket A VIA KM400 MATX
Processor
AMD Athlon XP3000 333FSB 512 L2 Cache
Operating System
Windows XP Professional SP2
RAM
Samsung 512MB PC2700 DDR333 184pin Memory Module
HDD
Seagate ST94811A 40GB 8mb Cashe 2.5ins 5400rpm
Network Interface Netgear MA111 USB Wi-Fi Dongle
Username
Administrator
Password
(None)
Installed Software WinVNC (Virtual Network Computing) v3.3.3 R7
Netgear MA111 Drivers v2.5 beta
MATLAB v7
Figure 9.4: PC Specification
9.6 Robot Computer Summary
The computer platform installed on this robot is a powerful custom built machine. Each
component has been selected for the best possible performance in its situation. The Windows
virtual footprint on the hard drive has been kept to a minimum by finding small, reliable
pieces of software, and not installing extra, unnecessary programs.
A further consideration has been the future development of the system. The design process
took into account that fact that the system will almost certainly be expanded in the near
future. As such, preference has been given to industry standard interfaces and recognised
manufacturers. The motherboard, for example, can be easily modified to support an extra four
USB ports (Refer to manufacturer’s handbook). This feature was chosen in deference to the
current trend towards the USB standard. It was also ensured that, in addition to supporting the
hardware requirements of this current project, the PC has plentiful room for expansion, with
spare IDE, PCI and AGP connectors.
The principle of ergonomics has been observed to make the computing system accessible,
such that a new user would be able to operate it with a minimum of instruction. The use of a
leading operating system and well known, well-regarded software means that the overall
function is reliable.
148
GDP – Minibot III
Autonomous Terrain Mapping Robot
10 Conclusion
The Minibot III project has been successful in achieving many of the objectives set out in
Section 1.2. At the conclusion of the project, the robot platform consists of a highspecification PC with wireless capability mounted on a versatile aluminium and steel chassis.
The robot moves on 4 wheels, each driven by its own motor which is in turn powered by one
of two Sealed Lead-Acid (SLA) batteries. Control is achieved through a combination of offthe-shelf motor controllers and controller hardware developed during the project. This
hardware also integrates electronic sensor inputs with the PC. Two optic sensors provide
information on the robot’s movement via a dead-reckoning system.
The robot senses its external environment in two ways. A low-cost sonar sensor is utilised as
a collision detector, while dual web-type cameras receive stereo images of their surroundings.
Software has been developed which allows depth calculations to be extracted from these
images, when correctly calibrated. The depth data can be used to construct a grid-based map
of the local environment, and a stochastic greedy strategy is used to explore and navigate
based upon this map. An on-screen graphic display provides current map and supplementary
information to the external user. Written in MATLAB code, the programme communicates
via RS232 to the controller hardware, providing a fully integrated robot system with
autonomous search and mapping capability.
While each separate component module in the project has been shown to operate to a standard
sufficient to meet the brief, reliability issues have consistently proved to be a barrier to
progress, especially in the latter stages of ‘project integration’. In particular, systems which
rely on sensor inputs (such a dead-reckoning and stereo vision) have suffered from poor
reliability. Although difficult to confirm, installation and calibration issues have been
identified as significant factors in the reliability of systems.
Persistent reliability problems, in addition to PC component failure, have meant that only
limited testing was possible of the whole robot system, and results have so far been
inconclusive. The robot has demonstrated the ability to move and navigate with some
accuracy in straight lines and through turns, mostly avoiding collisions. Fully integrated, the
stereo vision system is capable of identifying significant obstacles which can then be plotted
on a map. The robot usually attempts to turn away from identified obstacles. At this stage,
however, it has not been found that in a given situation the robot is able to autonomously
149
GDP – Minibot III
Autonomous Terrain Mapping Robot
generate an accurate and useful map of its immediate environment. The project team is
confident that if the reliability issues could be overcome, and with more time for testing, this
goal could be achieved with the existing robot hardware and software.
The personal development of all group members has been vast, with team-working, project
management and planning skills advancing in addition to general engineering awareness, and
specific programme related skills. The project has been enjoyable for everyone involved, and
the project team are grateful for the help and support they have received from the University.
150
GDP – Minibot III
Autonomous Terrain Mapping Robot
11 Recommendations
Following the completion of this project, recommendations are made towards the continuation
and further development of Minibot.
•
Measures should be taken to reduce the reliance on camera-calibration of the stereovision algorithms. This may include modifying the hardware mount enabling more
precise and robust calibration, or developing tools to allow software calibration to be
performed. The resulting software should be made robust to any further calibration
errors.
•
The link between MATLAB and the hardware control-modules should be extended to
allow two-way communication and the feedback of robot position. This will allow
more accurate control of robot movement and a more precise map to be created.
Furthermore, it will allow data from the sonic range-finder to be utilised in the
mapping process, in addition to its current use in collision avoidance.
•
As discussed, a non-mains alternative power supply should be implemented for the
robot PC and electronic components to allow fully mobile, un-tethered operation.
•
Further efforts should be made to reduce the response reaction time sonar obstacle
detection, to increase its functionality at a full range or robot speeds and obstacle
distances.
•
With improved accuracy of disparity and depth data, further development should be
undertaken into mapping using vectors and 3d planes. Obstacle maps based on
vectors should subsequently be used to replace or complement the current grid-based
system.
•
A Graphical User Interfaced should be developed to allow full user-robot interaction,
utilising a comprehensive array of available input commands and displaying multiple
output figures, of the current map and robot status.
151
GDP – Minibot III
Autonomous Terrain Mapping Robot
12 References
[1]
Phillips Semiconductor. Protocols I2C.
http://www.semiconductors.philips.com/markets/mms/protocols/i2c/index.html. Last
accessed: 29/12/04.
[2]
Bob Grabowski. Small Robot Sensors.
http://www.andrew.cmu.edu/user/rjg/websensors/robot_sensors2.html.
Last accessed:07/03/05
[3]
Superdroid Robots. SRF04 Setup/information.
http://www.superdroidrobots.com/product_info/SRF04.html.
Last accessed 07/03/05
[4]
Acroname. Demystifing the Sharp IR Rangers
http://www.acroname.com/robotics/info/articles/sharp/sharp.html.
Last accessed 07/03/05
[5]
Kevin Dowling. What robot related products are there?
http://www.frc.ri.cmu.edu/robotics-faq/10.html.
Last accessed 07/03/05
[6]
ActivMediaRobots. Laser Mapping & Navigation.
http://www.activrobots.com/ACCESSORIES/pantiltlaser.html.
Last accessed 07/03/05
[7]
Peatman J.B. (1998) Design with PIC Microcontrollers. Prentice Hall.
[8]
Jesse Hoey. Stereo Geometry and Depth measurement.
www.cs.ubc.ca/spider/jhoey/review/node2.html Last accessed 17/03/05
[9]
Ahmad Darabiha. (2003) Video-Rate Stereo Vision on Reconfigurable Hardware.
[http://www.eecg.toronto.edu/~jayar/pubs/theses/Darabiha/AhmadDarabiha.pdf]
[10] Tony Jebara, Ali Azarbayejani and Alex Pentland. 3D Structure from 2D Motion. “3D
And Stereoscopic Visual Communication" May 1999, Vol. 16. No. 3.
[http://www1.cs.columbia.edu/~jebara/htmlpapers/SFM/node8.html]
[11] Robyn Owens. Epipolar geometry.
http://homepages.inf.ed.ac.uk/rbf/CVonline/LOCAL_COPIES/OWENS/LECT10/node
3.html last accessed 17/03/05
[12] Point Grey Research Stereo Vision products, hardware
http://www.ptgrey.com/products/index.html
Last accessed 16/04/05
152
GDP – Minibot III
Autonomous Terrain Mapping Robot
[13] Robosoft OEM cameras http://www.robosoft.fr/SHEET/06Vision/index.html Last
accessed 16/04/05
[14] Logitech Product features.
http://www.logitech.com/index.cfm/products/details/GB/EN,CRID=2204,contentid=50
41,detail=2
last accessed 16/04/2005
[15] Daniel Oram Rectification for any Epipolar Geometry.
http://www.bmva.ac.uk/bmvc/2001/papers/82/accepted_82.pdf
Last accessed 18/01/05
[16] Carnegie Mellon University Stereo Image Database
http://vasc.ri.cmu.edu/idb/html/stereo/index.html
last accessed 20/3/05
[17] Martin Campel Stereo vision
http://www.prip.tuwien.ac.at/Research/3DVision/stereo.html
last accessed 10/12/04
[18] IEEE Standards Association
http://standards.ieee.org/
Last accessed 15/04/05
[19] Intel Wireless Solutions Information
http://www.intel.com/business/bss/infrastracture/wireless/solutions
Last Accessed 15/04/05
[20] Gumstix Embedded Computer Systems
www.gumstix .com
[21] (Perez,Sanchez) A Digital Artificial Brain Architecture for Mobile Autonomous
Robots. [In Proceedings of the Fourth International Symposium on Artificial Life and
Robotics AROB’99, Oita, Japan, Jan., 1999, pp. 240-243.]
[22] PC/104 Standard Information
http://www.pc104.org/technology/reg_info.html
Last Accessed 12/04/05
[23] Ampro - PC/104 Supplier. Readyboard 700
http://www.ampro.com/html/ReadyBoard_700.html
Last Accessed 12/04/05
[24] Compact Flash Standard Information
http://www.compactflash.org/faqs/faq.htm#characteristics
Last Accessed 16/04/05
153
GDP – Minibot III
Autonomous Terrain Mapping Robot
[25] Microsoft Bluetooth and WPAN Information
http://www.microsoft.com/resources/documentation/windows/xp/all/proddocs/enus/wireless_networking_overview.mspx
Last Accessed 16/04/05
[26] Real VNC
http://www.realvnc.com
Last Accessed 16/04/05
[27] True Force. http://trueforce.com/Articles/Robot_History.htm
Last accessed: 18/04/05
[28] Robotics Research Group. http://www.robotics.utexas.edu/rrg/learn_more/history/
Last accessed: 18/04/05
[29] Robosapien. http://www.robosapienonline.com/
Last accessed: 18/04/05
[30] Sony Entertainment Robot Europe. http://www.aibo-europe.com/
Last accessed: 18/04/05
[31] Wikipedia, the Free Encyclopedia. http://en.wikipedia.org/wiki/Autonomous_robot
Last accessed: 18/04/05
[32] Spence, Floyd D. (2001) National Defense Authorization Act for Fiscal Year 2001 p.l.
106-398 sec. 220
[33] DARPA. http://www.darpa.mil/grandchallenge/index.html
Last accessed: 18/04/05
[34] Keith Somerville, BBC News Online. http://news.bbc.co.uk/1/hi/world/2404425.stm
Last accessed: 18/04/05
[35] The Sunday Times, October 24, 2004. Article Off-Roading. Author: John Arlidge.
[36] Technology Innovation Centre. http://www.tic.ac.uk/micromouse/
Last accessed: 18/04/05
[37] Trinity College. http://www.trincoll.edu/events/robot/
Last accessed: 18/04/05
[38] Haralick R. and Shapiro L. (1993) Computer and Robot Vision, Volume II. AddisonWesley Publishing Company, pp. 43-100, 156-158.
[39] Matthew Turk, University of California.
http://www.cs.ucsb.edu/~cs281b/spring2004/notes/camera.pdf
Last accessed: 17/04/05
[40] howstuffworks. http://electronics.howstuffworks.com/digital-camera2.htm
Last accessed: 17/04/05
154
GDP – Minibot III
Autonomous Terrain Mapping Robot
[41] Volker Gerdes, University of Bonn. http://www-dbv.cs.uni-bonn.de/stereo_data/ Last
accessed: 17/04/05
[42] Thrun, S. (2002) Robotic Mapping: A Survey. School of Computer Science, Carnegie
Mellon University.
[43] LaValle, S. M. and Kuffner, J. J., University of Illinois.
http://msl.cs.uiuc.edu/rrt/index.html
Last accessed: 11/04/05
[44] Oriolo G., Vendittelli M. et al. (2004) The SRT Method : Randomized strategies for
exploration. Proceedings of the 2004 IEEE International Conference on Robotics &
Automation, New Orleans, LA, April 2004.
[45] Cohen-Or, D. and Kaufman, A. (1995) Fundamentals of Surface Voxelization.
Graphical Models and Image Processing vol. 57, No. 6, November, pp. 453-461.
[46] Peter Kovesi, University of Western Australia.
http://www.csse.uwa.edu.au/~pk/Research/MATLABFns/
Last accessed: 19/04/05
[47] Matthies, L., Xiong, T. et al. (2002) A portable, autonomous reconaissance robot.
Robotics and Autonomous Systems vol: 40, pp. 163-172.
[48] Borenstein, J., and Koren, Y. (1991) Histogramic In-Motion Mapping for Mobile
Robot Obstacle Avoidance. IEEE Transactions on Robotics and Automation vol:7, no.
4, pp. 535-539.
[49] Zalud, L. (2003) RoBrno Awardee Paper. RoboCup Rescue Robot League Competition,
Padova, Italy, July 2003.
[50] Tomatis, N., Nourbakhsh, I. and Siegwart, S. (2003) Hybrid simultaneous localization
and map building: a natural integration of topological and metric. Robotics and
Autonomous Systems vol: 44, pp. 3-14.
[51] Ohya, A., Nagashima, Y. and Yuta, S. (1994) Exploring Unknown Environment and
Map Construction Using Ultrasonic Sensing of Normal Direction of Walls. IEEE.
[52] Hiyama, M., Emura, T. and Kumagai, M. (1997) Sharpening Directivity of Ultrasonic
Range Sensor Using Multiple Transmitters by Different Amplitude and Pulse Width
Method. IEEE.
155
GDP – Minibot III
Autonomous Terrain Mapping Robot
13 Appendix A: Disparity-depth equations
13.1 Depth estimation
13.1: Stereo depth estimation
In the following derivation, the subscript 1 applies to the left camera and the subscript 2 to the
right camera. The image of an object at P is assumed to appear at x1 and x2 when viewed by
cameras 1 and 2 respectively.
Z = distance from baseline to object P
D = separation of optical axes
f = focal length of cameras
x1 = position of P on image plane 1
x2 = position of P on image plane 2
d1 = perpendicular distance from optical axis 1 to P
d2 = perpendicular distance from optical axis 2 to P
From the construction of similar triangles
d1 x1
=
z
f
and
d 2 x2
=
z
f
156
GDP – Minibot III
Autonomous Terrain Mapping Robot
x1 z
f
d2 =
Equating
D = d2 − d2
d1 =
x2 z
f
D=
x 2 z x1 z
−
f
f
D=
z
( x 2 − x1 )
f
z=
Df
( x 2 − x1 )
13.2 Redefining projected image location
x1 =
z=
z=
z=
w
− x1 '
2
and
x2 =
w
− x2 '
2
Df
( x 2 − x1 )
Df
w
w
[( − x 2 ' ) − ( − x1 ' )]
2
2
Df
w
w
( − x 2 '− + x1 ' )
2
2
157
GDP – Minibot III
Autonomous Terrain Mapping Robot
14 Appendix B: Motor Wiring
Step 1: Take two capacitors and twist them together.
Step 2: Scratch a spot on the motor between the two terminals as shown. Using
soldering iron apply some solder. Then solder the twisted end of the two capacitors to
the spot on the motor case as shown.
Step 3: Take the other end of the twisted pair of capacitors and insert one end to each
motor brush terminal.
Step 4: Place a third capacitor between each motor brush terminal.
Step 5: Strip back the shielded wire leaving the two conductors about 2-3" out.
Step 6: Slide the large ferrite ring over the shielded wire and cover it with the 3/8"
heat shrink.
Step 7: Insert the small ferrite rings into the 1/4" heat shrink (this is a real tight fit)
and then slide over each wire conductor.
Step 8: Place the wires on the motor brush terminals. You want to be consistent with
which color wire goes to which terminal so all your motors are wired and hooked up
the same. We place the white wire on the terminal with a red dot. You may need to
use a high power soldering gun or torch to heat up the motor casing and ensure you
get good adhesion.
Step 9: Solder the wires and the capacitors to each motor terminal being careful not to
overheat any of the components.
Step 10: Heat the shrink wrap on the wires securing the ferrite rings.
158
GDP – Minibot III
Autonomous Terrain Mapping Robot
159
GDP – Minibot III
Autonomous Terrain Mapping Robot
15 Appendix C: Module 1 & 2 Circuit Diagram
160
GDP – Minibot III
Autonomous Terrain Mapping Robot
16 Appendix D: Layout of Chassis – Layer 1
SRF08 Range
Finder
Dead
Reckoning
System
Dead
Reckoning
System
Module 2 Micro
Controller
Left Side Battery
MD22
Motor
Controller
MD22
Motor
Controller
Module 1 Micro Controller
Software
Maintenance
Display
161
Right Side
Battery
GDP – Minibot III
Autonomous Terrain Mapping Robot
17 Appendix E: Layout of Chassis – Layer 2
Stereovision Cameras
Power
Switch
L.E.D.
Display
Hard Drive
MotherBoard
Power Supply
Unit
162