OmerBhatti`s paper - National University of Computer and Emerging

Transcription

OmerBhatti`s paper - National University of Computer and Emerging
FAST-NU Research Journal (FRJ), Volume 1, Issue 1, January 2015
Design and Implementation of an
Autonomous Fire Fighting Robot
Faisal Abbas1, Omer Saleem Bhatti2
National University of Computer and Emerging Sciences, Lahore, Pakistan; [email protected]
2
National University of Computer and Emerging Sciences, Lahore, Pakistan; [email protected]
1
this research/design activity is to develop an
autonomous and low-cost fire-fighting mobile
robot that can act as a reliable machine to
prevent fire accidents. The secondary objective
is to use this robot as an effective tool to teach
and practically demonstrate various essential
concepts to the undergraduate engineering
students in courses such as electronic circuit
design, embedded computation, microprocessor
based system design, electronic instrumentation,
reactive paradigms of robotics, engineering
mechanics and digital signal processing.
Abstract–This paper describes the details of
the design and construction of a flexible,
reliable and a low-cost experimental mobile
robot platform. This robot is capable of
autonomously detecting and extinguishing a
fire source. One benefit of this platform will
be to serve as a teaching aid for explaining
basic concepts of robotic paradigms to the
undergraduate students of engineering.
Infrared based flame sensors have been used
to detect the source of the fire. Optical range
sensors have been used to avoid obstacles.
Signal conditioning circuits and filters have
been used to remove noise and improve the
quality of the sensor data.
Planning
algorithms have been developed to enable the
robot to reach its target, while traversing
along the shortest, safest and collision-free
path. The structure of the omni-directional
platform and the articulated mechanism used
to dispense water and extinguish the fire have
also been explained. Motor driver and water
level monitoring circuits are presented, and
features that ensure the safety of the robotic
system have also been discussed in this paper.
II. LITERATURE REVIEW
A lot of work has been done in the past
regarding the construction of autonomous fire
fighting robots with an aim to use them in
undergraduate education and to motivate the
student teams to participate in mobile robot
design activities [1,2].
Such activities allow students to practically
apply and hence strengthen their concepts in
mathematics, feedback control, computer
programming, signals and systems and basic
robotics courses [3]. In some of these activities,
a line following robot is used to track and
navigate through a line maze while avoiding
obstacles and extinguish any fire sources on the
basis of feedback from Light Dependent
Resistors (LDRs) [4]. A remote-controlled
crawler hydraulic excavator has been modified
to serve as a fire-fighting robot in [5]. Similarly,
a fuzzy inference system has been designed in
[6] to monitor and put out fire in road and
railway tunnels. A cooperative fire fighting
technique has been discussed in [7], where a
human leader guides the robot(s) to perform the
fire fighting task according to his commands. An
ultra-sonic obstacle avoidance scheme has been
suggested in [8] and its efficacy has been
Keywords– fire fighting robot, flame sensor,
optical range sensor, path planning algorithms,
omni-directional platform.
I. INTRODUCTION
The employment of field assistive robots is
gaining momentum with the rapid advancements
in science and technology. Their utilization in
environments that are hazardous for human
intervention is increasing rapidly. Recently,
robots have been deployed for fire-fighting. It is
amongst the most dangerous jobs for human
beings, since a fire fighter has to get to the fire
and extinguish it quickly, thus preventing further
damage. The primary motivation in carrying out
21
FAST-NU Research Journal (FRJ), Volume 1, Issue 1, January 2015
experimentally validated. An adaptive fusion
multi-sensor algorithm for fire detection in
intelligent buildings has been discussed in [9]. A
number of technologies that are currently being
used in the fire fighting robots have been
explained in [10].
movement, and then generates appropriate motor
control commands. These commands are
provided to an H-Bridge motor driver circuit.
The driver circuit steers the motorized wheels of
the robot, so that it may navigate towards the
location of the fire, while avoiding collisions
and bypassing the intervening obstacles in its
path. In this way, the robot converges towards
its destination by following the shortest and
safest path. The robot uses four motorized omnidirectional wheels, installed beneath its base,
around the periphery.
III. DESIGN DETAILS
The overall hardware block diagram of the firefighting robot is presented in Figure 1.
Each pair of two wheels is installed co-axially
beneath the base; such that the axes are
perpendicular to each other. A differential drive
mechanism is used by each pair of wheels for
locomotion. This arrangement allows for
stability, flexibility, maneuverability and
controllability of the mobile platform. In case
there are multiple sources of fire in the vicinity,
the robot uses a variant of the ‘Bug algorithm’
[17] to decide the most critical location amongst
the choices available, and moves to extinguish
the fire at that location. The remaining locations
are scheduled according on the basis of their
criticality. Once the robot approaches the target
location, it processes the sensor data and tends
to maintain a minimum distance from the fire
source to ensure its safety. At this point, the
controller generates pulse-width-modulated
signals to control the position of the servo
motors that are installed in the manipulator arm.
Using the pan-tilt configuration of movement,
the manipulator starts scanning the area to
search for the exact location of the flame in its
work-space. The scanning is done with the aid
of an additional flame sensor attached at its endeffector, alongside the water dispensing
hose/unit. When the exact location of the flame
is determined, the arm locks its position and the
controller electronically actuates the water pump
to dispense water and extinguish the fire.
Fig. 1: Hardware block diagram
The robot system contains a total of eight
modules.
A.
B.
C.
D.
E.
F.
G.
H.
Fire Detection
Obstacle Avoidance
Signal Conditioning
Embedded Controller
Motor Driver
Water Dispensing Manipulator Arm
Water Pump Actuator
Robot Structure
The robot structure has a cylindrical contour.
The array of flame sensors and distance-range
sensors are installed around the outer periphery
of the robot. The robot uses infra-red flame
sensors to measure the intensity and the distance
of the source of the fire (flame) from its current
location. For environmental awareness and
detection, the robot uses the optical ranging
sensors. The data from these sensors are filtered
and conditioned to remove impulsive noise. By
using the sensor feedback in conjunction with a
reactive/heuristic/potential
path
planning
algorithm(s), the controller learns critical
information regarding its surroundings, plans its
A dedicated circuit is designed to monitor the
water level in the tank, placed inside the robot.
The entire system is powered by a 12V, 4.5
Ampere-hour sealed lead-acid (SLA) battery.
A.
Fire Detection
The primary task of the robot is to detect the
location of the fire source. The burning fire
emits radiation. These radiations are composed
of a very small amount of ultraviolet energy and
22
FAST-NU Research Journal (FRJ), Volume 1, Issue 1, January 2015
visible light energy. For this purpose, an array of
near Infrared flame detectors has been utilized
as shown in Figure 2 [11]. The sensor array is
particularly sensitive in detecting the radiations
that fall in the spectral range of 900 – 1100 nm.
The cone of vision of the individual sensor in
the array is three dimensional and is very sharp
(approximately 200 – 250).
where,
d= distance between the flame and the fire detector
A= minimum flame area
c = constant of proportionality
B.
Obstacle Avoidance
For obstacle detection and avoidance, infrared
range sensors are used, as shown in Figure 4
[13]. These sensors work on the principle of
optical triangulation in one dimension. The
infrared LED transmits a focused beam towards
the target. After reflection, the beam falls
directly over a position-sensitive device (PSD)
[14]. The process of ranging is shown in Figure
5. The distance of an object from the sensor
depends on the equation given below.
When all the sensors in the array are used, it
provides a “total” cone of vision of 120 degrees.
Four of these sensor arrays are placed around the
robot’s outer surface, one on each wall as shown
in Figure 2. The wide cone of vision is
beneficial as it covers a large area on each side
of the robot, hence allowing the robot to
accurately scan and sense its environment for
any possible fire sources.
where,
D = distance between the sensor and the object (m)
f = focal-length of the lens (m)
L = distance between laser source and PSD (m)
x = distance of the incident light on the PSD (m)
Fig. 2: Flame detector array
Fig. 4: Range sensor [13]
The sensor output is sufficiently accurate, but
due to the geometric limitation in the
construction of the sensor, as shown in Figure 5,
it is generally used to detect objects in the close
proximity. In the proposed robot, the GP series
optical range sensor (by SHARP) [15], have
been utilized. They can measure any object with
acceptable accuracy between 8 cm – 80 cm. The
sensor is shown in Figure 4 and the variation in
the analog voltage output of the sensor is shown
Figure 6.
Fig. 3: Square law of flame detection
The lamination on the sensors helps prevent the
sensor readings from potential background
radiations. The sensors offer detection range of
about 2.74 meters. However, they must be
placed high enough on the robot’s structure, so
that all the possible flame locations are in its
field of view. Each of the sensors in the array
gives an analog output. The analog output from
the sensor(s) follows the square law of flame
detection. If a flame detector can detect a fire
with an area A at a certain distance d, then a four
times bigger flame area is necessary if the
distance between the flame detector and the fire
is doubled [12], as shown in Figure 3. The
relation is given by the following equation
C.
Signal Conditioning
As discussed above, the robot system utilizes
four flame array sensors (each array having six
individual infrared flame sensing LEDs),
thirteen infrared optical range sensors and an
additional flame sensor at the tip of the shower.
Hence, in total, thirty nine analog-to-digital
23
FAST-NU Research Journal (FRJ), Volume 1, Issue 1, January 2015
(ADC) conversion channels were needed.
However, the ATMEGA 2560 microcontroller
provides only sixteen such ADC channels. In
order to handle this issue, multiplexing was
performed
using the CD4051 analog
multiplexers. Each three-bit multiplexer takes 8
analog inputs from the sensors. The enable-set
bits are provided by using the digital pins of the
ATMEGA
2560
microcontroller.
Five
multiplexers have been used as shown in Figure
7. Within 20msec, each IC reads the data and
depending upon the selection bit, sends an
output. A 100pF non-polar capacitor is also
placed at the analog output of each sensor to
remove the ripple and impulse noise in the
output.
conditions, therefore, an L298 dual H-bridge
motor driver is used as shown in Figure 9.
TABLE I shows the sequence of binary logic
inputs required to drive the motors. For speed
control, each logic ‘1’ input is replaced by an
appropriate PWM signal.
Fig. 7: Multiplexer circuit
TABLE I. LOGIC INPUT TO DRIVE MOTORS
Direction of
Rotation
Stop
Clockwise
Counter-Clockwise
Short-Circuit
Fig. 5: Optical Triangulation
Logic Input 2
0
0
0
1
1
1
0
1
Fig. 8: DC metal-geared motors
Fig. 6: Output behavior of range sensor [15]
D.
Logic Input 1
Motor Driver Circuit
E.
There are four DC metal-geared motors attached
beneath the base of the mobile robot. Each
motor, shown in Figure 8, has a torque of 18
kg.cm and a speed of 63 rpm. In order to drive
these four motors as per the commands
forwarded by the embedded controller, two dual
H-bridge motor driver circuits are utilized. Each
circuit contains two four-quadrant DC chopper
circuits. Hence, it can be used to control the
speed and direction of rotation of two motors
simultaneously. Since each motor draws
approximately 600mA current under loaded
Water Dispensing Manipulator Arm
Once detected, the fire is extinguished by
dispensing water. For this purpose, a very
simple yet an extremely effective mechanism
has been employed. Aluminum brackets and
rods are used to implement a pan-tilt
mechanism. The manipulator arm shown in
Figure 10 has two rotational degrees of freedom
(DOF). A 13 kg.cm torque RC servo motor is
used for pan rotation, where as a 10 kg.cm
torque servo motor is used for the tilt rotation of
the manipulator arm. A flame sensor is also
attached at the tip of the shower head. Hence, by
24
FAST-NU Research Journal (FRJ), Volume 1, Issue 1, January 2015
scanning the workspace, the mechanism aids the
robot to detect the exact origin of the flame.
F.
isolated electromechanical relay. This pump
serves to transmit the water supply directly to
the shower attached at the end-effector of the
manipulator arm. While the water is being
dispensed, the flame sensor keeps monitoring
the infrared radiations being emitted by the
flame. In this way, the shower keeps on
dispensing the water as long as the flame is not
fully extinguished.
Water Pump Actuator
To extinguish the fire, a reservoir carrying 1200
ml water is placed inside the body of the robot.
The water level inside the reservoir is monitored
constantly, and the user is alerted if the water
level recedes below a certain threshold. LEDs of
different colors are installed outside the robot to
provide a visual aid to the user, so the water can
be replenished if deemed necessary.
G.
Robot structure
The body of the robot is made from Alucobond
sheet, since it is light weight. The flame sensor
arrays are installed at the periphery of the robot.
The optical range sensors are installed near the
base of the robot around the outer contour. The
dimension of the robot is 35cm x 35cm x 50cm.
For flexible locomotion, “omni-wheels” are used
as shown in Figure 11 and 12. In the four wheel
design, the wheels are attached at 90° to each
other. One set of two wheels are parallel to each
other and this set is perpendicular to the other
two wheels. Also, at any point there can be only
two driving wheels and two free wheels. This
makes the two driving wheels completely
efficient. This steering methodology provides
simplified calculations, greater stability,
controllability and maneuverability to the robot.
Each of the four wheels is powered via a gear
motor which is controlled through an H-bridge
circuit, as discussed above. The overall structure
of the robot is shown in Figure 13.
Fig. 9: H-Bridge motor driver circuit
Fig. 10: (a) Shower arm schematic, (b) Shower arm
hardware implementation
Fig. 11: Bottom view of the robot
Once the fire source is detected, the controller
actuates a 12V water pump via an optically
25
FAST-NU Research Journal (FRJ), Volume 1, Issue 1, January 2015
Fig. 12: Robot moving mechanism
Fig. 13: Complete robot structure
26
FAST-NU Research Journal (FRJ), Volume 1, Issue 1, January 2015
IV. SOFTWARE DESIGN
The sequence of steps needed for processing of
the sensor data is very essential for the robot’s
expected functionality. In the proposed system,
forty (40) analog sensor channels are needed for
recording the analog values of the incoming
sensor data in the controller, whereas ATMEGA
2560 only offers sixteen analog channels. To
resolve this issue, five analog multiplexers
(CD4051) are used. Each multiplexer has 8
analog input channels. Hence three digital
selection bits are needed to sequentially select
these channels. When the selection bit is 000,
each of the 5 multiplexers will simultaneously
allow the transmission of data present on their
first channel to the controller. Similarly when
the selection bit is 001 then each of them will
feed the data on their next input channel to the
controller. In this way by changing the selection
bits from 000 to 111, one can send 40 analog
sensor readings sequentially. Since the time
interval between each selection bit transition is
very small, therefore the process of sensor data
acquisition runs very smoothly.
Figure 14 shows the flow chart of the software
routine. Upon initialization, the flame sensor
data is given to the ATMEGA 2560 on the basis
of which it turns on the gear motor(s), selecting
a path, while continuously reads values from the
IR sensors. When the final destination is
reached, the microcontroller will turn off the
gear motor(s), and turn on the servo motor(s) to
spray water till the fire is extinguished.
A. Sensor-data acquisition and conditioning
Conditioning the sensor data is very vital for the
optimal performance of the robot. The sensor
data is received and stored in the
microcontroller’s memory. Since there are a lot
of disturbances in the environment that could
induce errors in the result, therefore the
incoming signal needs to be filtered and
averaged in order to improve the sensor data
integrity and quality. The processed sensor data
is then compared with pre-defined thresholds
and reference values so as to allow the robot to
plan the required actions and hence generate
appropriate actuation commands.
Inputs: 10 values from each sensor via the multiplexer at an
interval of 0.1ms.
output: filtered and stabilized reading from each sensor
1. START
2. declare and initialize variables n=0, selection.bit = 000
3. declare 2d-array sensor[5][10] to store 10 readings from
each of the 5 sensors
4. declare 1d-array mean[5] to store filtered and stabilized
reading from each sensor
5. while (selection.bit != 111)
delay of 0.2ms
read 10 values for sensor[n] at a sampling
interval of 0.1ms
sort the 10 values of sensor n in ascending order
eliminate extreme readings (the first two and the
last two values) from the sorted array
mean[n] = sum of the middle 6 readings stored
increment n
increment selection bit
6. STOP
Once the data from each of the sensors is
acquired, the next step is to condition it by
taking the median and mean of the sample data
in order to filter out any noise in the readings.
After signal conditioning is complete, the
received data is compared with the pre-stored
reference values. This helps the system to
perceive its environment and accordingly
generate the actuation commands for motor
control. The flame sensor and the optical
ranging sensor, both operate on the negative
coefficient principle. The larger the flame
Fig. 14: Software block diagram
27
FAST-NU Research Journal (FRJ), Volume 1, Issue 1, January 2015

intensity or distance from the obstacle, the
smaller is the analog voltage output of the flame
sensor and the range sensor respectively, and
vice versa. There are 24 flame sensors
constantly receiving the IR radiations from any
fire source. Similarly, there are 12 ranging
sensors. The following routine helps the system
in identifying the flame sensor that is receiving
the highest flame intensity. This technique
marks the position and direction of the flame
with respect to the robot’s current location.
Using this information, the robot simply moves
towards its target (the flame) while avoiding the
obstacles. The same routine is also used in
conjunction with the range sensors to find the
obstacles in the close proximity of the robot.
6.
C. Path Planning Algorithm
The Bug’s algorithms are the most simple yet
the most affective path planning algorithms.
They ensure a collision free path for the robot’s
traversal in the presence of unknown obstacles.
The main task of the algorithm is to move
towards the goal. However, on encountering an
obstacle, the robots switch their behavior from
motion-to-goal to boundary-following. In this
mode they simply bypass the obstacle by
generating a trajectory along the contour of the
two-dimensional surface, such that the new path
leads directly to the goal. An improved version
of Bug algorithm series is the Dist-Bug
algorithm. They use different data structures to
store the hit and they leave points together with
some useful information about the followed path
[12, 18]. This algorithm helps the robot in
computing and traversing along the shortest yet
safest path towards the goal, allowing it to reach
it in comparatively lesser time. As discussed
above, when the robot encounters an obstacle in
its path, it tends to bypass the obstacle by simply
following its two dimensional boundary. The
robot simultaneously computes and continuously
compares the stored distances from its current
and next position to destination. This rigorous
comparison helps the robot in deducing the point
where it switches its behavior from obstacleavoidance back to move-to-goal, known as the
‘leaving point’. This point is selected based on
the condition that the distance of destination
from its next position is greater than the
corresponding distance from its current position.
If this condition is not satisfied, the robot
continues its obstacle avoidance behavior. A
drawback of this algorithm is that the minimum
dimensions of the obstacle in the environment
must be known to the robot before it starts
navigating the environment. The algorithm is
explained in detail below:
Inputs: analog value from the 24 flame sensors
Output: array index of the sensor with the minimum analog
voltage value
1. START
2. declare and initialize variables n = 0
3. declare array flame[24] to store analog readings from
each of the 24 sensors
4. while ( n < 24)
read value and store in flame[n]
increment n
5. find minimum value in the array flame[n]
6. index = index of the array with the minimum value
7. if (index > threshold reference value)
go to Step 2 again
8. else
next routine for robot movement
9. STOP
B. Shower Arm Manipulation Algorithm
There are two joints in this shower arm. Each
joint is driven by a servo motor. The larger
motor is responsible for the pan motion of the
arm and can rotate from 00 to 1800. The other
servo motor controls the tilt motion. Its motion
is restricted to 00 to 450 in the direction of the
fire. The direction of rotation is clearly
illustrated in Figure 12. The pan and tilt
rotations are denoted as θ1 and θ2. The algorithm
used for scanning is as follows:
1.
2.
3.
4.
5.
If θ2 approached to 00 but fire is not detected then
increment θ1 by 50 once.

If no, then go to step 4 again.
Check if fire detected or θ1 approached to 1800

If fire detected then move the Robot in the
direction of fire for some distance and extinguish
the source of fire.

If θ1 approached to 1800 but fire is not detected
then go to step 1.
Initialize θ1 = 00, θ2 = 00.
Increment θ2 by 50.
Check if fire detected or θ2 approached to 450.

If fire detected then STOP this process

If θ2 approached to 450but fire is not detected
then increment θ1 by 50 once.

If no, then go to step 2 again.
Decrement θ2 by 50.
Check if fire detected or θ2 approached to 00.

If fire detected then STOP this process
1.
2.
28
Initialize i=0 and thickness (Th) of grown obstacles
Increment i and move toward the target until one of
the following occurs:

The target is reached. Stop.
FAST-NU Research Journal (FRJ), Volume 1, Issue 1, January 2015

3.
C. Path Pattern 3
An obstacle is reached. Denote this point Hi. Go
to step 2.
Turn left and follow the obstacle boundary while
continuously updating the minimum value of d(x, T)
and denote this value dmin(t). Keep doing this until one
of the following occurs:

The target is visible: d(x, T) – F < 0. Denote this
point Li. Go to step 1.

The range based leaving condition holds: d(x, T)
– F < dmin(t) - Th. Denote this point Li. Go to step
1.
 The robot completed a loop and reached Hi. The
target is unreachable. Stop.
Pattern 3, shown in Figure 17 (a), is slightly
complicated than its predecessors. The
workspace here contains two compartments.
Although the robot finds the shortest path
towards the destination, however, it enters in the
first compartment while adopting the shortest
path towards the target. Since there is no fire in
this compartment, it moves ahead. It simply
follows the direction of signals while
recognizing the obstacle ahead. The robot
intelligently clears itself out of that vicinity and
eventually reaches the exact location of the
target. Details are shown in the frames of Figure
17 (b).
The proposed robot commences the path
planning tasks using the Dist-Bug Algorithm.
The obstacle detection is implemented with the
aid of optical ranging sensors. The layout of
these sensors for the proposed robot is shown in
Figure 14. The goal position is defined and
updated constantly by the array of flame sensors
installed on the contour robot body.
D. Path Pattern 4
The test case shown in Figure 18 (a) is slightly
different from pattern 3. Like pattern 3, there are
two compartments in pattern 4, however, unlike
pattern 3, the separation between the two
compartments is triangular. This gives the robot
a unique challenge. The robot finds the shortest
path and follows it, as shown in Figure 18 (b).
But when it gets to the middle, it realizes that
the opted path is not the safest one. Hence after
colliding with one corner of the obstacle, it pulls
itself out, re-evaluates and re-computes a new
path and then eventually reaches the exact target
without a collision.
V. RESULTS
In order to test the above-mentioned algorithm,
four test-cases were developed. Each pattern is
pictorially represented here.
A. Path Pattern 1
In pattern 1, shown in Figure 15(a), the target is
placed inside a U-shaped wall. The robot is
placed outside the U-shaped wall, exactly
behind the flame. As the robot detects the flame,
it starts tracking it and traverses along the wall
to reach the target location. The robot uses the
Dist-Bug algorithm, as explained above, and
plans the safest path and then follows it until it
reaches the flame. Picture frames 1 to 9 of
Figure 15(b) show the actual test results.
VI. CONCLUSIONS
The proposed design and research on the
implementation of a firefighting robot represents
an optimal platform for handling different
scenarios. This is validated by conducting
rigorous experimentation. The robot is tested in
different scenarios and the results are very
promising. Water is used as a fire extinguisher
since it is readily available. However a lot of
enhancements can be done to improve the
performance of the robot. Instead of water,
foam, powdered dry chemicals, carbon-dioxide
and such other fire extinguishing materials can
be used. There can be different types of fire
sources, so these fire extinguishing materials can
be used accordingly. Furthermore, the sensing
technique can also be improved.
B. Path Pattern 2
In pattern 2, shown in Figure 16(a), the robot
correctly chooses the collision free and the
shortest possible path towards the destination.
The algorithm efficiently computes this path by
keeping track of the target location. The omnidirectional mechanism helps the robot get to the
target easily. The results are shown in frames of
Figure 16 (b).
29
FAST-NU Research Journal (FRJ), Volume 1, Issue 1, January 2015
(a)
(b)
Fig. 15: (a) Pattern-1 for the robot, (b) Path traversal of the robot on pattern 1
30
FAST-NU Research Journal (FRJ), Volume 1, Issue 1, January 2015
(a)
(b)
Fig. 16: (a) Pattern-2 for the robot, (b) Path traversal of the robot on pattern 2
31
FAST-NU Research Journal (FRJ), Volume 1, Issue 1, January 2015
(a)
(b)
Fig. 17: (a) Pattern-3 for the robot, (b) Path traversal of the robot on pattern 3
32
FAST-NU Research Journal (FRJ), Volume 1, Issue 1, January 2015
(a)
(b)
Fig. 18: (a) Pattern-4 for the robot, (b) Path traversal of the robot on pattern 4
The infra-red based sensors are susceptible to be
interfered by the source emitting visible light.
Hence instead of IR radiation based flame
detectors, ultraviolet radiation based flame
detectors can be used. The UV-TRON Flame
sensor is a good example. It is expensive but it is
perfectly suited for the job. Moreover, a fire
proof sheet can be used to completely cover the
robot without affecting its performance.
REFERENCES
[1]
D. J. Ahlgren. (2001, Oct.). Fire-fighting robots
and first-year engineering design: Trinity
College experience. Frontiers in Education
Conference 31st Annual, 2001.
Available:http://ieeexplore.ieee.org/xpl/articleD
etails.jsp?arnumber=964027
[2] D. J. Pack. (2004, Aug.). Fire-fighting mobile
robotics and interdisciplinary designcomparative perspectives. IEEE Transactions
33
FAST-NU Research Journal (FRJ), Volume 1, Issue 1, January 2015
on Education. 47(3), pp. 369-376.
Available:http://ieeexplore.ieee.org/xpl/login.js
p?tp=&arnumber=1323151&url=http%3A%2F
%2Fieeexplore.ieee.org%2Fxpls%2Fabs_all.jsp
%3Farnumber%3D1323151
[3] I. M. Verner. (2006, Jul.). Education Design
Experiments in Robotics. Automation
Congress, 2006. WAC '06 World, Budapest.
Available:
http://ieeexplore.ieee.org/xpl/login.jsp?tp=&arn
umber=4259900&url=http%3A%2F%2Fieeexp
lore.ieee.org%2Fxpls%2Fabs_all.jsp%3Farnum
ber%3D4259900
[4] K. Altaf. (2007, Jul.). Design and Construction
of an Autonomous Fire Fighting Robot.
International Conference on Information and
Emerging Technologies 2007. Available:
http://ieeexplore.ieee.org/xpl/login.jsp?tp=&arn
umber=4381341&url=http%3A%2F%2Fieeexp
lore.ieee.org%2Fxpls%2Fabs_all.jsp%3Farnum
ber%3D4381341
[5] W. Zhang. (2009, Mar-Apr.). Development of a
New Remote Controlled Emergency-Handling
and Fire-Fighting Robot. WRI World Congress
on Computer Science and Information
Engineering,
2009.
Available:
http://ieeexplore.ieee.org/xpl/articleDetails.jsp?
arnumber=5170533
[6] A. D. Santis .Fuzzy Trajectory Planning and
Redundancy Resolution for a Fire Fighting
Robot Operating in Tunnels. Proceedings of the
IEEE International Conference on Robotics and
Automation
2005.
Available:
http://ieeexplore.ieee.org/xpl/articleDetails.jsp?
arnumber=1570163
[7] E. Martinson. (2012, Oct.). Fighting fires with
human robot teams. IEEE/RSJ International
Conference on Intelligent Robots and Systems
(IROS), 2012 IEEE/RSJ.
Available:http://ieeexplore.ieee.org/xpl/login.js
p?tp=&arnumber=6386269&url=http%3A%2F
%2Fieeexplore.ieee.org%2Fxpls%2Fabs_all.jsp
%3Farnumber%3D6386269
[8] T. Feng. An ultrasonic obstacle avoidance
system for firefighting robot. Proceedings of the
4th World Congress on Intelligent Control and
Automation, 2002, Vol. 2. Available:
http://ieeexplore.ieee.org/xpl/login.jsp?tp=&arn
umber=1020775&url=http%3A%2F%2Fieeexp
lore.ieee.org%2Fxpls%2Fabs_all.jsp%3Farnum
ber%3D1020775
[9] K. L. Su. Automatic Fire Detection System
Using Adaptive Fusion Algorithm for Fire
Fighting Robot,’ IEEE International Conference
on Systems, Man and Cybernetics, 2006. Vol.2.
Available:
http://ieeexplore.ieee.org/xpl/login.jsp?tp=&arn
umber=4273973&url=http%3A%2F%2Fieeexp
lore.ieee.org%2Fxpls%2Fabs_all.jsp%3Farnum
ber%3D4273973
[10] C. F. Tan. (2013, Aug.). Fire Fighting Mobile
Robot: State of the Art and Recent
Development. Australian Journal of Basic and
Applied Sciences. 7(10), pp. 220-230.
Available:http://www.idemployee.id.tue.nl/g.w.
m.rauterberg/publications/AJBAS2013journala.pdf
[11] Elcrow. Flame detector. [Online]. Available:
http://www.elecrow.com/5channel-flamedetector-module-p-648.html.
[12] Wikipedia, ‘Flame Detection,’ [Online].
Available:
http://en.wikipedia.org/wiki/Flame_detection.
[13] Sparkfun, ‘GP2D120XJ00F,’ [Online].
Available:https://www.sparkfun.com/datasheets
/Sensors/Infrared/GP2D120XJ00F_SS.pdf
[14] R. Siegwart and I. R. NourBaksh. (2005).
Introduction to Autonomous Mobile Robots.
Prentice
Hall
of
India.
Available:
http://home.deib.polimi.it/gini/robot/docs/siegw
art.pdf
[15] SHARP.
GP2Y0A21YK0F.
[Online].
Available:
http://www.sharpsma.com/webfm_send/1489.
[16] Arduino. CD4051. [Online]. Available:
http://playground.arduino.cc/learning/4051.
[17] H. Choset, K. M. Lynch, S. Hutchinson, G. A.
Kantor, W. Burgard, L. E. Kavraki and S.
Thrun. (2005). Bug Algorithms. In Principles of
Robot Motion: Theory, Algorithms, and
Implementation, A Bradford Book: The MIT
Press 2005, pp.17-38.
ABOUT THE AUTHORS
Omer Saleem Bhatti has done his B.S. and
M.S. in Electrical Engineering from the
University of Engineering and Technology,
Lahore, Pakistan, with specialization in control
systems. He is currently serving as Assistant
Professor at the Department of Electrical
Engineering, National University of Computer
and Emerging Sciences, Lahore, Pakistan. He
has advised twenty five undergraduate final year
design projects in the last four years. Most of his
projects have won prizes in national level
engineering project competitions. Some of his
projects have evolved in entrepreneurial startups as well. He mainly teaches instrumentation
and measurements, feedback control systems
and electronic circuit design. He also serves as
the Head of a research group, namely
ConSenSys (Control, Sensing and Systems).
This group focuses on the research and
development in the areas of embedded robotics
and control, distributed and networked control,
control of under-actuated electromechanical
34
FAST-NU Research Journal (FRJ), Volume 1, Issue 1, January 2015
systems and field-assistive wearable devices.
The group also conducts hands-on robotics
workshops for students of computer science and
electrical engineering.
Faisal Abbas has done his B.S. in Electrical
Engineering from the National University of
Computer and Emerging Sciences, Lahore,
Pakistan. He is currently serving as Lab
Instructor at the department. His final year
project has won three titles via competing in
national level engineering project competitions.
His research interests are in the areas of
embedded systems, electronics and industrial
process control. He also serves as the lead
research engineer at the ConSenSys (Control,
Sensing and Systems) Group at the university.
35