Robots

Transcription

Robots
Department of
Electrical & Computer
Engineering’s
August 2, 2006
Collision Avoidance Robotic Infrared Tracker (C.A.R.I.T.)
Team C.A.R.I.T.:
Jason Christensen, Mahmoud Azab, Todd Rosemurgy, Sumit Khatri
Robots can be very useful for completing assignments which humans may find difficult or
unpleasant. Small autonomous robots can be
used to reach places not accessible to humans
or larger robots, such as tiny holes in walls or
through the rubble of a collapsed building. A
small robot becomes much more useful when
equipped with sensors which allow it to interact
with its environment. A sensor-equipped robot
can be used to locate gas leaks, create maps, or
locate survivors. Small robots are not without
limitations. They cannot carry the amount of
hardware or travel over the same difficult terrain
which their larger counterparts can.
Our goal was to build a small autonomous robot
that can perceive its environment, react to unforeseen circumstances and re-plan dynamically
in order to achieve its mission. We addressed
the need for small, autonomous useful robots by
designing a robot which navigates around obstacles while trying to reach an infrared transmitter.
Our solution, ‘Collision Avoidance Robotic Infrared Tracker’ (C.A.R.I.T), is able to navigate around
obstacles while trying to reach an infrared transmitter. It interacts with its environment through
the use of sensors for input and motors for output. In order to reach the intended destination
(infrared transmitter), a set of infrared detectors
and an integrated digital compass is used to
direct the robot to the transmitter. To avoid collisions with obstacles, a set of ultrasonic sensors
bounces sound waves off stationary objects. An
analog infrared photodiode produces a voltage
that determines the distance to the destination.
The microcontroller uses these signals to deter-
mine the appropriate motor control signals. The
motors are controlled by a dual H-bridge circuit
to determine motor direction (forward, reverse
and speed). Finally, all decision-making by the microcontroller is controlled by the programming
algorithm.
Wireless Home Monitoring
Team CODS:
Vaibhav Sarihan, Syed Mujtaba Ali, Kamran Khan, Brian Butler
and analog signals to digital values
and sends them to the base station
through the RF module.
The base station consists of an
RF module, an LCD screen, a sound
alarm, a phone dialer, and two
buttons. One button controls the
display on the LCD screen – deciding which node to display. The
other button resets the entire system. Anytime that the base station
receives a CO level of over 50%, the
sound alarm is triggered. Once the
alarm is triggered, the user has 30
seconds to turn off the alarm. If
the alarm is not turned off within
that time, then the phone dialer
is activated which begins to place
calls on pre-stored numbers and
starts playing a pre-recorded mesAfter looking at the current market, we realized
sage. Once the system goes into phone dialer mode,
that there are a lot of different sensors for Carbon
the phone dialer alarm is triggered and the system
Monoxide, Temperature, and Humidity but very few
freezes for 3 minutes to allow the phone dialer to
that integrate all of them, and even less that actually
place its calls.
have a sensors network. Our smart home system is
The base station receives the values from the nodes
designed to integrate multiple sensors that can be
approximately once every second, giving close to
placed at different nodes and communicate wirelessly. instantaneous readings for the users.
Such a smart home system can be
used extensively throughout a home
or a business, and can be custom
designed to fit the customer’s need.
To demonstrate our sensors network, we have designed a base station, and two nodes. The nodes have
a CO sensor (a temperature sensor is
used for practical and demonstration
purposes), and a humidity sensor.
They also have an exhaust fan which
increases speed with increasing levels
of carbon monoxide. Also, 10 LEDs
are placed on each node that light up
one at a time with increasing levels
of CO. The humidity and CO sensor
send out a signal each to the PIC in
the node that converts the frequency
Travel Guitar Kit
The Guitar Group:
Brandon Loudenburg, Chris Leclair, Edward King, John Green
For anyone who has ever
been to a rock concert, it
becomes quite obvious that
musicians carry slightly more
equipment then just a guitar
or drum set. This is especially
true in the case of a guitarist.
Guitarists carry a multitude of
effects pedals and other devices to distort the sound of their
guitars to that precise timbre
that they are attempting to
create for each and every song. They also must carry
Amps and equipment simply so the sound may be
heard.
The problems with this lie in the fact that they are
carrying more equipment then is necessary and also
spending much more money then may be necessary
as well as each effect typically requires it’s own pedal
and therefore it’s own purchase. The Guitar Group has
designed a product that may be able to remedy that
problem for the traveling musician. We have designed
a DSP kit that is programmed to process audio signals
from a guitar and add various effects (that can be
chosen on the fly) and finally outputting the signal
through the FM band so that the musician can simply
use any local FM radio to output their riffs.
To achieve this we had to make use of a fast DSP in
order to process the input signal without any noticeable delay between playing a string and hearing the
modified output, we chose a TI TMS320C6713 DSK.
The design works as follows, first the signal is carried from the electric guitar through a typical ¼”
audio cable to the travel kit. The signal then passes
through a switch where it
is either carried to a Digital
Tuner or to the DSP board
itself. If the signal is sent
to the DSP board it is then
converted from the analog
to the digital realm using an
ADC converter with an AIC23
codec so that the DSP can
process the information. The
DSP runs algorithms based on
the value of the dip switches
located on the board (each effect has been assigned
a value and the 4 switches work in a binary fashion
to select the desired effect). The modified output is
then sent through a DAC converter and carried to
the input of the FM transmitter. The FM transmitter
uses a Colpitt’s Oscillator to oscillate at a frequency
of about 100 MHz (this can be adjusted so as to use
a frequency that is not used by local radio stations).
The analog audio is carried in the radio wave and
can be heard on any commercial FM radio, thereby
eliminating the need for an Amp.
The Home Automation Control System (HACS)
Team HACS:
Keenan I. Nichols, Jeremy Stack, Joe Hasty, Nikhil Pillai
Our team came up with the idea of implementing a system wherein household items
could be activated using voice commands.
There is a market for a system that would
enable a consumer to control the activation
of different appliances remotely. An important use of this could be by disabled people.
A single system enabling them to control a
wide range of appliances in the house without actually having to do it manually could be
invaluable.
placed inside the satellite. For the purpose of
this project, we have set up the satellite FM
transmitters to transmit at a frequency of 88.1
MHz and 88.7 MHz. The FM signals are picked
up by two radios, each tuned to the necessary
frequency. The signals are then sent directly
to the computer where they are decoded and
interpreted by the voice recognition software.
The Java interface “listens” to the voice commands, checks the commands for validity
and sends the appropriate signal to the RF
transmitter telling it which of the 4 devices to
switch on/off.
The RF transmitter is able to differentiate
Our project is essentially comprised of two satbetween multiple satellites, up to 255, thus alellites and one computer acting as a server. Each
lowing for over 500 controllable devices. Future
satellite contains a microphone, a FM transmitter, advancements could include automatic timers
a RF receiver and two AC plug points leading to
while the home owner is on vacation and remote
the devices to be controlled. The server consists
web-based configuration and activation. This
of two FM receivers plugged into the audio input enables a user to be able to turn off/on a light
of the computer, a Java GUI and voice interpreter, attached to the satellite by simply saying “sysand a RF transmittem lights” in the
RF
Tra
nsmitter
ter attached to the
vicinity of any of
“ System, lamp.”
computer via a
the two microFM Rec eiver
serial port.
phones. The Java
The microphones
GUI also enables
JAVA
on the satellite pick
manual control
Mic
FM Tra nsmitter
Interfa c e
up spoken comof the devices as
Home
RF Rec eiver
Rela y
mands and send
well as some genAutoma tion
x86 C omputer
them directly to
eral configuration
C ontrol
120V Devic e
Joe Ha sty, Kee na n Nic hols, Nikhil Pilla i, Jeremy Sta c k
the FM transmitter (La mp)
options.
System
Positioning System for Robotic Systems
Team Robo-Nav:
Jeff Svatek, Timothy Graham, Nicholaus Kee, Alison Shanley
The IR signal is used as a timestamp and
the difference in delay of the ultrasonic
signal provides the distance information. The speed of light is constant, and
the speed of sound can be assumed to
be constant, though humidity and other
factors cause slight variation. For our
small distances, however, these variations are negligible. The data will be
processed initially with a PIC. This digital
data is run through a digital analog converter using opto-isolators and a resistor
As a group, our goal was to design a positioning
system for use with robotic systems. Our robot
can triangulate its position and navigate to a
desired location. The purpose of designing such a
robot is to prove that it is possible to have such a
vehicle plot a course to a desired location autonomously while starting anywhere inside a triangle
of beacons (trilateration). Our experiment is to
prove that an algorithm can be properly embedded in a robot, giving it the ability to navigate a
flat surface accordingly.
Sensing
Beacons will transmit IR and ultrasonic
Using an infrared and
signals. The robot receives these signals
and sends the data to the microcontroller. ultrasonic signal scheme,
we are able to implement the trilateration
Computation
algorithm. Our research
Using the time stamps from the ultrasound,
the microcontroller computes location using initially implied that we
the standard formula for trilateration.
could be able to execute
this technique in a 2
dimensional environActuation
The robot interprets its relative location and ment approximately 5
moves in the direction it is programmed
meters by 5 meters. This
to go.
was accomplished by
sending both an IR and
Outside World
ultrasonic signal to the
The robot can maneuver a flat 2D
surface up to 5 m without exceeding the robot from the beacons,
range of the transmitting beacons.
each in rapid succession.
The Principle of Trilateration
Standing at B, you want to know your location relative to the reference
points P1, P2, and P3. Measuring r1 narrows your position down to a
circle. Next, measuring r2 narrows it down to two points, A and B. A
third measurement, r3, gives your coordinates at B. A fourth
measurement could also be made to reduce error.
network. This analog information is then processed
by a microcontroller located within the LEGO RCX
brick, which translates the raw analog value into an
integral distance. Using three distances (one from
each beacon), the
microcontroller
derives the robot’s
coordinates relative to a predetermined set of axes.
This location information can be applied to a myriad
of algorithms
to invoke robot
movement in the
proper direction.
Smart Autonomous Robot Team
Team SART:
Morgan Hinchcliffe, Dan Maicach, Boone Staples, Steve Torku
As technology progresses, and computing power
cle detector due to its low power intake and high
range sensing. The target beacon
was designated as a 1 kHz pulse.
SART decided to use a system of
three microphones to follow this
target. If the ultrasonic sensor
fails, a bumper sensor was mounted on the front to help the robot
get out of tough situations.
All the sensory information was
processed using an FPGA made
by National Instruments. The NI
7831R RIO board includes onboard
DAC as well as multiple analog
and digital inputs and outputs. The
instincts of the robot’s “brain” were
programmed using LabView which
directly interfaced with the FPGA.
It was programmed in such a way
that the audio system would detect
the beacon, the robot would head in that direction,
still roughly following Moore’s law, the need for,
but be able to avoid obstacles and re-detect the
and potential of artificial intelligence increases. In
particular, robotics stand to gain immensely from
target beacon to continue in that direction.
these processing power increases as more complex
algorithms can be run faster.
SART tried to develop on this idea of artificial intelligence with an autonomous robot. Having no previous knowledge of its surroundings, the robot can
navigate around obstacles to a beacon. SART decided that the quickest, easiest, and most efficient way
was to use a car-like vehicle. The Vex Robotics Starter
Kit accomplished this several basic parts including a
chassis, and two motors – one controlling each side
of the “car”.
In order to be autonomous, the robot needed
sensory input information to understand its surroundings. An obstacle detector and a directional
sensor system were necessary. An ultrasonic sensor, the MaxSonar EZ1, was chosen as the obsta-
Project: IRF
Team Tech Gurus:
Daniel Chunkapura, Gaurav Gupta, Anshul Gupta, Karan Garg
Small robots can perform various duties that
would be impossible for
human beings or even
hazardous to them. For
example, small robots can
be used to check for toxic
gases in coal mines, check
for volcanic activity, and
check for coal mines etc.
by having the respective
sensors attached to them.
Our goal in this project
was to build such a small
robot that was inexpensive, to help demonstrate
how these machines could be used to perform the
above mentioned tasks.
Our project is divided mainly into two parts. In the
first part the robot will track an Infrared (IR) source
placed ten feet away. In Part 2 the robot will try and
track a Radiofrequency (RF) source placed the same
distance away from it. For the IR part of the
demonstration, we have an IR source positioned 10 feet away from the starting point
which emits IR light at 38 kHz. The robot has
two IR sensors located on it which enable
it to detect the transmitter. When the robot
does not detect any IR light it is programmed
to move in a large circle until either one of
the detectors detect the IR. When one of the
sensors detects the IR, the robot will position itself in that direction and move forward
until it reaches the source. Once it reaches the
source, we will demonstrate Part two of the
project, where the robot will try and locate
the RF source. The RF transmitter provides a
strong and reliable signal at a frequency of
418 MHz. On the robot,
there is a RF receiver
which receives this
signal. The receiver provides signal strength in
terms of analog voltage
which is fed in the PIC.
The robot moves towards the RF transmitter
using this increasing signal strength indication.
Under ideal conditions
this signal strength is
directly proportional to
the distance but the RF receiver does pick up noise
which may interfere with its functioning.
Similarly, a small robot could be attached with a
seismographic sensor which would help it detect
volcanic activity or a carbon monoxide sensor to detect toxic gases or one could even build a robot that
sacrifices itself checking for land mines in a field.
Cognitive Autonomous Robot
Team Tiny Robot:
Jonathan Miller, Andrew Hunter, and Chris Holcomb
the beacon.
On its journey to the target, the robot will encounter obstacles. To avoid them, it utilizes two
infrared sensors that shoot an analog signal out
and measure the return signal as a voltage function. Based on the data interpreted from each
sensor, the FPGA will send corresponding signals
to an H-bridge, which will relay commands to the
motors.
While the plan looked good on paper, the
implementation was not so smooth. We ran into a
few problems, mainly dealing with the powering
of the FPGA. After we resolved that with the intro-
As society evolves, new technologies are
necessary to assist in even the smallest
tasks. Automation is improving the way
things are done, and shedding light on new
possibilities. Our project, an autonomous
robot car, is capable of doing just that. Specifically, the robot is a mobile target tracker
with the ability to avoid obstacles.
The robot car consists of several systems,
as outlined in the block diagram below: its
brain, the target and target locator, obstacle
detection and avoidance, and motor circuitry.
The brain of the robot, the most important
component, is a NI PCI-7831R FPGA, donated by
National Instruments. This powerful processor is
capable of intricately detailed programming, and
handles all of the robot’s decision making and
calculations.
The target is a beacon that emits a 1 kHz audio
signal, which is received by electret condenser
microphones on the robot and run through an
audio filter. The filtered signal is relayed
to the FPGA, which
rectifies it and translates it into a distance. The FPGA then
delivers logic to the
motors to turn the
car in the direction of
duction of a motherboard, we discovered that the
robot was pulling a lot of current. However, the
rest of the project was integrated successfully.
In theory, our robot can be effective in a wide
range of applications. It could be used to deliver
objects to people that call out to it. Also, if the motors are silenced and a camera is placed on the car, it
could spy on its surroundings or act as a portable security camera. It could
even participate in military warfare as a mobile
bomb with the addition
of an explosive. With
some modifications, our
autonomous robot has
the potential for numerous innovations.
Autonomous Functioning Robot
Team WSAE:
David Anderson, John Brinkley, Randy Doolittle, Noah Haewoong Yang
AMP
AMP
Band
pass
ADC
Motor
Band
pass
Algorithm
ADC
Motor
AMP
Timer
DAC
AMP
Power Supply
DAC
FPGA
I/O
Ultrasonic
I/O
Bumper
Microphone
Microphone
Detailed Ultrasonic Sensor
Bandpass
INCLUDED ON FPGA
distance from the target, the
entire robot system will be
terminated.
Two DC motor drives from
Vex Robotics are used to
control the movements of
the robot for forward, backward and stopping capabilities. The FPGA controls the
two motors simultaneously by creating and varying a
Pulse Width Modulation control signal, which drives the
motors. The movement of the machine interacts with
the environment through a tank-tread system, also purchased from Vex Robotics. The FPGA is programmed to
react to the incoming/outgoing sensor data and utilizes
the State Machine logic in order to send and receive
necessary control signals.
Power is provided by two separate systems. The FPGA
is powered by a bank of six Alkaline AA batteries providing 3.3 and 5.0 V to power the PCI bus. The rest of
robot relies off of +/- 8V and 2500mAh supplied by two
separate six AA battery packs, that voltage is then used
directly by the motors and amplifier circuits, regulated
to 3.3V for the microphones and 5.0V for the ultrasonic
sensors and the bumper sensor.
Our chassis is based on components obtained through
the Vex Robotics Starter Kit.
This kit contains “erector set” style metal and gear components as well as motors. As the design of the chassis
was not the focus of this project the use of the Vex kit
was encouraged so that the group could focus on the
sensor and actuation design and logic. However due to
the amount of surface space necessitated by our project,
it was required to fabricate many more risers and platforms in order to hold batteries, the PCI bus, as well as a
secondary power system.
We used a National Instruments (NI) RIO FPGA (NI PCI7831R) for our robot brain. This product contains onboard
ADC’s and DAC’s, which allows for easy
input/output programming through
Labview 8.0 FPGA and Real-Time modules. Since all logic is operating in the
hardware at loop rates of up to 40 MHz
(25nS), it is very responsive. NI has donated this technology as a platform for
us to use, we will utilize these on-board
capabilities of the RIO unit.
Environment
Our goal was to
build a robot with
autonomous functionality governed
by control system
theory, which will
navigate a course
of obstacles, find an
acoustic target and then stop within a fixed range of that
target. Based on our desire to create a new paradigm for
autonomous robot design, we are challenged to construct a more biological and simplistic robot, which is as
power efficient, cheap and scalable as possible.
The solution we proposed is a simple self-governing
machine using State Machine (Event Driven) Architecture a few basic directives to emulate certain search
behaviors. We did our best to make a simple autonomous machine that is scalable, modular, cost and power
efficient. Our solution makes use of a FPGA to implement a control structure and ultimately use ultrasonic
and acoustic sensors to avoid obstacles, transverse a
course and locate a target. This involves sensing, actuation, power distribution, brain subsystems, and a chassis.
Two MaxSonar-EZ1 ultrasonic sensors are being used
to detect obstacles from 0 to 3 meters and alternate
broadcasting every 50 milliseconds. If an obstacle is
detected within 1 meter, the robot will veer left or right
depending on the obstacle’s orientation to ultrasonic
sensors. This control is done by a comparison between
the voltage levels of the ultrasonic sensors, depending upon which signal is lower (nearer objects return a
lower signal) the algorithm will execute commands to
turn in a direction to avoid the obstacle.
Target tracking is done with the use of two acoustic
sensors (Kobitone electret unidirectional condenser
microphones). These sensors are utilized by comparing
the output voltage of each microphone and executing
control in order to attempt to equalize the voltage level
between the two microphones. This
signal is measured from an amplified acBrain
tive band-pass filter, and then compared
between the two microphones. The
robot will adjust its course to achieve
equal amplitude between the two sensors (thus facing the target). Once the
robot gets to the target, the program
has a threshold voltage to gauge the
ADC
Sonar
AMP
TWSAE
Block Diagram
6/15/2006
Demonstration Day Winners
First Place-C.A.R.I.T.
Jason Christensen, Mahmoud Azab, Todd Rosemurgy, Sumit Khatri
Second Place-Robo-Nav
Third Place-Team CODS
Jeff Svatek, Timothy Graham, Nicholaus Kee,
Alison Shanley
Vaibhav Sarihan, Syed Mujtaba Ali, Kamran Khan,
Brian Butler