UNIVERSITI TEKNOLOGI MALAYSIA

Transcription

UNIVERSITI TEKNOLOGI MALAYSIA
PSZ 19:16 (Pind. 1/13)
UNIVERSITI TEKNOLOGI MALAYSIA
DECLARATION OF THESIS / UNDERGRADUATE PROJECT PAPER
Author’s full name : MUHAMMAD ADIB ZUFAR B. RUSLI
Date of Birth
: 18 DECEMBER 1992
Title
: ROBOTIC HAND CONTROLLED USING MYO SENSOR
Academic Session : 2014/2015
I declare that this project report is classified as:

CONFIDENTIAL
(Contains confidential information under the Official Secret Act
1972)*
RESTRICTED
(Contains restricted information as specified by the organization
where research was done)*
OPEN ACCESS
I agree that my thesis to be published as online open access
(full text)
I acknowledged that Universiti Teknologi Malaysia reserves the right as follows:
1. The thesis is the property of Universiti Teknologi Malaysia
2. The Library of Universiti Teknologi Malaysia has the right to make copies for the
purposes of research only.
Certified by:
NOTES:
*
SIGNATURE
SIGNATURE OF SUPERVISOR
921218-01-5819
IR. DR. AHMAD ‘ATHIF B. MOHD FAUDZI
(NEW IC/PASSPORT)
NAME OF SUPERVISOR
Date: 22 JUNE 2015
Date: 22 JUNE 2015
If the thesis is CONFIDENTAL or RESTRICTED, please attach with the letter from
the organization with period and reasons for confidentiality or restriction.
ii
“I hereby declare that I have read this final year project report and in my opinion,
this final year project report is sufficient in terms of scope and quality for the
purpose to be awarded the Degree of Bachelor Engineering (Electrical –
Instrumentation and Control)”.
Signature
:
....................................................
Name
:
Ir. Dr. Ahmad ‘Athif B. Mohd Faudzi
Date
:
22 June 2015
ROBOTIC HAND CONTROLLED USING MYO SENSOR
MUHAMMAD ADIB ZUFAR B. RUSLI
A final year project report submitted in partial fulfilment of the
requirements for the award of the Bachelor of
Electrical Engineering (Instrumentation & Control)
Faculty of Electrical Engineering
Universiti Teknologi Malaysia
JUNE 2015
iv
I declare that this final year project report entitled “Robotic Hand Controlled Using
Myo Sensor” is the result of my own research except as cited in the references. The
final year project report has not been accepted for any degree and is not currently
submitted in candidature of any other degree.
Signature
:
....................................................
Name
:
Muhammad Adib Zufar B. Rusli
Date
:
22 June 2015
v
Dedicated, in thankful appreciation for support, encouragement and understanding
to my beloved mother, father, brothers and friends.
vi
ACKNOWLEDGEMENT
Firstly, I am very grateful to Allah that give me chance to live and continue
my studies until the last part of my degree. I also grateful to Allah because give me
the strength to finish my final year project on time.
Secondly, I would like to give all my appreciation to my parents because they
always give me spirit to continue my study to degree level and to finish my project.
I also want to thanks my beloved supervisor, Ir. Dr. Ahmad ‘Athif B. Mohd
Faudzi because he is really caring person. He always give me suggestion and guidance
until I come out with this project.
Lastly, thanks to everyone that I did not mention here that involve directly or
indirectly in my final year project. May Allah bless all of you. Thanks.
vii
ABSTRACT
Robotic hand is a system that can imitate human hand for multi degree-offreedom (DOF) motion. It can be used in the field of medical such as prosthesis hand
or in industrial field as an end effector. Currently, robotic hands that can imitate human
gesture based on human muscle are limited in number and potentiometer are used to
detect hand movement instead of Myo sensor. The purpose of this project is to develop
a robotic hand with 5 DOF that can imitate human hand gestures by using Myo sensor
to detect the movement of forearm muscle. 3D printer is used to print the parts of
robotic hand using acrylonitrile butadiene styrene (ABS) a common thermoplastic as
its material. The muscle’s pulse is detected by Myo sensor and will determine the
output gesture of robotic hand. There are 3 gestures will be studied which are fist,
spread and pinch. Arduino microcontroller is used to control servo motors with the aid
of computer to process human muscle signal. Servo motors were placed inside the
robotic to pull the tendon, hence, imitating the gesture of human hand. The result
shows that, the fist motion can be imitate by half grip at 31 N.
viii
ABSTRAK
Tangan robot adalah satu sistem yang boleh meniru tangan manusia untuk
pelbagai darjah kebebasan bergerak. Ia boleh digunakan dalam bidang perubatan
sebagai tangan palsu atau dalam bidang perindustrian sebagai pengesan hujung. Pada
masa kini, tangan robot yang boleh meniru pergerakan berdasarkan otot manusia
adalah terhad dan kebanyakkannya menggunakan perintang boleh laras untuk
mengesan pergerakan tangan. Tujuan projek ini adalah untuk membuat tangan robot
yang mempunyai 5 darjah kebebasan bergerak yang boleh meniru pergerakan tangan
manusia dengan menggunakan sensor myo untuk mengesan pergerakan tangan.
Pencetak 3D digunakan untuk mencetak bahagian-bahagian tangan robot dengan
akrilonitril butadiena stirena ( ABS) sejenis termoplastik biasa sebagai bahan. Nadi
akan dikesan oleh sensor myo dan ia akan menentukan pergerakan tangan robot.
Terdapat 3 pergerakkan yang akan dikaji iaitu genggam, membuka, dan mengetap.
Arduino digunakan untuk mengawal motor servo dengan bantuan komputer untuk
memproses isyarat yang diterima dari sensor myo. Motor servo telah diletakkan di
dalam tangan robot untuk menarik tendon, justeru tangan robot dapat meniru
pergerakan tangan manusia. Hasil ujikaji menunjukkan bahawa pergerakkan
menggenggam boleh ditiru oleh separuh genggaman pada 31 N.
ix
TABLE OF CONTENTS
CHAPTER
1
TITLE
PAGE
DECLARATION OF THESIS
i
ACKNOWLEDGEMENT
vi
ABSTRACT
vii
ABSTRAK
viii
TABLE OF CONTENT
ix
LIST OF TABLES
xii
LIST OF FIGURES
xiii
LIST OF ABBREVIATIONS
xv
LIST OF APPENDICES
xvi
INTRODUCTION
1.1
General Introduction
1
1.2
Problem statement
2
1.3
Research Objectives
2
1.4
Research Scopes
3
1.5
Final Year Project Report Outline
3
x
2
LITERATURE REVIEW
2.1
Introduction
5
2.2
Robotic Hand Design
5
2.3
Mechanism of Actuation
8
2.3.1
Motor
8
2.3.2
Soft Actuator
9
2.4
3
4
Controlling the Robotic Finger
12
METHODOLOGY
3.1
Introduction
14
3.2
Project Workflow
15
3.2.1
17
System Overview
3.3
Robotic Hand System Development
18
3.4
Hardware Development
19
3.4.1
Robotic Hand Development
19
3.4.2
Circuit Design
21
3.4.3
Main Component of Robotic Hand
22
3.5
Software Development
24
3.6
Summary
25
RESULT AND DISCUSSION
4.1
Introduction
27
xi
4.2
Myo Sensor Raw EMG Data
28
4.3
Gesture Imitation Experiment
30
4.4
Fist Gesture Recognition Experiment by Varying
31
Size of An Object
4.5
Fist Gesture Recognition Experiment by Varying
33
Weight of An Object
4.6
5
6
Measuring Grasping Force
34
CONCLUSION
5.1
Introduction
38
5.2
Recommendation for future works
39
PROJECT MANAGEMENT
6.1
Introduction
40
6.2
Project schedule
40
6.3
Cost estimation
42
REFERENCES
APPENDIX A - B2
44
46-51
xii
LIST OF TABLES
TABLE NO.
TITLE
PAGE
3.1
Specification of Arduino UNO
23
3.2
Specification of servo motor
24
4.1
Data of repeatability experiment
31
4.2
Data of relationship between size and imitation
32
experiment
4.3
Data of relationship between weight and imitation
33
experiment
6.1
1st semester Gantt chart
41
6.2
2nd semester Gantt chart
42
6.3
Cost estimation to develop a robotic hand
43
xiii
LIST OF FIGURES
FIGURE NO.
TITLE
PAGE
2.1
Otto Bock Hand
6
2.2
Design of Shadow hand
7
2.3
Bending soft actuator by using two chambers
10
2.4
Bending soft actuator by using two different braided
11
angle
3.1
Flowchart of project progress
15
3.2
Flowchart of robotic hand system
16
3.3
Overview of Robotic Hand System
18
3.4
Middle finger drawing
19
3.5
Palm drawing
20
3.6
Arm drawing
20
3.7
Robotic Hand design assembly
20
3.8
Schematics diagram of servo motors
21
3.9
Myo sensor
22
3.10
Arduino UNO
23
3.11
An example of servo motor
24
xiv
3.12
Console for Myo data conversion
25
4.1
Gestures detected by Myo sensor
28
4.2
Raw EMG data for tap gesture
29
4.3
Raw EMG data of finger spread gesture
29
4.4
Raw EMG data for fist gesture
30
4.5
Diameter of a cylinder
32
4.6
Measuring the weight of the cylinder
34
4.7
Hand Dynamometer
35
4.8
Graph of grasping force without load
35
4.9
Graph of grasping force with 1.25 kg load
36
4.10
Graph of grasping force with 2.5 kg load
37
xv
LIST OF ABBREVIATION
DOF
-
Degree-of-Freedom
DOA
-
Degree-of-Actuation
EMG
-
Electromyography
DC
-
Direct Current
CAD
-
Computer Aided Design
IMU
-
Inertial Measurement Unit
IDE
-
Integrated Development Environment
ABS
-
Acrylonitrile Butadiene Styrene
PEC
-
Parallel Elastic Component
xvi
LIST OF APPENDICES
APPENDIX
TITLE
PAGE
A
CAD drawing
46
B1
Arduino coding
48
B2
Myoduino coding.
50
1
CHAPTER 1
INTRODUCTION
1.1
General Introduction
Robotic is a branch of technology that deals with the design, construction,
operation and application of robots. It is develop from a combination of electrical
engineering, mechanical engineering and computer science. The main function of
robot is to simplify human work and also as an assist for human in hazardous
environment such as diffusing bomb, mines and piping cleaning. One of robotic
branch is robotic hand. Robotic hand can be divided into two categories which is end
effector and prosthetic hand. End effector is applied in industrial field, mainly in
assembly line, while prosthetic hand used in medical field as replacement of actual
human hand. Throughout history, robotic hand become more similar to human in term
of its design and ways to manage task. It is always desired to develop robotic hand
that can imitate human hand gesture complexity, function and high degree-of-freedom
(DOF). Although many sensor have been used to control robotic hand, but, there is
only a few robotic hand that using human muscle activity directly as input. Sensor that
is used to detect human muscle activity is electromyography (EMG). Usually, EMG
sensor is used in medical field to identify neuromuscular diseases or to study human
kinetics.
2
1.2
Problem Statement
To control a robotic hand, physical measurement need to be used as input
signal. Currently, potentiometer and limit switch are used as input signal. The
disadvantages of both sensor is, it cannot be used to control a complex motion such as
making gesture and also the signal given is not taken directly from human. This
problem can cause the robotic hand to be low at its function compared to human hand.
Connection between potentiometer and limit switch to the robotic hand is done
through wire. This will make the system to be complicated for the end user. Therefore,
to encounter this problem, Myo sensor is implemented as a controller for the robotic
hand. Myo sensor with a built in EMG sensor is used to detect muscle activity or pulse.
Pulse generated from one gesture to another is different. This is because, different
muscle is used to make a different gesture. The pulse generated from making a gesture
is then used to control the robotic hand.
1.3
Research Objectives
The objectives of this project are:
1. To develop robotic hand that can imitate human gesture
2. To implement myo sensor as robotic hand controller.
3. To conduct gesture imitation experiment and study Myo sensor
performance.
3
1.4
Research Scopes
There are several things need to be covered under the project scope. The scope
consists of hardware and software element which are:
1. Develop a robotic hand with five DOF.
2. Construct a mechanism to move the hand by using five servo motors.
3. Design a control box to control the movement of servo motors. Arduino
UNO is used to control all movement of servo motors.
4. Implement myo sensor as robotic hand controller.
5. Convert the signal transmitted by myo sensor into a signal that can be
read by Arduino UNO. The signal is converted by using a laptop.
6. Develop a program for Arduino UNO. The program is develop to
control all five servo motors.
7. Set the gesture that robotic hand can imitate to four set only. The
gesture that can be imitate are fist, spread, rest and tap.
1.5
Final Year Project Report Outline
This report contains five chapter which is Chapter 1, Chapter 2, Chapter 3,
Chapter 4, Chapter 5 and Chapter 6. Chapter 1 represent the introduction of this
project. It discuss about background of robotic hand system, problem statement,
objective and scopes of the project.
Chapter 2 discuss the previous works that others has done related to this
project. This chapter mainly discuss about design of the robotic hand, actuator that
robotic hand use and controller of robotic hand.
4
Chapter 3 shows the methodology of the project. It represent the steps to build
robotic hand system. This chapter can be divided into several parts which is hardware
development, software development and also list of components use to develop the
robotic hand.
Chapter 4 discuss the result of the project. There are several experiment has
been set up to test the reliability of myo sensor as controller of robotic hand. All data
is gathered and analyzed in this chapter.
Chapter 5 is the summary and conclusion of this project. Recommendation for
future works also included in this chapter.
Chapter 6 represent the schedule of this project. The schedule is tabulated in
Gantt chart. This chapter also consist of cost estimation to develop the robotic hand.
5
CHAPTER 2
LITERATURE REVIEW
2.1
Introduction
This project is focused on development of robotic hand, implementing myo
sensor as robotic hand controller and testing the reliability of myo sensor as controller.
Literature review was carried out to gain knowledge and improve skills needed in
order complete this project. This chapter focuses on previous research that are related
to this project.
2.2
Robotic Hand Design
Hand design can be separated into several classification. It can be categorized
mainly based on the function of the hand itself. As example, prosthetic hand is
designed to be used by the user as replacement of their actual hand. Human hand is a
complex system with high degree-of-freedom and have the ability to identify the
object it touch [1]. The design of the robotic hand is depend on the developer whether
6
to design a full functioning robotic finger or limited number of functioning robotic
finger. Earlier design, Otto Bock Hand or the VASI Hands can be used to demonstrate
the function of prosthetic hand with low number of DOF. Both hands are easy to use.
Power supply comes from batteries and the activity of muscle in the remnant limb are
detected using myo-sensors. Kinematically, Otto Bock Hand was designed with single
DOF and uses a simple mechanism with two rigid finger and a thumb to grasp an
object. Because of its simple mechanism, this hand has a limitation which is, it cannot
grasping an object that are not in cylindrical shape. This hand can only grasp an object
that have approximately same diameter with its inner curvature of finger and thumb.
However, not all object have the shape of a cylinder. In order to grasp an object that
are in irregular shape, more force needed for the hand to hold the object in position.
The user also will have difficulty to hold the object because of its single DOF design.
The user need to find the right orientation before grasping an object. Otto Bock hand
have a weight of 540 g and can produce 140 N output force [2]. Figure 2.1 shows the
design of Otto Bock hand.
Figure 2.1: Otto Bock Hand [2]
Recently, design for prosthetic hand become more similar to human hand. The
controllability, functionality, DOF and cosmetic become more human like. Cyberhand
is a prosthetic hand with 16 DOF and 1 Degree of Actuation (DoA) [3]. Each finger
have three DOF composed of its three phalanxes, a total of 15 DOF for five finger.
One DOF were integrated for the thumb for abduction and adduction control which
make a total of 16 DOF. The hand consist of position sensor and tendon tension sensor
7
to determine finger position and measure the force that will be given when grasping
an object. The drawback of this design is each finger joint cannot be actively and
independently controlled. Advanced design of prosthetic hand were develop by
Shadow company [4] [5]. Figure 2.2 shows a design of Shadow Hand and the actuator
that the hand used. Shadow hand have high number of DOF with approximately close
to human hand in term of shape and size. This hand use soft pneumatic actuator to
move the finger with minimal use of motor. The soft actuator composed of two
material, expendables rubber tubes and surrounded with plastic braiding. The actuator
can pull force up to 70 kg at 4 bar with a contraction of 30% from its initial length.
(a)
(b)
Figure 2.2: Design of Shadow hand and its actuator a) Shadow hand and, b) Shadow hand
soft actuator [4] [5]
Festo company has developed a hand (ExoHand) for industrial and medical
purpose. The shape of this is like an exoskeleton of a human hand. ExoHand has
almost all the physiological degree of freedom. This hand can be used for industrial
and medical purpose (rehab). Double acting cylinder are used to move the finger. The
drawback of this design is its large size and need a lot of compressed air to operate.
Another design of a robotic hand named RL1 hand is a robotic hand with 3
fingers [6]. The hand is designed so that the finger can adapt to the object its grasp.
Movement of the finger is done by a DC motor. To control the hand, a string of
8
character will be sent through a computer. The weight of this hand is 250 g. This arm
was developed to help people with amputee hand.
2.3
Mechanism of Actuation
There are many ways to actuate the robotic finger. The conventional way to
actuate the finger are by using a direct current (DC) motor. DC motor always used
because of its availability in the market and also its simple mechanism of work. The
literature review was carried to understand the conventional actuator of robotic hand
and also to improve understanding on all type of actuator.
2.3.1
Motor
Jung, S.-Y., et al. have proposed a motor driven prosthetic hand. The motor is
used to pull a tendon and a spring is used to oppose the pulling force given by the
motor [7]. The hand is develop with three finger actuated by the tendon (wire) at each
finger. All finger possess a DC motor with 1/192 gear ratio. An addition of a servo is
placed at thumb for adduction or abduction process. Cyberhand [8] is one of the
prosthetic hand design that use motor as its actuator. Earlier design composed of three
finger only, which is thumb, index and middle finger. The actuation system is slightly
similar with the design proposed by Jung, S.-Y., et al., the different is, Cyberhand use
DC motor for all actuation including the abduction or adduction of thumb. A total of
four motor is used to actuate the hand. One motor for each finger and a motor at the
base of thumb for adduction or abduction process. Later design have a more similar
shape to human hand [3]. It have all five finger with one motor for each finger and an
addition of one motor at the base of thumb for adduction or abduction movement. A
9
total of six motor is used for this design and a cable at each finger act as human muscle.
The cable is pulled by motor to enable the movement of finger. A magnetic
incremental encoder is implemented to determine the position finger. This hand was
designed as underactuated system. The reason it is design as underactuated system are
because it can decrease the number of actuator needed, to let torque distribution
between joint, enables adaptive grasp and simplify the design.
2.3.2
Soft Actuator
Soft actuator is an actuator made of rubber or silicon with internal chamber(s).
It can be categorized into two which is pneumatic driven and hydraulic driven.
Pneumatic driven soft actuator is driven by a compressed air same as pneumatic
cylinder. Other than that, it has the ability to stretch and contract. By manipulating this
ability, new ability which is bending can be produce [9]. Design of soft actuator can
be with or without fiber reinforced. The earliest design is McKibben type invented by
a physician, Joseph L. McKibben. McKibben type soft actuator is a contraction type
soft actuator. The actuator developed by using a hollow cylindrical rubber covered
with shell braided with a closing at both end. This design have being studied for over
a decades. Although this design have high pulling force, but, it is highly non-linear. A
controller need to be developed in order to compensate the high non-linearity of this
actuator [10]. For fiber reinforced soft actuator, the degree of the fiber being knitted
plays a major role in determining the motion of soft actuator [11] [5].
There are many design had been proposed and studied to know the effect of
knitted angle. K.Iwata et al. concluded that the effective angle of his actuator is 23.5
for contraction and 66.5 for stretch. Figure 2.3 shows the design of K.Iwata et al. He
use two separate soft actuator and hold them together to produce bending motion.
Faudzi, A.M., et al.use almost the same method in his research [9]. Figure 2.4 shows
the design used by Faudzi, A.M et al.. The different between Faudzi, A.M et al. and
10
K.Iwata et al. model is, his use only one soft actuator. The actuator is divided into two
part of equal size. The first part is knitted with angle that will produce contraction.
The other part is knitted with angle that will stretch. The purpose of this study is to
make a bending motion. From the study, it have been proved that, two different angle
knitted in the same actuator can produce a bending motion, but, the effective angle for
contraction and stretch will be different compared to the single actuator for single
angle of knitted fiber. This study also proved that, number of fiber knitted will not
affect the soft actuator bending angle.
Figure 2.3: Bending soft actuator by using two different chambers [11]
11
Figure 2.4: Bending soft actuator by using two different braided angle [9]
Another design of fiber reinforced pneumatic soft actuator is the fiber knitted
axially or horizontally alongside the actuator [12] [13]. This design were developed
using natural latex rubber as the actuator’s tube to make the contraction rate higher
and more similar to human muscle. The design without ring shows that the maximum
expansion of the tube is larger than other parameter. Because of large maximum
expansion, there is a possibility that the actuator might break under high pressure. So,
ring(s) were inserted into the actuator to make the contraction rate more stable. Next,
the developed muscle were being analyse based on Biomechanical Muscle Model
established by Hill. Hill model can be used to predict force, length and velocity of a
muscle. The analysis is done based on two-element muscle model because parallel
elastic component (PEC) can be ignored if the muscle are not stretched beyond its
physiological range. From analysis, the characteristic of this model is approximately
the same as human muscle especially when 0.1 MPa were applied.
12
2.4
Controlling the Robotic Finger
There are many ways used to control the robotic finger. One of it is, by using
a camera [14]. Camera will captured the vision. The captured vision is extracted to get
the user’s hand gesture and the background will be ignored. The gesture is used as
input to control the movement of robotic finger. This type of controlling method have
a low precision compared to the glove based controlling method, but the advantage of
this method is, it give more freedom to the user. The user does not need to wear
anything as the data glove. Coquin, D., et al. share the same method by using a camera.
The different is the user need to wear a data glove that have sensors implemented in it
[15]. The sensors function is as recognition for the camera to detect the gesture of the
user’s hand. Brethes, L., et al. also proposed camera to detect the user’s hand gesture.
The different of his method from the other is the system will detect the user hand
gesture by filtering the captured vision based on colour [16].
There is a limited number of prosthetic hand that used EMG sensor to detect
movement activity. In case of amputation, the muscle in forearm is still remain. For
normal people, this muscle are used to move their finger while for people with
amputated hand, this muscle activity still can be read by using EMG sensor [17]. Input
signal from the EMG sensor is use to move the prosthetic hand. Peleg, D., et al. had
amplified the EMG signal up to 2500 times. This is because, signal receive from the
sensor is too small. Bitzer, S. and P. van der Smagt introduce a system to identify
opening and closing actions of the human finger by using surface EMG [18]. The
method introduced does not affected by position of user arm. Because of its stability,
the method is suitable to be used in active prosthesis with a high number degrees of
freedom. Then, the method is tested by using a robotic hand with four-finger.
Allen, P.K., et al. propose a robotic hand system that uses joint position and
force sensing to accurately compute finger contacts and applied forces for grasping
tasks [19]. Mainly, this design propose a robotic hand system that can compute its
force by using a strain gauge and a visual tracker. The strain gauge sensor function is
13
to detect the force of grasping and the visual tracker is used to identify the location of
all finger.
14
CHAPTER 3
RESEARCH METHODOLOGY
3.1
Introduction
In this chapter, the methodology used to achieve the objectives will be
discussed. As an overview, the system consists of three main parts which are the myo
sensor, the control box and the robotic hand. The function of myo sensor is to detect
muscle pulse of its user. Different sets of gesture will give different sets of muscle
pulse. The pulse detected is used as an input for the controller box with the aid of
laptop to convert the pulse into a signal that can be read by Arduino UNO. Arduino
UNO is placed inside the control box. Arduino UNO will move the servos according
to the pulse detected. Two tendon is attach between each finger of the robotic hand
and servo. The total tendon used are ten tendons. The function of tendon is to move
the finger accordingly with the movement of servo. As a result, the finger of robotic
hand can be move according to the muscle pulse of myo’s user. Four gestures have
been sets. The gesture that can be imitated by the robotic hand are fist, rest, spread and
tap.
15
3.2
Project Workflow
Figure 3.1 illustrated the workflow of the whole project while Figure 3.2 shows
the flowchart of robotic hand system. After literature review is done, the project move
to the next steps which are developing robotic hand and implementing Myo sensor as
robotic hand controller. The workflow of the project is described as below.
Figure 3.1: Flowchart of project progress
16
Start
No
Myo sensor
detect
muscle pulse
Yes
Convert the signal
into Arduino
readable signal
Arduino move
servos based on
signal received
Servo move the
hand according to
gesture done
New
gesture?
Yes
No
End
Figure 3.2: Flowchart of robotic hand system
17
Figure 3.1 shows the flow of work that has been done. Firstly, a title which is
“Robotic Hand Controlled Using Myo Sensor” is proposed and literature review
related to the title is carried out. Then, hardware and software part is developed
separately but almost at the same time. At this stage, hardware and software parts is
being tested separately. Next, the hardware and software part is being integrated to
become a complete robotic hand. The system integration is tested and correction were
made. After that, three experiment were set up to determine the reliability of Myo
sensor as the robotic hand controller. Lastly, all data is recorded and documented.
Figure 3.2 shows the flowchart of the robotic hand system. Firstly, myo sensor
will detect human muscle pulse. Different sets of gesture will give different sets of
muscle pulse. Then, with the aid of a laptop, the pulse detected by myo sensor is
converted into a signal that are readable by Arduino. Converted signal is then fed to
Arduino via serial communication. Arduino will control the movement of servo
motors based on the converted signal. The system will be in loop until there is no
gesture to be detected.
3.2.1
System Overview
From Figure 3.2 and Figure 3.3, the system operate by utilizing several
components. Firstly, myo sensor is used to identify human muscle pulse. Different sets
of gesture will give different sets of muscle pulse. Then, the muscle pulse is
transmitted to a laptop via Bluetooth connection. The function of laptop is to convert
data transmitted by myo sensor into a data that can be read by Aduino microcontroller.
After the data is converted, the data is sent to Arduino via serial connection. Arduino
received the converted data which correspond to the gesture that the user made. From
the received data, Arduino will move the servo. Five servo is used to move the robotic
hand. Two tendon is attached between a servo motor and a finger. The function of
tendon is to open and close the finger of the robotic hand. Different sets of gesture will
18
turn on different sets of servo motors. Myo sensor will update the gesture that the user
made from time to time.
Figure 3.3: Overview of robotic hand system
3.3
Robotic Hand System Development
The system development can be divided into two part which are hardware
development and software development. Hardware development focusing on
developing the robotic hand, designing the circuit of control box, mounting of the
actuator and assembly the robotic hand. Meanwhile, the software development
focusing on conversion of muscle pulse and developing an Arduino program to control
movement of servo motors.
19
3.4
Hardware Development
Hardware development section can be divided into two parts which are
developing robotic hand and designing a circuit of control box.
3.4.1
Robotic Hand Development
The robotic hand is designed in Solidwork, a Computer Aided Design (CAD)
software. The hand was divided to several parts which are finger, palm and forearm.
Figure 3.4, Figure 3.5 and Figure 3.6 shows an example of CAD of the robotic hand
and Figure 3.7 shows the finished assemble of robotic hand design. A built-in servo
motors mounting inside the forearm as shown in Figure 3.6 was included in the design.
Five servo motors are used for this design. The mounting also consist of ten track for
the tendons. Two tendon were paired with a servo. The function of tendon is to move
the finger by taking pulling force from servo motor. One tendon is used to close the
finger and the other tendon to open the finger. Finished designed is then printed by
using a 3D printer. Detailed design is shown in APPENDIX A.
Figure 3.4: Middle finger drawing
20
Figure 3.5: Palm drawing
Figure 3.6: Arm drawing
Control box
3D printed hand
Built in
servo motors
Figure 3.7: Robotic hand design assembly
21
3.4.2
Circuit Design
FritzingTM software was used to design the circuit of servo motors. After
completed the design, the circuit is then tested on proto-board. Proto-board is used to
to test the circuit with the hand. A well functioned circuit is then soldered on donut
board and a box is made as a controller box. Figure 3.8 below shows the schematic
diagram of the circuit.
Arduino Uno is used to control all five servo motors. It will control the servos
based on the input received from Myo sensor. Different input will caused a different
movement of servo motors. The programming part will be discussed in software
development section and APPENDIX B1 and APPENDIX B2.
Figure 3.8: Schematics diagram of servo motors
22
3.4.3
Main Component of Robotic Hand
Robotic hand is developed by combining several electrical components. The
main components are:
1. Thalmic’s Lab Myo sensor
2. Arduino UNO
3. Servo motors
4. 6V battery
In this project, Thalmic’s lab Myo sensor is used as controller of robotic hand.
Figure 3.9 below shows the sensor used to detect human hand muscle. The hand will
detect human or user gesture based on its muscle pulse. This sensor built up from eight
sets of EMG sensor and nine inertial measurement unit (IMU) sensor. EMG sensor is
a sensor to detect human muscle activity which will be the main sensor used for this
project. While, the function of IMU sensor is to identify position. IMU sensor also can
be used to sense acceleration made by the user. But, the usage of IMU sensor is not
included under the scope of this project.
Figure 3.9: Myo sensor
As shown in Figure 3.10, Arduino UNO acts as a brain for this system. Arduino
UNO need aid from laptop to convert the signal given by Myo sensor into a signal that
it can read. Different gesture of human hand will give a different signal. From Myo
sensor signal, Arduino UNO will control the movement of servo motors. Arduino
23
UNO is selected because of its small size and it is more economic compared to other
microcontroller. Table 3.1 shows the specification of an Arduino UNO.
Figure 3.10: Arduino UNO
Table 3.1: Specification of Arduino UNO
Description
Value
Operating voltage
5V
Input voltage
7-12V
Digital I/O pins
14
Analog input pins
6
DC current per I/O pins
40mA
DC current for 3.3V
50mA
Flash memory
32KB
Clock speed
16MHz
Five servo motors is assigned to move the robotic hand. Figure 3.11 shows an
example of servo motor used for this project. Each servo motor will move one finger.
From Figure 3.8, servo motor J1 will move thumb, J2 will move index finger, J3 will
move middle finger, J4 will move ring finger and J5 will move pinky finger. Servo
motors will move the finger by pulling tendons attach between motor and finger in
24
clockwise and counter clockwise direction. Table 3.2 shows the specification of servo
motor used for this project.
Figure 3.11: An example of servo motor
Table 3.2: Specification of servo motor
3.5
Description
Value
Speed
0.12sec/60⁰ (no load)
Torque
4.4 kg.cm
Voltage
6V
Dimension
40.7x20.5x39.5 mm
Weight
43g
Rotation angle
180⁰
Software Development
MyoDuino is a console made by developer to convert muscle pulse signal
received from myo sensor into a signal that can be read by Arduino. Figure 3.12 shows
the interface for MyoDuino. It will show the current gesture that the user made. After
25
converting the muscle pulse signal, Myoduino will sent the signal to Arduino via serial
communication. Detailed coding regarding Myoduino is shown in APPENDIX B2.
Figure 3.12: Console for myo sensor data conversion
By using the data sent by Myoduino, an Arduino is programmed to move servo
motors according the current gesture that the user made. All programming has been
done in Arduino IDE. The language for Arduino IDE basically based on C++
language. Detailed coding of Arduino is shown in APPENDIX B1.
3.6
Summary
This chapter has discussed the methodology of this project. The robotic hand
was developed by using a 3D printer based on the CAD drawing. Finished printed part
is then assembled by using a glue. Allen key screw was used as the joint for the palm
and wrist to reduce the friction when movement occur. A 3D printer is used as the
joint for all finger because the dimension of the joint is too small to fit in a screw. 5
servo motors is integrated inside the arm of the robotic hand. Then, the software part
is developed. Firstly, the data transmission of myo sensor is tested. After complete
converting the data into a signal that can be read by Arduino, software and hardware
26
part were integrated and coding of Arduino to move servo motors based on converted
data were developed. The reliability of myo sensor as the robotic hand controller was
analyzed. The analysis will be discussed in the next chapter.
27
CHAPTER 4
RESULT AND DISCUSSION
4.1
Introduction
This chapter discuss about the experiments that had been carried out to test the
reliability of Myo sensor as robotic hand controller. Three experiment had been carried
out. The first experiment is to determine the repeatability of robotic hand imitating a
gesture. Second experiment is done to study the relation between diameter of a
cylinder and imitation of gesture. The last experiment is to study the relation between
weight of a cylinder and imitation of gesture. Gesture that can be imitated are fist,
finger spread, rest and tap as shown in Figure 4.1. For the second and third experiment,
only fist gesture will be used.
28
(a)
(b)
(c)
(d)
Figure 4.1: Gestures detected by Myo sensor a) Fist gesture, b) Finger spread gesture, c)
Tap gesture, d) Rest gesture
4.2
Myo Sensor Raw EMG Data
Figure 4.2, Figure 4.3 and Figure 4.4 shows Myo sensor raw EMG data. All
graph shown the data of muscle pulse versus samples taken. Muscle pulse taken is in
mili voltage. The data is taken from muscle activity by making gesture which will
produce an electrical pulse. Different sets of gesture will produce a different sets of
electrical pulse. Then, the pulse is used to control the robotic hand based on the gesture
that the user had made.
29
Muscle pulse (mV) VS Samples
150
100
50
-50
1
77
153
229
305
381
457
533
609
685
761
837
913
989
1065
1141
1217
1293
1369
1445
1521
1597
1673
1749
1825
1901
1977
2053
2129
2205
2281
2357
2433
0
-100
-150
emg1
emg2
emg3
emg4
emg5
emg6
emg7
emg8
Figure 4.2: Raw EMG data for tap gesture
Muscle pulse (mV) VS Samples
150
100
50
-50
1
66
131
196
261
326
391
456
521
586
651
716
781
846
911
976
1041
1106
1171
1236
1301
1366
1431
1496
1561
1626
1691
1756
1821
1886
1951
2016
2081
0
-100
-150
emg1
emg2
emg3
emg4
emg5
emg6
Figure 4.3: Raw EMG data of finger spread gesture
emg7
emg8
30
Muscle pulse (mV) VS Samples
150
100
50
-50
1
82
163
244
325
406
487
568
649
730
811
892
973
1054
1135
1216
1297
1378
1459
1540
1621
1702
1783
1864
1945
2026
2107
2188
2269
2350
2431
2512
2593
0
-100
-150
emg1
emg2
emg3
emg4
emg5
emg6
emg7
emg8
Figure 4.4: Raw EMG data for fist gesture
From Figure 4.2 and Figure 4.3, the pulse generated for tap and finger spread
have almost the same pulse. Only the last part of the graph differentiate both gesture.
Because of its slight different, Arduino may get an error reading. But, from the
repeatability experiment, the error does not affect the robotic hand system. While from
Figure 4.4, fist gesture, the pulse generated is totally different from other gesture but
the Myo user need to use more force so that Myo sensor can detect the gesture easily.
If less force were used, Myo sensor tends to detect the gesture that being made as rest
gesture.
4.3
Gesture Imitation Experiment
This experiment was done to observe the repeatability of robotic hand
imitating a gesture. Fist, finger spread and tap gesture is used. The user of Myo sensor
will make one of the gesture and imitation from robotic hand is recorded. As shown
31
in Table 4.1, the experiment can be divided to three sets and average of all sets are
taken.
Table 4.1: Data of repeatability experiment
Gesture
Set 1
Set 2
Set 3
Average
Fist
19/20
18/20
18/20
55/60
Finger spread
20/20
18/20
20/20
58/60
Tap
18/20
17/20
17/20
52/60
This experiment was done to determine the repeatability of fist, finger spread
and tap gesture. 20 samples are taken for each gesture. For Set 1, finger spread gesture
with 20 imitation out of 20 sample is the most gesture that robotic hand can imitated.
While for Set 2, fist and finger spread gesture have the same number of imitation by
the robotic hand. Both gestures were imitated 18 times out of 20 samples taken. For
Set 3, finger spread is the most imitated gesture with 20 times imitation out of 20
samples taken. From average, finger spread is the most gesture imitated by the robotic
hand. The significant of this study is to observe the repeatability of robotic hand
system. From the study, it can be conclude that finger spread gesture is the easiest
gesture that can be imitated by the robotic hand. This is because, the muscle that move
during finger spread gesture is almost the same with muscle during hand at rest. So,
myo sensor detect this gesture easily compared to other gesture.
4.4
Fist Gesture Recognition Experiment by Varying Size of An Object
This experiment was done to study the relationship between sizes of an object
and imitation of gesture by the robotic hand. Fist gesture is focused in the experiment.
Fist gesture is used to lift a cylinder. Cylinder with different diameter is selected as
32
the object. The weight of all cylinders is fixed to 100 gram. Table 4.2 shows the data
collected to find a relationship between size of an object and the ability of robotic hand
to imitate human gesture.
Table 4.2: Data of relationship between size and imitation experiment
Diameter of cylinder
Repetition
Gesture imitated
(cm)
(Times)
(Times)
4.5
10
0/10
7.5
10
0/10
8
10
1/10
15
10
1/10
The diameter of cylinder used are 4.5, 7.5, 8, and 15 centimeters. For each size,
the gesture were repeated for 10 times and imitation by the robotic hand is observed.
For cylinder with 4.5 and 7.5cm of diameter, the robotic hand does not imitate the
gesture. Zero gesture were imitated out of 10 repetition that had been done. While for
cylinder with 8 and 15 cm of diameter, the robotic hand imitate the gesture for once
out of ten samples taken. This is because, more force is used in the beginning of lifting
the cylinder. After adjusting to the size of the cylinder, the muscle will be in relax
condition compared to the beginning of lifting. Myo sensor detect the relaxed muscle
and read it as rest gesture.
Figure 4.5: Diameter of a cylinder
33
4.5
Fist Gesture Recognition Experiment by Varying Weight of An Object
This experiment also focused on fist gesture. It is done to study the relationship
between weight of an object and imitation of gesture by the robotic hand. A cylinder
is used as the weight. The diameter of the cylinder was fixed to 7.5cm.
Table 4.3 below shows the data of relationship between different weight
cylinder and the ability of robotic hand to imitate human gesture. Five different weight
were used. Figure 4.6 shows the process to weigh a cylinder. First, measure the weight
of the cylinder. The weight of the cylinder is 90.13 gram. Then, a water is put inside
the cylinder to give a different weight. The weight of the cylinder is not included as
the manipulated variable. Only the weight of water inside the cylinder was taken into
consideration.
Table 4.3: Data of relationship between weight and imitation experiment
Weight of cylinder (gram)
Repetition
Gesture imitated
100
10
0/10
200
10
0/10
300
10
0/10
400
10
1/10
500
10
1/10
10 repetition were done for each weight. For cylinder with weight of 100, 200
and 300 gram, robotic hand does not imitate the gesture even once. The robotic hand
only imitate the gesture for cylinder with weight of 400 and 500 gram. But, for both
weight, the hand only imitate for a few moment. The reason is, the user used more
force before adjusting to the weight. After a few moment, a suitable force is given.
The suitable force is less compare to the beginning of grasping. Myo sensor detect the
less force as if the hand is in rest gesture.
34
Figure 4.6: Measuring the weight of the cylinder
4.6
Measuring Grasping Force
This experiment also focused to fist gesture. It is done to determine the
grasping force needed so that fist gesture can be imitated. Hand dynamometer, a strain
gauge based sensor was used to measure the force given by grasping action. This
experiment can be divided into two part which is grasping without load and grasping
with load. Figure 4.7 shows the force sensor used in this experiment. A box is placed
at the top of the sensor. The box was used as a container to place the load.
35
Figure 4.7: Hand Dynamometer
Figure 4.8 below shows the graph of grasping force without load. The
minimum force needed so that the robotic hand can imitate fist gesture is around 31 N
and higher. The robotic hand cannot imitate the fist gesture if the applied force is lower
than 31 N.
Figure 4.8: Graph of grasping force without load
36
Figure 4.9 below shows the graph of grasping force when 1.25 kg of loads is
placed at the top of the Hand Dynamometer sensor. From the figure, the average force
needed to lift the Hand Dynamometer sensor is 5 N. When 1.25 kg is applied, 14 N of
grasping force is needed to lift up the sensor with load. At this point, the robotic hand
still not imitate the fist gesture.
Figure 4.9: Graph of grasping force with 1.25 kg load
Then, the value of the load is doubled to 2.5 kg. Figure 4.10 shows the graph
of grasping force with 2.5 kg of load. The grasping force needed to lift up the sensor
with 2.5 kg of loads is 32 N. At this point, the robotic hand can imitate the fist gesture.
37
Figure 4.10: Graph of grasping force with 2.5 kg load
As conclusion, robotic hand can imitate fist gesture by applying a grasping
action, but, the grasping force needed is 31 N and higher. The robotic hand cannot
imitate fist gesture if the grasping force applied is lower than 31 N. The minimum load
needed to produce a grasping force of 31 N and higher is 2.5 kg.
38
CHAPTER 5
CONCLUSION
5.1
Introduction
As a conclusion, the prototype of the robotic hand has been successfully
developed to achieve the three objectives:
1. To develop robotic hand that can imitate human gesture.
2. To implement Myo sensor as robotic hand controller.
3. To conduct gesture imitation experiment and study Myo sensor
performance.
Literature review on robotic hand system was successfully done by referring
to previous projects and researches conducted by others. A robotic hand system
control by using Myo sensor were successfully developed. Based on the experiment
done, it can be concluded that, the robotic hand can imitate 3 gestures which is fist,
spread and pinch. Spread gesture has the highest repeatability amongst the three
gesture. Other than that, it also can be concluded that, the current design is not suitable
to be used for grasping action. Experiment 2 and Experiment 3 shows that the robotic
hand cannot do the grasping action. Based on the experiment, Myo sensor cannot
detect muscle activity if low force was used for the grasping action.
39
5.2
Recommendation for future works
The robotic hand system were successfully with a few limitation in hardware
development and software development. For hardware development, it is
recommended to:
1. Use a molding technique instead of 3D printer to make the parts of the
robotic hand. By using a molding technique, the part produce is more
durable compare to 3D printer parts.
2. Use a string with higher tensile force as the tendons. The current string
has a tensile force of 50 lbs. It is recommended to use a higher tensile
force so that the string will not stretch and the repeatability of the
system will be increased.
While for the software development, it is recommended to:
1. Sent the data directly to the microcontroller without the aid of laptop.
This is because, the robotic hand system will be more flexible if it is
not attach to a laptop. Some application such as prosthetic hand need
to be flexible for the end user.
2. Use raw EMG data from the myo sensor. The current robotic hand
system use a pre-set gesture. The advantage of using a raw EMG data
is, the robotic hand system can make more gesture and the system will
be more human like.
40
CHAPTER 6
PROJECT MANAGEMENT
6.1
Introduction
Project management is carried out to achieve all project goals with effective
project, planning, organizing and controlling resource within a specified time period
[20]. The main constraints of this project are research scope and research time to
achieve the required specifications. Based on the stated constraints, project schedule
had been tabulated on Gantt chart. To ensure a minimal project cost while achieve the
required specifications, cost of all components had been estimated.
6.2
Project schedule
Table 6.1 shows project schedule for Semester 1 tabulated in Gantt chart. Most
of the work that has been done in the first semester is proposing a title and literature
review. Study about previous works is important to get a better understanding about
the project. It is also important because, a solution to encounter the problem based on
41
previous works can be prepared. Other than that, much effort has be spent to
understand a robotic hand system by referring to previous works that others has done.
Besides, one of the important thing in develop a robotic hand system is the selection
of actuator. For this project, a servo motors has been selected because it is the most
suitable actuator for the specific hand design. Then, all components needed for the
hand development is listed.
Table 6.1: 1st semester Gantt chart
Activity
Weeks
2 3 4 5 6 7 8 9 10 11 12 13
14 15 16
1.Propose a title
2.Do research and
literature review
3.Find a suitable
hand design
4.Find a suitable
actuator
5.List all
component
6.Draw flowchart
for the system
7.Preparation and
presentation
Gantt chart for the second semester is shown in Table 6.2. Most of the time
has been spent on hardware implementation and software implementation. Both are
the crucial part for this project. After that, a full functioning robotic hand is develop
42
by integrating the hardware and software parts. Then, the reliability of myo sensor as
the robotic hand controller is tested. Data gathered from the testing is then being
analysed.
Table 6.2: 2nd semester Gantt chart
Activity
Weeks
1 2 3 4 5 6 7 8 9 10
11
12
13
14
15
16
1.Hardware
implementation
2.Software
implementation
3.Integrate
hardware
and
software
4.Testing
and
analysis
5.Presentation
6.Thesis
preparation
6.3
Cost estimation
Table 6. 3 demonstrates the list of component and cost estimation to develop
a robotic hand. The most expensive part is myo sensor. This is because myo sensor is
a sensor integrate of 8 sets of EMG sensor and 9 sets of IMU sensor.
43
Table 6. 3: Cost estimation to develop a robotic hand
No.
Component
Quantity
Price per unit
Price
(RM)
(RM)
1.
3D printed hand design
1
-
-
2.
Myo sensor
1
900.00
900.00
3.
Servo motor
5
37.10
185.50
4.
Arduino UNO
1
101.80
101.80
5.
6V Battery
1
24.40
24.40
6.
Donut board
1
3.00
3.00
7.
Acrylic
1
18.00
18.00
8.
Single core wire 1.5mm
1m
1.00
1.00
9.
Switch
1
0.40
0.40
Total
1234.10
Costing to make a 3D printed hand model is none because the hand was printed
at several lab in Faculty of Electrical Engineering (FKE). There is no charge given to
FKE’s students to print a 3D model.
44
REFERENCES
1.
2.
3.
4.
5.
6.
7.
8.
9.
10.
11.
12.
13.
14.
15.
16.
Jones, L.A. and S.J. Lederman, Human hand function. 2006: Oxford
University Press.
Toledo, C., et al. Upper limb prostheses for amputations above elbow: A
review. in Health Care Exchanges, 2009. PAHCE 2009. Pan American. 2009.
IEEE.
Carrozza, M.C., et al., Design of a cybernetic hand for perception and action.
Biological cybernetics, 2006. 95(6): p. 629-644.
Kochan, A., Shadow delivers first hand. Industrial robot: an international
journal, 2005. 32(1): p. 15-16.
Tuffield, P. and H. Elias, The shadow robot mimics human actions. Industrial
Robot: An International Journal, 2003. 30(1): p. 56-60.
Cabas, R., L.M. Cabas, and C. Balaguer. Optimized design of the
underactuated robotic hand. in Robotics and Automation, 2006. ICRA 2006.
Proceedings 2006 IEEE International Conference on. 2006. IEEE.
Jung, S.-Y., et al. Design of robotic hand with tendon-driven three fingers. in
Control, Automation and Systems, 2007. ICCAS'07. International Conference
on. 2007. IEEE.
Carrozza, M.C., et al. The CyberHand: on the design of a cybernetic prosthetic
hand intended to be interfaced to the peripheral nervous system. in Intelligent
Robots and Systems, 2003.(IROS 2003). Proceedings. 2003 IEEE/RSJ
International Conference on. 2003. IEEE.
Faudzi, A.M., et al. Development of bending soft actuator with different
braided angles. in Advanced Intelligent Mechatronics (AIM), 2012
IEEE/ASME International Conference on. 2012. IEEE.
Hildebrandt, A., et al. A flatness based design for tracking control of
pneumatic muscle actuators. in Control, Automation, Robotics and Vision,
2002. ICARCV 2002. 7th International Conference on. 2002. IEEE.
Iwata, K., K. Suzumori, and S. Wakimoto, Development of Contraction and
Extension Artificial Muscles with Different Braid Angles and Their
Application to Stiffness Changeable Bending Rubber Mechanism by Their
Combination. Journal of Robotics and Mechatronics, 2011. 23(4): p. 582.
Nakamura, T., N. Saga, and K. Yaegashi. Development of a pneumatic
artificial muscle based on biomechanical characteristics. in Industrial
Technology, 2003 IEEE International Conference on. 2003. IEEE.
Nakamura, T. Experimental comparisons between McKibben type artificial
muscles and straight fibers type artificial muscles. in Smart Materials, Nanoand Micro-Smart Systems. 2006. International Society for Optics and
Photonics.
Raheja, J.L., et al. Real-time robotic hand control using hand gestures. in
Machine Learning and Computing (ICMLC), 2010 Second International
Conference on. 2010. IEEE.
Coquin, D., et al., Gestures recognition based on the fusion of hand positioning
and arm gestures. Journal of Robotics and Mechatronics, 2006. 18(6): p. 751.
Brethes, L., et al. Face tracking and hand gesture recognition for human-robot
interaction. in Robotics and Automation, 2004. Proceedings. ICRA'04. 2004
IEEE International Conference on. 2004. IEEE.
45
17.
18.
19.
20.
Peleg, D., et al., Classification of finger activation for use in a robotic
prosthesis arm. Neural Systems and Rehabilitation Engineering, IEEE
Transactions on, 2002. 10(4): p. 290-293.
Bitzer, S. and P. van der Smagt. Learning EMG control of a robotic hand:
towards active prostheses. in Robotics and Automation, 2006. ICRA 2006.
Proceedings 2006 IEEE International Conference on. 2006. IEEE.
Allen, P.K., et al. Using tactile and visual sensing with a robotic hand. in
Robotics and Automation, 1997. Proceedings., 1997 IEEE International
Conference on. 1997. IEEE.
Kerzner, H.R., Project management: a systems approach to planning,
scheduling, and controlling. 2013: John Wiley & Sons.
46
APPENDIX A
CAD DRAWING
(a) Thumb
(b) Index finger
(c) Middle finger
(d) Ring finger
(e) Pinky finger
(f) Connector
(g) Arm1
(h) Arm2
47
(i) Arm3
(j) Arm4
(k) Tendon lining1
(l) Tendon lining2
(m) Arm cover
(n) Pulleys
(o) Servo mounting
(p) Palm
48
APPENDIX B1
ARDUINO PROGRAMMING
1. Arduino promming to control the movement of servo motors
#include <MyoController.h> // Use MyoController library
#include <Servo.h>
// Use servo library
Servo servothumb;
// Define thumb servo
Servo servoindex;
// Define index servo
Servo servomajeure;
// Define middle servo
Servo servoringfinger; // Define ring servo
Servo servopinky;
// Define pinky servo
MyoController myo = MyoController();
void setup() {
servothumb.attach(2); // Set thumb servo to digital pin 2
servoindex.attach(3); // Set index servo to digital pin 3
servomajeure.attach(4); // Set middle servo to digital pin 4
servoringfinger.attach(5); // Set ring servo to digital pin 5
servopinky.attach(6);
myo.initMyo();
// Set pinky servo to digital pin 6
// Initialize myo sensor
}
void loop(){
myo.updatePose();
( myo.getCurrentPose() ) {
--------------------------------------------------------------------------------------------case rest:
servothumb.write(85);
servoindex.write(85);
servomajeure.write(83);
servoringfinger.write(90);
servopinky.write(85);
break;
--------------------------------------------------------------------------------------------case fist:
49
servothumb.write(28);
servoindex.write(27);
servomajeure.write(0);
servoringfinger.write(34);
servopinky.write(27);
break;
--------------------------------------------------------------------------------------------case fingersSpread:
servothumb.write(110);
servoindex.write(121);
servomajeure.write(120);
servoringfinger.write(120);
servopinky.write(110);
break;
--------------------------------------------------------------------------------------------case doubleTap:
servothumb.write(28);
servoindex.write(27);
break;
}
delay(100);
}
50
APPENDIX B2
CODING OF MYODUINO
1. Coding for myo controller
#include "MyoController.h"
MyoController::MyoController(){
msgChar=String("");
}
MyoController::~MyoController(){
}
bool MyoController::initMyo(){
Serial.begin(9600);
return true;
}
bool MyoController::updatePose(){
if (Serial.available()){
storageStr = String("");
while(Serial.available()){
storageStr = storageStr + char(Serial.read());
delay(1);
}
msgChar = storageStr;
Serial.print(msgChar);
}
if(msgChar.indexOf("rest")>=0)
{
current_pose_=rest;
}
else if (msgChar.indexOf("fist")>=0){
current_pose_=fist;
}
else if (msgChar.indexOf("waveIn")>=0)
{
current_pose_=waveIn;
51
}
else if (msgChar.indexOf("waveOut")>=0)
{
current_pose_=waveOut;
}
else if (msgChar.indexOf("fingersSpread")>=0)
{
current_pose_=fingersSpread;
}
else if (msgChar.indexOf("doubleTap")>=0)
{
current_pose_=doubleTap;
}
else
{
current_pose_=unknown;
}
}
Poses MyoController::getCurrentPose(){
return current_pose_;
}