Design and Implementation of Software Research Platform for

Transcription

Design and Implementation of Software Research Platform for
Proceedings of the 2001 IEEE
International Conference on Robotics & Automation
Seoul, Korea • May 21-26, 2001
Design and Implementation of Software Research Platform for
Humanoid Robotics : H6
Satoshi Kagami
Koichi Nishiwaki Tomomichi Sugihara
James J. Kuner Jr. Masayuki Inaba Hirochika Inoue
Dept. of Mechano-Informatics, Univ. of Tokyo.
7{3{1, Hongo, Bunkyo-ku, Tokyo, 113{8656, Japan.
Email: fkagami,nishi,zhidao,kuner,inaba,[email protected]
Abstract
A humanoid robot \H6" is developed as a platform for
the research on perception-action coupling in intelligent behaviour of humanoid type robots. The H6 has
the features as follows : 1) body which has enough
DOFs and each joint has enough torque for full body
motion, 2) PC/AT compatible high-performance onboard computer which is controlled by RT-linux so that
from low-level to high-level control is achieved simultaneously, 3) self-contained and connected to a network
via radio ethernet, 4) Dynamic walking trajectory generation, motion planning and 3D vision functions are
available. The H6 is expected to be a common testbed in experiment and discussion for various aspects
of intelligent humanoid robotics.
1 Introduction
Recently, research on humanoid type robot is active
eld in robotics society, and many elemental functions
are proposed. Especially bipedal dynamic walking,
soft skin, motion planning, 3D vision, and other topics are very much progressing. However in order to
achieve a humanoid robot which works in a human
world together with human being, not only elemental functions but also integration of these functions
will be a important problem. At present, many humanoid robots are developed, but almost all robots
are designed for bipedal locomotion experiments. To
satisfy both locomotion and high-level behavior by integrating tactile sensing/3D vision based perception
and motion, robot should have a good functionality
for mechanism, hardware and software. Especially full
body behavior with many contact points to the environment will be a important problem, however such a
motion requires more sophisticated body design.
0-7803-6475-9/01/$10.00 © 2001 IEEE
2431
z
y
x
Figure 1: Left: Geometric Model in Euslisp, Center:
DOFs arrangements, Right: Photo of H6
So far, child size full body humanoid \H5"
(tall:1270mm, weight:33kg) has been developed for a
research on dynamic bipedal locomotion, and several
issues about dynamically balanced trajectory generation have been proposed [1{3]. However, humanoid
robot which work in a human world requires body,
sensor and computational availability together. As for
body, arrangement of DOFs, rotation range of joints,
maximum torque of joints and smooth surface in order
to attach tactile skin sensors are required. As for sensor, foot force sensor, exible tactile skin and 3D vision
are required. Finally, as for computational availability, high computational performance and sosticated
software are required. Therefore humanoid robot H6 is
developed in order to satisfy those three requirements.
In this paper, requirements, design, implementation
and experiments of humanoid robot H6 as for a research platform for perception-action integration are
denoted.
2 Design and Concepts
2.1 Requirements
Three key issues are important for a humanoid robot
which works in a human world as mentioned above:
a) body mechanism, b) sensor availability, and c) software availability.
As for body mechanism, in order to achieve bipedal
walk, full body motion such as lie down, support body
by hand, pick itself up and so on, not only compact body with light weight but also other elements
become important such as follows : arrangement of
DOFs, enough DOFs, rotation range of joints, maximum torque of joints, self-containedness, ease of maintenance and smooth surface in order to attach tactile
skin sensors.
As for sensor, foot force sensor for walking, soft skin
with exible tactile sensor for contacting the world,
and 3D vision system are required.
Finally, as for computational availability, real-time operating system are required in order to process from
low-level software such as servo-loop and sensor processing to high-level software such as motion planning and behavior control simultaneously. In order
to achieve full body behavior from sensor input, 3D
vision software, motion planning software, dynamically stable trajectory generation software is required.
Therefore high computational performance is also required.
2.2 Conceptual Design of H6
Conceptual design for H6 are as follows:
Compact and light weight body to make experiment easier,
Modular structure for the ease of maintenance
and enhancement,
Self-contained including battery, connection to
the LAN via wireless ethernet,
Enough DOFs., range, torque and speed of joints,
with which robot can stand up when it falls down
and can walk,
Capability of looking down at the feet and of vergence control for 3D vision,
Smooth surface for tactile sensor skin made of
pressure sensor and air-chamber,
Onbody high performance PC/AT computer and
RT-Linux as for system controller,
2432
Dynamically stable walking trajectory generation
system,
Motion planning system,
3D vision system,
Voice processing system
2.3 Specications
Accordings to the above requirement, humanoid robot
H6 was developed. H6 is 285(l) 2 598(w) 2 1361(h)
[mm] in standing default posture, and total weight is
about 51.0[kg]. It has 6 DOFs leg, 7 DOFs arm with
1 DOF hand, and 5 DOFs head.
2.3.1
Power and Energy Consumption
24V DC power supply is utlized for H6, and two leadacid battery (12V, 5.0Ah, 2kg) are attached inside the
torso for both controllers and motors, and it can supply power for about 10 to 15 minutes in average (5
minutes in walking). In squat motion time, maximum
current for motors is about 29A and average current is
about 4A(96W). Current for computer is 4.3A(102W)
on average.
2.3.2
Sensors
In order to generate 3D vision information, stereo
cameras with vergence control functions are attached
on the head.
Twelve force sensing resister (FSR) sheets are attached on each foot, so that the tatal force along the
vertical axis and the ZMP position can be measured.
Data are measured by A/D of RIF-01 board. Inclinometer and Accelerometer are attached on the center
of the body, and measured by A/D of RIF-01 board.
As for tactile skin sensors, air chamber with pressure
sensor type suit is designed to achieve both tactile
sensing and shock absorbing.
3
Onbody PC/AT Clone Controller
3.1 Computer and Peripherals
Processing power is especially important for a humanoid type robot which simultaneously requires dynamic balance calculation, sensor processing including
3D vision, motion planning and so on. We adopted
PC/AT clone for FA use, since it has much processing
power and can be replace in a future progress.
Back Plane
CCD Cameras
Image Capture
Board (BT878)
A/D
Inclinometer
Accelerometer
Network Card
PCI
Bus
DMA Transfer
A/D
CPU Card
Format Conversion
Subsampling
Gamma Conversion
(Fujitsu
A/D, D/A
RIF01)
I/O Board
Force Sensor
Hitachi Universal
Hitachi
Universal
Pulse
Processor
Pulse Processor
(UPP:HD63140)
(UPP:HD63140)
Motor and
Encoder
Radio
Ethernet
12bit 16ch. A/D
ISA
Bus
12bit 16ch. D/A
PentiumIII-700MHz
EIDE, Floppy,
Serial Port x2,
Parallel Port,
PS/2 kbd/mouse,
256MB DRAM
Linux 2.2.13
RTLinux-2.0pre3
SCSI 2.5’’ HDD
Motor Driver
JSK-D00
Intelligent
Intelligent
Motor
Driver
Motor
Driver
JSK-D01
JSK-D01
Motor and
Encoder
(on the head)
I2C Bus
Figure 2: H6 Hardware Components
It has dual PentiumIII{750MHz(100MHzFSB),
256MBytes SDRAM, two serial ports, one parallel
port, two IDE ports, FDD port and PS/2 keyboard.
A 6.4GBytes 2.5inch IDE harddisk is connected. This
board has PICMG connector to connect to a backplane. Backplane board has one PICMG connector
for CPU board, two ISA connectors and three PCI
connectors(Fig.2).
Mother board + backplane board
3.3 Image Capture Card
Frame grabber is BT848 based PCI card, and it transmits every image into main memory by DMA transfer, and program can obtain the input image after
30[msec.] latency. Two cards are connected to the
backplane, so that synchronous stereo camera input
can be captured simultaneously. Synchronisation of
cameras is achieved by a Field Mix Circuit [5].
3.4 Motor Driver
Wireless ethernet(PCI)
Image capture card 22 (PCI)
JSK-D00 is adopted as motor drivers. A card size
JSK-D00 can control four motors. 90W and 150W motor requires more current compared with the current
limitation the driver, therfore circuit for two mortors
are connected to each motor.
Robot interface card RIF-01 22(ISA)
3.2 Robot I/O Card (Fujitsu RIF-01) [4]
RIF-01 is a I/O board which has 16 channel 12bit
A/D and D/A, and has two UPP (Hitachi Universal
Pulse Processor, HD63140). UPP has 10 channel 10bit
A/D, 16 channel DIO connected to UPC (Universal
Processor Core) and watch dog timer. Encoder of each
joint of limbs are mesuared by UPP digital input and
D/A outputs are used for motor driver control signal.
FSR and accerelometer are connected to AD.
2433
3.5 Head Control System
In order to remove its head and do experiment independently, head has dierent control system. JSKD01, intelligent card size motor driver, has four Hbridges, encoder counters, PWM pulse generator and
a Hitachi H8 microprocessor. It can control motors
locally by given control input.
Application Layer
Obstacle
Avoidance
Interact with
Human being
3D Vision Manager
Step Finder
Network
Layer
Onbody
Layer
I/O Board
Image Capture Card
(BT878 x 2)
AutoBalancer
Footprint
Planner
Body Data
Manager
Online Basic
Motion Mixture
Speech Synthesizer
Device
Driver
Stereo Camera
Dynamic
Motion Planner
Virtual Model
Environment
Speech Recognizer
Depthmap Generation
Device
Driver
Motion
Planner
Operator
Interface
Sound
Manager
Target Finder
Human Finder
Obstacle Finder
Kernel Layer
Sensor/Motor
Joystick
Server
Face Recognizer
Depthmap Generation
3D Depth Flow
Plane Segment Finder
3D Basic Vision
Remote
Operation
Target
Grasping
Software Servo
by RT-Linux
Kernel Module
Data
Server
Sequence
Manager
Robot I/O Board
Microphone/Speaker
Tactile Sensor
PD
Servo
Online ZMP
Compensation
Inclinometer, Gyro
(RIF-01 x 2)
FSR
Encoder
Motor
Figure 3: H6 Software Components
4 Software Design
originally, RT-Linux have two special mechanisms, one
is scheduler for real-time process, the other is two-level
interrupt handler.
4.1 Requirements of Humanoid Robots
A humanoid robot research platform should satisfy
many aspects of research or experiment, from lowlevel quick/smooth motion control to high-level vision/sensor based behavior research in many complex environments. Therefore transparent system is
required from realtime control to high computation.
There are two requirements, 1) software servoing, 2)
high performance parallel computing function with
network capability such as remote resource utilization,
interface for a developer and so on.
At present several fullbody humanoid robots have
been developed, however several are designed for
legged locomotion and seveal are desinged for highlevel behavior, and no systems satises both requirements.
In order to satisfy both legged locomotion and highlevel behavior by integrating perception and motion,
RT-Linux [6] is adopted and servo loop is implemented
as a kernel module. Since Linux is not a real-time OS
2434
4.2 H6 Software Components
There are six software components in H6(Fig.3). i)
realtime servo-loop, online ZMP compensation mechanisms for servoing and walking, ii) online footprint
planning mechanisms, iii) onbody low-level 3D vision processing, voice processing functions, iv) motion
planning functions, v) vision, sound, and other sensor
data server, in order to achieve network through data
processing, vi) high-level 3D vision fucntion, voice
recognition, and other high-level recognition functions
are placed on the network computers.
4.3 Joint Servo Unit
All 28 joints except in head are controlled by one RT
loop which runs at every 1msec cycle(Motor servo in
Fig.3). It is basically PD control.
4.4 Online ZMP Compensation
4.8 Motion Planning and Dynamically
Stable Motion Planning
Humanoid type robot is hard to \replay" the output
trajectories correctly in real world even it satises the
ZMP constraint. Therefore local compliance control
methods have been proposed [7{10]. In this paper, we
adopted a torso position compliance method to track
a given ZMP trajectory. This method tries to track a
given ZMP trajectory by the horizontal motion of the
torso. It consists of two parts, one is ZMP tracking
mechanisms, and the other is inverse pendulum control to keep its dynamic balance(Online ZMP compensation in Fig.3) [11].
Since humanoid robot has many DOFs, it is hard to
generate a full body trajectory such as reaching to
the target object. In order to solve this problem,
motion planning algorithms for calculating a collision
free path using remarkably fast randomized technique
RRT [16] are adopted.
Moreover, dynamically stable motion planner is developed that search the conguration space of the robot
for simultaneously satises the dynamic balance constraints [17].
4.5 Online Footprint Planner
4.9 Sound Processing
In order to implement interactive behavior, basic
walking pattern generation function is prepared. Enhancing \Dynamically Stable Mixture of Pre-designed
Motions [2]", appropriate body position, posture and
velocity can be generated by mixing pre-calculated
candidate motions online(Footprint planner in Fig.3).
Since humanoid robot has many degrees of freedom,
noises happen while it is working. Therefore, voice
recognition software should have a function to resist
its noises. Adopted voice recognition software is developed by Dr.Hayamizu at ETL, and this software
has advantages that it can run on onbody processor
(it runs on Linux) and programmer can very easily
to manage its dictionary. Using this advantage, task
based dictionaries which contain only several words
are prepared, and it is robust in terms of noises.
Speech software is a commercial software (Fujitsu) and
it also runs on Linux. Fig.?? shows a voice command
based walking experiment.
4.6 Autobalancer
\AutoBalancer" reactively generates the stable motion of a standing humanoid robot on-line and from
the given motion pattern( [3, 12]). The system consists of two parts, one is a planner for state transition
from the relationship between legs and the ground,
and the other is a dynamic balance compensator which
solves the balance problem as a second order nonlinear programming optimisation by introducing several
conditions. The latter can compensate the centroid
position and the tri-axial moments of any standing
motion, using all joints of body in real-time. The
complexity of AutoBalancer is O((p + c)3 ), where p is
number of DOFs and c is number of condition equations(Autobalancer in Fig.3).
4.7 3D Vision Processing
So far we developed real-time 3D vision functions such
as, 1) depth map generation [13], 2) 3D depth ow
generation [14], 3) Plane segment nde [15]. Realtime
depthmap generation system, and its application of
human nder, target nder run on onbody PC. These
low-level library can be utilised onbody PC. Other
high-level vision functions such as plane nder, face
recognizer and so on, require much computational resources, so that they run on the network computers.
2435
2433
5
Conclusion
This paper described requirements, design and implementation of H6. H6 is designed as a research platform of the humanoid robot that can interact to the
complex environment by coupling sensor and behavior. It has enough DOFs, joint angle range and torque
to satisfy not only dynamic walking but for full body
behavior. It also has enough CPU power by adopting RT-Linux on PC/AT clone CPU board, and can
handle from software servo-loop to high-level sophisticated motion generation/sensor processing functions.
H6 is self-contained including battery and is expected
to be a common test-bed in experiment and discussion
for various aspects of intelligent humanoid robotics.
This research has been supported by Grant-in-Aid for Research for the Future Program of the Japan Society for
the Promotion of Science, \Research on Micro and SoftMechanics Integration for Bio-mimetic Machines (JSPSRFTF96P00801)" project.
[2] K. Nishiwaki, K. Nagasaka, M. Inaba, and H. Inoue. Generation of reactive stepping motion for a humanoid by dynamically stable mixture of pre-designed motions. In Proc.
of 1999 IEEE Int. Conf. on Systems, Man, and Cybernet-
ics No. VI, pp. 702{707, 1999.
[3] S. KAGAMI, F. KANEHIRO, Y. TAMIYA, M. INABA,
and H. INOUE. Autobalancer: An online dynamic balance compensation scheme for humanoid robots. In Proc.
of Fourth Intl. Workshop on Algorithmic Foundations on
Robotics (WAFR'00), pp. SA{79{SA{89, 2000.
[4] Y. Matsumoto, K. Sakai, T. Inamura, M. Inaba, and H. Inoue. PC-based Hypermachine: A kernel System for Intelligent Robot Application. In Proc. of 15th Annual Conference of Robotics Society of Japan, Vol. 3, pp. 979{980,
1997.
[5] Y. Matsumoto, T. Shibata, K. Sakai, M. Inaba, and H. Inoue. Real-time Color Stereo Vision System for a Mobile
Robot based on Field Multiplexing. In Proc. of IEEE Int.
Conf. on Robotics and Automation, pp. 1934 { 1939, 1998.
[6] V. Yodaiken and M.Barabanov.
RT-Linux.
http://www.rtlinux.org.
[7] Ken'ichirou NAGASAKA, Masayuki INABA, and Hirochika INOUE. Stabilization of Dynamic Walk on a Humanoid Using Torso Position Compliance Control. In Proceedings of 17th Annual Conference on Robotics Society of
, pp. 1193{1194, 1999.
[8] Honda Co. Ltd. Walking Control System for Legged Robot.
Japan Patent Oce (A) 5-305583, 1993.
[9] Honda Co. Ltd. Walking Control System for Legged Robot.
Japan Patent Oce (A) 5-200682, 1993.
[10] Honda Co. Ltd. Walking Pattern Generation System for
Legged Robot. Japan Patent Oce (A) 10-86080, 1998.
[11] S. KAGAMI, K. NISHIWAKI, T. KITAGAWA, T. SUGIHARA, M. INABA, and H. INOUE. A fast generation
method of a dynamically stable humanoid robot trajectory with enhanced zmp constraint. In Proc. of IEEE
Japan
International
Conference
on
Humanoid
Robotics
(Hu-
, 2000.
[12] Y. Tamiya, M. Inaba, and Hirochika Inoue. Realtime balance compensation for dynamic motion of full-body humanoid standing on one leg. Journal of the Robotics Society of Japan, Vol. 17, No. 2, pp. 268{274, 1999.
[13] S. KAGAMI, K. OKADA, M. INABA, and H. INOUE.
Design and implementation of onbody real-time depthmap
generation system. In Proc. of International Conference
on Robotics and Automation (ICRA'00), pp. 1441{1446,
2000.
[14] S. Kagami, K. Okada, M. Inaba, and H. Inoue. Real-time
3d optical ow generation system. In Proc. of Internamanoid2000)
Figure 4: H6 Experiments : Top row: H6 Controlled
by Joystick(Right side man has joystick), 2nd row:
Voice Control, 3rd row: dynamic motion planning, 4th
row: 3D vision system which nd out a human being
References
tional Conference on Multisensor Fusion and Integration
for Intelligent Systems (MFI'99), pp. 237{242, 1999.
[15] S. Kagami, K. Okada, M. Inaba, and H. Inoue. Plane
segment nder. In 5th Robotics Symposia, pp. 381{386,
2000.
[16] Steven M. LaValle and Jr James J. Kuner. Rapidlyexploring random trees: Progress and prospects. In Proc.
of Fourth Intl. Workshop on Algorithmic Foundations on
[1] K. Nagasaka, M. Inaba, and H. Inoue. Walking Pattern
Generation for a Humanoid Robot based on Optimal Gradient Method. In Proc. of 1999 IEEE Int. Conf. on Systems, Man, and Cybernetics No. VI, 1999.
2436
Robotics (WAFR'00), 2000.
[17] J. J. Kuner, S. KAGAMI, M. INABA, and H. INOUE.
Dynamically-stable motion planning for humanoid robots.
In Proc. of IEEE International Conference on Humanoid
Robotics (Humanoid2000), 2000.