D3.1 Sensor network

Transcription

D3.1 Sensor network
CORBYS
Cognitive Control Framework for Robotic Systems
(FP7 – 270219)
Deliverable D3.1
Sensor Network
Contractual delivery date: Month 9
Actual submission date: 31st October 2011
Start date of project: 01.02.2011
Duration: 48 months
Lead beneficiary: SINTEF
Responsible person: Frode Strisland
Revision: 1.0
Project co-funded by the European Commission within the seventh Framework Program
Dissemination Level
PU
Public
PP
Restricted to other program participants (including the Commission Services)
RE
Restricted to a group specified by the consortium (including the Commission Services)
CO
Confidential, only for members of the consortium (including the Commission Services)
X
D3.1 Sensor Network
Rev. 1.0
Document History
Author(s)
Frode Strisland
Frode Strisland
Revision
0.1
0.2
Date
05-09-2011
07-09-2011
Frode Strisland
Steffen Dalgard
Steffen Dalgard
Frode Strisland
Frode Strisland
0.3
0.4
0.5
0.6
1.0
01-10-2011
13-10-2011
20-10-2011
26-10-2011
28-10-2011
Contributions
Draft document structure
Version distributed to partners with request for input on
sensor network needs
Version incorporating all information from partners
Restructured document
Version for early internal review
Version for final internal review
Release version
II
D3.1 Sensor Network
Rev. 1.0
CORBYS Definition of Terms
Term
Definition
CORBYS Demonstrators
The 1st CORBYS Demonstrator
(Demonstrator I)
The 2nd CORBYS Demonstrator
(Demonstrator II)
Mobile Robot-assisted Gait Rehabilitation System
Reconnaissance Robot for
Environments – (RecoRob)
Investigation
of
Hazardous
CORBYS Roles
User
CORBYS End-user
Mobile Robotic
Gait
Rehabilitation
System Roles
Patient
Therapist
Engineer
Reconnaissance Operator
robot for
Hazardous Area
Investigation of Examination Officer
Hazardous
Engineer
Environments
Roles
CORBYS Domain Knowledge
Sensor Fusion
Situation Assessment
Cognitive Control
Human-Robot Interaction
Neural Plasticity
Cognitive Processes
CORBYS Technology Components
SAWBB
SOIAA
User Interface
Brain Computer Interface (BCI)
Human Sensory System (HSS)
Low Level Control
Any user interacting with the CORBYS systems, for example,
in case of gait rehabilitation system, users with the following
roles: a patient, therapist or an engineer;
In the case of reconnaissance robot, users with the following role:
(tele) operator or a hazardous area examination officer.
The companies/entities that use/exploit (aspects of) CORBYS
technology in their commercial products or services.
The person receiving gait rehabilitation therapy aided by the
CORBYS system
The medical professional configuring and assessing rehabilitation
therapy aided by the CORBYS system.
A professional dealing with the CORBYS system based on a need
to do technical maintenance, repairs or system configurations.
The person steering the robot by remote control
The person that robot follows in a team work on investigation of
hazardous areas
A professional dealing with the CORBYS system based on a need
to do technical maintenance, repairs or system configurations.
Method used to combine multiple independent sensors to extract
and refine information not available through single sensors alone.
Estimation and prediction of relation among objects in the context
of their environment.
Capability to process variety of stimuli in parallel, to “filter”
those that are the most important for a given task to be executed,
to create an adequate response in time and to learn new motor
actions with minimum assistance (Kawamura et al., 2008).
Ability of a robotic system to mutually communicate with
humans.
Ability of neural circuits, both in the brain and the spinal cord, to
reorganise or change function.
Processes responsible for knowledge and awareness, they include
the processing of experience, perception and memory.
Situation Awareness Blackboard
Self-Organising Informational Anticipatory Architecture
User interface designed to meet the needs of the various users in
exchanging information between the robot and human user
The sensor system measuring the brain waves using EEG and
detecting patterns identifying movement actions
The sensors measuring aspects of the human physiology and
movement patterns
Localised control of actuators, usually torque, current or position
III
D3.1 Sensor Network
Smart Actuators
Rev. 1.0
control. Sensory data is passed to the real-time control, actuation
commands are calculated and sent to the actuators.
Highly integrated mechatronic units incorporating a motor and
the complete motion control electronics in one single unit.
Generic CORBYS Robot Control Components
Cognitive System
Incorporates situation awareness and intention detection to enable
optimal man-machine interaction towards achievement of set
goals in the specific usage context.
Executive Layer
The executive layer is responsible for translating the high-level
plans (cognitive inputs) into low-level actions, invoking actions at
the appropriate times, monitoring the action execution, and
handling exceptions. The executive layer can also allocate and
monitor resource usage.
Communication Server
Manage subscriptions of sensor data between different control
modules. The sensor data to the cognitive modules are not
flowing through the Communication Server, but are forwarded
directly.
Task manager
The task manager manages operation modes to be executed by the
system. Performs specific tasks when the operation mode is
changed.
FPGA Reflexive Module
Field Programmable Gate Array (FPGA) based hardware
subsystem of Situation Awareness architecture (SAWBB) for
acceleration of robot reflexive behaviour.
Safety Module
Verification that actuator output is in line with the commanded
output and that it satisfies safety-related position, velocity, current
and/or torque constraints.
Real-Time Data Server
Real-time data server is a software module responsible for
communicating sensor data from real-time (RT) bus to other
software modules.. This excludes communication of RT modules
with sensors and actuators which communicate with sensors and
actuators directly, in order to preserve RT control behaviour.
Real-Time Network
Sensor network for real-time, safety critical data transmission
General Purpose Network
Network for robot control and interface to the cognitive modules
Demonstrator Specific Technology Components
Mobile Robotic Gait Pelvis Link
Mechanical interface between the mobile platform and the
Rehabilitation
powered orthosis equipped with an appropriate actuation and
System
sensing system.
Powered Orthosis
Exoskeleton system to help the patient in moving his/her legs and
receiving an appropriate rehabilitation therapy.
Mobile Platform
The platform for the entire system, including Pelvis Link,
Powered Orthosis, necessary computational, storage, and power
supply modules, as well as motored wheels for movement
Reconnaissance robot Vision System
Cameras of the 2nd demonstrator used for environment perception
for Investigation of
including human tracking
Hazardous
Robot Arm
7DOF lightweight robot arm mounted on the Mobile Platform
Environments
used for the object manipulation (for contaminated area sample
drawing)
Mobile Platform
Mobile platform of the 2nd demonstrator which consists of a
variable drive system that is equipped with chains. It is used for
mounting of the robot arm and sensors for environment
perception as well as sensors for platform navigation and robot
arm control. Containers for samples are also placed on mobile
platform
IV
D3.1 Sensor Network
Rev. 1.0
Table of Contents
CORBYS DEFINITION OF TERMS ................................................................................................ III
EXECUTIVE SUMMARY ................................................................................................................. 1
1
INTRODUCTION ....................................................................................................................... 2
1.1
1.2
1.3
2
DOCUMENT SCOPE........................................................................................................................... 2
DOCUMENT STRUCTURE .................................................................................................................. 2
ASSOCIATED DOCUMENTS............................................................................................................... 2
SENSOR NETWORK IN THE CORBYS SETTING .................................................................. 3
2.1
DEFINITION OF THE CORBYS SENSOR NETWORK .......................................................................... 3
2.2
REQUIREMENTS AND INPUT FROM PROJECT PARTNERS .................................................................. 3
2.3
DESIGN CONCERNS FOR SELECTING CORBYS SENSOR NETWORK ARCHITECTURE ..................... 4
2.3.1 Partitioning of the Sensor Network According to Needs ............................................................. 4
2.3.2 Safety ........................................................................................................................................... 5
2.3.3 Modularity ................................................................................................................................... 5
2.4
CORBYS OVERALL ROBOT CONTROL SYSTEM ARCHITECTURE ................................................... 5
3 SENSOR NETWORK REQUIREMENTS FOR THE 1ST CORBYS DEMONSTRATOR INPUT FROM PARTNERS ............................................................................................................... 8
3.1
GENERAL PURPOSE NETWORK REQUIREMENTS FOR THE 1ST CORBYS DEMONSTRATOR ............. 8
3.1.1 User Interfaces ............................................................................................................................ 8
3.1.2 Human Sensory System................................................................................................................ 9
3.1.3 Cognitive Modules ..................................................................................................................... 10
3.2
REAL -TIME NETWORK REQUIREMENTS FOR THE 1ST CORBYS DEMONSTRATOR ....................... 11
3.3
EVALUATION OF STATE-OF-THE-ART REAL-TIME SENSOR NETWORK TECHNOLOGIES RELEVANT
ST
FOR THE 1 DEMONSTRATOR .................................................................................................................... 12
4
IMPLEMENTATION OF THE 1ST CORBYS DEMONSTRATOR SENSOR NETWORK ...... 14
4.1
IMPLEMENTATION OF THE GENERAL PURPOSE NETWORK ............................................................ 14
4.1.1 Common tools and libraries for the General Purpose Network ................................................ 14
4.1.1.1
4.1.1.2
4.1.1.3
4.1.1.4
CORBYS Command Interface (CCI)............................................................................................... 14
CORBYS Sensor Data Protocol ....................................................................................................... 15
Logging and Monitoring .................................................................................................................. 15
Remote Debugging .......................................................................................................................... 16
4.2
IMPLEMENTATION OF THE REAL-TIME NETWORK ........................................................................ 17
4.2.1 The Network Cabling ................................................................................................................. 18
4.2.2 Outer Loop in the Gait robot Real-Time Network ..................................................................... 19
4.2.3 Safety Module ............................................................................................................................ 19
4.2.4 Real-Time Network Data Format .............................................................................................. 22
5 CORBYS 2ND DEMONSTRATOR NEEDS AND IMPLEMENTATION OF SENSOR
NETWORK ...................................................................................................................................... 24
5.1
5.2
CORBYS 2ND DEMONSTRATOR IMPLEMENTATION ....................................................................... 24
SYSTEM SETUP............................................................................................................................... 24
V
D3.1 Sensor Network
5.3
5.4
5.5
Rev. 1.0
CORBYS 2ND DEMONSTRATOR SOFTWARE ARCHITECTURE ........................................................ 26
INTEGRATION INTO THE GENERAL PURPOSE NETWORK ............................................................... 26
THE REAL-TIME NETWORK ........................................................................................................... 27
6
CONCLUSIONS ....................................................................................................................... 28
7
REFERENCES .......................................................................................................................... 29
VI
D3.1 Sensor Network
Rev. 1.0
Executive Summary
The CORBYS sensor network will facilitate the input of sensor signals to and the output of actuator steering
signals from the CORBYS cognitive system, as well as lower level robot control signals. As such, the sensor
network is not limited to sensors (devices that measure a quantity and convert the quantity into machine
readable pieces of information), but also involves actuators (devices that perform physical actions triggered by
the control system).
CORBYS Deliverable 3.1 Sensor Network deals with the definition of the CORBYS sensor network which
will be an important ingredient in the generic CORBYS architecture for cognitive robot control. The
development of the CORBYS sensor network will be driven by the 1st demonstrator, a robotic gait
rehabilitation system which will be developed during the project lifetime. The generic characteristics of the
CORBYS architecture will be tested in the 2nd demonstrator, a reconnaissance robot for investigation of
hazardous environments.
Based on an analysis of sensor network needs collected from the CORBYS partners as well as other work to
define the overall CORBYS system architectures, a generic framework for the project cognitive robotic
control structures is outlined. The sensor network is considered to consist of two parts:
-
A real -time sensor network in which all robot sensor and actuators are included
-
A general purpose network which will allow transmission of data that are not time and safety critical.
The network architecture should allow the cognitive control modules to be used in the two project
demonstrators, as well as in other robotic applications.
Beyond the general sensor network architecture, the sensor network is further described and detailed for the
two demonstrators. For the robotic gait rehabilitation demonstrator, the real -time sensor network will be
based on sensor network standard which supports real-time control such as the EtherCAT. This standard will
be evaluated and tested in the next stages of the project. For the 2nd demonstrator, an existing CAN network
implementation will be used for the low level robot network.
1
D3.1 Sensor Network
Rev. 1.0
1 Introduction
The focus of CORBYS is on robotic systems that have symbiotic relationship with humans. Such robotic
systems have to cope with highly dynamic environments as humans are demanding, curious and often act
unpredictably. CORBYS will design and implement a cognitive robot control architecture that allows the
integration of 1) high-level cognitive control modules, 2) a semantically-driven self-awareness module, and 3)
a cognitive framework for anticipation of, and synergy with, human behaviour, based on biologically-inspired
information-theoretic principles.
The CORBYS control architecture will be validated within two challenging demonstrators: i) the novel mobile
robot-assisted gait rehabilitation system CORBYS; ii) an existing autonomous robotic system.
Further information about the design challenges of CORBYS can be found on the CORBYS web page
[www.corbys.eu]. Additionally, general information about the field of Cognitive Robotics can be found on
the EU Framework Program 7 web pages on Cognitive Systems and Robotics
[http://cordis.europa.eu/fp7/ict/cognition/].
1.1 Document Scope
The present document corresponds to Deliverable 3.1 in the CORBYS project, and is the outcome of work in
CORBYS Task 3-2 on Sensor network design. The sensor network architectural design description given here
will steer the project in the implementation and integration phases so that the different sensor and actuator
functionalities can be included in a consistent manner.
1.2 Document Structure
This document is structured as follows:
Section 2 discusses the general CORBYS sensor network perspectives and needs that are independent of
application. Section 3 discusses gait rehabilitation system specific requirements and Section 4 outlines how
this sensor network can be implemented. This section also contains a summary of tools and libraries that will
be considered in the development of the CORBYS sensor network.
Finally, Section 5 describes the needs of the 2nd demonstrator and how the corresponding sensor network is
implemented.
1.3 Associated Documents
The following documents give additional perspectives for the present work:
-
D2.1 Requirements and Specification: State-of-the-Art, Prioritised End-User Requirements, Ethical
Aspects
D2.2 Detailed Specification of the System: System Architecture Specification with control and data
flow, module interdependencies, user scenarios etc.
2
D3.1 Sensor Network
Rev. 1.0
2 Sensor Network in the CORBYS Setting
2.1 Definition of the CORBYS Sensor Network
CORBYS addresses robotic systems that have symbiotic relationship with humans. Such robotic systems have
to cope with highly dynamic environments as humans are demanding, curious and often act unpredictably.
The sensor network will be an information backbone of the architecture for the cognitive robot control that
allows the integration of high-level cognitive control modules, a semantically-driven self-awareness module
and a cognitive framework for anticipation of, and synergy with, human behaviour based on biologicallyinspired information-theoretic principles.
The CORBYS sensor network will facilitate the input of sensor signals to and the output of actuator steering
signals from to the CORBYS robot control modules.
As such, the sensor network is not limited to sensors (devices that measure a quantity and convert the quantity
into machine readable pieces of information), but also involves actuators (devices that perform physical
actions triggered by the control system). For simplicity, we will however stick to the term “sensor network”.
The main components of the sensor network will be:
i)
The physical network, consisting of hardware components such as cables, interface components,
and/or wireless transceivers
ii)
The network protocol(s), which specifies how signals are transmitted on the physical network.
A sensor network is mainly an infrastructure for acquisition and transmission of information; it is therefore
not responsible for any signal processing beyond packaging and unpacking of data capsules according to the
network protocol. Further, the sensor network should not be considered as a power source for robot sensors or
actuators1.
CORBYS will develop a framework for robot control including cognitive modules which will be
demonstrated in two different scenarios; a robot for gait rehabilitation purposes, and an autonomous robot
system. In order to take advantage of the different CORBYS modules, the sensor network description is
structured so the generic aspects applied for both demonstrators (and with potential use in other applications
as well) are described in a separate section. The specific aspects of the sensor network architecture for the
two CORBYS demonstrator cases are then described in more detail.
2.2 Requirements and Input from Project Partners
The system requirements for the generic CORBYS control architecture and particularly the two demonstrators
have been given in Deliverable D2.1. The current work on design of the sensor network has therefore been
developed keeping these requirements in mind.
1
Some communication protocols, such as RS-232, RS 485 and USB-type standards, also provide limited power supply to
run peripheral components, but scope of the current document is limited to the information transmission perspective.
3
D3.1 Sensor Network
Rev. 1.0
The current work on establishing the sensor network architecture has also benefited from parallel work taking
place to complete deliverable D2.2 Detailed Specification of the System. In total, the three deliverables D2.1,
D2.2 and D3.1 will therefore provide a cohesive description of the targeted CORBYS demonstrator systems in
terms of different components and how the different components interact with one another.
All CORBYS partners have been requested to provide a detailed description of how they see the transmission
of sensor and actuator control signals within their modules, based on the requirements and system
specification, as well as with respect to other CORBYS modules. Based on these descriptions, it has also been
possible to aggregate the numbers of different sensors and actuators planned in the different demonstrators. It
must however be kept in mind that the actual numbers and configurations might change as the project moves
from design to actual implementation.
In the sensor network architecture design, it is also valuable to combine or bundle related sensors so that they
interface the sensor network via a shared sensor network node. The reduction in number of nodes reduces the
complexity of the network in terms of required number of physical components and cabling. Conclusions of
this work will not be finalized until detailed design of the individual modules.
2.3 Design Concerns for Selecting CORBYS Sensor Network Architecture
The design of the CORBYS sensor network architecture is guided by the overall CORBYS’ vision of
developing a flexible architecture for cognitive robot control that can be adapted to several different
applications. In particular, the sensor network architecture should fulfil the needs for the targeted CORBYS
demonstrators.
In the design of CORBYS sensor network, the project will as far as possible employ established network
standards rather than developing new ad hoc solutions. The use of industry standards allows easier access to
components, lowers implementation risk, makes integration and testing easier to plan and carry out, and also
promotes future use of CORBYS components in other settings. To further make the CORBYS effort valuable
in future applications, the project has also taken into consideration the sensor network standards applied by
CORBYS partners in their existing and planned products.
2.3.1
Partitioning of the Sensor Network According to Needs
The different sensor and network components have different requirements with respect to response times.
Most modules in the system can communicate without the need for a real-time guarantee. The amount of data
to be transported per message is in the range 50-1500 bytes. This is assumed to be a network with low
utilization. As long as the utilization is low a best effort network without prioritization mechanisms can be
used. The type of information flowing will be collection of sensor data (many samples collected into a larger
message), command interface, user interface. This is typically the communication between cognitive modules,
executive layer, human sensors, real-time control and user interfaces. This part of the network will be named
the General Purpose Network in CORBYS.
Components, such as robot sensor and actuator controls must have sub-millisecond response times in order to
obtain smooth robotic movement. Also from a safety perspective, some components must be able to adjust
reliably, promptly and firmly to avoid potential damage to humans or robot. This part of the CORBYS sensor
network will be named the Real-Time Network.
4
D3.1 Sensor Network
2.3.2
Rev. 1.0
Safety
As the 1st demonstrator shall be tested when attached to humans the safety aspect is of particular importance in
this application. The safety must be incorporated in many parts of the system. Firstly, the system must not be
able to physically hurt the patient attached to it or other users' interaction with the system. The actuators need
to have significant force in order to help the patient to stand upright. This force is too high to be considered
harmless. This leads to a requirement of the system having to be fail-safe.
Seen from the sensor network this leads to three levels of safety:
-
The General Purpose Network is considered open and not safe. All modules receiving commands
from this network have to check the command with respect to safety. If non-safe commands are
received, they should be rejected.
-
The commands sent to the actuators have to be within safe limits when it comes to position, force and
speed. If not the commands have to be limited to safe values. The Real-Time Network has to ensure
that only safe values can be sent to the actuators.
-
The actuators have to be monitored. If they are not moving as commanded, it must be detected. If a
fault is detected the power to the actuators shall be cut off. The detection can be done by the actuator
drive, or from independent safety sensors. The Real-Time Network should be designed so that it
inherently offers redundancy to secure that communication to actuators and safety sensors is not lost
due to a single failure.
2.3.3
Modularity
All modules have to be able to be tested before integration. This shall be done by simulating input and output
on the CORBYS sensor network. The input and output has to be specified in agreement between the module
owners. By using common libraries for input and output the integration should be easier. By having a common
toolset for debugging and monitoring test data from several modules can be collected and presented. This will
help debugging the integrated functionality.
2.4 CORBYS Overall Robot Control System Architecture
The general architecture to be used for the two CORBYS demonstrators is shown in Figure 1.
Due to differences in their nature, the actual realisation for some of the modules will differ between the two
demonstrators.
5
D3.1 Sensor Network
Rev. 1.0
Cognitive System
SAWBB Machine
SOIAA Machine
General Purpose Network
User Interface(s)
Executive Layer
Task Manager
Communication Server
Human Sensory System
General Purpose Network
FPGA Reflexive Layer
Real-Time Control (RT Machine)
Real-Time Data Server
Real-Time Network
ROBOT Application Sensors and Actuators
Figure 1. Generalised CORBYS sensor network architecture to be applied in the CORBYS demonstrators.
The Task Manager will coordinate the whole system by initiating different working modes. It is natural that
this module is responsible of self-checking the system at start-up. The Human Sensory System relays sensor
data to the cognitive system. The Cognitive System will use the input from the Human Sensory System as
well as other information extracted from the user of the robot system. The user interface will be defined for
the relevant end users in the two CORBYS demonstrator applications.
The Cognitive System modules shall be used in both demonstrators; hence the General Purpose Network has
to be the same in order to reuse the communication interfaces.
Real-time control is responsible for the control of all actuators. For the 1st demonstrator this means applying
the correct gait pattern, keeping balance, and keeping the Mobile Platform in correct position. This is a safety
critical functionality that has to be autonomously failsafe regardless of what it is commanded to do by the
non-real-time modules. The Real-Time Data Server is responsible for relaying sensor data to the non-real-time
part at a suitable format for the cognitive system.
6
D3.1 Sensor Network
Rev. 1.0
The Real-Time Network will use different implementation technology for the two CORBYS demonstrators.
This is due to the fact that the 2nd demonstrator, RecoRob, is a ready-made system that cannot easily be
modified to be used in the 1st demonstrator.
The safety aspect is mainly a concern for the 1st demonstrator. The safety requirements together with the need
for smooth gait pattern control in 8 axes impose severe requirements for the data transmission speed of RealTime Network for the 1st Demonstrator.
The different components in Figure 1 are only described in sufficient detail in this document to show the
usage of the sensor network. Please see CORBYS Deliverable 2.2 - Detailed Specification of the System, for
more details on the individual components.
7
D3.1 Sensor Network
Rev. 1.0
3 Sensor Network Requirements for the 1st CORBYS Demonstrator - Input
from Partners
The preliminary physical layout of the CORBYS 1st demonstrator, the mobile robotic gait rehabilitation
system, is shown in Figure 2. The Mobile Platform must be able to carry the total load of the patient, the
Pelvis Link and the Powered Orthosis. In addition, the platform will also be able to support several additional
components, such as a battery pack and various instrumentation and processors. Of particular relevance for
the sensor network design, the setup will allow integration of several processors on the Mobile Platform. The
robot control can therefore be made self-contained.
Figure 2. CORBYS Gait rehabilitation demonstrator; Mobile Platform and Pelvis Link (left) and Mobile Platform with
Powered Orthosis (right).
The requirements on the sensor network capacity from the various CORBYS components have been analysed,
and are summarized in the following sub-sections related to the General Purpose and the Real-Time
Networks, respectively.
3.1 General Purpose Network Requirements for the 1st CORBYS Demonstrator
3.1.1
User Interfaces
The 1st demonstrator, the robotic gait rehabilitation system, is planned so that user interfaces can be offered
for the following three user roles: i) Patient, ii) Therapist, and iii) Engineer. The user interfaces will be
interfaced to the General Purpose Network. For all users, the outlined needs are summarized in Table 1.
8
D3.1 Sensor Network
Rev. 1.0
Table 1. Summary of user interface needs for the gait rehabilitation system users.
User Interface:
PATIENT
Feature
System input
User input
THERAPIST
Sensor data
Sensor data
Configuration
Configuration
Status
Status
Emergency
Define working mode
ENGINEER
All system data available at
the
General
Purpose
Network
Parameter adjustment (to
patient and safety levels)
Define
working
mode
(configuration,
views,
debugging
alternatives,
control settings et cetera)
Parameters and working
modes selected by engineer
Output to system
Signal that user has pressed
emergency button
Configurations
therapist
set
by
Output to user
To be analysed, but very
limited. Patient should not
be distracted during gait
rehabilitation therapy
Therapy information
Sensor and actuator data
Overall system status
Input and output data of
selected modules
Module
status
and
configuration parameters
System should be able to
adapt the force needed
dependent on the strength
of the patient, which
changes during the therapy
time of 10-60 minutes
1
Number of interfaces
Message format
Other issues
3.1.2
CORBYS
Interface
CORBYS
Protocol
1
Command
Sensor
Data
Physically attached to the
Mobile Platform. May not
be a computer interface at
all, just an emergency
switch.
CORBYS
Interface
CORBYS
Protocol
1
In case of remote
debugging there may be
additional virtual instances
of this interface
Command
Sensor
Data
Possibly remote terminal
connected via e.g. Wi-Fi
CORBYS
Interface
CORBYS
Protocol
Command
Sensor
Data
Possibly remote terminal
connected via e.g. Wi-Fi
Human Sensory System
The CORBYS Human Sensory System consists of the EEG-based Brain- Computer Interface and the human
physiological sensors. The data flows from the individual sensors will therefore to considerable extent define
9
D3.1 Sensor Network
Rev. 1.0
the update frequencies needed in each case. The sensor network dimensioning data regarding number of
sensors and sampling frequencies for the Human Sensory System sensors are given in Table 2. The collection
of raw sensor data from the sensors to the processing unit is not considered part of the sensor network. For
processed data from all human related sensors, CORBYS will aim to use the General Purpose Network for
data transmission. The EEG on one side, and the remaining human sensors on the other, will have processors
that will process data and prepare for the transmission of data via the General Purpose Network. The use of
the General Purpose Network will mean that there will be larger latency in these measurements than one
would have obtained in the Real-Time Network. However, the latency, as well as synchronization issues can
be compensated for by applying time-stamped message formats.
Table 2. Summary of dimensioning data streams from human sensing components
Sensor
Number of devices /channels for each sensor
Sampling frequency
To be decided (max 16)
31.25 Hz
EMG
4
Heart rate
1
Skin temperature
1
<100 Hz (processed)
<2 kHz (raw)
~1 Hz (processed)
Raw data ~ 250 Hz
~1 Hz
Environment temperature
1
~1 Hz
Electrodermal
(GSR)
<5
~1 Hz
<5
~1 Hz
2 units, each containing 3-axis accelerometer, 3axis gyroscope and 3-axis magnetometer
~1 Hz (processed)
Raw data ~100 Hz
EEG (BCI)
response
Humidity
Inertial Measurement Unit
The EEG module will have its own processing, so that the readings from the individual channels are bundled
and introduced to the General Purpose Network via one specific entry node. The EEG module offers filtered
(artefact noise reduced) data, a decoding flag probability indicating state derived from the analysis of the
related cognitive processes, an estimate on decoding accuracy for each BCI decoder, and an indication of
overall EEG system state (working or not).
The other human sensors will also be able to provide both raw and processed data, as well as flags indicating
sensor state (working or not). The data from this module will be bundled together and time-stamped for
internal synchronization purposes before being introduced in the General Purpose Network.
3.1.3
Cognitive Modules
The two Cognitive Modules (SAWBB and SOIAA) will be interfaced to the control system via the General
Purpose Network. This interface is summarized in Table 3. Whereas the update data rates are quite modest
for these modules, the challenge is rather that these modules subscribe to essentially all data in the CORBYS
system.
10
D3.1 Sensor Network
Rev. 1.0
Table 3. Summary of sensor network input and output requirements for the cognitive modules.
Situation Awareness Blackboard
(SAWBB)
Input rate
~ 10 Hz
Output rate
~ 10 Hz
Input data
Robot control sensor and actuator
data
Human sensors
Self-Organising
Informational
Anticipatory Architecture
(SOIAA)
~10 Hz
(50 Hz sampling of Real-Time
Network sensors)
20 Hz (for trajectory data to low level
control module)
Robot control sensor and actuator
data
Human sensors
3.2 Real -Time Network Requirements for the 1st CORBYS Demonstrator
All sensors and actuators involved in the gait rehabilitation robot control feedback loop are connected to the
Real-Time Network. This includes all sensors and actuators on the Mobile Platform, on the Pelvis Link, and
on the Powered Orthosis. The actual number of sensors and actuators has not been finalised at this stage, but
are based on the current estimates, which is given in Table 4. Most of the sensor read-out will be in duplicate
since the position reported by the actuator (the cause) and the independently measured position or angle will
be reported back via the Real-Time Network.
Table 4: Summary of sensors and actuated and passive joints in the 1st demonstrator, gait rehabilitation system
Subsystem
Joint/Axis
Actuation
Type of sensing
Mobile Platform
Wheel 1..4 (x-axis)
Steering 1..4
horizontal
Rotation
Orientation
Position
sideways
Driving wheels
Wheel orientation
Linear motion to lift or lower patients
body
Linear motion to move body sideways
sideways
Passive rotational
Hip actuator
Hip actuator
Knee actuator
Ankle actuator
Ankle actuator
Active extension and flexion
Passive adduction/abduction
Active extension and flexion
Active plantar flexion and dorsiflexion
Passive eversion and inversion
Force and torque in all 3
directions
Angular
Angular
Angular
Angular
Angular
Pelvis Link
Powered Orthosis
Left and right
legs
Position
For all the sensor and actuator components, the project targets sensor and actuator update rates of (up to) 1
kHz. Further, the latency, defined as the communication cycle time (outer loop total communication delay),
should not exceed 1-2 milliseconds. Following an estimate of 16 bit per sensor or actuator, with sensors
duplicated, and an update rate of 1 kHz for all devices, the resulting raw data transmission need is more than 1
Mbit/s.
The sensors and actuators on the Real-Time Network are safety critical. It is therefore important to be able to
detect failures and that all actuator activities give rise to predictable action. The 1st CORBYS demonstrator
11
D3.1 Sensor Network
Rev. 1.0
will therefore also implement an independent cabling for sensor readings in order to carry out consistency
checks between actuator settings and actual implementation.
3.3 Evaluation of State-of-the-Art Real-Time Sensor Network Technologies Relevant
for the 1st Demonstrator
The evaluation of relevant technologies has been carried out in several stages, where in particular the
technologies CANbus, FlexRay and EtherCAT have been considered. A summary of the main features of
these network technologies are given in Table 5. In the following, the evaluation process is described further.
Since the RecoRob system to be used as 2nd CORBYS demonstrator already had implemented the CANbus,
this was a natural starting point. The functional distribution with detailed gait pattern control done by the realtime control machine requires high update rate of the actuators. Requirements indicated an update rate of
1kHz with a maximum total loop delay in the range of 1-2 milliseconds. Given this requirement it will not be
possible to use the CANbus because of too high latency. A calculation of the raw data transmission showed
that at least 1Mbit/s would be needed. This rate is not possible to achieve for CANbus without splitting into
several physical busses.
The next candidate that has been evaluated is FlexRay. As one relevant CORBYS partner already has FlexRay
experience from other projects. FlexRay has higher capacity and lower delays than CANbus technology.
Unfortunately, FlexRay requires significant engineering to configure the advanced dedicated driver
technology required for the routing of a bus at 10Mbit/s. Further, the FlexRay consortium [www.flexray.com]
operating from 2000 was closed down in 2010, and the technology is now primarily used by a few car
manufacturers. There seems to be a relatively small community using FlexRay, and little information about
FlexRay can be found on the Internet. Xilinx had a FlexRay component for their FPGAs, but this product has
been discontinued. Therefore, it was concluded that there were too high risks associated with using this
technology.
The third technology that has been evaluated is EtherCAT. This technology is promoted by EtherCAT
technology group [www.ethercat.org] and Beckhoff [www.beckhoff.com/], and is a technology much used in
automation. It has many standard components available. It is based on Ethernet technology with a proprietary
frame format. There are IO-components available both as ASIC and FPGA components. It is based on a pass
through processing of the Ethernet frame. The result is a very low delay per node. The bus is routed in a daisy
chain with point-to-point connections between each node. It can be seen as a long shift register where each
node reads and writes data on the fly. A standard 100Mbit/s (FE) Ethernet driver for physical layer is usually
used, but also LVDS (Low Voltage Differential Signalling) is an alternative for custom parts of the network.
LVDS is a smaller and cheaper line driver than the standard Ethernet drivers.
The speed of 100Mbit/s gives more than a decade margin to the raw data needs for transport. This margin
gives headroom for changes that may be introduced later in the project.
EtherCAT seems to be flexible supporting FPGAs, embedded controllers and off-the-shelf standard products.
The SW load is low since most of the protocol in realised in HW. SW is available for development and test
using PC-windows, National Instruments LabView [www.ni.com/labview] and open source code that can be
used on embedded systems.
12
D3.1 Sensor Network
Rev. 1.0
Many off-the-shelf standard products are available for configuration and debug. This makes an easy start
using this technology.
The implementation of the 1st demonstrator described in this report will therefore be based on usage of
EtherCAT. EtherCAT technology will be tested within the consortium using sample sensor and actuator
devices. The technology will then be evaluated with respect to what extent it fulfils needs for support for a
significant number of components with 1 kHz update rates and 1-2 milliseconds outer loop delay times as
needed in the CORBYS Demonstrator 1. The CORBYS partners will collaborate to define the sensor network
protocols in full detail during the development of the demonstrator components and in the integration activity.
Table 5: Real-Time Network summary of bus characteristics
Topic
Background
Physical layer
EtherCAT
High-speed bus industrial
Ethernet fieldbus system
developed for control
automation
Ethernet 4 wire UTP LVDS transceivers
CANbus / CAN Open
Bus system developed for
automobile manufactures
2 wire UTP –
Dedicated transceivers –
termination
1Mbit/s / milliseconds /
Bus with dominant levels
One linear bus
May be hard to route out
in the legs of the Powered
Orthosis.
Speed / Timing /
arbitration
Topology
100Mbit/s / microseconds /
serial on the fly
Linear, tree, ring
Master is needed
HW support
Controller – Freescale, Ti
FPGA – Xilinx / Beckhoff
Controller – Freescale, Ti
FPGA – Xilinx (only lower
data layer)
SW support
TwinCAT, Open source
products
EtherCAT technology
group
Yes
Ring
Many
CAN Open
Yes
No
Vector.com
Yes, any Ethernet protocol
No
Documentation
Synchronisation
Redundancy
Configuration / debug /
monitor
Tunnelling
Yes
13
FlexRay
High-speed bus system
developed for
automobile
manufactures
2 or 4 wire UTP –
Dedicated transceivers –
termination
10Mbit/s /
microseconds / TDMA
Linear passive, passive
star, active star
Network engineering
needed
Active HUB may be
needed
Controller – Infineon,
Freescale, Ti, Renesas
FPGA – Xilinx IP is
discontinued
No FPGA support
Standard modules?
AUTOSAR
FlexRay consortium
Discontinued.
No
Yes
Static configuration.
More than 30
parameters need to be
consistently
configured.
No
D3.1 Sensor Network
Rev. 1.0
4 Implementation of the 1st CORBYS Demonstrator Sensor Network
4.1 Implementation of the General Purpose Network
Figure 3 General Purpose Network implementation.
The main part of the General Purpose Network will be located on the Mobile Platform chassis as illustrated in
Figure 3.
All modules connected to the General Purpose Network will use 100 Mbit Ethernet (FE) connected with cat5
cables to an Ethernet switch located on the Mobile Platform chassis. The only exception is the user interface
modules using Wi-Fi for connecting to the General Purpose Network. The modules will be part of the same IP
sub-network. No IP routing is necessary.
It can be considered whether a managed switch should be used. Then a specific port can be disabled remotely.
This can be useful for remote debugging when disconnecting a module in wanted.
The computer target for the different SW-modules will be specified in Deliverable 7.1 Design concept of the
demonstrator. Since the number of computers is open, the number of the Ethernet switch connections is also
open. A viable assumption is that an 8-port switch is sufficient.
4.1.1
Common tools and libraries for the General Purpose Network
In order to ease integration and reuse, all modules should implement a unified way for command and data
exchange. In the following sub-sections, the need for command control interfaces, protocols, logging and
debugging is described in more detail.
4.1.1.1
CORBYS Command Interface (CCI)
The CORBYS Command Interface (CCI) should be text based. This enables the interface to be both manmachine and machine-machine. The first can be used for sub-module testing and debug interface on the User
Interface (engineer). The latter can be used when setting up subscriptions.
14
D3.1 Sensor Network
Rev. 1.0
It should be possible to connect to CCI using a Telnet client using a TCP connection. It may be sub-systems
that already have an existing OS console available at the telnet port to avoid conflicts. It is suggested that the
CCI is standardized to use port 30000. If multiple sub-systems share a network address, each sub-system must
be addressable by using individual port numbers or another addressing method.
Multiple sessions must be supported. If a subscription is ordered, the CCI sessions must be held open. If the
CCI session is closed, the subscription will be cancelled. Some of the services that must be offered are given
below:
-
Discover sensors
-
Reset module device
-
Show module information
-
Show / set configuration data
-
Show / set mode of operation
-
Show sensor data
-
Create sensor subscription
-
Sub-system specific commands for control of its functions.
4.1.1.2
CORBYS Sensor Data Protocol
The sensor data will be transported using a UDP based protocol. Packet loss should not be an issue unless the
network is overloaded. There is no need for retransmission since the data is periodically refreshed.
Sequence number and CRC must be supported. This will secure that lost or garbled data is detected.
One datagram may contain multiple data samples and/or samples from different sensors. A format must be
supported for data identification. Timestamp should be provided in each packet (or possibly for each sample).
It is not decided whether it will be an open or binary packet format.
Multicast must be considered if network load gets too high. This can be decided at a later stage as long as the
interface to the generator and consumer are kept as a common interface. Multicast should be quite easy as
long as General Purpose Network consists of one subnet. Multicast through routers is more complicated.
Multicast will result in datagrams having more data than ordered in a single subscription. The datagram will
end up sending a superset of the data in all subscriptions. This is an argument to make the subscription
granularity quite coarse.
4.1.1.3
Logging and Monitoring
A logging device needs a non-volatile storage. There are many types of data to consider in the CORBYS
setting.
15
D3.1 Sensor Network
Rev. 1.0
Logging of sensor data
The logging can be considered a distributed service provided by each generator. In this case the generator
needs to have storage, or use a storage device available on the General Purpose Network.
All sensor data will normally be available on the General Purpose Network using the Sensor Data Protocol.
Based on this any module on the General Purpose Network can set up a subscription and store the data to a
file.
In order not to put storage requirements on the different sub-systems it could be an idea to make a sensor log
function to be part of the User Interface (engineer). During unit testing and integration all modules will not be
available. Then the logging functionality should be possible from a general PC connected to General Purpose
Network.
Logging of Commands
Commands received over the CCI should be possible to log. This can be combined into the internal log or use
a separate command log.
Debugging of the Real-Time Network will probably be done using dedicated tools. It could be considered that
commands sent on the Real- Time Network should be possible to log using the Real-Time Data Server in the
real-time machine or a general PC.
Logging of Internals
It is beneficial to have the possibility to store a log outside the sub-system for non-volatile trace of internal
details. This can be a duplication of an internal non-volatile log. In order not to put storage requirements on
the different sub-systems it could be an idea to make a stream-to-file-function part of the User Interface
(Engineer). A possibility to have this log displayed in a window on the User Interface (Engineer) can be
beneficial since not all sub-systems have a local display. If the different logs have timestamp it should be
possible to merge the different log into a global log sorted on timestamp or arrival.
In order to avoid overloading the system it has to be possible to enable different debug levels (verbose level)
defining how much information that will be put into the log.
4.1.1.4
Remote Debugging
Tools to perform remote debugging should be considered. During integration it will not be feasible to have all
partners available on site all the time. Remote debugging can be enabled by making as much as possible
available via the General Purpose Network. However, the General Purpose Network should never be directly
connected to the Internet or a corporate network due to safety and capacity reasons. A possibility can be to
have a dedicated module providing remote login with similar functionality as the User Interface (Engineer).
This module should not have any functionality needed for running the system and it should only be connected
if remote debugging is needed. Since remote debugging comes in addition to the User Interface (Engineer) all
modules must be designed to accept multiple debug interfaces.
16
D3.1 Sensor Network
Rev. 1.0
4.2 Implementation of the Real-Time Network
Mobile platform chassis
FPGA Reflexive Layer [UR]
Real-Time Control (RT Machine)[VUB+UB]
Mobile platform actuators
[OBMS+SCHUNK]
Real-Time Network
Mobile platform sensors
[OBMS+SCHUNK]
Safety Module
[SCHUNK]
Pelvis link actuators
[OB+OBMS+SCHUNK]
Pelvis link sensors
[OB+OBMS+SCHUNK]
Powered Orthosis actuators
[OB+SCHUNK]
Mobile platform safe sensors
[OBMS+SCHUNK]
RTN safe sensor
Pelvis link safe sensors
[OB+OBMS+SCHUNK]
Right leg
Left leg
Powered Orthosis sensors
[OB+SCHUNK]
Powered Orthosis safe sensors
[OB+SCHUNK]
Powered Orthosis sensors
[OB+SCHUNK]
Powered Orthosis safe sensors
[OB+SCHUNK]
Figure 4: Real-Time Network implementation in the 1st demonstrator, gait rehabilitation system.
The main part of the Real-Time Network components will be located on the Mobile Platform chassis. The
main sensor and actuator components present on the Mobile Platform including the Powered Orthosis are
illustrated in Figure 4. As illustrated, also the Real-Time Control loop and the FPGA reflexive layer will
physically be located on the Mobile Platform. In the following paragraphs, the sensor network
implementations for the individual modules are described further.
Safety Module: All actuators commands will be routed through a safety controller. This will assure correct
operation of actuators and limit all actuator movements to safe positions that are not harmful for the patient.
Since this module is responsible for keeping the control of the actuators and the sensors safe, it should be
master on the safe part of the network. This module has to be a safety compliant module. More details about
this in Section 4.2.3 below.
Powered Orthosis: All actuators for the Powered Orthosis will be located on the chassis and use push-pull
cables to the joints in the hip, knee and ankle for both legs. There will be sensors for both position and torque
on all active joints. It will be considered to have sensors on the passive joints. The final number of sensors
will be decided during implementation. There will be a need for Real-Time Network cabling to both legs.
17
D3.1 Sensor Network
Rev. 1.0
Probably there will be double set of cabling due to the safe sensor redundancy requirements. The connectors
used for connecting the orthosis have to be specified for the high speed of the Real-Time Network signals.
Mobile Platform: The actuators for the Mobile Platform will probably be located close to, or integrated into
the wheels. All components will be located on the Mobile Platform chassis. The control of the Mobile
Platform has to follow the gait movement of the patient. It is assumed that the control will be integrated into
the Real-time Control machine. If a separate controller is used, these two controllers have to cooperate
closely, and with a clearly defined hierarchical command structure.
Pelvis Link: The actuators for the Pelvis Link will be located directly on the linear unit of the Mobile
Platform. All other components will be located on the Mobile Platform. The control of the Pelvis Link will be
integrated into the Real-time Control machine.
Real-time Control machine: This module will be the main intelligence in the generation of the gait pattern. It
controls all the actuator movements and receives corrective actions from the reflexive layer. This module
should have safety evaluation of all commands from the General Purpose Network. This is important since
this part of the system not is considered safe. Since this module will be closing the outer control loop for all
the actuators it should be the master of the system. More details about the outer loop see Section 4.2.2.
FPGA reflexive layer: This module will be located on the Mobile Platform chassis. It will listen to the sensor
data and send corrective actions to the Real-time control machine.
4.2.1
The Network Cabling
The current section is written based on the assumption that EtherCAT is chosen as sensor network technology.
The network has to be cabled with cat5 Ethernet cables or equivalent. The use of LVDS (Low Voltage
Differential Signalling) line drivers rather of standard Ethernet line drivers has to be investigated. This driver
is smaller and this may be an advantage out in the Powered Orthosis. EtherCAT is cabled using a daisy chain
going from module to module. If one cable is disconnected the communication is lost for the downstream
modules. In order to avoid this, the cabling should be made as a ring as illustrated in Figure 5. This
configuration will allow a backup path to the master even when one cable is disconnected.
The cabling should be made as a main loop only on the Mobile Platform chassis. The cables out in the legs
should be made as sub-loops. In this way, the system can be operated for test purposes without the legs
connected.
18
D3.1 Sensor Network
Rev. 1.0
Figure 5.. Redundancy obtained by using a ring cable with sub-loops
sub loops (figure from EtherCAT technology forum)
4.2.2
Outer Loop
oop in the Gait robot Real-Time
Real
Network
The real-time
time control module will keep close control of the gait pattern movements. The control is done
through
ugh the outer loop consisting of the path from the real-time
real time control, via network, safety controllers,
drivers, actuators, sensors and back to the real-time
real
control as illustrated in Figure 6.
Real-time
control
Real-time
time
Network
Real-time
Network
Sensor
Safety controller
Actuator
Driver
Real-time
network safe
command
Figure 6. Real-time outer loop
The total round trip delay in the loop should be kept below 1-2
1 milliseconds;; higher
h
delays may cause
problems with the stability in the outer loop. These delay times are not a problem for EtherCAT as
technology, but the introduction of the safety controller may increase the delay. This will have to be
investigated further by the partners
tners during the implementation stages.
4.2.3
Safety Module
The Safety module shall handle the low level safety levels, designated as safety levels 0 and 1. This
Th can
probably be a standard component that is already safety certified. The fact that the commands to actuators and
the safe sensor readings are routed through the Safety Module will introduce additional delay in the outer
19
D3.1 Sensor Network
Rev. 1.0
loop. This has to be given attention during implementation. The location of the safety functions is illustrated
in Figure 7.
Figure 7: Safety functions in the Real-Time Network
Safety 0 shall assure that the actuators are moving to the commanded position at correct speed. This is
checked by reading the safety sensors giving absolute position. This should use standard components that are
already safety certified. The communication between the safety encoders and the safety controller is done
using a safe bus (RTN safe sensor). This will probably be a dedicated bus. Details have to be studied further
during implementation. If the safety 0 detect that the actuators are not moving as commanded, it shall perform
a safety power shutdown. The power is then cut for all the actuators. Actuator having breaks will then be
locked in their position. For actuators without breaks it should be possible to move them. This has to be
specified during implementation.
Safety 1 shall assure that the actuators are moving within safe positions. A safe operating area for all actuators
has to be defined as part of design. All commands from Real-Time Control and Reflexive layer will be routed
to safety 1. If the commands are within the safe operating area, they shall be passed on to the actuators. If they
are outside they shall be limited to the safe value closest to the commanded value. The command with the safe
20
D3.1 Sensor Network
Rev. 1.0
value will be sent to the actuator using a safe bus (RTN safe command). The idea with this bus is that only the
safety controller can generate commands. In this way the other modules cannot bypass the Safety Module.
The Real-Time control need to be informed that the Safety Module limits the actuator commands.
Safety 2 shall assure that all commands from the General Purpose Network are safe and reasonable. E.g.
speed not too high, direction is according to program. This is the main gate for all commands entering the safe
part of the system. All commands have to be specified within safe value ranges.
Safe bus concept for safety 0: In order to assure that the actuators are in the commanded position, there have
to be two redundant paths for the information. If one path is failing completely there must be an alternative
source of information. To achieve this there have to be two separate busses, one communicates with the
actuators and another bus communicates with the safety sensors. One of the busses can be realized by using
safety over EtherCAT layered on top of the Real-Time Network as illustrated in Figure 8.
Figure 8. Real-Time Network implementation. Alternative using safety over EtherCAT
Safe bus concept for safety 1: In order to assure that the actuators are never commanded to a position outside
the safe operating area, only the Safety Module must be able to send commands to the actuators. This can
either be a separate bus where only the Safety Module and the actuators are connected, or a safety level on an
existing bus assuring that commands to the actuators are only possible to be initiated from the Safety Module.
The latter can be achieved by safety over EtherCAT.
21
D3.1 Sensor Network
4.2.4
Rev. 1.0
Real-Time Network Data Format
All objects communicated on the Real-Time Network have to be specified. This is done by a dedicated tool
which stores the complete set of object in an XML file. This file is deployed throughout the network to all
masters, slaves and test tools. It has to be decided on a common tool that all partners use. When the different
partners do their module testing, they can use common data object definitions. The result is a consistent
interface description. The data formats will be specified among the partners in early stages of the
demonstrator implementation.
When working with real-time networks, efficient tools for logging and debugging are essential. Table 6 gives
a summary of software and hardware tools that can be considered.
Table 6. Summary of logging and debugging tools that may be used for EtherCAT real-time networks
Tool
Description
Software:
Port EtherCAT Device Monitor
Ackermann EtherCAT Data Logger
Acontis AT-ECLyser – EtherCAT
Diagnosis
Wireshark
KPA Studio EtherCAT
KPA EtherCAT Slave Tester
EtherCAT Master is a tool for inspection and configuration of
EtherCAT network. It offers scripting ability to access services and
write test applications.
The tool supports logging of data from EtherCAT slaves continuously,
data can be saved to files. GUI view of data. Includes an EtherCAT
master.
EtherCAT system 'health' analysis. Linked to AT-EM EtherCAT
Master Stack.
Open-source Ethernet package analyser with EtherCAT module.
EtherCAT configuration and diagnostics tool: trigger and trace,
measure connection quality, events handling.
Software toolkit for script based automated testing of EtherCAT slaves,
regression tests etc.
Hardware:
Beckhoff Multi-Channel Probe
ET2000
National Instruments NI9144 with
LabVIEW Real-Time Module
Hilscher NANL-C500-RE
netAnalyzer PCI-Card Real-TimeEthernet
Iba ibaBM-eCAT
8 channel time stamped recording of EtherCAT data, data can be
analysed with Wireshark or EtherCAT analyser.
The NI 9144 is an 8 slot EtherCAT I/O chassis for CompactRIO, PXI,
or industrial controller systems. Integrated with LabVIEW software.
4-port PC card capture two EtherCAT connections bidirectional with
time stamps. PluginsPlug-ins for Wireshark available.
Bus monitor for EtherCAT lines. Data transmitted to ibaPDA.
To ensure that individual sensors and actuators operate correctly on the Real -Time Network each device must
be tested individually before it is integrated on the Real -Time Network. The topology and configuration of
each branch of the Real-Time Network must be verified and tested using an EtherCAT test tool. A test tool
recommendation will be prepared.
It has to be taken into consideration that there probably will be more than one EtherCAT bus in the
demonstrator due to the safe network for commands and sensors. There may be scenarios where there is a
22
D3.1 Sensor Network
Rev. 1.0
need to monitor several networks simultaneously. Simulation of input and logging of output will be needed
during module testing.
23
D3.1 Sensor Network
Rev. 1.0
5 CORBYS 2nd Demonstrator Needs and Implementation of Sensor Network
The evaluation of the generic characteristics of the CORBYS cognitive techniques developed will be done on
the 2nd demonstrator representing the robotic system for investigation of contaminated or hazardous
environments. This robotic system is named RecoRob.
5.1 CORBYS 2nd Demonstrator Implementation
The demonstrator is a mobile outdoor robotic system, which was designed for handling samples in
unstructured hazardous/contaminated environments. The basic system concept is depicted in Figure 9.
Figure 9. RecoRob user interaction (left) and Mobile Platform concept (right).
RecoRob has been developed by UB within a German national project “A Mobile Reconnaissance Robot for
Investigation of Hazardous Environments - RecoRob” (Hericks et al., 2011). The main focus of the German
project is development of robot skills for autonomous sample drawing using dexterous manipulation while the
robot is remotely navigated by a human operator.
5.2 System Setup
The RecoRob system is equipped with various hardware devices as shown in Figure 10.
24
D3.1 Sensor Network
Rev. 1.0
Figure 10. RecoRob hardware setup is shown to the left and to the right is illustrated a mapped Virtual Reality (MVR) used as
simulation environment in the RecoRob system.
RecoRob consists of a mobile ASENDRO platform from Robowatch (Robowatch Industries Ltd.), which
consists of a variable drive system that is equipped with chains. In addition, the chains are supported by swing
arms which are capable of continuous rotation, enabling the basis to climb stairs and overcome obstacles. The
system components to be actuated are: a SCHUNK 7DOF lightweight robot arm for the object manipulation
and a 2DOF Pan-Tilt-Head (PTH) for steering the vision system. The sensor system consists of an ATI
Technologies Force-Torque sensor in the manipulators wrist, a PointGrey Bumblebee XB3 stereovision
camera for 3D environment reconstruction, a Samsung SNC Dome Camera for workspace observation and an
NEC ThermoTracer IR-Camera for thermal inspection of the working area. For the computational power,
there are two SPECTRA NISE 3140P2E industrial computers and for communication a D-Link DAP-2590
Access Point is installed. Additionally a NAVILOCK NL-302U GPS-receiver is utilized for the aspired
navigation and localization purposes.
User (operator) interaction is realized with a Getac M230N-5 rugged notebook, including Wireless LAN and a
touch screen for intuitive user (operator) input.
In order to ensure the power supply of computers and cameras, two Hellpower 24V/10Ah accumulators are
mounted in addition to the main accumulator inside the mobile basis. To guarantee that the system achieves its
main objectives even in the case of low batteries, a power management is developed allowing the monitoring
of the energy flow. When the battery is low, all hardware devices can be switched off except the computer for
communication and the propulsion system.
25
D3.1 Sensor Network
Rev. 1.0
5.3 CORBYS 2nd Demonstrator Software Architecture
The software architecture of the demonstrator has hierarchical structure with predefined interfaces for all its
modules.
The system is equipped with two computers on board and one notebook for remote control and user (operator)
interaction. On the system side, one computer uses the Windows operating system (OS) and the other uses a
Unix OS. The different operating systems are presently necessary due to the available drivers for the
hardware. Windows is required for the camera modules and performs the compression of the video streams.
The Unix computer controls the platform, the manipulator and the pan-tilt-head of the camera system as well
as the management of the sample containers. In the long term it is planned to use one OS only.
Communication with the system is performed using a proprietary command format via a TCP/IP connection.
For system control, the user connects and sends the commands to a server on the Windows computer where
they are either processed directly or forwarded to the UNIX computer, depending on the required action.
The command set supports retrieving basic status information of all system components, the transmission of
video streams and the invocation of actions like controlling the actuators of the system. For identification and
classification, they have a header containing a globally unique identifier (GUID) and information about
sender, receiver and the actual command.
Hardware control is handled by separate server software modules for each hardware component. They are
connected to the main system via the CORBA framework (Henning et al., 1999) which allows the connection
of software elements that may be executed on different computers inside a network. This approach enables a
maximum of flexibility and scalability concerning the embedding of hardware components. In order to avoid
uncontrolled hardware activities, all hardware servers are supervised to stop continuous movements of the
system components once the connection to the remote computer is interrupted for too long.
5.4 Integration into the General Purpose Network
The Human Sensory System and the Cognitive System are planned to be used for both CORBYS
demonstrators. This is described in further detail in the usage scenarios presented in Deliverable 2.2.
Figure 11 shows a high level sketch of the RecoRob Software architecture linked into the General Purpose
Network with common modules.
26
D3.1 Sensor Network
Rev. 1.0
Figure 11. General Purpose Network implementation for RecoRob used as CORBYS 2nd demonstrator. The lower switch
interface indicates the link to the RecoRob software architecture.
The whole original system setup involves three computers, two on the Mobile Platform and one notebook for
user interaction, the communication between them is an important point to consider. It is yet to be decided
whether the new functions shall be executed on the existing computers.
A TCP/IP network is already implemented. The challenge is probably to get the new functions to
communicate with the existing functions. It has to be looked into whether the communication system for
RecoRob can be reused for the CORBYS gait rehabilitation system as well. This will ease usage of the same
SW modules in both demonstrators.
5.5 The Real-Time Network
The 2nd demonstrator is realized by using the CAN bus. The 1st CORBYS demonstrator is planning to use
EtherCAT. Due to this difference, there will not be any reuse for the Real-Time Network between the two
demonstrators.
27
D3.1 Sensor Network
Rev. 1.0
6 Conclusions
The sensor network aspects of the CORBYS cognitive robot control framework have been presented. In
CORBYS, the sensor network can be separated into a time-critical, safety critical real-time network and a
general purpose network that deals with the system interaction with the cognitive modules, the user interfaces
and the Human Sensory System.
For the 1st CORBYS demonstrator, the mobile robotic gait rehabilitation system, EtherCAT is considered to
be the preferred technology for the Real-Time Network. EtherCAT will therefore be analysed further and
test-implemented to verify that this technology will be acceptable alternative that meets the CORBYS
requirements.
For the reconnaissance robot system to be used as the 2nd CORBYS demonstrator, the real-time sensor
network has already been implemented based on CAN technology.
The CORBYS partners will collaborate to define the sensor network protocols in full detail during the
development of the demonstrator components and in the integration activity.
28
D3.1 Sensor Network
Rev. 1.0
7 References
Beckhoff Automation GmbH, http://www.beckhoff.com/ [Accessed 31 Oct 2011]
EtherCAT technology group. http://www.ethercat.org/ [Accessed 28 Oct 2011]
FlexRay Consortium. http://www.flexray.com/ [Accessed 28 Oct 2011]
Henning M., Vinoski, S., (1999). Advanced CORBA Programming with C++, Addison Wesley Professional,
1999. – 1120 S. – ISBN 0201379279
Hericks, M., Krebs, U., Kuzmicheva, O., (2011). A Mobile Reconnaissance Robot for Investigation of
Dangerous Sites, the 2011 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS
2011), San Francisco, California, 2011.
Kawamura, K., Gordon, S. M., Ratanaswasd, P., Erdemir, E., Hall, J. (2008). Implementation of Cognitive
Control for a Humanoid Robot, International Journal of Humanoid Robotics, Vol. 5, No. 4 (2008) 547-586
National Instruments LabView: http://www.ni.com/labview/ [Accessed 31 Oct 2011]
Robowatch Industries Ltd. Available at :<http://www.robowatch.de> [Accessed 20 July 2011]
29