Intel® Technology Journal: Enabling Healthcare in the Home

Transcription

Intel® Technology Journal: Enabling Healthcare in the Home
9 781934 053232
7
35858 21143
7
$49.95 US
Copyright © 2009 Intel Corporation. All rights reserved. Intel, and the Intel logo, are trademarks of Intel Corporation in the U.S. and other countries.
ITJ9-3_Cover_BFC_39spn_090409.indd 1
SEPTEMBER 2009
Enabling Healthcare in the Home
vol 13 | issue 03 | SEPTEMBER 2009
ISBN 978-1-934053-23-2
Intel® technology journal | ENABLING healthcare in the home
More information, including current and past issues of Intel Technology Journal, can be found at:
http://developer.intel.com/technology/itj/index.htm
Intel® Technology Journal
9/4/09 8:28:45 AM
About the Cover
Enabling Healthcare in the Home is the theme of
the Intel Technology Journal, Volume 13, Issue 3.
The physician in the foreground (1) is remotely
performing a house call. The doctor is able to see
the patient (2) and to obtain diagnostic information,
such as blood pressure and pulse vitals. The patient
is able to remain at home, which is helpful and
efficient for those with ambulatory or transportation
issues. Next door (3) is an elderly person who can
safely live independently because the home is outfitted with sensors (the yellow spots) that monitor
motion. Family caregivers located elsewhere can
be aware that this individual is performing routine
tasks. Upstairs is (4) someone sleeping. Sensors
measure nighttime activity, which can be an indicator of health risk. Also
upstairs (5) is a child or an elderly person for whom reading is difficult.
The child or elder can snap-shot a page and listen to the words. Thus,
technology can enable healthcare in the home for all ages.
ITJ9-3_Cover_BFC_39spn_090409.indd 2
9/4/09 8:28:46 AM
Intel® Technology Journal | Volume 13, Issue 3, 2009
Intel Technology Journal
Publisher
Richard Bowles
Managing Editor
David King
Content Architects
Doug Busch
Eric Dishman
Program Manager
Stuart Douglas
Technical Editor
Marian Lacey
Technical Illustrators
Richard Eberly
Margaret Anderson
Technical and Strategic Reviewers
Steve Agritelley
Alan Boucher
Doug Bugia
Doug Busch
Eric Dishman
Farzin Guilak
Kristina Kermanshasche
George Korinsky
Brad Needham
Kevin Rhodes
Intel® Technology Journal | 1
Intel® Technology Journal | Volume 13, Issue 3, 2009
Intel Technology Journal
Copyright © 2009 Intel Corporation. All rights reserved.
ISBN 978-1-934053-23-2,
ISSN 1535-864X
Intel Technology Journal
Volume 13, Issue 3
No part of this publication may be reproduced, stored in a retrieval system or transmitted in any form or by any means, electronic, mechanical,
photocopying, recording, scanning or otherwise, except as permitted under Sections 107 or 108 of the 1976 United States Copyright Act, without either
the prior written permission of the Publisher, or authorization through payment of the appropriate per-copy fee to the Copyright Clearance Center,
222 Rosewood Drive, Danvers, MA 01923, (978) 750-8400, fax (978) 750-4744. Requests to the Publisher for permission should be addressed to the
Publisher, Intel Press, Intel Corporation, 2111 NE 25th Avenue, JF3-330, Hillsboro, OR 97124-5961. E-mail: [email protected].
This publication is designed to provide accurate and authoritative information in regard to the subject matter covered. It is sold with the understanding
that the publisher is not engaged in professional services. If professional advice or other expert assistance is required, the services of a competent
professional person should be sought.
Intel Corporation may have patents or pending patent applications, trademarks, copyrights, or other intellectual property rights that relate to the
presented subject matter. The furnishing of documents and other materials and information does not provide any license, express or implied, by estoppel
or otherwise, to any such patents, trademarks, copyrights, or other intellectual property rights.
Intel may make changes to specifications, product descriptions, and plans at any time, without notice.
Third-party vendors, devices, and/or software are listed by Intel as a convenience to Intel’s general customer base, but Intel does not make any
representations or warranties whatsoever regarding quality, reliability, functionality, or compatibility of these devices. This list and/or these devices may be
subject to change without notice.
Fictitious names of companies, products, people, characters, and/or data mentioned herein are not intended to represent any real individual, company,
product, or event.
Intel products are not intended for use in medical, life saving, life sustaining, critical control or safety systems, or in nuclear facility applications.
Intel, the Intel logo, Celeron, Intel Centrino, Intel Core Duo, Intel NetBurst, Intel Xeon, Itanium, Pentium, Pentium D, MMX, and VTune are
trademarks or registered trademarks of Intel Corporation or its subsidiaries in the United States and other countries.
†Other names and brands may be claimed as the property of others.
This book is printed on acid-free paper.
Publisher: Richard Bowles
Managing Editor: David King
Library of Congress Cataloging in Publication Data:
Printed in the United States
10 9 8 7 6 5 4 3 2 1
First printing September 2009
2 | Foreword
Intel® Technology Journal | Volume 13, Issue 3, 2009
INTEL® TECHNOLOGY JOURNAL
Improving Healthcare Through Technology
and Innovation
Articles
Foreword���������������������������������������������������������������������������������������������������������������������������������������������������������������� 4
Powering Healthcare Visions: Taking Advantage of Complexity, Connectivity, and Consumerism������������������������ 6
From People to Prototypes and Products: Ethnographic Liquidity and the
Intel Global Aging Experience Study�������������������������������������������������������������������������������������������������������������������� 20
Adapting Technology for Personal Healthcare����������������������������������������������������������������������������������������������������� 40
Healthcare IT Standards and the Standards Development Process: Lessons
Learned from Health Level 7�������������������������������������������������������������������������������������������������������������������������������� 58
Healthcare Information Integration: Considerations for Remote Patient Monitoring�������������������������������������������� 80
Personal Health Device Interoperability������������������������������������������������������������������������������������������������������������� 104
A Common Personal Health Research Platform — SHIMMER™ and BioMOBIUS™��������������������������������������� 122
Gathering the Evidence: Supporting Large-Scale Research Deployments�������������������������������������������������������� 148
Assistive Technology for Reading����������������������������������������������������������������������������������������������������������������������� 168
Developing Consumer and IT Platforms in a Regulated Medical Device Environment�������������������������������������� 188
Table of Contents | 3
Intel® Technology Journal | Volume 13, Issue 3, 2009
Foreword
Douglas F. Busch
Vice President
Chief Technology Officer
Digital Health Group
Intel Corporation
Eric Dishman
Intel Fellow
Digital Health Group
Intel Corporation
Fundamental medical science has made enormous strides over the past several
decades, and the length and quality of life in developed countries has improved
remarkably. However, trends in demographics, lifestyles, and treatment
practices are making our healthcare systems unsustainable. Improving access
to healthcare and controlling healthcare costs are two of the most contentious
current issues in United States politics, and are a huge concern in most other
countries also. Better use of technology in healthcare is an obvious strategy to
help solve these problems by applying innovation to how and where we deliver
care.
to how and where we deliver care.”
Science and technology have been applied aggressively to clinical diagnostics
and treatment. Diagnostic imaging, advanced surgical techniques, and
sophisticated pharmaceuticals have all had strong positive benefits. We have not
yet seen the benefits of applying everyday technologies that we take for granted
in other aspects of our lives, though. Much attention is now being given to
applying information technology to record keeping and communication in
clinical settings. The recent American Recovery and Reinvestment Act (part
of the 2009 economic stimulus package) focuses strongly on deployment and
meaningful use of Electronic Medical Records.
“We need to make home health
This is not enough. To really impact the availability and cost of healthcare, and
improve the quality of our citizen’s lives, we need to shift some of the focus of
healthcare from the clinical setting to the home. Just as much of banking has
moved from marble lobbies filled with tellers to a self-service model enabled
by technology, we can use readily-available technologies to move some aspects
of healthcare from clinical settings to the home. We can also use technology to
allow our aging population to stay in their homes longer, living independently.
Both of these approaches have the potential to improve quality of life and
reduce the cost of providing the healthcare and living assistance our population
needs. We need to really understand how to use technology effectively for these
purposes, and we need to adapt common technologies for successful use in
home settings. We need to make home health technology simple, unobtrusive,
and low cost, while protecting the security, privacy, and safety of our most
vulnerable family members.
“Better use of technology in healthcare
is an obvious strategy to help solve
these problems by applying innovation
technology simple, unobtrusive, and
low cost, while protecting the security,
privacy, and safety of our most
vulnerable family members.”
4 | Foreword
Intel® Technology Journal | Volume 13, Issue 3, 2009
In this issue of the Intel Technology Journal, we hope to provide a broad
perspective on the use of technology for delivering healthcare at home.
Dr. Mike McGee, a noted physician, speaker and author, describes a vision for
lifelong healthcare delivered at home. Intel researchers describe their work and
some of the conclusions reached over several years gathering ethnographic data
and conducting pilot studies of home-based sensor networks. The leaders of
two major heath standards organizations describe the approaches being taken
to create interoperability of data and devices for healthcare. Lessons learned
about integrating healthcare data systems are shared by a team developing
health data middleware. Intel’s experience entering the medical device
product development industry is summarized by the leaders of our product
development groups. The design team responsible for building an assistive
device for reading disabilities shares their approach and learned lessons.
“We hope to provide a broad
Intel is both a long-standing member of the aging research community and a
new participant in the medical device community. We hope that by sharing our
insights, and those of our collaborators, we can encourage the development of
powerful, easy-to-use technology for home healthcare and independent living.
We hope that we can also stimulate investment in — and innovation for —
home-based care industries through broad-based R&D collaboration, standards
development, and political advocacy to ignite this important marketplace. Our
future, and that of our children, depends on dramatically improving access
to healthcare and reducing cost, which we believe is achievable if we can “go
home again” by using technology and innovation to reimagine how we deal
with the epidemics of chronic disease and injury that we all face from the
enormous success of our global age wave.
“We hope that we can encourage the
perspective on the use of technology for
delivering healthcare at home.”
development of powerful, easy-to-use
technology for home healthcare and
independent living.”
“Our future, and that of our children,
depends on dramatically improving
access to healthcare and reducing cost.”
Foreword | 5
Intel® Technology Journal | Volume 13, Issue 3, 2009
Powering Healthcare Visions: Taking Advantage of
Complexity, Connectivity, and Consumerism
Contributors
Mike Magee, MD
Opinion
Over the past ten years, the World Health Organization and other groups have
been actively engaged in trying to answer the question “What is health?” A
large part of this thought process has involved defining what health is not. It is
not the healthcare system. It is not the reactive elimination of disease. It is not
a simple commodity to be weighed against all other commodities in society. It
is different from these things, and it is more than these things.
“A large part of this thought process
has involved defining what health is
not.”
Health is universal and common to the people of the world, independent of
geography, race, income, gender, and culture. Health is an active state of wellbeing that encompasses mind, body, and spirit. It is the capacity to reach one’s
full human potential, and, on a larger scale, the key that unlocks a nation’s full
productivity and potential for development.
The Modern Health World: Shaped By Intersecting
Megatrends
To lead and innovate in today’s healthcare world requires a modern vision
that is based on a highly strategic understanding of the major intersecting and
mutually reinforcing elements shaping health worldwide. Among these are
aging demographics, the growth in home-based caregivers, the dual burden
of disease, the Information Revolution, and consumer empowerment and
activism.
“By 2030, there will be over a million
United States citizens over 100.”
For most people involved in healthcare, aging equals numbers (how many
people are over 65, how many are over 85, and so on). But aging is not
about the aging population only. It is more complex than that. Fifty percent
of current sixty-year olds in the United States have a parent alive, and by
2030, there will be over a million United States citizens over 100. Before our
eyes, the American family is growing from three generations to four and five
generations. In 25 percent of American homes, a non-professional, family
caregiver works without pay and without support to manage frail parents and
grandparents on the one hand, and immature children and grandchildren on
the other. (I call these “informal” caregivers, as opposed to “formal” caregivers,
those who are trained and paid for what they do.) Almost all of these caregivers
are third-generation women, ages 40 to 70. When we look at the whole
question of providing healthcare, we must address these family dynamics
efficiently and effectively, and our solutions must bring support and relief to
beleaguered home-based informal caregivers.
6 | Powering Healthcare Visions: Taking Advantage of Complexity, Connectivity, and Consumerism
Intel® Technology Journal | Volume 13, Issue 3, 2009
Coincident with an aging society is the Internet, an incredible tool. When
combined with ever-expanding computing power, remarkably innovative
processor technologies, broadband, wireless, and converging powerful
applications pushing massive amounts of information at lightning speed, the
Information Revolution gives us a glimpse of what is possible in a modern
world in terms of health and healthcare. The true significance of the Internet,
however, in this context, is that it ignores geography, and in doing so, breaks all
the rules governing communication: who gets what information from whom,
and for what purposes.
Regina Herzlinger at Harvard Business School first suggested a consumerdriven, healthcare model nearly 30 years ago. She and others at the time saw
a disconnect between those who paid for the care and those who consumed
it. It was clear to all that consumers of healthcare were often at a disadvantage
because they lacked adequate information to support decisions. Three decades
later, with the support of new information technologies, we see a very different
reality. Doctors, nurses, and hospitals seem to agree that “the best patient is
an educated patient.” They acknowledge that paternalism must give way to
partnership. It must be teams over individuals. There must be mutual decision
making. Most people would agree that these are all good things. However, our
access to health information is leading us into health activism, led mostly by
our informal family caregivers who labor as both providers and consumers of
healthcare in nearly 25 percent of all American homes: they have no formal
support or even acknowledgment from healthcare providers. For them, lack of
healthcare information is not what’s literally killing them. Lack of a support
system is killing them.
“Doctors, nurses, and hospitals seem
to agree that ‘the best patient is an
educated patient.’ ”
“Our access to health information is
leading us into health activism.”
Finally, an aging population, the Information Revolution, and the consumer
health movement intersect and reverberate within the context of a global
interconnected world community. High-speed travel, overnight delivery,
cultural integration, the rapid spread of inter-species microbes, and the
mass marketing of products, good and bad, have created a wide range of
health challenges. The developing nations struggle with warfare, migration,
famine, child mortality, and infectious diseases, while the developed world
is overwhelmed with largely preventable chronic diseases. As the developing
world exports the next influenza virus, the developed world returns the favor
by marketing tobacco and the American diet to developing nations. I believe
that together we share a dual burden of disease and a range of environmental
threats to our health from global warming to scarcity of clean, safe water.
Rather than fighting these accelerating trends that are connecting our
world, opening up access to health information, but also making our world
increasingly complex, we must embrace them. Our vision must be sufficiently
powered to take advantage of complexity, connectivity, and consumerism.
“As the developing world exports the
next influenza virus, the developed
world returns the favor by marketing
tobacco and the American diet to
developing nations.”
Powering Healthcare Visions: Taking Advantage of Complexity, Connectivity, and Consumerism | 7
Intel® Technology Journal | Volume 13, Issue 3, 2009
The Power is Shifting
“I present here five visions
sufficiently powered to guide a true
transformation in our health.”
Clearly, healthcare reform is in the air. Whether such reform can positively
affect our health, however, remains to be seen. For true reform to take place,
powerful visions must be joined by cross-sector innovation to serve, humanize,
and realize future human potential. I present here five visions sufficiently
powered to guide a true transformation in our health and to encourage crosssector collaboration among traditional healthcare organizations and technology
leaders, such as Intel, and home health caregivers with existing positions in
American homes.
Vision 1: Home-Centered Healthcare
“We have been unable to create
or manage a more inclusive and
anticipatory healthcare system.”
Nearly 100 percent of the assets we currently include in our definition of the
healthcare system have little to offer us, in their current form, that would
help in the build-out of a truly preventive healthcare system. I am including
assets such as the bricks and mortar of our hospitals and our patient offices;
our human resources as embodied in our training, roles, responsibilities,
and payment incentives; our educational curricula; and our continuously
reengineered processes targeted at in-patient safety and efficiency. These assets
are original, or second or third iterations, of a century-old interventional care
system that stubbornly survives, largely in its original form, because we have
been unable to create or manage a more inclusive and anticipatory healthcare
system.
Prevention of illness is grounded in education and behavioral modification.
It begins before birth and extends beyond death. Let me explain: the parents’
nutritional status as well as their use of alcohol, drugs, and cigarettes affects
fertility, as well as conception and in-utero development. On the other end of
one’s life, friends and family predictably suffer grief and anxiety at the death
of an individual, and these friends and family can often benefit from formal
bereavement support services.
“To be successful, therefore, a
preventive healthcare system must
To be successful, therefore, a preventive healthcare system must take advantage
of multi-generational relationships to provide multiple, repetitive inputs in
real-time that allow micro-adjustments in a person’s daily life. Such a system
demands intimately informed, highly motivated, and deeply committed
individuals willing to gently prod those under their charge toward health and
wellness. I believe that the goals of a preventive healthcare system require
guiding hands and a pervasive presence across several sectors: family and
community linkages, and the ability to efficiently lay out lifecycle plans
and execute lifespan management on the one hand, and ensure adherence
to palliative treatment plans for patients with chronic disease on the other.
Realizing this vision of a truly preventive healthcare system will require an
integrated approach to care delivery, supported by cross-sector collaboration.
take advantage of multi-generational
relationships.”
8 | Powering Healthcare Visions: Taking Advantage of Complexity, Connectivity, and Consumerism
Intel® Technology Journal | Volume 13, Issue 3, 2009
Beginning with New Technology
In the build-out of this new healthcare system, there is only one location that
is both geographically identifiable and politically viable as a candidate: the
home. Most Americans have a very favorable view of the home and its value
to quality of life. As a result, politicians frequently embrace the imagery of
“home” and are likely to be willing to invest (through legislation and other
means) in the creation of healthier homes. While the home may be where the
heart is, it is most certainly not currently where the health is, and in this way
it is the optimum site for constructive change to take place in our health and
healthcare.
I first began to think about the “moldability” of the home when I saw a
remarkable exhibit at the World Fair in New York in 1964. As a 16-year-old,
I remember sitting in the revolving theatre and witnessing the marvels of
technology. The audience was transported to what life was like in 1904 and
then brought forward through the decades to 1964, and then 1984, 20 years
into the future, showcasing how technologies and products helped to improve
our lives and would continue to improve our lives in the future. The exhibit
convinced me that toasters and refrigerators had truly improved our lives. Can
we leverage technology today to do the same for our health? America views
homelessness as a social failure, and we have begun to view “healthlessness” as
a social failure also. It doesn’t have to be that way. If we can leverage all kinds
of technology, from diagnostic and imaging to entertainment and financial, to
equitably re-outfit and at least partially improve the health of people in their
homes, I believe that we can efficiently re-center our healthcare system in the
home.
“If we can leverage all kinds of
technology, I believe that we can
efficiently re-center our healthcare
system in the home.”
Moving into the Home
The primary health information loop in our current healthcare system extends
from a hospital to a physician’s office and back. The home is an after-thought.
If you wish to obtain healthcare, you must leave your home and enter the
hospital-office loop. What if technology allowed us to change the trajectory of
that loop so that it goes from home to care team and back to home?
In this new approach, informal caregivers would become fully enfranchised
members of physician-led and nurse-directed care teams. These family
caregivers would not only be linked virtually to their multi-generational
families and to their care teams, but also to other informal family caregivers,
thereby effectively addressing the profound sense of isolation that comes with
the informal caregiver role. A wide range of secondary loops would evolve from
generalist to specialist, from clinician’s office to hospital, and from care team to
insurer or pharmacy. However, the primary loop, where data would originate
and from which privacy access would be granted, would be home-centered.
“In this new approach, informal
caregivers would become fully
enfranchised members of physician-led
and nurse-directed care teams.”
Powering Healthcare Visions: Taking Advantage of Complexity, Connectivity, and Consumerism | 9
Intel® Technology Journal | Volume 13, Issue 3, 2009
“The healthy home would also have
ubiquitous, low-cost sensors that
could track motions, actions, and
interactions.”
The data flowing out of the home of the future would be rich, varied, realtime, and widely available. It would include vital signs and diagnostic and
imaging results sent wirelessly to care teams. The healthy home would also
have ubiquitous, low-cost sensors that could track motions, actions, and
interactions. The data produced by these sensors would be interpreted by
artificial intelligence software and measured against predicted healthy living
plans. The results would be fed in a continuous stream to the care team. The
feedback loop would consist of a human team partner communicating through
a friendly interface of one’s choice, such as a wristwatch, phone, radio, TV, or
computer: this interface would act as a guide and companion. For example, the
interface might remind the person to bathe if he or she had forgotten to do so;
to increase fluid; to alter diet or to exercise; to take medication or vary dosage
that day; or even to call his or her daughter as promised.
The Ideal Scenario
“For example, the interface might
remind the person to take medication
or vary dosage that day.”
Beyond home healthcare for our aging population, this model would make it
possible for more healthcare to take place in the home for all the generations
within the home. Ten outcomes can be skillfully integrated into my vision of a
calm, well-organized, and healthy home:
1. A home health manager, previously the informal family caregiver, is
designated for each extended family.
2. Nearly all Americans have health insurance, and a universal, integrated
medical information highway is constructed primarily around the patient,
and integrated with caregivers, rather than the other way around.
3. The majority of prevention, behavioral modification, monitoring, and
treatment of chronic diseases takes place at home.
4. Physician-led, nurse-directed, virtual health networks of home health
managers provide a community-based, 24-hour, seven days a week, clinical,
educational, and emotional support team.
5. Healthcare insurance premiums for families are lower due to expert
performance of the home health manager, as reflected in evidence-based,
outcome measures of family members.
6. Basic diagnostics, including blood work, imaging, vital signs, and
therapeutics are performed by the home health manager and transmitted
electronically to the physician-led, nurse-directed educational network
which provides feedback, coaching, and treatment options as necessary.
7. Sophisticated behavioral modification tools, age adjusted for each
generation, are present and utilized in the home. These are funded in
part by diagnostic and therapeutic companies that have benefited from
expansion of insurance coverage and health markets, as early diagnosis and
prevention takes hold.
10 | Powering Healthcare Visions: Taking Advantage of Complexity, Connectivity, and Consumerism
Intel® Technology Journal | Volume 13, Issue 3, 2009
8. Physicians have more time to respond to health issues that truly require
an office visit, as more and more health concerns are increasingly
managed in the home. Monetary compensation for physicians increases in
acknowledgment of their roles in managing clinical and educational teams
and multi-generational complexity. Nursing school enrollment goes up, as
the critical role as educational director of home health manager networks
becomes a major magnet for the profession.
9. Family nutrition is carefully planned and executed; activity levels of all five
generations rise; weight goes down; cognitive functioning goes up; mental
and physical well-being are also up.
10. Hospital size continues to be reduced to appropriately meet the needs
of the community. Scientific advances allow earlier diagnosis and more
effective treatment, making the need for hospitalization increasingly rare.
Hospitals are more specialized and safer: that is, the concentration of
specialized services in Centers with high volume and expertise ensures safer
outcomes from select procedures.
Vision 2: Reconnecting the Family — Unlocking
Human Capital
When virtual health networks first appeared, clinical visionaries at the time
imagined an age of “telemedicine.” Dreamers thought the most pressing
application for these networks would be to connect knowledge and skills in
developed nations with pressing needs in the developing world. In reality,
we now understand that the true purpose of innovative hardware and
software, combined with pervasive broadband and wireless networking, is not
connecting the developed with the developing world, but rather to integrate
knowledge and skills in our own homes.
“The true purpose of innovative
hardware and software is to integrate
knowledge and skills in our own
homes.”
If we can reconnect the multigenerational American family, we can plan
efficiently for health and reorganize community resources, thereby addressing
both accessibility and affordability of health resources. If the lessons learned
in one generation can benefit other generations downstream, and if we can
efficiently utilize a multi-purposed health workforce, we will be able to afford
wellness, prevention, and scientific and technologic progress in our healthcare
and our health.
Segmentation as a Strategy No Longer Works
The most pressing health demographic in the United States and worldwide
is aging and its associated burden of chronic disease. This is not news. We
have seen it coming and tried to plan for it. For decades our training schools,
hospitals, regulators, insurance industry, housing industry, legal system, longterm care providers, and elder advocacy associations have focused on our aging
populations and their unique needs and vulnerabilities.
“We will be able to afford wellness,
prevention, and scientific and
technologic progress in our healthcare
and our health.”
Powering Healthcare Visions: Taking Advantage of Complexity, Connectivity, and Consumerism | 11
Intel® Technology Journal | Volume 13, Issue 3, 2009
“We must now use technology and
social networking to aggressively
reconnect the generations within the
modern family.”
Rather than segment the American family into old and young, we must now
use technology and social networking to aggressively reconnect the generations
within the modern family. By doing this, I believe that we can manage the
complexity, mobility, and health data of the modern family, and we can
distribute the responsibility for taking care of our health, as we move toward
the provision of universal healthcare.
Five Strategic Problems with Segmentation
First off, generational segmentation within a family, especially as the number
of people over 65, 85, and 100 continues to grow, becomes increasingly
problematic. For one thing, it pits one generation against another in a “civil
war of health financing,” that, absent coordinated efficiencies, creates a
downward cycle with all roads leading to health rationing.
“Generational segmentation traps
family learnings and health traditions
in multiple intergenerational divides.”
A second problem with generational segmentation is that it traps family
learnings and health traditions in multiple intergenerational divides. For
example, the 58-year-old, third-generation mother caring for her fourthgeneration 82-year-old mother with a fragile fracture from osteoporosis,
should ideally not only be an experienced and sympathetic caregiver, but also
a wise source of health information. While she is focusing on her mother’s
independence, dignity, and pain control, she should also be asking “How can
I avoid becoming vulnerable to the same disease as my mother?” Her research
and responses to questions of her doctors and other caregivers should tell her
that silent osteoporosis exists in 52 percent of third-generation women by age
50, which is to say, she herself is vulnerable and should have a bone scan.
Her research should also make her aware of her own 27-year-old daughter’s
vulnerability, since 98 percent of a woman’s skeleton is formed by age 20.
Further, it should bring her 5-year-old granddaughter’s activities into sharp
relief, since diet, exercise, and health-related behaviors in these early years
determine whether this first-generation family member’s skeleton will be in
good shape 15 years from now. Thus, in an enlightened, connected multigenerational family committed to each other’s health and well-being, health
learnings should continually and seamlessly flow down the inter-generational
ladder, with the goal of constantly shrinking the entire family’s future disease
burden.
“A third problem is that informal
family caregivers are trapped in a
cavern of social isolation.”
A third problem inherent in a segmented family care system is that informal
family caregivers are trapped in a cavern of social isolation. Many ignore
their own health and become ill during this period. Some die before those for
whom they are caring. They are disconnected from the formal care team and
from other informal family caregivers who struggle in isolation themselves.
Finally, they are often disconnected from their siblings and from children and
grandchildren, as the chaos, depression, and complexity of what they are doing
overwhelms them.
12 | Powering Healthcare Visions: Taking Advantage of Complexity, Connectivity, and Consumerism
Intel® Technology Journal | Volume 13, Issue 3, 2009
A fourth problem that affects the health and healthcare of a generationally
segmented family is that such segmentation works against the adoption of
electronic health records and electronic health planning. These electronic
records and plans are required to move us from a system of intervention to one
of prevention, and they are the tools that will propel us ahead of the disease
curve, by customizing and personalizing individual health planning.
“Segmentation works against the
adoption of electronic health records
and electronic health planning.”
A fifth and final argument for family connectivity is that it allows us to better
define the human resource needs for a modern healthcare team; emphasizes
individual roles and responsibilities; better assigns and utilizes these human
resources with an eye toward multi-use for cross-generational assessments and
services; and it enables a more informed application of unique community
resources to fill in the gaps where they exist.
Vision 3: Lifespan Planning Record — An Organizing
Planning Platform
To manage the complexity of our multi-generational families and to make
optimum use of our increasingly connected world requires an organizing
platform with sufficient power and flexibility to adjust in real-time, while
customizing and personalizing strategic health planning. Over the last decade,
the discussion about healthcare records has focused on converting paper-based
patient records of hospitals and doctors into electronic medical records (EMR).
The goal was to improve accuracy and efficiency, a worthy goal. However,
as this conversion process began, it became increasingly clear that what
constituted a “Record” was not only what was in hospitals’ and doctors’ files.
Rather, much of the real data on health resided with the patient; the EMR was
only part of the overall health picture. Thus, the concept of a “Personal Health
Record” (PHR) began to emerge, and this PHR may well someday subsume
the focus on an EMR.
“The concept of a ‘Personal Health
Record’ (PHR) began to emerge.”
An Integrated Long-Term Plan
I believe that in a truly preventive system, “health” is not a collection of latestage, reactive interventions. Rather, I believe that health should be defined as
a life fully lived — hopeful, productive, fulfilling, rewarding, and manageable.
The determinants of such a life begin before birth, embedded in the healthful
behaviors of one’s future parents, and are extended beyond death to one’s
survivors.
“I believe that health should be
defined as a life fully lived — hopeful,
productive, fulfilling, rewarding, and
manageable.”
Powering Healthcare Visions: Taking Advantage of Complexity, Connectivity, and Consumerism | 13
Intel® Technology Journal | Volume 13, Issue 3, 2009
Death
ears)
Birth
00 y
Life Plan (1
Figure 1
Source: Intel Corporation, 2009
Considering this broader view of health, the right concept for our health record
system should be a Lifespan Planning Record (LPR), a virtual online organizing
scaffold for health information and health planning.
The LPR for a single individual born today would extend out at least 100 years.
It would include all of the baseline medical information needed by patients,
and much more. It would consider economic, social, educational, and spiritual
goals and milestones as well as medical and scientific objectives.
“The LPR for a single individual born
today would extend out at least 100
years.”
Born today, the newborn child’s LPR would already contain a great deal
of data. Some reasonable compilation of the health records of parents,
grandparents, and siblings would be represented. Future diagnostic and
preventive therapeutic measures, based on familial information, would be
flagged on the timeline. Print, video, and graphic information from other
accessible intelligence databases would be seamlessly interwoven for easy use by
the people caring for each other and this new global citizen.
As time passes, this “living” record would grow and adjust to assist in
informed decision making, preventive behavior, and full and complete human
development.
Obviously, many issues will need to be sorted out in order for such a vision
to be realized, not the least of which are confidentiality, patient privacy, and
control over records. However, the bottom line is that as quickly as the EMR
is being subsumed by the PHR, the PHR is now being subsumed by the need
for an LPR. The LPR is the best way to move us toward a preventive healthcare
system.
“Future diagnostic and preventive
therapeutic measures, based on
familial information, would be
flagged on the timeline.”
14 | Powering Healthcare Visions: Taking Advantage of Complexity, Connectivity, and Consumerism
Intel® Technology Journal | Volume 13, Issue 3, 2009
Vision 4: Collapsing Databases — Accessing and
Integrating Knowledge
If the home is the center of our future healthcare universe, the connected
family an integral member of the health delivery team, and the LPR its
organizing framework, where will the health knowledge come from and how
will it be used to the best advantage?
“In order for healthcare consumers
Three enormous health databases are currently in the process of becoming
available electronically and therefore, easily accessible. The first of these is the
Clinical Research Database (CRD). Access to the latest information from
research discoveries is critically important as we shift more responsibility
for decision making to consumers’ shoulders. Health product companies
appreciate that anything short of full and immediate disclosure of all study
results could increase liability into the future and run the risk of collapsing
a successful product (Merck’s Vioxx drug is a case in point) Moreover,
in order for healthcare consumers to have full trust in their physicians’
recommendations, they need to have full disclosure regarding conflict of
interest: if a researcher or physician receives payment for doing drug or product
research and giving talks in support of that product, device, or drug, then
consumers need to know this. As a result, therefore, of conflict of interest
concerns and legitimate health consumer desires for early access to discovery
information, major research databases are moving toward open transparency.
full disclosure regarding conflict of
For better or worse, the public will soon have ready access to the vast
majority of positive and negative results at the time of completion of medical
research efficacy and effectiveness studies of new products and health devices
studies. These results will be electronic and therefore capable of being widely
distributed. In addition, the American people, as part of good citizenship, will
soon be able to voluntarily contribute their de-identified health information
to nationally coordinated Clinical Effectiveness Research, effectively becoming
data sources and thus co-contributors in the ongoing effort to improve health
in America.
“American people, as part of good
The second database is the Continuing Medical Education (CME) database
which will also soon be widely accessible. In fact, nearly 20 percent of the U.S.
CME database is already electronic and has been demonstrated to be effective.
It is likely that within ten years, the vast majority of the CME database will
be accessible and will be applied in real-time rather than in episodic segments.
Hand-held devices are increasingly standard medical equipment in caring
encounters, providing immediate database support to the patient-physician
relationship during the evaluative and joint decision-making process. This
allows experts to quite confidently predict that in a preventive healthcare
system, where information is overwhelmingly the dominant healthcare
product, CME will be seamlessly integrated into the whole doctor-patient
encounter and become an integral part of the diagnosis and outcome.
to have full trust in their physicians’
recommendations, they need to have
interest.”
citizenship, will soon be able to
voluntarily contribute their deidentified health information to
nationally coordinated Clinical
Effectiveness Research.”
“CME will be seamlessly integrated
into the whole doctor-patient
encounter and become an integral part
of the diagnosis and outcome.”
Powering Healthcare Visions: Taking Advantage of Complexity, Connectivity, and Consumerism | 15
Intel® Technology Journal | Volume 13, Issue 3, 2009
“CME and CCE would in many ways
become one and the same.”
Which brings us to the third database, one that I envision will be called
Continuing Consumer Education (CCE). As the consumer movement
continues to evolve from educational empowerment to active engagement and
inclusion in the healthcare team, patients and their families will demand access
to the same hand-held hardware and information software that the other care
team members are using. This will help avoid any confusion that might arise
from multi-tracked information and accelerate the need for simple and welldesigned educational and organizational products.
Bridging the Translation Gaps
As CRD, CME, and CCE become widely accessible, we are left with two
translation gaps between new medical discoveries and those discoveries being
translated into practice. The first gap is between CRD and CME. For example,
if a study reveals that it is safer and more effective for the mother and the
child if the mother is given an epidural anesthesia at 2 cm rather than at 5
cm dilation, and that doing so not only does not increase C-section rates but
ensures safer, more comfortable labor and better Apgar scores for the baby,
under our past system, this knowledge transfer to practice would take many
years. However, with integrated CRD and CME databases, practice behavior
changes would become coincident with a new discovery.
“Do people not loan these data to the
people in whom they have the greatest
trust and confidence?”
If CRD and CME were to collapse upon each other, CME and CCE would in
many ways become one and the same. Over the past decade patients and their
caregivers have been gaining expertise and experience with new information
technologies and applications at roughly the same rate. Yet the movements to
organize an individual’s health data electronically was originally conceived as a
provider controlled function. The EMR was viewed as a hospital and officebased platform. The EMR concept has more recently broadened into the PHR,
a patient-centric organizing framework. But are separate approaches defensible?
Are these records not, after all, one and the same? Do not all clinical data
originate with the people? Do people not loan these data to the people in
whom they have the greatest trust and confidence: their physicians, nurses, and
other caregivers? Moreover, if our records are one and the same, should we not
also use the same informational resources to support our joint decision making?
Wouldn’t this method be the best way to help us stay on the same page and
avoid miscalculations, misinformation, or mistakes?
“Lack of adherence means
costly re‑treatment, and often
hospitalization, when conditions
The elimination of the translation gap between CME and CCE has enormous
economic implications also. Studies indicate that patients adhere to the plans
they establish with their physicians only 25 percent to 50 percent of the time.
Lack of adherence means costly re-treatment, and often hospitalization, when
conditions predictably worsen. In addition, many of the safety issues that
have been the focus of attention in the healthcare community in recent years
are based on miscommunication between healthcare providers and between
patients and their caregivers.
predictably worsen.”
16 | Powering Healthcare Visions: Taking Advantage of Complexity, Connectivity, and Consumerism
Intel® Technology Journal | Volume 13, Issue 3, 2009
Finally, the emergence of Clinical Effectiveness Research and the culture of
continuous process improvement are dependent upon evidence-based protocols
and processes shared and managed by patients and their clinicians. Together,
these trends carry with them the promise to transform our health delivery
system and to improve quality and efficiency simultaneously. Common shared
databases, used effectively in real-time and customized to patient need, are
critical to improving healthcare.
Vision 5: Techmanity — Technology as a Humanizing
Force
Back in 1983, Dr. John A. Benson, Jr., then President of the Board of Internal
Medicine, voiced these words when questioned about technology’s impact
on the patient-physician relationship. “There is a groundswell in American
medicine, this desire to encourage more ethical and humanistic concerns in
physicians. After the technological progress that medicine made in the 60’s and
70’s, this is a swing of the pendulum back to the fact that we are doctors, and
that we can do a lot better than we are doing now.” He accurately described the
mood of clinicians then, and for most of the 20th century, toward technology;
that is, a complex love-hate relationship. While they have rejoiced and cheered
on progress, they have struggled to accept and master change in a manner that
would avoid driving a wedge between them and their patients.
“Common shared databases are
critical to improving healthcare.”
Medical Informatics Meets Consumers
It is fair to say that, as the health consumer movement has matured over
the past 30 years, and physicians have moved away from paternalism to
partnerships and team-based approaches to care, that outright resistance and
abject fear of technology has subsided, and clinicians have progressed to and
beyond grudging acceptance. People and their caregivers have developed
computer skills together, pursued broadband and wireless connectivity
together, and discovered the value of personalized and customized computer
search engines together.
Alongside this evolution, the specialty of Medical Informatics has risen to
legitimacy within the medical hierarchy, and its leaders have reinforced the
need to take advantage of technology and informatics in support of humanistic
care. One such voice is that of Warner V. Slack, who heads the Center for
Clinical Computing at Harvard Medical School. His first published paper on
Medical Computing appeared in the New England Journal of Medicine in 1966.
His book, Cybermedicine: How Computing Empowers Doctors and Patients for
Better Health Care, is considered a classic, and argues, as Health Informatics
expert Kevin Kawamoto, from the University of Washington has said,
“Computers can be mutually beneficial for both the patient and the healthcare
provider.”
“People and their caregivers have
developed computer skills together.”
Powering Healthcare Visions: Taking Advantage of Complexity, Connectivity, and Consumerism | 17
Intel® Technology Journal | Volume 13, Issue 3, 2009
“In embracing technology in medicine,
we must view it as both assistive and
transformational.”
If we have managed to move as caregivers from resistance to acceptance of
technology in healthcare, I would suggest we have not moved far enough. As
I stated in a reference paper as chair of the technology sub-committee for the
National Commission for Quality Long Term Care: “In embracing technology
in medicine, we must view it as both assistive and transformational.”
Powering Up The Vision
“Our vision must be sufficiently
forward looking and expansive to
challenge technology innovators.”
The revolutionary strength of modern information and scientific technologies
is that they ignore geography. In so doing they allow us to reorient and
connect beyond the limits of a range of barriers whether they be physical,
social, financial, or political. The danger is not in over-reaching but in underreaching. Our vision must be sufficiently forward looking and expansive to
challenge technology innovators. Where are the “killer applications” that would
allow lifespan planning to move us ahead of the disease curve? How can we
target technologic advances in healthcare to first reach our citizens most at
risk? How do we, in powering the health technology revolution, broaden our
social contract to include universal health insurance? How do we unite the
technology, entertainment, and financial sectors (previously locked out of the
healthcare space) with the traditional healthcare power players, and incentivize
them to work together to create a truly preventive and holistic health delivery
system that is equitable, just, efficient, and uniformly reliable? How can each
citizen play a role in ongoing research and innovation, and help define lifelong
learning and behavioral modification as part of good citizenship? What can
corporate America do to advance the health information infrastructure, and in
“doing good,” do well financially, thereby serving Main Street as it serves Wall
Street?
Moving Beyond Acceptance
“Health Information leaders of
Health Information leaders of the 21st century need to be more revolutionary.
Were they to perform at their full capacity, our healthcare system would
be transformed. Healthcare would be oriented around relationship-based
care, cementing the people to the people caring for the people. If such
a transformation were to take place, we would see improvement on ten
different fronts simultaneously: access, efficiency, team-care coordination,
multi-generational family linkages, inclusion of informal family caregivers
in healthcare teams, targeted interventions for vulnerable populations,
informed mutual decision making, lifespan health planning, evidenced-based
personalized care, and a palpable presence of physicians, nurses, and care team
members in the home.
the 21st century need to be more
revolutionary.”
18 | Powering Healthcare Visions: Taking Advantage of Complexity, Connectivity, and Consumerism
Intel® Technology Journal | Volume 13, Issue 3, 2009
Further Reading
Kawamoto K. “Computer Technology in Health Care Settings.” The Journal of
Education, Community, and Values. May-June, 2003.
Magee M. Home-Centered Health Care: The Populist Transformation of the
American Health Care System. Spencer Books. NY, NY. 2007. Available at
www.spencerbooks.com.
Magee M. “Connecting Healthy Homes to a Preventative Healthcare System:
Leveraging Technology For All It Is Worth.” Harvard Health Policy Review. Fall,
2007, pp. 44-52. Available at http://www.hcs.harvard.edu.
Magee M. “Health Records of The Future.” Health Politics. November 18,
2006.
Nelson, B. “Can Doctors Learn Warmth?” New York Times. September 13,
1983. Available at query.nytimes.com.
Slack, Warren V. Bibliography. Harvard Medical School. Beth Israel Deaconess
Medical Center: Center for Clinical Computing. February 13, 2008.
Author Biography
Mike Magee, MD, is Editor of HealthCommentary.org. He is a Senior
Fellow in Health Policy at the Center For Aging Services Technologies and a
Commissioner of the National Commission for Quality Long Term Care. He
is a widely recognized public speaker and author of ten books including Positive
Leadership, Health Politics: Power, Populism and Health, and Home Centered
Health Care: The Populist Transformation of the American Health Care System.
He can be reached at www.mikemagee.org.
Copyright
Copyright © 2009 Intel Corporation. All rights reserved.
Intel, the Intel logo, and Intel Atom are trademarks of Intel Corporation in the U.S. and other
countries.
*Other names and brands may be claimed as the property of others..
Powering Healthcare Visions: Taking Advantage of Complexity, Connectivity, and Consumerism | 19
Intel® Technology Journal | Volume 13, Issue 3, 2009
From People to Prototypes and Products: Ethnographic
Liquidity and the Intel Global Aging Experience Study
Contributors
Tim Plowman
Intel Corporation
David Prendergast
Intel Corporation
Simon Roberts
Intel Corporation
Index Words
Ethnography
Aging
Product Development
Ethnographic Liquidity
Healthcare
Abstract
This article documents how a large-scale, multi-site, ethnographic research
project into aging populations, the Global Aging Experience Study, led to
the development of concepts, product prototypes, and products for the
independent living market. Successfully leveraging the output of ethnographic
research within large organizations and product groups is often fraught with
challenges. Ethnographic research produced within an industry context can
be difficult for an organization to thoroughly capitalize on. However, careful
research design and sound knowledge transfer activities can produce highly
successful outcomes that can be thoroughly absorbed into an organization,
and the data can lend itself to re-analysis. Our research was conducted by the
Product Research and Innovation Team in the Intel Digital Health Group,
and the work was done in Europe and East Asia, eight countries in all. Using
a mixed methodology, our research examined health and healthcare systems in
order to chart the macro landscape of care provision and delivery. However,
the core of our study was ethnographic research with older people, and their
formal (clinical) and informal (family and friends) caregivers in their own
homes and communities. Data from this study were organized and analyzed to
produce a variety of tools that provide insight into the market for consumption
by teams within the Digital Health Group. As the results of the research
were driven into the Digital Health Group and other groups within Intel, it
became clear that the Global Aging Experience Study possessed what we term
ethnographic liquidity, meaning that the data, tools, and insights developed in
the study have layers of utility, a long shelf life, and lend themselves to repeated
and consistent use within and beyond the Digital Health Group.
20 | From People to Prototypes and Products: Ethnographic Liquidity and the Intel Global Aging Experience Study
Intel® Technology Journal | Volume 13, Issue 3, 2009
Introduction
In 2006, researchers within the product research and innovation team within
the Digital Health Group launched the Global Aging Experience Study. This
is a multi-year, multi-site, ethnographic, research project designed to develop
a global, comparative understanding of the practices and meanings associated
with growing older. To date, we have conducted intensive, qualitative research
into older people, living in eighty-five households in eight countries. These
data, supplemented by additional interviews with dozens of informal caregivers
and healthcare professionals, resulted in a detailed database comprising field
and analytic notes, thousands of photographs, and five hundred hours of video.
Long-range research design, innovative reporting, and knowledge transfer
activities by the team have resulted in the data and analysis providing ongoing
value. In this article, we explore how this research has affected how we imagine
the new independent living technologies of the future.
“We explore how this research has
affected how we imagine the new
independent living technologies of the
future.”
Demography: the Age Wave and its Implications
The world’s population is growing older. According to the latest United
Nations biennial population forecasts, 11 percent of the world’s current
population of 6.9 billion are over 60 years old. By 2050, this will be 22 percent
of a total population of 9 billion people. However, the developed world will
be older, with 33 percent of its population over the age of sixty [1]. There are
two key demographic drivers of this change: longer life expectancy and lower
fertility rates.
People are now living longer lives and, in general, are healthier and more
active. During the course of the twentieth century, average life expectancy
within the developed world rose from 50 to 78 years [2]. Populations have
benefited from better basic healthcare and from the decline in infectious
diseases. However, fertility rates have continued to fall in the developed world.
The rate is now 1.6, which is below the 2.1 rate, regarded by demographers as
the requirement for a population to replace itself. This low fertility rate will
mean that global population growth will level out during the middle of the
century. However, it also means that the ratio of people who work versus those
who do not work, the dependency ratio, will continue to fall. For example, in
Britain in the early 1900s when old-age pensions were introduced, there were
22 people of working age for every retired person. In 2024, there will be less
than three [3]. Such a ratio has fiscal implications: governments will get less
revenue to support the health and social care needs of their aging populations,
and they will be less able to meet the pension entitlements and expectations of
retired people. Political and policy discourse around the inevitable bankruptcy
of the social security system in the United States is perhaps the most familiar
manifestation of the impending problem to U.S. residents.
“The developed world will be older,
with 33 percent of its population over
the age of sixty.”
“Governments will get less revenue
to support the health and social care
needs of their aging populations.”
From People to Prototypes and Products: Ethnographic Liquidity and the Intel Global Aging Experience Study | 21
Intel® Technology Journal | Volume 13, Issue 3, 2009
Percentage of population over 65 years old,
2006
Percentage of population over 65 years old,
2050 Projection
Percentage of population over 80 years old,
2003 and 2004
Percentage of population over 80 years old,
2050 Projection
12.5%
USA
20.6%
3.35%
7.3%
19.7%
Italy
35.5%
4.87%
The ways in which different countries’ populations will age in light of the
demographic shifts just described are likely to differ widely, both from a
sociocultural and a material perspective. For example, in the United States, the
shift will be amplified by the aging and retirement of the so-called baby boomer
generation; i.e., the 80 million people born between 1945 and 1965. In Italy,
Spain, and Germany, the continued decline of fertility rates is resulting in
smaller families, which itself results in these countries’ populations growing
older, faster. But whatever the structural constraints (i.e., social, demographic,
cultural, or economic) on the aging experience, the anticipated challenges
presented by aging, both to the individual and societies, will be profound. See
Figure 1 for the percentage of populations in the developed world who are over
65 and 80, respectively, and also projections of aging as a percentage for those
populations.
15.2%
Looking at the 27 member states of the European Union alone, the rate at
which Europe’s population is aging relative to the number of younger people
available to provide support for elders is predicted to diverge. While the falling
dependency ratio is also due to falling fertility rates, the net outcome is that a
shrinking workforce threatens the fiscal base of numerous healthcare systems.
11.6%
Ireland
25.9%
2.65%
7.6%
17.6%
Sweden
24.7%
5.32%
9.7%
19.4%
Germany
28.4%
4.24%
12.2%
17.7%
Spain
34.1%
4.22%
12.3%
16.4%
France
27.1%
4.28%
10.9%
15.8%
UK
23.2%
4.32%
8.8%
0%
10%
20%
30%
40%
As the broad demographic picture changes globally, the health landscape
is being transformed. The decline of deaths due to infectious disease is
accompanied by the rise in the number of people with chronic diseases,
conditions increasingly associated with affluent and sedentary lifestyles. Among
these chronic diseases are, for example, Type 2 diabetes, chronic obstructive
pulmonary disease (COPD), and congestive heart failure (CHF). These diseases
are expensive to manage and represent a significant strain on the healthcare
systems of the developed world. Annual U.S. costs of COPD were $37.2
billion in 2004, which included $20.9 billion in direct healthcare costs, $7.4
billion in indirect morbidity costs, and $8.9 billion in indirect mortality costs
(from the American Lung Association). However, many of the developing
nations are experiencing significant increases in the incidences of such diseases
also. It is also important to note that individuals with these conditions are
not the only impacted parties: caregivers, public health systems, payers, and
employers are also negatively affected by the rise in the incidences of these
diseases.
Figure 1: Percentage of Population Over 65, and
Over 80, and Projections
Source: Intel Corporation, 2009
22 | From People to Prototypes and Products: Ethnographic Liquidity and the Intel Global Aging Experience Study
Intel® Technology Journal | Volume 13, Issue 3, 2009
Health and Welfare Systems
“Of critical interest to our research
While differences exist in the ways that countries will age, there are also
significant variations in the ways that countries are organized to respond to
the challenges and opportunities that an aging population present. Of critical
interest to our research team was the ways such systems have evolved — how
they are funded and structured — and how people comprehend, experience,
and behave in relationship to these systems. A base assumption of the project
was that there is a dynamic relationship between health and care systems and
the ways people think about, and respond to, health challenges and the process
of aging in general.
team was the ways such systems have
evolved.”
Spain, Sweden, and Italy appear to receive comparatively high dividends for
their levels of investment in healthcare; i.e., the overall measures of national
health relative to the percentage of GDP spent suggests that those euros are
well spent. This suggests that cultural, environmental, and other factors are at
least as important as formal healthcare systems in shaping health outcomes.
However, it is worth remembering also that a high proportion of healthcare
expenditure is directed towards treating ill-health, particularly for those over
the age of 75 [4].
For example, in Europe there is considerable diversity in the ways that welfare
states are organized and how health and social care for older people are
delivered. A range of factors underpin this variety: the historical development
of welfare states; local, regional, and national ideas about the role of the state
versus the individual, family or third-sector (voluntary) in providing care; and
the fiscal and economic frameworks in which development has occurred within
these countries. The interplay between people, practices, policy and politics,
and economics has created a complex and mutable health and healthcare
landscape.
Analysts have created a series of models that seek to describe and categorize
the different welfare systems of Europe. These models provide a good guide
to understanding the fundamental principles or structure of how care is
organized and financed. Notably, Esping-Anderson’s [5] work identified
three distinct social welfare regimes. These are the social democratic model
(Nordic countries), conservative or state corporatist model (Germany or
the Netherlands), and the liberal welfare or Anglo-Saxon model (United
Kingdom). Ferrera [6] subsequently added a fourth category — the Southern
European model including Italy, Greece, Spain, and Portugal. In the United
States, the system can best be characterized as a liberal, capitalist model that
“endorses equality of opportunity but not of outcomes” and is arguably
outperformed by the state corporatist model in terms of social and economic
objectives [7].
“The interplay between people,
practices, policy and politics, and
economics has created a complex
and mutable health and healthcare
landscape.”
“These models provide a good guide
to understanding the fundamental
principles or structure of how care is
organized and financed.”
From People to Prototypes and Products: Ethnographic Liquidity and the Intel Global Aging Experience Study | 23
Intel® Technology Journal | Volume 13, Issue 3, 2009
“In short, these models render human
and economic activity in brush
strokes too broad to support nuanced
insights.”
“This shift towards a more
community- and home-centered
approach to care is being forged, with
varying degrees of success, through
attempts to create the optimal mix
of funding and provision of care by
citizens, family, state, and private and
voluntary sectors of society.”
These models represent high-level views of systems and so fail to capture the
regional and local variations in delivery that often makes care provision deviate
from these ideal, typical models. In short, these models render human and
economic activity in brush strokes too broad to support nuanced insights into
phenomena such as the relationship between policy and the experience of aging
in a given socio-economic structure. Over recent years and in anticipation of
dramatic transitions within the demographic landscapes of Europe, there have
been significant changes in policy planning and rhetoric surrounding how
welfare and care systems operate. European long-term care systems demonstrate
that different attitudes and expectations exist about the relative roles of family,
state, and volunteers.
To answer the questions of shifting demographics and varying approaches
to healthcare, many countries are moving away from hegemonic, expensive,
institutional, and medicalized responses towards a more community- and
home-centered approach to care. The propensity of bio-medicine to extend its
domain over a range of real and perceived maladies, despite any empirical basis
for doing so, has been well documented within the social science literature.
Scheper-Hughes and Lock [8] argue that “the funnelling of diffuse but real
complaints into the idiom of sickness has led to the problem of medicalization
and to the overproduction of illness in contemporary advanced industrial
societies.” In addition to the gradual bankrupting of healthcare systems, owing
to shifting workforce ratios, medicalization has also contributed to healthcare
systems being overburdened and dysfunctional, as well as to inadequate
institutional and social responses to the challenge of aging. This shift towards
a more community- and home-centered approach to care is being forged,
with varying degrees of success, through attempts to create the optimal mix
of funding and provision of care by citizens, family, state, and private and
voluntary sectors of society. This multi-faceted approach frequently entails a
repositioning of the state as a financier, rather than as a direct provider of care
to individuals, through the amplified use of market mechanisms, the growth
of the private and commercial care industries, and increased regulation and
funding of the not-for-profit and voluntary sector [9]. Additionally, a central
component of the rethinking that is underway in the way that care for older
people is organized is a commitment to home-based models of care in which
technology plays an enabling role.
24 | From People to Prototypes and Products: Ethnographic Liquidity and the Intel Global Aging Experience Study
Intel® Technology Journal | Volume 13, Issue 3, 2009
The Intel Global Aging Experience Study
The need for innovation today in order to prepare for tomorrow’s demographic
change was the primary context in which the Global Aging Experience
Study was conceived and conducted. The work was designed to support the
Digital Health Group’s commitment to affecting a paradigm shift within
the independent living market. Our research built on a foundation of
earlier work conducted in the United States, and we sought to extend that
work into Europe and East Asia thereby providing concrete local detail,
while simultaneously developing a more thorough global perspective on the
experience of aging. In addition to deepening our understanding of the myriad
social and cultural differences in people’s subjective experiences of aging and
health, our other research objectives were to challenge prevailing assumptions
about what it means to grow old and to identify strategic opportunities
for appropriate technologies and services for older people. Grounded in a
thorough review of the gerontological literature, and utilizing ethnographic
techniques such as open-ended interviews, observations, and multi-day
visits at multiple households in each country, our team gathered thousands
of images, stories, and insights about what it means to grow old from the
perspective of older people themselves. This was the first step in a long process
of understanding the needs over time for real people. Local researchers and
healthcare professionals helped us recruit older people from diverse socioeconomic backgrounds and with a wide range of health conditions. Efforts
were also made to ensure a reasonable distribution of ages was represented,
ranging from 60 to 100 years old. Researchers would typically spend two days
per household. The first day was usually spent getting to know the occupants
and exploring their life and health histories, while the second visit involved an
exploration of the home and the meaning and use of material objects within
it. The ethnographic interview ended with a mapping exercise and frequently,
it ended with trips around the neighborhood with the senior to meet their
friends, families, and community members. Our style of questioning was
purposively fluid and relaxed, providing encouragement and time for
participants to think through and formulate their responses, often revisiting
sensitive or painful memories. Over the course of the ethnographic encounter
the following domains, among others, were gradually probed and explored:
•• Older people’s relatives, acquaintances, and care networks.
“The work was designed to support the
Digital Health Group’s commitment
to affecting a paradigm shift within
the independent living market.”
•• Life review and biography.
“Our team gathered thousands of
•• Experiences and expectations of aging.
images, stories, and insights about
•• Health and housing histories.
what it means to grow old from the
•• Older people’s relationships with their built, material, and social
environments.
perspective of older people themselves.”
From People to Prototypes and Products: Ethnographic Liquidity and the Intel Global Aging Experience Study | 25
Intel® Technology Journal | Volume 13, Issue 3, 2009
•• Perceived and actual availability of formal and informal support networks
following critical health events or along chronic disease pathways.
•• Health information sources used by older people and their carers in
different parts of the world to make informed decisions about both their
treatment and their future.
•• Everyday life, technology, and home.
We provide a vignette to demonstrate the wealth of material and insights that
can be gained from just one of these case studies.
Widowhood and Aging in an Italian Village
“Anna brought up her own children
to work hard and be financially
independent.”
Anna (85) shares a house with her unmarried oldest son Dario in a village
about an hour from Milan. She has lived in this area all her life and soberly
recalls the poverty of her youth. “I remember a widow came to my mother
and told her that her children didn’t want to sleep because they were hungry.
And my mother gave her some flour...” Anna developed an abhorrence of debt
after seeing her father, a seed tradesman, in difficulties, and she brought up her
own children to work hard and be financially independent. Life wasn’t easy. To
make a profit from the small café bar she and her husband ran demanded their
attention eighteen hours a day. When her husband suddenly died from acute
kidney failure in 1973, she managed to close their business for eight days and
then the family had to go on working. Times were hard, but they were able to
survive, as her three children were already working. Now well into her eighties,
Anna reflects how she feels differently inside in comparison to how she felt in
her youth, “I am different in the way life is different now. In the past I had to
sacrifice a lot. I managed to get a house to pass onto my children. I worked
hard for it. My husband liked to have a motorcycle but work was secondary for
him. It was always first with me.”
As she ages, Anna feels she has a safety net in her children and especially
her oldest son who has taken on the role of family caregiver. Italian law
stipulates that an elderly parent in need has the legal right to demand either
accommodation or financial help from children with resources. Many would
not press this right however. Anna’s daughter now lives in another part of Italy
and so can only visit for holidays, but they talk a great deal on the telephone.
“As things stand today, young Italians
entering the workforce will have to
pay contributions equivalent to 127
percent cumulatively of their salaries
over the next fifteen years to guarantee
the same level of benefit.”
Anna believes older Italians, though still reliant on the family, live better today
than in the past due to the basic state pension — a welfare benefit the country
may have trouble supporting for future generations. As things stand today,
young Italians entering the workforce will have to pay contributions equivalent
to 127 percent cumulatively of their salaries over the next fifteen years to
guarantee the same level of benefit support received by the current older
generation. Anna is able to subsidise her income a little with a small pension
from the days when her husband worked in a factory, but aside from this, like
many women and small business owners, she regrets they were not able to
contribute to personal pensions.
26 | From People to Prototypes and Products: Ethnographic Liquidity and the Intel Global Aging Experience Study
Intel® Technology Journal | Volume 13, Issue 3, 2009
Anna feels her health has seriously started to decline over the last year with
recurrent problems with her knees causing her the greatest concern. In 2002,
her arthritis was causing her so much pain she opted for a knee prosthesis
operation against her doctor’s advice. She was in hospital for three weeks and
then had to spend six weeks at home in bed being cared for by her son and
sister. Her treatment included an hour of physiotherapy per day and a passive
exercise machine to prevent her muscles from atrophying. Her rehabilitation
was paid for by the healthcare system for two months and she feels this was
not long enough. She neglected her exercises, and now problems have begun
to emerge with her other knee due to long periods of over compensation. This
is partly because she felt it demeaning to use a walker in public and preferred a
single walking stick.
“She felt it demeaning to use a walker
Anna suggests the chronic pain in her legs is helped by massage and creams.
Unfortunately she is unable to get massages on the public health service and
private practitioners charge €40 an hour, a sum she feels guilty about spending.
Anna feels alienated from her doctor so she avoids visiting her local clinic. Her
shoulder has been troubling her recently and when she asked for more of the
cream that works on her knee to fix the problem, he refused on grounds of
expense. Nowadays, Anna just phones in when she needs her prescription and
her son picks it up. She finds it very irritating that she has to do this every six
to twelve days, as she is only allowed one box of twelve blood pressure pills at a
time and would like more pills in each box to save her the trip.
“Anna feels alienated from her doctor
Decreasing mobility is having a significant effect on her quality of life. “I feel
old because I am dependent on other people. The problems with my legs make
me feel old. I don’t have any plans for the future now. I don’t want to go out
dancing but I would love to visit a specialist to be able to move around better.”
For the last eight weeks, Anna has not left her apartment. Instead she spends
her days watching the world go by from her balcony and trying to do the little
she can in terms of housework: her family decided to pay for a home help
worker to visit four hours a week to clean. Her son has taken over the shopping
duties. Anna is not able to conceive of using an electric buggy to get around
as she no longer believes she is safe going out by herself. Having given up on
travel, she now regrets not seeing more of the world during her life. The main
variation in routine occurs when her daughter or sister comes to help look after
her once a year when Dario takes a respite break from his care-giving duties
and visits a health spa. “I would consider a nursing home during these times,
but only for a month, for a change, never for good.”
Anna uses sticks to get around and has a stick located in every strategic position
in the household. She sometimes tests and pushes herself by attempting to walk
between resting points. The main obstacle for her in the house is the slippery
marble staircase from her apartment down to the ground floor.
in public and preferred a single
walking stick.”
so she avoids visiting her local clinic.”
“For the last eight weeks, Anna has
not left her apartment. Instead she
spends her days watching the world go
by from her balcony.”
From People to Prototypes and Products: Ethnographic Liquidity and the Intel Global Aging Experience Study | 27
Intel® Technology Journal | Volume 13, Issue 3, 2009
“As Anna’s mobility has declined and
her social circle has narrowed, her
phone has become an increasingly
important lifeline.”
Anna has heard of alarm pendants but does not have one. She carries a mobile
phone with her in case of emergencies but never takes it into the bathroom
with her, one of the most dangerous rooms in the house. Last year, Anna had
a fall in her bedroom and could not get up. Her son was not able to hear her
cries over the living room TV and her mobile phone battery was flat. She
managed to crawl to her charger but didn’t realize that she could use the phone
while it was plugged in. It was two hours before she was discovered.
As Anna’s mobility has declined and her social circle has narrowed, her phone
has become an increasingly important lifeline. Her son works long hours, and
she receives few visitors: she, therefore, calls her sister as often as she can afford
it. She can still cook a little and loves to sew. She used to make clothes for
her family, but she has stopped as nobody wears what she makes. Reading is
another hobby, as is listening to the radio. The real highlight of her month is a
visit from her grandchild. This occurs only rarely, however, as she dislikes her
second son’s wife and they do not feel she is capable of taking care of the child
on her own. “I don’t think I would want to spend another twenty years living
like this, though I would love to see my grandchild married. Although I am
alone a lot, I am not lonely. I can always watch the street from my balcony.”
“The real highlight of her month is a
visit from her grandchild.”
Developing Insights and Tools for Knowledge Transfer
As the selected field notes of the interview with Anna demonstrate,
ethnographic fieldwork requires a balance between having an appropriate focus,
but not being so focused on one topic that you lose sight of what is important
to the participants themselves. Ethnography, in this respect, is not like
requirements gathering, usage model development, or design research, where
the goal is to determine the boundary conditions for a particular product.
Instead, we are trying to understand what it is that people want and need
through a holistic exploration of their subjective experience, and the associated
meanings and behaviors that interpellate [10] with wider socio-economic
structures. The requirements come later.
When ethnographers return to the office, they bring with them images, audio
recordings, video recordings, and hundreds of pages of field and analytic notes
to both review and log for later analysis. As depicted in Figure 2, transcribed
voice recordings and field notes are then entered into qualitative data analysis
software to be coded into themed categories that allow for a horizontal view
across the data set to be developed.
“We are trying to understand what
it is that people want and need
through a holistic exploration of their
subjective experience.”
28 | From People to Prototypes and Products: Ethnographic Liquidity and the Intel Global Aging Experience Study
Intel® Technology Journal | Volume 13, Issue 3, 2009
Figure 2: Qualitative Software for Coding Raw and Analytic Field Notes
Source: Intel Corporation, 2009
As important as systematic coding of interviews are, group analytic working
sessions, or harvest sessions, are equally important. Harvest sessions are where
data from the field, the photos, and other collateral are shared, in an attempt to
inductively identify patterns in the data, as well as to note the important
dimensions of variability within and between countries. These tasks are
accomplished by working through the data and actively listening for recurring
insights, themes, and other aspects of the data. This kind of analysis requires
very active participation and constant critical engagement. Post-it notes are
used to capture ideas as they emerge. After hours, sometimes days, of sharing
stories from the field, the team begins to organize what might be hundreds of
post-it notes into clusters, and robust patterns begin to appear.
Identification of the patterns in the data leads to grounded theory insights about
what is most important to people. For instance, one persistent idea that came
up in story after story in our Global Aging project was simply this: people don’t
want to be treated as if they are no longer competent or valued members of
society. They are human beings, with aspirations, goals, and a sense of identity
and self-worth. And yet, so many healthcare devices and services treat older
people as if they were children, or worse.
“This kind of analysis requires very
active participation and constant
critical engagement.”
“This kind of analysis requires very
active participation and constant
critical engagement.”
From People to Prototypes and Products: Ethnographic Liquidity and the Intel Global Aging Experience Study | 29
U
U
N
TI
N
Supporting cognition
“You have to challenge
your mind and brain.”
M
IN
D
-B
O
DY
C
O
Feeling safe
“Since I fell I’m a lot more
afraid of falling.”
M
Intel® Technology Journal | Volume 13, Issue 3, 2009
TH
E
Environments
of choice
“Despite my lack of mobility
I can turn on garden lights
with this remote control.”
Bringing
healthcare home
“People think that doctors
can cure everything—
that’s not ture.”
Meaningful &
useful life
“It would be great to maybe
have an exhibition of my
work sometime.”
Enabling
social interaction
“I don’t like to be alone.”
Supporting
physical activities
“I want to move easily,
be independent.”
Help
getting care
“We know the pharmacist
very well, it’s easy to get
my medicine.”
Figure 3: Initial and Final Versions of an Opportunity Map
Source: Intel Corporation, 2009
As more and more of these needs are noted, the team begins to cluster them
into what we call the opportunity map. The map, featured in Figure 3, helps us
organize many of the possible ideas about needs and technology ideas that
come up during sharing of field stories, or ideas about services or other
interventions. We repeatedly refer to this map when thinking about products
already on the market, or products in development within the Digital Health
Group. By doing this we can identify gaps or places that need more attention.
Beyond the identification of needs and opportunities, when thinking through
the ethnographic data, the study team identifies insights that are neither ideas
for new products nor areas of explicit need; rather, they are insights into the
required characteristics of any device or solution we might develop. We capture
these as design principles (see Figure 4).
Figure 4: Design Principles Derived During the
Analytic Process
Source: Intel Corporation, 2009
30 | From People to Prototypes and Products: Ethnographic Liquidity and the Intel Global Aging Experience Study
Intel® Technology Journal | Volume 13, Issue 3, 2009
With all these materials and insights in hand, we finally have a basis for
actually thinking about products, services, or other interventions. We follow up
harvest sessions with design brainstorms. During these work sessions (see
Figure 5), we use the opportunity maps to identify which slice of the map we
want to address. Keeping in mind the values and design principles from our
data, we begin to explore ways of addressing particular needs. Here is where
mapping other products, technologies, or research onto the map ahead of time
really helps inform our thinking.
Key Aging Experience Themes
A range of important themes emerged from the analysis of the Global Aging
Experience data set. Seven of these are outlined next.
People want to focus on what they CAN do, not what they CAN’T do
As one woman told us, succinctly and unambiguously, “You are sick when you
are lying in bed.” That is why so few people self-identify as either ill or old.
Many people chose not to use canes or assistive devices in the home. This is not
just because these devices are socially stigmatizing in appearance, but because
these devices reinforce a personal identity as someone who is sick. Many people
sought out challenges as ways of keeping themselves sharp: in fact it seemed
that it was people’s energy level, their will to pursue such challenges, that
most correlated with self-identification as ill or old. Still, we cannot avoid the
fact that many people will and do need assistance. The key is to provide this
assistance in ways that people recognize “as helping me do what I want to do,”
not as a constant reminder that I am no longer capable.
Figure 5: GAE Brainstorming Session
Source: Intel Corporation, 2009
“Many people chose not to use canes or
assistive devices in the home.”
We identified a sort of denial that is a healthy part of the natural aging process
that makes some interventions very difficult. We call this adaptive optimism.
Often, people will regard most interventions as not necessary until it is too late,
unless we can find a way to introduce these interventions into their lives earlier.
The clear implication of this is that we need to be thinking about technologies
that adapt and age with a person, changing the mix of offerings as the person
ages.
“Often, people will regard most
interventions as not necessary until it
is too late.”
From People to Prototypes and Products: Ethnographic Liquidity and the Intel Global Aging Experience Study | 31
Intel® Technology Journal | Volume 13, Issue 3, 2009
“These factors can seriously impact
a person’s ability to live the life they
desire in later years.”
Healthy aging and independent living mean far more than health
There are many factors that enable a person to remain at home and relatively
independent. These include a wide variety of capabilities, not just health-related
ones. The ability to take care of one’s own home maintenance or gardening, the
ability to get groceries, prepare meals, and clean up afterwards, the ability to
get around the neighborhood or town, are among some of the most important
ones. All of these factors can seriously impact a person’s ability to live the life
they desire in later years. We need to be thinking about all of these factors
when we design interventions. A few of the key areas for enabling services are
these:
•• Mobility and transportation in the community are crucial. Communitybased transportation initiatives are catching fire both in the United States
and in Europe. Different attitudes towards private transportation and
different availability of public transportation will mean very different
systems in different places.
•• Services for the home are becoming increasingly important. This is
particularly true for widowed husbands or wives: the loss of a spouse can
mean a serious adjustment of lifestyle, at a time when the ability to adapt
is not what it once was. Cooking, cleaning, home maintenance, and other
forms of routine care will become increasingly needed. An opportunity
exists to provide an infrastructure to enable a broader, less centralized
marketplace for trusted providers.
•• The need for finding trustworthy providers is going to increase. Technology
can play a major role in helping communities identify and enable trusted
providers. This may be particularly important considering that some forms
of care that one would expect to come from family members will come
from strangers.
Health is not an objective quality; it is defined collaboratively and culturally
Health is defined through interactions and negotiations among various people,
including informal caregivers, family members, hired in-home caregivers,
and medical caregivers. We saw many differences within and between these
networks in our assessments of an elder’s health.
•• Assessments of health can vary from person to person. In Germany, for
instance, an elderly husband insisted that his wife (who had suffered a
mild stroke) needed much more help with driving, shopping, and taking
her medications than the woman herself felt she needed. Both she and her
daughter felt that the husband was being overly protective. In multiple
cases, children had put their parents in nursing homes even though
those parents did not feel they needed to be there. And yet, the parents
acquiesced, mostly because they “did not want to be a burden.”
32 | From People to Prototypes and Products: Ethnographic Liquidity and the Intel Global Aging Experience Study
Intel® Technology Journal | Volume 13, Issue 3, 2009
•• Notions of health vary from place to place in both major and minor ways.
The cultural, social, and political systems in which we are embedded shape
our attitudes and behaviors with regards to health. In England, people
behave in particular ways as a result of the presence of the National Health
System’s guarantee of life-long, free healthcare. In Mediterranean countries,
people do not recognize eating as a health-related practice. They simply eat
what people outside the region recognize to be very healthy foods.
People mark the progression of aging by watershed events
People do not experience aging as a gradual loss of capability, but rather as
a series of threshold events. It is most obvious in the case of falls, where a
combination of physical or cognitive decline might go unacknowledged for
a long time before the precipitating event, namely, the fall, makes it official.
While the obvious solution might seem to be monitoring and intervening
earlier, we suggest it is not that simple: people who are in this state of adaptive
optimism or denial might not willingly submit themselves to this kind of
intervention.
•• Some watershed events initiate decline. Non-health-related events can have
a serious health impact. For example, the loss of a loved one or a child,
resulting in grief and depression, may have measurable, negative health
consequences.
•• Sometimes external events leave deficits that expose health decline. The
loss of a spouse may entail the loss of a person who has for years helped
one cope with other deficits. A wife who has prepared her husband’s meals,
done his laundry, and helped with other daily life needs over a period of
forty or fifty years will, upon death or disability, leave a huge gap deficit
that the husband or other informal caregivers will need to overcome.
“Meaning is made through social
interaction and that does not cease in
older age.”
Healthy aging is inextricably linked to social participation
People of all ages aspire to have a sense of belonging, a legitimate role in
the life of their family and community. We are meaning makers all through
life: meaning is made through social interaction and that does not cease in
older age. At the same time, people become more acutely aware that ongoing
engagement with the community or their family might actually increase their
sense of being a burden, not a contributor. Thus, enabling social participation
means lowering the costs associated with social engagement in at least three
ways:
•• Feeling useful or productive, or at least having something to offer and the
sense of identity that comes with that. This could be either in the form
of caring for grandkids or being able to tell one’s story, to put one’s life’s
history in context.
•• Enabling emotionally satisfying contact with the people the elderly care
about, people they share relations or a history with. At the same time, note
that there are costs associated with such social contact, and those costs
seem to rise with age (for example, as you lose your ability to drive, or
when cognitive impairments affect your ability to remember names on the
telephone).
From People to Prototypes and Products: Ethnographic Liquidity and the Intel Global Aging Experience Study | 33
Intel® Technology Journal | Volume 13, Issue 3, 2009
•• Enabling recognition and lightweight engagement with a broader
community. As one of our participants told us, “Sometimes you just want
to look out your window and see what’s happening in your neighborhood.”
In many European communities people placed a lot of value on being
known in a place, of interacting with people with whom they share some
history or interests. This could come from such activities as going regularly
to the local market, or participating in a club.
“One interesting theme to emerge from
our work was a deeper discussion of
why nursing homes or institutional
residences were generally regarded as
bad, while staying at home is generally
regarded as good.”
The lived-in space is crucial to the experience of aging
People’s choice of objects to inhabit their homes, as well as less malleable
aspects of architecture, location, and other aspects of the physical environment
were important in how people experienced aging and health. One interesting
theme to emerge from our work was a deeper discussion of why nursing homes
or institutional residences were generally regarded as bad, while staying at home
is generally regarded as good. As we teased the attributes apart we realized that
these elements characterize differences across many kinds of environments,
including movement from one private residence to another. These elements
suggest interesting possibilities for technology:
•• Houses reflect cultural norms around physical privacy. There is a reason
why bathrooms are hidden in European and American homes. There are
cultural notions of privacy and intimacy associated with care and nurturing
of the body, particularly hygiene. We need to be cognizant of how different
cultural values around bodily intimacy may shape acceptance or resistance
to technologies.
•• Houses are the sites of spiritual and personal expression. Expressions of
spirituality and the emplacement of memories (for example, in shrines to
lost loved-ones) abounded in the houses we visited.
•• Enacting of micro-routines. Simple daily activities, from checking the
barometer, to drinking a glass of juice at bedtime (and using the empty glass
as a medication reminder in the morning) to gardening, were crucial for
people for multiple reasons simultaneously: as a way for people to fill little
slots of time, as a way of grounding behaviors in some structure that aids
memory, to provide some lightweight ways of finding meaning and having
something to talk about, get a little bit of physical exertion, keep one’s mind
sharp, and generally have some sense of small purpose to focus on each day.
•• The home is not just the four walls. It is the physical situation of the house
in the neighborhood, proximity to other services and opportunities for
social engagement. One woman we visited in a nursing facility in Spain was
quite happy there, partly because the home was in her old neighborhood,
near the same market, the same services, and the same social network that
she had always had.
Healthcare networks are large and increasingly complex just about everywhere
People in virtually every country we visited struggled to get the best value out
of their healthcare systems. People struggled with healthcare bureaucracy and
sought to bring alternative approaches to care into alignment:
34 | From People to Prototypes and Products: Ethnographic Liquidity and the Intel Global Aging Experience Study
Intel® Technology Journal | Volume 13, Issue 3, 2009
•• Help people figure out their options. In places where coverage is universal,
people still seek out alternatives. In the United Kingdom, despite universal
coverage, people find it necessary to pursue alternative (private) healthcare
systems to speed up access. In some cases, people had to leave their country
to gain access to treatments that were unavailable because of resources or
regulations.
•• Help people help each other. Several people we visited had been forced to
learn a lot about the healthcare system, because of a chronic condition
or the need to care for someone who is ill. More than one noted that it
was unfortunate that, when their situation changed (a patient died or got
better) they had no place to share this hard-won knowledge. Simple tools
might make a collaborative user-based system to help people share that
knowledge and help each other.
The identification of key themes such as these provides broad insight into the
issues, problems, and desires of participants to help guide the development of
effective, culturally sensitive, and age-appropriate technologies. We now outline
some examples of how the global aging experience research program has
stimulated more in-depth studies and has contributed to the cycle of product
incubation and development, constantly ongoing within the Intel Digital
Health Group.
“Over the past decade, a somewhat
different style of ethnographic research
Ethnographic Liquidity
Since the beginning of the twentieth century, the vast majority of ethnographic
research has been conceived and consumed within an academic context. The
structure of ethnographies as academic texts, and the tools and techniques used
to produce them, are well fitted to an audience of scholars that, to a greater
or lesser degree, sees ethnographic research as cumulative and contributing
to anthropological and other related fields of inquiry. Over the past decade, a
somewhat different style of ethnographic research has been developed in the
context of industry and commerce. This work differs in scope and duration.
It does not bear the same implied scholarly and cumulative relationship to
academic ethnographic research.
Within the context of industry, ethnographic research has different demands
placed on it and different gauges are used to assess the relative success or
failure of an ethnographic project. A key demand placed on ethnographic
research in this new context is that of immediate relevance. Equally, multiple
stakeholders expect the research to be actionable. The term actionable refers
to the requirement that the research must be problem-directed and result in
an analysis that produces results that are easily consumed, understood, and
acted upon by other stakeholders in an enterprise (for example, research that
produces engineering requirements leading to product specifications).
has been developed in the context of
industry and commerce.”
“A key demand placed on ethnographic
research in this new context is that of
immediate relevance.”
From People to Prototypes and Products: Ethnographic Liquidity and the Intel Global Aging Experience Study | 35
Intel® Technology Journal | Volume 13, Issue 3, 2009
For research to have an impact in any organization it needs to circulate so that
the findings can be understood, conclusions discussed, and the implications
fleshed out by relevant stakeholders. It could be argued that research that does
not travel through an organization can only ever have limited impact, since it
is on this journey that audiences can create for themselves a meaning for the
research. However, the impact of ethnographic research in the context of large
organizations, and beyond, can often be impaired by issues of circulation.
Creating outputs that travel well is all important to developing what we call
ethnographic liquidity, that is, the ability of research to be converted into
something of value by its audiences.
“If research activity and its output is
liquid, it is readily exchangeable.”
Therefore, another way of thinking about the actionability of ethnographic
research is through the lens of liquidity. If research activity and its output is
liquid, it is readily exchangeable. The research creates debate, is able to inform
existing activities, and creates the basis for new endeavors. Liquid research will
perform different functions for different members of an organization, because
it is multivalent and contains different layers of utility. For example, engineers
and designers might focus more on design principles as a means of guiding
their work. Marketing may take more note of needs and compelling ways of
telling their business story.
In short, the best ethnographic research in an industrial context is research that
has continued resonance and meaning. In essence, there are two competing
demands placed on ethnographic liquidity. First, the research must be
conceived and conducted to meet current organizational objectives. Second,
the research must have prolonged and ongoing relevance for the organization.
When we commenced the design of the Global Aging Experience Study, the
team was new to a recently created business division. The organization to
some degree was still, to borrow the concepts of Tuckman [11], storming and
forming. As a result, the research was conducted as much to shape the activities
of a business division as it was to meet a pre-defined set of research needs. An
explicit goal was to create a piece of liquid research that would endure and have
long-term value.
“The research was conducted as much
to shape the activities of a business
division as it was to meet a predefined set of research needs.”
The Global Aging Experience Study was designed accordingly. A foundation
of the research was an extensive review of gerontological literature and its
dominant theories of aging. This review lead to the creation of domains
of experience that were used to create the interview guides and the coding
system later applied to our data. While this grounded theory research approach
is commonplace in the academy, it is less so in industry. (Researchers and
management in industry generally perceive this approach as too academic and
too resource intensive. The members of the Digital Health Group, however,
viewed this approach as a wise, long-term investment). The domains we created
remain relevant to the aging experience and have enabled other researchers to
access and mine the original data we collected for other projects within the
Digital Health Group, a practice that will continue for the foreseeable future.
36 | From People to Prototypes and Products: Ethnographic Liquidity and the Intel Global Aging Experience Study
Intel® Technology Journal | Volume 13, Issue 3, 2009
Another aspect of the ethnographic liquidity of our research is the multiple
artifacts we have created to disseminate the research. These include video,
PowerPoint* presentations, academic papers, printed internal and external
reports, press releases, conference talks, posters, and an opportunity map. Key
to this strategy of dissemination has been the production of materials in an
analog format.
We have focused on the production of high-quality, printed material in the
form of booklets. Booklets are used to draw together large projects presenting
ethnographic profiles, country overviews, and illumination of the analysis of
the material. The booklets aim to provide content that will become on-going
resources that diverse readers find useful (for example, by including relevant
statistics and demographic and market data). Our experience has been that
focusing on materializing research in this way has allowed us to increase the
liquidity of our work. Perhaps counter-intuitively, moving ethnographic
material out of digital formats into paper formats can make it easier to
circulate. Further, its existence in an organization becomes more durable.
Materials in this format have been crucial to supporting the persistence of our
research in the geographically dispersed, digital workplace. Furthermore, we
have found that producing versions of our documents [12, 13] for external
audiences further enhances the liquidity and durability of the original research.
“Booklets are used to draw together
large projects presenting ethnographic
profiles, country overviews, and
illumination of the analysis of the
material.”
Because of the ethnographic liquidity characterizing our work, the organization
has made repeated use of it. The following represents a sample of activities to
which the Global Aging Experience Study has contributed, either in part or in
whole:
•• Deeply informed the initial conceptualization and subsequent development
of two major product incubation efforts, one of which has developed into a
key program within the Digital Health Group.
•• Defined a further research agenda examining social health and social
participation in more detail within the context of the European Union and
the United States.
•• Resulted in the submission of 60 or more invention disclosures for the
Digital Health Group.
•• Informed major research efforts in the Technology Research for
Independent Living (TRIL) research partnership between Intel and the
Irish government within the areas of social connection, cognitive decline,
and falls. One specific example is how older people negotiate movement
through their homes and develop pathways in the household. This insight
informed the development of a sensor strategy.
•• Developed an operational blueprint for ethnographic research of this scale
and complexity and identified best practices for knowledge transfer.
•• Launched deeper research efforts in the areas of sleep and sociality,
transportation and mobility, social care, active retirement groups, and the
reinvention of retirement.
Figure 6: Front Covers of Two External-facing
Documents
Source: Intel Corporation, 2009
From People to Prototypes and Products: Ethnographic Liquidity and the Intel Global Aging Experience Study | 37
Intel® Technology Journal | Volume 13, Issue 3, 2009
Conclusion
In this article, we have discussed the Global Aging Experience Study in
the context of ethnographic liquidity and knowledge transfer. The project
afforded our team an opportunity to develop tools and artifacts that have
led to the continued relevance of our research to the Digital Health Group.
One can argue that conducting fieldwork and analyzing data represents the
comparatively easy parts of any research project. The real challenge when
conducting ethnographic research in the context of industry, comes in the
practice of sustained knowledge transfer and in the ability to drive the insights
gained and communicated through research into a program for development.
References
[1]
Beck, Barbara. “A Slow Burning fuse: A Special Report on Aging
Populations.” The Economist, June 27th, 2009, page 3.
[2]
Ibid.
[3]
“Aging population: Facts behind the fiction.” At news.bbc.co.uk
[4]
World Health Organization. At www.who.int
[5]
Esping-Anderson, Gøsta. 1990. The Three Worlds of Welfare Capitalism.
Princeton University Press.
[6]
Ferrera, Maurizio, ed. 2005. Welfare State Reform in Southern Europe:
Fighting Poverty and Social Exclusion in Greece, Italy, Spain and Portugal.
Routledge: New York.
[7]
Goodin, Robert E., Headey, Bruce, and Muffels, Ruud. 1999. The Real
Worlds of Welfare Capitalism. Cambridge University Press: Cambridge.
[8]
Scheper-Hughes, Nancy and Lock, Margaret. “The Mindful Body: A
Prolegomenon to Future Work in Medical Anthropology.” Medical
Anthropology Quarterly, Volume 12, Issue 3, 1987.
[9]
Timonen, Virpi, Doyle, Martha and Prendergast, David. No Place Like
Home: Domiciliary Care Services for Older People in Ireland. Liffey Press:
Dublin, Ireland, 2006.
[10] Althusser, Louis. 1969. For Marx. Translated by Ben Brewster. Verso:
London.
[11] Tuckman, Bruce. 1965. “Developmental sequence in small groups.”
Psychological Bulletin 63 (6): 384-99.
[12] Roberts, Simon and Prendergast, David. 2009. “Community Supports
for Aging: Care Pathways for Older People in Europe.” Digital Health
Group, Intel Corporation.
[13] Plowman, Tim, Roberts, Simon and Prendergast, David. 2007. “The
Global Aging Experience Project: Ethnographic Research.” Digital
Health Group, Intel Corporation.
38 | From People to Prototypes and Products: Ethnographic Liquidity and the Intel Global Aging Experience Study
Intel® Technology Journal | Volume 13, Issue 3, 2009
Author Biographies
Tim Plowman, PhD, is a Senior Research Scientist and a Medical
Anthropologist for Product Development in the Digital Health Group at
Intel. He has been leading innovation from a social science perspective in
Silicon Valley for over a decade. His current work focuses on aging and
possible instrumental and non-instrumental applications of information and
communication technologies in support of successful aging. He holds multiple
patents and has published widely on social science, technology, and design. His
e-mail is tim.plowman at intel.com.
David Prendergast, PhD, is a Social Anthropologist in the Digital Health
Group at Intel and a Principal Investigator of the Social Connection Strand of
the Technology Research for Independent Living (TRIL) Centre. His research
focuses on later-life course transitions, and he has authored a number of
books and articles on aging and health. David utilises ethnographic research
to co-design and iteratively develop culturally appropriate, independent-living
technologies for older people. His e-mail is david.k.prendergast at intel.com.
Simon Roberts, PhD, leads the European social science and design team
for Product Research and Incubation in Europe and is the Intel Principal
Investigator on the ethnographic research strand of the TRIL Centre. He is an
Applied Anthropologist with over ten years of experience translating research
into product, service, and business opportunities. Simon has published widely
on ethnography, the application of anthropology, technology, design, and the
aging experience. His e-mail is simon.w.roberts at intel.com.
Copyright
Copyright © 2009 Intel Corporation. All rights reserved.
Intel, the Intel logo, and Intel Atom are trademarks of Intel Corporation in the U.S. and other
countries.
*Other names and brands may be claimed as the property of others..
From People to Prototypes and Products: Ethnographic Liquidity and the Intel Global Aging Experience Study | 39
Intel® Technology Journal | Volume 13, Issue 3, 2009
Adapting Technology for Personal Healthcare
Contributors
Terrance (Terry) Dishongh
Intel Corporation
William (Bill) DeLeeuw
Intel Corporation
Mark Francis
Intel Corporation
Index Words
Sensors
Surveys
Social Care
Personal healthcare
Long-term Care
Abstract
The increase in the aging population in the United States, coupled with
healthcare costs that consume nearly 20 percent of US Gross Domestic
Product, are straining the ability of the government to deliver quality care.
One way to ease this strain on our national budget would be to use technology
to enable the aging population to age at home. Technology could be used to
support the independence of the elder through monitoring the activities of
daily living (ADL), tracking cognitive health, proactively managing chronic
health conditions, alerting to falls, and providing a system to promote social
engagement. This alerting, monitoring, and tracking can be done via homebased technologies that provide information to formal and informal caregivers.
Effectively, with the use of technology, the care model shifts from a large
centralized, provider-centric one to a home- and community-based model of
distributed care.
Introduction
In this article, we review a variety of tried and tested technical solutions to
address the care and needs of the elderly in the United States. Our solutions are
broadly characterized into three types: surveying, sensing, and social care.
•• The surveying solution involves periodic interactive dialogues with the
senior to assess functional, physical, and mental health while also providing
a communication platform for health education. Queries may be as
simple as a repeated list of standard questions or as complex as a system of
personalized, condition-specific questions with branching logic.
•• The sensing solution is achieved by using active or passive sensing
technology to collect information about the person’s behaviors, activities,
and functional status.
•• The social care solution involves encouraging seniors and providers to
actively participate in social communities, nurture relationships, and to
break down social isolation as individuals age-in-place alone.
We show how these three technical solutions were used by seniors and
providers to enable personal healthcare, enhance care, and improve quality of
life.
40 | Adapting Technology for Personal Healthcare
Intel® Technology Journal | Volume 13, Issue 3, 2009
A Stressed Healthcare System
In the past, the majority of people in the United States did not live beyond 50
years of age. In 1900, for example, life expectancy was 47 years of age. In 2010,
life expectancy in the United States will exceed 80 years of age [1]. As a result,
there are now more than 39 million Americans over the age of 65. Globally, the
current population of individuals aged 65 years and older exceeds 500 million.
By 2030, it will exceed 1.5 billion. The existence of hundreds of millions of
individuals aged 65 and older is a 21st-century phenomenon [2].
“The existence of hundreds of millions
of individuals aged 65 and older is a
21st-century phenomenon.”
Historically, when the need for long-term care occurred, this need was met by
family members. It was only in the last century that long-term care emerged
as an industry. The 1940s were marked by a surge in the construction of
traditional nursing homes and the institutionalization of care. The 1960s saw
the establishment of Medicare (the government-run, healthcare system for
the elderly), Medicaid (a government-funded, healthcare system for certain
low-income individuals and families), and the establishment of the Older
Americans Act that launched a series of services designed to support aging-inplace and home-based care. Over the past twenty years, we have witnessed the
shift toward home- and community-based care, as seen in the rapid growth of
the senior housing industry and the creation of numerous, innovative models
of aging-in-place services.
Today, long-term care is prevalent and the costs are staggering:
“These factors create a tremendous
•• Over 40 percent of US households are involved in providing elder care [3].
opportunity for technology to play
•• The average out-of-pocket cost to family caregivers exceeds $5,500
annually [3].
•• The cost to employers due to lost productivity and absenteeism exceeds
$1,142 per year per employee [4].
•• The average annual cost of nursing homes exceeds $72,000 [5].
•• The average annual cost of assisted living services exceeds $30,000 [6].
a role in enabling personal care,
enhancing healthcare, improving
quality of care, and relieving some of
the stress on the healthcare system.”
As baby boomers (a term used to describe those born during the post WWII
baby boom) move into their 70s, the demand for long-term care services will
rise dramatically. If nothing is done, this additional demand could break the
long-term care system. However, these factors create a tremendous opportunity
for technology to play a role in enabling personal care, enhancing healthcare,
improving quality of care, and relieving some of the stress on the healthcare
system.
Adapting Technology for Personal Healthcare | 41
Intel® Technology Journal | Volume 13, Issue 3, 2009
Users of a Technology Enabled Personal Care
System: Seniors, Caregivers, and Family
Independent
Living Services
Independence
Demographic
Active Boomer adults
in the workforce;
sandwich generation;
new healthcare
consumers
Average Length
of Service: 20 years
Services
• Cognitive fitness
• Informational portals
• Healthy aging and
maintaining youth
• Discounts
55+
Assisted
Living Services
Demographic
Beginning of functional
decline; living w/chronic
disease –
“uh oh moments”
Average Length
of Service: 4–6 years
Services
• Cognitive fitness
• Medication
management
• Lite IALD and
ADL support
• Chronic Disease
Management lite:
Healthy Heart
• Meals
• PERS/Safety
75+
Skilled Nursing
Services
Demographic
2+ ADL issues;
Alzheimer’s; Faller
Average Length
of Service: 2–3 years
Services
• Support of 2+ ADLs
• Cognitive Health
• Active Falls
• Wandering
• Post-acute
episode care
• Functional foods/meals
• CDM
• PERS/Safety
80+
Demographic
Seriously ill, end
of life; need for
constant supervision
Average Length
of Service:
0.5–2 years
Services
• Active medical care
• 24/7 monitoring
• Active ADL support
• Meds management
• Chronic Care
85+
Average Age
Figure 1: Transitioning through the Aging Services Continuum
Source: Intel Corporation, 2009
“The senior, caregivers, and family
form a triad to address emerging and
evolving needs.”
As individuals transition though the second half of life, they traditionally
proceed through stages of increasing functional, physical, and cognitive decline
over time, as shown in Figure 1. For some individuals, this is a slow, gradual
process. Other individuals maintain healthy and active lifestyles until the very
end of life. In either case, throughout this process, the senior, caregivers, and
family form a triad to address emerging and evolving needs.
At-home technology can complement the work of caregivers such as physicians,
nurses, wellness coaches, care coordinators, family members, neighbors, and
friends. Technology provides new and innovative ways to monitor the wellbeing
of older people, increase the levels of communication between all concerned,
and enable response to rapidly changing conditions. The new awareness that
long-term care includes clinical and non-clinical caregivers as well as local and
distant family and friends is playing a large role in transforming long-term care
services.
“For caregivers, home-care systems
mean more efficient and effective
healthcare, driven by seniors who take
greater responsibility for their own
health.”
42 | Adapting Technology for Personal Healthcare
Home-care systems allow people to monitor themselves with devices that
give proactive warnings of illness, thereby enabling patients to turn to their
caregivers earlier, when intervention can be the most effective. For caregivers,
home-care systems mean more efficient and effective healthcare, driven
by seniors who take greater responsibility for their own health. William
Herman, director of the Division of Physical Sciences in the Food and Drug
Administration’s Center for Devices and Radiological Health (CDRH), calls
home-care systems “the fastest growing segment of the medical device
industry” [7].
Intel® Technology Journal | Volume 13, Issue 3, 2009
The Home Environment
In developing technologies to support aging-in-place and to enhance long-term
care, the following key principles should be kept in mind:
•• The focus is healthcare; often a new technology is not what is needed;
instead, the application of an existing technology might be appropriate.
“Two of the biggest challenges to the
•• At the center of this healthcare issue are the senior and the caregiver, not
the technology. Without cooperation from the users of the system, the
system may never be useful.
usability and installation.”
utilization of technology in a home are
•• There is often more than one way to achieve a clinical or care objective: the
first technology solution that appears may not be the best.
•• The simpler the technology, the better.
•• Technology solutions for healthcare are mission-critical; reliability is of
paramount importance.
•• The technology has to work in the home, not just in the lab.
•• Usability by the senior and the caregiver is paramount.
Two of the biggest challenges to the utilization of technology in a home are
usability and installation. The usability issue is critically important in that
compliance often hinges on usability. For what we call our survey and social
care solutions to work, issues regarding user interfaces, choice of display colors
and font sizes, method of interaction, device type, device size, and portability
have a significant impact. For a solution to reside in an individual’s home, we
must understand where a system will be used and the available space in the
home for that system.
The installation issue is particularly challenging. Often a new technology
requires multiple devices, batteries, cables, cable management, and wireless
data connections. Integrating all of these components and installing them is
often more challenging than anticipated.
Moreover, variability in home construction material, room size, room layout,
and environmental factors (lightning and air flow) can impact the technology
deployed and the reliability of the results. For our sensor solutions, ongoing
maintenance requirements, such as changing batteries, repositioning sensors, or
updating the type of sensor, may also create obstacles to adoption.
Ironically, while technology may feel threatening to those in the home, the
home environment is actually a fairly inhospitable place for technology.
Drops, strikes, water damage, and tampering, among other things, are very
likely to occur in a home and must be anticipated when technology is being
designed. Children, in many cases more technology savvy than their parents or
grandparents, are an active threat and may drive new security requirements that
require biometric authentication and tamper-detection technologies.
“Drops, strikes, water damage, and
tampering, among other things, are
very likely to occur in a home and
must be anticipated when technology
is being designed.”
Adapting Technology for Personal Healthcare | 43
Intel® Technology Journal | Volume 13, Issue 3, 2009
“A technology will benefit from a
design that enables the parts of the
technology to function, potentially all
over a home, but not draw unwanted
attention to the resident’s health.”
Finally, another goal of in-home monitoring technology should be that it
be hidden in plain sight. Being readily available and convenient are critical
elements for fostering regular use, but many people with chronic disease have
no desire to share that part of their life with others. Therefore, a technology
will benefit from a design that enables the parts of the technology to function,
potentially all over a home, but not draw unwanted attention to the resident’s
health. The use of Personal Emergency Response System (PERS) devices is a
classic example of a market reaching a premature plateau because of the stigma
attached to wearing a PERS pendant. PERS market penetration has flattened to
a 16 percent adoption rate due to this usage issue and 40 percent of PERS users
do not wear their pendants.
We now explore in more detail the three technology solutions that we evaluated
and tested: surveying, sensing, and social care.
Surveying
“Wellness surveys are one way to
approach behavior assessment,
encourage change, and achieve better
outcomes.”
It is now known and accepted that proactively managing health can lead
to healthier choices, improved accountability, increased quality of life, and
enhanced health. In an era where chronic disease is the leading cause of
hospitalization and the major driver of healthcare expenditures, engaging
individuals to modify behavior and make better lifestyle choices can have a
profound impact on health, wellness, utilization of services, and system costs.
Wellness surveys are one way to approach behavior assessment, encourage
change, and achieve better outcomes.
Wellness Coach and Surveying
The increased awareness of the importance of proactively engaging individuals
in managing their own health has led to a proliferation of health coaching
services. From the Mayo Clinic Embodied Health Services to numerous mobile
health applications, these services run the gamut from clinical to consumerfocused programs.
“Our surveying research involved
periodic interactive testing or queries
of the elder in their homes to estimate
physical, functional, and mental
health.”
44 | Adapting Technology for Personal Healthcare
Through the efforts in the Product Research and Incubation Group, we have
conducted numerous research initiatives around the concepts of wellness
coaching and surveying. We also employed non-clinical wellness coaches to
use technology to improve the cognitive and social health of seniors, as well
as to use surveys to explore how seniors could stay at home and age with as
much dignity, independence and functional, cognitive, and physical health as
possible.
Our surveying research involved periodic interactive testing or queries of
the elder in their homes to estimate physical, functional, and mental health.
This interactive testing involved using either dedicated testing hardware or
measurable extensions of existing household items or activities.
Intel® Technology Journal | Volume 13, Issue 3, 2009
Our surveying methodology varied. In some cases, we used queries ranging
from as simple as a repeated list of questions. In other tests, we used queries
as complex as a system of branching logic. The ability to have flexibility in
the surveying methodology allows for the healthcare or eldercare provider
who monitors survey answers to customize future queries or test sequences to
specific patient situations.
“Sensors strategically placed on the
human body create a wireless body
area network (BAN).”
Sensing
Any at-home healthcare solution must detect and respond to the activities
and characteristics of the older person. A network of sensors (either worn,
carried, or environmental) is a good platform for detecting and responding to
health-relevant parameters, such as movement, breathing, pulse, heart rate, or
social activity. Sensors strategically placed on the human body create a wireless
body area network (BAN) that can monitor various vital signs while providing
real-time feedback to the user and medical personnel. Further, wireless sensors
can be deployed in a patient’s home environment to provide real-time and
extended monitoring of activity and wellbeing.
When coupled with communications technologies, such as mobile phones and
the Internet, the wireless sensor network (WSN) can keep family, caregivers,
and doctors informed, while also establishing trends and detecting variations
in health. Appropriate technology design can minimize intrusion and protect
privacy, maximize user friendliness, and encourage long-term adherence to
medical regimens.
“For example, changes in gait might
indicate the onset of Parkinson’s disease
in advance of visible symptoms.”
In some cases the wireless sensor network requires no direct patient interaction
(this is ideal where the patient may be suffering from some level of cognitive
decline). Because WSNs are deployed where the patient spends most time, they
can deliver long-term data sets to assist in diagnostics and patient response to
interventions. The data collected by a WSN can be stored and integrated into
a comprehensive health record of each patient that may allow identification of
subtle changes in a person’s health. For example, changes in gait might indicate
the onset of Parkinson’s disease in advance of visible symptoms [8]. Other
example applications include medicine reminders, social connectivity, and
emergency communications.
BANs can detect small changes in vital signals — such as heart rate and bloodoxygen levels — that are not obvious in a one-off visit to a doctor. Further,
BANs address the issues associated with so-called “white coat syndrome,”
skewed physiological and cognitive test results that are related to the stress
and anxiety associated with a clinical visit. Measurements can be made
unobtrusively over an extended period in a supportive home environment that
more accurately reflects the true values for a given parameter.
“Measurements can be made
unobtrusively over an extended period
in a supportive home environment
that more accurately reflects the true
values for a given parameter.”
Adapting Technology for Personal Healthcare | 45
Intel® Technology Journal | Volume 13, Issue 3, 2009
“WSNs can provide notice of
significant shifts in key physiological
parameters and may save lives, or
initiate preventative care in order to
forestall a health crisis.”
As well as offering excellent long-term care benefits, the always-on nature of
WSNs means that they can be used to detect and respond to health crises in a
timely manner. In particular, WSNs can provide notice of significant shifts in
key physiological parameters and may save lives, or initiate preventative care in
order to forestall a health crisis.
The potential value of WSNs for healthcare has not yet been fully explored. A
significant research effort will be required to mine the vast data sets collected by
WSNs and to find patterns that reveal information of interest to the healthcare
professional. Breakthroughs in this area may support semi-automated analysis,
diagnosis, and treatment processes in the future. Doctors will be assisted
by an electronic counterpart, and patients’ health may benefit as a result of
faster diagnosis and treatment of diseases. Other quality-of-life issues, such as
privacy, dignity, and convenience, are supported and enhanced by the ability to
unobtrusively provide services in the patient’s own home [9].
Overview of our Methodology
In this section we present an overview of our general methodology for a WSN
solution. Specifically, here are criteria for developing and deploying a WSN in
healthcare [10].
•• Understand the problem. A clinician may already have analyzed the problem,
and have come up with a technology outline for the engineer to develop.
However, further analysis may pay dividends. At-home healthcare means
that the appropriate environment to formulate any technology solution is
not the clinic or the laboratory, but the home. Ethnographic observation
of user behavior, where an understanding of how people live and why they
live as they do, their constraints and their priorities day to day, can be very
beneficial and has informed many of the solutions in which we are involved.
•• Understand the end user. Who will use the solution in the longer term?
What are their constraints or priorities? How do they feel about the type
of solution envisaged? Usage modeling, where multi-disciplinary teams use
personas to explore the user experience from several perspectives, can be
useful here.
“Breakthroughs in this area may
support semi-automated analysis,
diagnosis, and treatment processes in
the future.”
46 | Adapting Technology for Personal Healthcare
•• Understand what data must be collected. In order to test a clinical hypothesis
or to achieve a care objective, some information about the activity or
health state of the patient must be collected. It is important that a clear
understanding of what data will be collected by the sensor network is
shared by the clinician and the engineer, from the start. The information
that the clinician needs is important, not the data that the sensor has the
technological capability to collect. Focus on what is needed, rather than on
what the technology allows.
•• Understand the environment. Many WSN deployments do not achieve their
objectives because they fail to take into account the differences between the
laboratory and clinical environment, and the home environment. Building
methods and materials vary from location to location; these may impact
the technology. Other equipment in the home may also interfere with the
sensor technology.
Intel® Technology Journal | Volume 13, Issue 3, 2009
•• Consider sensor location. Are sensors to be worn, to be held, to be embedded
in walls, floors, beds, or furniture? Keep in mind that the sensor network
must be as unobtrusive as possible — solutions that require the patient to
change his or her day-to-day behavior or which affect the comfort, privacy,
or dignity of the elder person are unlikely to be successful on a long-term
basis. Almost all at-home solutions aim to be long-term.
•• Select sensors and actuators. Take into account the data, the environment,
and the placement. What impacts do these have on power consumption,
size and weight, form factor, radio communications, antenna type
and computational capability? Where at all possible, use off-the shelf
components, and avoid experimental or prototype technology. Remember
this is a healthcare problem, not a technology research project. Aim for a
low-cost, light-touch solution.
•• Specify and build the aggregator. The aggregator receives data from all the
sensors in the network, and may transmit them to an analysis engine or
data visualization system. Alternatively, the aggregator may itself process
the data and trigger responses by actuators or by communications with
caregivers or doctors. In general, actuators take the form of local PCs
or mobile devices, such as PDAs and smart phones. We do not cover
aggregator design in any detail in this article, because one typically uses
mass-market devices such as phones and PCs.
•• Identify and deploy analysis and visualization capability. Ensure that the
technology is in place to convert the data from the WSN to clinical
information, and to allow the clinician to access this information in an
appropriate manner. This may be out of the scope of a purely sensornetwork project, but it is important that in such a case the precise nature
of the interface between the network and the back-end data analysis and
clinician system is clearly defined. The technology used for visualization
is not explored in this article; analysis and visualization techniques and
technologies are quite standard across many application domains.
•• Build the solution in the laboratory and verify that all elements work as
planned. Discuss the solution with clinicians and end users.
•• Prototype and pilot test the solution in a controlled, friendly environment.
The solution should be used in the home of a study participant who has
agreed to act as a tester. In these friendly trials, the importance of proper
safety testing and precautions cannot be understated. Before deployment
to the home even for prototyping purposes, any Federal Communication
Commission licenses or local radio licenses, UL or CE requirements,
human subjects approvals, Institutional Review Boards (IRBs, signed
agreements, etc.) must be obtained.
•• Deploy the solution in the real world. Continue to engage with end-users to
detect and correct any issues that may lead to failure to use the solution as
planned.
Adapting Technology for Personal Healthcare | 47
Intel® Technology Journal | Volume 13, Issue 3, 2009
“Medication compliance is a vexing
issue at the core of many challenges in
healthcare.”
Overall, we have learned that deploying an effective sensor solution is
a complex endeavor. As such, proactive planning, clear objectives, and
forethought of the end-stage solution are critical to a successful design and
deployment of a WSN solution.
Context Aware Medication Compliance (CAMP) Trial
Medication compliance is a vexing issue at the core of many challenges in
healthcare. In a recent report in the New England Journal of Medicine, Osterber
and Blaschke [11] report that typical adherence to a medication regime ranged
from 43 percent to 78 percent, depending on the age of the population
researched and the type of medication taken. In other words, 50 percent of
patients are typically failing to comply with their medication regimen.
Our team engaged in a context aware medication compliance (CAMP) trial
to develop a sensor network system that is contextually aware. We wanted to
test adherence to a medication regime over several months, in the hopes of
improving the rate of adherence over time.
“Our system was designed to track
the location of users, monitor the
sleep activity of users, monitor the
adherence to medication regimens,
and drive prompts to various devices
throughout the home.”
Typical products on the market for medication adherence fall into two
categories. The first is the product that acts like a timer or alarm clock. These
devices issue alarms when the time set for people to be reminded to take
their medication is reached. Although these devices are highly accurate in
measuring time and delivering an alarm, they often were not co-located with
the medication at the time of alarm. Moreover, these alarms prompted so often
that they became a nuisance and were often overlooked. Other devices are store
and dispense medications. These are common in many households and take
the form of a system with cells in multiples of seven for the days of the week.
Medication is stored in these devices, and the device is often located near the
activity center associated with the time of day the medication should be taken.
These systems fail when the user’s routine is disturbed.
Our system was designed to track the location of users, monitor the sleep
activity of users, monitor the adherence to medication regimens, and drive
prompts to various devices throughout the home, based on the appropriate
time, location, and activity that would ensure the best adherence.
Planning Sensor Deployment
A key issue for planning is the layout of the deployment environment (the
subject’s home) and the identification of ideal sensor locations. Ideal locations
reflect the range of the radios being used (Zigbee*, Bluetooth*), the size of the
rooms, and the sensitivity of motion sensors. Figure 2 shows an example floor
plan with sensor locations. The sensor locations must be verified and validated
after the installation, to ensure that data are being captured and communicated
as planned.
48 | Adapting Technology for Personal Healthcare
Intel® Technology Journal | Volume 13, Issue 3, 2009
Base station
Livingroom
Bed sensor
Imed
Pillbox
Bedroom
Closet
Watch charger
Kitchen
DSFridge
Bathroom
Closet
Door sensor
Activity beacon
Motion sensor
Phonejack
Ceiling motion sensor
Cable connection
Figure 2: Floor Plan and Sensors Placement
Source: Intel Corporation, 2009
The choice of radio technologies can have a profound impact on the success of
communications within a project. Similar devices from different suppliers may
perform in dramatically different ways.
Moreover, we learned through deployed trials in US homes throughout the
Southwestern United States and the Pacific Northwest that construction
methods and materials profoundly affect wireless technology. Homes with
significant wood and lumber construction create a very different environment
than do homes constructed of stucco, steel frames, and other materials.
“The choice of radio technologies can
have a profound impact on the success
of communications within a project.”
Surprising at the time, although obvious now, we learned that variation in
the materials and methods of home construction had a significant impact
on sensor performance. As such, understanding the materials used in home
construction has a direct impact on both the type of sensors selected and the
manner in which a WSN will be designed and deployed.
Participant Tests
In the course of the CAMP project, we deployed a pillbox with sensorequipped doors. Each time a door in the pillbox was opened, a signal was sent
to the aggregator, indicating that a pill had been taken. However, one subject
left the door open, as a reminder to himself to take the second pill; this was not
as envisaged by the team, and it led to unexpected inferences by the system.
“Each time a door in the pillbox
was opened, a signal was sent to the
aggregator, indicating that a pill had
been taken.”
Adapting Technology for Personal Healthcare | 49
Intel® Technology Journal | Volume 13, Issue 3, 2009
“Frequent at-home monitoring of
disease state tends to be far superior
to infrequent but highly detailed inclinic assessments.”
Fluorescent Lamps and Infrared Signals
Common fluorescent lighting fixtures driven by electronic ballasts emit
low-frequency infrared signals modulated in the kilohertz range. This signal
interferes to a very significant degree with infrared communications between
devices and aggregators, or among devices. Interference from fluorescent
lighting needs to be taken into account when choosing the technologies to
deploy, and also when positioning devices in the home. It should be borne in
mind that some applications of fluorescent lights mean that they are used only
occasionally, such as for lighting on kitchen counters.
Parkinson’s Disease At Home Test
One option where technology in the home may improve upon the modern
hospital and clinic framework is in the area of interactive monitoring of chronic
or degenerative diseases. An ecosystem of technologies has grown around a
few individual diseases, such as diabetes, and frequent at-home monitoring of
disease state tends to be far superior to infrequent but highly detailed in-clinic
assessments.
“Designers and engineers at Intel
created an At Home Test (AHT) for
Parkinson’s disease.”
In 2004, we were approached by a not-for-profit Parkinson’s Disease research
support foundation interested in technologies to better monitor the symptoms
of Parkinson’s disease. This interest was fostered by a series of meetings between
the foundation and prominent neurologists from seven different institutions.
Each of the neurologists present had recent experiences with various techniques
to quantitatively measure the symptoms of Parkinson’s disease. Together, the
neurologists expressed interest in a single apparatus, designed around multiple
measurement techniques, and capable of daily or weekly evaluations of the
symptoms of Parkinson’s disease outside of a clinic or lab [12].
Historically, Parkinson’s disease testing has consisted of twice or three times
yearly visits to a neurologist, where somewhat subjective estimates were made
of the patient’s condition by using the Unified Parkinson’s Disease Rating Scale
(UPDRS). While the UPDRS has been a gold standard for years, the neurology
team and the foundation felt its typical application lacked frequency and did
not produce truly quantitative measurements.
Intel, the Parkinson’s Disease foundation, and the team of neurologists formed
a set of requirements and a study protocol to (1) establish evidence that
in-home interactive sensing of Parkinson’s disease is viable and would be used
by patients on a regular basis, (2) assess the effectiveness of the various
quantitative measurement techniques brought forward by the team of
neurologists, and (3) establish if those measurement techniques could show
variation in measured symptoms from week to week. Based on this set of
requirements and the new protocol, designers and engineers at Intel created an
At Home Test (AHT) for Parkinson’s disease.
Figure 3: The Parkinson’s Disease At Home Test
Apparatus
Source: Intel Corporation, 2009
50 | Adapting Technology for Personal Healthcare
Intel® Technology Journal | Volume 13, Issue 3, 2009
The AHT was designed to measure the following:
“The study consisted of 52 subjects,
•• Large limb motion
all in the early stages of Parkinson’s
•• Fine motor skills
disease, who each took an AHT
•• Fatigue
•• Ability to maintain rhythm
•• Tremor frequency and duration
apparatus home for six months of
once-weekly testing.”
•• Reaction time
•• Free speech (voice inflection)
•• Continuous vocal tone
Metrics designed around gait measurement or other ambulatory situations were
rejected from final design, due to the potential for possible subject injury when
subjects were in an unsupervised environment.
The study consisted of 52 subjects, all in the early stages of Parkinson’s disease,
who each took an AHT apparatus home for six months of once-weekly testing.
Prior to the six months of weekly tests, the subjects each took a test once every
day for a week to ensure they were well practiced with the movements and
to ensure that future study results would not reflect a learning curve. After
the practice sessions, the subjects were evaluated by their clinician to see if
they could actually follow the study directions. All but one subject was fully
enrolled after practice, with the one missing subject disqualified due to other
health reasons. In general, subjects proved capable of using the AHT apparatus.
Due to the relatively late onset of Parkinson’s disease symptoms (average age at
diagnosis is 59) for most subjects, certain design considerations were taken into
account:
•• Failing eyesight.
•• Reduced range of hearing.
•• Discomfort with complex technology.
•• Presence of grandchildren who might tamper with or damage the
equipment.
Furthermore, because a movement disorder was known to afflict the study
subjects, we employed large buttons in a widely spaced configuration for all
navigation and selection tasks. We did not use any mouse or other pointing
device. Instead, subjects used just up, down, and select interaction buttons.
The likelihood of participants having failing eyesight was mitigated by
integrating a super bright and relatively large liquid crystal display (LCD) into
the Parkinson’s disease AHT apparatus. The LCD had large fonts and provided
a lot of empty space within screens. Furthermore, we provided all instructions
and cues in both text and through a voice reading of the text. Reduced hearing
ability was accommodated by using a large speaker and selecting an instruction
narrator with a lower-toned voice.
“We employed large buttons in a
widely spaced configuration for all
navigation and selection tasks.”
Adapting Technology for Personal Healthcare | 51
Intel® Technology Journal | Volume 13, Issue 3, 2009
“We developed significant data
security to ensure the authenticity of
the information could be established
and that the health information
within the test data could be protected
from unauthorized viewing.”
At-Home Testing Program
3:49 PM
Take Tests
Friday, May 30
• Instructions
Move Up
Next test session
Select
Saturday 5/31
Move Down
4:00pm – 8:00pm
Reaction Test
“BEEP”
Finally, we made the design decision not to use a standard computer, where
booting and peripheral connection could interfere with regular use of the
AHT apparatus. Instead, we prototyped a series of small netbook-like devices
with integrated tests. The form-factor we selected in the end was rugged, small
enough for regular use on a counter or kitchen table, easily transported, and
nondescript so as to encourage subjects to leave it out and plugged in.
The initial intent was to have the Parkinson’s disease AHT apparatus connect to
a home network. After extensive review of the availability of high-speed
Internet, however, together with concerns about integrating the AHT apparatus
with existing home networks, we made the design decision to simply store test
data for later delivery to a central storage facility either physically or through a
computer Internet connection. We later decided to include a very basic and
inexpensive laptop computer equipped with a mobile broadband card to
facilitate more regular uploads of study data.
We anticipated that the AHT apparatus might incur system failures in the
field, so we also made the design decision to store all application software and
study data on an external storage device. This decision allowed us to replace a
defective system in the field without having to modify it for use by a specific
study participant or extract test data from a damaged unit. New systems
could simply be sent to replace defective or damaged ones, while the damaged
systems could simply be discarded. We decided on a USB storage device as the
medium and we developed significant data security to ensure the authenticity
of the information could be established and that the health information within
the test data could be protected from unauthorized viewing.
Because the systems were going to be deployed over much of the United States,
we designed the software system to be easily upgraded. Upgrades were provided
either through issuing new USB storage devices or by updating the software on
the storage devices during the data upload process. We also designed packaging
to protect the apparatus during air or ground freight.
The final AHT apparatus design was very well received [12] and even won a
major design award in 2006. All but one study participant found the device to
be easy to work with.
Although careful consideration and a great deal of organizational effort went
into building the AHT apparatus, we learned many lessons the hard way. As an
initial foray into the healthcare field, these lessons were as valuable, if not more
so, than anything the actual study might tell us.
First and foremost, we learned the value of very careful industrial design and its
impact on end-user acceptance. The system was beautiful and people wanted to
use it. Something less refined would have begged to be put in a drawer.
Figure 4: Parkinson’s Disease At Home Test
Screen Shot
Source: Intel Corporation, 2009
52 | Adapting Technology for Personal Healthcare
Intel® Technology Journal | Volume 13, Issue 3, 2009
On the study side of things, we found that training was critical, especially for
the clinicians who would oversee the study subjects. They were the first line of
support for subjects and, in many cases, solved problems without our input
(until the end of the study). What seemed excessive at the time, flying all of the
clinicians to our offices at Intel for a day of intense training, turned out to be
money and time very well spent.
Finally, we found that the sheer volume of data collected by the AHT
apparatus to be intimidating, even to seasoned statisticians. There were so
many dimensions for analysis that forming a coherent plan to correlate
measurements with UPDRS scores was difficult.
“What seemed excessive at the time,
flying all of the clinicians to our offices
at Intel for a day of intense training,
turned out to be money and time very
well spent.”
In the end, we found that individuals with Parkinson’s disease would likely use
this type of interactive technology on a regular basis in their homes if it was
widely available. We also found that some quantitative measures showed a lot
of promise in predicting disease state [12], but the initial study was not sized
to prove any such assertion. The data, arguably one of the largest collections
of Parkinson’s disease measurements, have proven useful for additional
investigations into the progression of this disease.
We now move on to a discussion of the third aspect of our technology
healthcare solutions, what we call social care.
Social Care
Studies have shown that social isolation can lead to depression and loneliness
and can accelerate the onset and manifestation of illnesses of aging. In May
2009, the New York Times ran a front-page story on the increased use of social
media by senior citizens — and the direct and indirect health benefits of such
social engagement. There have been many other articles and reports discussing
the link between an individual’s health and the size and activity of their social
network [13].
“There have been many other articles
and reports discussing the link between
an individual’s health and the size
and activity of their social network.”
In April 2009, The California Healthcare foundation released a report titled:
The Wisdom of Patients: Health Care Meets Online Social Networking [14].
This report illustrated the strong growth of web 2.0 offerings focused on
providing information and community to seniors and aging baby boomers.
The report indicated that the active participation in social networks would
be the hallmarks of a consumer-centric approach to healthcare. The access
to health information, quality data, and provider rankings, coupled with
the establishment of forums, sites, and support groups to rapidly exchange
information, are fundamental shifts in how healthcare is provided.
Adapting Technology for Personal Healthcare | 53
Intel® Technology Journal | Volume 13, Issue 3, 2009
Social Health Trials
“We hypothesized that we could
detect and track the social activity of
a patient with Alzheimer’s disease,
while also providing feedback on that
activity to the patient and a family
caregiver.”
One of Intel’s earliest research projects focused on examining social networks
and social health. We hypothesized that we could detect and track the social
activity of a patient with Alzheimer’s disease, while also providing feedback
on that activity to the patient and a family caregiver. The desired result was
to keep the patient as socially engaged as possible. To that end, we queried
each participant to define a social network of approximately ten friends and
family that were to be tracked. We built a sensor network in the home of each
participant that was designed to detect and track physical visits and phone calls
by anyone in the predefined social network.
We also built two interfaces that the patient and family caregiver could interact
with: one was a caller ID application that rendered a picture and provided
contextual notes regarding the last conversation between the patient and friend
or family member, as well as a social networking galaxy model. The galaxy
model provided a picture of the patient in the center of the graphical universe
with pictures of the social network participants orbiting around the patient.
As more or less social activity was detected by the sensor network, the orbit of
each participant changed and became closer or further away from the patient,
respectively [15]. This provided a simple model for the patient to use to reflect
on their level of social activity with any given participant in their network. For
example, if a patient’s daughter hadn’t called in a few weeks, the patient could
see in the galaxy the orbit becoming more distant. Conversely, the orbits of
those with whom the patient had more frequent contact would become closer.
“For example, if a patient’s daughter
hadn’t called in a few weeks, the
patient could see in the galaxy the
orbit becoming more distant.”
54 | Adapting Technology for Personal Healthcare
Participants found the interfaces compelling, keeping them more mindful
of changes in their social activity than they otherwise would have been [15].
New models of interaction, particularly those focused on what one might call
interactive journaling of patterns and daily activities are promising solutions for
keeping home-bound seniors and patients engaged and connected in their daily
lives.
Intel® Technology Journal | Volume 13, Issue 3, 2009
Conclusions
In this article, we examine the impact of the transformation occurring across
many parts of the world with regard to healthcare. Increased longevity and
the aging of the baby boomer population are creating a sea change in the
demographic composition of populations across the globe. Correspondingly,
societies are shifting from youth-orientated cultures to age-aware cultures.
Healthcare systems are shifting from treating the acute illnesses of youth to
managing the chronic illnesses of aging.
All of this is occurring at a time when healthcare is consuming more than 17
percent of the United States’ GDP, the quality of care is deteriorating, and the
healthcare delivery system is more challenged than ever. Meanwhile, more than
40 million people in the United States lack healthcare insurance. Millions of
others lack access — or cannot afford access due to the cost of care.
One solution to ease some of these strains on the healthcare system may be
to use technology to enable the aging population to age at home. Technology
can support independence of the elderly through monitoring activities of
daily living (ADL), tracking cognitive health, proactively managing chronic
health conditions, and providing a system to promote social engagement. All
of these services can be provided at home with the help of informal caregivers.
Moreover, informal home caregivers help to alleviate the shortage of skilled
nurses in the workforce today. Effectively, this technology shifts the care
model from a large, centralized, provider-centric care model to a home- and
community-based model of distributed care.
“Increased longevity and the aging
of the baby boomer population
are creating a sea change in the
demographic composition of
populations across the globe.”
We have shown how three technical solutions were used in trials, by adapting
technology to enhance the quality of life for patients, while at the same time,
providing a platform for proactive health and wellness management for
providers. The adoption of these solutions can play a major role in changing
the playing field, improving quality of care, and improving health outcomes.
“The adoption of these solutions can
play a major role in changing the
playing field, improving quality of
care, and improving health outcomes.”
Adapting Technology for Personal Healthcare | 55
Intel® Technology Journal | Volume 13, Issue 3, 2009
References
[1]
J.E. Cohen. “Human Population: The Next Half Century.” Science, 14
Nov. 2003, pp. 1172-1175.
[2]
J.E. Cohen. “Human Population: The Next Half Century.” Science, 14
Nov. 2003, pp. 1172-1175.
[3]
“Family Caregivers — What They Spend, What They Sacrifice. The
Personal Financial Toll of Caring for a Loved One.” Evercare and the
National Alliance for Caregiving, November 2007, page 9.
[4]
“The MetLife Caregiving Cost Study: Productivity Losses to U.S.
Business.” MetLife Mature Market Institute and National Alliance for
Caregiving, July 2006.
[5]
“Genworth 2009 Cost of Care Survey: Home Care Providers, Adult Day
Health Care Facilities, Assisted Living Facilities and Nursing Homes.”
April 2009. Genworth Financial, page 8.
[6]
“Genworth 2009 Cost of Care Survey: Home Care Providers, Adult Day
Health Care Facilities, Assisted Living Facilities and Nursing Homes.”
April 2009. Genworth Financial, Page 18.
[7]
C. Lewis, “Emerging Trends in Medical Device Technology: Home Is
Where the Heart Monitor Is.” http://www.fda.gov/
[8]
B. Schlender. “Intel’s Andy Grove: The Next Battles in Tech.” Fortune,
12 May 2003, pp. 80-81.
[9]
J. Takamura and B. Williams. “Informal Caregiving: Compassion in
Action.” An informal report, US Department of Health and Human
Services, 1997.
[10] Dishongh, T. and McGrath, M. Wireless Sensor Networks for Health Care
Application. Artech Publishing, New Jersey, 2009.
[11] Osterberg, Lars, and Blaschke, Terrence. “Adherence to Medication.”
New England Journal of Medicine, Volume 353:487-497, August 4,
2005.
[12] Deleeuw, Bill. “Testing objective measures of motor impairment in
early Parkinson’s disease: Feasibility study of an at-home testing device.”
Movement Disorders, Volume 24(4): 551-6, March 15, 2009.
[13] “A Reason to Keep On Going.” New York Times, June 12 2009, Health
Section, page D5. By Stephanie Clifford, Reporter.
[14] “The Wisdom of Patients: Health Care Meets Online Social Media.”
April 2008. The California Healthcare Foundation. Health Report, by
Jane Sarasohn-Kahn, THINK-Health.
[15] Morris, M. “Social Networks as Health Feedback Displays.” IEEE
Internet Computing, vol. 9, no. 5, pp. 29-37, Sept./Oct. 2005.
56 | Adapting Technology for Personal Healthcare
Intel® Technology Journal | Volume 13, Issue 3, 2009
Acknowledgements
We acknowledge with thanks the hard work of our co-workers: Kevin Rhodes,
Brad Needham, Margie Morris, Jay Lundell, Eric Dishman, Tim Plowman,
Steve Agritelley, and Adam Jordan.
Author Biographies
Terrance (Terry) J. Dishongh, Ph.D., is currently a Senior Principal Engineer
in the Intel Digital Health Group. In his twelve years at Intel he has been
awarded 48 patents with an additional 44 pending and has published books
on wireless sensors and electronic packaging. Dr. Dishongh has held faculty
positions at the University of Maryland, College Park and the State University
of New York at Buffalo. His e-mail is terry.dishongh at intel.com.
William (Bill) DeLeeuw is an 18-year veteran of Intel and currently is an
“Innovation Architect” in the Intel Digital Health Group. He is responsible
for creating new health concepts consistent with Intel’s digital health strategy,
prototyping those concepts, and working with study coordinators to establish
feasibility. Bill has worked on PC FAX systems, applications sharing software,
consumer video conferencing, distance learning, consumer electronics,
experimental wireless systems, and user-centered design concepts. He received
a Bachelors of Science degree in Electrical Engineering from the University of
Texas in 1991. His e-mail is bill.deleeuw at intel.com.
Mark Francis, M.P.P., is a Product Marketing Manager in the Intel Digital
Health Group. Prior to joining Intel, Mr. Francis held senior management
or Board positions with Harrison Clinical Research, Health Hero Network,
Age Wave, and the City of Boston. Mark conducted his academic work at
the Harvard Business School, Harvard Kennedy School and the University of
Pittsburgh. His e-mail is mark.r.francis at intel.com.
Copyright
Copyright © 2009 Intel Corporation. All rights reserved.
Intel, the Intel logo, and Intel Atom are trademarks of Intel Corporation in the U.S. and other
countries.
*Other names and brands may be claimed as the property of others..
Adapting Technology for Personal Healthcare | 57
Intel® Technology Journal | Volume 13, Issue 3, 2009
Healthcare IT Standards and the Standards Development
Process: Lessons Learned from Health Level 7
Contributors
Charles Jaffe
Health Level 7
W. Edward Hammond
Duke University
John Quinn
Accenture
Robert H. Dolin
Health Level 7
Abstract
Standardization of information in healthcare is critical to the ability of a diverse
community of caregivers to reliably exchange complex data without ambiguity.
Perhaps more importantly, every member of the healthcare team must be able
to re-use the data within increasingly diverse applications and environments.
In order to improve quality and to constrain the escalating costs of delivering
care and preventative medicine, the strategies for developing these standards
have attempted to keep pace with the developments in information technology,
human biology, healthcare policy, social science, and economics. For more than
two decades, Health Level 7 has been at the forefront of those processes for an
international community. New technologies, and new applications of existing
ones, foster standards development against a complex backdrop of social and
political demands.
Healthcare IT Standards: An International Perspective
Index Words
Healthcare Information Standards
Health Level 7
“Health Level Seven (HL7) was
organized in 1987 to create standards
The history of standards has made the creation of standards difficult. The
global community of today did not exist when the healthcare IT community
recognized the necessity for health data standards to reduce the costs of
interfacing systems and to permit the exchange of data. The health-related
community began to be interested in standards during the early 1980s, and
several organizations were created during the late 1980s and early 1990s.
For clinical systems, the needs were primarily within a hospital and included
the domains of patient admission, transfer and discharge; lab test ordering
and result reporting; prescriptions and dispensing; materials management;
images; reimbursement; and other similar domains. New standards-developing
organizations were created, mainly focusing on only one component of the
many recognized needs.
Standards developing organizations (SDOs) that were creating standards in
communications, banking, manufacturing, and other non-health areas were
quite mature by this time. In the United States, ASTM American Society for
Testing and Materials created E31 for health data standards, focused initially
on standards for the reporting of laboratory test data. Health Level Seven
(HL7) was organized in 1987 to create standards to support the development
of best-of-breed hospital information systems.
to support the development of best-ofbreed hospital information systems.”
58 | Healthcare IT Standards and the Standards Development Process: Lessons Learned from Health Level 7
Intel® Technology Journal | Volume 13, Issue 3, 2009
In Europe, the Comité Européen de Normalisation (CEN) formed Technical
Committee 251 in the early 1990s to create health data standards. There
was little competition between CEN and HL7, simply because there was no
apparent common market. In fact, the methodology that was used to create
later HL7 standards was influenced by CEN. In 1995, Germany became
the first international affiliate of HL7, followed shortly thereafter by the
Netherlands. From then on, the competition between HL7 and CEN began.
The relationship between members of both organizations was cordial, but both
HL7 and CEN were vying to have their standards used in Europe.
“The International Standards
All countries have a national standards body: the American National Standards
Institute (ANSI) in the United States; the British Standards Institution (BSI)
in the United Kingdom, the German Institute for Standardization (DIN),
the French national organization for standardization, known as Association
Française de Normalisation (AFNOR), and so on. The International Standards
Organization (ISO) was founded just after the end of World War II in 1947
and is an international, standard-setting body composed of representatives
from these various national standards’ organizations. ISO Technical Committee
215 was formed in 1998 through efforts of the U.S. and the U.K. to create
standards in Health Informatics. Since most countries, by law, require the use
of an ISO standard if one exists, ISO became a key player in international
standards. In 1991, recognizing that there were not sufficient expert resources
available for ISO and CEN to conduct their standardization activities
independently, ISO and CEN signed the Vienna Agreement to work together
to produce standards. The impact of this agreement, including the joint
development of standards and the sharing of standards, was felt independently
within each organization.
national standards’ organizations.”
Organization (ISO) was founded
just after the end of World War II
in 1947 and is an international,
standard-setting body composed of
representatives from these various
“After over five years of work, a single
standard had still not been defined.”
In 2002, HL7, following the lead of IEEE, signed an agreement with ISO,
through ANSI, that permitted HL7’s work and standards to be brought into
ISO upon approval of ISO TC 215. Although both of these agreements
resulted in a move toward the consolidation of work, standards continued to
be duplicated. In the spirit of harmonization, efforts were put into mapping
one similar standard to another across organizations where multiple standards
exist. An example of one such activity was the effort to create a single standard
for defining data types, by combining similar work from ISO, CEN, and
HL7. After over five years of work, a single standard had still not been defined.
Similarly, approaches to define a single global Reference Information Model
(RIM) failed, because of differences between the CEN standard EN 13606
and the HL7 RIM. This latter issue was further compounded when the HL7
RIM was approved as ISO/HL7 21731:2006-Health Informatics-HL7 version
3-Reference Information Model.
Healthcare IT Standards and the Standards Development Process: Lessons Learned from Health Level 7 | 59
Intel® Technology Journal | Volume 13, Issue 3, 2009
“Over the next few years, HL7
submitted several HL7 standards to
the ISO to become joint HL7/ISO
standards.”
Over the next few years, HL7 submitted several HL7 standards to the ISO
to become joint HL7/ISO standards. These included HL7 version 2.5
Messaging Standards; HL7 version 3.0 Clinical Document Architecture, R2;
Common Terminology Server, R1; HL7 EHR–Functional Model; and Clinical
Genomics-Pedigree. In late 2005, HL7 submitted four regulatory standards to
ISO: Structured Product Labeling, Release 1; Individual Case Safety Report;
Stability Study; and Annotated Electrocardiogram.
None of these standards were accepted by TC 215 (most nations abstained).
In addition, the International Committee on Harmonization (ICH) was very
concerned, because they themselves had standards in some of these areas.
In 2006, at the International Conference on Harmonization of Technical
Requirements for Registration of Pharmaceuticals for Human Use ICH
meeting in Yokohama, leaders from ISO TC 215, CEN TC 251, and HL7
made presentations to the group about their respective organizations. As a
result, ICH became Class D Liaison with ISO and developed relationships with
both CEN and HL7.
“There was a real interest among the
leaders of ISO, CEN, and HL7 to
work together.”
There was a real interest among the leaders of ISO, CEN, and HL7 to work
together, largely driven by the limited resources available to produce standards,
as well as by the confusion in the marketplace because of multiple standards.
At the Global Health Information Technology Standards Summit in Geneva in
2006, presentations by Dr. Yun Sik Kwak, Chair, ISO/TC 215; Kees Molenaar,
Chair, CEN TC 251; and Dr. Ed Hammond, Chair-elect, HL7 suggested the
three SDOs might be able to work jointly to produce a single global standard
for a single business purpose. After additional discussions, a charter was written
and was subsequently ratified by all three SDOs. The charter establishes a
Joint Initiative Council (JIC) that includes the chairs of the three participating
organizations plus two additional representatives from each SDO.
Figure 1 shows the global healthcare landscape. The JIC serves as a collaborative
forum for the international standards community. Within the US, the Federal
Health Architecture coordinates 55 Federal agencies. Vocabulary standards are
depicted in the oval boxes.
“The charter establishes a Joint
Initiative Council (JIC) that includes
the chairs of the three participating
organizations plus two additional
representatives from each SDO.”
60 | Healthcare IT Standards and the Standards Development Process: Lessons Learned from Health Level 7
Intel® Technology Journal | Volume 13, Issue 3, 2009
NHS
Joint Initiative Council (JIC)
Infoway
NeHTA
CEN
HL7
ISO
CDISC
IHTSDO
IEEE
SCO
NCPDP
ASTM
ASC X12
SNOMED
HITSP
ANSI
CCHIT
NQF
FHA
ONC
NIST
LOINC
HIMSS/RSNA
MedDRA
ICH
CPT
AMA
ICD
WHO
IHE
HHS
NIH
FDA
VHA
NLM
CDC
DOD
NCI
CMS
Home Sec
DICOM
caBIG
US Realm
Figure 1: Global Healthcare Standards Landscape
Source: Health Level 7
The word “joint” initiative was chosen specifically to distinguish this activity
from harmonization, since harmonization was generally considered to be
efforts to map one standard to another competing standard. Any joint project
must be agreed to by this group. A Joint Working Group (JWG) was also
created, managed by ISO, which includes members from the participating
SDOs. The purpose of the JWG is to work through the details of any joint
initiative project, identify potential new projects, monitor the work on joint
initiative projects, and support, in general, the work of the JIC. Each JIC
project would be hosted by one of the participating SDOs, with that SDO
providing a project chair or lead for the project. Each of the other SDOs
would provide a co-chair to ensure the resulting work met the needs of each
participating SDO.
“Harmonization was generally
considered to be efforts to map one
standard to another competing
standard.”
The work on any project would be done jointly by volunteer members from
each participating SDO, working as a cohesive unit. The three SDOs would
be considered equal in the work process. The resulting work product would be
balloted simultaneously by each participating SDO, and the comments from
each SDO would be aggregated and processed by the joint project team. The
resulting standard would be the joint property of each participating SDO and
would carry a shared copyright and the logo of each SDO.
Healthcare IT Standards and the Standards Development Process: Lessons Learned from Health Level 7 | 61
Intel® Technology Journal | Volume 13, Issue 3, 2009
“ISO 21090: Healthcare Data Types,
was approved in 2009.”
The first, now successfully concluded, JIC project was a standard for data types.
This standard, ISO 21090: Healthcare Data Types, was approved in 2009, after
many years of unsuccessful work to bring disparate groups together. Other
projects, listed next, have been accepted as JIC projects:
•• Individual Case Safety Report
•• Glossary Project
•• 13606/HL7 version 3 Implementation Guide
•• Identification of Medicinal Products
•• Clinical Trial Registration and Result
•• BRIDG Data Model
There are a number of other items being proposed as JIC projects. Once the
scope and definition of a project is defined, and the participants begin to
establish a level of trust in one another and can work together, the projects do
move ahead.
“The JIC has developed a policy and
procedures for defining projects, for
governance, and for bringing other
groups into the JIC.”
The JIC has developed a policy and procedures for defining projects, for
governance, and for bringing other groups into the JIC. Since the creation
of the JIC by the three SDOs, two other SDOs have joined the group: the
Clinical Data Interchange Standards Consortium (CDISC) became a member
in 2008, and the International Health Terminology Standards Development
Organization (IHTSDO) became a member in 2009. Other SDOs are in the
process of joining the JIC.
There are barriers that still must be overcome for the JIC process to work. The
balloting scenario differs among the three SDOs. The length of the balloting
period differs as does the publication process and form. Communication
among the participating SDOs and their membership is very challenging.
Understanding the process has been difficult for all those involved, and
managing the process has also been a challenge.
“Understanding the process has been
The success of the JIC offers a lot of promise, however. In the United States, an
organization similar, at least in purpose to the JIC, has been formed: the SDO
Charter Organization (SCO). This organization promises to consolidate efforts
within the United States to produce a single standard for a single purpose.
Those involved are strongly motivated to work together for a number of
reasons. They can share expertise and resources, such as tools and repositories;
and working together makes more efficient use of limited funding. Moreover,
most SDOs recognize that the global market is expanding, and that there
is an urgent need to accelerate the development of standards to avoid both
marketplace and regulatory ambiguity. The more confusion, the less likely
players will be to use standards in the marketplace.
difficult for all those involved, and
managing the process has also been a
challenge.”
62 | Healthcare IT Standards and the Standards Development Process: Lessons Learned from Health Level 7
Intel® Technology Journal | Volume 13, Issue 3, 2009
The Role of Architecture in Developing Healthcare
Interoperability Standards
In this section, we look at the role of architecture and an architectural
framework in developing healthcare interoperability standards, and we present
some example environments that use them.
HL7 is a set of standards that supports interoperability of software applications
that are designed to support clinical and administrative processes in a
healthcare organization. While any healthcare organization can use HL7’s
interoperability standards, these standards are, for the most part, designed for
use by healthcare provider organizations.
“While any healthcare organization
can use HL7’s interoperability
standards, these standards are, for
the most part, designed for use by
healthcare provider organizations.”
Simply put, the architecture of an interoperability standard is a high-level view
of the components that make up the standard and the relationships between
those components. An architecture framework, as discussed here, is a reference
context that we use to view each HL7 standard product and its components.
These useful analysis tools are new to HL7.
Over its 22 years, HL7 has both created new and adopted some existing
standards. Because of the separate and distinct origins of these standards, a
different architecture could have been written for the internal components and
their inter-relationships for HL7’s major standards products: these products
include the HL7 version 2.x; HL7 version 3.0; HL7’s EHR standards; CCOW;
and Arden Syntax. To begin this remedial effort of creating a common housing
to hold the architecture definitions of all HL7’s major products, HL7 created
an architectural framework called HL7’s Services Aware Enterprise Architecture
Framework (SAEAF).
We shall now provide a high-level introduction to HL7’s SAEAF. We view
SAEAF as it applies to the HL7 version 3.0 standards, including the major
components that they share (for example, the HL7 RIM), and the somewhat
different purposes that they serve. Our customers, including those working
in the UK, Canada, and now the United States, all begin their efforts with
provider organizations that use the different architectures of the HL7 version
2.x and the HL7 version 3.x products, that going forward must co-exist and
interoperate to achieve country-wide interoperability goals.
“To hold the architecture definitions
of all HL7’s major products, HL7
created an architectural framework
called HL7’s Services Aware Enterprise
Architecture Framework.”
HL7 and Architecture: a Brief Introduction
In 1987 the HL7 organization was created with the purpose of creating
interoperability standards for the exchange of electronic information between
IT systems within and between healthcare providers. Not surprisingly, the
technology, workflows, and systems architectures created by HL7 in 1987
were the ones in use by the information technology (IT) industry in 1987.
The standard created at that time (HL7 version 2.1) achieved widespread use,
starting in 1991 and has evolved through a series of major and minor updates
to HL7 2.6 today. No formal architecture was ever planned for any element of
the HL7 version 2.
“No formal architecture was ever
planned for any element of the HL7
version 2.”
Healthcare IT Standards and the Standards Development Process: Lessons Learned from Health Level 7 | 63
Intel® Technology Journal | Volume 13, Issue 3, 2009
About five years later, HL7 volunteers expressed an interest in and began
working on a RIM for healthcare. There were no formal expectations at the
time for the scope of the effort, the amount of time it would take, or even
the expected difficulty that we would encounter. This experience was quite
eye-opening, because it forced HL7 developers to document the vastness and
complexity of healthcare. Nevertheless, no formal architecture was defined for
any HL7 standard that uses the RIM, although the RIM itself did assume a
formal internal architecture that addresses its governance, definitions, and the
tools related to testing conformance.
“This experience was quite eyeopening, because it forced HL7
developers to document the vastness
and complexity of healthcare.”
The HL7 RIM was never intended to stand on its own — although as a
reference model of healthcare information, it certainly does. Once the RIM was
recognized as possible, the organization also turned to the effort of developing a
new standard for data interchange that would have these features:
1. Model-driven (i.e., all data items that can be exchanged or used to generate
messages, documents, or services have a place and are taken from the RIM
where cardinality and relationship(s) to other data are clearly defined.
2. It must be formally connected to a methodology that itself could be
instantiated into software tools, giving us the following potential:
a. Using software tools to manage, validate, assemble, and publish the
version 3 Standard’s products, including the eXtensible Markup
Language (XML) schemas that would be needed to hold the metadata of
the information being exchanged.
b. Automating the generation of interchange specifications and creating
an environment where two or more individuals working from the same
set of requirements could generate identical specifications that could be
measured against a similar, software-generated set of conformities.
“Messages support connections between
IT systems that are involved in
supporting a workflow.”
These features evolved into the goals of HL7 version 3, which was first
published in 2004. Our goals for HL7 version 3 have also evolved into
three distinct delivery mechanisms that can be used for a version 3-based
interchange. These delivery mechanisms are discussed next.
An interchange can be a message similar to an HL7 version 2 message.
version 3 messages today are produced in XML syntax. Messages support
connections between IT systems that are involved in supporting a workflow.
HL7 version 2 describes a workflow through a pre-defined and somewhat
vague descriptor called a “trigger event.” For example, within an institution, a
trigger event could be a physician’s lab order placed at the patient’s bedside to
the hospital pathology laboratory. This workflow may be clearly understood by
everyone involved in every possible action step required, from the composition
of the order to the reporting of the signed final test results to the assigned
individual. However, the same workflow within an ambulatory setting can be
far more difficult to describe, because the actors in the workflow do not have an
integrated understanding of each other’s role.
64 | Healthcare IT Standards and the Standards Development Process: Lessons Learned from Health Level 7
Intel® Technology Journal | Volume 13, Issue 3, 2009
Most importantly, it is not ideal that we lack a formal behavioral model
(i.e., a dynamic or workflow model), but we compensate for this by analyzing
the existing workflow and then programming our interchange steps to
accommodate it. Secondly, we can only create different interchange patterns as
we encounter them. In both cases, a formal dynamic model could document
the supported workflow; assign a unique identifier to that workflow that could
be used by all parties; and serve as a base platform to be modified and re-used
when slightly new or different workflows are encountered.
“A formal dynamic model could be
modified and re-used when slightly
new or different workflows are
encountered.”
Messages have an expected lifetime bound by the instant in time that they are
created and used. A message has no further use once the content of a message
is consumed by the target system and optionally used to change the state of
its database. Hypothetically, a message might be used as a possible means of
assisting in a database recovery action, something that, hopefully, is never
needed.
The next delivery mechanism is Clinical Documents, based on the HL7
Clinical Document Architecture (CDA) standard, which is a part of HL7
version 3, and which shares and uses the same underlying artifacts (for
example, HL7 RIM, R-MIMS, Data Types, etc.). Clinical Document
templates are implementation guides based on the CDA. Clinical Documents
contain structured clinical data that also conform to document requirements
of being signed (electronically), immutable, and so forth. Documents may
indirectly be the product of and/or support a workflow, but they stand on their
own and typically have an indefinite life.
“Clinical Documents contain
structured clinical data that also
conform to document requirements
of being signed (electronically),
immutable, and so forth.”
The third delivery mechanism is a service that is based on the SAEAF. This
mechanism (as messages and documents) will be defined by the relevant HL7
Work Group responsible for the interchange’s content (for example, Structured
Documents for CDA). In HL7, services are currently based on the principles of
a Services Oriented Architecture (SOA) that is, itself, based on the antecedent
concepts developed in the Referent Model of Open Distributed Processing
(RM-ODP).
“This mechanism (as messages and
documents) will be defined by the
relevant HL7 Work Group responsible
for the interchange’s content (for
example, Structured Documents for
CDA).”
Healthcare IT Standards and the Standards Development Process: Lessons Learned from Health Level 7 | 65
Intel® Technology Journal | Volume 13, Issue 3, 2009
HL7’s Services-Aware Enterprise Architecture Framework
SAEAF was created by the HL7 Architecture Board (ArB) as a response to
a request from the CTO for an HL7 architecture that is driven by HL7’s
commitment to the following three principles:
1. HL7 produces specifications to enable Computable Semantic
Interoperability (CSI) between users of systems implementing those
specifications.
2. Instances of CSI between two or more HL7-based systems may cross
department, enterprise, and/or national boundaries.
3. An HL7 Enterprise Architecture Specification (EAS) is required, if HL7 is
to produce durable specifications that enable CSI in an effective, efficient,
and scalable manner.
The ArB recognized early on that three critical components were missing from
HL7 and, therefore, they needed to develop SAEAF. These are the missing
components:
1. A Behavioral Framework to express interaction semantics.
2. A layered Enterprise Conformance/Compliance Framework (ECCF) to
support service integration and run-time assessment of CSI.
3. A Governance Framework to oversee the development and implementation
of service (and other HL7 Interoperability Paradigm) specifications.
Computable Semantic Interoperability (CSI)
In order to understand the aforementioned, it is necessary to understand a little
of what HL7 sees as the requirements to achieve CSI. These necessary, but not
inclusive, requirements include the following:
1. Common static models (for example, the HL7 RIM) across all domains of
interest including:
a. An information model versus a data model.
b. The semantics of common structures.
c. Models based on robust data-type specifications.
2. A mutually understood behavioral (or dynamic) model that enables
sufficient (as defined by the problem space) understanding of the “use
context” of the creation of the data by the producer and its intended use by
the consumer.
3. Methodology for binding to concept-based ontologies that support these
constraints:
a. Domain-specific semantics.
b. Country, regional, or use-domain selection of appropriate ontologies.
c. Rigorous versioning release cycle management to ensure that individual
terminologies are consistently interpreted by both the producer and
consumer of the data.
66 | Healthcare IT Standards and the Standards Development Process: Lessons Learned from Health Level 7
Intel® Technology Journal | Volume 13, Issue 3, 2009
4. A formally-defined process for specifying structures that contain both
data and defined actions to be exchanged between machines, i.e., a data
exchange (such as a message or a service) or an electronic document or
services specification.
Looking at the first three requirements of CSI just outlined, it is useful
to understand them as the dimensions of variability for any data element
exchanged between two HIT systems. As Figure 2 illustrates, three properties
must be defined to exchange data elements: the data, terminology, and process
or behavior. Taken together, these three dimensions provide a context for
the information to be shared. We need to know data type (e.g., integer, text,
image), terminology (i.e., the specific ontology being used), and process or
behavior (e.g., a diagnostic process).
These three requirements, in addition to the fourth requirement of actually
moving the data, are necessary for CSI. Figure 2 depicts this.
Process/Behavior
P
(D,P,T)
D
T
Data
Terminology
Figure 2: Information Context for Data Elements
Source: Health Level 7
Summary and Next Steps
HL7 has begun the effort of defining an architectural framework called SAEAF
that is both accommodating of services-based interoperability in an SOA and
capable of providing a framework for holding, comparing, and analyzing the
similarities and differences among HL7’s products. Going forward, SAEAF is
also the basis for EAS that will be used in the future to apply HL7 to user SOA
environments.
In our next section, we move on to describe the HL7 CDA, and we examine its
role in healthcare in the United States today.
Healthcare IT Standards and the Standards Development Process: Lessons Learned from Health Level 7 | 67
Intel® Technology Journal | Volume 13, Issue 3, 2009
“HL7 is also developing standards
for the representation of clinical
documents (such as discharge
summaries and progress notes).”
The Templated CDA Strategy: Scalable and
Incremental Interoperability
Many people know of HL7 as an organization that creates healthcare messaging
standards. HL7 is also developing standards for the representation of clinical
documents (such as discharge summaries and progress notes). These document
standards make up the HL7 clinical document architecture (CDA). The HL7
CDA, Release 1, became an ANSI-approved HL7 standard in November 2000.
CDA Release 2 became an ANSI-approved HL7 standard in May 2005, and it
is now in widespread use across the globe.
What follows is an introduction and overview of the CDA specification,
along with a description of how and why CDA has emerged as a cornerstone
component of the United States healthcare interoperability strategy.
“Approximately 1.2 billion clinical
documents are produced in the
United States each year. Dictated
and transcribed documents make
up around 60 percent of all clinical
notes.”
“Our goal is to get at the clinical
content within those documents and
make it computable, accessible to a
The need for a clinical document standard stems from the desire to unlock
the considerable clinical content currently stored in free-text clinical notes,
and to enable comparison of that content in documents created on widely
different information systems. Approximately 1.2 billion clinical documents are
produced in the United States each year. Dictated and transcribed documents
make up around 60 percent of all clinical notes. These documents contain the
majority of physician-attested information and are used as the primary source
of information for reimbursement and proof of service.
The challenge, addressed by CDA and the templated CDA strategy, is to
continue to meet the needs of front-line clinicians today, who are heavily
dependent on largely narrative documents, while at the same time providing
a migration pathway for greater and greater discrete data. In other words, the
goal of CDA is not simply to provide yet another format for the exchange of
clinical documents. If that were all we wanted, we could use PDF, MS Word*,
or any other file format. Instead, our goal is to get at the clinical content
within those documents and make it computable, accessible to a computer, so
it can be used for such things as decision support and quality reporting. We
want to do this in a way that fits in to real-world clinical workflows so that we
can tackle the problem incrementally, without the need for massive process
redesign.
From a technical perspective, the HL7 CDA is a document markup standard
that specifies the structure and semantics of a clinical document. A CDA
document is a defined and complete information object that can exist outside
of a message, and it can include text, images, sounds, and other multimedia
content. Just as you can create a document in MS Word, in PDF, etc., you can
create a clinical document in CDA format. CDA documents are encoded in
XML and derive their machine processable meaning from the HL7 RIM.
computer, so it can be used for such
things as decision support and quality
reporting.”
68 | Healthcare IT Standards and the Standards Development Process: Lessons Learned from Health Level 7
Intel® Technology Journal | Volume 13, Issue 3, 2009
CDA is based on a principle of incremental interoperability, whereby an
implementer can begin with a simple CDA, and then add structured data
elements over time. CDA R2 consists of a single CDA XML schema, and
the architecture arises from the ability to apply one or more templates that
serve to constrain the richness and flexibility of CDA. Professional society
recommendations, national clinical practice guidelines, and standardized data
sets can be expressed as CDA templates.
“All CDA documents must use the
same narrative markup so that you
can receive a CDA document from
anyone in the world.”
From a practical perspective, incremental interoperability means that one can
easily create a minimally-conformant CDA document — not really much
different than creating an HTML document. CDA documents have a header
that identifies and classifies the document and provides information on
authentication, the encounter, the patient, etc. There are a handful of required
fields and a number of optional fields. The body of the CDA can be purely
narrative, by using markup very similar to XHTML. All CDA documents must
use the same narrative markup so that you can receive a CDA document from
anyone in the world, and, following a defined algorithm, render the document
such that the receiving clinician correctly views content that was attested to by
the originating clinician.
In addition to its narrative markup, CDA provides XML markup for formally
representing the clinical statements within the narrative. A complete encoding
of all clinical utterances can be hard if not impossible, and there is no model
of healthcare that can fully and formally represent everything a clinician
might say. While the HL7 RIM is a richly expressive model that can represent
much of clinical narrative, the templated CDA strategy shields developers from
the need to learn all the RIM nuances. Developers only need to understand
those templates that have been recommended by the Healthcare Information
Technology Standards Panel (HITSP) or put on the Certification Commission
for Healthcare IT (CCHIT) roadmap. Developers simply map their internal
data stores to the prioritized templates.
Assume that next year, HITSP defines the way in which we should
communicate medical conditions and allergies. Assume that CCHIT takes
these HITSP patterns and creates corresponding certification requirements. For
those using CDA, there is no need to change to a new XML schema, and there
is no need to change the approach to communicating narrative notes. CDA
templates for medical conditions and allergies are created. An application maps
their internal data stores to these templates, and it includes corresponding
structured markup for medical conditions and allergies into CDA, whilst
making no change to the narrative. On the recipient side, there will continue
to be those who only render the document, whereas there may also be those
who can parse out the formally encoded medical conditions and allergies for
use in decision support, disease management, personalized medicine, and
many other critical healthcare delivery requirements. Figure 3 shows the CDA
constrained by the data elements of the Continuity of Care Record (CCR, in
red) named the Continuity of Care Document (CCD). In this example, the
CDA template includes additional data elements not found in the CCR (such
as Chief Complaint and Discharge Diagnosis).
“CDA provides XML markup for
formally representing the clinical
statements within the narrative.”
“For those using CDA, there is no
need to change to a new XML schema,
and there is no need to change the
approach to communicating narrative
notes.”
Healthcare IT Standards and the Standards Development Process: Lessons Learned from Health Level 7 | 69
Intel® Technology Journal | Volume 13, Issue 3, 2009
Clinical Document Architecture
Continuity of Care Document
Allergies
Problems
Meds
Personal Information
Insurance
…
YYYY
Narative and tabular
XXX
Narative
TTTT
Narative and tabular
ZZZZ
Narative
Mode of transport
Discharge Diagnosis
Chief Complaint
Figure 3: CDA constrained by Additional Data
Elements
Source: Health Level 7
Next year, HITSP and CCHIT will prioritize new templates; the following year
more templates will be produced, and so it goes — scalability arises from the
fact that one XML schema does it all; incrementalism arises from the fact that
only well-described changes need to be introduced over time, on a defined
roadmap.
CDA is attractive for these reasons:
•• Scope. There is a single XML schema for all CDA documents.
•• Implementation experience. CDA has been a normative standard since 2000,
and it has been balloted through HL7’s consensus process. CDA is also
widely implemented.
•• Gentle on-ramp to information exchange. CDA is straight-forward to
implement, and it provides a mechanism for incremental semantic
interoperability. One can begin with a simple CDA narrative document
implementation and add discrete data elements over time, based on
national priorities.
•• Improved patient care. CDA provides a mechanism for inserting evidencebased medicine directly into the process of care (via templates), thereby
enabling the application of evidence-based medicine.
•• Lower costs. CDA’s top-down strategy lets you implement CDA once, and
reuse it many times for new scenarios.
These attributes of CDA have lead to its adoption as a core healthcare
information technology component in many countries, including the United
States, in particular by HITSP. Interoperability on a use-case by use-case basis
can lead to a disjointed set of standards that do not support re-use or internal
consistency. What is needed is an overarching approach to interoperability, one
that is both widely encompassing and scalable, as new use cases are developed.
A strategy being exploited by HITSP, as well as the Integrating the Healthcare
Enterprise (IHE) and other international HIT initiatives, is to base a growing
number of Interoperability Specifications (ISs) on the CDA, or, more
precisely, on templated CDA. From its inception, CDA has supported the
ability to represent professional society recommendations, national clinical
practice guidelines, and standardized data sets as “templates” or constraints
on the generic CDA XML. Perhaps the best known example of a templated
CDA specification is the ASTM/HL7 Continuity of Care Document (CCD)
specification (see Figure 4), where the standardized data set defined by the
ASTM Continuity of Care Record (CCR) is used to further constrain CDA,
specifically for summary documents.
“What is needed is an overarching
approach to interoperability, one
that is both widely encompassing
and scalable, as new use cases are
developed.”
70 | Healthcare IT Standards and the Standards Development Process: Lessons Learned from Health Level 7
Intel® Technology Journal | Volume 13, Issue 3, 2009
Subsequent to its adoption of CCD as the basis for HITSP/C32 “Summary
Documents using HL7 CCD,” HITSP recognized that a top-down strategy,
whereby one learns CDA once and then re-uses it in other ISs, leads to greater
cross use-case consistency. HITSP has since created a library of CDA templates
(HITSP/C83 “CDA Modules Component”) that are used within a growing
number of CDA-based specifications (for example, HITSP/C28 “Emergency
Care Summary,” HITSP/C32 “Summary Documents Using HL7 CCD,”
HITSP/C105 “Patient Level Quality Data Document Using HL7 Quality
Reporting Document Architecture (QRDA),” HITSP/C48 “Encounter
Document constructs,” HITSP/C84 “Consult and History & Physical Note
Document,” and HITSP/C78 “Immunization Document”).
Clinical Document Architecture
Continuity of Care Document
Problems
Meds
Personal Information
Insurance
…
Additional templated CDA specifications being developed within IHE and
HL7 include the QRDA, the Public Health Case Report Document, Operative
Reports, Personal Health Monitoring Reports, and Minimum Data Set,
version 3. What is true across the spectrum of specifications is that all the
specifications conform to the underlying CDA XML, and templates are re-used
to the extent possible. CDA, coupled with a library of re-usable templates,
forms the basis for a growing number of HITSP ISs, and CDA represents a
national interoperability strategy that is both widely encompassing and scalable
as new use-cases are developed.
Considerable discussion is taking place now in the United States around
the notion of meaningful use, and about ensuring that our interoperability
strategy supports meaningful use. At the heart of meaningful use is simply a
requirement for data re-use as in re-using clinical trial data in the construction
of decision support rules, re-using clinician-captured data for quality reporting,
public health reporting, etc. Imagine, for instance, that there were separate
models and XML schemas for immunization data, medication administration,
pharmacy dispensing, lab, and clinical summaries, and that the onus was on
the implementer to reconcile the differences in the data in order to support
data re-use. CDA and templated CDA address the concept of meaningful reuse, by maximizing data re-use.
Allergies
A CCD
based
document
YYYY
Narative and tabular
XXX
Narative
TTTT
Narative and tabular
ZZZZ
Narative
Mode of transport
Discharge Diagnosis
Chief Complaint
Figure 4: Templated CDA Specification
Source: Health Level 7
The value of the RIM, and the rationale for its use as the underlying formalism
for encoding clinical statements in a CDA document include the following:
•• Consensus. A consensus process is used in the development of the RIM that
encompasses many years, many vendors, many countries, and many usecases.
•• Expressivity. This allows for the formal representation of many types of
clinical statements.
•• Data re-use. All version 3 specifications are derived from a common model.
•• Concrete. While not perfect, RIM is here today for us to use.
•• Governance and maintenance. There is a defined consensus process for
revisions.
“CDA and templated CDA address
the concept of meaningful re-use, by
maximizing data re-use.”
Healthcare IT Standards and the Standards Development Process: Lessons Learned from Health Level 7 | 71
Intel® Technology Journal | Volume 13, Issue 3, 2009
“It is anticipated that several ballot
cycles will be required to work through
the list of new requirements, and
that CDA Release 3 will become
an ANSI-approved HL7 standard
sometime in 2011.”
CDA Release 1 became an ANSI approved HL7 standard in November 2000.
CDA Release 2 became an ANSI-approved HL7 standard in May 2005.
Balloting on CDA Release 3 is slated to begin towards the end of 2009. Given
that HL7 has three ballot cycles per year, and given the widespread adoption
of CDA Release 2, it is anticipated that several ballot cycles will be required
to work through the list of new requirements, and that CDA Release 3 will
become an ANSI-approved HL7 standard sometime in 2011. CDA Release 3
requirements are cataloged as formal proposals and as suggested enhancements.
The main feature enhancement expected for CDA Release 3 is the inclusion
of much more of the HL7 RIM, thereby enabling a wider range of clinician
narrative to be expressed.
Standards and Standards Development
Organizations: A New Collaboration
“Standards development is a
political process.”
Standards development is a political process. As described previously, the
stakeholders are diverse but not every interest is equally represented. In many
countries, the dominant voices in this process are the government agencies that
determine the business requirements and fund the development initiatives.
The other end of the spectrum is represented by the vendor community
that has always made a substantive contribution to the definition of the
standards’ artifacts. In the past, the caregiver community has been significantly
disenfranchised.
That is not to say that credentialed professionals have not been leaders of
standards development organizations and government agencies. Even when
clinicians lead these organizations, however, decision-making processes are
often blunted by layers of administrative bureaucracy and regulatory overhead.
Within the healthcare IT community, the chief medical information officer has
often provided critical guidance and commanded substantive influence.
There is a growing body of critical data and published studies that describe the
successful deployment of large-scale IT solutions within multi-disciplinary,
healthcare systems. In fact, the application of eHealth processes has been
shown to be transformative in various large, healthcare provider organizations.
Unfortunately, recent exposés in the public media have recounted instances of
dire unintended consequences of new or updated healthcare IT solutions. More
often than not, the problem with these installations has not been technical. The
apparent proximal cause has been the significant failure by system designers
and software developers to understand the workflow and daily business
requirements of the clinical end-users.
“The application of eHealth processes
has been shown to be transformative
in various large, healthcare provider
organizations.”
72 | Healthcare IT Standards and the Standards Development Process: Lessons Learned from Health Level 7
Intel® Technology Journal | Volume 13, Issue 3, 2009
At the technical end of the implementation spectrum lies the need to ensure
seamless, unambiguous interchange of data. The inability to achieve this is
rarely the failure of the technical standards (specifications) themselves. Often
there is a lack of formal agreement on vocabulary, the definition of individual
data elements, and the application of the required terminology within the
patient care continuum. This is exacerbated when requests are made or
requirements defined across clinical boundaries. In simple terms, a physical
therapist may not apply the same meaning to a term as the orthopedic surgeon
who first used it. In the in-patient setting, this is repeated daily in a failure to
achieve unambiguous communication between physicians, nurses, pharmacists,
and laboratory staff.
“In simple terms, a physical therapist
may not apply the same meaning to
a term as the orthopedic surgeon who
first used it.”
Attempts to rectify these ambiguities have often failed to overcome parochial
and economic hurdles. In a practical sense, there is an ongoing tension among
primary caregivers and the specialist community. The gatekeeper model
of the managed healthcare system failed to achieve either fiscal or clinical
outcome metrics because of the dissension that was exacerbated when practice
variability was confronted with ambiguous, often counter-productive payment
schema. In those managed care systems with closed practice systems and a
single information system (e.g., Kaiser Permanente), the results are uniformly
significantly more successful.
Outside of the United States, stringent government regulation has often led
to improved outcomes following implementation of enhanced healthcare
information systems. Perhaps this is more related to a single payer system and a
more homogeneous patient population than to the technology itself. At times,
regulatory oversight has been an enormous obstacle to data interchange. For
example, the global regulated research community (chiefly pharmaceutical
and biotech industry) has embraced a structured vocabulary called the
Medical Dictionary for Regulatory Activities (MedDRA). This terminology
is principally required for data encoding in the reporting of adverse events in
clinical trials and in post-approval pharmacovigilance. The system is largely
incompatible with SNOMED CT (Systematized Nomenclature of Medicine
— Clinical Terms), which is in widespread use for patient care worldwide.
Both clinical research and patient care suffer because of the artificial barriers to
effective information exchange.
These problems transcend the functional requirements for interoperability.
For our purposes, it is best to rely on the definition established by IEEE.
Interoperability is the ability of two or more systems or components to
exchange information. Semantic interoperability is the ability to use the
information that has been exchanged. Faxing an EKG tracing between two
professional offices provides a high degree of interoperability, if both clinicians
agree on the parameters to use for interpretation. Like most data that are not
encoded, the EKG cannot be re-used in any meaningful way. Parenthetically,
HL7 has developed a standard for encoding an annotated EKG.
“Outside of the United States,
stringent government regulation
has often led to improved outcomes
following implementation of enhanced
healthcare information systems.”
“Interoperability is the ability of
two or more systems or components
to exchange information. Semantic
interoperability is the ability to use the
information that has been exchanged.”
Healthcare IT Standards and the Standards Development Process: Lessons Learned from Health Level 7 | 73
Intel® Technology Journal | Volume 13, Issue 3, 2009
“These practice guidelines are
predicated on more data being derived
from clinical research and have been
referred to as evidence-based
medicine.”
“The development of treatment
algorithms that weigh the relative
severity (importance) of two co‑morbid
diseases is the foundation for ongoing
research.”
Reducing the ambiguity of the data is increasingly important as healthcare
managers and researchers attempt to measure quality. For our purposes, the
enhanced clinical quality and patient-care outcomes are realized when practice
guidelines are adhered to by individual caregivers and system-wide care
requirements. Usually these guidelines are established by central authorities
(such as the US National Institutes of Health) and by professional societies
(for example, the American College of Cardiology). Globally, quality estimates
(outcomes) are provided by the World Health Organization.
In the United States, these guidelines have been historically written by the
Agency for Healthcare Research and Quality (AHRQ), an agency of the
Department of Health and Human Services. More recently, non-profit
organizations, such as the National Quality Forum, have been created to take a
more proactive role in writing these guidelines. These groups establish priorities
for quality evaluation as well as establishing the parameters for measuring
success. These practice guidelines are predicated on more data being derived
from clinical research and have been referred to as evidence-based medicine.
These guidelines are critical to improving patient outcomes and reducing costs.
Unfortunately, the clinical guideline for one disease may be contraindicated
when complying with the guideline for a co-morbid disease. For example,
the use of a relatively common anti-inflammatory drug for arthritis, such
as ibuprofen, may be contraindicated in the management of hypertension.
These two ailments are often concurrent in the Medicare-aged population, but
practice management algorithms would indicate a violation of quality patient
care. The development of treatment algorithms that weigh the relative severity
(importance) of two co-morbid diseases is the foundation for ongoing research.
Other instances of conflicts of standard practice guidelines are less easily
resolved. These conflicts may represent the differences in the interpretation
of medical evidence between two medical societies or the same specialty in
different regions or countries. It is easy to imagine that the recommendations
offered by the American Academy of Orthopedic Surgeons would not coincide
with those of the American Chiropractic Association. HL7 is working closely
with the National Quality Forum and the Agency for Healthcare Research and
Quality to standardize these metrics.
“Decision support systems are highly
complex, predicated on the integration
of a vast amount of patient care
and research data, and reliant upon
the harmonization of technical
standards.”
Delivering evidence-based medicine at the point of care (inpatient, ambulatory,
emergent, or chronic home care) is enabled by a technology that is broadly
classified as decision support. Decision support systems are highly complex,
predicated on the integration of a vast amount of patient care and research
data, and reliant upon the harmonization of technical standards. Within
HL7, the Decision Support Working Group has advanced the technical
functionality of these systems. Practical implementation is more complex.
Individual system vendors have implemented decision support by using a vast
array of technologies, alerts, graphical interface representations, and workflow
modification. At the most simple level, this technology may be implemented
74 | Healthcare IT Standards and the Standards Development Process: Lessons Learned from Health Level 7
Intel® Technology Journal | Volume 13, Issue 3, 2009
in the form of a clinical alert, for example, when two drugs with a potential for
adverse interaction are prescribed concurrently. Another example of decision
support might be a recommendation for an annual PAP smear at the time
of an outpatient visit. The most sophisticated systems guide complex disease
management and require data from lab testing and medication administration,
as in guidance for insulin dosing. HL7 is working closely with several decision
support initiatives, such as the Clinical Decision Support Consortium.
“HL7 is working closely with several
decision support initiatives.”
The most innovative program does not directly involve the development of
standards. It is an initiative in social engineering. Nearly four years ago, HL7
undertook a program for including specific clinical input into the standards
development process. This Work Group recognized the contributions of a
wide range of members of the caregiver community, including physicians,
nurses, pharmacists, and even medical librarians. While important progress
was made towards the identification of critical development pathways, the
Work Group sorely lacked a focus on patient care, public health, and program
management. In 2008, HL7 recognized the immediate need to align with the
caregiver communities. With funding from the AHRQ, an experiment was first
undertaken. The experiment went by the name of Bridging the Chasm.
At first blush, the chasm to be bridged was between the healthcare community
and the organization of healthcare IT professionals. It was most evident to the
clinicians that everyone was speaking a different language, filled with technospeak, jargon, and acronyms, with which they were embarrassingly unfamiliar.
Moreover, concepts, such as knowledge representation, so commonplace in
IT, were lost on the clinicians. To re-purpose an old adage, “the clinicians
were mad and they weren’t going to take it anymore.” Of course, there was
another, much older, and often more acrimonious divide to bridge. That was
the divisions erected over hundreds of years between specialists and primary
care physicians, between physicians and nurses (you can substitute almost any
other caregiver function), and between many of the ancillary but critical roles
of pharmacist, dietician, physical therapist, pathologist, and many others.
In April 2009, a conference was convened in Washington to begin the bridge
building. Funded by AHRQ and led by HL7, over 100 professional societies
met with one objective in mind: begin to define the terminology, workflow
processes, business requirements, and the like that everyone had in common.
Of course, standards development was to be a collateral outcome, but only
after the difficult task of bridge building was underway. This was not about
fence mending, since it was hoped that the fences would be torn down
eventually. For two days there was no jargon. Not an acronym was uttered. The
results were unprecedented, because this heterogeneous body moved forward
with unanimity of purpose.
“In 2008, HL7 recognized the
immediate need to align with the
caregiver communities.”
“Standards development was to be a
collateral outcome, but only after the
difficult task of bridge building was
underway.”
Healthcare IT Standards and the Standards Development Process: Lessons Learned from Health Level 7 | 75
Intel® Technology Journal | Volume 13, Issue 3, 2009
“In the office setting, in the home,
and in the hospital, the patients will
be the beneficiaries.”
With the creation of a new professional society, the Clinical Information
Interchange Collaborative, development of common terminology seems
possible. Integration of workflow into eHealth systems seems achievable.
Articulating a business case for the harmonization of practice parameters and
clinical guidelines has begun. In the office setting, in the home, and in the
hospital, the patients will be the beneficiaries.
Standards development for healthcare will always be about politics. The
problems to be solved have and will always have highly technical solutions.
Interoperability is not a goal, but rather the means to improving quality and
reducing costs. Creating innovative solutions to complicated issues of unified
vocabulary and seamless data integration can be realized in an environment of
collaboration.
Further Reading
Blobel, Bernd. “Making hospital IT interoperable.” HITE 2008:1:2: Summer
2008:12-15. Available at www.hospitaliteurope.com
Clinical Data Interchange Standards Consortium (CDISC)
This website contains standards and policy documents cited in this article.
http://cdisc.org/
Hammond, William Edward; Jaffe, Charles; Kush, Rebecca Daniels,
“Healthcare Standards Development: The Value of Nurturing Collaboration”
Journal of AHIMA, July 2009 80/7, pages 44-40.
Health Level Seven, Inc.
www.hl7.org
On the website can be found more detailed supporting material cited in this
article:
CDA Tutorial, 2008.
HL7 version 2.5 and HL7 2.6 Standards.
Dolin RH, Alschuler L, Boyer S, Beebe C, Behlen FM, Biron PV, Shabo A.
“HL7 Clinical Document Architecture, Release 2.” J Am Med Inform Assoc.
2006;13:30–39. Available at www.jamia.org.
HITSP/C32: “Summary Documents Using HL7 Continuity of Care
Document (CCD); Component.” July 8, 2009. Version 2.5.
HITSP/C37: Lab Report Document Component. July 8, 2009. Version 2.3.
HITSP/C75: Healthcare Associated Infection (HAI) Report Component.
July 8, 2009. Version 1.1.
HITSP/C83: CDA Content Modules Component. July 8, 2009. Version 1.1.
HITSP/C105: Patient Level Quality Data Document Using HL7 Quality
Reporting Document Architecture (QRDA). June 30, 2009. Version 0.0.1.
76 | Healthcare IT Standards and the Standards Development Process: Lessons Learned from Health Level 7
Intel® Technology Journal | Volume 13, Issue 3, 2009
European Committee for Standardization (CEN) Technical Committee 251
(Healthcare)
This site contains standards and policy information cites in this article.
http://www.cen.eu/cenorm/sectors/sectors/isss/index.asp
ISO Technical Committee 215 (Healthcare Informatics)
This workspace contains documents and standards cited in this article:
http://isotc.iso.org/livelink/livelink?func=ll&objId=529137&objAction=browse
Acronyms
Agency for Healthcare Research & Quality (AHRQ)
American National Standards Institute (ANSI)
Association Française de Normalisation (AFNOR)
British Standards Institution (BSI)
Certification Commission for Healthcare IT (CCHIT)
Clinical Data Interchange Standards Consortium (CDISC)
Clinical Document Architecture (CDA)
Comité Européen de Normalisation (CEN)
Computable Semantic Interoperability (CSI)
Continuity of Care Document (CCD)
Continuity of Care Record (CCR)
Enterprise Architecture Specification (EAS)
Enterprise Conformance/Compliance Framework (ECCF)
Extensible Markup Language (XML)
German Institute for Standardization (DIN)
Health Level Seven (HL7)
Healthcare Information Technology Standards Panel (HITSP)
Integrating the Healthcare Enterprise (IHE)
International Health Terminology Standards Development Organization
(IHTSDO)
International Standards Organization (ISO)
Interoperability Specifications (ISs)
Joint Initiative Council (JIC)
Quality Reporting Document Architecture (QRDA)
Reference Information Model (RIM)
Referent Model of Open Distributed Processing (RM-ODP)
Services Aware Enterprise Architecture Framework (SAEAF)
Standards Developing Organization (SDO)
SDO Charter Organization (SCO)
Services Oriented Architecture (SOA)
Healthcare IT Standards and the Standards Development Process: Lessons Learned from Health Level 7 | 77
Intel® Technology Journal | Volume 13, Issue 3, 2009
Author Biographies
Charles Jaffe, MD, PhD, FACP. Since 2007, Dr. Jaffe has served as the Chief
Executive Officer of HL7. He began his career at Intel in 2005 as the Senior
Global Strategist for the Digital Health Group. He has also served as Vice
President of Life Sciences at SAIC and as the Global Director of Medical
Informatics at AstraZeneca Pharmaceuticals. He completed his medical
training at Johns Hopkins and Duke Universities, and he was a post-doctoral
fellow at the National Institutes of Health and at Georgetown University.
Formerly, he was President of InforMed, a consulting firm for research
informatics. Over his career, he has been the principal investigator for more
than 200 clinical trials, and he has served in various leadership roles in the
American Medical Informatics Association. He has been a board member on
leading organizations for information technology standards, and he served
as the Chair of a national institutional review board. Currently, he holds an
appointment in the Department of Medicine at the University of California at
San Diego. He has been the contributing editor for several journals and he has
published on a range of subjects, including clinical management, informatics
deployment, and healthcare policy.
W. Ed Hammond, PhD, is Professor-emeritus, Department of Community
and Family Medicine and Professor-emeritus, Department of Biomedical
Engineering, Duke University and Adjunct Professor in the Fuqua School
of Business at Duke University. He has served as President of the American
Medical Informatics Association (AMIA), President of the American College
of Medical Informatics, and as Chair of the Computer-based Patient Record
Institute. He is currently serving his third term as the Chair of HL7. He was
Chair of the Data Standards Work Group of the Connecting for Health PublicPrivate Consortium. Dr. Hammond was a member of the IOM Committee
on Patient Safety Data Standards. He was awarded the Paul Ellwood Lifetime
Achievement Award in 2003 and the ACMI Morris F. Collen Award of
Excellence in November 2003.
John Quinn is the Chief Technology Officer of HL7 and has served in
this full-time position since September 2007. HL7 is one of several ANSI
-accredited SDOs operating in the healthcare arena.
Quinn is also a Senior Executive, Chief Technology Officer and Thought
Leader in Accenture’s Health Life Science’s Provider Practice focusing on
healthcare information system’s technologies, architectures, and data and
messaging standards; and he serves in his role as HL7’s CTO in part through
a significant contribution by Accenture. He has over 32 years experience in
Healthcare IT and 35 years experience in computer industries, respectively. He
has participated on the HL7 Board of Directors since its inception 21 years ago
and served as the Chair of its Technical Steering Committee from 1989–2007.
78 | Healthcare IT Standards and the Standards Development Process: Lessons Learned from Health Level 7
Intel® Technology Journal | Volume 13, Issue 3, 2009
Robert H. Dolin, MD, MS, serves as the Chair-elect on the HL7 Board of
Directors and has been involved with the organization since 1996. Dr. Dolin
is an internationally renowned and innovative physician expert in healthcare
information technology standards development with more than 20 years
of clinical experience. After receiving his medical degree, he served as chief
resident at UCLA Department of Medicine in 1989, where he developed
Harbor-UCLA first outpatient electronic medical record system. Dr. Dolin
is a Fellow in the American College of Physicians, and he was elected into
Fellowship in the American College of Medical Informatics in recognition of
his work on standards development. Dr. Dolin currently owns Semantically
Yours, a healthcare standards consulting firm. Previously, he spent 18 years
as a Hospitalist at Kaiser Permanente, Department of Internal Medicine. At
Kaiser, he was the physician lead for the enterprise terminology services team,
responsible for deploying SNOMED in their national electronic health record.
Dr. Dolin also co-chairs the HL7 Structured Documents Work Group, and
is co-editor of the HL7 CDA and the CCD specifications. He has also served
on the SNOMED CT Content Committee, and currently co-chairs the US
HITSP Foundations Committee. Dr. Dolin’s work has also been published in
both technical and clinical peer reviewed journals.
Copyright
Copyright © 2009 Intel Corporation. All rights reserved.
Intel, the Intel logo, and Intel Atom are trademarks of Intel Corporation in the U.S. and other
countries.
*Other names and brands may be claimed as the property of others..
Healthcare IT Standards and the Standards Development Process: Lessons Learned from Health Level 7 | 79
Intel® Technology Journal | Volume 13, Issue 3, 2009
Healthcare Information Integration: Considerations for
Remote Patient Monitoring
Contributors
Kristina M. Kermanshahche
Intel Corporation
Index Words
Remote Patient Monitoring (RPM)
Semantic Interoperability
Service Oriented Architecture (SOA)
Healthcare Informatics
Chronic Disease Management (CDM)
HL7 Clinical Document Architecture (CDA)
Abstract
Modern healthcare is no longer practicable without data integration. Without
standards, integration costs soar and threaten the effectiveness of healthcare
delivery. With standards, information becomes accessible in computable
form, driving new levels of clinical research, patient-empowered healthcare,
and innovative business models. In this article we examine a few of the latest
developments in healthcare informatics standards with respect to remote
patient monitoring, review recent learnings in deploying a remote patient
monitoring solution, and identify key considerations and areas for further
development.
Introduction
Simple interventions can save lives and reduce the cost of healthcare. Home
care for patients with co-morbidities can mean the difference between life and
death. For example, with ongoing monitoring of vital signs, patient condition,
and medication levels, sudden changes outside of the patient’s established
thresholds can be detected, which can mean the difference between extended
hospitalization and rapid decline in health, versus maintaining a quality of life
in the patient’s home, surrounded by family and friends.
Similarly, maintaining tight control over blood glucose levels during gestational
diabetes can mean the difference between premature birth with a whole host
of medical complications for both mother and child, versus a stable healthy
maternity and normal delivery.
“Home care for patients with
RPM can be significant to both examples just cited, and the effectiveness
of RPM depends directly on the availability of standardized information
from a variety of healthcare data sources, including patient health summary,
prescriptions, lab results, daily vital signs collection, and functional
assessments.
co‑morbidities can mean the difference
between life and death.”
80 | Healthcare Information Integration: Considerations for Remote Patient Monitoring
Intel® Technology Journal | Volume 13, Issue 3, 2009
We first explore the standardized representation of RPM information by using
the Health Level 7 (HL7) v3 CDA Release 2 [4] and the Personal Health
Monitoring Report (PHMR) Draft Standard for Trial Use (DSTU) [10]. Then,
we consider recent developments with the American Health Information
Community (AHIC) Use Cases and Healthcare Information Technology
Standards Panel (HITSP) Implementation Guides for Remote Patient
Monitoring [6, 7], Consultation and Transfer of Care [5], Long-Term Care
Assessments [13], and the resulting HL7 Implementation Guide for CDA 2:
CDA Framework for Questionnaire Assessments DSTU [8]. We review several
SOA principles and considerations critical to healthcare integration. Next,
we discuss several technical and workflow considerations when designing and
deploying a RPM solution. Along the way we identify multiple areas suitable
for further development.
Founded in 1987, Health Level Seven (HL7)
is a not-for-profit, ANSI-accredited standards
development organization dedicated to
providing a comprehensive framework and
related standards for the exchange, integration,
sharing, and retrieval of electronic health
information that supports clinical practice and
the management, delivery, and evaluation of
health services.
Remote Patient Monitoring Use Case
In this section we discuss a typical RPM use case, in order to set the context for
applying healthcare informatics standards and to explain how subtle nuances
in clinical workflow and system interactions can have a profound impact, both
on the design of the end-to-end information integration as well as on future
enhancements to informatics standards.
The sequence diagram depicted in Figure 1 was adapted from the HITSP
RMON Business Use Case [6], representing a superset of actors, steps, and
functionality across a number of finer-grained but related use cases. We
currently work with clinicians in a variety of regions worldwide but primarily
based in the United States and the United Kingdom. Clinical delivery and
reimbursement models are very different between these two regions. Some
healthcare delivery models rely upon visiting nurses, community outreach,
and clinical call centers to perform primary patient engagement, monitoring
patients remotely and within the home, and only referring the most acute cases
for physician or hospital intervention.
Other delivery models rely upon physician practices to monitor patients
directly, at times delegating these tasks to trained staff that operate as part of
the practice under their clinical supervision. These two models exist in both
regions regardless of patient acuity, frequency of monitoring, or reimbursement
models. (In the healthcare context, patient acuity refers to the type and severity
of illness, with acutely ill patients requiring emergency care.) It is important
that use cases and the supporting informatics standards comprehend these
types of variation in order to deliver an effective RPM solution.
AHIC is a federal advisory body, chartered in
2005 to make recommendations to the Secretary
of the U.S. Department of Health and Human
Services on how to accelerate the development
and adoption of health information technology.
AHIC specifies prioritized healthcare use
cases, for which the HITSP then develops
interoperability specifications, leveraging,
harmonizing, and further constraining existing
internationally-recognized standards. HITSP
identifies gaps in coverage and forwards new
development areas to Standards Development
Organizations (SDOs) for consideration
in future revisions. While both AHIC and
HITSP are entirely U.S.-centric in focus,
similar organizations exist in each major region
worldwide and their common goal is to achieve
standards harmonization. The process of
international standards harmonization is covered
separately in this issue by Jaffe, et al.
“We currently work with clinicians
in a variety of regions worldwide but
primarily based in the United States
and the United Kingdom.”
Healthcare Information Integration: Considerations for Remote Patient Monitoring | 81
Intel® Technology Journal | Volume 13, Issue 3, 2009
Remote Monitoring
Management System
Electronic Health
Record (EHR) System
Patient
Event: Collection of Patient Session Data (vital signs, assessments)
according to prescribed Treatment Plan
Remote Monitoring Values Compared
Against Reference Range
Alerts and Notifications Generated for
Values Outside of Reference Range
Remote Monitoring Information
Transmitted to Clinician’s EHR
Clinician Reviews Remote
Monitoring Information from
Within the EHR
Updated Treatment Plan Transmitted
to Remote Monitoring System
Clinician Modifies
Treatment Plan as
Necessary
Updated Treatment Plan
Conveyed to Patient
Figure 1: RPM Sequence Diagram
Source: Intel Corporation, 2009. Adapted from HITSP RMON Business Sequence Diagrams [6]
Figure 1 represents three primary actors of concern – the Patient, the Remote
Monitoring Management System (RMMS) that operates the RPM solution,
and an external Electronic Health Record (EHR) system, which might
represent anything from a clinical electronic medical record system to a
purpose-built system developed specifically for chronic disease management
(CDM) or home health monitoring. The primary monitoring of a patient’s
wellbeing by a clinician can occur either within the RMMS itself or by
extension, from within the EHR system. The sequence diagram is triggered by
the event of patient data collection, including vital signs and assessments
defined by a treatment plan, which specifies not only what data to collect but
also the frequency and schedule.
“Patients work with a variety of
peripheral device types including blood
pressure cuff, glucose meters, weight
scales, and pulse oximeters.”
Typical patients are asked to collect vital signs anywhere from one to four times
per day (an average of eight vital sign readings per day), ranging anywhere
from three to seven days per week. Patients work with a variety of peripheral
device types including blood pressure cuff, glucose meters, weight scales,
and pulse oximeters. Vital sign measurements are combined with assessment
questions that are geared to assess patient adherence to the treatment plan,
functional status, and coping mechanisms with regard to their specific set of comorbidities, such as congestive heart failure, diabetes, and chronic obstructive
pulmonary disease (COPD).
82 | Healthcare Information Integration: Considerations for Remote Patient Monitoring
Intel® Technology Journal | Volume 13, Issue 3, 2009
Once the RMMS aggregates the vital sign and assessment data, it compares
the values against a pre-established reference range or threshold to identify
those readings that are out of normal range. Optionally, alerts and notifications
may be generated to the EHR system and members of the clinical team,
notifying them of the abnormal readings. Next, both normal and out-of-range
information is transmitted to the EHR system, either in raw or summary
format. The clinician reviews the information, makes clinical notes, and as
needed, generates referrals and modifies the treatment plan. The updated
treatment plan is transmitted to the RMMS and subsequently communicated
to the patient and the patient-local devices.
“Optionally, alerts and notifications
may be generated to the EHR system
and members of the clinical team,
notifying them of the abnormal
readings.”
The significance of the use case and sequence diagram just described are as
follows:
•• There is a high degree of variability in where patient data are maintained.
The clinician and clinical support team review and annotate patient
information in multiple systems of record. The RMMS is generally
deployed in addition to multiple legacy systems used to manage patient
healthcare, and typically, little to no integration exists.
•• Some healthcare use cases and informatics standards assume that a clinician
event is responsible for triggering the transfer of information from one
clinical team to the next; however, this use case underscores several
examples where data transmission is triggered instead by system events, and
it is important for informatics standards to comprehend both scenarios.
•• The type, frequency, and schedule of the information collected is also
highly variable, which must be comprehended by informatics and system
deployment designs alike. The more standardized the information, the
more it will drive machine computability, advanced analytics, and improved
healthcare at lower costs. However, informatics standards must always
balance the goal of perfect information requiring every data element,
with a pragmatic approach to incremental adoption, relying instead on
implementations to deliver best effort results in populating standard data
elements with carefully designed constraints.
•• A consistent, appropriately encoded specification of the treatment plan
is just as important as the data collection itself. Initial focus is rightfully
on capturing and transmitting RPM data, yet to truly measure patient
adherence and prognosis over time, we also need a consistent method to
specify the treatment plan in a semantically meaningful way.
Healthcare Information Integration: Considerations for Remote Patient Monitoring | 83
Intel® Technology Journal | Volume 13, Issue 3, 2009
Cloud Services:
Remote Patient Monitoring Solution Boundary
6
Remote Monitoring
Management System
1
Data
12
Personal Health
Monitoring Device
Patient
2
7
SOA Integration
Services
4
Data
Synchronization
Entity Identity Service
Controlled Terminology Service
Patient Consent Service
Rx, Lab Results
9
8
Data
11
Data
10
Electronic Health
Record (EHR) System
3
Clinician
5
Care Manager
Figure 2: RPM Interaction Diagram
Source: Intel Corporation, 2009
Figure 2 reveals several additional deployment considerations with respect to
RPM. Step 1 represents the transmission of patient session data to the RMMS.
The RMMS aggregates trend information, compares vital signs against
reference ranges, and generates alerts and notifications. It has its own data
repository, consisting of historical patient treatment plans, reference ranges,
clinical notes, patient monitoring data, and customer-specific configuration
information. The RMMS is necessarily designed to be an online transaction
processing (OLTP) system, since its primary goal is to drive RPM in a highperformance, transaction-oriented manner.
Optionally, Step 2 and 3 depict a care manager who reviews the patient’s status
from within the RMMS. The care manager may modify the treatment plan
directly, or annotate the record and refer to a clinician for review and followup. The care manager may also play a key role in assessing the viability of the
data collected prior to escalation. For example, in the case of a very low weight
reading generating an alert, the care manager might confirm that the patient’s
grandson had in fact triggered the scale.
“The RMMS is necessarily designed to
be an OLTP system, since its primary
goal is to drive RPM in a highperformance, transaction-oriented
manner.”
Step 4 represents some form of data synchronization between the RMMS and
the Integration Services environment. The primary goal of the Integration
Services environment is rapid query, retrieval, translation, transformation, and
guaranteed delivery (push/pull) of healthcare information between a number of
participating entities (Step 5). Integration Services, therefore, are commonly a
mix of online analytic processing (OLAP) combined with logical mechanisms
of extract, transform, and load (ETL). Trend analysis and summarization may
also be performed at this stage. The Integration Services environment is best
developed according to SOA principles, largely due to the complexity and
84 | Healthcare Information Integration: Considerations for Remote Patient Monitoring
Intel® Technology Journal | Volume 13, Issue 3, 2009
number of end points, the variability of legacy and standard interfaces, and
the number of different workflows and external services required. We discuss
the rationale and benefits of applying SOA to RPM in more detail later. The
mechanism of data synchronization depends upon patient acuity, data latency,
and a number of other considerations, also discussed later in this article.
As a part of the transformation that occurs in Step 5, multiple calls may be
made to services across the Internet (Steps 6 and 7) in order to complete the
healthcare dataset, including identity match; terminology encoding; record
locate; patient consent; or data enrichment, incorporating the latest lab results,
prescription history, or drug-to-drug interaction checks.
Finally, the RPM information is transmitted to the EHR (Step 8), where it is
reviewed and annotated by the clinician (Step 9). Note that many providers
will require manual review of the data for clinical accuracy prior to committing
it to the EHR. This review requirement poses significant workload and process
implications to the future viability of RPM, discussed in more depth later.
The clinician enters modifications to the treatment plan and generates requests
for consultation (Step 10). Changes to the treatment plan are returned to the
RMMS (Step 11) and transmitted to the patient-local devices (Step 12). The
inbound data flows undergo similar decomposition and transformation as
outbound flows (Steps 5-7), remapping identity and terminology to local code
sets.
System designers should anticipate the need to support a large number of end
points with a high complexity and variability of transforms (e.g., HL7 v2.x,
HL7 v3, proprietary XML, proprietary delimited files) and translations (e.g.,
SNOMED CT, LOINC, RxNORM, ICD9/10) required across an array of
different transport protocols (e.g., SOAP over HTTPS, SFTP, MLLP, and
proprietary sockets). A powerful approach is the implementation of the HL7
v3 CDA [4] in the Integration Services environment (Step 5) as a normalized
view of RPM information, leveraging the full richness of the HL7 v3 Reference
Information Model (RIM). The CDA becomes in effect a Rosetta Stone,
making the subsequent translation to legacy, proprietary, and standardsbased systems predictable, reliable, and semantically correct. The incremental
adoption model inherent in CDA ensures the relative ease with which we
can populate optional segments with new information, such as Plan of Care,
while the SDOs work through the optimal encoding scheme. It also ensures
straightforward compliance with new CDA document types, such as the
Continua Health Alliance Personal Health Monitoring (PHM) Report [3], as a
minor transform from the baseline CDA.
“Note that many providers will
require manual review of the data for
clinical accuracy prior to committing
it to the EHR.”
The HL7 v3 RIM is an abstract healthcare
informatics model, generated by using Uniform
Modeling Language (UML) and consisting of
a set of classes, attributes, and relationships.
The HL7 CDA and other v3 specifications are
derived from the RIM and encoded by using
XML. The RIM is capable of representing Acts,
Entities, Relationships, Roles, and Participants,
and it is refined by using standard code sets and
controlled medical vocabulary.
Continua Health Alliance is a non-profit, open
industry coalition of more than 200 healthcare
and technology companies collaborating to
improve the quality of personal healthcare.
Continua is dedicated to establishing a system
of interoperable personal health solutions, with
the knowledge that extending those solutions
into the home fosters independence, empowers
individuals, and provides the opportunity for
personalized health and wellness.
Healthcare Information Integration: Considerations for Remote Patient Monitoring | 85
Intel® Technology Journal | Volume 13, Issue 3, 2009
Using HL7 to Encode Telehealth Data
“The standard is both precise and
adaptable, as demonstrated by the
Continua xHR Implementation
Guide.”
The HL7 v3 CDA Release 2 [4] is the ideal vehicle for healthcare data
integration. It is an informatics standard based on years of industry expertise, it
represents healthcare documents in their purest form, ranging from a progress
report, to a discharge summary, to capturing an MRI study as a standard
image set. The HL7 CDA can scale easily from early to advanced stages of
adoption: it can represent a container with basic summary information (such as
basic patient demographics, ordering physician, and a facsimile of a lab result
stored as an attachment), to a container with a longitudinal health record.
The standard is both precise and adaptable, as demonstrated by the Continua
xHR Implementation Guide [3], a variant of the CDA developed to define the
Personal Health Monitoring Report (PHMR) [10].
The real ingenuity of the CDA is the overlay of a series of templates or
constraints to the v3 RIM, to uniquely identify and encode sections of a
clinical document in a standard and semantically meaningful way. The HL7
v3 Continuity of Care Document (CCD) [9] was one of the first broadly
implemented set of constraints applied to the HL7 CDA Release 2 [4], and
it was selected by HITSP as the basis of their implementation guides. There
are now a whole complement of implementation guides based on either the
umbrella CDA or on the further constrained CCD, all of which aim to deliver
both human-readable and machine-consumable clinical information in a
systematized fashion.
Assigned Author
The HL7 CDA supports the concept of the Assigned Author being either a
human or a device, as depicted in Code Listing 1.
“The real ingenuity of the CDA is
the overlay of a series of templates or
constraints to the v3 RIM, to uniquely
identify and encode sections of a
clinical document in a standard and
semantically meaningful way.”
<!-- when the CDA is compiled/reviewed by a Clinician -->
<author>
<assignedAuthor>
<assignedPerson>
…
</assignedPerson>
</assignedAuthor>
</author>
<!-- when the CDA is created by a system or device -->
<author>
<assignedAuthor>
<assignedAuthoringDevice>
…
</assignedAuthoringDevice>
</assignedAuthor>
</author>
Code Listing 1: Machine-Computable XML: Assigned Author
Source: Intel Corporation, 2009
86 | Healthcare Information Integration: Considerations for Remote Patient Monitoring
Intel® Technology Journal | Volume 13, Issue 3, 2009
In some cases, a summary report of RPM is generated along with clinical
notes, as part of a transfer of care or a request for consultation, and the use of
a human Assigned Author makes perfect sense. In other cases, information is
automatically collected by RPM devices, compiled into an HL7 CDA, and
forwarded to other healthcare systems for subsequent clinical analysis. In this
latter case, there is no human author who can be assigned, as the information
has yet to be reviewed by any clinical personnel. Similarly, while some regions
may require legal authentication of a clinical document, other regions may
delegate legal authentication to a device or system, and still others may opt to
release a clinical document prior to legal authentication. The HL7 PHMR does
an excellent job of both considering and supporting each of these variations in
usage models [10].
Medical Equipment
The HL7 PHMR [10] (based upon the work of the Continua Health Alliance
[3]) does a thorough job of specifying required peripheral manufacturer
information. Table 1 and Code Listing 2 show the Medical Equipment section
from the HL7 PHMR [10], depicting both the XML-rendered table and a
subset of the machine-computable section for a single device, respectively.
“The HL7 PHMR does an excellent
job of both considering and supporting
each of these variations in usage
models.”
Medical Equipment
System
Type
System
Model
System
System ID Production
Manufacturer
Spec
Blood
Pressure
Monitor
Pulse
Master
2000
Nonin
1F-3E46-789A-BCDE-F1
Regulated
Unspecified: Regulated
Serial
Number:
584216
Part Number:
69854
Hardware
Revision: 2.1
Software
Revision: 1.1
Protocol
Revision: 1.0
Prod Spec
GMDN:
Table 1: XML-rendered Table: Medical Equipment
Source: Health Level Seven, 2009 [10]
Healthcare Information Integration: Considerations for Remote Patient Monitoring | 87
Intel® Technology Journal | Volume 13, Issue 3, 2009
“Systems can transform the
terminology to other standard and
proprietary formats, precisely because
the standard is encoded in the first
place, just like the Rosetta Stone.”
<title>Medical Equipment</title>
…
<!-- Note the use of the IEEE EUI-64 unique peripheral DeviceID, used
to establish a link between Medical Equipment peripheral and reference
measurements in the Vital Signs section -->
<id root="1.2.840.10004.1.1.1.0.0.1.0.0.1.2680"
assigningAuthorityName="EUI-64” extension="1F-3E-46-78-9A-BCDE-F1"/>
…
<!-- standard device encoding via SNOMED CT -->
<playingDevice>
<code code="32033000" codeSystem="2.16.840.1.113883.6.96"
codeSystemName="SNOMED CT" displayName="Arterial pressure
monitor">
<!-- translated device encoding via MDC -->
<translation code="MDC_DEV_SPEC_PROFILE_BPM”
codeSystem="2.16.840.1.113883.6.24" codeSystemName="MDC"
displayName="Blood Pressure Monitor"/>
</code>
<manufacturerModelName>…</manufacturerModelName>
</playingDevice>
Code Listing 2: Machine-computable XML: Medical Equipment
Source: Health Level Seven, 2009 [10]
“Semantic interoperability is the
highest form of data integration.”
Note the significance of the use of standard terminology in Code Listing 2,
which leverages both SNOMED CT and MDC code sets (refer to “code/code
system…translation code/code system”). This semantic encoding is what makes
the information machine-computable and enables interoperability. Systems can
transform the terminology to other standard and proprietary formats, precisely
because the standard is encoded in the first place, just like the Rosetta Stone.
Semantic interoperability is essential to enable clinical research and to facilitate
queries across a wide variety of data sources; however, semantic exchange is
only possible if the terminology has been first normalized to one of a few dozen
international healthcare terminology standards.
Levels of Interoperability
Semantic interoperability is the highest form of data integration, such that
receiving systems can readily and precisely consume information, encoded with
standard terminologies and code sets, with no loss of meaning or context for
abstract terms and concepts. This is the goal of the HL7 v3 RIM.
“Prior messaging standards tended to
stop short at syntax.”
Syntactic interoperability, the next lower form of data integration, exchanges
information by using agreed-upon syntax. A set of fields are specified along
with their syntax, but nothing is specified as to the possible range of values or
meaning. Prior messaging standards tended to stop short at syntax: they did
not address the crucial last step of semantics, and therefore left an incomplete
data standard wide open to conflicting interpretation and incompatible
88 | Healthcare Information Integration: Considerations for Remote Patient Monitoring
Intel® Technology Journal | Volume 13, Issue 3, 2009
implementations. For example, the earlier versions of HL7 v2.x messages
focused almost exclusively on syntax, and hence the expression “every v2.x
interface is a new v2.x interface.” Each implementation added proprietary
interpretations and extensions over time.
Integration at the lower five levels of the OSI model focus on standardizing
transport, protocol, and security, which is also the primary focus of such
groups as W3C, Continua, IHE, and IEEE. Technology standards are optimal
to address the exchange on the wire, while healthcare domain expertise is
optimal to address the syntax and semantics of the exchange.
Vital Signs
Table 2 shows a sample of the Vital Signs section of a PHMR, depicting the
XML-rendered table, while Code Listing 3 depicts a subset of the machinecomputable section for a single measurement. This example highlights the dual
role of the CDA construct: to provide highly accessible information viewable
by clinicians in a concise, easy-to-read format, along with a fully encoded,
machine-consumable version of the same information, capable of supporting
any degree of advanced analytics and clinical workflow. It is common in fact
for information in the machine-consumable section to be richer than that
depicted in the XML-rendered table. For example, the XML-rendered table
might present only summary or trend information, whereas it is perfectly
acceptable for the machine-consumable segment to include both raw and
summary information.
“It is common in fact for information
in the machine-consumable section
to be richer than that depicted in the
XML-rendered table.”
Vital Signs
Date Captured
Peripheral
Measurement
Value
Condition
Reference Range
2008-01-07 13:55:14.000
Blood Pressure
Systolic
137 mmHg
Alert
89-130
2008-01-07 13:55:14.000
Blood Pressure
Diastolic
85 mmHg
Normal
59-90
2008-01-07 13:55:14.000
Blood Pressure
Pulse
89 BpM
Normal
59-100
Table 2: XML-rendered Table: Vital Signs
Source: Intel Corporation, 2009
Healthcare Information Integration: Considerations for Remote Patient Monitoring | 89
Intel® Technology Journal | Volume 13, Issue 3, 2009
<title>Vital Signs</title>
…
<!-- standard encoding for Systolic readings -->
<code code="271649006"codeSystem="2.16.840.1.113883.6.96"
displayName="Systolic"/>
…
<!-- actual Systolic measurement -->
<value xsi:type="PQ" value="137" unit="mm[Hg]"/>
<!-- interpretation code indicates reading is Abnormal, based upon
referenceRange cited below -->
<interpretationCode code="A"
codeSystem="2.16.840.1.113883.5.83"/>
…
<!-- Note the use of the IEEE EUI-64 unique peripheral DeviceID, used
to establish reference between a measurement in Vital Signs to originating
peripheral details in the Medical Equipment section -->
<participant typeCode="DEV">
<participantRole>
<id root="1.2.840.10004.1.1.1.0.0.1.0.0.1.2680"
assigningAuthorityName="EUI-64" extension="1F-3E-46-78-9ABC-DE-F1”/>
</participantRole>
</participant>
<!-- referenceRange cites the lower and upper measurement thresholds
considered out-of-range or abnormal -->
<referenceRange>
<observationRange>
<value xsi:type="IVL_PQ">
<low value="89" unit="mm[Hg]"/>
<high value="130" unit="mm[Hg]"/>
</value>
</observationRange>
</referenceRange>
Code Listing 3: Machine-computable XML: Vital Signs
Source: Intel Corporation, 2009
“Inclusion of this information
is crucial to be able to quickly
perform patient triage as well as
trending analysis and population
management.”
Note also the use of Observation/interpretationCode and Observation/
referenceRange in Code Listing 3, to indicate whether a particular reading
is considered within or outside of the reference range for that measurement
type. These data elements are optional, but are strongly encouraged as industry
best practice (i.e., “SHOULD”) by the HL7 PHMR [10]. Inclusion of this
information is crucial to be able to quickly perform patient triage as well as
trending analysis and population management.
90 | Healthcare Information Integration: Considerations for Remote Patient Monitoring
Intel® Technology Journal | Volume 13, Issue 3, 2009
Functional Status
Remote patient monitoring has a clear need for including questionnaire
assessments as a part of Functional Status. Assessments can range from
standard question-answer sets, recognized regionally as the authoritative
protocol for a given disease condition, to proprietary question-answer sets,
designed by particular institutions along with customized care plans. The
standard assessments are routinely used by payers and providers alike to gauge
individual patient functional status and assess overall population trending.
The HITSP Long Term Care–Assessments AHIC Gap/Extension [13] lists
several assessments applicable to the United States, including the Long-Term
Care Minimum Data Set (MDS), the Continuity Assessment Record and
Evaluation (CARE), the Outcome and Assessment Information Set (OASIS),
and the Inpatient Rehabilitation Facility-Patient Assessment Instrument
(IRF-PAI). There are widely recognized instruments worldwide for managing
chronic disease conditions such as diabetes, congestive heart failure, asthma,
or depression. Assessments can also be utilized as part of pre-qualification and
patient recruitment for clinical trials.
“There are widely recognized
instruments worldwide for managing
chronic disease conditions such as
diabetes, congestive heart failure,
asthma, or depression.”
AHIC and HITSP acknowledged the gap in the lack of specifications related
to assessments, and worked with HL7 to develop the CDA Framework
for Questionnaire Assessments DSTU [8]. Standardized information,
templates, guidelines, schedules, and scoring mechanisms are all needed to
fully comprehend assessments within the HL7 CDA. Currently, work is
divided amongst several different teams, including HITSP Consultation and
Transfer of Care; Quality; and Long Term Care Assessments: all are working
in conjunction with the HL7 Patient Care Work Group, the HL7 Structured
Documents Work Group, and a broad array of clinical and technology industry
experts.
Some assessment instruments have the concept of weighted points associated
with particular question responses, which can then be used to triage patients
who require immediate intervention. An extension of this is the concept of a
Scored Care Plan Report, with higher patient acuity associated with higher
points assigned to more significant health indicators. For example, Congestive
Heart Failure patients might score a 0 if they are coping well on a particular
day, versus a 5 or 6 if they suddenly put on additional weight or notice swelling
in their legs. The ability to encode this information by using standard CDA
templates, constraints, and appropriate terminology would be extremely
powerful to drive both analytics and clinical workflow.
“An extension of this is the concept of a
Scored Care Plan Report, with higher
patient acuity associated with higher
points assigned to more significant
health indicators.”
Healthcare Information Integration: Considerations for Remote Patient Monitoring | 91
Intel® Technology Journal | Volume 13, Issue 3, 2009
“The framework is very thorough and
nuanced in its handling of diverse use
cases.”
The great news is that both the CDA and specializations such as the HL7
PHMR permit the addition of optional sections such as Functional Status
to represent this type of information. However, for a system to accurately
consume and mediate the information, it would still require extensions to the
specification. The work to date from the CDA Framework for Questionnaire
Assessments DSTU [8] is outstanding in this regard, in making such rapid
progress in such a short period of time. The framework is very thorough and
nuanced in its handling of diverse use cases, both enabling early adoption
while parts of the specification are yet to be finalized, as well as demonstrated
capability to instrument very complex assessments, as evidenced by the
derivative work on the Minimum Data Set Version 3 (MDSV3) [8]. We urge
the SDOs to continue to aggressively pursue work in this area, essential to all
forms of RPM, long-term care, and other clinical settings.
In Table 3, we show an example of how we might render assessment responses
in human-readable form.
Functional Status
Session date/time: 2008-01-07 13:53:04.000
Assessment
Answer
Score
How are you feeling today compared to yesterday?
Worse
5
Have you taken all of your prescribed medication
over the past 24 hours?
Yes
0
Did you need to take any extra diuretic medication
in the past 24 hours?
Yes
10
Did you need to use any extra oxygen in the past
24 hours?
Yes, more
than normal
5
Total Score:
20
Table 3: XML-rendered Table: Functional Status
Source: Intel Corporation, 2009
High priority should be placed on standardizing the templates to encode
patient assessments, including terminology, along with the concept of
assessment scales or scores where relevant.
“A Plan of Care is used to capture
‘What did we ask the patient to do?’
whereas the combination of daily
assessments and vital sign collection is
the answer to ‘What did the patient
actually do?’ ”
Plan of Care
Multiple SDOs and HITSP working groups are planning to address Plan
of Care as it relates to Consultations and Transfers of Care [5]. This is
another example of the power and flexibility of the CDA architecture.
While a traditional treatment plan might be best characterized by a set of
multidisciplinary protocols and a follow-up planned upon hospital discharge,
in RPM it takes on a different character altogether. Plan of Care includes
functional and other nursing assessments (not to be confused with physician
assessments of patient status, which are located in the CDA “Assessment and
Plan” section).
92 | Healthcare Information Integration: Considerations for Remote Patient Monitoring
Intel® Technology Journal | Volume 13, Issue 3, 2009
A Plan of Care is used to capture “What did we ask the patient to do?” whereas
the combination of daily assessments and vital sign collection is the answer to
“What did the patient actually do?” Thus, the combination of Plan of Care,
Functional Status, and Vital Signs can be used to gauge patient adherence
to the agreed-upon treatment plan, and can be used as a basis for long-term
outcomes analysis with respect to RPM. We would like to see the SDOs create
Plan of Care section constraints that cover both definition and scheduling
of standard assessments and vital signs collection. Such constraints would
be machine-consumable by different systems and encoded, by the use of
international terminology standards, to promote advanced analytics.
“We would like to see the SDOs create
Plan of Care section constraints that
cover both definition and scheduling
of standard assessments and vital signs
collection.”
Service Oriented Architecture (SOA) Principles
Remote patient monitoring truly represents a superset of healthcare integration
use cases, sitting at the hub of a complex coordination of care — across clinical
specialists, home care, lab, pharmacy, hospital, and assisted living facilities.
Chronic disease management adds a further layer of complexity, given that
most home-care and remote monitoring systems were developed as proprietary
systems with only a passing acquaintance with healthcare and technology
standards. Also, it is not uncommon for these CDM systems to be used
alongside one or more of the major commercial EMR systems.
SOA provides the essential grounding, agility, and extensibility to manage
and reduce complexity when integrating healthcare systems, services, and
information. It is adaptable to a challenging and ever-changing business
climate, and represents a proven return on investment [1, 2, 11, 12]. SOA
provides the necessary bridge between legacy and proprietary environments
and moves us towards standards-based deployments that can scale to handle
many network end points and a rich diversity of healthcare data services.
In this section we outline key considerations and SOA principles when
establishing an RPM solution.
Centralized
PYREX
Lab
550ml
400
300
200
Clinic
RX
Hospital
Federated
PYREX
Lab
550ml
400
300
200
Clinic
RX
Hospital
Flexible, Scalable Architecture to Support Any Deployment Model
With healthcare integration, it is important to leverage architecture that is
flexible across different deployment models and data-use agreements. Figure 3
depicts three common deployment models, wherein health information is
maintained in a centralized, federated, or hybrid database model. The model
selected largely depends upon data-use agreements in the region — whether to
maintain data centrally to a given region, remotely (federated), or as a hybrid
model of the two. The centralized model is optimum both in terms of
performance and access to a consistent, normalized data set, suitable for both
healthcare delivery and clinical research. However, the centralized model
requires the political will by all participants, public and private alike, to agree
to centralized data sharing and data-use agreements.
Hybrid
PYREX
Lab
550ml
400
300
200
Clinic
RX
Hospital
Figure 3: Data Origin Flexibility
Source: Intel Corporation, 2009
Healthcare Information Integration: Considerations for Remote Patient Monitoring | 93
Intel® Technology Journal | Volume 13, Issue 3, 2009
“Due to concerns over performance
and availability, deployment models
frequently shift over time from
federated to hybrid.”
Due to concerns over data sharing, ownership, and local control, the federated
model is frequently chosen over the centralized or hybrid. When a discharge
summary or lab report is required, a record locator service query is issued from
the center to each of the federated or member organizations. If the federated
service is available and the appropriate agreements are in place, one or more
relevant documents are returned that pertain to the patient in question. Due to
concerns over performance and availability, deployment models frequently shift
over time from federated to hybrid, maintaining a small set of demographic
data and centralized pointers to records maintained at the perimeter.
The federated model can be quite effective for very large, distributed data sets,
but the records must be normalized at the edge by the use of agreed-upon
terminology standards. The political will to use such standards can be even
more challenging to achieve than that required for the centralized model. An
excellent example of the federated model is the caBIG (Cancer Biomedical
Informatics GRID at cabig.nci.nih.gov), which links physicians, patients, and
researchers to clinical, cancer, and genomics repositories distributed worldwide
in a standard normalized fashion.
RPM is to leverage a flexible
architecture which can scale from
small institutions, to regional health
information exchanges, to national
networks.”
Remote patient monitoring requires the integration of health information from
a variety of disparate sources. SOA is well suited to adapt and extend to this
level of service, data origin, and terminology complexity. A key success factor in
deploying RPM is to leverage a flexible architecture which can scale from small
institutions, to regional health information exchanges, to national networks.
Finally, the deployment model and technology selected must readily scale to
processing at the core or at the edge of the network.
Service Extensibility, Virtualization of End Points
Traditional peer-to-peer approaches to integration lead to the N2 problem as
depicted in Figure 4, in that each and every application requiring integration
causes a geometric expansion of up-front cost and ongoing maintenance. In
healthcare integration, the N2 problem is made manifest by the inconsistent
adoption of healthcare and technology standards by legacy and proprietary
systems. When a new application joins the network, each and every adapter has
to be modified, in addition to the new application.
n=2
n=3
“In healthcare integration, the N2
n=6
problem is made manifest by the
inconsistent adoption of healthcare
and technology standards by legacy
Figure 4: N2 Problem in Healthcare
Source: Intel Corporation, 2009
and proprietary systems.”
94 | Healthcare Information Integration: Considerations for Remote Patient Monitoring
Custom Interfaces
“A key success factor in deploying
100
90
80
70
60
50
40
30
20
10
0
1 2 3 4 5 6 7 8 9 10
HIT Systems
Intel® Technology Journal | Volume 13, Issue 3, 2009
Integration Brokers change the cost model from geometric to linear, but have
their own share of challenges, with the risk of establishing heterogeneous
“islands” of integration. Integration Brokers rely upon a hub-and-spoke
architecture, creating a single point of failure by routing all messaging traffic
through a central hub. Only with a standardized information model, service
extensibility, and virtualization of end points, provided by an Enterprise
Service Bus (ESB), can one completely address the N2 problem.
The heart of the N2 problem lies in the simplistic framing of integration as a
two-dimensional use case. When we frame the exchange of health information
as a simple, bidirectional exchange between a total of two points, we obfuscate
the actual complexity involved. In reality, the RPM exchange requires multiple
data sources, or network end points, in order to complete the CDM view of
the patient, including vital signs, assessment responses, functional status, lab
results, prescriptions, diet, exercise, treatment plan, and clinical notes, to name
just a few. The information needs to be addressable by using a standardized
information model, and over time, each of the data services should be exposed
by using a standard set of query and retrieval methods.
“The heart of the N2 problem lies in
the simplistic framing of integration as
a two-dimensional use case.”
A service network architecture allows for building the “on-ramp” once, with no
adapter maintenance required, as other applications join or leave the network.
Service extensibility serves to virtualize the end points, abstracting the details of
transport protocols and peer-to-peer connections. Services can be dynamically
registered, discovered, and rerouted to scale as performance and reliability
needs dictate.
Network Compute Model
The HL7 v3 CDA Release 2 [4] constrains the v3 RIM and leverages the full
richness of its healthcare informatics model and standardized terminology,
delivering computable, healthcare information as well as human-readable
clinical documents. By first composing all of the telehealth data to the HL7 v3
CDA Release 2 (i.e., the network on-ramp in Figure 5), it becomes a repeatable
exercise then to perform any secondary transforms to legacy or proprietary
messaging protocols and local terminology (i.e., the network off-ramp).
Healthcare Service Network
Network
On-Ramp
“Services can be dynamically
registered, discovered, and rerouted to
scale as performance and reliability
needs dictate.”
Healthcare
Service using
HL7 CDA
Canonical
Transform
Canonical
Figure 5: Service Network Architecture
Source: Intel Corporation, 2009
Healthcare Information Integration: Considerations for Remote Patient Monitoring | 95
Intel® Technology Journal | Volume 13, Issue 3, 2009
This integration pattern accelerates adoption of the latest healthcare informatics
standards, while lowering the barriers of adoption for smaller organizations that
need to proceed at their own pace.
A robust network informatics exchange model can be used to establish trust
at the network level — a healthcare dial tone. Peer systems may validate,
authenticate, and audit the encrypted XML payloads at any point in the
network. Moreover, the network compute model enables the ability to route,
compose, and decompose fine-grained business objects according to national
and regional, legal and regulatory, privacy, data protection, and patient consent
requirements.
Pluggable Services in the Cloud
“Since the service location is
virtualized, and the service
implementation is encapsulated,
a service can be readily created or
replaced without impacting existing
service consumers.”
SOA provides the ability to leverage services available within the data center or
across the Internet. New services can be brought online and directly utilized by
network participants, without requiring additional modifications to each end
point. Since the service location is virtualized, and the service implementation
is encapsulated, a service can be readily created or replaced without impacting
existing service consumers. The OMG/HL7 Healthcare Services Specification
Project (HSSP at http://hssp.wikispaces.com/) is working to define standard
Web service methods to access critical healthcare infrastructure services,
including entity identity, controlled terminology, record locator, decision
support, and clinical research filtered query services. Similarly, there is a need
for advanced healthcare data services, such as drug interaction checks, adverse
event reporting, clinical trial recruitment, and public health reporting. SOA
design methodology allows for incremental implementation, at first utilizing
simple data match routines and then, when the complexity of exchange
dictates, readily switching to industry-strength identity and terminology
services, all without changing the service interface.
Deployment Considerations
“What is required, however, is deep
healthcare domain expertise, a keen
sense of customer requirements, and an
understanding of the context in which
the system will be used.”
Effective RPM deployment requires the application of industry best practices
with respect to information technology, data center operations, enterprise
service delivery, and robust security and privacy measures. The technical
challenges in healthcare are not insurmountable; rather, they can be solved by
using well-known solutions and design patterns. What is required, however, is
deep healthcare domain expertise, a keen sense of customer requirements, and
an understanding of the context in which the system will be used. In order to
fully realize the potential of RPM, we must arrive at the right combination of
technology and process.
96 | Healthcare Information Integration: Considerations for Remote Patient Monitoring
Intel® Technology Journal | Volume 13, Issue 3, 2009
Patient Acuity and Mode of Healthcare Delivery
Patient acuity and the concomitant mode of healthcare delivery are arguably
the most important determinants of RPM requirements. Patient acuity
determines the level of monitoring and likelihood that intervention will be
required, the frequency of data collection, the criticality of reference ranges
and alerting mechanisms, and the relative intolerance for data latency. The
mode of healthcare delivery — whether it be a wellness coach, a visiting nurse,
community outreach, an assisted living facility, hospital discharge monitoring
to avoid readmission, population management, or patient-empowered
healthcare — tends to be matched to patient acuity and provider service level
agreements. Contact with the patient, and the relative need to process the
clinical findings, will therefore range from occasional, monthly, quarterly,
daily, hourly, or perhaps even more frequent when a sudden decline in health
is detected. The duration of RPM deployments could be measured in weeks
in the case of hospital discharge monitoring, months in the case of high-risk
pregnancies, or years when monitoring elderly patients with co-morbidities.
“The mode of healthcare delivery tends
to be matched to patient acuity and
provider service level agreements.”
Data Latency and Modes of Transmission
As discussed earlier, tolerance for data latency is largely determined by patient
acuity of the target population. Relatively healthy patients, coupled with
wellness coaches, can tolerate high data latency with summary reports over
periods as much as a few months at a time. High-acuity patients require
more frequent monitoring, with data latency approaching near real-time.
High data latency can readily be accommodated by scheduled, file-oriented,
and store-and-forward processes to perform data integration. In contrast,
low data latency requires end-to-end integration performed via Web services
every few minutes, where the incremental transmission is closer to a single
patient session, containing the latest raw measurements and assessment
responses, rather than a complete trend analysis of the past period. Alerts and
notifications can be generated via real-time, event-driven triggers, whereas
batch operations and monthly summary reports can be scheduled to occur
during off-hour processing.
“Alerts and notifications can be
generated via real-time, event-driven
triggers, whereas batch operations
and monthly summary reports can
be scheduled to occur during off-hour
processing.”
Volume and Quality of Data
RPM involves significantly more data than what is typically anticipated for
an EHR, such as a clinical encounter. Patients may be instructed to take vital
sign measurements multiple times per day in addition to responding to various
assessment questions. One of the chief areas of ongoing investigation is the
optimal level of summarization of the PHMR. Different clinicians will likely
want a full range of options, from daily, monthly, or quarterly to a filtered
summary, depending upon patient acuity and the mode of healthcare delivery.
“Different clinicians will likely want
a full range of options, from daily,
monthly, or quarterly to a filtered
summary, depending upon patient
acuity and the mode of healthcare
delivery.”
Healthcare Information Integration: Considerations for Remote Patient Monitoring | 97
Intel® Technology Journal | Volume 13, Issue 3, 2009
Once the report design is optimized, additional adaptations may be necessary
to the recipient system in order to fully leverage the additional rich data types,
process alerts, and triage patients based on clinical findings. In particular, it
is unlikely that recipient systems are prepared to work with patient-specific
reference ranges, threshold violations, assessment questionnaires, or weighted
scores for industry standard protocols. Some systems are unprepared to process
datetime stamps on individual measurements, because of the prior exclusive
focus on clinical encounters in office settings. In other words, while a given
office medical system might record the date of the office visit, it rarely records
the time of an individual measurement. While the concept of an RPM “patient
session” can be likened to an office visit, the recipient medical system is
unprepared to process the sheer volume of RPM sessions and measurements.
“All points along the end-to-end data
flow should be instrumented and
monitored for effective operation and
patient safety.”
“We have identified the need to define
tiers of thresholds to separately drive
patient and clinician workflows and
modes of intervention.”
As clinical systems and processes evolve to process data from RPM, the need for
more sophisticated methods of patient triage, alert, and notification will also be
required. For example, a large number of normal readings from a moderatelysized patient population will quickly outpace the most efficient of care-manager
organizations, if workflows require the manual acknowledgement of all
readings, rather than triage based on out-of-range measurements. Conversely,
if an organization becomes overly dependent upon the direct integration of
RPM data without also developing adequate means for systems monitoring,
an undetected outage or transmission failure might inadvertently create a
false impression that patient readings are within normal limits. It is critical,
therefore, to develop adequate systems monitoring and failsafe methods.
For example, reports should be appropriately annotated with synchronized
datetime stamps, indicating both the time of report generation and the time of
the last data transmission. All points along the end-to-end data flow should be
instrumented and monitored for effective operation and patient safety.
Threshold violations, especially life-threatening ones, need to trigger specific
workflows that are customizable by practice, by co-morbidities, by target
populations, and by individual patients. We have identified the need to define
tiers of thresholds to separately drive patient and clinician workflows and
modes of intervention, ranging from patient education, to clinician referral,
to emergency hospital admission. There is also the need to capture both the
trigger event and the clinical intervention as part of analytics. For example, a
patient’s oxygen saturation falls dangerously low, which triggers an alert and
results in some form of patient intervention — whether a phone call, an SMS
text message, a video conference with a clinician, or a house call from a visiting
nurse. The clinical intervention may result in a change in protocol, a lab order,
a medication change, or hospitalization. Each of these events needs to be
associated with a standard measure of outcomes in order to support analytics
for evidence-based medicine and drive further improvements to healthcare.
98 | Healthcare Information Integration: Considerations for Remote Patient Monitoring
Intel® Technology Journal | Volume 13, Issue 3, 2009
Clinician Workflow and Reimbursement Model
Another important consideration in the integration of RPM information is
clinician workflow and the reimbursement model. Some jurisdictions require
that a clinician manually review and accept each and every measurement prior
to importing the data into the institutional EHR. This follows standard clinical
practice of signing off when reviewing external lab results, yet again, the
volume of data is fundamentally different when considering RPM.
“Annotating the data stream with the
source and method of reporting helps
to account for these differences in
policies.”
Some institutions extend the system boundary of the EHR to encompass any
automated data capture but draw the line at information that is patient-reported
or otherwise manually entered, such as a PHR containing diet and exercise
journal entries. Data that are grandfathered in as automatic data capture might
not require the manual approval step, whereas patient-reported data may
be reviewed but perhaps not incorporated into the institution’s legal record.
Annotating the data stream with the source and method of reporting helps to
account for these differences in policies.
Accommodations are required for both the level of summary and raw
information in a given report. Streamlined mechanisms are required to process
messages through the clinical inbox, along with careful consideration as to what
level of clinical staff might be able to process what level of data on behalf of the
doctor, in order to potentially offload this manual step.
Further, the clinical reimbursement model is frequently called into question
with respect to RPM. Some reimbursement models attempt to equate RPM
with an office visit, while others only reimburse when the patient establishes
and maintains good tolerance of pre-established thresholds. Each of these
considerations will have an impact on the rate of adoption of RPM, especially
when combined with additional processing overhead on the part of the
physicians to periodically review the results.
“Some reimbursement models attempt
to equate RPM with an office visit,
while others only reimburse when the
patient establishes and maintains good
tolerance of pre-established thresholds.”
Trans-border Data Flow Considerations
Careful attention to detail will be required for any deployment in which
integration is planned across borders of state/provinces, regional, or national
boundaries. Privacy and data protection laws are rapidly changing worldwide,
with significant penalties for mishandling of data and breach of privacy.
Advanced workflow, transformation, and routing engines will be required
to comply with local data protection regulations and policies. Special
consideration is due when determining where to locate a primary or alternative
data center hosting patient data, since a number of countries require that
protected health information (PHI) not cross national boundaries. To manage
and track patient consent and negotiate appropriate data use, business associate
and data controller/supplier agreements are all essential, regardless of whether
or not information crosses any recognized governmental boundaries.
“Privacy and data protection laws are
rapidly changing worldwide, with
significant penalties for mishandling
of data and breach of privacy.”
Healthcare Information Integration: Considerations for Remote Patient Monitoring | 99
Intel® Technology Journal | Volume 13, Issue 3, 2009
Identifying Systems of Record
“While clinical data might be easily
segregated between different systems
of record, it is highly likely that every
system maintains its own copy of
demographic data.”
Identifying a single system of record (i.e., a single authoritative source for
each and every data element) is essential to any successful integration project
and frequently overlooked in applications such as RPM. It is typical for even
small organizations to already have multiple systems in place for purposes of
chronic disease management, population management, a primary EHR data
repository, a separate system for lab results, etc. The addition of telehealth data
likely represents the addition of one or more systems to an already complex and
disorderly mix.
A key area of concern is patient and clinician demographics. While clinical data
might be easily segregated between different systems of record, it is highly likely
that every system maintains its own copy of demographic data. Considerations
must be given both from a systems and a workflow perspective to demographics
synchronization, import, and ongoing maintenance. The older systems likely
do not have a method to disable manual edits to demographics, yet one must
ensure that clinicians are always working from an authoritative source of
patient and clinician demographics and contact information.
“Provisions are made to disable
manual edits in the recipient systems,
or at least ensure that processing
detects, logs as an exception, and
overrides any unauthorized changes.”
“Common identity mismatch errors
tend to require a small staff to resolve
In an advanced integration deployment, the demographics system of record
updates the recipient systems with the latest information, including translation
of identity to the recipient system, via an entity identity service (EIS).
Provisions are made to disable manual edits in the recipient systems, or at
least ensure that processing detects, logs as an exception, and overrides any
unauthorized changes.
In mixed environments with both legacy and newer systems, a complex
scheme of automated demographics integration along with a carefully designed
business process is required. A central system can be configured to synchronize
demographics to each of the recipient systems. When manual edits in each
system cannot be disabled, they must be controlled via business process,
training, and careful oversight to ensure that changes in demographics are only
entered into the central system. Common identity mismatch errors tend to
require a small staff to resolve and maintain on an ongoing basis.
As healthcare systems integration becomes more complex, encompassing
multiple end points and service providers, each with their own independent
systems of record, it becomes paramount to employ an industry-strength EIS to
accurately address the identity match problem. Industry leaders in EIS leverage
advanced stochastic algorithms for matching against multiple demographics
attributes to disambiguate identity. The OMG/HL7 Healthcare Specifications
Services Project (HSSP) is working to define standard Web service interfaces for
common capabilities like EIS, such that different commercial services may be
deployed without requiring a change to the implemented interface.
and maintain on an ongoing basis.”
100 | Healthcare Information Integration: Considerations for Remote Patient Monitoring
Intel® Technology Journal | Volume 13, Issue 3, 2009
Customization Requirements
There are a number of deployment considerations when establishing a remote
patient-monitoring solution. The AHIC use cases are foundational to defining
routine healthcare interactions and processes, required data elements, and
terminology constraints to standardize the exchange. It is also important to
distinguish the areas of local variation and future development, such as those
identified in the RPM sequence and interaction diagrams (Figures 1 and 2,
respectively). It is important to control the level of customization required
for any given solution to something that delivers real value to the customer,
both in the short-term and in the reasonable future, yet is practical enough to
represent low maintenance over time. While anything is technically feasible, it
is not practical for a business to develop a system to be all things to all people.
It is critical to establish up front some of the key business drivers, including
patient acuity, target mode of healthcare delivery, and the relative tolerance for
data latency. From this baseline, customization mechanisms can be established
to allow for local variation, leveraging codeless configuration changes to
metadata, rather than requiring a code recompile and system overhaul for each
deployment.
“While anything is technically feasible,
it is not practical for a business to
develop a system to be all things to all
people.”
Conclusion
Remote patient monitoring represents a critical intersection of healthcare
information integration, embodying the need for healthcare informatics
standards, careful consideration of workflow and system deployment tradeoffs,
and direct engagement with the patients and clinicians who work with the
system. The use of HL7 CDA helps to accelerate adoption of healthcare
standards by properly constraining the rich schema and vocabulary of the
RIM, and lowering the barriers of entry through incremental evolution. The
use of SOA design principles enables us to respond to changes in business
drivers and adapt to the complexity of healthcare integration end points.
Terminology standards drive computable information which in turn enables
advanced analytics, clinical research, and transformational healthcare delivery.
We are actively working with the SDOs in pursuit of future enhancements to
the existing healthcare informatics standards.
Healthcare Information Integration: Considerations for Remote Patient Monitoring | 101
Intel® Technology Journal | Volume 13, Issue 3, 2009
References
[1]
Bass, Christy and Lee, J. Michael. 2002. “Building a Business Case for
EAI.” eAI Journal, January 2002, pp. 18-20.
[2]
Booz Allen Hamilton. 2005. “Canada Health Infoway. Pan-Canadian
Electronic Health Record: Projected Costs and Benefits.” March 2005.
At www.infoway-inforoute.ca.
[3]
Continua Health Alliance. 2009. “Continua Design Guidelines.” V1.0,
June 2009. At www.continuaalliance.org.
[4]
Dolin R.H., Alschuler L, Boyer S, Beebe C, Behlen FM, Biron PV,
Shabo A, (Editors). 2005. “HL7 Clinical Document Architecture,
Release 2.0. ANSI-approved HL7 Standard.” May 2005. Ann Arbor,
Michigan: Health Level Seven, Inc. At www.hl7.org.
[5]
Healthcare Information Technology Standards Panel (HITSP). 2008.
“Consultations and Transfers of Care Interoperability Specification.”
HITSP/IS09. Released for Implementation. 20081218 V1.0.
At www.hitsp.org.
[6]
Healthcare Information Technology Standards Panel (HITSP). 2008.
“Remote Monitoring Interoperability Specification.” HITSP/IS77.
Released for Implementation. 20081218 V1.0. At www.hitsp.org.
[7]
Healthcare Information Technology Standards Panel (HITSP). 2009.
“Remote Monitoring Observation Document Component.” HITSP/
C74. Released for Implementation. 20090708 V1.1. At www.hitsp.org.
[8]
Health Level Seven (HL7). 2009. “Implementation Guide for CDA
Release 2: CDA Framework for Questionnaire Assessments (Universal
Realm) Draft Standard for Trial Use (DSTU) Release 1.0.” April 2009.
At www.hl7.org.
[9]
Health Level Seven (HL7). 2007. “Implementation Guide: CDA
Release 2 – Continuity of Care Document (CCD).” April 01, 2007.
At www.hl7.org.
[10] Health Level Seven (HL7). 2009. “Implementation Guide for CDA
Release 2.0 Personal Health Monitoring Report (PHMR) Draft
Standard for Trial Use (DSTU) Release 1.” At www.hl7.org.
[11] OMG SOA Consortium, CIO Magazine. 2008. “OMG SOA
Consortium and CIO Magazine Announce Winners of SOA Case Study
Competition.” At www.soa-consortium.org.
102 | Healthcare Information Integration: Considerations for Remote Patient Monitoring
Intel® Technology Journal | Volume 13, Issue 3, 2009
[12] Schindler, Esther. 2008. CIO Magazine. “Service-Oriented Architecture
Pays Off for Synovus Financial.” September 30, 2008. At www.cio.com.
[13] U.S. Dept. of Health and Human Services (HHS), Office of the
National Coordinator for Health Information Technology (ONC).
2008. “Long Term Care–Assessments. AHIC Extension/Gap.”
December 31, 2008. At www.hitsp.org.
Acknowledgements
Special thanks to Dr. Robert H. Dolin, co-chair of the HL7 Structured
Documents Committee, who assisted us in developing our initial CDA
models.
Author Biography
Kristina M. Kermanshahche is Chief Architect for Intel’s Digital Health
Group. Her research interests include SOA as applied to regional and
national healthcare architectures, CDM from home health to clinical trials,
and translational biomedical informatics. She was the Lead Architect for the
Intel SOA Expressway and contributed to the Intel Mobile Clinical Assistant
and the Intel Health Guide; she has a patent pending in healthcare semantic
interoperability. Her twenty-five year career in software development
spans nearly every industry, with an emphasis on complex distributed
systems, multithreading, high availability, networking, and database design.
She received her B.A. degree in 1988 from the University of Oregon in
Political Science and International Studies, magna cum laude. Her e-mail is
Kristina.M.Kermanshahche at intel.com
Copyright
Copyright © 2009 Intel Corporation. All rights reserved.
Intel, the Intel logo, and Intel Atom are trademarks of Intel Corporation in the U.S. and other
countries.
*Other names and brands may be claimed as the property of others..
Healthcare Information Integration: Considerations for Remote Patient Monitoring | 103
Intel® Technology Journal | Volume 13, Issue 3, 2009
Personal Health Device Interoperability
Contributors
Douglas P. Bogia
Intel Corporation
Rick Cnossen
Continua Health Alliance
Chris Gough
Intel Corporation
Abstract
Healthcare costs continue to spiral upwards to the point of prompting
national mandates for change in the whole healthcare system. Technology has
advanced to a point where personal telehealth systems provide viable, costeffective solutions and represent a very real opportunity to help control costs.
In order for deployment to become widespread, an ecosystem of standardsbased interoperable components (starting with the consumer-facing device) is
essential. In this article, we discuss the challenges related to creating such an
ecosystem and examine the efforts of the Continua Health Alliance to address
this need.
Introduction
Several factors have converged to provide new opportunities to address user
needs in the healthcare domain:
Index Words
Continua Health Alliance
Healthcare
Interoperability
Standardization
Communication
Cost Reduction
•• The dramatic rise in costs associated with established healthcare services.
•• A global increase in the number of older adults who are most likely to
require these services.
Technological advances, such as innovations in networking technologies and
a rapid increase in the number of Internet-connected devices, enable the
development of solutions that address user needs in a cost-effective manner.
These technologies also allow people to remain safely in their own homes for a
longer period of time. Personal telehealth systems composed of an ecosystem
of standards-based interoperable components are the building blocks of these
solutions. In this context, we define personal as systems that are used in a home,
and we define telehealth as the way in which communication technologies, such
as a phone line or broadband Internet access, are used in healthcare services.
In this article, we describe in more detail the underlying need for
interoperability between these components, the approach that was used by the
Continua Health Alliance [1] to define and design such a system, and some of
the key challenges that were encountered along the way.
“Personal telehealth systems composed
of an ecosystem of standards-based
interoperable components are the
building blocks of these solutions.”
104 | Personal Health Device Interoperability
Intel® Technology Journal | Volume 13, Issue 3, 2009
Need for Device Interoperability
While there are many challenges associated with the successful design,
implementation, and deployment of personal telehealth systems, one of the
more obvious problems present in early telehealth solutions was a lack of device
interoperability or well-defined standards to which technology companies
could develop their products. In the next sections we discuss these issues in
more detail from both the integrator and purchaser perspective as well from the
product designer perspective.
Integrator and Purchaser
Telehealth stakeholders, both system integrators (companies producing
telehealth solutions made up of components from a number of different
vendors) and system purchasers (healthcare providers that purchase these
solutions and offer them to their patients or members) require a wide variety
of system vendors and components to select from. These stakeholders produce
or consume, respectively, a large number of telehealth systems and need to
protect themselves from undesirable situations, such as a component vendor
going out of business or being unable to deliver the desired volume of system
components in a given timeframe. Furthermore, the presence of many
component vendors competing for business drives down costs, improves the
quality of solutions, and encourages adding differentiating features to these
solutions.
“The presence of many component
vendors competing for business drives
down costs, improves the quality of
solutions, and encourages adding
differentiating features to these
solutions.”
However, the ability for integrators or purchasers to select from a wide variety
of personal health devices offered by multiple vendors is only possible if the
devices are compatible with one another. This compatibility is enabled by welldefined communication interfaces between the various system components.
Product Designer
From the perspective of a product designer, device interoperability is also an
essential need. For instance, without interoperable standards, a system designed
to communicate with a wide range of telehealth peripherals (for example,
weight scales, blood pressure monitors, glucose meters) must implement
proprietary data modeling and communication protocols over a variety of
transport technologies , such as serial, Bluetooth*, and infrared.
“Without interoperable standards, a
Product designers need to create a device-specific software adapter such that
their system can communicate with each of the peripherals. This leads to
inefficiencies in terms of both software development and system validation.
Likewise, designers of peripherals must design data and communication
protocols. Since these are specific to their device, the implementation of device
driver(s) for target operating system(s), user applications with GUIs, and
documentation is necessary.
modeling and communication
system designed to communicate with
a wide range of telehealth peripherals
must implement proprietary data
protocols over a variety of transport
technologies.”
Personal Health Device Interoperability | 105
Intel® Technology Journal | Volume 13, Issue 3, 2009
“The necessary, interoperable solution
minimizes cost, improves design and
development efficiencies, and enables
separation of concerns for each of the
parties involved.”
The necessary, interoperable solution minimizes cost, improves design and
development efficiencies, and enables separation of concerns for each of the
parties involved. Consequently, device vendors can focus on devices, software
vendors can focus on software development, and service providers can focus
on service delivery, without requiring each party to have intimate knowledge
of every component that makes up the overall solution. Such an interoperable
solution can be achieved by defining communication interfaces in an
unambiguous set of standards and guidelines.
Challenges to Standardization
There are many challenges and barriers to standardization of device interfaces
in the personal health domain. In this section we explore some of the more
difficult ones.
Confluence of Domains
“It is imperative to converge on a
single unified solution yet difficult to
define a single solution that is both
efficient and optimum for all options
in the spectrum.”
“The challenge is in creating a generic
framing protocol that can be applied
to all application areas in order to
normalize collected data.”
106 | Personal Health Device Interoperability
The scope of the personal health domain is quite large and ranges from
regulated clinical disease management devices to unregulated consumer-grade
fitness devices. From an architectural and practical standpoint it is imperative
to converge on a single unified solution yet difficult to define a single solution
that is both efficient and optimum for all options in the spectrum.
Regulated Environment Versus Consumer Environment
Devices that are intended for a regulated environment (for example, US
FDA Class 2 Glucose Meters for a medical professional to treat diabetes) and
consumer grade devices (for example, a personal pedometer to track activity
levels) are starkly different. These differences show up in the requirements
(security levels, privacy restrictions, data reliability, and logging) as well as in
the level of testing required (stress testing, off-nominal conditions, etc.). These
differences make the decision-making process for interoperability standards
difficult as there are always tradeoffs.
Diversity of Applications
The current state of the personal health market is such that there are
independent vertical application solutions (for example, health and fitness,
disease management, and aging independently). The broad vision for an
interoperable solution is to eliminate these divisions in order to allow for
innovative solutions that span domains and create optimum value for the
consumer. For example, many consumers of disease-management applications
could benefit from an integrated application that accepts fitness data in order
to evaluate the effectiveness of a particular exercise against a particular disease.
The challenge is in creating a generic framing protocol that can be applied to
all application areas in order to normalize collected data. This can be difficult
as different domain applications have different approaches for some of the
common elements. For example, the timestamp granularity for diseasemanagement applications tends to be higher than for aging-independently
applications.
Intel® Technology Journal | Volume 13, Issue 3, 2009
The Nascent Personal Health Industry
The personal health industry is relatively young and lacks mature standardsbased solutions for interoperability. This necessitates developing new standards
or profiles in many cases instead of using existing standards that have been used
and validated over time. It takes a notoriously long time for the standards to
be created, accepted by the industry, augmented with tools, and understood by
engineers. Such a lengthy process runs counter to the desire to quickly define
solutions for industry build-out.
“It takes a notoriously long time for
the standards to be created, accepted
by the industry, augmented with tools,
and understood by engineers.”
International Need
With the ubiquity of the Internet, common communication advances, multinational companies, and global markets there is an opportunity and a strong
desire to create solutions that can be deployed internationally. International
deployment can capitalize on a company’s initial development investment
and increase overall revenue by expanding market opportunities. Within
the international standards organization, these solutions require geographic
differences among standardization processes to be harmonized. Harmonization
complicates and slows down the standardization process. For example, UScentric solutions tend to use US Standard Development Organizations (SDOs)
such as IEEE, ANSI, and W3C, while EU-centric solutions tend to use EU
SDOs, such as CEN, ETSI, and CENLEC. There are some fundamental
differences in approaches also. For example, the United States approach tends
to be industry-driven, while the European Union approach tends to be more
government-driven. These differing approaches impact the process and must
be reconciled. Additionally, there are regional and demographic differences
that must be overcome. For example, demographics data (address, gender) vary
among countries, and a common structure must be defined to allow for all
system instantiations.
“The United States approach tends
to be industry-driven, while the
European Union approach tends to
be more government-driven.”
Business Motivation
Despite the best intentions, there are many existing standards that are
incomplete or unused. The bottom line for many companies is that investing in
the development and implementation of standards must benefit them to some
degree economically. For new, nascent markets, such a return on investment
can be difficult to project and demonstrate which, in turn, can lead to a dearth
of volunteers or to slow uptake in product build-out.
The Continua Health Alliance Solution
This section introduces the Continua Health Alliance and describes how it
addressed the challenges and issues described in the preceding sections.
“For new, nascent markets, such a
return on investment can be difficult
to project and demonstrate which,
in turn, can lead to a dearth of
volunteers or to slow uptake in product
build-out.”
Personal Health Device Interoperability | 107
Intel® Technology Journal | Volume 13, Issue 3, 2009
Approach
The Continua Health Alliance is an open industry coalition of over 200
healthcare and technology companies focused on interoperability in the
personal healthcare domain. Founded in 2006, the Continua Health Alliance
addresses the unique combination of challenges just discussed. The Continua
Health Alliance leveraged examples from other solution domains such as home
networking (for example, Wi-Fi Alliance and Digital Living Network Alliance)
in order to help define its overall approach. The Continua Health Alliance
adopted the following methodology:
•• Select an existing, applicable set of industry standards.
•• Extend these selected standards where required to meet user needs identified
in use-cases and requirements.
•• Eliminate ambiguity in interpretation of these standards through a
collection of interoperability guidelines.
In the following sections we describe the process, architecture, and final
implementation that address the challenge of enabling an ecosystem of
interoperable, standards-based personal-healthcare devices.
Process
The Continua Health Alliance process begins with use-case collection and
refinement. After agreeing upon the use-cases, requirements are elaborated
upon and architecture is defined. Existing standards are then reviewed to
determine if a standard already exists that meets most, if not all, of the
requirements. If so, it is selected. If not, a new one is created. Next, the
Continua Health Alliance creates interoperability guidelines to define profiles
over the standards to serve as a basis for product certification. To assist
companies that make products adhering to the guidelines, the Continua
Health Alliance hosts interoperability plugfest events to ensure products from
multiple vendors work together. The Continua Health Alliance also establishes
a certification and testing program that ensures product compliance with the
guidelines. It certifies that products pass the certification and testing program,
and if they do, the products may use the Continua Health Alliance logo.
Use-Case Collection
With more than 200 companies in the Continua Health Alliance, converging
on a set of priority use-cases for each iteration can be a challenge.
To deal with this challenge, the Continua Health Alliance maintains a rigorous
process that includes the following steps:
•• Crisply define candidate use-cases by using a template that includes title,
date, version, description, scope, actors, preconditions, flow, and technical
feasibility.
•• Consolidate and generalize use-cases, segmenting work along market
boundaries in order to encourage participation in areas of expertise and
business interest. This eliminates redundancy and brings like-minded
companies together to truly describe a consolidated industry need.
108 | Personal Health Device Interoperability
Intel® Technology Journal | Volume 13, Issue 3, 2009
•• Review use-cases to carefully constrain scope (for example, consider product
instantiations, resourcing, schedule) and appropriate focus (for example,
ensuring component interfaces are defined but do not overly address
product capabilities as the interfaces are opportunities for differentiation).
•• Include a review cycle with domain experts and consumers to ensure usecases represent real and useful solutions.
Requirements Elaboration
Once the use-cases are complete, they are decomposed into system
requirements. This process entails analyzing the use-case to pull out
actors, actions, and assumptions and converting these into corresponding
requirements.
Standard Selection and Development
Once the requirements are defined, members of the Continua Health Alliance
search for existing SDOs and standards that best satisfy the requirements and
desired characteristics. In cases where there is no standard, the members must
select an SDO for hosting a new standard and apply resources to create the
new standard. The following questions are considered to evaluate the selected
standard:
“In cases where there is no standard,
the members must select an SDO for
hosting a new standard and apply
resources to create the new standard.”
•• Does the standard adequately address the requirements from the selected
use cases?
•• Does the SDO create international standards or a path to generate them?
•• Is the SDO open for participation from member companies?
•• If a standard exists, how well is it harmonized with related domain
standards? If standards do not exist, how open is the SDO to harmonizing
with other standard organizations?
•• How is the standard accessed and what are the control mechanisms for the
standard from the SDO?
•• What are the intellectual-property (IP) rights for standards from the SDO?
•• What are the costs pertaining to licensing, implementation, and access?
•• Does the SDO provide tool support for creating or testing implementations
of their standards?
•• What is the level of adoption and maturity of standards from the SDO?
Guidelines Development
Once the standards are available, Continua Health Alliance members compare
the standard against the identified system requirements to identify remaining
gaps. The Continua Health Alliance creates interoperability guidelines to
address these gaps, expand on the standards, minimize optionality, and
facilitate tight interoperability.
“The Continua Health Alliance creates
interoperability guidelines to address
these gaps, expand on the standards,
minimize optionality, and facilitate
tight interoperability.”
Personal Health Device Interoperability | 109
Intel® Technology Journal | Volume 13, Issue 3, 2009
“The lab uses an automated test
tool, a system simulator, and mature
certified reference devices to verify
interoperability.”
Test Plans and Tools
After standards and the interoperability guidelines are completed, the Continua
Health Alliance creates test assertions and procedures. The test procedures
verify that a product adheres to the appropriate standards and guidelines.
All procedures can be run manually; however, as many as possible are also
automated to reduce the level of product certification effort.
Certification Lab
When a product is completed, it is tested in a certification lab. The device
is run through various procedures to receive certification. The lab uses an
automated test tool, a system simulator, and mature certified reference devices
to verify interoperability. Once testing in the lab produces a satisfactory result,
a certification can be issued and the product can use the Continua Health
Alliance logo for marketing purposes.
Architecture
“The architecture defines the subsystems
that make up the overall system and
describes how they interact.”
One of the key activities that goes hand-in-hand with the collection of
requirements discussed in the preceding Requirements Elaboration section
is the definition of the system architecture. The architecture defines the
subsystems that make up the overall system and describes how they interact.
Just as important, it provides a useful model for organizing the requirements
and offers guidance to engineers who look for applicable standards and
technologies during the subsequent implementation phase.
Early on in the process of designing the Continua Health Alliance Device
Architecture, there were two critical challenges that needed to be addressed:
•• Resolving differing viewpoints from a large number of member companies
and participants.
•• Describing the architecture in a manner that provided enough technical
guidance to contributors working on the follow-on implementation, but
could still be comprehended by non-technical stakeholders such as those in
marketing, and also by usability experts.
“A comprehensive set of standard
architecture principles was selected,
and referenced terms were defined
(for example, extensibility, portability,
time to market, and scalability).”
110 | Personal Health Device Interoperability
The first challenge was solved in large part by following a relatively simple
process: a comprehensive set of standard architecture principles was selected,
and referenced terms were defined (for example, extensibility, portability, time
to market, and scalability). The team came to the realization that the system
architecture could look quite different, depending on how these principles
were prioritized, so each member company furnished a prioritized list that
effectively served as their ballot. The lists were compiled, and the resulting
unified prioritized list of architecture principles served as a useful tool during
architecture sub-team meetings where various design alternatives and tradeoffs
were discussed.
Intel® Technology Journal | Volume 13, Issue 3, 2009
The second challenge primarily arose from the fact that the team needed
to create a logical view of the system architecture composed of logical (or
functional) components, but it was much easier to describe concepts to a nontechnical audience in terms of physical components (for example, a weight
scale or a personal computer). The solution in this case was to create a hybrid
view of the architecture that represents both logical and physical entities, as
shown in Figure 1.
Application Hosting Device
Control Application
Service Virtualization Middleware
PAN Client
Component
In the following subsections we look more specifically at the architecture
depicted in Figure 1.
PAN Interface
Personal Area Network Devices
A personal area network (PAN) device is a real-world physical device that a
person interacts with to obtain a measurement, for example. Examples of PAN
devices include blood glucose meters, weight scales, and pedometers. Note that
the term PAN device is synonymous with the term agent introduced later in the
Implementation section.
PAN Service Component
A PAN device must contain a PAN service component. This is a logical
component that offers one or more services over the PAN interface.
PAN Service
Component
PAN Device
Physical Device
Logical Component
Figure 1: Continua Device Architecture
Source: Continua Health Alliance, 2009
Application Hosting Device
An application hosting device is a physical device that communicates with
one or more PAN devices to receive measurements, for example. Examples
of application hosting devices include purpose-built, healthcare appliances,
personal computers, and cell phones. Note that the term application hosting
device is synonymous with the term manager introduced further on in the
Implementation section.
PAN Client Component
An application hosting device must contain a PAN client component. This is a
logical component that consumes one or more services over the PAN interface.
Service Virtualization Middleware
Service virtualization middleware is a logical component that normalizes
details associated with a number of PAN client components, such that control
applications can consume PAN services from a wide variety of devices in a
consistent fashion. For example, if a system has two PAN devices, a Bluetooth
blood-pressure monitor and a USB weight scale, control applications would
receive measurements through the unified interface offered by the virtualization
middleware without requiring detailed understanding of the specific transport
technologies involved.
“Examples of application hosting
devices include purpose-built,
healthcare appliances, personal
computers, and cell phones.”
Control Applications
Control applications consume the interface implemented by the service
virtualization middleware to communicate with a wide variety of PAN service
components.
Personal Health Device Interoperability | 111
Intel® Technology Journal | Volume 13, Issue 3, 2009
PAN Interface
PAN
Device
1…*
1
Application
Hosting Device
1
1
1…*
1…*
PAN Service
Component
1
1
Physical Device
PAN Client
Component
Logical Component
The PAN interface is the standards-based communication interface between
PAN service components and PAN client components that enables device
interoperability in the Continua Health Alliance ecosystem.
Figure 2 shows the multiplicity of relationships between the various
components that make up the Continua device architecture. A PAN device
contains one or more PAN service components. An application hosting device
contains one or more PAN client components. There is a one-to-one
relationship between PAN service components and PAN client components.
There is a one-to-many relationship between an application hosting device and
a PAN device.
Figure 2: System Multiplicity
Source: Intel Corporation, 2009
We now move on to describe how the PAN interface was realized during the
implementation phase of the Continua Health Alliance development process.
Implementation
There are multiple standards that are utilized when implementing the PAN
solution. Figure 3 depicts the standards and their relationships.
Continua Design Guidelines
Device Specialization Layer
Higher
Layer
Protocol
Lower
Layer
Protocol
04 – Pulse
Oximeter
07 – Blood
Pressure
08 – Thermometer
15 – Weighing
Scale
17 – Glucose
Meter
19 – Insulin
Pump
21 – Peak Flow
+Phase III
41 – Cardiovascular Fitness & Activity
42 – Strength Fitness Equipment
71 – Independent Living Activity Hub
72 – Medication Monitor
Optimized Data Exchange Protocol Layer
ISO/IEEE Std 11073–20601
Transport Layer
Bluetooth* Health
Device Profile
USB Personal
Healthcare Device
Figure 3: Standards for PAN Interface
Source: Continua Health Alliance, 2009
112 | Personal Health Device Interoperability
ZigBee* Personal
Healthcare Device
Intel® Technology Journal | Volume 13, Issue 3, 2009
The higher-layer protocols are standardized by the International Standards
Organization (ISO) and the Institute of Electrical and Electronics Engineers
(ISO/IEEE). The higher-layer protocols consist of a common, optimized data
exchange protocol that provides a broad toolbox of interactions in support of
any type of device. The highest-layer protocols are specializations that tailor
the usage of the toolbox to specific device types (such as to a weighing scale) in
order to maximize interoperability between the device sensing the health data
and the one receiving the data. Each of the device specializations begins with
ISO/IEEE Std 11073-104zz where zz is the two digit number in each of the
boxes in Figure 3.
“The highest-layer protocols are
specializations that tailor the usage of
the toolbox to specific device types.”
The lower transport layers are specified in other organizations such as Universal
Serial Bus (USB) Implementers Forum, Bluetooth Special Interest Group
(SIG), or ZigBee* Alliance. However, the IEEE Personal Health Device (PHD)
work group has worked closely with the other organizations to harmonize the
application levels with the transport levels.
As a backdrop to all the standards, the Continua Health Alliance produces
design guidelines that further clarify and if necessary, narrow optionality to
increase interoperability. Again, there is a close working relationship between
Continua Health Alliance membership and all the standard organizations and
special interest groups.
Principles of Implementation
When initially creating the IEEE standards, the PHD work group surveyed
existing devices to understand the needs in the market.
“One key goal was to support a broad
range of device types that fit within
one or more of these categories: health
and wellness, disease management,
and independent living.”
One key goal was to support a broad range of device types that fit within one
or more of these categories: health and wellness, disease management, and
independent living. To do this, it was felt that a common exchange protocol
was necessary to allow a single manager (for example, health appliance, settop box, personal computer, cell phone) to accept data from a broad range
of device types such as glucose meters; fitness equipment; and home sensors,
such as motion sensors, light sensors, and gas sensors. At the same time, since
the sensing agents often run on small batteries, efficient transmission of the
data is also paramount. IEEE Std 11073-20601-2008 attempts to balance the
architecture between these two goals so that it is flexible enough and yet has
optimized packet sizes.
In order to optimize data exchange, the IEEE Std 11073-20601 has three
different event-reporting formats that range from very flexible (but bigger)
to very compact (but less flexible). In the very compact forms, all of the
information about what the bytes mean is extracted and sent once during
configuration. This allows the event reports containing the health data to
report primarily just data and remove redundant bytes of meta-data.
“At the same time, since the sensing
agents often run on small batteries,
efficient transmission of the data is
also paramount.”
Personal Health Device Interoperability | 113
Intel® Technology Journal | Volume 13, Issue 3, 2009
“When there was a choice between
placing a feature on the sensing agent
or on the manager, the feature should
be placed on the manager.”
An overarching principle when creating the standards was that when there was
a choice between placing a feature on the sensing agent or on the manager, the
feature should be placed on the manager. This choice helps to keep the cost of
building agents as low as possible. Therefore, in general, the agent is “in charge”
and determines when it has data, when it wants to transmit those data, and
what functionality it selects to expose. The manager must adapt to the agent.
Due to the range of device types, the standards need to support three different
styles of data delivery: episodic – data delivered based on an action, such as
stepping on a scale; periodic – the device sends data at a known frequency, such
as updating the blood oxygen saturation every second; and store and forward –
the device collects data while not connected to the manager and then the data
can be uploaded on a later connection, such as when a person walks with a
pedometer for an exercise session. Thus, there is a range of tools in IEEE 1107320601 for supporting these modes of data transfer.
“Our design goal from the outset
was to keep the higher-layer protocol
independent from the transport layer.”
Our design goal from the outset was to keep the higher-layer protocol
independent from the transport layer. There are several reasons for this
independence. First, we recognized that some use-cases were best supported
with a wired transport, and others were better supported with a wireless
transport. Second, transport technology continues to advance, so it is
advantageous to adopt new transports as they become available. Third, by
utilizing a common application layer, the transport details can be abstracted
deep within a software stack, allowing most of the upper portions of software
to be re-used, regardless of which transport is used.
Another big difference between the PHD standards and other healthcare
standards is that PHD standards focus on personal health in the home and
mobile environments. Such a focus allows for a reduction in complexity. For
instance, when blood pressure is measured in the home, there is a limited set
of ways it is measured (for example, it is measured non-invasively on the arm
or wrist); whereas, in a clinical setting, there are additional ways it could be
measured (for example, invasively across heart valves or by measuring umbilical
pressure). By focusing on the needs of home health devices only, the standards
process is less complex, and implementations are easier to build, test, and make
interoperable. In addition, solutions can meet the price points required by the
consumer market.
Common Optimized, Data-Exchange Protocol
“By focusing on the needs of home
health devices only, the standards
process is less complex, and
implementations are easier to build,
test, and make interoperable.”
114 | Personal Health Device Interoperability
Within the IEEE Std 11073-20601 there are five major elements as shown in
Figure 4. We describe these elements in detail in the following sections.
1. Abstract Syntax Notation.1 (ASN.1). Language used in the abstract modeling
of data and interactions.
2. Nomenclature. Provides binary codes for terminology.
3. Domain Information Model (DIM). Describes the device and physiological
data.
Intel® Technology Journal | Volume 13, Issue 3, 2009
4. Service Model. Defines interactions with the device and data.
5. Communication Model. Manages the connection state machine and
communication characteristics.
Abstract Syntax Notation
Abstract Syntax Notation.1 (ASN.1) is an existing standard for describing
concepts in an abstract notation that is independent of an underlying
implementation language.
ISO/IEEE Std 11073–20601
ASN.1
Domain Information Model
Nomenclature
Service Model
Communication Model
By using ASN.1 to describe the modeling of the Domain Information and
Service Models, it is possible to generate code automatically for different
computer languages as well as code that converts from internal data structures
into different transmission representations (such as binary or XML).
Figure 4: Components of IEEE Std 11073-20601
Source: IEEE, 2008
Nomenclature
The nomenclature represents binary values that are used to describe,
unambiguously, the concepts being modeled. For example, there are separate
codes for pulse rate, heart rate, systolic blood pressure, and diastolic blood
pressure. By defining specific codes for all the concepts being modeled, two
devices are able to exchange information by using a common language.
By using binary codes rather than strings, the nomenclature also aids in
supporting localization efforts so the concepts can be represented to the
operator in the local language.
The Domain Information Model
The Domain Information Model (DIM) contains objects that represent both
device-level information (for example, battery status, current time, model and
serial number), physiological information (for example, weight, blood pressure,
number of steps), and environmental information (for example, gas or water
left on, house too cold or too hot). The DIM uses object-oriented concepts
in specifying the model. This eases the maintenance and modeling effort for
standard creation. There is no requirement to implement the model by the use
of object-oriented languages and, in fact, there are product implementations
built by using Standard C.
The object classes that are available are generic in nature in order to support
a wide range of usages. There are currently classes of objects to describe the
device itself (for example, battery or wall-powered device), numeric objects (for
example, physiological measurements such as temperature), real-time sample
arrays (for example, usually for waveforms such as an ECG), and enumerations
(for example, at this time the door was opened). There are also classes that
support the storing and forwarding of measurements and scanners that assist in
compacting data streams.
Each class defines the object type, the standardized attributes (variables)
available in the class, and the actions (methods) that can be executed for
objects of the class. The agent represents its data by creating one or more
objects and then communicating the starting values of the attributes. As data
changes, the agent sends event reports to the manager to inform it of the
changes.
“By defining specific codes for all the
concepts being modeled, two devices
are able to exchange information by
using a common language.”
“The Domain Information Model
(DIM) contains objects that represent
both device-level information,
physiological information, and
environmental information.”
Personal Health Device Interoperability | 115
Intel® Technology Journal | Volume 13, Issue 3, 2009
“Attributes of an object are set by
the agent to represent the current
conditions.”
Attributes of an object are set by the agent to represent the current conditions
(for example, there is an observed value attribute that defines the person’s
weight and another that describes the unit-code for pounds or kilograms).
Attributes have qualifiers in the standard that describe whether the attribute
is: a) mandatory, that is, it must be present, b) conditional, that is, if the
condition is met, it is mandatory; otherwise, it is optional, or c) optional, that
is, where the agent implementers determine whether they want the attribute
in their product. In addition, additional attributes can be added to an object
beyond those that are standardized. This allows vendors to model other items
to differentiate their product, if desired.
Service Model
The service model describes the services that are available for communicating
data to and from the agent. There are three categories of services:
•• Event reporting. This is used by the agent to send updates to the manager
about its configuration or any new measurements that are available.
•• Object access. This is used by the manager to get, set, or invoke actions on
the agent.
•• Association. This is used by agent and manager to create an association
with one another so they are synchronized on their operational and
communication state.
Communication Model
The communication model is the abstraction between the upper application
layers and the transport layers. The key items that are described in the
communication model include the following:
•• Communication characteristics. While it would be nice to be agnostic to the
underlying transport, there are certain requirements that the applicationlevel protocol requires of transports. These are described to make it easy to
assess whether a transport can work natively or if a shim is required to make
the transport conformant.
•• Connection state machine. This describes the states that a manager and agent
move through during an association. The agent’s state diagram is shown in
Figure 5.
•• Legal interactions in each state. These describe the actions that are and are
not allowed while in each state of the state chart.
•• Conversion service. This describes the method required to convert from
the ASN.1 modeling into a transmission format encoding. The standard
supports three different styles of encoding, but it mandates that the most
basic one shall always be present to ensure the agent and manager can
interoperate at an encoding level.
116 | Personal Health Device Interoperability
Intel® Technology Journal | Volume 13, Issue 3, 2009
Disconnected
Transport connect
indication
Transport disconnect
indication
Connected
Disassociating
Associated
assocRelReq
Operating
+entry/TxAssocRelReq
TxAssocAbort
RxAssocAbort
RxAssocRelRsp
RxAssocRelReq/
TxAssocRelRsp
Unassociated
TxAssocAbort
Configuring
RxAssocAbort
RxAssocRelReq/
RxConfigEventReportRsp
(accepted-config)
TxAssocRelRsp
Waiting Approval
assocReq
RxAssocAbort
Or
TxAssocAbort
RxAssocRsp
(accepted)
RxAssocRsp
(rejected)
RxConfigEventReportRsp
(unsupported-config)
Associating
+entry/TxAssocReq
TxConfigEventReportReq
RxAssocRsp
Sending Config
(accepted-unknown-config)
Figure 5: Agent State Diagram
Source: IEEE, 2008
Example Device Specialization (Glucose Meter)
Figure 6 shows the glucose meter modeling. To be conformant, the only
mandatory objects are the glucose reading and the object representing the
device. However, many glucose meter models available today allow the user
to enter additional contextual data that helps when interpreting a reading.
For example, did the person exercise? Did the person eat a meal recently? Was
medication taken near the reading? The optional objects of the glucose meter
standard allow a vendor to model this type of context data. Finally, there is a
persistent metric store that allows the device to store readings for a longer time,
such as the time between doctor visits.
Personal Health Device Interoperability | 117
Intel® Technology Journal | Volume 13, Issue 3, 2009
PHD-Glucose meter object instances
Glucose: Numeric
Observations: PM-Store
Glucose Meter: MDS
1
1
1
0…1
1
ContextExercise: Numeric 0…1
ContextDiet: Numeric 0…1
ContextMedication: Numeric 0…1
ContextHbA1C: Numeric 0…1
0…1 ContextHealth: Enumeration
0…1 EventDeviceErr: Enumeration
0…1 ContextMeal: Enumeration
0…1 ContextSampleLocation: Enumeration
0…1 ContextTester: Enumeration
Figure 6: Glucose Meter Modeling
Source: IEEE, 2009
Transport
“This specification takes into
consideration personal health wired
requirements and defines a mechanism
to exchange data between devices over
USB.”
“This specification takes into
consideration personal health wireless
PAN requirements and defines a
mechanism to exchange data between
devices over Bluetooth.”
118 | Personal Health Device Interoperability
The Continua architecture separates the transport from the protocol, data, and
nomenclature. This abstraction allows for various transports to be leveraged
as usage requires without impacting the upper-layer applications. As of this
writing, there are three different transport technologies available and the
specification used for each one is explained next.
USB
If a wired transport is desired, then the USB Personal Healthcare Device
Class (PHDC) specification has been selected. This specification takes into
consideration personal health wired requirements and defines a mechanism
to exchange data between devices over USB. The USB Implementers Forum
has an automated test tool that is in place to provide USB Personal Healthcare
Device class transport certification of personal health devices.
Bluetooth
If a wireless transport is desired for point-to-point personal area network
connectivity, then the Bluetooth Health Device Profile specification has been
selected. This specification takes into consideration personal health wireless
PAN requirements and defines a mechanism to exchange data between devices
over Bluetooth. The Bluetooth SIG has an automated test tool that is in place
to support a Bluetooth transport certification of personal health devices.
Intel® Technology Journal | Volume 13, Issue 3, 2009
If a low-power version is necessary (for example, extreme sensitivity to battery
life), then the Bluetooth Low Energy (LE) specification has been selected.
This technology is anticipated to be combined with classic Bluetooth on the
aggregation device in order to accept input from either HDP or LE.
“This specification takes into
ZigBee
If a low-power wireless transport is desired for local area network
connectivity, then the ZigBee Health Profile specification has been selected.
This specification takes into consideration personal health wireless LAN
requirements and defines a mechanism to exchange data between devices over
ZigBee. The ZigBee Alliance has a tool in place to support a ZigBee transport
certification of personal health devices.
mechanism to exchange data between
consideration personal health wireless
LAN requirements and defines a
devices over ZigBee.”
Lessons Learned
To enable the personal health market during the development of the Continua
Health Alliance over the past three years, several key lessons have been learned.
We discuss several of the important ones.
•• Start out with market-viable use-cases and requirements. It is easy to let
technology lead the way for a particular use-case, but it is imperative to
ensure that there is broad market support (i.e., more than one company)
and that the purchasers and domain experts are involved in the definition
process such that what is eventually delivered is relevant.
•• Carefully constrain scope and ensure appropriate focus. As the number of
companies in an organization grows so does the appetite for new, complex
capabilities, and it is easy to get distracted and lost. Since there are limited
resources, it is important that the highest-priority capabilities are identified
and that they become the main focus of the organization.
•• In order to avoid confusion, it is important to agree on the philosophical
positions of the organization ahead of time. These include choices, such
as whether to profile by developing guidelines, or try and augment the
standards; or whether to focus on simplicity (basic features) instead of
complexity (rich features).
Conclusion
As a result of the potential benefits of home-based healthcare, a growing
number of companies have focused their efforts on developing telehealth
solutions in the personal healthcare domain. While there are a number of
challenges that must be addressed for this endeavor to be successful on a large
scale, excellent progress has been made toward the removal of one key obstacle:
personal health device interoperability.
“Excellent progress has been made
toward the removal of one key
obstacle: personal health device
interoperability.”
Personal Health Device Interoperability | 119
Intel® Technology Journal | Volume 13, Issue 3, 2009
The efforts of the Continua Health Alliance and standard development
organizations, such as the USB IF, Bluetooth SIG, and IEEE, have culminated
in a set of standards that specify the manner in which personal health devices
communicate as well as in design guidelines that profile these standards,
thereby removing ambiguity and ensuring true interoperability.
Armed with these design guidelines and standards, device vendors, software
developers, service providers, integrators, and purchasers are now in the
position to focus on their respective areas of expertise and collectively deliver
on the vision of home healthcare delivery based on systems composed of
interoperable personal health devices.
Reference
[1]
Continua Health Alliance. Available at www.continuaalliance.org.
Author Biographies
Doug Bogia is a Standards Architect in Intel’s Digital Health Group. In 1995,
Doug received his Ph.D. from the University of Illinois, Urbana-Champaign
in Computer Supported Collaborative Work. Since joining Intel in 1995,
Doug has held a number of positions ranging from implementing collaborative
business and personal products, to implementing small business support
services, to creating telecommunication products. During 2002, he assisted in
creating the Advanced Telecom Computing Architecture standard enabling the
telecommunications industry to begin interoperable implementations in 2003.
In 2005, he shifted focus to the healthcare industry. Doug led the formation of
the ISO/IEEE 11073 Personal Health Devices Work Group where he currently
serves as the Work Group Chair. He is also an active participant in the
Continua Health Alliance, Bluetooth Medical Devices Work Group, and USB
Personal Health Device Profile. His e-mail is Douglas.P.Bogia at intel.com.
Rick Cnossen is the President and Chairman of the Board for the Continua
Health Alliance. Continua is a large, international, cross-industry consortium,
focused on the establishment of an ecosystem of interoperable, personal
telehealth systems. He is also a Director of Personal Health Enabling for
Intel’s Digital Health Group with a focus on medical device interoperability
standards. Additionally, he serves on the HITSP Board of Directors and is a
participating member of the ISO/IEEE 11073 Personal Health Data Work
Group, the HITSP Consumer Perspectives Technical Committee, the Health
Level Seven (HL7) Structured Document Technical Committee, the USB
Personal Health Device WG, and the Bluetooth Medical Device Profile WG.
120 | Personal Health Device Interoperability
Intel® Technology Journal | Volume 13, Issue 3, 2009
Mr. Cnossen holds an MSCS degree from the University of Southern
California (USC). He worked at McDonnell Douglas (Boeing) for 15 years as
a Software Engineering Manager on a variety of aerospace and defense projects
and at Intel Corporation for 9 years as an Engineer Manager and Director.
During this time, he has been involved in many industry technology and
standards development efforts including the Advanced Telecommunications
Architecture (ATCA) manageability specification, Tactical Defense data
collection and analysis, Hierarchical Data Assimilation and Storage, and High
Performance Computing. His e-mail is rick.a.cnossen at intel.com.
Chris Gough is an Enterprise Architect with the Product Development team
in Intel’s Digital Health Group. Prior to his focus on product development,
Chris was closely involved with the Continua Health Alliance Technical Work
Group and had a key role in developing the Continua system architecture.
Chris joined Intel in 1991 and, prior to his tenure with the Digital Health
Group, has held various technical positions in the Digital Enterprise Group
focusing on motherboard design applications and validation tools, emerging
networking technologies, and systems architecture. Chris holds a bachelors
degree in Computer Science and Engineering from the University of California
at San Diego. His e-mail is chris.s.gough at intel.com.
Copyright
Copyright © 2009 Intel Corporation. All rights reserved.
Intel, the Intel logo, and Intel Atom are trademarks of Intel Corporation in the U.S. and other
countries.
*Other names and brands may be claimed as the property of others..
Personal Health Device Interoperability | 121
Intel® Technology Journal | Volume 13, Issue 3, 2009
A Common Personal Health Research Platform — SHIMMER™
and BioMOBIUS™
Contributors
Michael J. McGrath, Ph.D.
Intel Corporation
Terrance. J. Dishongh, Ph.D.
Intel Corporation
Index Words
Research Platform
BioMOBIUS™
SHIMMER™
Sensing
Extensible
Flexible
Abstract
As the cost of healthcare continues to rise, and the aging population
demographic in the Western world continues to increase, the pressure on direct
clinical resources continues to grow at an alarming rate. The health of people
around the world is worsening and the global population is getting older.
With this in mind the goal of SHIMMER™ (Sensing Health with Intelligence,
Modularity, Mobility and Experimental Reusability) systems is to provide an
extremely compact extensible platform for body-worn or ambient sensing
applications based on connected or disconnected data strategies with proven
system building blocks. Additionally, to aid in the analysis of the data coming
from SHIMMER, it is fully integrated with the software environment of the
BioMOBIUS™ research platform. The combination of the BioMOBIUS and
SHIMMER platforms give researchers an extensible and flexible systems that
can be utilized to rapidly develop prototype research tools to acquire data in
a reliable and timely manner. In this article, we outline how the combined
BioMOBIUS and SHIMMER platforms can be used to develop sophisticated
research tools.
Introduction
“The technology development overhead
that many biomedical researchers
must address can detrimentally impact
the pace and scope of the research
projects.”
Biomedical and clinical research and development projects have a growing
need for technology solutions that are highly flexible, extensible, easy to use,
and provide a comprehensive range of capabilities. The availability of such
resources allows researchers to focus on the R instead of the D in R&D during
the lifecycle of their research project. The technology development overhead
that many biomedical researchers must address can detrimentally impact the
pace and scope of the research projects. This problem has been compounded
in recent years due to the growing interest in moving from laboratory to data
collection and observation in home environments. A variety of applications
have been reported in the literature including physiological monitoring [1-3],
physical rehabilitation [4], activities of daily living (ADL) monitoring [5, 6],
falls detection [7, 8], cognitive function [9], and social engagement [10, 11].
These applications typically comprise hardware and software components
interconnected to form customized applications to address the needs of a
specific research hypothesis.
122 | A Common Personal Health Research Platform — SHIMMER™ and BioMOBIUS™
Intel® Technology Journal | Volume 13, Issue 3, 2009
Many biomedical applications utilize wireless sensing capabilities in the form
of body-worn sensing or non-contact sensing to acquire data of interest and
to transmit them to an aggregation device. Fusion of both body-worn and
ambient sensors has been reported to improve accuracy of inference based on
activity or behavior identification [12, 13]. Typically, multiple parameters are
of interest including physiological [14, 15], kinematic [16, 17], ambient [18,
19], and environmental measurements [20].
To address the need for multisensing capabilities while minimizing the
complexity of the hardware and software components, platform-based
approaches have emerged where the sensing capability of the sensors can
be modified by changing the sensing element, normally in the form of a
daughterboard that connects to a common baseboard that provides the
computational and communications capabilities. Chen et al [2] report a
sensor node platform for wireless biomedical sensing. They utilize a flexible
expansion connector that supports additional daughterboards, including ECG
with TinyOS* firmware. DuboisFerrière et al [21] describe the TinyNode*,
which features two types of add-on boards. The Standard Extension Board
(SEB) includes footprints for two optional sensors: a relative humidity
and temperature sensor and a photodiode light sensor. The MamaBoard
features a variety of external communications including, LAN, WLAN, and
GPRS options with support for data storage via an SD card. The platform
also features an XE1205* from Semtech with a reported four- to eightfold communications range performance improvement over MICA2 and
TelosB sensor nodes. Nokia has reported a wearable sensor platform called a
Nokia Wrist–Attached Sensor Platform* (NWSP) that is based on a highly
flexible Field Programmable Gate Array (FPGA) [22]. The platform features
an accelerometer, a gyroscope, and magnetometer sensing capabilities in a
wristwatch-like form-factor.
“Typically, multiple parameters are
of interest including physiological,
kinematic, ambient, and
environmental measurements.”
“Platform-based approaches have
emerged where the sensing capability
of the sensors can be modified by
changing the sensing element.”
For a body-worn sensor platform, size is a key consideration. IMECs
Human++ research program has developed a highly miniaturized and
autonomous sensor system for body sensor network applications [23]. The
platform combines wireless ultra-low power communications, 3D integration,
MEMS energy scavenging techniques, and low-power design techniques.
Two-channel Electroencephalography (EEG), two-channel Electrooculography
(EOG), and one-channel Electromyography (EMG) for sleep-monitoring
applications and wireless ECG monitoring have been demonstrated [24].
From a software perspective, platforms for wireless sensor networks have been
reported in the literature [14, 25, 26]. Walker et al describe a Java* and Java
Agent based Development (JADE) platform; it is designed to provide sensor
network integration and application development capabilities [27]. Other Java
platforms include SunSPOT* from Sun Microsystems [28].
“For a body-worn sensor platform, size
is a key consideration.”
A Common Personal Health Research Platform — SHIMMER™ and BioMOBIUS™ | 123
Intel® Technology Journal | Volume 13, Issue 3, 2009
“Both the C-type programming and to
a lesser extent the SDK approaches are
complex and time-consuming.”
“The environment can cater for users
with widely varying levels of technical
expertise by enabling design and
development work to be carried out at
several levels of abstraction.”
“LabVIEW provides a GDE with
a drag and drop capability for
functional blocks wired together to
The Arduino* open-source electronics prototyping platform has become
popular (www.arduino.cc) especially for under-graduate teaching purposes. A
number of research applications have been reported, including e-Textiles*, that
are based on the integration of accelerometers for motion capture and human
motion capture in combination with tactile feedback in response to specific
postures in real-time [29, 30].
A number of options exist for high-level application program development
with associated benefits and difficulties. Typically there is a choice between
traditional syntactically-oriented software languages such as C/C++, Java,
Python*, etc., bespoke software development environments (SDKs); e.g.,
Eclipse*, and graphical development environments (GDE). Both the C-type
programming and to a lesser extent the SDK approaches are complex and
time-consuming. Firstly, they often result in applications that are harder
to debug, integrate, and evolve. Secondly, they restrict researcher access to
the development process. Graphical programming approaches can deliver a
common interface to the sensor/network with which all users will be able to
interact. These interfaces generally take the form of drag and drop development
environments where functional blocks are selected from a palette of tools
and connected on a drawing area to describe data flow through a system or
application. The environment can cater for users with widely varying levels
of technical expertise by enabling design and development work to be carried
out at several levels of abstraction. Using a GDE, a developer can build a
working system graphically by dragging and dropping blocks onto a diagram
(resembling a data-flow diagram). Typically, each block features a number of
input and output pins, with connections made between blocks by clicking and
dragging from a pin on one block to a suitable pin on another block.
LabVIEW* from National Instruments, has been popular for many years,
particularly for applications that required data acquisition via an analog-todigital converter (ADC) or device actuation via a digital-to-analog converter
(DAC). LabVIEW provides a GDE with a drag and drop capability for
functional blocks wired together to form a functional application. It has
been used for a variety of biomedical-type applications including interfacing
pulse oximeters [31], ECG signal analysis [32], tri-axial accelerometers and
galvanic skin response (GSR) [33], and gyroscopes for gait analysis [34]. Other
graphical programming environments that have been developed for biomedical
applications include Scicos: these can be utilized for signal processing,
systems control, and for studying physical and biological systems [35]. The
BioEra* environment provides a graphical block-based environment that can
process dozens of channels simultaneously in real-time from various devices,
including EEG and heart rate monitors [36]. BrainBay* is a bio- and neurofeedback environment, again based on a block-type development environment
designed to work with the OpenEEG*-hardware platform [37]. Simulink*
from MathWorks provides an interactive graphical environment that has been
applied to a variety of sensor simulation applications [38, 39].
form a functional application.”
124 | A Common Personal Health Research Platform — SHIMMER™ and BioMOBIUS™
Intel® Technology Journal | Volume 13, Issue 3, 2009
In order to address the issues associated with technology development such
as rapid prototyping, reuse, extensibility, distribution, etc., the TRIL Centre
[40] has developed a modular, extensible, and reusable technology approach
based on an open research platform concept called BioMOBIUS. The TRIL
Centre is a collaboration between Intel Ireland, University College Dublin,
Trinity College Dublin, and the National University of Ireland at Galway, with
support from IDA Ireland. TRIL grounds its research in ethnography and
clinical efficacy, with a common set of research tools based on BioMOBIUS,
and takes the research out of the lab into homes of older people. BioMOBIUS
enables TRIL researchers to carry out their research in an effective and
efficient manner and to have the ability to share their research solutions in
a logistically effective manner. A modular, abstracted approach reduces the
learning curve and prompts rapid application prototyping and reuse, thereby
lowering development costs and reducing the barriers to technology adoption.
The software components of BioMOBIUS are made freely available to the
biomedical research community by the TRIL Centre.
“TRIL grounds its research in
The BioMOBIUS research platform comprised of both low-level and
high-level software development environments is fully integrated with the
SHIMMER sensor platform [41, 42]. BioMOBIUS also supports a number
of other third-party sensors and off-the-shelf devices. This integration of
software and hardware provides a unique set of capabilities that enables
developers, engineers, and researchers to rapidly develop research applications
to investigate a variety of research hypothesis and to rapidly modify the
applications based on the evolving needs of the researcher and end user. The
key features of the platform to enable such capabilities are as follows:
“The BioMOBIUS research platform
ethnography and clinical efficacy, with
a common set of research tools based
on BioMOBIUS.”
is fully integrated with the
SHIMMER™ sensor platform.”
•• Ease of hardware integration.
•• Software component reuse.
•• Highly extensible both from a hardware and software perspective.
•• High quality user interfaces (UI).
•• Supports data acquisition rates appropriate for kinematic and physiological
data capture.
•• Provides real-time data processing and data presentation.
•• Data persistence to file and database.
•• Stability and reliability.
•• Supports rapid application prototyping, i.e., short development times in
the order of one to two weeks.
A typical BioMOBIUS application includes SHIMMER sensors, processing
functionality, and a UI. The sensors monitor biomedical indicators such as
gait stability, and the processing functionality converts the sensor data into
meaningful information. The UI allows clinicians to view the information and
adjust the application settings or it allows the home-based participant to view
feedback information on performance; e.g., attention level measurement via
GSR.
“A typical BioMOBIUS application
includes SHIMMER sensors,
processing functionality, and a UI.”
A Common Personal Health Research Platform — SHIMMER™ and BioMOBIUS™ | 125
Intel® Technology Journal | Volume 13, Issue 3, 2009
BioMOBIUS Software Environment
“Originally focused on visual arts
applications, EyesWeb provides fusion
of motion and music for emotion
capture during artistic performance.”
Drag and drop environments ideally support the concept of rapid application
prototyping through the development of applications, by selecting blocks from
a palette of capabilities categorized by function, such as input/output, data
management, digital signal processing (DSP), etc. Sophisticated applications
can be quickly assembled, including hardware interfacing and real-time data
processing, by using the functional blocks that are connected in sequence to
represent the real-time data flow.
EyesWeb*, developed by the University of Genoa, was selected as the drag
and drop GDE [43, 44]. Originally focused on visual arts applications,
EyesWeb provides fusion of motion and music for emotion capture during
artistic performance [45]. It is also utilized to develop applications based on
multimedia techniques for therapy and rehabilitation; e.g., Parkinson’s, Autism
[46, 47]. The EyesWeb software platform was originally developed in EU 5th
Multisensory Expressive Gesture Applications (MEGA, see www.megaproject.
org) and 6th Framework Programme, the Tangible Acoustic Interfaces for
Computer Human Interaction (TAI-CHI). The EyesWeb environment has
been adapted and enhanced by TRIL for biomedical oriented applications as
shown in Figure 1.
“The EyesWeb environment has been
biomedical oriented applications.”
BioMOBIUS High Level Application Development Env
adapted and enhanced by TRIL for
BioMOBIUS
Applications
BioMOBIUS
GUI Designer
Applications
Applications–
Physiological, Kinematic
User Interface
RCF Interface
EyesWeb
IDE
DSP
Image
Services
Database
Services
Protocols
File
Services
Math
Operations
Sensor
Services
Audio/Video
Services
EyesWeb Kernel
Firmware
Windows API’s
Hardware
Interfaces
BT
Stack
Driver
Hardware
X10
Stack
Driver
Hardware
USB/
RS232
Stack
Driver
Hardware
Figure 1: BioMOBIUS Architecture
Source: The TRIL Centre (www.trilcentre.org)
126 | A Common Personal Health Research Platform — SHIMMER™ and BioMOBIUS™
IEEE
802.15.4
Stack
Driver
Hardware
ThirdParty
Applications
e.g. Databases
Intel® Technology Journal | Volume 13, Issue 3, 2009
The EyesWeb environment comprises a GDE that supports drag and drop
application development in the form of patches. BioMOBIUS provides a
Graphical User Interface (GUI) development tool that enables the user
to design an application to control a patch and view its output. The GUI
Designer is built by using the JUCE* C++ class library [48]. This is a crossplatform, high-level, graphical application program interface (API) and
provides the potential to develop BioMOBIUS applications on non-Windows*
platforms.
“The EyesWeb environment comprises
a GDE that supports drag and drop
application development in the form
of patches.”
Use of the GUI development tool allows the user to be abstracted away from
the underlying complexities of the connected patch. The resultant application
is called a GUI application. The runtime environment within BioMOBIUS
comprises two key components:
•• Kernel Runtime Server. The EyesWeb Kernel Runtime is a Windows
component that supports the execution of EyesWeb patches outside of the
EyesWeb GDE. It is used in conjunction with the BioMOBIUS Runtime
to execute GUI applications and patches as if they are native Windows
applications.
•• BioMOBIUS GUI Runtime. This is a Windows console application that
supports the execution of a BioMOBIUS application outside of the GUI
development tool. Typically, a BioMOBIUS application is first developed
and tested within the GUI development tool. Thereafter, the BioMOBIUS
Runtime is used to run the application as a Windows executable in
conjunction with the EyesWeb Kernel Runtime component, via a TCP/IP
connection that supports execution of the patch and GUI on two physically
separated networked systems, if required.
Both patches and GUI applications are defined in XML and interpreted by
their respective runtimes. A number of benefits are achieved by separating these
two frameworks:
•• Runtime efficiency. Applications consume less system resources; for example,
output and display rendering is automatically disabled in the runtime
environment.
•• Separation of duty. Patches and UI applications may be developed in
parallel.
•• Distribution. An application and its patch may execute on separate
platforms across a network.
•• Extendibility. Other graphical development environments may be used to
develop feature-rich applications such as FLASH-based interfaces.
“Both patches and GUI applications
are defined in XML and interpreted
by their respective runtimes.”
A Common Personal Health Research Platform — SHIMMER™ and BioMOBIUS™ | 127
Intel® Technology Journal | Volume 13, Issue 3, 2009
Output
pin
x
t
Input
pins
Blocks
Blocks, as shown in Figure 2, are the fundamental building units of the
environment which are dragged and dropped onto a patch from a palette of
blocks grouped by functionality. A block is coded as a C++ class with inherited
attributes and methods. The number of inputs and outputs for each block is
determined by the developer during the development process. Inputs and
outputs are referred to as pins. There are two categories of inputs:
Figure 2: BioMOBIUS block with input and output
•• Main inputs (block receives its primary data for processing).
pins
•• Parameter inputs (configuration details determines block execution).
Source: The TRIL Centre (www.trilcentre.org)
Blocks are executed in two ways at runtime. Active blocks initiate their own
execution by requesting the kernel to schedule its execution. Initiation of
execution is triggered periodically by using kernel polling, i.e., a call-back
mechanism. The second type of execution is used by reactive blocks. These
blocks are executed by the kernel when data arrives at its main input. The
kernel is alerted to the availability of these data by a system flag in the block
setting.
+
GUI
+
E
+
+
+
+
+
+
+
+
+
GUI Objects
Kernel
Legacy
Steinberg Technologies
Base
System
National Instruments
3D
BioMOBIUSBiosignals
BioMOBIUSBase
BioMOBIUSHardwareIO
BioMOBIUSMath
Catalog View
+ GUI
+
+
+
+
+
+
+
+
+
+
+
+
+
+
GUI Objects
Math
Audio
BioMOBIUS
DataStructures
Filesystem
FlowAndControlStructure
Geometric
ImageAndVideo
Math
Operations
Peripherals
String
TimeAndDate
TimeSeries
New blocks can easily be developed for the GDE by using the EyesWeb
software development kit (SDK) and integrated development environment
(IDE) wizards. Typically, a user creates a new block to support a new hardware
device or execute a specific DSP function. In addition, EyesWeb applies
data-type checking on block data pins, and the SDK supports the creation of
user-specific data types. The developer of a new block has the choice to either
distribute the source and/or binary to the development community.
Blocks are organized in two ways with the GDE. Firstly, they can be aggregated
into catalogs that are physical collections of blocks within a standard Windows
environment. Secondly, they can be structured into libraries which are logical
collections organized according to functionality/use; e.g., mathematical
operations (Figure 3).
Patches
In order to build a functioning application, blocks are dragged from libraries or
catalogs in the GDE environment and connected together to form a patch. A
patch comprises a series of blocks linked together in order of execution to form
a fully-functional application. Figure 4 shows an example of a patch designed
to allow the user to manually trigger the acquisition of data from a SHIMMER
sensor, process the received data, and to display the acceleration values sensed
by the sensor to the user. A patch is executed via the Kernel Runtime Server.
The EyesWeb Kernel Runtime is a Windows component that supports the
execution of EyesWeb patches both inside and outside the EyesWeb GDE.
Library View
Figure 3: Library and catalog views within the
BioMOBIUS GDE
Source: The TRIL Centre (www.trilcentre.org)
128 | A Common Personal Health Research Platform — SHIMMER™ and BioMOBIUS™
Intel® Technology Journal | Volume 13, Issue 3, 2009
CONNECT
START
x
STOP
t
DISCONNECT
x
t
Shimmer
x
t
Figure 4: Sample patch using SHIMMER in the EyesWeb GDE
Source: Intel Corporation, 2009
As the complexity of an application increases so too does the need to
structure patch development into logical or functional units. Complex
patch development can be managed through the use of the subpatch feature
within the EyesWeb environment. The subpatch supports the integration of
a functional group of interconnected blocks into a single block-like subpatch
structure.
“BioMOBIUS features a variety of
digital signal-processing capabilities.”
Digital Signal Processing Capabilities for Biosignal Processing
BioMOBIUS features a variety of digital signal-processing capabilities. For
real-time video processing, an extensive range of processing options exists
including, baricenter/centroid calculations, quantity of motion from a
silhouette, background subtraction, image concatenation, and image splitting.
A variety of more advanced techniques is also available including real-time,
full-body motion tracking and blob extraction. Time domain digital filtering
is served by Finite Impulse Response (FIR) filters. A variety of frequency
domain digital filters are available including Fourier Transform (FT), Hilbert
Transform, and Kalman. Support is included for matrix-handling operations. A
variety of specialist filters, such as Cochleogram generation in both image and
matrix format from an audio buffer, are available.
“A variety of frequency domain digital
filters are available.”
A Common Personal Health Research Platform — SHIMMER™ and BioMOBIUS™ | 129
Intel® Technology Journal | Volume 13, Issue 3, 2009
“The SHIMMER platform comprises
SHIMMER Platform
a baseboard that provides the
The SHIMMER wireless sensor platform was developed by the Intel Digital
Health Group over a number of years to support a variety of internal research
projects. The SHIMMER platform comprises a baseboard that provides the
sensors computational, data storage, communications, and daughterboardconnection capabilities. The core functionality of SHIMMER platform is
extended via a range of daughterboards that provide kinematic, physiological,
and ambient sensing capabilities. This range of contact and non-sensing
capabilities can be reliably used both in clinical and in home-based scenarios.
sensors computational, data storage,
communications, and daughterboardconnection capabilities.”
SHIMMER Baseboard Design
“The primary advantages of the
MCU are its extremely low power
during periods of inactivity and its
proven history for medical-sensing
applications.”
The core element of the baseboard is the Texas Instruments MSP430* MCU
(Microcontroller Unit) [49], which has been widely used in wireless sensors
[50-52]. The primary advantages of the MCU are its extremely low power
during periods of inactivity and its proven history for medical-sensing
applications. The 10Kb of RAM on the MSP430F1611 variant is the largest
memory available within the processor family and offers improved buffering
capability for communication applications.
The MSP430 has eight ADC channels for 12-bit A/D conversions. For the
SHIMMER platform, the external ports are utilized for reading data from the
XYZ accelerometer (three channels), the internal expansion connector (three
channels), and the external expansion connector (two channels). To maintain
the low-power usage capabilities of SHIMMER, the MSP430 ADC core
is disabled when not in use and re-enabled when necessary. The baseboard
has a Freescale Semiconductor 3-axis (XYZ) accelerometer (MMA7260Q*).
The accelerometer is suitable for low-power systems: when active the current
consumption is around 500μA, and it contains a sleep mode, with current
usage of 3μA. It is robust, having high shock survivability, and is suitable for
high-sensitivity applications (800mV/g for 1.5g setting).
A passive tilt/vibration sensor (SQ-SEN-200* – Signal Quest) has been added
to later generations of the SHIMMER baseboard. It is sensitive to both tilt
(static acceleration) and vibration (dynamic acceleration). The sensor is used
to trigger power state transitions. For example, when the user takes an inactive
sleeping sensor out of the charging dock, the firmware will automatically switch
on the necessary components (accelerometer, radio, etc.) to begin capturing
and transmitting motion data.
Communications
“For the SHIMMER platform, the
external ports are utilised for reading
data from the XYZ accelerometer, the
internal expansion connector, and the
external expansion connector.”
One of the key functions of the SHIMMER board is its ability to communicate
as a wireless platform. SHIMMER has the dual functionality of having both
802.15.4 and Bluetooth® radio modules.
802.15.4 Radio
For IEEE 802.15.4-compliant wireless communication, the SHIMMER
platform uses a Chipcon CC2420* radio transceiver and a gigaAnt 2.4GHz
Rufa* antenna.
130 | A Common Personal Health Research Platform — SHIMMER™ and BioMOBIUS™
Intel® Technology Journal | Volume 13, Issue 3, 2009
Bluetooth® Radio Module
The SHIMMER platform uses the Roving Networks RN-41* Class 2
Bluetooth module to communicate via an integrated 2.4GHz antenna. This
module contains a full Version 2 Bluetooth Protocol Stack and supports
the Serial Port Profile that facilitates rapid application development. The
Bluetooth module is connected to the MSP430 directly via the USART1 serial
connection. It can also be controlled by ASCII strings over the Bluetooth RF
link.
SHIMMER Daughterboards
For further functionality the SHIMMER baseboard has a Hirose Electric
DF12 series* connector that allows the user to connect daughterboards to the
baseboard. Currently there are seven daughterboards available for SHIMMER
as shown in Table 1.
Kinematic Sensing
Physiological Sensing
Ambient Sensing
Gyroscope
ECG
PIR Motion
Magnetometer
GSR
Temperature
Light
Table 1: BioMOBIUS SHIMMER-based sensors
Source: Intel Corporation 2009
External Connections
The external expansion (HIROSETM ST Series*) allows the user to attach the
board to the programming dock or multicharger.
External Hardware Integration
The SHIMMER platform provides two options for external hardware
integration, such as third-party sensors. The General Purpose Expansion
Module (AnEx Board) is an analog expansion module that enables two analog
signals to be plugged into the SHIMMER platform. These analog signals
interface with the A0 and A7 of the MSP430.
“The purpose of the PRIMMER
daughterboard is to enable rapid
integration and prototyping of
commercially available and existing
physiological front-end and sensor
circuitry.”
PRIMMER™ Daughterboard
Prototype Reconfigurable board for SHIMMER (PRIMMER) is a
daughterboard that enables the breakout of SHIMMER’s internal connector.
The purpose of the PRIMMER daughterboard is to enable rapid integration
and prototyping of commercially available and existing physiological front-end
and sensor circuitry. The PRIMMER daughterboard breaks out six ADC lines,
power and ground prototype solder points, two general purpose input/ouput
(GPIO) lines, and a Universal Asynchronous Receiver/Transmitter (UART).
In addition to this, it provides a boosted power supply by way of a powerboosting circuit that takes in 3V from the SHIMMER baseboard and increases
it to ±5V (Maxim MAX768EEE* charge pump). The board has been utilized
to interface to an EMG pre-amplifier sensor (MA-411-000*, Motion Lab
Systems) and to streamed EMG data at 500HZ.
A Common Personal Health Research Platform — SHIMMER™ and BioMOBIUS™ | 131
Intel® Technology Journal | Volume 13, Issue 3, 2009
“The GLIMMER display platform
is an extension of the standard
SHIMMER platform.”
GLIMMER™
GLIMMER, the SHIMMER-based display, is a low-cost, wireless display
platform that enables context-aware prompting in the home for ADL
monitoring, medication adherence, journaling, social messaging, and coaching
applications. The GLIMMER display platform is an extension of the standard
SHIMMER platform. Prompts are displayed via a 128x64x4bit 2.7” organic
light-emitting diode (OLED) display (OSRAM Pictiva™). The resolution is
128x64 with 4-bit grey for image display.
SHIMMER Peripherals
A number of peripheral components have been developed for the SHIMMER
platform, and among these are the following:
•• Programming dock. USB-bus-powered cradle using the FTDI FT232*
UART. The dock also has a battery charger status, USB Power, and UART
activity indicators.
•• Multicharger. Cradle that can charge six SHIMMERs simultaneously.
“SHIMMER firmware is primarily
developed by using TinyOS and
released into the public domain via
SourceForge.”
•• Advanced Dual UART programming board. For applications where wired
serial communication or enhanced debug capabilities are desirable.
•• Enclosures. The SHIMMER platform has two types of plastic enclosure. The
standard housing holds the SHIMMER baseboard, lithium ion battery, and
kinematic or ambient sensor daughterboards if required. The physiological
enclosure is utilized with the ECG and GSR daughterboards and provides
external connection points for electrodes.
SHIMMER Firmware
Firmware provides the low-level capabilities within the SHIMMER platform
to control the device functions. There are a number of lightweight operating
systems that can be used to run on a sensor node, namely TinyOS [53],
Contiki [54], Nano-RK [55], ERIKA [56], MANTIS [57], and SOS [58]. All
SHIMMER firmware is primarily developed by using TinyOS and released
into the public domain via SourceForge [59]. Firmware running on the sensor
platform provides local processing of the sensed data, local storage of the data
if required, and communications of these data to a higher-level application for
advanced signal processing, display and data persistence.
TinyOS handles task scheduling, radio communication, time, I/O processing,
etc. and has a very small footprint size. TinyOS is a lightweight, event-driven
operating system designed for sensor network nodes that have very limited
resources.
“TinyOS is a lightweight, eventdriven operating system designed for
sensor network nodes that have very
limited resources.”
132 | A Common Personal Health Research Platform — SHIMMER™ and BioMOBIUS™
Intel® Technology Journal | Volume 13, Issue 3, 2009
Real-time Data Streaming to BioMOBIUS
SHIMMER data can be reliably streamed at rates up to 500HZ in real-time to
a BioMOBIUS application via the Bluetooth or 802.15.4 radios. To enhance
the reliable communications in addition to standard Bluetooth link reliability,
a packet format that supports various types of sensor data and provides
enhanced reliability to the data transmission through packet sequence numbers
and a cyclic redundancy check, was developed. The SHIMMMER firmware
(implemented in TinyOS version 1) uses the frame format shown in Figure 5
for BioMOBIUS application integration.
Packet
Framing
Packet
Header
Packet
Payload
“SHIMMER data can be reliably
streamed at rates up to 500HZ
in real-time to a BioMOBIUS
application via the Bluetooth or
802.15.4 radios.”
Packet
Framing
BOF
Sensor
ID
Data
Type
Frame
Sequence
Number
Time
Stamp
Data
Length
Raw
Sensor
Data
CRC
EOF
8 bits
8 bits
8 bits
8 bits
16 bits
8 bits
1-255 bytes
16 bits
8 bits
Figure 5: SHIMMER frame format
Source: Intel Corporation, 2009
The content of packets is unrestricted binary data. This can lead to problems
with control characters like BOF (Beginning of Frame, hex byte ‘0xC0’) or
EOF (End of Frame, hex byte ‘0xC1’) appearing in the packet header or
payload and being interpreted as control characters. To avoid this issue, a byte
stuffing technique based on ISO’s asynchronous high-level data link control
(HDLC) protocol is utilized.
Logging Data to MicroSD Card
Many biomedical applications require subjects to have complete and
unhindered mobility. To achieve this, capability support for Secure Digital
(SD) card logging has been developed. The SHIMMER platform utilizes
a MicroSD card slot that supports up to two GBytes of flash memory. For
ambulatory ECG monitoring at 500 Hz from three channels, more than 80
days of data can be stored in uncompressed format. There are a number of
firmware programs available on SourceForge that facilitate recording of data to
the SHIMMER platform’s MicroSD card.
“The SHIMMER platform utilizes a
MicroSD card slot that supports up to
two GBytes of flash memory.”
A Common Personal Health Research Platform — SHIMMER™ and BioMOBIUS™ | 133
Intel® Technology Journal | Volume 13, Issue 3, 2009
“The goal is to minimize the on-time
of all hardware subsystems on the
sensor hardware platform.”
Power Management
A significant difference between most sensor nodes and standard computing
platforms is the heavy focus on power management and consumption. A
large number of applications require battery-powered operation for extended
periods of time [60]. Computational capabilities have far outstripped battery
technologies. SHIMMER battery optimizations have been implemented to
satisfy various project requirements. These optimizations implemented in
firmware have been mostly, but not exclusively, focused on lowering the duty
cycle of the 802.15.4 radio. The goal is to minimize the on-time of all hardware
subsystems on the sensor hardware platform. For example, each gyroscope on
the SHIMMER gyroscope daughterboard requires a supply current of 9.5mA.
With two dual-axis gyroscopes on the board there is a substantial 19mA
requirement that can be saved when the gyroscopes are off. Other auxiliary
battery features have been implemented for integration purposes, such as a low
battery indication sent by the SHIMMER platform to a BioMOBIUS software
application that prevents corrupted data from the accelerometer being received.
The accelerometer outputs are ratio metric which means the output offset and
sensitivity will scale linearly with applied supply voltage. Therefore, useful data
can only be generated if the supply voltage remains constant. The low battery
indication is a SHIMMER firmware feature that halts sampling of data when
the supply voltage falls below +3V.
Programming/Flashing SHIMMER
An intuitive mechanism to support the download of firmware executable
programs to SHIMMER sensors was highlighted as an important capability
for biomedical researchers. The SHIMMER development kit includes an
easy-to-use graphical application called the “SHIMMER Windows Bootstrap
Loader” to allow users who do not have a TinyOS installation on their system
to program SHIMMER sensors for their application needs.
SHIMMER Sensing Validation
The operation of the SHIMMER baseboard and daughterboards has been
tested through a number of validation processes to determine the accuracy
of the boards’ function and their usefulness for biomedical-oriented research
applications.
Electrocardiography (ECG) Validation
“A number of tests were carried out
to validate the SHIMMER ECG
daughterboard as a valid tool for
acquiring ambulatory ECG.”
A number of tests were carried out to validate the SHIMMER ECG
daughterboard as a valid tool for acquiring ambulatory ECG. The tests
consisted of validating the ECG amplifier and ADC performance by using
calibrated input signals and a resting ECG from a healthy subject in normal
sinus rhythm. Simulated ECG signals as well as an ECG recording from a
healthy non-resting subject were used to validate the performance SHIMMER
ECG daughterboard for use in ambulatory monitoring. Figures 6 and 7 show
a 1mV QRS amplitude (60 BPM) waveform generated by the Fluke MPS450
134 | A Common Personal Health Research Platform — SHIMMER™ and BioMOBIUS™
Intel® Technology Journal | Volume 13, Issue 3, 2009
Patient Simulator* captured by a SHIMMER ECG and a MAC 3500 ECG
Analysis System* (GE Medical Systems). The plotted waveform (Figure 6) was
recognizable by a clinician as Normal Sinus Rhythm. Figure 7 shows the MAC
3500 ECG Analysis ECG System waveforms for comparative signal quality
purposes. Visual examination of the waveforms indicate that they compare
well.
Lead I
1
mVolts
0.5
0
-0.5
-1
3
4
5
6
Seconds
7
8
9
7
8
9
7
8
9
Lead II
1
mVolts
0.5
0
-0.5
-1
3
4
5
6
Seconds
Lead III
1
mVolts
0.5
0
-0.5
-1
3
4
5
6
Seconds
Figure 6: Simulated ECG signal of 1mV QRS amplitude captured by the
SHIMMER ECG
Source: The TRIL Centre (www.trilcentre.org)
Figure 7: Simulated ECG signal of 1mV QRS amplitude captured by a MAC
3500 ECG Analysis System
Source: The TRIL Centre (www.trilcentre.org)
A Common Personal Health Research Platform — SHIMMER™ and BioMOBIUS™ | 135
Intel® Technology Journal | Volume 13, Issue 3, 2009
“A 5.9 minute ECG recording
containing 503 heart beats from a
non-resting healthy subject during a
moderate walk was captured by the
SHIMMER ECG and also captured
by a Medilog Holter monitoring
system.”
QRS Detection of Normal Sinus Rhythm for a Non-resting Healthy
Subject
A 5.9 minute ECG recording containing 503 heart beats from a non-resting
healthy subject during a moderate walk was captured by the SHIMMER ECG
and also captured by a Medilog Holter monitoring system. The R-R intervals
and instantaneous heart rate (HR) identified by using the Medilog Holter ECG
monitor software were compared against the R-R intervals calculated by using
previously reported QRS detection and R-R interval correction algorithms
[61, 62]. Each automatically detected QRS point on the SHIMMER ECG
was manually verified to ensure correct detection by using the QRS detection
algorithm.
The mean R-R interval for the SHIMMER acquired ECG was 0.7049
seconds, while the mean R-R interval for the Medilog acquired ECG was also
0.7049 seconds. The percentage difference between the R-R intervals for each
acquisition was calculated on a beat-beat basis. The mean percentage error
between the R-R intervals calculated, by using each acquisition, was found to
be 0.0192 percent which can be considered negligible. These results indicate
that the SHIMMER ECG can be used to acquire ambulatory ECG from
resting and non-resting human subjects for research application purposes.
Kinematic Validation
In order to validate the SHIMMER platform for use in studies of human gait
analysis, temporal gait parameters derived from a tri-axial gyroscope on the
SHIMMER platform were compared against those acquired simultaneously, by
using the codamotion* analysis system from Charnwood Dynamics Ltd., UK.
The gait of one normal healthy adult male (age 25) was measured
simultaneously by using two SHIMMER sensors placed on each shank and the
Cartesian Optoelectronic Dynamic Anthropometer (CODA) motion analysis
system. Data were recorded whilst the subject performed multiple over-ground
walking and running trials along a 15-meter walkway in a motion analysis
laboratory. In all, ten walking trials at a self selected comfortable walking pace
and four running trials at a self-selected jogging pace were completed. Heel
strike and toe-off points were calculated from the medio-lateral angular velocity
derived from the gyroscope signal by using an algorithm reported by Salarian et
al. [63] as shown in Figure 8.
“Data were recorded whilst the subject
performed multiple over-ground
walking and running trials along
a 15-meter walkway in a motion
The heel strike and toe-off characteristic points derived from the SHIMMER
and CODA systems were used to calculate the three temporal gait parameters
listed below:
•• Stride time
•• Stance time
•• Swing time
analysis laboratory.”
136 | A Common Personal Health Research Platform — SHIMMER™ and BioMOBIUS™
Intel® Technology Journal | Volume 13, Issue 3, 2009
150
Angular Velocity
Angular Velocity
th 2
100
All Peaks
Mid-Swing
50
0
-50
0
5
10
15
20
Time [s]
150
Angular Velocity
Angular Velocity
th 4
100
th 5
50
Heel Strike
Toe Off
0
-50
0
5
10
15
20
Time [s]
Figure 8: Shank angular velocity signal derived from the SHIMMER
gyroscope
Source: Intel Corporation, 2009
Results show an intraclass correlation coefficient (ICC(2,k)) [64] greater than
0.85 in stride, swing, and stance times for ten walking trials and four running
trials. These results suggest that the SHIMMER platform is a versatile costeffective tool for use in temporal gait analysis. Full results from this study are
reported elsewhere [65]. Previous studies [66, 67] have validated the CODA
system as a reliable platform for gait measurements, so these results are very
promising.
“These results suggest that the
SHIMMER platform is a versatile
cost-effective tool for use in temporal
gait analysis.”
GSR Validation
The SHIMMER galvanic skin response (GSR) sensor contains an internal
resistor network that works as a potential divider and provides a voltage that
can be converted by the SHIMMER’s ADC to a 12-bit value, used to measure
external skin resistance. All skin resistance values were calculated in the
SHIMMER platform firmware and transmitted to a BioMOBIUS patch for
real-time display and persistence to file. The sensor performance was correlated
with a commercial Nexus-10* system (Mind Media BV) utilizing a series of
known resistors from 10KΩ to 2.2MΩ. SHIMMER GSR demonstrated an
average mean percentage error of 2.3 percent versus the commercial Nexus-10
that had an average mean error of 4.1 percent.
A Common Personal Health Research Platform — SHIMMER™ and BioMOBIUS™ | 137
Intel® Technology Journal | Volume 13, Issue 3, 2009
BioMOBIUS Prototype Applications
BioMOBIUS software has been utilized by the TRIL Centre for a number
of prototype applications both in clinical and home environments. A typical
BioMOBIUS-type application comprising the software and SHIMMER
wireless sensors described in this article is shown in Figure 9.
Application Development
Graphical User Interface
Embedded Signal
Processing
Sensor
Platform
Low Power Radios
Streaming Data
Over Radio
Data Analytics
Low Power Radios
for signals broadcasted
from Kinematic Data
sensors on
arms and legs
Physiological Data ECG
and Kinematic Data
Data Storage
Figure 9: Typical BioMOBIUS application
Source: Intel Corporation, 2009
We now look at three TRIL Centre projects that have utilized the BioMOBIUS
research platform.
Gait Analysis Platform
The BioMOBIUS platform has been used to develop a prototype gait analysis
platform for use in a clinical environment and designed for early detection of
gait and postural instability. The system is designed to unobtrusively capture
gait parameters and physiological data. The platform comprises a number of
integrated subsystems that provide a true six degrees of freedom analysis, linked
to footfall data and visual gait data. Body-worn kinematic and sensor data are
provided by SHIMMER sensors and via a Tactex pressure-sensitive floormat.
Video data of the subject are provided via two Webcams. The hardware
components are integrated into a BioMOBIUS high-level application that
138 | A Common Personal Health Research Platform — SHIMMER™ and BioMOBIUS™
Intel® Technology Journal | Volume 13, Issue 3, 2009
provides real-time visual feedback to clinicians. The system’s user interface
allows clinicians to select and adjust which data are collected and how the data
are processed. The software encapsulates data acquisition and signal processing
modules, and it allows customization of the sensors [68]. The system is being
used in the TRIL Clinic in St James’ Hospital, Dublin, Ireland to evaluate
scientifically the impact in non-normal gait changes and associated falls’ risks.
“The system’s user interface allows
clinicians to select and adjust which
data are collected and how the data
are processed.”
The MuSensor Project
The Tril Centre’s MuSensor project aimed to develop a cheap, unobtrusive,
home-based wireless sensor network to monitor an elder’s gait velocity
and doorway dwelling times over an eight-week period. The network
was developed using BioMOBIUS software and SHIMMER’s low-power
802.15.4 communications module and Passive Infrared (PIR) motion sensor
daughterboards. Each activation of a PIR sensor was time-stamped and
temporarily stored in the SHIMMER’s RAM, and batches of PIR events
were periodically uploaded via the 802.15.4 radio to a central data aggregator
located in the home. Data were then transferred to a server via a 3G GSM
network. The PIR sensor activation events were used to measure in-home
velocity of fallers and non fallers to determine if non performing (i.e., measured
outside a clinical environment) gait velocity of fallers and non fallers differs in
the home.
Engineering Alertness
This project is focused on the development of, and validation of, a prototype
BioMOBIUS application that can detect and correct lapses in alertness. The
study tests the viability of a self-alert training (SAT) protocol supported by
a BioMOBIUS software application with SHIMMER electrodermal activity
(EDA) sensors for measuring exosomatic EDA.
A small DC current was passed across two electrodes placed on the skin, and
the change in the resistance of the skin to the current was recorded as a
function of increased sweat gland activity (Figure 10). For DC measurements,
the current is kept constant and skin resistance is measured as changes in
voltage.
The protocol implemented in the application is designed to assist older people
in increasing attention and alertness levels in their own homes.
Developing the Ecosystem
BioMOBIUS software components can be downloaded for free from the
BioMOBIUS website (www.biomobius.org) under license from the TRIL
Centre. The website features a comprehensive set of documentation to support
both application and block developers. A variety of tutorials and sample
applications to assist new users are available. A user forum is also available
where users can log questions for TRIL engineers. A number of international
workshops have also been delivered to help with knowledge transfer in the
research community.
Figure 10. SHIMMER sensor measuring
electrodermal activity (EDA)
Source: The TRIL Centre (www.trilcentre.org)
A Common Personal Health Research Platform — SHIMMER™ and BioMOBIUS™ | 139
Intel® Technology Journal | Volume 13, Issue 3, 2009
“BioMOBIUS software remains under
active development.”
The SHIMMER wireless sensor platform was licensed by Intel to Realtime
Technologies in 2007. SHIMMER development kits are now commercially
available from SHIMMER Research [42].
Conclusions and Future Work
BioMOBIUS software remains under active development. Version 2.0 of the
platform was released in late summer 2009. It features improved stability,
support for the X10* family of sensors, improved graphing capabilities in the
GUI Developer, support for SQLite* and MySQL* databases, improvements to
SHIMMER blocks, etc.
A number of new features are currently in development, including ODBC
support, Continua-compliant device support (www.continuaalliance.org) and
Web services data transports. New features and improvements will be available
to download from the BioMOBIUS website.
“Hardware components, such as the
SHIMMER wireless sensors platform
can be purchased from third-party
vendors.”
The BioMOBIUS research platform represents a set of highly integrated
software and hardware components developed by the TRIL Centre for
prototype biomedical research applications. The software components are
freely available for download to researchers. Hardware components, such
as the SHIMMER wireless sensors platform can be purchased from thirdparty vendors. The platform is open and extensible, allowing users to develop
new blocks for the graphical development environment or to integrate new
hardware components into the platform. The TRIL Centre has demonstrated
the use of the BioMOBIUS platform in a number of research projects
including the development of prototype gait analysis platform and EDA
monitoring applications. We believe that the BioMOBIUS research platform
will prove to be a useful and versatile tool in enabling the biomedical research
community. The prototype applications we developed with the BioMOBIUS
research platform and described in this article are for research purposes only
and are not commercially available.
Acknowledgements
We acknowledge the contributions of Barry Greene, Adrian Burns, Cliodhna
Ní Scanaill, and the Digital Health Group Product Research and Incubation in
the preparation of this article. We also recognize the TRIL Centre’s researchers
and their research programs, some of which are reported in this article.
“We believe that the BioMOBIUS
research platform will prove to be a
useful and versatile tool in enabling
the biomedical research community.”
140 | A Common Personal Health Research Platform — SHIMMER™ and BioMOBIUS™
Intel® Technology Journal | Volume 13, Issue 3, 2009
References
[1]
R. Hongliang, M. Q. H. Meng, and C. Xijun. “Physiological
Information Acquisition through Wireless Biomedical Sensor
Networks.” In IEEE International Conference on Information Acquisition,
Hong Kong and Macau, China, 2005, pp. 483-488.
[2]
X. Chen, M. Q. H. Meng, and H. Ren. “Design of Sensor Node
Platform for Wireless Biomedical Sensor Networks.” In Proceedings
27th IEEE International Conference of Engineering in Medicine and
Biology Society, Shanghai, China, 2005, pp. 4662-4665.
[3]
G. S. Gupta, S. C. Mukhopadhyay, B. S. Devlin, and S. Demidenko.
“Design of a Low-Cost Physiological Parameter Measurement and
Monitoring Device.” In IEEE Conference on Instrumentation and
Measurement Technology, Warsaw, Poland, 2007, pp. 1-6.
[4]
A. Lymberis. “Smart Wearables for Remote Health Monitoring, from
Prevention to Rehabilitation: Current R&D, Future Challenges.” In 4th
International IEEE Engineering in Medicine and Biology Society Special
Topic Conference on Information Technology Applications in Biomedicine,
Birmingham, UK, 2003, pp. 272-275.
[5]
J. Kaye, T. Hayes, T. Zitzelberger, J. Yeargers, M. Pavel, H. Jimison, N.
Larimer, J. Payne-Murphy, E. Earl, K. Wild, L. Boise, D. Williams, J.
Lundell, and E. Dishman. “Deploying Wide-Scale in-home assessment
technology.” Technology and Aging: Selected papers from the 2007
International Conference on Technology and Aging, Toronto, Canada,
2007, pp. 19-26.
[6]
E. M. Tapia, S. S. Intille, and K. Larson. “Activity Recognition in the
Home Using Simple and Ubiquitous Sensors.” Pervasive Computing, vol.
3001, Berlin/Heidelberg: Springer, 2004, pp. 158-175.
[7]
R. Jafari, W. Li, R. Bajcsy, S. Glaser, and S. Sastry. “Physical Activity
Monitoring for Assisted Living at Home.” In 4th International
Workshop on Wearable and Implantable Body Sensor Networks,
Aachen, Germany, 2007, pp. 213-219.
[8]
T. Degen, H. Jaeckel, M. Rufer, and S. Wyss. “SPEEDY:a Fall Detector
in a Wrist Watch.” In Proceedings 7th IEEE International Symposium on
Wearable Computers, White Plains, New York, 2003, pp. 184-187.
[9]
J. Kaye. “Home-Based Technologies: A New Paradigm for Conducting
Dementia Prevention Trials.” Alzheimers and Dementia, vol. 4, no. 1, pp.
S60-S66, 2008.
[10] M. Morris, J. Lundell, and E. Dishman, “Catalyzing Social Interaction
with Ubiquitous Computing: a Needs Assessment of Elders Coping
with Cognitive Decline.” In Conference on Human Factors in Computing
Systems, Vienna, Austria, 2004, pp. 1511-1514.
A Common Personal Health Research Platform — SHIMMER™ and BioMOBIUS™ | 141
Intel® Technology Journal | Volume 13, Issue 3, 2009
[11] P. Bagnall, G. Dewsbury, V. Onditi, and I. Sommerville. “Promoting
Virtual Social Interaction with Older People.” In International Workshop
on Cognitive Prostheses and Assisted Communication, Sydney, Australia,
2006.
[12] D. G. Mcilwraith, J. Pansiot, S. Thiemjarus, B. P. L. Lo, and G. Z.
Yang. “Probabilistic Decision Level Fusion for Real-time Correlation
of Ambient and Wearable Sensors.” In 5th International Summer School
and Symposium on Medical Devices and Biosensors, Hong Kong, China,
2008, pp. 117-120.
[13] L. Atallah, M. ElHelw, J. Pansiot, D. Stoyanov, L. Wang, B. Lo, and G.
Z. Yang. “Behaviour Profiling with Ambient and Wearable Sensing.”
In 4th International Workshop on Wearable and Implantable Body Sensor
Networks, Aachen, Germany, 2007, pp. 133-138.
[14] C. Otto, Aleksandar Milenkovic, Corey Sanders, and E. Jovanov.
“System Architecture of a Wireless Body Area Sensor Network for
Ubiquitous Health Monitoring.” Journal of Mobile Multimedia, vol. 1,
no. 4, pp. 307-326, 2006.
[15] T. Westeyn, P. Presti, and T. Starner. “ActionGSR: A Combination
Galvanic Skin Response-Accelerometer for Physiological Measurements
in Active Environments.” In 10th IEEE International Symposium on
Wearable Computers, Montreux, Switzerland, 2006, pp. 129-130.
[16] D. Roetenberg, P. J. Slycke, and P. H. Veltink. “Ambulatory Position
and Orientation Tracking Fusing Magnetic and Inertial Sensing.” In
IEEE Transactions on Biomedical Engineering, vol. 54, no. 5, pp. 883890, 2007.
[17] S. Bamberg, A. Y. Benbasat, D. M. Scarborough, D. E. Krebs, and J.
A. Paradiso. “Gait Analysis using a Shoe-Integrated Wireless Sensor
System.” In IEEE Transactions on Information Technology in Biomedicine,
vol. 12, no. 4, pp. 413-423, 2008.
[18] S. J. Bellis, K. Delaney, B. O’Flynn, J. Barton, K. M. Razeeb, and
C. O’Mathuna. “Development of Field Programmable Modular
Wireless Sensor Network Nodes for Ambient Systems.” In Computer
Communications, vol. 28, no. 13, pp. 1531-1544, 2005.
[19] L. Benini, E. Farella, and C. Guiducci. “Wireless Sensor Networks:
Enabling Technology for Ambient Intelligence.” Microelectronics Journal,
vol. 37, no. 12, pp. 1639-1649, 2006.
[20] W.-Y. Chung and S.-J. Oh. “Remote Monitoring System with Wireless
Sensors module for Room Environment.” Sensors and Actuators B:
Chemical, vol. 113, no. 1, pp. 64-70, 2006.
142 | A Common Personal Health Research Platform — SHIMMER™ and BioMOBIUS™
Intel® Technology Journal | Volume 13, Issue 3, 2009
[21] H. Dubois-Ferriere, R. Meier, L. Fabre, and P. Metrailler. “TinyNode:
a Comprehensive Platform for Wireless Sensor Network Applications.”
In Proceedings 5th International Conference on Information Processing in
Sensor Networks, Nashville, Tennessee, 2006, pp. 358-365.
[22] T. Ahola, P. Korpinen, J. Rakkola, T. Ramo, J. Salminen, and J.
Savolainen. “Wearable FPGA based Wireless Sensor Platform.” In
Proceedings 29th IEEE International Conference of Engineering in
Medicine and Biology Society, 2007, pp. 2288-2291.
[23] B. Gyselinckx, C. Van Hoof, J. Ryckaert, R. F. Yazicioglu, P. Fiorini,
and V. Leonov. “Human++: Autonomous Wireless Sensors for Body
Area Networks.” In Proceedings IEEE Conference on Custom Integrated
Circuits, San Jose, California, 2005, pp. 13-19.
[24] J. Penders, B. Gyselinckx, R. Vullers, M. De Nil, V. Nimmala, J. van de
Molengraft, F. Yazicioglu, T. Torfs, V. Leonov, P. Merken, and C. Van
Hoof. “Human++: From Technology to Emerging Health Monitoring
Concepts” In 5th International Summer School and Symposium on
Medical Devices and Biosensors, Hong Kong, China, 2008, pp. 94-98.
[25] L. Girod, N. Ramanathan, J. Elson, T. Stathopoulos, M. Lukac,
and D. Estrin. “Emstar: A Software Environment for Developing
and Deploying Heterogeneous Sensor-Actuator Networks.” In ACM
Transactions on Sensor Networks, vol. 3, no. 3, p. 13, 2007.
[26] S. Hengstler and H. Aghajan. “WiSNAP: A Wireless Image Sensor
Network Application Platform.” In 2nd International Conference on
Testbeds and Research Infrastructures for the Development of Networks and
Communities, Barcelona, Spain, 2006, pp. 6-12.
[27] Z. Walker, M. Moh, and T.-S. Moh. “A Development Platform for
Wireless Sensor Networks with Biomedical Applications.” In 4th IEEE
Consumer Communications and Networking Conference, Las Vegas,
Nevada, 2007, pp. 768-772.
[28] D. Simon, C. Cifuentes, D. Cleal, J. Daniels, and D. White. “Java™
on the Bare Metal of Wireless Sensor Devices:.” In Proceedings 2nd
International Conference on Virtual Execution Environments, Ottawa,
Ontario, 2006, pp. 78-88.
[29] R. Slyper and J. Hodgins, “Action Capture with Accelerometers.” In
ACM SIGGRAPH/Eurographics Symposium on Computer Animation,
Dublin, Ireland, 2008, pp. 193-200.
[30] D. Spelmezan, A. Schanowski, and J. Borchers. “Rapid Prototyping
for Wearable Computing.” In International Semantic Web Conference,
Karlsruhe, Germany, 2008.
A Common Personal Health Research Platform — SHIMMER™ and BioMOBIUS™ | 143
Intel® Technology Journal | Volume 13, Issue 3, 2009
[31] Y. Jianchu and S. Warren, “Design of a Plug-and-Play Pulse Oximeter.”
In Proceedings Second Joint IEEE 24th Annual Conference on Engineering
in Medicine and Biology and the Annual Fall Meeting of the Biomedical
Engineering Society Conference, Houston, Texas, 2002, pp. 1752-1753.
[32] M. Lascu and D. Lascu. “LabVIEW Electrocardiogram Event and Beat
Detection.” WSEAS Transactions on Computer Research, vol. 3, no. 1, pp.
9-18, 2008.
[33] T. Paing, J. Morroni, A. Dolgov, J. Shin, J. Brannan, R. Zane, and Z.
Popovic. “Wirelessly-Powered Wireless Sensor Platform.” In European
Conference on Wireless Technologies, Munich, Germany, 2007, pp. 241244.
[34] S. Scapellato, F. Cavallo, C. Martelloni, and A. M. Sabatini. “In-Use
Calibration of Body-Mounted Gyroscopes for Applications in Gait
Analysis.” Sensors and Actuators A: Physical, vol. 123-124, no. pp. 418422, 2005.
[35] S. Campbell, J.-P. Chancelier, and R. Nikoukhah. Modeling and
Simulation in Scilab/Scicos, 1st ed. New York: Springer, 2006.
[36] PROATECH BioEra—Visual Designer for BioFeedback webpage.
Available at http://www.bioera.net/
[37] C. Veigl and J. Wikerson. “BrainBay—an OpenSource Biosignal
Project.” Available at brainbay.lo-res.org/
[38] S. A. Jawed, D. CattIn, M. Gottardi, N. Massari, R. Oboe, and A.
Baschirotto. “A Low-Power Interface for the Readout and MotionControl of a MEMS Capacitive Sensor.” In 10th IEEE International
Workshop on Advanced Motion Control, Trento, Italy, 2008, pp. 122-125.
[39] J. Music, R. Kamnik, and M. Munih. “Model Based Inertial Sensing
of Human Body Motion Kinematics in Sit-to-Stand movement.”
Simulation Modelling Practice and Theory, vol. 16, no. 8, pp. 933-944,
2008.
[40] TRIL Centre, Technology Research for Independent Living webpage
(2009). Available at www.trilcentre.org
[41] SHIMMER wireless sensor platform webpage (2009, January). Available
at http://www.csi.ucd.ie
[42] Sensing Health with Intelligence, Modularity, Mobility and
Experimental Reusability webpage (2009). Available at
www.shimmer‑research.com
144 | A Common Personal Health Research Platform — SHIMMER™ and BioMOBIUS™
Intel® Technology Journal | Volume 13, Issue 3, 2009
[43] A. Camurri, P. Coletta, M. Demurtas, M. Peri, A. Ricci, R. Sagoleo,
M. Simonetti, G. Varni, and G. Volpe, “A Platform for Real-Time
Multimodal Processing.” In 4th Conference in Sounds and Music
Computing, Lefkada, Greece, 2007, pp. 354-358.
[44] A. Camurri, P. Coletta, G. Varni, and S. Ghisio, “Developing
Multimodal Interactive Systems with EyesWeb XMI.” In Conference on
New Interfaces for Musical Expression, New York, 2007, pp. 305-308.
[45] A. Camurri, B. MazzarIno, M. Ricchetti, R. Timmers, and G. Volpe,
“Multimodal Analysis of Expressive Gesture in Music and Dance
Performances.” Gesture-based Communication in Human-Computer
Interaction, vol. 2915, A. Camurri and G. Volpe, Eds. Berlin/
Heidelberg: Springer Verlag, 2004, pp. 357-358.
[46] A. Camurri, B. MazzarIno, G. Volpe, P. Morasso, F. Priano, and C. Re.
“Application of Multimedia Techniques in the Physical Rehabilitation
of Parkinson’s Patients.” Visualization and Computer Animation, vol. 14,
no. 5, pp. 269-278, 2003.
[47] M. L. Rinman, A. Friberg, B. Benediksen, D. Cirotteau, S. Dahl, I.
Kjellmo, B. MazzarIno, and A. Camurri, “Ghost in the Cave—An
Interactive Collaborative Game Using Non-verbal Communication.”
Gesture-based Communication in Human-Computer Interaction, vol.
2915, A. Camurri and G. Volpe, Eds. Berlin/Heidelberg: Springer
Verlag, 2004, pp. 403-404.
[48] Raw Material Software—JUCE webpage (2005). Available at
www.rawmaterialsoftware.com/juce/
[49] MSP430 16-bit Ultra-Low Power MCUs webpage (2009). Available at
focus.ti.com/
[50] B. Lo, S. Thiemjarus, R. King, and G. Z. Yang. “Body Sensor
Network—A Wireless Sensor Platform for Pervasive Healthcare
Monitoring.” In 3rd International Conference on Pervasive Computing,
Munich, Germany, 2005, pp. 77-80.
[51] J. Polastre, R. Szewczyk, and D. Culler. “Telos: Enabling UltraLow Power Wireless Research.” In 4th International Symposium on
Information Processing in Sensor Networks, Los Angeles, CA, 2005, pp.
364-369.
[52] G. Werner-Allen, K. Lorincz, M. Ruiz, O. Marcillo, J. Johnson, J. Lees,
and M. Welsh. “Deploying a Wireless Sensor Network on an Active
Volcano.” IEEE Internet Computing, vol. 10, no. 2, pp. 18-25, 2006.
[53] TinyOS Community Forum webpage (2009 June). Available at
http://www.tinyos.net
A Common Personal Health Research Platform — SHIMMER™ and BioMOBIUS™ | 145
Intel® Technology Journal | Volume 13, Issue 3, 2009
[54] A. Dunkels, B. Gronvall, and T. Voigt. “Contiki — A Lightweight and
Flexible Operating System for Tiny Networked Sensors.” In 29th Annual
IEEE International Conference on Local Computer Networks, Tampa,
Florida, 2004, pp. 455-462.
[55] A. Eswaran, A. Rowe, and R. Rajkumar. “Nano-RK: An EnergyAware Resource-Centric RTOS for Sensor Networks.” In 26th IEEE
International Real-Time Systems Symposium, Miami, Florida, 2005, pp.
5-8.
[56] M. Cirinei, A. Mancina, D. Cantini, P. Gai, and L. Palopoli. “An
Educational Open Source Real-Time Kernel for Small Embedded
control systems.” Computer and Information Sciences - ISCIS 2004, vol.
3280, BerlIn/Heidelberg: Springer, 2004, pp. 866-875.
[57] S. Bhatti, J. Carlson, H. Dai, J. Deng, J. Rose, A. Sheth, B. Shucker, C.
Gruenwald, A. Torgerson, and R. Han. “MANTIS OS: an embedded
multithreaded operating system for wireless micro sensor platforms.”
Mobile Network Applications, vol. 10, no. 4, pp. 563-579, 2005.
[58] V. Tsetsos, G. Alyfantis, T. Hasiotis, O. Sekkas, and S.
Hadjiefthymiades. “Commercial Wireless Sensor Networks: Technical
and Business Issues.” In 2nd Annual Conference on Wireless On-demand
Network Systems and Services, St Moritz, Switzerland, 2005, pp. 166173.
[59] SourceForge webpage (2009, June). Available at http://sourceforge.net
[60] J. Hill, M. Horton, R. Kling, and L. Krishnamurthy. “The Platforms
Enabling Wireless Sensor Networks.” Communications of the ACM, vol.
47, no. 6, pp. 41-46, 2004.
[61] D. Benitez, P. A. Gaydecki, A. Zaidi, and A. P. Fitzpatrick. “The Use of
the Hilbert Transform In ECG Signal Analysis.” Computers in Biology
and Medicine, vol. 31, no. 5, pp. 399-406, 2001.
[62] P. de Chazal, C. Heneghan, E. Sheridan, R. Reilly, P. Nolan, and M.
O’Malley. “Automated Processing of the Single-Lead Electrocardiogram
for the Detection of Obstructive Sleep Apnoea.” IEEE Transactions on
Biomedical Engineering, vol. 50, no. 6, pp. 686-696, 2003.
[63] A. Salarian, H. Russmann, F. J. G. Vingerhoets, C. Dehollain, Y. Blanc,
P. R. Burkhard, and K. Aminian. “Gait Assessment In Parkinson’s
Disease: Toward an Ambulatory System for Long-Term Monitoring.”
IEEE Transactions on Biomedical Engineering, vol. 51, no. 8, pp. 14341443, 2004.
[64] P. E. Shrout and J. L. Fleiss. “Intraclass Correlations: Uses In Assessing
Rater Reliability.” Psychological Bullet, vol. 86, no. 2, pp. 420-428, 1979.
146 | A Common Personal Health Research Platform — SHIMMER™ and BioMOBIUS™
Intel® Technology Journal | Volume 13, Issue 3, 2009
[65] K. J. O’Donovan, B. R. Greene, D. McGrath, R. O’Neill, A. Burns, and
B. Caulfield. “SHIMMER: A New Tool for Temporal Gait Analysis.”
In Proceedings 31st IEEE Engineering in Medicine Biology Conference,
Minneapolis, MN, 2009.
[66] K. Monaghan, E. Delahunt, and B. Caulfield, “Increasing the Number
of Gait Trial Recordings Maximises Intra-Rater Reliability of the
CODA Motion Analysis System.” Gait & Posture, vol. 25, no. 2, pp.
303-315, 2007.
[67] V. Maynard, A. M. O. Bakheit, J. Oldham, and J. Freeman. “IntraRater and Inter-Rater Reliability of Gait Measurements with CODA
MPX30 Motion Analysis System.” Gait & Posture, vol. 17, no. 1, pp.
59-67, 2003.
[68] K. O’Donovan, T. Dishongh, T. Foran, D. Leahy, and C. Ni Scanaill.
“The Development of a Clinical Gait Analysis System.” In Conference on
Ambulatory Monitoring of Physical Activity and Movement, Amsterdam,
Netherlands, 2008, p. 155.
Author Biographies
Michael McGrath is Senior Technologist in Intel’s Digital Health Group. He
is also a co-Principal Investigator in the TRIL Centre focusing on development
technologies to support independent living research. Michael McGrath joined
Intel in 1999. He has held positions as an automation engineer working in
Ireland Fab Operations and as a researcher in IT Research and Innovation.
He moved to his current position as a technologist in the Product Research
and Incubation group in 2006. His areas of interest include sensors, wireless
communications, assisted living technologies, intelligent user interfaces,
data fusion, and data management techniques. Michael received his B.Sc. in
Analytical Science from Dublin City University in 1992 and a Ph.D. in sensors
and instrumentation from Dublin City University in 1995. His e-mail is
michael.j.mcgrath at intel.com.
Terrance (Terry) J. Dishongh, Ph.D., is currently a Senior Principal Engineer
in Intel’s Digital Health Group. In his twelve years at Intel he has been awarded
45 patents with an additional 44 pending and has published books on wireless
sensors and electronic packaging. Dr. Dishongh has held faculty positions at
the University of Maryland, College Park and the State University of New York
at Buffalo. His e-mail is terry.dishongh at intel.com.
Copyright
Copyright © 2009 Intel Corporation. All rights reserved.
Intel, the Intel logo, and Intel Atom are trademarks of Intel Corporation in the U.S. and other
countries.
*Other names and brands may be claimed as the property of others..
A Common Personal Health Research Platform — SHIMMER™ and BioMOBIUS™ | 147
Intel® Technology Journal | Volume 13, Issue 3, 2009
Gathering the Evidence: Supporting Large-Scale Research
Deployments
Contributors
Tamara L. Hayes
OHSU & ORCATECH
Misha Pavel
OHSU & ORCATECH
Jeffrey A. Kaye
OHSU & ORCATECH
Abstract
Independent living technologies have tremendous potential to facilitate the
study of aging and chronic disease. By collecting medical and behavioral data
continuously as people live their daily lives, we are able to form a much clearer
picture of the changes in people’s health, and the efficacy of treatments over
time, than is possible by using periodic clinical visits. However, deploying these
technologies in large-scale clinical trials poses unique challenges. In this article
we review those areas in which independent living technologies can provide
significantly improved research data, and the challenges that must be overcome
to effectively gather those data. We propose approaches, both technical and
procedural, which enable efficient management of large-scale research studies,
and we discuss scalability issues and their possible solutions.
Introduction
Index Words
Aging
Home Monitoring
Clinical Research
Scalability
“Over 20 percent of people 85 and
older have a limited capacity for
independent living.”
People aged 65 and older are the fastest growing segment of the United States
population: it is predicted that this number will more than double by the year
2030. Forty-two percent of this age group reported some type of long-lasting
condition or disability [6]. Furthermore, over 20 percent of people 85 and
older have a limited capacity for independent living [5], requiring continuous
monitoring and daily care. Almost 6 percent of private health expenditures
and approximately 13 percent of all public healthcare expenditures in 2005
were spent on home healthcare and nursing-home care for older adults [4]. The
convergence of growing numbers of seniors with attendant chronic illness, the
rising costs and complexity of care, and the inability to effectively develop and
apply increasing knowledge of how to treat or manage these health declines
merge to create a well-recognized crisis of healthcare in America.
A major challenge to research into caring for the aged is the reliable assessment
of behavior and clinical status across multiple related domains (for example,
cognitive, social, physiological, environmental). Clinical research has not
fundamentally changed its tools of conduct and assessment paradigm since the
beginning of the computer age. Thus, clinical research is currently still rooted
largely in traditional clinical practice methodologies, where the basic paradigm
of assessment is a clinical interview or examination, relying on the recall of
the volunteer, performed at a brief moment in time, at a location convenient
for the researcher, not the volunteer. Even when technologies are deployed in
the home, they are often used in a brief or episodic way (for example, 24-hour
heart monitoring or two weeks of actigraphy) and not fully integrated into the
clinical data stream and workflow.
148 | Gathering the Evidence: Supporting Large-Scale Research Deployments
There have been several research efforts that have developed solutions for
monitoring and assessment of individuals in their homes. These approaches use
unobtrusive wireless sensors placed in a subject’s home (such as motion sensors,
door sensors, and environmental sensors), or technologies that the subject
wears or uses (such as medication adherence monitors, telemedicine boxes,
personal computers, and body-worn location and activity sensors). Studies that
use these technologies have found that even simple technologies can be
valuable in assessing behavior in the home [13], and there are a growing
number of researchers developing and evaluating emerging technologies for
in-home assessment and health intervention. These technologies have the
potential to transform the clinical research landscape by enabling the
simultaneous assessment of an individual’s activity, function, and physiologic
status in a particular location across a wide array of time scales (for example,
seconds to seasons), and by creating the potential to detect subtle changes in
that status, by comparing this rich multidimensional data stream with baseline
data from prior observations.
The general approach to this kind of study is to monitor a subject (or patient)
over a period of time to establish a baseline, after which deviation from that
baseline can be used to detect clinically-relevant change. Without this baseline,
patients must instead be compared to a population norm–and what is normal
for one individual may be a cause for alarm in someone else. Furthermore,
by sampling infrequently, true trends may be masked, as a person has good
days and bad days. Figure 1 illustrates how infrequent sampling can lead
to substantial aliasing errors. In the figure, you will see that the top panel
depicts test scores taken during a standard clinic visit, taken at six-month
intervals, for two different patients. Then note that the bottom panel depicts
how continuous assessment could reveal a very different picture. Thus, the
continuous nature of the in-home sensor data collection allows many data
points to be collected, providing better time resolution for looking at changes
and for reducing the overall variance. This continuous collection also allows
changes to be detected in a shorter period of time. The immediacy and
frequency of the collection of these data form the basis for the transition to
a true substantiation of personalized medicine and directly aligns with new
healthcare delivery trends and mandates, including the wide-scale build-out
of electronic medical or health records [19], the Medical Home [8], and
comparative-effectiveness research (CER) [14].
Measured Function
Intel® Technology Journal | Volume 13, Issue 3, 2009
Month 1
Month 6
Month 12
Figure 1: Problems with Infrequent Monitoring
Source: Alzheimer’s and Dementia: Hayes TL,
Abendroth F, Adami, A, Pavel, M, Zitzelberger TA,
Kaye JA
The Role for In-Home Monitoring in Research
In-home monitoring has wide applicability in routine care and intervention. In
particular, in-home monitoring allows tracking of conditions that evolve slowly
over time or that have poor demarcation of onset; and it allows the tracking
of conditions that transition to new clinical states, such as depression, frailty,
and cognitive impairment. Suitable sensors placed in the home may also allow
immediate detection of acute events that are rare or irregularly occurring, such
as falls, naps, or transient neurological events.
“In-home monitoring has wide
applicability in routine care and
intervention.”
Gathering the Evidence: Supporting Large-Scale Research Deployments | 149
Intel® Technology Journal | Volume 13, Issue 3, 2009
“In many cases in aging research, the
variance in a measure may be more
informative than the absolute value.”
“The metrics of interest would become
true rates of change and not simply
binary ‘present or absent’ estimates.”
Continuous monitoring would also greatly enhance clinical trials of both
drug and behavioral therapies, by allowing the ongoing assessment of patient
adherence to the trial or therapy regimen, and by providing those administering
these trials the ability to assess side-effects and health changes throughout the
day. Traditionally, both adherence and side-effects in clinical trials are measured
through self-report, which at best corresponds modestly to actual medication
taking and to true physiological changes [17, 20] (see Vignette 1 at the end
of this article). The immediate identification of side-effects accurately aligned
to the time of treatment would provide valuable insights into the efficacy of
treatments. Data about patient behaviors would allow improved coaching of
the patient to ensure better adherence to the treatment protocol. Furthermore,
in studies of recovery (such as from a hip fracture or stroke), instead of
assessing outcomes by using subjective measures, for example, arbitrary scale
numbers at a fixed time point such as 3-6 months after the event, new measures
could be generated from recovery curves that represent the rate of functional
recovery (such as total activity or walking speed). One of the most interesting
benefits of using the continuous home assessment platform is not only the
opportunity to examine the trends in these objective measures, but equally
importantly, to determine their variability over time. In many cases in aging
research, the variance in a measure may be more informative than the absolute
value [12, 16].
The ability to improve on self-report also has the potential to transform
epidemiological studies. Currently, epidemiological survey and assessment
methodologies tend to rely on sparsely spaced questionnaires that depend on
recall of events. This approach makes it difficult to identify data or events with
detailed temporal or spatial precision, and it limits their ecological validity.
A way to improve this state is to bring the locus of assessment into the home
or community by providing a means of recording continuously, in real-time,
events as they occur and where they occur. If full sensor networks and on-line
reporting capabilities were incorporated into thousands of households, there
would be an opportunity to collect changing health outcomes at the point
of occurrence. In this case the metrics of interest would become true rates of
change and not simply binary “present or absent” estimates. Item-level data
would be much less reliant on recall (e.g., “In the last six months how often
did you wake at night?”) and instead reflect actual events and more objective
measures (e.g., the number of bathroom visits, hours in bed, and so on).
Achieving Large-Scale Studies
Recent advances in ubiquitous computing and sensor network methodologies
make large-scale, in-home research studies feasible. When coupled with
other convergent technologies and methods, such as ecological momentary
assessment and telemedicine, the potential to transform the practice of clinical
research is staggering. However, in spite of the number of smart homes that are
now being used to develop and evaluate in-home monitoring technologies,
we are a long way from wide-scale deployment. Experience has shown that
solutions that appear to work in a controlled laboratory environment, even
150 | Gathering the Evidence: Supporting Large-Scale Research Deployments
Intel® Technology Journal | Volume 13, Issue 3, 2009
when that environment is intended to simulate a home living situation, never
work as expected in the field. For this reason, we have developed a staged
approach to designing solutions for large-scale, in-home monitoring.
Assess Attitudes and Beliefs
SU RV EY
Evaluate in Point-of-Care Lab
Internet
Test in Community Living Lab
Internet
Deploy on a Large Scale
Internet
Internet
Internet
Internet
Internet
Figure 2: Scaling In-home Assessments from the Lab to Communities
Source: OHSU
Figure 2 illustrates the stages we use from technology inception to wide-scale
deployment. First, we conduct focus groups and interviews to inform the
technology design, and in particular to understand attitudes towards the
technology and to identify key usability and acceptance issues. In the second
stage, technologies are tested in a smart-home environment (the Point of Care
Laboratory) by using convenience samples of participants. Data collection can
be done quickly and easily, under controlled experimental conditions, and
therefore this approach is appropriate for initial evaluations of technologies.
However, these tests will not typically allow the collection of outcome measures
that are representative of particular patient populations. Therefore, part of this
“We conduct focus groups and
interviews to inform the technology
design.”
Gathering the Evidence: Supporting Large-Scale Research Deployments | 151
Intel® Technology Journal | Volume 13, Issue 3, 2009
“Subjects brought into a smart home
will not behave as they would in their
own home.”
“The Living Lab is a group of senior
volunteers who have agreed to allow
technologies to be evaluated in their
homes on an ongoing basis.”
“The high, inter-individual variability
stage includes bringing patients or subjects from the target cohort into the
smart home, to collect additional data about how the population of interest
will interact with the sensor systems. This is useful for improving our
understanding of the limitations and capabilities of the sensor technologies.
Clearly, subjects brought into a smart home will not behave as they would in
their own home, and therefore data collected in such an environment has less
ecological validity than those collected in the individual’s home. Hence, the
third stage entails taking the technology into the field — that is, instrumenting
a small number (10-20) of patients’ homes — to provide much better
outcomes’ data. We use the ORCATECH Living Lab for this deployment. The
Living Lab is a group of senior volunteers who have agreed to allow
technologies to be evaluated in their homes on an ongoing basis. A core set of
technologies is continually maintained in the homes, and clinical and
neurological assessments are conducted on this cohort on a semi-annual basis.
This deployment helps to reveal problems that may arise when the technology
is used with different home construction materials, infrastructure support,
living arrangements, or patient populations. Once a technology has been
field-hardened in the Living Lab, we are ready to begin to scale it to larger
studies.
In spite of the challenges of scaling emerging technologies, there is a clear need
to do so. The high inter-individual variability in behavior and health outcomes
drives the need for large studies. For example, while small randomized
clinical trials of cholinesterase inhibitors have shown modest improvements
in cognitive function, trials with treatment and control groups of 200 – 300
people are typically necessary to have sufficient power to show improvements
from baseline [18]. Population-based studies require a much larger sample size,
since these studies typically control less for co-morbidities, and therefore the
variance in those samples is much greater.
In the process of deploying various in-home monitoring and intervention
technologies in more than 400 homes around the United States, we have
learned a great deal about the challenges of scaling the technologies for
research. These challenges relate in part to the complex, heterogeneous nature
of the data that are collected, in part to the challenges of adding the use of
technology into the already difficult problem of recruiting and retaining patient
populations (who may be technophobic), and in part to the fact that the global
infrastructure and sensors designed specifically for collecting behavioral health
data do not yet exist. In the remainder of this article we review each of these
challenges and propose possible solutions.
in behavior and health outcomes
drives the need for large studies.”
152 | Gathering the Evidence: Supporting Large-Scale Research Deployments
Intel® Technology Journal | Volume 13, Issue 3, 2009
The Nature of the Data
Continuous, in-home monitoring studies generate a vast amount of data
pertaining to the daily activity of subjects. These datasets may contain mixtures
of complex data, ranging from rare and random events to continuous signals.
Uniquely, these data inherently describe multiple data domains simultaneously:
space or location, time, activity, physiological function, neuropsychological
function, etc. These data provide more accurate and detailed information
than can be captured by traditional methods of self-report, and they hold
information about patterns of activity in the home that could be used to
examine a multitude of hypotheses about behaviors, such as restlessness at
night, nocturia, patterns of use of the kitchen (and refrigerator), outings from
the home, and acute events such as falls or stroke. However, the very strengths
of these data — such as the high sampling frequency and the many different
types of information gathered — make them challenging to manage, study,
monitor, and interpret.
There are three key types of data that are needed to support large-scale research
studies. The first type of data that is of interest is the raw data from the
sensors themselves. These data reflect physical responses of the sensors to the
behaviors of the research participants. For example, the instantaneous signal
strength from a body-worn radio frequency identification (RFID) sensor as
an individual walks through the home, the time at which a motion sensor
fires when somebody passes by, and the force measured by a load cell under
the support of a bed as a person breathes, are all signals that must be captured
for subsequent processing and inference. Some of these (signal strength, load
cell force) are sampled at regular intervals and form a time series sampled at a
relatively high frequency. Others (sensor firing time) are event-driven, and the
amount of data collected therefore depends upon the activity being sensed.
Furthermore, these data may also be tied to location, and therefore they may
require a spatio-temporal representation.
“These datasets may contain mixtures
of complex data, ranging from rare
and random events to continuous
signals.”
“These data may also be tied
to location, and therefore they
may require a spatio-temporal
representation.”
The second type of data is metadata about the sensors, the state of the sensors,
and participants themselves. If one is relying on signal strength to estimate
location, for example, then the battery level of the sensor becomes important
to determining the likely variance and accuracy of the estimate. Similarly, if a
motion sensor appears to have stopped firing, it is important to know if there is
a hardware or communications problem or if the person being monitored has
simply not passed near the sensor.
The third type of data is derived data, which come from inference algorithms
that are used to extract information from the raw data. Often these derived
data actually arise from sophisticated statistical models that fuse data from
multiple sensors to make inferences about behaviors and health outcomes.
Derived data could be as basic as an estimate of respiration or walking speed,
or as complex as a determination of how well a person was performing a
particular activity of daily living (ADL). The derivation of such measures and
metrics from in-home sensors is an area of active research, and at this point,
only a small number of measures have been developed and validated.
“The derivation of such measures
and metrics from in-home sensors
is an area of active research, and at
this point, only a small number of
measures have been developed and
validated.”
Gathering the Evidence: Supporting Large-Scale Research Deployments | 153
Intel® Technology Journal | Volume 13, Issue 3, 2009
“As purposed sensors become more
common, what was once derived data
will become raw data.”
The applicability of each of these types of data typically relates to the maturity
of the sensor technology. For example, a pulsed ultra-wideband location sensor
could provide data in the form of the time-difference-of-arrival measures, or
could provide an estimate of the individual’s location, based on a sophisticated
algorithm that integrates data from multiple nodes. If the sensor were welltested and its behavior in home environments was well-understood, then
the expense of storing the raw data might not be warranted. Instead, the
location estimates would be sufficient. Thus, as purposed sensors become more
common, what was once derived data will become raw data, which in turn will
be used to develop more sophisticated and complex measures.
Managing Large-Scale Studies
“Unlike standard clinical trials,
in‑home studies require central
tracking and remote systems
management capabilities.”
“Short installation time and high
equipment reliability become essential
in a large study.”
Conducting a large-scale, in-home trial differs from conducting a small pilot
study as well as from conducting standard clinical trials. First, in large studies,
the recruitment of subjects is often done across multiple sites in different cities
or states, which raises issues of staff training, forming relationships with local
infrastructure providers at each location, dealing with regional differences in
attitudes towards the technologies, and cross-site coordination. Furthermore,
unlike standard clinical trials, in-home studies require central tracking and
remote systems management capabilities. Second, the innumerable possible
variations on home configurations and environmental conditions means that
in-home technology must be adaptable to many environments, which is not
always straightforward. Third, short installation time and high equipment
reliability become essential in a large study. Fourth, because in-home
technologies can potentially produce a large amount of raw data, centralized
data management and a means of visualizing and reviewing the data are
essential as well.
In our studies, the data and the remote installations are managed by using
the ORCATECH Management Console (OMC), a multi-tiered remote
monitoring application built from Open Source tools. This software tool
was designed to support the activities of both clinical and technical research
assistants. OMC provides a central view into the data and has the following
capabilities:
•• View subject and home information (floor plans, sensor placements,
demographic info, etc.).
•• Track recruitment and subject status, including generating alerts for
upcoming scheduled visits for assessments or equipment installation.
•• Track contact with subjects, for assessing total cost of installations (for
example, time and frequency of support phone calls and visits).
•• Roll up the health of the installed systems (and raise alerts if sensors fail to
fire, or data fails to transfer).
•• Provide summary views of the data to enable data validation on an ongoing
basis. Figure 3 shows an example in which computer use is displayed for the
two residents in a home. In this graph, we also plot in-home activity, since
we should not expect any computer use if nobody is home.
154 | Gathering the Evidence: Supporting Large-Scale Research Deployments
Intel® Technology Journal | Volume 13, Issue 3, 2009
Alerts
Data Outage
Equipment
X10
MedTracker Events
Start Date:
Graphs
Personal
Residents
End Date:
Ekahau
2009-07-31
Computer Usage
Submit
Home Activity
12
11
10
9
8
7
6
Logins
X10 Events
Pets
21,000
20,000
19,000
18,000
17,000
16,000
15,000
14,000
13,000
12,000
11,000
10,000
9,000
8,000
7,000
6,000
5,000
4,000
3,000
2,000
2009-07-01
Phone Monitor Events
5
4
3
2
1
2009-07-05
2009-07-12
2009-07-19
2009-07-26
Date
X10 Events
(738)
(760)
Figure 3: Checking Data Integrity with the ORCATECH Management
Console
Source: OHSU
In the following sections we discuss the core issues this study management tool
was designed to support — and those issues that still need to be addressed to
make this tool ultimately scalable to studies that include thousands of subjects.
Cohort Management
One of the greatest challenges for any large-scale study is the recruitment and
retention of the participant cohort. The challenges are exacerbated when the
participant population is asked to use or interact with unfamiliar technology.
In aging studies, many participants have never used a computer or cell phone
and are highly intimidated by the thought of learning to do so. In one of our
longitudinal studies, we require the participants to use a personal computer
on a weekly basis. Approximately 32 percent of the participants we recruited
already owned their own computer. The majority of the remainder, 169
subjects, were provided with a computer and trained to use it. A significant
effort was spent in developing a training program that included skills ranging
from the use of the mouse and how to double-click, to sending and receiving
e-mail (a definition of computer literacy). The training program comprised
six hour-long lessons that were taught to small groups. Clearly, this is not an
approach that scales to tens of thousands of subjects. However, many computer
classes for seniors are now available through libraries and senior centers, and
through partnerships with such programs, it is possible that large-scale studies
requiring frequent computer use may be viable even in studies of older adults,
such as our studies (where the mean age is over 80 years old).
“A significant effort was spent in
developing a training program that
included skills ranging from the use
of the mouse and how to double-click,
to sending and receiving e-mail (a
definition of computer literacy).”
Gathering the Evidence: Supporting Large-Scale Research Deployments | 155
Intel® Technology Journal | Volume 13, Issue 3, 2009
“For an individual with a tremor,
pressing a single key on a cell phone or
moving a mouse to a target on a screen
may be literally impossible to do with
accuracy.”
Many of the technologies available commercially, such as cell phones and mice,
are difficult to use for older adults. For an individual with a tremor, pressing
a single key on a cell phone or moving a mouse to a target on a screen may
be literally impossible to do with accuracy. For many elderly, the text on a
cell phone is too small to see, even with glasses. Recent research has provided
considerable insight into the usability needs of the aging population [2, 3, 15,
21], and large-scale studies must be prepared to adapt their technologies to
accommodate the wide variety of needs in this population.
This difficulty with the technology translates into a retention problem for
research studies: participants that become frustrated are more likely to drop
out of the study. Therefore, in technology-based studies, even more than in
traditional clinical studies, it is vital to maintain active personal relationships
with the participants. In our own studies, this is done by maintaining
continuity of contact as much as possible. Each participant has an assigned
research assistant that they know well and with whom they are used to
interacting. In addition, we maintain a help-line for the participants to call
with any questions they may have about the technology. By knowing that they
will receive a response about their problem within a day, and that somebody
wants to help them through it, the participants are able to give voice to their
frustrations, which is often all that is needed to defuse the situation. Of course,
not all participants are frustrated by the technology.
“It is vital to maintain active personal
relationships with the participants.”
“In aging studies, transitions are part
In aging studies, transitions are part of life. Many participants will move to
smaller homes, to assisted living, or they will move in with family over the
course of a study. Even more participants will become ill, have an exacerbation
of a chronic disease, or undergo surgery. In a study of behaviors or health
outcomes, these transitions will have a significant impact on the data. It is
important to track these changes, and in some cases to collect additional
data that will help in the interpretation of changes in the data that result
from the life transition. For example, in our Intelligent Systems for Assessing
Aging Changes (ISAAC) project, we are interested in changes in walking
speed over time. In this project, 6.8 percent of our participants have moved
in the last eight months. Often, their new homes are smaller, and therefore
the participants do not need to move around as much to go from room to
room. Alternatively (or in addition), the participants may have made the
change because of changing health needs that themselves impact their daily
activity. Therefore, in order to understand how the home move impacts our
measurements of activity, it is important to study the time periods around
these transitions carefully, and to understand what health changes may have
happened at the same time. To help us understand these data, we ask our
participants to complete weekly online questionnaires about any health
changes, medication changes, or emergency room visits that may have occurred
in the previous week.
of life.”
156 | Gathering the Evidence: Supporting Large-Scale Research Deployments
Intel® Technology Journal | Volume 13, Issue 3, 2009
One final area of cohort management is important to consider. Many
technology-based studies place expensive equipment into the homes of the
participants. Depending on the neighborhood, this technology can make
the participant a target for break-in and theft. Furthermore, in studies with
cognitively impaired subjects, quite a few of the subjects will forget what the
technology is for or even that it is in their home. In these cases, participants
may pack up the equipment or even give it away. While these situations are
rare, their resolution takes time and money.
“Many technology-based studies place
expensive equipment into the homes of
the participants.”
Technology Management
There are three keys to successful technology management in a large-scale,
in‑home study: a stream-lined and well-supported installation process,
continuous remote tracking of system status, and the ability to do remote
management and maintenance of installed technologies. Our approach in each
of these areas has changed over the years, as we have partnered with study sites
around the country and have learned many lessons.
The installation of in-home technology often involves sensors and devices that
are unfamiliar to many study assistants. Even setting up a computer may be
outside the experience of the typical research assistant (RA). Therefore, the
technology installation must be both orderly and as idiot-proof as possible. In
addition, a live help line is essential to provide support for inexperienced RAs.
For example, in one study in which medication tracking devices and computer
kiosks for administering monthly or quarterly neuropsychological evaluations
were to be installed in 200 homes in 25 cities around the country, we created
a laminated step-by-step installation instruction sheet that was packed in the
box with each technology set. All connectors between the equipment were
color coded, and the instructions included photographs of how those colorcoded connectors should be connected. A set of pre-install steps were clearly
specified, including forming a working relationship with the local broadband
provider for billing and technical support prior to commencing installs. Study
coordinators from all sites converged locally for a one-day training session in
which we attempted to cover possible problems with the installation. In spite
of these preparations, the help line assistant received 55 calls during the first
57 installations. While the need for support tapered off sharply by the time
each site had installed two systems, about two-thirds of sites still called for
additional assistance at least once in the first year. Clearly, an effective help line
with well-trained staff is one of the most essential components of ensuring a
successful multi-site, large-scale, study deployment.
The continuous monitoring of equipment status is another important
component of ensuring good data collection in a large-scale study. In our
studies we do this through the ORCATECH Management Console, which
queries the status of numerous system variables in each home once per day.
Figure 4 shows an example in which an alert screen provides visual indicators
of the status of the sensors. The system checks for problem indicators daily and
displays them as a red bar. Technicians can then reclassify the problems when
they have determined the cause. Colors reflect the problem status. Multiple
days can be displayed to assist in tracking problem status over time.
“There are three keys to successful
technology management in a largescale, in-home study: a stream-lined
and well-supported installation
process, continuous remote tracking
of system status, and the ability to do
remote management and maintenance
of installed technologies.”
“The continuous monitoring of
equipment status is another important
component of ensuring good data
collection in a large-scale study.”
Gathering the Evidence: Supporting Large-Scale Research Deployments | 157
Intel® Technology Journal | Volume 13, Issue 3, 2009
Alerts
Services
Summary
Pass:
Select Colored Cells to Classify
2009-08-04 (02:00:01)
Apartment
Classify
1020
Home Id
History:
1192
Alert:
16
1194
1196
Sensor PC
1198
Reload
1200
1203
1205
Print
1207
20
85
109P
94
204B
109
414S
112
127
317
150
212
160
301P
Unread
64
Unread
Watch
413
304B
1558RR
Oa
E 303
161
Not an Issue
Create New Issue
Out of Home
229
237
259
B 106
261
312
335
334
336
315
348
1012
401
Figure 4: Screenshot of the ORCATECH Management Console Alert Screen
Source: OHSU
A great deal of system problem triaging can be done programmatically.
Communication status between the routers and the in-home computers, the
status of services running on the sensor computers in the home, remaining
device battery life, failure to receive sensor heartbeats, sensor data values out
of expected range, and even failure of the participant to complete a weekly
form, can all be flagged as possible issues and brought to the attention of the
study staff. Serious problems, which may result in the loss of data, can then
be identified quickly and addressed immediately. By tracking the time taken
to triage and fix problems, we are able to assess the overall quality of the study
support.
“Scientifically, it is important to not
visit with the participants more than
necessary due to the study protocol.”
Finally, remote maintenance of installed technologies is an extremely important
component of conducting in-home trials, both scientifically and financially.
Scientifically, it is important to not visit with the participants more than
necessary due to the study protocol, because that contact may itself influence
the outcome of the study. For example, in one of our studies we are interested
in changes in walking speed and motor activity over time. Each time we visit
a participant’s home to repair activity sensors, they become aware of their
own activity and may subtly alter their behavior for a short period of time as
a result. Furthermore, frequent visits to the participants can be an irritant to
them, which puts study retention at risk. Financially, it is expensive to travel
to participants’ homes to make repairs. Clearly, some repairs require in-person
visits (replacing batteries or broken equipment), but many things can be fixed
remotely (restarting services, scheduling tasks, pushing new software versions,
helping participants with computer programs, etc.). Wake-on-LAN capability
has proven to be an extremely valuable feature of the in-home computer. In our
158 | Gathering the Evidence: Supporting Large-Scale Research Deployments
Intel® Technology Journal | Volume 13, Issue 3, 2009
Figures 5 and 6 show the repair history for one of our studies, the ISAAC
project [9]. In this project participants have passive infrared sensors distributed
throughout their home to detect activity and walking speed and a personal
computer that we use to track inter-keystroke intervals (a surrogate for finger
tapping speed) and their interaction with a series of games. This study has two
technical RAs who are responsible for supporting the 248 participants and 190
homes in the study. In Figure 5, you can see the clear pattern of install/bug
fix that we engaged in during ramp-up of the study. Once the project reached
steady state, the number of fixes trended down, reflecting the fact that issues
that spanned multiple homes had been identified and addressed. At this point
in the project, most fixes are maintenance issues, such as battery changes and
issues with broadband.
200
Total number of
Installed homes
In-home fixes
Phone support
Remote fixes
Number of installs/fixes
180
160
140
120
100
80
60
40
20
“Once the project reached steady state,
the number of fixes trended down.”
0.13
Average number of fixes per month per home
studies, we used a laptop to collect sensor data that did not have wake-on-LAN
capability in its wireless LAN card. As a result, in those homes in which we
used 802.11g to connect to the router, we were unable to remotely boot many
of the sensor computers that went off-line.
In-home fixes
Phone support
Remote fixes
0.11
0.09
0.07
0.05
0
09
9
9
l–0
Ju
y–
Ma
09
08
08
8
8
08
r–0
Ma
n–
Ja
v–
No
p–
Se
l–0
Ju
y–
Ma
r–0
Ma
07
07
7
07
08
n–
Ja
v–
No
p–
Se
l–0
Ju
y–
Ma
0.03
Router
Sensors
Sensor PC
User PC
Figure 5: Patterns of Installation and Fixes on the ISAAC Project
Figure 6: Types of Fixes for Each of the Four
Source: OHSU
Major Technologies Installed in the ISAAC Project
Source: OHSU
Figure 6 shows the ratio of in-home visits to support calls with the participant,
to remote fixes, for the main problem areas. Because participants in this study
are required to use their computer weekly, we have put into place a help line
(e-mail and phone) that they can call for support. This is reflected in the large
number of remote fixes done on the PC, which often take the form of
undeleting a subject’s file that they cannot seem to locate, restoring e-mail
settings that they have changed by mistake, or deleting spy ware that was
inadvertently downloaded by their grandchildren. In spite of our best efforts to
provide FAQs, handouts, and in-person training on basic computer use issues,
many relatively simple tasks remain a challenge to the individuals who have
never used computers prior to our studies.
“Surprisingly, most computer problems
unrelated to participant use have
required on-site visits.”
Surprisingly, most computer problems unrelated to participant use have
Gathering the Evidence: Supporting Large-Scale Research Deployments | 159
Intel® Technology Journal | Volume 13, Issue 3, 2009
required on-site visits. This is a major barrier to scalability at this time. In
part, this reflects the limited capabilities of the in-home systems used in this
study. We chose inexpensive laptop computers running Windows* XP* for
our study, in part because of the need to keep costs under control, and in part
because some of the commercial sensor software we were using runs only under
Windows. However, these laptops turned out to have unreliable network cards
as well as some obscure device driver issues. Furthermore, at the time the study
started, Intel® vPro™ technology was not yet available. Had we been able to
use such technology, we would have been able to remotely diagnose a number
of sensor computer issues rather than visit the home. Overall, our considered
opinion at this time is that a Linux*-based system would perform more reliably,
as well as require significantly less memory.
Data Management
“The data must be appropriately
anonymized to protect the subjects’
privacy, and they must be stored
securely.”
“While none of these issues are
unsolvable, some choices may be
irreversible.”
The management of the data from in-home technology studies raises some
unique issues. Of primary concern is the security of the data and the privacy
of the participants. At all levels of data management (acquisition, transmission
to a central site, ongoing storage, and retrieval for analysis), the data must
be protected. Many organizations, such as the Veteran’s Administration,
have highly-regulated restrictions on where the data may be stored, when
they need to be encrypted, and who may access them. However, even in
research organizations with fewer regulations, the data must be appropriately
anonymized to protect the subjects’ privacy, and they must be stored securely.
Data acquisition poses interesting challenges to maintaining this privacy, since
often the data are not encrypted before being sent to a local receiving device.
As described in the section on the Nature of the Data, raw data, metadata, and
derived data all have different properties. As a result, they also have different
storage needs. Large-scale, longitudinal studies that follow hundreds or
thousands of subjects over years need to consider options for data compression,
data storage, and data retrieval. As derived measures are developed, are the raw
data still needed? Is it appropriate to archive these data, or does it make more
sense to create a hierarchical storage model in which older data are downsampled or summarized over longer time periods? Do meta-data need to be
kept beyond the period in which they are used to address technology issues? If
they are discarded, what historical data should be kept to allow for evaluation
of the study’s efficiency? As derived measures evolve over time, how should
the versions be documented and stored? For large datasets, where processing
the data to derive the measures may take hours or days, it may make the most
sense to store all of the derived datasets. In other cases, where storage is at a
premium, it may make more sense to store the algorithms used to derive the
measures. While none of these issues are unsolvable, some choices may be
irreversible. As a result, it is important to have a clear data management policy
in place at the start of any large, in-home study.
160 | Gathering the Evidence: Supporting Large-Scale Research Deployments
Intel® Technology Journal | Volume 13, Issue 3, 2009
Scalability Issues
There are many issues that need to be addressed if large-scale deployments of
in-home research studies are to be successful. One issue centers on the diversity
of technology experience and acceptance in the aging population. A second
issue is the lack of available tools for interpreting the complex data that result
from such studies, and the need to enable data-sharing through visualization
tools. However, the largest scalability issue is the lack of technology
infrastructure, specifically focused on supporting large-scale research studies.
We discuss these issues in more detail next.
“The largest scalability issue is the
lack of technology infrastructure,
specifically focused on supporting
large-scale research studies.”
Managing Diverse Populations
Differences in living environments, in attitudes towards technology, and
even in the relative extent of in-home versus out-of-home activity across
different groups may require modifications in the platform and approach for
continuous in-home assessment. For example, a recent ORCATECH study
[1] suggested that rural seniors may have several technological challenges.
Of 128 seniors aged 80 or older that were surveyed, only 5 percent reported
using a computer frequently, or sending or receiving e-mail daily. This cohort
is a good example of the kind of research and environmental challenges that
could be encountered during wider deployment in terms of subject training,
and in terms of collecting and transferring of data. Another group that may
provide unique challenges is low-socioeconomic status (SES) elderly. SES is a
combined measure of an individual’s economic and social position relative to
others. SES is often assessed based on a method proposed by Hollingshead,
which considers level of education and occupation. Krousel-Wood [11] found
that level of education adversely affected patients’ willingness to participate in
a telemedicine study. In addition, due to economic constraints, this cohort is
less likely to use a computer on a regular basis. Therefore, a particular challenge
in this group could be recruiting and training on computer use. Finally,
cognitively impaired elders are often less able to interact with technology.
The other interesting issue that arises when a study is scaled nationally is
regional differences in home construction. For example, in our national
studies using in-home technologies, we have found that the wiring in many
of the older homes in New York City result in high data loss with DSL
broadband. For commercial applications, such issues may mean loss of market
opportunity and disappointment in some populations that the technologies
do not work in their home. For a research study, such differences could result
in loss of access to a key demographic or cohort. For example, participants
with low socioeconomic status often live in older homes. This is a population
notoriously underserved by the research community because they are less likely
to volunteer for studies and must therefore be actively recruited. However,
these individuals may benefit greatly from aging-in-place technologies, and
therefore they are an important cohort for inclusion in research studies.
“Another group that may provide
unique challenges is low-socioeconomic
status (SES) elderly.”
“We have found that the wiring in
many of the older homes in New York
City result in high data loss with DSL
broadband.”
Gathering the Evidence: Supporting Large-Scale Research Deployments | 161
Intel® Technology Journal | Volume 13, Issue 3, 2009
The Need for Visual Analytics
“In the field of unobtrusive in‑home
monitoring, there has been no
coordinated effort to share data from
past and ongoing studies.”
As mentioned previously, sensor-based, in-home studies generate large amounts
of complex data. It is well accepted that sharing research data increases
collaboration, increases research activity, and reduces the cost of research [10].
However, in the field of unobtrusive in-home monitoring, there has been no
coordinated effort to share data from past and ongoing studies. Even if the data
were available, analysis of those data requires significant domain knowledge and
an understanding of the sensors, their deployment, and the clinical and sensor
datasets that have been collected during the study. Therefore, there is a clear
need for extensible, modular tools that allow researchers to focus on the data,
without being mired in the details of the sensor technologies.
Specifically, new data analytic tools must be created. Data analytics is the use of
visualization and descriptive statistical techniques for initial data exploration.
Visualization is the display of multiple dimensions of a dataset graphically,
grouped by some feature or category of interest. Visualization is meant to
take advantage of the human capacity to process visual information quickly
and efficiently. In the case of complex data such as in-home monitoring
data, visualization tools must allow exploration of the spatial and temporal
properties of the sensor data, as well as fuse information from multiple sensor
types to examine interactions between different behavioral measures. These
tools will enable collaboration between researchers already focused on this
emerging paradigm, and they will facilitate entry into this area of research
by investigators who may lack the resources to undertake the necessary data
collection, the access to key patient populations, or both, thereby increasing the
opportunities for discovery and formation of new hypotheses.
“More than 50 percent of technology
issues were a result of failed
broadband.”
“We have been forced to repurpose
sensors developed for inventory
tracking and security applications.”
The Need for Global Technology Solutions
One of the primary barriers to wide-scale deployment of in-home, technologybased research studies is the lack of universally available and supported Internet
infrastructure. In the one multi-state study in which we have used in-home
technologies, more than 50 percent of technology issues were a result of
failed broadband, ranging from a poor signal due to old wiring in the home,
to difficulty on the part of the Internet service provider (ISP) in getting the
broadband up and working in the home, and to the ISP discontinuing service
inadvertently or because of billing errors on their part. There are no low-cost
broadband options available universally across the country, and multi-site
studies therefore require establishing a separate ISP relationship and billing
agreement with each city, and sometimes each county, where the study is to be
conducted. This is a significant administrative barrier to large-scale deployment
for research.
Another significant barrier to wide-scale, in-home research studies is the lack of
a platform supporting purposed sensors designed for gathering behavioral and
physiological data in the home. In our current studies, we have been forced to
repurpose sensors developed for inventory tracking and security applications for
our activity monitoring. In those cases where we have developed a new sensor
(for example, with medication tracking [7]), it was done with a particular
162 | Gathering the Evidence: Supporting Large-Scale Research Deployments
Intel® Technology Journal | Volume 13, Issue 3, 2009
project in mind and did not grow out of a principled architecture intended
to support the growing needs for different types of sensors. Ultimately, what
is needed is the integration of sensors designed to optimize the collection of
specific behavioral and physiological data with a platform built on an open,
standards-based architecture. It is our hope that such an architecture may
ultimately be derived from the combined efforts of organizations such as
Continua Alliance, industry leaders promoting interoperable standards for
personal health; Integrating the Healthcare Enterprise (IHE), an initiative
sponsored by professional societies to provide a common framework for
multi-vendor system integration; and the International Organization for
Standardization (ISO), which has specified numerous standards for health
devices and is one of the sponsoring organizations of Health Level 7 (HL7).
The research community has an additional need: standards-based data sharing
and storage models, such as those developed by the Clinical Data Interchange
Standards Consortium (CDISC) that have been endorsed by the FDA for use
with clinical drug trial data. With the advent of large-scale, in-home studies
that use technology to aid in the assessment and detection of health and
behavior changes, there will be increasing amounts of temporal and spatial
sensor data, as well as health outcome data, to be gathered, stored, analyzed,
and archived for efficient later retrieval. Sharing of these research data would
allow investigators to develop new algorithms for deriving health status and
interpreting behavior from these data, without the considerable costs and
resources that would be incurred if each investigator had to collect his or her
own data. Data sharing is most effective when a standard data format and set of
tools are available to facilitate data sharing. Unfortunately, current, continuous
monitoring studies all use different methodologies, and thus far, none have
made their data widely available to the research community for further
analysis of health outcomes. No tools currently exist to facilitate the sharing of
continuous monitoring data. Thus, the development of a common architecture
for sharing and visualizing sensor-based study data is another key requirement
for true scalability.
“What is needed is the integration
of sensors designed to optimize the
collection of specific behavioral and
physiological data with a platform
built on an open, standards-based
architecture.”
“Data sharing is most effective when
a standard data format and set of
tools are available to facilitate data
sharing.”
Conclusions
There is tremendous hope that independent living technologies may lead to a
reduction in the cost of healthcare delivery in this country. These technologies
have the potential not only to support aging-in-place, but to enhance our
ability to assess and remediate health changes in the elderly. Although a
number of mid-sized research studies using such technologies are underway,
there are still significant barriers to their deployment in large-scale clinical
trials. However, we are beginning to understand these barriers and to find
solutions to overcoming them, which will lead some day soon to a significant
evolution in the practice of clinical research.
Gathering the Evidence: Supporting Large-Scale Research Deployments | 163
Intel® Technology Journal | Volume 13, Issue 3, 2009
Vignette
“I never miss a pill.”
After taking care of herself for 23 years since her husband died, 85-yearold Ana is pretty sure she has things under control. Although she takes five
medications each day, three in the morning and two at night, she maintains
strict control of her medication regimen and is able to cheerfully confirm that
she has taken her meds when she talks to her daughter in their nightly phone
call. When she enrolled in our medication adherence study, we asked her to add
a low-dose vitamin C to her regimen once in the morning and once at night
when she took her regular meds. She put these into our Medtracker pillbox
and agreed to take them twice a day. Ana scored very well on her cognitive test,
achieving a score that was a little lower than some but still considered normal
for her age.
After a couple of days, we noticed that she had stopped taking the vitamins and
gave her a call to remind her to take them. She was adamant that she’d been
taking them when she took her own meds, and so we went out to her house to
replace the Medtracker, which clearly wasn’t working properly. However, when
we got there, we found that the pillbox was still full of pills. Ana was sure she
had taken them, and thought that somebody else must have put more pills into
her box. She wanted to continue in the study so we reviewed how to fill up the
pillbox each week. After five weeks in the study, Ana’s overall adherence was 38
percent. To this day she maintains that she “only missed a couple of pills.”
References
[1]
Calvert J., Kaye, J., Leahy, M., Hexem, K., and Carlson, N. 2009.
“Technology use by rural and urban oldest old.” Journal of Medicine and
Technology. In press.
[2]
Chadwick-Dias A., McNulty, M., and Tullis, T. 2003. “Web usability
and age: How design changes can improve performance.” In Proceedings
of the 2003 Conference on Universal Usability, Vancouver, B.C., Canada,
pp. 30-37.
[3]
Chadwick-Dias A., Tedesco, D., and Tullis, T. 2004. “Older adults and
web usability: Is web experience the same as web expertise?” In Aging by
Design Workshop, Bentley, MA.
[4]
Centers for Medicaid and Medicare Services. 2005. National Health
Expenditures: web tables. Available at www.cms.hhs.gov.
[5]
Erickson P., Wilson, R., and Shannon, I. 1995. Years of Healthy Life. US
Department of Health and Human services, CDC, National Center for
Health Statistics, Statistical Notes 7, Hyattsville, Maryland.
[6]
Gist Y. J. and Hetzel, L. I. 2004. We the people: Aging in the United States
(CENSR-19). Available at www.census.gov.
164 | Gathering the Evidence: Supporting Large-Scale Research Deployments
Intel® Technology Journal | Volume 13, Issue 3, 2009
[7]
Hayes T. L., Hunt, J. M., Adami, A., and Kaye, J. 2006. “An electronic
pillbox for continuous monitoring of medication adherence.” In 28th
Annual International Conference of the IEEE Engineering in Medicine and
Biology Society, New York, NY.
[8]
Iglehart J. K. 2008. “No place like home—testing a new model of care
delivery.” New England Journal of Medicine, 359(12), 1200-2.
[9]
Kaye J. A., Hayes, T. L., Zitzelberger, T. A., Yeargers, J., Pavel, M.,
Jimison, H. B., Larimer, N., Payne-Murphy, J., Earl, E., Wild, K.,
Boise, L., Williams, D., Lundell, J., and Dishman, E. 2008. “Deploying
wide-scale in-home assessment technology.” In Technology and Aging,
vol. 21. In Assistive Technology Research Series. A. Mihailidis, J. Boger, H.
Kautz, and L. Normie, Eds.: IOS Press, pp. 19-26.
[10] Koslow S. H. 2005. “Discovery and integrative neuroscience.” In
Clinical EEG in Neuroscience. 36(2), pp. 55-63.
[11] Krousel-Wood M. A., Re, R. N., Abdoh, A., Chambers, R., Altobello,
C., Ginther, B., Bradford, D., and Kleit, A. 2001. “The effect of
education on patients’ willingness to participate in a telemedicine
study.” Journal of Telemedicine and Telecare, 7(5), pp. 281-7.
[12] Li S., Aggen, S. H., Nesselroade, J. R., and Baltes, P. B. 2001. “Shortterm fluctuations in elderly people’s sensorimotor functioning predict
text and spatial memory performance: The Macarthur Successful Aging
Studies.” Gerontology, 47(2), pp. 100-16.
[13] Logan B., Healey, J., Philipose, M., Tapia, E. M., and Intille, S. 2007.
“A long-term evaluation of sensing modalities for activity recognition.”
Vol. 4717. In Lecture Notes in Computer Science, Innsbruck, Austria:
Springer Verlag, pp. 483-500.
[14] Mitka M. 2009. “Studies comparing treatments ramp up.” Journal of the
American Medical Association, 301(19), 1975.
[15] Murata A., Nakamura, H., and Okada, Y. 2002. “Comparison of
efficiency in key entry among young, middle and elderly age groups—
effects of aging and character size of a keyboard on work efficiency in
an entry task.” In 2002 IEEE International Conference on Systems, Man
and Cybernetics, Oct. 6-9 2002, Yasmine Hammamet, Tunisia, 2, pp.
96-101.
[16] Salthouse T. A., Nesselroade, J. R., and Berish, D. E. 2006. “Short-term
variability in cognitive performance and the calibration of longitudinal
change.” Journal of Gerontology Series B: Psychological Sciences and Social
Sciences, 61(3), pp. 144-151.
Gathering the Evidence: Supporting Large-Scale Research Deployments | 165
Intel® Technology Journal | Volume 13, Issue 3, 2009
[17] Stone A. A., Pennebaker, J. W., Keefe, F. J., and Barsky, A. J. 2000. “Part
VII: Self-Report of Physical Symptoms.” In The Science of Self-Report:
Implications for Research and Practice. A. A. Stone, Ed. Mahwah, NJ:
Lawrence Erlbaum Associates, pp. 297-362.
[18] Takeda A., Loveman, E., Clegg, A., Kirby, J., Picot, J., Payne, E., and
Green, C. 2006. “A systematic review of the clinical effectiveness of
Donepezil, Rivastigmine and Galantamine on cognition, quality of
life and adverse events in Alzheimer’s disease.” International Journal of
Geriatric Psychiatry, 21(1), pp. 17-28.
[19] Tang P. C. and Lee, T. H. 2009. “Your doctor’s office or the Internet?
Two paths to personal health records.” New England Journal of Medicine,
360(13), pp. 1276-8.
[20] Tourangeau R. 2000. “Remembering What Happened: Memory Errors
and Survey Reports.” In The Science of Self-report: Implications for
Research and Practice. A. A. Stone, Ed. Mahwah, NJ: Lawrence Erlbaum
Associates, pp. 29-48.
[21] Ziefle M. and Bay, S. 2005. “How older adults meet complexity: Aging
effects on the usability of different mobile phones.” Behaviour and
Information Technology, 24(5), pp. 375-389.
Acknowledgements
The research described in this article was funded through grants from the
National Institutes of Health (NIA grants P30AG024978 P30AG08017, and
R01AG024059), by Intel Corporation, and by the Kinetics Foundation.
Author Biographies
Tamara Hayes, PhD, is Assistant Professor and Associate Division Head of
Biomedical Engineering at Oregon Health and Science University (OHSU).
She received her MS degree in Electrical Engineering at the University of
Toronto and her PhD in Behavioral Neuroscience from the University of
Pittsburgh. She has worked in both industry and academia, on projects
ranging from creating tools for remote management of distributed systems to
delivering teledermatology care to rural Oregon. Dr. Hayes’ current research
interests include the use of technology to deliver healthcare in the home, with
the goal of changing the current paradigm of clinic-centered healthcare to a
model that is less costly, more effective, and allows individuals to participate
more fully in their own healthcare. This research entails the use of low-cost,
unobtrusive sensors in the home for collecting behavioral data related to acute
and chronic motor and cognitive changes, and meaningful analysis of these
data to assist and inform the patient.
166 | Gathering the Evidence: Supporting Large-Scale Research Deployments
Intel® Technology Journal | Volume 13, Issue 3, 2009
Misha Pavel, PhD, is Division Head and Professor of Biomedical Engineering
at Oregon Health and Science University (OHSU). He is also Director of
the Point of Care Laboratory, which focuses on unobtrusive monitoring
and neurobehavioral assessment and modeling. He received his PhD in
Experimental Psychology from New York University and his MS degree in
Electrical Engineering from Stanford University. His research interests include
analysis and modeling of complex behaviors of biological systems, including
perceptual and cognitive information processing, pattern recognition,
information fusion, and decision making in healthy and cognitively impaired
individuals.
Jeffrey A. Kaye, M.D., is a Professor of Neurology and Biomedical
Engineering and the Director of the Layton Aging and Alzheimer’s Disease
Center, as well as Director of the Oregon Center for Aging and Technology
(ORCATECH) at Oregon Health and Sciences University. Dr. Kaye received
his medical degree from New York Medical College and trained in neurology
at Boston University. His Alzheimer’s research is focused on understanding
differing rates of progression and cognitive decline as compared to optimal
cognitive health in the elderly. Through his work with ORCATECH, he seeks
to facilitate successful aging and reduce the cost of healthcare through the use
of in-home technologies for assessment and remediation.
Copyright
Copyright © 2009 Intel Corporation. All rights reserved.
Intel, the Intel logo, and Intel Atom are trademarks of Intel Corporation in the U.S. and other
countries.
*Other names and brands may be claimed as the property of others..
Gathering the Evidence: Supporting Large-Scale Research Deployments | 167
Intel® Technology Journal | Volume 13, Issue 3, 2009
Assistive Technology for Reading
Contributors
Selena Chan
Intel Corporation
Ben Foss
Intel Corporation
David Poisner
Intel Corporation
Index Words
Assistive Technology
Independent Living
Dyslexia
Visual Impairment
Blindness
Specific Learning Disability
Abstract
Over 82 million people in the developed world face challenges each day when
attempting to read a variety of printed materials. The challenges may be due
to blindness, partial vision loss, or reading-based learning disabilities, such as
dyslexia. Regardless of the cause, people of all ages suffer a huge loss in their
independence and productivity when they must rely on others to read material
on their behalf or struggle to read at a rate so slow that they become frustrated
and discouraged. This article focuses on the design and technical challenges of
addressing this user need and the more general complexities of designing an
independent-living product for a user group that ranges from teens to seniors
and from those who are conversant with technology to those who downright
fear it.
The Intel® Reader is the output of years of research into end-users who
experience difficulty reading text for a variety of reasons. The Intel Reader is
a mobile device that combines a high-resolution camera with an Intel Atom™
processor to take pictures of text and read that text aloud to the user on the
spot.
Designing the Intel Reader presented challenges in many areas: user interface,
mechanical and electrical, and usage flows and specification of the system-level
requirements. In this article, we first review the diverse end-user population
for this technology, and we identify unique subpopulation requirements that
were distilled into a set of common parameters. We explore many of the
design challenges, tradeoffs, and unexpected discoveries that led us to the final
definition for the Intel Reader.
Introduction
“For a large number of people, reading
printed text is a barrier.”
168 | Assistive Technology for Reading
Printed text is a default form of communication in the developed world.
Beginning in early elementary school, reading is a central component of
education and is the foundation for building independent, analytical skills. As
students grow and begin to enter the workforce, the ability to read memos,
purchase orders, and industry-related articles is critical to promotions and
successful performance reviews. As the working adult matures, the prevalence
of vision impairment increases and poses a challenge to those who desire to live
an independent lifestyle. And for a large number of people, reading printed
text is a barrier. This group of people includes those with specific learning
disabilities, those with partial vision loss, and those who are completely blind.
Intel® Technology Journal | Volume 13, Issue 3, 2009
“The Intel Reader is a mobile device
that takes pictures of text, translates
the image into digital text by using
optical character recognition (OCR)
technology, and reads the text aloud by
using text-to-speech (TTS) software.”
Figure 1: The Intel Reader
Source: Intel Corporation, 2009
The Intel Reader is a mobile device that takes pictures of text, translates
the image into digital text by using optical character recognition (OCR)
technology, and reads the text aloud by using text-to-speech (TTS) software.
Figure 1 shows a photograph of the Intel Reader, after many rounds of
prototype design iterations and user-focused group discussions. It is roughly
6.5” by 5.5” by 1.1”, about the size of a paperback book. The Intel Reader
contains a high-resolution camera, designed to capture crisp images of text.
To improve the accuracy of OCR, it is critical that the text images are of high
resolution and as sharp as possible. When designing the user interface for the
Reader, we took into account the findings from our ethnographic research
and we incorporated many human factors into our design to provide print
accessibility to the target audience: we included features such as image rotation
and automatic correction to increase the accuracy of the conversion. The Intel
Reader utilizes the Intel Atom processor and a Linux* operating system.
Disabilities and the Challenges of Reading
Reading can be challenging for a variety of reasons ranging from learning
disabilities to partial or complete blindness.
“We took into account the findings
from our ethnographic research and
we incorporated many human factors
into our design to provide print
accessibility to the target audience.”
Assistive Technology for Reading | 169
Intel® Technology Journal | Volume 13, Issue 3, 2009
“Roughly 10 to 15 percent of any
given population has specific learning
disabilities, with 80 percent exhibiting
reading-based learning disabilities
consistent with dyslexia.”
“Reading becomes an uphill battle and
remains slow and laborious.”
Reading-Based Learning Disability
Reading can be a frustrating experience for people who have specific learning
disabilities. For example, dyslexia is a neurological disability, often characterized
by difficulties with word recognition, word spelling, and word decoding.
One participant in our research described reading as “having a bad cell phone
connection with the paper — words drop out or come through garbled even
though I can see just fine.” Dyslexia, like most learning disabilities, is an
inherited condition, and not the result of lack of effort, sensory impairment,
or of inadequate education. Roughly 10 to 15 percent of any given population
has specific learning disabilities, with 80 percent exhibiting reading-based
learning disabilities consistent with dyslexia. In the United States, there are
55 million people with specific learning disabilities in the public schools, 8
million of whom are students under the age of 18 who have dyslexia. Although
studies may vary, it is estimated that 30 – 45 million people are in this target
population overall in the United States alone.
The majority of students first learn to read by phonemic awareness (how
speech sounds make up words) and by connecting those sounds to letters
of the alphabet (phonics). Next, they learn to blend the sounds into words,
recognize words, and comprehend what they read. People with specific learning
disabilities, however, have difficulty with phonemic awareness and phonics.
Research has shown that dyslexia occurs because information is processed
differently in the brain, and for this reason, reading becomes an uphill battle
and remains slow and laborious. When students struggle with these early
stages of reading, they are often made to feel inferior. Frustration and isolation
resulting from this lack of independence can be intense, leading to high dropout rates and associated social problems.
With intensive and early help, students with dyslexia are often able to learn
the basic skills of reading and develop strategies that allow them to stay in
the conventional classroom. Generally, however, they are not able to achieve
a level of reading fluency commensurate with their level of intelligence and
aptitude. While they may be able to get words off a page with great effort, they
are generally at a great disadvantage, relative to the majority of their peers,
especially when it comes to comprehension. However, the comprehension of
spoken words is not affected by dyslexia, and as a result, audio processing of
information is highly desirable for individuals with dyslexia.
Blindness and Visual Impairments
“Anyone with non-correctable reduced
vision is considered to be visually
impaired.”
170 | Assistive Technology for Reading
For people with vision-loss, reading poses a real challenge. Anyone with noncorrectable reduced vision is considered to be visually impaired. Cataracts,
glaucoma, macular degeneration, corneal opacity, diabetic retinopathy, and
trachoma are examples of pathologies that can cause vision acuity loss. Vision
impairment can range from severe low vision (uncorrectable beyond 20/200
to 20/400) to near total blindness (less than 20/1000). At the extreme end,
individuals who completely lack light perception are considered totally blind.
Intel® Technology Journal | Volume 13, Issue 3, 2009
Visual impairment is unequally distributed across age groups. More than 82
percent of all people who are blind are over 50 years of age, although they
represent only 19 percent of the world’s population. There are an estimated
1.4 million blind children below age 15 in the world. In the next 30 years, an
ever-increasing number of people will develop some form of visual impairment
due to the aging of our population. Potentially blinding eye conditions, such as
age-related macular degeneration (AMD), diabetic retinopathy, and glaucoma
are increasing as the world’s population ages.
In recent years, as technology has become more prevalent, a number of assistive
tools have emerged for people with disabilities. There are text readers that help
students build reading comprehension skills, a number of computer programs
are available to help students with literacy, and audio books have emerged as an
alternative format to help students read by listening. Audio books are typically
available for common textbooks, selected novels, and popular magazines
and newspapers. Additionally, for content that is in electronic format, such
as websites and e-mails, text-to-speech software can be used to convert it to
audio. However, there are documents that exist primarily in printed format,
such as store receipts, coupons, business memos, personal mail, and specialty
publications, all of which require an alternative means of access. Alternative
access means reliance on a third party, be it a scanning service to get access to
text, or a special educator to provide on-going support. The Intel Reader was
developed to address these needs, offering an evolution of the existing assistive
technologies to a more portable and intuitive usage model, as well as access to
text at a time and place of the user’s choosing.
“Potentially blinding eye conditions,
such as age-related macular
degeneration (AMD), diabetic
retinopathy, and glaucoma are
increasing as the world’s population
ages.”
“The Intel Reader was developed
to address these needs, offering an
evolution of the existing assistive
technologies to a more portable and
intuitive usage model.”
End-User Personas
When we started developing the Intel Reader we talked with end users who
allowed us to gain insight into their needs, and who gave us the opportunity
to hear how they use current technology in their everyday lives. To obtain a
deeper understanding for end-user needs, we began with usage research. As
part of this phase, we interviewed targeted users, followed them through a day
in their lives, and observed their daily challenges at home and at work, and
asked for their insights on the challenges of daily living. Over the course of this
project, we have interviewed and tested the product at various stages with over
200 individuals.
“Over the course of this project, we
have interviewed and tested the
product at various stages with over
200 individuals.”
Assistive Technology for Reading | 171
Intel® Technology Journal | Volume 13, Issue 3, 2009
The process began by interviewing 20 end users from various target groups and
developing a broad set of model personas to guide our design choices along the
way. Initially, over 30 composite characters were generated, each representing
key elements of the user population. The key characteristics, shown in Figure 2,
articulate the key attributes of our user population and their user needs.
Key Characteristics
Vision:
Affects: VUI/GUI,
tasks/usage
Age:
Affects: dexterity, tasks,
aesthetics
Settings:
Affects: context,
tasks/usage
Tech tolerance:
Affects: controls/ui,
complexity
blind
kid
home
VI
teen
LV
adult
sighted
senior
street
work
old
school
low
medium
high
Coping:
Affects: tasks/usage,
settings, adoption
struggling
coping
conquering
Attitude:
Affects: tasks/usage,
adoption, aesthetics
ashamed
ambivalent
proud
Channel:
Affects: who we target;
how they encounter this
medicare
insurance
grants
OOP
Figure 2: End-user Key Characteristics
Source: Intel Corporation, 2009
Five potential target personas emerged from our research and findings, and two
personas, described next, defined the boundary conditions for a successful
product.
Persona 1
“His vision is getting less and less clear
and he can no longer read unless he
uses very strong magnification.”
“If he had a reading device, he would
use it also to read a variety of print
material.”
172 | Assistive Technology for Reading
James is 67 years old and lives with his wife in St. Louis, MO, in the United
States. He is a diabetic with health complications and deteriorating vision.
His hands are a bit unsteady and he is losing sensation in them. His vision is
getting less and less clear and he can no longer read unless he uses very strong
magnification; even then some common fonts are too small for him to read.
James struggles daily with controlling his diabetes. He tends to go through
cycles of being in control followed by periods where he doesn’t take care of
himself. The guilt he feels about his health is contributing to depression.
James wants a device to read the newspaper so he can stay well-informed on
current events. He likes to go out to dinner with his family, and he would like
to pay for the meal without asking his children to read the total on the receipt,
thereby regaining his head of the house role. In addition, James would like to
use coupons, distributed in the mail or newspaper, when he goes shopping. If
he had a reading device, he would use it also to read a variety of print material,
such as mail, so he could sort junk mail from bills, read greeting cards from his
grandchildren, read street signs, and read prescription instructions from the
pharmacy.
Intel® Technology Journal | Volume 13, Issue 3, 2009
Persona 2
Ethan is 14 and lives in Swindon in the UK. He is a student, living with his
grandmother, and he was recently identified as dyslexic. He was doing pretty
well in school until the reading level increased when he turned 13. Now that
he has started high school, he is frustrated that he is in special education classes
for a portion of the day, separated from his friends.
“He is frustrated that he is in special
education classes for a portion of the
day.”
Ethan values discretion and does not want everyone to know that he is
dyslexic. Ethan wants a device that looks modern and similar to other gaming
devices that his friends have so that he would not draw attention to himself
when he uses the device at school. He has a lot of reading to do for school, so
he needs a tool to help him read books — including text books, popular books,
and magazines. In addition, he would like a device that would also allow him
to listen to music.
For both James and Ethan, having a portable device is important so that
they have the freedom to take it with them wherever they go and use it in a
variety of environments, such as in a restaurant or on the school bus. Having
a device that is light-weight, has a long battery life, and large storage capacity
are all basic features that James and Ethan care about. In James’s case, he
wants something sturdy and solid, and not too small so that it becomes hard
to operate. Ethan, on the other hand, wants something with a cool factor that
looks high-tech and is small enough to fit into his backpack.
“Having a device that is light-weight,
has a long battery life, and large
storage capacity are all basic features
that James and Ethan care about.”
Our ethnographic research therefore highlighted for us the three key elements
for a reading device: accuracy, convenience, and discretion. The device needs to
accurately capture information; it needs to be easy to use anywhere, anytime;
and it must allow users to maintain privacy in their everyday tasks. We,
therefore, had our mission: design and develop an assistive technology product
to meet the needs of both James and Ethan.
Development Stages of our Product
As with the development of any product, the Intel Reader went through a
number of design phases, with feedback loops influencing and often forcing a
rewrite of earlier specifications. Outlined here are the stages we went through
in the development of the Reader by Intel and our collaborators:
Fundamental R&D. Intel has been researching imaging technology for
decades, based largely on the need to develop visual inspection tools for silicon
microprocessor manufacturing. Image improvement, rotation alignment, the
integration of multiple images into higher resolution images, and fundamental
research in optics have all been part of Intel’s broad research efforts. Much of
this work aided in the design, testing, and debugging of the Intel Reader.
“Our ethnographic research therefore
highlighted for us the three key
elements for a reading device:
accuracy, convenience, and discretion.”
Assistive Technology for Reading | 173
Intel® Technology Journal | Volume 13, Issue 3, 2009
Exploration. Intel engineers began experimenting with image and optical
technology, by using lower-resolution cameras and existing OCR solutions, to
test the feasibility of capturing images of text taken with hand-held devices and
converting them into speech. Initially the team explored the use of cell phones,
and then they moved on to off-the-shelf digital cameras. The team mocked
up early prototypes and demonstrated the concept for approval. The inclusion
of targeted users as members of the design team was critical, thereby allowing
deeper insight into usage models.
Usage Research. The team expanded to include behavioral scientists who worked
with end users in their homes and workplaces to learn about their daily lives.
In this phase, as the diversity of the population was better understood, the core
usages were updated and refined.
Human Factors. Following this phase, human factors engineers pulled together
a number of potential designs that were modeled on cameras, books, gaming
devices, and other hand-held units. Flows of software user interfaces, as well as
audio cues and button placements, were proposed. These were then subjected
to user testing, whereby additional end users were allowed to hold, touch,
and provide input to plastic models and mock-ups. It was at this phase that
the insight regarding a downward pointing camera intended for use with
documents placed on a table (see Figure 4) rather than wall posters came to
light, fundamentally changing the teams understanding of how a reading
device needs to be designed. Similarly, the decision as to where to place the key
pad and other buttons for easy reach by adults and teen hands came out of this
phase of development.
Industrial and mechanical design. Following the human factors work, industrial
and mechanical designers defined the shape, color, and materials, working
through the stack-up of major subsystems and the placement of key features.
Electrical and thermal engineering. With emphasis on achieving a good end-user
experience of holding and utilizing the device, the system designers crafted
board layout and airflows to fit with the mechanical and industrial design.
Software architecture. Research into third-party applications, as well as internal
experience drove the team to use Moblin (see moblin.org) as the Linux platform
and begin working on an architecture that could include both new and existing
modules.
Alpha testing. Following six months of development, the team tested the first
integrated system, molded in plastic, using an older version Intel processor (as
the Intel Atom platform was still in development at that time). These systems
were tested with another set of users, refining each of the key design elements
while changes were more feasible.
Manufacturing. The team engaged with the manufacturing group to develop
the high-volume capacity and necessary test infrastructure.
Future development. After launch, we will continue to improve the device
further and provide updates to the product.
174 | Assistive Technology for Reading
Intel® Technology Journal | Volume 13, Issue 3, 2009
Technical Design Challenges
Three fundamental steps must be followed in order for the user to hear the text
that is printed on a piece of paper: the image must be captured, the image must
be converted to text, and finally the text must be rendered as audible speech.
In some cases, one or two of the above steps can be skipped. For example, if
the user already has the text in electronic form, then the capture stage can be
skipped. This is common for e-book files. Even the text-to-speech step can be
skipped if the text has already been recorded by a human. These are commonly
known as books on tape or audio books, with MP3 being the most popular audio
format.
Camera Placement
The first design and usability challenge centered on camera placement. Initial
prototypes placed the camera on the back of the device (as shown in Figure 3)
similar to the design of most general-purpose cameras.
Figure 3: End-user Key Characteristics
Source: Intel Corporation, 2009
Assistive Technology for Reading | 175
Intel® Technology Journal | Volume 13, Issue 3, 2009
“We knew we needed to develop a
different solution to make the product
ergonomically friendly for all end users
of various age groups.”
When placed in the hands of end users, it was interesting to observe their
intuitive response: they stood up in order to capture pages in a book placed on
a desk. Standing was the only way for them to see what was displayed on the
device screen, as the camera was pointed to the text laying flat on the table.
When asked how they would use the device when sitting down, we observed
a very awkward twist in their wrist, in order to position the camera above the
text, and a strain in their neck to peer above the device to view the display.
We knew we needed to develop a different solution to make the product
ergonomically friendly for all end users of various age groups. Interaction
designers and engineers worked together to implement a solution to the
problem: design a vertically-mounted camera placed at the bottom of the
device. With the camera in this location, users can wrap both hands around the
device and view the display that is positioned at user eye-level when seated —
with the camera aimed directly above the printed text (see Figure 4).
?
Figure 4: Camera Placement at the Bottom of the Device
Source: Intel Corporation, 2009
176 | Assistive Technology for Reading
Intel® Technology Journal | Volume 13, Issue 3, 2009
Unanticipated benefits followed from this design choice. For most users,
placing their elbows on a table assured them that the unit was level to the
surface, and such a posture provided stability while holding the units,
important for accurate imaging. For seniors, this was especially important,
enabling them to rest while holding the device. For blind users, the physical
location of the document between their elbows and directly in front of them
while seated allowed for more accurate framing of the image.
Form-Factor Considerations
In the midst of the human factors and usage research, we evaluated a number
of end-user requirements. One significant challenge concerned the size of the
device. Teens wanted the thinnest, lightest device we could design, while lowvision seniors needed buttons to be placed far enough apart to allow easy use.
Blind users had limited use for the screen on the device, though it would turn
out that most blind users wanted to have a visual display so they could share
information with sighted people. Low-vision users wanted to have as large
a screen as possible. Each of these needs pushed industrial design in various
different directions.
“One significant challenge concerned
the size of the device.”
Placement of Buttons
There are many considerations in the placement of buttons, among which are
these:
•• Placement of the buttons influences the user’s experience and learning
curve. The buttons must be close enough together such that a user does
not have to stretch uncomfortably; yet, they must be large enough and far
enough apart to accommodate users with large fingers and limited dexterity,
including users with limited range of motion due to arthritis and other
rheumatoid conditions.
•• The main function of the device, capturing images, pointed to making
the image-capturing button the most prominent button on the device.
The large button, at the cut corner on the right, also oriented the device
for a blind person, pointing the camera downward. For the teen or the
senior, pushing the button while aiming it with one or two hands was an
important consideration. This placement also was found to reduce motion
blur induced by the user’s finger movement during the image capture.
•• The team tested two-handed and one-handed usage models, determining
that having one hand free to hold content or keep a book page flat
would be instrumental to success. Following from this, the main keypad
on the device (see Figure 5) was designed to be within a thumb’s reach
while holding the device with the right hand, as almost 90 percent of the
population is right-handed.
Assistive Technology for Reading | 177
Intel® Technology Journal | Volume 13, Issue 3, 2009
•• Human factors research also pointed to the importance of being able to
move up or down in menu structures easily. A four-way directional pad
with variable function, depending on context, met this need. Consumer
electronic devices typically included a fifth navigation button for Select or
OK button, and therefore we located this button within the center of the
four-way direction pad.
•• For advanced usages, these five navigation buttons have multiple functions
depending on context. While playing a document, the Enter button
functions as a play/pause button. This functionality, not originally planned,
was added after user testing showed that many users expected this behavior.
Similarly, the right and left buttons have multiple functions depending
on context and usage. Holding the right button down while paused in a
reading document, will allow the users to jump one page. If the users are
playing back text, the right and left button accelerate progressively or jump
back to recently read text.
•• For the Intel Reader, a sixth button, called the back button was added to
allow users to jump from lower-level menus to higher-level menus. In
subsequent software versions, buttons would take on additional functions.
For example, the back button, when held down for an extended duration,
would return the user all the way to the Home menu, the top level of the
user interface, rather than to just go up one level.
•• Grouping of the other buttons on the device by function improved
discoverability.
•• Low-vision-related buttons, of no use to a blind person, and moderately
useful to a dyslexic user, were placed together on the lower left. These
buttons allowed for font magnification and for toggling between image view
and text view.
•• Voice speed changes were important to power users, those who are very
comfortable with using assistive technology: these users listen to content at
speeds of over 400 words per minute (four times common speech) and slow
down for complex content. Similarly, all users wanted to have Favorites and
Help easily accessible, which made dedicated buttons critical.
•• A tactile adjuster for volume allowed one handed use, with a rocker-type
button on the right side.
•• Design considerations also included responsiveness of buttons, durability,
and tactile feedback. For example, the button on the lower left for
decreasing font size on the screen is indented, while the one for increasing
font size is out dented, signaling to the users zoom in and out. Similarly,
each of the buttons is shaped uniquely by function, allowing users to
distinguish one button from another.
•• In some cases, buttons were eliminated. For example, initial designs
included a voice recorder for audio notes.
178 | Assistive Technology for Reading
Intel® Technology Journal | Volume 13, Issue 3, 2009
The resulting placement of the buttons is shown in Figure 5.
Hold
Power
Play
Shoot
Back
OK
Volume
View Toggle
Navigation Buttons
Zoom Out
Intel®
Options
Read
er
?
Zoom In
Favorites
Voice Speed
Guide
Figure 5: Placement of Intel Reader Buttons
Source: Intel Corporation, 2009
Image Capture
The first major step is to capture the image of the target, which in this case
is printed text. As with any camera, the target must be in focus, properly
illuminated, and the capture time must be fast enough to prevent motion
blur. The Intel Reader’s camera subsystem was designed to be fully automatic,
including auto-focus, auto-exposure, and auto-flash mechanisms. The major
considerations in image capture are discussed next.
Illumination
The image sensor has minimum illumination requirements. In general, a sensor
that has higher sensitivity will be more expensive, and a sensor that can operate
at very low-light conditions will not have sufficient dynamic range to function
at very high illumination. For the usages we considered, ambient illumination
will vary across a tremendous range, from as low as 100 lux in a typical
residential setting, to over 10,000 lux outside on a sunny day.
The standard industry solution has been to use an integrated Xenon* flash
strobe, with an exposure time of a few tens of microseconds. The Xenon strobe
is effective, because it can produce a large amount of illumination over a short
time, with a reasonably uniform light distribution. For the Intel Reader’s usage
models, the target is generally 10 cm to 1 meter from the sensor, so the flash
strobe can be significantly lower in power than what might be found on a
standard camera that might be two to three meters from the target.
“The flash strobe can be significantly
lower in power than what might be
found on a standard camera that
might be two to three meters from the
target.”
Assistive Technology for Reading | 179
Intel® Technology Journal | Volume 13, Issue 3, 2009
In selecting the strobe, we also had to consider the useful life of the product.
A consistently-used Intel Reader could easily work through fifty pages a day of
text, requiring 17,000+ strobe flashes a year. Given a useful life of three years
for the device, 50,000+ would be a likely usage scenario. This drove designers
to consider long-life strobes to ensure product durability.
Intel®
Reader
?
Focus
The lens in front of the sensors determines the field of view. This field of view
can be considered as a pyramidal cone, as shown in Figure 6. Often this is
thought of as a stack of planes, where the area of the planes increases with the
distance from the sensor. However, not all possible positions within the field of
view are in sharpest focus. For any given distance between the lens and the
sensor, there will be a plane of ideal focus, as shown in Figure 6.
Since the distance between the Intel Reader and the target is adjusted by the
user, the lens must be moved relative to the sensor in order to have the target
in the sharpest focus. The auto-focus algorithm calculates the required lens
position. This algorithm must also account for slight hand movements by the
user, especially for our low-vision senior users.
Figure 6: Mockup of Field of View and Plane of
Focus
Source: Intel Corporation, 2009
“The distance between the sharpest
plane and the plane of just-acceptable
focus is known as the depth of field.”
While the region of sharpest focus is shaped as a plane, the corresponding
planes immediately in front of and behind that plane are still in reasonably
sharp focus. The distance between the sharpest plane and the plane of justacceptable focus is known as the depth of field. For usage models associated
with the Intel Reader, the target material will often not be perfectly flat.
For example, with a thick book opened to a page near the beginning, the
distance from the sensor to the left side of the left-facing page may be several
centimeters greater than the right side of the right page. The lens must thus
have enough depth of field to account for that difference.
Preventing Motion Blur
As the user holds the device, some natural motion will be imparted to the
camera. The complete image must be captured while the image is stationary
on the sensor; therefore, the exposure time must be set correctly relative to the
potential motion of the camera. For some users, especially elderly users with
tremors, the motion may be significant. If the exposure time is too long, the
image will be blurred, resulting in poor accuracy when the optical character
recognition is attempted.
To prevent this blurring, the exposure must be fast. Of course, as the exposure
time is reduced, the amount of light hitting the sensor will also be reduced.
This drives sensitivity requirements on the image sensor, as well as performance
requirements on the illumination source.
“To prevent this blurring, the exposure
must be fast.”
180 | Assistive Technology for Reading
Intel® Technology Journal | Volume 13, Issue 3, 2009
Even with the relatively low weight of the device, when capturing a large
number of images, such as an entire magazine, newspaper, or book, the user
may become fatigued. In addition, the book might not naturally stay flat on its
own. The user might need some significant dexterity to hold the Intel Reader
in one hand while holding the target open with the other hand, and at the
same time not obscure the text of the target with his or her hand.
To address the user requirement of avoiding fatigue, especially for our lowvision senior group, and to allow the bound printed material to be held open,
the team developed an accessory called the Intel® Portable Capture Station, as
shown in Figure 7. The Intel Portable Capture Station has a tray that is large
enough to keep magazines or books open, with two pages exposed in the view
angle of the Intel Reader’s camera. The Intel Reader is placed into a holder that
determines the exact height above the tray and positions the camera’s field of
view.
Design of the Intel Portable Capture Station included a number of
considerations. User focus groups identified durability as a key requirement,
since this device is likely to be used in a public school setting. Portability is
another key requirement, since the user may choose to bring it to work, school,
or home.
Figure 7: The Intel Portable Capture Station
Source: Intel Corporation, 2009
A key element of the design of the Intel Portable Capture Station and the Intel
Reader included the ability to transition from mobile to fixed usages. When
the Intel Reader is used in conjunction with the Intel Portable Capture Station,
bulk capturing of text becomes easy. However, for single worksheets and news
handouts, capturing the text by using the Intel Reader in hand-held mode is
quicker and more convenient.
Initial design included the possibility of an illumination system in the Intel
Portable Capture Station itself. The additional complexity and associated
cost of the system, including the electrical subsystems and the cost of the
illumination elements, made using the lighting system on the Intel Reader a
preferable choice.
Other user design requirements included an indented tray on the Intel Portable
Capture Station to give users a physical guide to the Intel Reader’s viewable
area when docked in the Intel Portable Capture Station. This allowed both
sighted and non-sighted users the ability to orient the target material and
assume all text is within the camera’s field of view.
The Intel Portable Capture Station was designed to have a separate button to
initiate image capture by using the Intel Reader. Because the capture button
is placed on the bottom of the tray, the user does not have to reach up high
to activate the shoot button on the Intel Reader that is placed in the holder,
about two feet above the text. Users with reduced mobility in their arms find
this feature valuable. In addition, seniors often have difficulty raising their arms
above their shoulders, and the decision to place the capture button on the tray
base was based on end-user feedback.
“A key element of the design of the
Intel Portable Capture Station and
the Intel Reader included the ability
to transition from mobile to fixed
usages.”
Assistive Technology for Reading | 181
Intel® Technology Journal | Volume 13, Issue 3, 2009
Optical Character Recognition (OCR)
“A key step involves finding the
natural boundaries: the spaces between
columns, paragraphs, words, and
characters.”
Once the image has been captured, the next major step is to convert the
raw image to text. This process is known as Optical Character Recognition.
OCR software packages are commercially available and one was selected for
incorporation into the Intel Reader.
The OCR algorithm must examine the pixels associated with the image
and derive the corresponding text. A key step involves finding the natural
boundaries: the spaces between columns, paragraphs, words, and characters.
Individual characters may only be a few pixels apart.
Key challenges to a highly accurate OCR process include having a sufficient
number of pixels per character, having clear focus on the characters, and
minimizing interference such as glare, etc. The differences in individual
characters, such as the lower-case letter i and the lower-case letter l may be very
subtle; therefore, acquiring a very accurate image of each character is critically
important.
“Acquiring a very accurate image of
each character is critically important.”
When capturing multiple pages, such as chapters of a book, the software is
designed to perform background OCR processing and allow users to listen
to the first few pages that have been processed, if they choose to. Typically,
by the time a user captures three or four pages of a multi-page document,
the first page is available to be read aloud. This was a design tradeoff between
optimizing for time to first sentence versus optimizing for capturing multi-page
documents; we choose the latter based on our usage models.
Text to Speech
“The ability to scale to much higher
rates will be of great importance to the
long-term appeal of any device that
reads aloud to users.”
After capturing the image of a page of text, the final stage is reading it aloud to
the user, often called text to speech. This requirement also drives a wide variety
of design decisions. Naturally, as users start to listen to more and more content,
their ability to comprehend that content will also improve. For example, a new
user may start out being comfortable with a speed of 110 words per minute,
roughly the rate of common human speech, but will later find that speed far
too slow, and he or she may eventually find that speed frustrating. However,
even users that are comfortable with 300 words per minute for simple or
familiar content may need to go much slower for complex or unfamiliar
content, such as a technical journal. Thus, the ability to scale to much higher
rates will be of great importance to the long-term appeal of any device that
reads aloud to users. The Intel Reader operates at rates as slow as 70 words per
minute and allows playback of up to 500 words per minute, with increments at
appropriate rates in between.
Rendering Existing Electronic Content
In designing the Intel Reader, we focused on the ecosystem of other consumer
electronic devices within which the device would function, as well as the
environment of electronic files for reading materials, such as plain text files or
DAISY format (described later). Connecting to a personal computer, playing
182 | Assistive Technology for Reading
Intel® Technology Journal | Volume 13, Issue 3, 2009
MP3 audio content, or even generating MP3 to play on other devices were
critical to the usage model. To enable this transfer, the designs included the
ability to access files on the system and transfer them on or off, by using a USB
connection to a PC or by using USB flash drives.
Imported text files can be played in the same manner as text that has been
derived by using the OCR capabilities. However, files in MP3 and other digital
audio formats require an entirely different playback mechanism. Instead of
using the text-to-speech engine, a digital audio decode engine is required.
“Instead of using the text-to-speech
engine, a digital audio decode engine
is required.”
While MP3 is a popular format for audio files, it is not the default for
accessible text. A consortium known as the Digital Accessible Information
System (DAISY) has established standards for Digital Talking Books (see
www.daisy.org). The DAISY format includes both digital audio files with
human narration and corresponding text of that file. The Intel Reader was
designed to play both text only and audio versions of these files.
Organizing Files
Over time, the user may accumulate a very large number of files. A familiar
usage model is a filing cabinet, where related files are stored in the same drawer
or folder within a drawer. As a user captures files, the files must be identified.
To simplify this identification process, we set a default where the title of a
document is identified, based on text size and placement on the page. The first
25 characters of this title become the title of the book. In a case where there
is no clear title, the first 25 characters identified become the default title. The
user may rename the file and place it in one of several categories. To distinguish
between documents with similar content, the file created also records the date
and time.
“To simplify this identification process,
we set a default where the title of a
document is identified, based on text
size and placement on the page.”
Sorting and managing content can also be done through a computer-based
architecture, by attaching the device to a computer via a USB. Using a standard
Windows* or Mac* interface to do this allows users with preferred screenreading software to navigate the device content through the tools they use on
a daily basis. Our team made a specific decision not to develop a separate user
interface for the computer, allowing users to navigate the Intel Reader through
the tools they already use.
Display Issues
For users with vision, even limited vision, an electronic display, such as an
LCD, can provide significant information. Some uses are obvious, such as
display of functional menus or a preview of the text to be captured. A less
obvious but very important use is to display the text as it is being read aloud by
the device. For users with limited vision, seeing the text along with hearing the
words yields a significantly improved reading experience. The same holds true
for users with specific learning disabilities: multiple forms of input increases
comprehension. Given that users have differing levels of sight, the font size for
the displayed text must be adjustable, even up to the point where a single word
covers the entire displayed area.
“For users with limited vision, seeing
the text along with hearing the words
yields a significantly improved reading
experience.”
Assistive Technology for Reading | 183
Intel® Technology Journal | Volume 13, Issue 3, 2009
“For users with specific learning
disabilities, stochastic reading, placing
one word at a time on the screen, can
be useful.”
A number of factors drive the decision on the size of the LCD panel to include.
First, the LCD panel must be large enough to be useful. Users with limited
vision will require very large fonts. For some users, reading is easier when many
lines of text are on the screen; low-vision users may have enough sight to shift
between words while reading, thereby gaining context as they go. For users
with specific learning disabilities, stochastic reading, placing one word at a time
on the screen, can be useful as this allows them to avoid scanning across a line
of text, a common challenge for readers with specific learning disabilities.
Overall, having a larger display incurs many burdens on the system: the larger
the display, the higher the cost, weight, and power consumption. One also
needs to consider the business implications. Settling on a size that conforms to
industry norms, for example, those commonly used for gaming systems, allows
the design to benefit from the economies of scale for these adjacent markets.
“The LCD panels must include
integrated backlighting, with
LED‑based illumination being the
most power efficient.”
The display will be viewed under a wide variety of lighting conditions, ranging
from dark rooms to bright sunlight. Given this requirement, the LCD panels
must include integrated backlighting, with LED-based illumination being the
most power efficient. In order to have sufficient intensity, the voltage must be
raised beyond the 12 volts provided by the main power supply, and a pulsewidth modulation must be applied to allow for variation in the intensity.
A wide viewable angle is also important: it allows users to hold the device in a
lap or observe it in the Intel Portable Capture Station while seated or standing
30+ degrees above or below the screen.
For our blind users, having a screen is not critical. Their typical usage is to turn
the screen off to save battery power and predominantly use menu voicing to
determine device status; however, a number of blind users indicated that they
would prefer to have a viewable screen to share content with sighted people
when necessary.
“Placing the Intel Reader in the center
of the document to be imaged and
raising it directly above the document
also proved useful for blind and
low‑vision users.”
184 | Assistive Technology for Reading
Framing printed text is a challenge without audio cues to indicate whether
the text is within the field of view of the camera. When specific usage tips are
provided, blind users have more success framing the document. For example,
a tip could be advising users to hold the Intel Reader a distance from the
paper equal to the largest dimension of the page being imaged: that is, for an
8X11 piece of paper, the user is advised to hold the device 11 inches above the
document to ensure it is captured. Placing the Intel Reader in the center of the
document to be imaged and raising it directly above the document also proved
useful for blind and low-vision users in testing. When capturing documents
at home or in the office, having the Intel Reader docked in the Intel Portable
Capture Station removes the height variable, and the user is confident that
anything placed within the defined borders of the base will be captured by the
Intel Reader’s camera.
Intel® Technology Journal | Volume 13, Issue 3, 2009
Storage Requirements
Storage requirements were defined based upon usage. For a multi-megapixel
color image, the raw images may be over 10 megabytes each. Text is a very
compact format with 1 character generally only requiring 1 byte of memory.
That same 10 megabytes can store over 5,000 pages of text, and if the text is
compressed, storage can typically be increased by 2 to 4 times more. For text
capture, therefore, the memory has to be large enough to handle capture of a
few dozen raw images.
“For text capture, therefore, the
memory has to be large enough to
handle capture of a few dozen raw
images.”
Some of the storage space must be allocated for audio files and text files.
Audio data can be tightly compressed due to gaps between words, the low base
frequency, and the relatively small dynamic range of the human voice. MP3type compression can yield files as small as 500 kbytes per minute. By far, the
least amount of storage is required for text, as one minute of text at 160 words
per minute might require only 800 bytes.
Testing showed that power users were likely to move content on and off the
device to an external drive of a computer. This pointed the team to consider
cost and to specify the size of the internal drive. Compression of images was
also a critical element, allowing the user to get good OCR accuracy from full
resolution images and then store the content once read in a low-resolution
JPEG format.
Creating a Complementary Environment
To assist in moving content into and out of the Intel Reader, a USB interface
allows direct connection to any modern computer. The Intel Reader appears
to the computer as would any storage-class device, such as is done with many
digital cameras or external disk drives. This allows direct re-use of existing tools
for movement of data files. Users can thus download content from remote
servers to their computer, and then move it to the Intel Reader. Conversely,
text captured by using the Intel Reader can be archived on the computer or
sent to other systems.
In some cases, the user may not have access to a computer. In examining
alternative data transfer methods, the USB flash drive appeared to be a great
choice. No cable is required, as the flash drive is plugged directly into the
corresponding USB host port on the Intel Reader. In addition to data transfers,
the USB host port also allows some useful USB peripherals to be connected,
such as a keyboard. This proved important in user testing for people who use
alternative input devices, such as pointer or breath-activated keyboards. The
keyboards generally function according to standard USB interfacing, allowing
people with limited mobility and special-needs computing devices to use the
Intel Reader. As an extension of this functionality, the Intel Portable Capture
Station also allows for USB connectivity, enabling the user to drive the device
while docked in the station.
“A USB interface allows direct
connection to any modern computer.”
“In addition to data transfers, the
USB host port also allows some useful
USB peripherals to be connected.”
Assistive Technology for Reading | 185
Intel® Technology Journal | Volume 13, Issue 3, 2009
Selecting the Computing Environment
“The Intel Reader needs a highly
flexible computing environment.”
From our architectural exploration, it became clear that the Intel Reader
must be based on a highly flexible platform. Several different vendors have
developed OCR and text-to-speech algorithms for different languages, and
additional languages are being added each year. In order to take advantage of
this continuous improvement in the OCR and text-to-speech environments,
the Intel Reader needs a highly flexible computing environment.
In addition to flexibility, the processor must provide sufficient performance to
complete the OCR algorithm within a reasonable delay. However, the size and
power consumption must be low enough to fit with the required hand-held
form-factor.
The Intel Atom processor and its associated chipsets running a general-purpose
Linux operating system provides this type of flexibility, performance, power
consumption, and size. The Intel Atom processor is an X86 compatible
processor with operating speeds above 1 GHz.
Conclusion
When developing technology to help people with disabilities become more
independent, developers face a dilemma: if they tailor the use too tightly,
the number of potential users is reduced to the point that the device is not
economically sustainable. On the other hand, if they keep the feature set too
broad, it fails to address the needs of any one community. In short, narrow
designs fail for lack of a market, and broad designs fail for lack of use. The Intel
Reader is an attempt to navigate this design challenge. The core usage model,
reading text from a page with a camera and converting it to speech, addresses
a need shared by blind, low-vision, and dyslexic users. In this way, we are
attempting to reframe the market from a subpopulation of people with one of
these disabilities to a larger population, renamed as those with Print Disability.
“We are attempting to reframe the
market from a subpopulation of people
with one of these disabilities to a
larger population, renamed as those
with Print Disability.”
186 | Assistive Technology for Reading
We had to make choices, focusing on users with a basic set of technology
literacy skills; for example, someone who might use a cell phone or
communicate via e-mail. We also balanced the competing needs of seniors and
teens and of the sighted and those with limited vision.
Intel® Technology Journal | Volume 13, Issue 3, 2009
The Intel Reader can also be seen as a step on the road to the future of reading
and writing. By 2020, we should expect to see a range of devices that will move
content seamlessly between print, digital, and audio. Moreover, we should
expect the cost of such technology to plummet. The key inflection point
in the cost curve will be when this technology moves from a necessity for a
disability population to a convenience for the general population. Crossing this
line will drive economies of scale similar to other common communication
and entertainment technologies. Where DVD players once cost over $1,000,
they now cost under $50. Cell phones, once the toy of the very rich, are now
included for free with a service contract. We expect that in the future, the
evolution of the Intel Reader technology will follow a similar path. Indeed, we
share a vision of this technology becoming a very low cost device, one that will
serve the disabled population and the general population. It is in this moment
of cross-over that we will see the true goal of the device come to fruition —
the establishment of an equal playing field for all people who want access to
knowledge and increased independence.
“By 2020, we should expect to see a
range of devices that will move content
seamlessly between print, digital, and
audio.”
Author Biographies
Selena Chan is a Principal Engineer and has been at Intel since 2001. She
currently manages end-user trials and market trials for the Intel Reader. Prior
work involved delivering Intel technology platforms to the medical research
community and leading external technical collaborations. She holds a Ph.D
and M.S. degree in Electrical and Computer Engineering and a B.S. degree in
Chemical Engineering. Her e-mail is selena.chan at intel.com.
Ben Foss is Product Development Manager for the Intel Reader in the Digital
Health Group, leading a team of industrial, mechanical, electrical, and software
engineers in the development of new products to help people with disabilities
be more independent. He has been at Intel since 2003. Previously, he worked
in the National Economic Council in the Clinton White House and was a
Marshall Scholar. He holds a JD/MBA degree from Stanford. Ben is dyslexic:
he volunteers his time as a board member of the civil rights organization,
Disability Rights Advocates, and Headstrong, a non-profit group focused on
issues regarding learning disabilities. His e-mail is ben.foss at intel.com.
David Poisner is a Senior Principal Engineer and has been at Intel since 1986.
He is currently the chief hardware architect in Intel’s Digital Health Group.
Prior work involved architecting nine generations of Intel chipsets and various
platform technology initiatives in the realm of power management, security,
audio, USB, storage interfaces, and LANs. He holds a BSEE degree from the
University of Kansas. His e-mail is david.i.poisner at intel.com.
Copyright
Copyright © 2009 Intel Corporation. All rights reserved.
Intel, the Intel logo, and Intel Atom are trademarks of Intel Corporation in the U.S. and other
countries.
*Other names and brands may be claimed as the property of others..
Assistive Technology for Reading | 187
Intel® Technology Journal | Volume 13, Issue 3, 2009
Developing Consumer and IT Platforms in a Regulated
Medical Device Environment
Contributors
Barbara Fullmer
Intel Corporation
Abstract
Alan Boucher
Intel Corporation
The Intel Digital Health Group, formed in 2005, is executing a steep learning
curve as we develop our capabilities as a medical device manufacturer. Applying
the development practices used in the semiconductor industry to complete
regulated medical devices revealed number of important process gaps that
needed to be closed.
Index Words
Some of these differences result from the very different customers served by the
Digital Health group. Some were the result of the different usage models and
potential risks in healthcare. Other differences result from the conventions,
practices and requirements of standards organizations and regulatory bodies.
Consumer Device
Medical Device
Regulatory
Design Controls
Product Development
Validation
This led to a series of complex inside-out structural modifications to the Digital
Health Group’s Product Development Lifecycle, which we discuss in detail
in this article. We also examine the effects of these changes on our group and
the challenges we faced during the development of Intel’s first FDA Class II
regulated medical device.
Introduction
The Intel Digital Health Group was formed in 2005 with the goal of applying
Intel’s technical and business strengths to meet important needs in the
healthcare industry. This was largely a new domain for Intel. We had provided
semiconductor components to healthcare customers for many years and
we had previously experimented with offering information security services
for healthcare. We also had active research programs in home health and
biomedical/life sciences. But we had never before designed and delivered fully
integrated products, directly to users, which were subject to medical regulatory
processes.
“The highest quality of life and
the lowest cost of care occur when
individuals are living independently
in their own home.”
The Digital Health Group is engaged in building products which enable
individuals to receive the care and assistance they need at home. This focus is
grounded in a philosophy we refer to as “Shift Left.” The highest quality of life
and the lowest cost of care occur when individuals are living independently
in their own home, as illustrated in Figure 1. We have focused on developing
products and services which allow individuals to receive care and support in
their own home that might otherwise be delivered in a clinical setting or an
assisted living facility.
188 | Developing Consumer and IT Platforms in a Regulated Medical Device Environment
Intel® Technology Journal | Volume 13, Issue 3, 2009
Home Care
100%
Healthy,
Independent Living
Community
Clinic
Chronic Disease
Management
Doctor’s
Office
Quality of Life
Residential Care
Assisted
Living
Skilled Nursing
Facility
Acute Care
Specialty
Clinic
Community
Hospital
ICU
0%
$1
$10
$100
$1,000
$10,000
Cost of Care/Day
Figure 1: Shift Left, Stay Left — the Continuum of Care
Source: Digital Health Group, Intel Corporation
A large proportion of healthcare spending is dedicated to caring for older
patients who have one or more chronic diseases, such as diabetes, congestive
heart failure, or chronic obstructive pulmonary diseases (such as emphysema).
The first regulated medical device the Digital Health Group developed was
a system designed to help such patients manage their chronic diseases. Our
system was to enable self-management by the patient with the assistance of
clinicians working remotely, often as part of a dedicated disease management
team.
This first product, the Intel Health Guide, needed to be designed for use
by elders who are not familiar or comfortable with computers or other
complicated technology, and for use by skilled nurses and physicians who
provide care for these patients. The system allows the clinical care team to plan
a series of health sessions for the patient, in which the patient checks their own
vital signs, answers survey questions about their current condition, and may be
presented with educational material (text, photos, audio and recorded video.)
The patient is reminded proactively to complete their health sessions. The
clinical care team can also use video conferencing to interact with the patient,
in order to coach them and assess their condition.
In order to simplify the use of the system and ensure accuracy, data needed to
be collected automatically from medical devices such as blood pressure cuffs,
glucose meters and pulse oximeters. Collecting data from these regulated
medical devices results in classification of the Intel Health Guide as a regulated
medical device in the US, the UK, and most other countries.
“The first regulated medical device the
Digital Health Group developed was
a system designed to help such patients
manage their chronic diseases.”
“Collecting data from these regulated
medical devices results in classification
of the Intel Health Guide as a
regulated medical device.”
Developing Consumer and IT Platforms in a Regulated Medical Device Environment | 189
Intel® Technology Journal | Volume 13, Issue 3, 2009
The Intel Health Guide system would include a purpose-built PC with a
specially designed user interface (for use by the elder in their home), a serverbased data collection system with a management application used by the
clinician care team, and a set of tools to manage the deployment and
maintenance of the system. Intel would act as an Application Service Provider,
operating the server infrastructure and back-end applications. Figure 2
illustrates the components of the complete solution.
Solution Integration
• Database ETL
• Custom export
Content and Applications
• Disease Management and RPM
• Administration and Remote Management
• Education Content, Care Plans
Network
• Broadband over Ethernet
• TCP/IP (TLS), VPN
• USB, Bluetooth™
OS, Drivers and System Software
• Windows XPe
• Device drivers
• VC Engine, VPN
IHG Hardware Core
All aspects of the system needed to be designed for the intended users, and the
entire system would need to meet the requirements for medical devices.
• Intel MB, CPU, Chipset, BIOS
• LCD with touch panel
• Plastics and HIDs e.g. Gloworm, Buttons
Telehealth Peripherals
• WeightScale
• Blood Pressure Cuff, Pulse Ox
• GlucoseMeter, Peak FlowMeter
Intel Developed
The needs and characteristics of several different user types would have to be
considered. Patients would often be elderly, unfamiliar with technology, and
living in homes with poor communications and electrical power systems. Care
managers (often nurses with varying computer skills) might be working in
central offices or from their own homes. Care manager administrators would
be assigning care managers to patients, and performing other administrative
duties. System Administrators would be responsible for technical maintenance
of databases, network connections and other infrastructure. Figure 3 illustrates
the relationships between system components and these user types.
Intel Validated
Figure 2: The Intel Health Guide System,
Version 1.0
Source: Digital Health Group, Intel Corporation
“We found that we faced a steeper
We knew we had much to learn, but we believed the strong capabilities of
Intel’s product divisions and manufacturing organization prepared us well for
the regulated medical device industry. The development team had extensive
experience in designing PC’s, and in designing, building and operating high
reliability distributed IT systems. We had experienced software developers
who had built complex software systems for business, engineering and
manufacturing. Our corporate culture placed a strong emphasis on discipline
and quality. The complexity of Intel’s semiconductor products, the tight process
controls required in our manufacturing processes, the reliability and correctness
required of our products, and the high volumes of products we ship would
surely give us the necessary capabilities for development of the relatively simple
medical devices we planned. We found that we faced a steeper learning curve
than we had anticipated.
The direct user of Intel products in our core semiconductor business is a design
engineer or an original equipment manufacturer (OEM) who develops and
builds computer systems from Intel’s silicon and subsystem building blocks.
As such, these architects, engineers and manufacturing personnel work within
the culture of the electronics industry. They have specific expectations of
technical design criteria, product specifications, product errata, supporting
documentation, etc.
learning curve than we had
anticipated.”
190 | Developing Consumer and IT Platforms in a Regulated Medical Device Environment
Intel® Technology Journal | Volume 13, Issue 3, 2009
Lighthouse Application Service
Provider Data Center
Patient Home
Bluetooth
Intel®
Health Guide
USB
Patient
o
C
Lighthouse Provisioning Application
Peripherals
Internet
Care Manager Office
Application Service Provider
Data Center
Care Manager
User interface
Intel®
Care Manager
Care Manager Administrator
User interface
Work Order
User interface
Care Manager Administrator
System Administrator
Intel® Health Care Management Suite
Figure 3: The Intel Health Care Management Suite v1.0
Source: Digital Health Group, Intel Corporation
The typical Digital Health Group user, however, is not a design engineer or
manufacturing team. The users of a Digital Health Group product may be a
healthcare provider organization, a clinician, a patient, or a family member.
The other critical stakeholders in the Digital Health Group environment
are the regulatory bodies that oversee medical device development and
manufacturing. The product characteristics and the development and
manufacturing processes for medical devices must conform to the requirements
of these customers and stakeholders.
As this became clear, we learned that we needed a new set of core capabilities in
the Digital Health Group.
Digital Health Platform Design and Development
The Intel Health Guide system was initially conceived and developed under
Intel’s corporate Product Life Cycle (PLC), which has been proven over many
years of successful semiconductor design and development.
The corporate PLC extends across the business landscape at Intel – from early
market segment analysis to conceptualization of product families, through
selection, development, product delivery and finally, product discontinuance or
End-of-Life (EOL).
“The corporate PLC extends across the
business landscape at Intel.”
Developing Consumer and IT Platforms in a Regulated Medical Device Environment | 191
Intel® Technology Journal | Volume 13, Issue 3, 2009
“Consistency and visibility improve
Intel’s product development and
roadmap stability.”
The corporate PLC is designed (as described in training material) to “drive
shared expectations and resources such as structured systems, specifications,
documentation and procedures as well as data systems and tools. This
consistency and visibility improve Intel’s product development and roadmap
stability.”
The corporate PLC provides a pathway to accomplish several major goals;
increasing development efficiencies, producing stable and synchronized
platform roadmaps, improving time-to-market and enabling organizations with
better decision making capabilities.
(P)PLC
Approvals/
Milestones
OIA/OSA
CA
Exploration
PSA
DIA
(P)POPL1
PFA
FA
(P)POPL2
Planning
Beta
Design
Complete H/W Deploy
PPS
A0-tape out
PD
PDC
PCA
Alpha
IPA
A1/A2
(P)POPL3
PIE
Development
PRA/PLA
SRA
PRQ
Production
Figure 4: The Intel Corporate Product Life Cycle
Source: Intel Corporation, 2009
“The PLC funnel diagram depicts
the 4 phases that comprise the Intel
corporate PLC.”
The PLC funnel diagram (see Figure 4) depicts the 4 phases that comprise
the Intel corporate PLC. Slanted lines between the phases, such as between
planning and development, represent an overlap of activities between crossfunctional teams and in some cases, even organizations.
Intel’s corporate PLC has evolved over the years to support not just
semiconductor development, but all aspects of platform development
from silicon design, to subsystem building block development, to software
subsystems, and eventually, to complete client and enterprise platform design
and delivery.
“These documents enable all
functional areas in the organization
to perform the planning needed to
establish the program Plan of Record.”
Each phase carries with it a unique set of boundaries and expectations in the
life cycle. For instance, in the initial Exploration phase of the PLC, teams
focus on basic or applied research, which identifies and tests new product
concepts, architectures, features or design approaches. Uncertainty is gradually
reduced over the course of the exploration process, leading to a decision to
commit larger amounts of resources to plan a product or system.
During the Planning phase, the Product Development Team (PDT)
incorporates business analysis, user modeling and architectural requirements to
create a product definition. This takes the form of product requirements, design
documents, and quality planning. These documents enable all functional areas
in the organization to perform the planning needed to establish the program
Plan of Record (POR).
192 | Developing Consumer and IT Platforms in a Regulated Medical Device Environment
Intel® Technology Journal | Volume 13, Issue 3, 2009
The Program POR defines the PDT’s commitments regarding product features
to be delivered at various releases, delivery dates for interim and final product
releases, product quality levels, product and program cost and pre-release
design wins.
Ultimately a decision is made within the organization to proceed to the
Development phase of the program. Here the product is engineered,
developed and evaluated against its initial requirements and other program
criteria. This phase usually involves several pre-production release gates and
formal product testing activities. Testing is conducted both internally by Intel
and in many cases, externally as our customers perform validation activities.
“This phase usually involves several
pre-production release gates and
formal product testing activities.”
Once product development is complete and a release candidate is ready for
the market, the design transfer activities begin under the PLC. Production line
qualifications, integrated system testing and yield analysis and improvement
all culminate in a series of phase gate handoffs known as Product Release
Qualification (PRQ) and Ship Release Authorization (SRA). These activities are
components of the final design transfer process, which moves the product or
platform from development into the Production phase.
The corporate PLC includes all of the steps necessary for developing complex,
high quality products. However, we learned quickly that the details of processes
that had evolved for semiconductor and computer building block development
were not always well-suited for developing complete medical devices, and were
not aligned well with regulatory requirements. Our most significant issues
occurred in the Planning and Development phases. It is here that Digital
Health encountered its greatest challenges.
Bringing the FDA into Intel required us to change our historical application
of the corporate PLC and rethink how we operated within the Digital Health
Group.
“Bringing the FDA into Intel required
us to change our historical application
of the corporate PLC and rethink how
we operated within the Digital Health
Group.”
Regulated Product Development
The design, development, manufacturing and support of any product classified
as a medical device is closely regulated everywhere in the world. In the US,
the Food and Drug Administration (FDA) is charged with this responsibility.
The quality system regulations the FDA applies are documented in the Code of
Federal Regulations (CFR), 21 CFR, Part 820 (21 CFR 820).
Outside the US, most medical device regulations are based on ISO
13485:2007, which is closely aligned with 21 CFR 820. In both of these
documents, each medical device manufacturer is required to implement
a Quality Management System (QMS) suitable for their business, which
complies with the requirements. Historically, FDA interpretation is much more
strict than ISO for requirements worded much the same.
“The design, development,
manufacturing and support of any
product classified as a medical device
is closely regulated everywhere in the
world.”
Developing Consumer and IT Platforms in a Regulated Medical Device Environment | 193
Intel® Technology Journal | Volume 13, Issue 3, 2009
“The QSR requires medical device
manufacturers to consider quality
at the earliest stages of all areas of
product design and development.”
The FDA’s quality systems regulations (QSR) uses a total systems approach
designed to satisfy the specific safety and performance needs of each
manufacturer, product, and user market. It is intended to have a positive
impact on the quality, safety, and efficacy of the device. The QSR requires
medical device manufacturers to consider quality at the earliest stages of all
areas of product design and development. The FDA expects management and
employees of each medical device manufacturer to fully commit to the firm’s
quality management system (QMS). This commitment is vital, if the quality
system is to be effective.
There were three paradigm shifts that needed to occur when the Digital Health
Group began development of regulated medical products.
“This document has 15 subparts, each
with a different topic area, such as
Quality System Requirements, Design
Controls, and Labeling and Packaging
Controls.”
The first paradigm shift was that we needed to deeply understand the
regulations that were now applicable to us. A critical document, 21 CFR
Part 820–Title 21 Food and Drugs, Chapter 1, Food and Drug Administration,
CFR (Code of Federal Regulations), Department of Health and Human Services,
Subchapter H Medical Devices, Part 820, Quality System Regulation [2], “Good
Manufacturing Practice for Medical Devices” quickly became standard issue to
all Digital Health Group employees. This document has 15 subparts, each
with a different topic area, such as Quality System Requirements, Design
Controls, and Labeling and Packaging Controls. In addition to the regulations
themselves, there is a classification system for medical devices that, when
layered on top of the regulations, determines the level of scrutiny and the
process and objective evidence that is needed to manufacture a given product.
(Medical devices are classified as Class I, II, or III, based on the level of control
required to ensure the safety and effectiveness of the device). While there
are many requirements defined in the regulations, there is little prescriptive
direction (e.g., “Do it this way.”). This lack of specific direction caused
confusion and frustration, as employees felt the goal posts kept moving, while
the game was still in play.
The second paradigm shift was the realization that the regulations apply, like it
or not. Intel employees are known for being detail oriented and for challenging
all authority, real or imagined. While the specific application of how we
implement the regulations is subject to interpretation, the fact that we were
fundamentally subject to the regulations was not up for debate or discussion.
“We had to learn that anything that is
not documented correctly and archived
The third paradigm shift was the notion of objective evidence: what proof do
you have that you did what you said you were going to do? Intel employees are
accustomed to pointing to an e-mail, or hallway conversation, or a meeting,
as their reason for a given design or development decision. We had to learn
that anything that is not documented correctly and archived in our controlled
document repository simply didn’t happen, according to the FDA.
in our controlled document repository
simply didn’t happen.”
194 | Developing Consumer and IT Platforms in a Regulated Medical Device Environment
Intel® Technology Journal | Volume 13, Issue 3, 2009
The Inside-Out Analysis
Faced with the paradigm shifts listed earlier, we followed the Intel standard
practice and dedicated some key individuals to own the establishment of
our QMS, with help from outside consultants. The result was an avalanche
of documentation, starting with a Quality Manual, which was supported
by a plethora of Standard Operating Procedures and Standard Operating
Instructions (SOPs and SOIs, respectively). We had to create this library
of governance and process definitions at the same time that design and
development work was being done, generally by different employees. The
exhaustive detail in the SOPs and SOIs gave us a false sense of being in control
of design and development. In practice, our maiden attempt at a QMS
resulted in an impossibly complex system that was not sustainable, nor was it
appropriate to support our business model. Our attempt at design controls was
viewed as over thought, overpriced, and overwhelming by our engineers, who
simply wanted to design a good product.
As we became aware of the burden we had inappropriately placed on ourselves,
we began to detangle the web, section by section, and to figure out what design
controls really meant for the Digital Health Group. We started with the basics,
reinforcing one of the key phrases in the CFR — “establish and implement.”
We needed to establish how we were going to do our work; we needed to
implement those processes that we defined, and we needed objective evidence
to document that we did what we said we were going to do. There was a lot of
education about the significance of writing down the plan for the work — the
how for design and development. We had to figure out the answers to lots of
questions: What did the plan need to say? Who was subject to the plan, and
why? What was covered in the plan, and what was excluded? Where were the
excluded areas addressed, if anywhere? How do we ensure that each of the
individual plans made sense, when combined with the rest of the project? We
needed more education on the importance of creating and approving the plan
before any work was started, so that the implementation of the approved plan
was clean and clear. The initial impression within the group was that the plan
creation was simply paperwork; in other words, someone else’s issue, while the
engineers did the real work, and that the plan could be done later. It took a
long time to dispel this notion. The third paradigm shift, objective evidence,
was even harder than the “establish and implement” phase of our process.
What was the right level of proof to confirm that the activities had occurred?
What level of approval was necessary for the proof? How did we ensure that
the evidence really was evidence, and not just opinion?
“In practice, our maiden attempt
at a QMS resulted in an impossibly
complex system.”
“What was the right level of proof
to confirm that the activities had
occurred?”
Developing Consumer and IT Platforms in a Regulated Medical Device Environment | 195
Intel® Technology Journal | Volume 13, Issue 3, 2009
Overview of Intel’s Standard Product Life Cycle Versus the Code of
Federal Regulations
“We set about decomposing the
standard Intel PLC while at the same
time decomposing the CFR.”
Initially, we set about our detangling in a piece-meal fashion, going to the areas
of the most pain first. While this approach staunched the bleeding, it still did
not address the fundamental problem: what did the Digital Health Group need
to do to satisfy the CFR, while designing and developing products inside of
Intel? To get to the bottom of the issue, we set about decomposing the standard
Intel PLC while at the same time decomposing the CFR. We had to see where
we needed to make changes, and to see how we could leverage what we already
knew and were doing, to satisfy the CFR. In short, we began bringing Intel to
the FDA.
One of the first distinctions we recognized was that we were not obligated
to satisfy strict design controls while still in the Exploration phase. This was
amazingly freeing for our incubation teams — they could spend their energies
thinking about new opportunities and getting detailed ecosystem and user
understandings, without having to check they were following the regulations
each step of the way.
“This was amazingly freeing for our
incubation teams — they could spend
their energies thinking about new
opportunities and getting detailed
ecosystem and user understandings.”
“Cutting back on product features was
not permitted without justification
and evidence.”
Once we internalized the separation of exploration from design and
development, we realized that there were other clear separations within the
CFR which were not reflected in Intel’s standard PLC. We knew that we
confused our regulatory inspectors when we spoke, because we used words
that were familiar to the FDA and to ISO; but our words meant something
completely different. We used the word planning for what the FDA considered
design and we used the word design when the FDA meant development.
Another new concept for our group was formal “traceability.” This is the
process by which design inputs are matched to design outputs, and then
verified (ensuring outputs match inputs) and validated (ensuring the final
device satisfies the intended use) for completeness. Risks are identified and
categorized for potential severity during the input gathering stage; they are
traced to outputs, and the risk mitigations are verified. All of this linking —
inputs, outputs, risks, verification, and validation needs to be documented,
and exceptions need to be identified and dispositioned. Nothing is thrown
out without justification, and you have to have evidence of that justification.
Coming from an industry in which it is normal practice to delay features to
satisfy a schedule imperative, this concept of accountability for all recorded
inputs was a big shift for us. It meant we had to deliver wholly on the
requirements we defined for ourselves: a new meaning for us for the word
verification. Cutting back on product features was not permitted without
justification and evidence. This meant we had to validate that our product
satisfied its intended use by the customer, through user test results documented
with objective evidence.
196 | Developing Consumer and IT Platforms in a Regulated Medical Device Environment
Intel® Technology Journal | Volume 13, Issue 3, 2009
Finally, and perhaps most importantly, we had to establish and document the
objective evidence that we had accomplished all of these tasks prior to release
of product to customers. Intel has always done excellent engineering work, but
we never had to prove each element of the process to an auditor or other third
party, as is needed for medical device manufacture. The evidence had to be
reviewed, agreed to, and approved by senior staff, and anything that was not
satisfied had to be identified and then dispositioned. The disposition had to be
documented as well, and it had to be readily available to an inspector.
“We had to establish and document
the objective evidence that we had
accomplished all of these tasks prior to
release of product to customers.”
A Walk Through the Code of Federal Regulations
As a medical device manufacturer who is making devices in the US but
distributing them in Europe, we are subject to both ISO and the CFR.
The two sets of regulations have many similarities, but often differ in their
emphasis. For our purposes, we had to blend these two sets of regulations to
ensure we were meeting all the imperatives we needed to. In many cases, the
CFR takes the lead, so we began our analysis by walking through the section
of the regulation that applies to design and development: 21 CFR Part 820.30:
Design Controls for Medical Devices. There are 15 subsections that we needed
to internalize and then institutionalize. We needed to tell our employees
the why, as well as the how. Deciphering the CFR was much of the why,
along with some hard-learned lessons from our employees who had previous
experience designing medical devices outside of Intel. We also learned that
institutionalization of regulations was much more successful if we provided our
engineering audience with templates and comprehensive examples of how to
satisfy the imperative.
“We had to blend these two sets of
regulations to ensure we were meeting
all the imperatives.”
Each section of 21 CFR 820.30 was analyzed carefully.
§820.30 (a) General
This is the section that defines who is subject to meeting the design control
section of CFR, and who is not — the biggest why we had to satisfy. There
are three classes of devices, as mentioned previously, as well as some special
allocation of design controls in specific classes. The most significant learning
for our group was that because we automate our device with software, we will
always be subject to design controls. Once we were able to help people realize
this, teams more or less quieted down on their protests against design controls.
“Because we automate our device with
software, we will always be subject to
design controls.”
Developing Consumer and IT Platforms in a Regulated Medical Device Environment | 197
Intel® Technology Journal | Volume 13, Issue 3, 2009
“Once we established the Design
and Development plan as part of
the process, there was the continuing
education for the engineers.”
§820.30 (b) Design and Development Planning
“Each manufacturer shall establish and maintain plans that describe or
reference the design and development activities and define responsibility for
implementation.” This is where it began to hit home that we had to define
our plans in advance, we had to assign ownership for the elements of the plan,
and we had to maintain the plan so that it continued to drive design and
development activities. We internalized this into our Design and Development
Quality Plan, a template that ensured the key elements of the regulations
were consistently maintained across multiple programs. The development
activities matrix is the real meat of the artifact, a table that lists each design
and development activity, its owner, what the owner is responsible for, a list of
applicable SOPs or SOIs for the activity, change acceptance criteria, and with
whom the owner has to interact in order to ensure successful delivery of the
element. Once we established the Design and Development plan as part of the
process, there was the continuing education for the engineers: this was now
what they were adhering to, not an e-mail or PDT conversation.
§820.30 (c) Design Input
“Each manufacturer shall establish and maintain procedures to ensure that
the design requirements relating to a device are appropriate and address the
intended use of the device, including the needs of the user and patient. The
procedures shall include a mechanism for addressing incomplete, ambiguous,
or conflicting requirements.”
“First and foremost, CFR’s focus is on
‘needs of user and patient.’ ”
“We had to first help them identify
This is where the real essence of “What is a requirement?” started to hit home.
There are keywords in the CFR that mandate that we realign our traditional
Intel perspectives. First and foremost, CFR’s focus is on “needs of user and
patient.” We therefore had to ask ourselves who our user was. We had several,
ranging from the patient whose care is monitored by the device, to the
installer who places the device in the home, to the IT administrator, who is
responsible for the backend support of the device. These are very different
from the standard user for Intel’s core business, the design engineer or systems
OEM. Once we had the users defined, we could start to get specific on the
requirements, what we came to speak of as the what. We had a version of
a Product Requirements Document, and we had a Market Requirements
Document, standard Intel PLC artifacts. Our issue quickly became that we
had too many requirements, in too many different places, and with too many
interpretations. We compounded this problem by the way we used the word
requirement: an all-encompassing term for anything we wanted the device to
do, versus the FDA’s much more narrow definition. In order to help our teams
grasp the FDA’s meaning of requirement we had to first help them identify
where requirements came from.
where requirements came from.”
198 | Developing Consumer and IT Platforms in a Regulated Medical Device Environment
Intel® Technology Journal | Volume 13, Issue 3, 2009
MDRs
Complaints
User
Studies
Market
Data
Clinical
Data
“Requirements come from a variety of
Defined User Needs
sources.”
Specification
Product
Figure 5: FDA Requirements for a Medical Device
Source: Digital Health Group, Intel Corporation
FDA guidance is that requirements come from a variety of sources, such as
user needs’ studies, marketplace needs, customer complaints, and clinical
data. There may be more than one user for a product — for our first product,
we needed to consider not only the patient in the home environment, but
also the clinician and anyone else who came in contact with the product.
Requirements can be derived from initial hazard analysis; they are functional
expectations. Once the PDT understood where requirements came from, we
had to simplify our requirements and put them into a single document, to
address “incomplete, ambiguous, or conflicting requirements.” We needed to
use strong, declarative statements, and we needed to define when a statement
was a design intent (nice to have) or a requirement (device does not ship
until we satisfy it). “The device SHOULD do something,” versus “The device
SHALL do something.” When the word should is used, it can be interpreted in
many ways and that can strangle a design team and paralyze Verification and
Validation (V&V) later on. If a device should be capable of running 24 hours
a day, but it is actually only capable of running 23 hours a day, does the device
fail the requirement? Sorting through all the words of the combined MRDPRD took a very long time, and so we engaged outside consultants to assist
us with this task. We needed people with clear heads, who were familiar with
the endgame of medical device manufacture, but who were not invested in
the device, to ask the tough questions. We needed to frame our requirements
with the idea that they should be answerable with a yes or a no. An example
requirement: “The Care Management Suite functions shall be presented in
English.” This can be answered with a yes or a no, versus other items that felt
like requirements (but really were design intentions) such as “Menu structures
on the PHS will be simplified.” We started out by just cleaning up our new
PRD, tagging requirements, and making them testable. This experience was
painful and time consuming, but was essential to our success. When we had
completed the work on the Intel Health Guide, we moved onto creating a
template for future programs, so that others could start off on a good footing.
Our template had clear examples, based on what we had learned, so that the
future authors could understand the difference between a good requirement
“When the word should is used, it can
be interpreted in many ways and that
can strangle a design team.”
“When we had completed the work
on the Intel Health Guide, we moved
onto creating a template for future
programs, so that others could start off
on a good footing.”
Developing Consumer and IT Platforms in a Regulated Medical Device Environment | 199
Intel® Technology Journal | Volume 13, Issue 3, 2009
“Now that we had clear requirements,
how did our design satisfy them?”
and a poor one. We defined some mandates, to simplify life further on
in development, and during the creation of the PRD: requirements to be
numbered sequentially, no deleted requirements without justification, no reuse
of identification numbers, proper use of the word shall, no use of the word
should, and most importantly, user requirements to live ONLY in our PRD.
§820.30 (d) Design Output
“Each manufacturer shall establish and maintain procedures for defining and
documenting design output in terms that allow an adequate evaluation of
conformance to design input requirements.”
“Our users for the device are patients
who may have visual impairments,
limited mobility, and multiple disease
states.”
“We had to get skilled at translating a
requirement into a specification.”
There were two new concepts here: documenting outputs and evaluating
conformance of those outputs to the inputs. Now that we had clear
requirements, how did our design satisfy them? This was where we redefined
specification to help Digital Health Group employees use the word in the
same way that the FDA did, so that the result of the specification could be
answered in a measurement (versus the yes or no of a requirement). The
final specification, for font size on the device, read like this: “The PHS
computer related interfaces shall use a font visually equivalent in size to 14
point Verdana.” This is an important specification; although at first it might
not appear so. Our users for the device are patients who may have visual
impairments, limited mobility, and multiple disease states. We could not
presume their vision to be 20/20, or that their dexterity was equivalent to an
athletically fit person. We had to make it easy for our user (the patient) to read
the information presented on the device. The first version of the specification
was much less precise and not measurable: there was no specific mention of
font size or type, nor any definition of where this would apply.
The second area, conformance of outputs to inputs, meant that we had to get
skilled at translating a requirement into a specification. In the example just
cited, the final version of the requirement read: “PHS user interface, PHS
education content, and PHS training material shall be suitable for use by
patients with corrected vision adequate to read a typical large print magazine,
such as the Readers Digest large print edition.” In other words, this is the what.
Now, the specification made sense: the how of satisfying the requirement was
14 point Verdana font on the PHS interface.
§820.30 (e) Design Review
“Each manufacturer shall establish and maintain procedures to ensure that
formal documented reviews of the design results are planned and conducted at
appropriate stages of the device’s design development.”
The CFR goes on to provide additional clarity on who needs to attend these
reviews, and to stipulate that the results need to be archived in the Design
History File (DHF). As an engineering company, Intel was familiar with
performing design reviews during the course of product development, but the
premise of a formal review, with specific attendees who could not be omitted,
and which was documented and archived, was a new twist. E-mail meeting
summaries, which our group was well accustomed to, were no longer sufficient.
200 | Developing Consumer and IT Platforms in a Regulated Medical Device Environment
Intel® Technology Journal | Volume 13, Issue 3, 2009
These often had insufficient information on the attendees (what role were they
fulfilling), too much information on the dialogue, not enough information
on any decisions that were made during the review, and not archived in our
controlled document repository. “If it isn’t documented, it didn’t happen”
became a familiar refrain, particularly when the subject of design reviews
was discussed. Our initial SOP was overly burdensome, people were not
effectively trained on it, and we were haphazard in the archiving of the review
and the materials covered in the review. When we did archive material, we
were inconsistent in tracking our actions required (ARs). Were the ARs really
related to the form, fit, or function of the product? If so, then we had to have
objective evidence that they were addressed and closed. If they were related
to general process or activities (such as “Send Alan the e-mail on the new SW
code.”), then it did not need to be archived in a design review. This distinction
has taken some time to become clear in many people’s minds. In addition to
the mechanics of design reviews, we had to clearly define types of review, and
we had to start to use the review terms the way the FDA meant them. Design
Input, Design Output, and Design Transfer reviews hold a particular meaning
for the FDA, and these happen only once in a product’s development. The
employees in the Digital Health Group felt that many design reviews were
design input reviews, so again, our training programs had to carefully clarify
the new meanings we were embracing, as a medical device manufacturer.
Design Review is an area we continue to focus upon, balancing between
simplification and detail, with the goal of realizing full business value.
“Design Input, Design Output,
and Design Transfer reviews hold a
particular meaning for the FDA, and
these happen only once in a product’s
development.”
“Verification means to the FDA that
our outputs clearly match our inputs:
did we build what we said we were
going to build?”
§820.30 (f) Design Verification
“Each manufacturer shall establish and maintain procedures for verifying the
device design.”
Verification means to the FDA that our outputs clearly match our inputs:
did we build what we said we were going to build? It was during our initial
attempts at verification that we really started to understand the cost of a poorly
written specification. If we had a specification that could not be measured,
then how could we verify that the design met the specification? Therefore,
we started to get our verification team involved in the refinement of the
specifications themselves: we engaged them in the upfront work, so that they
could better perform their roles later on.
We also realized we could perform verification more efficiently. We previously
had the perception that verification was something we did only at the end of
the project, and we did not realize that there was much valuable verification
work happening upstream of the final verification. This work met the intent
as well as the requirement for thorough verification, but we had not been
taking credit for it by documenting the right objective evidence. For example,
engineering verification tests, code walk-throughs, stack-up analyses and other
engineering tasks all could play a role in substantiating verification. We did not
realize this at first, and so did some redundant tests. Recognizing that work
we were already doing would help us substantially in the end was helpful in
getting engineers to embrace the QMS. This was a clear case where we were
already doing good work, but not giving ourselves proper credit (the objective
evidence) for it.
“Engineering verification tests, code
walk-throughs, stack-up analyses and
other engineering tasks all could play a
role in substantiating verification.”
Developing Consumer and IT Platforms in a Regulated Medical Device Environment | 201
Intel® Technology Journal | Volume 13, Issue 3, 2009
§820.30 (g) Design Validation
“Each manufacturer shall establish and maintain procedures for validating the
device design.”
“Validation, to the FDA, means that
our final product satisfies the intended
use we defined.”
“It is imperative that device design be
transferred to the manufacturing floor
accurately.”
“The Digital Health Group
Validation, as practiced in Intel’s core businesses, involves thousands of hours
of laboratory testing with commercial software, and the use of engineering
samples by validation partners. Validation, to the FDA, means that our
final product satisfies the intended use we defined. Does our product “allow
caregivers to remotely access vital sign measurements of patients at home”?
Does it “collect and store information and display it accurately on the LCD
screen”? We were so occupied with the design and development activities that
we did not internalize that the only way to validate our design, in the FDA
meaning of the word, was to test our device with real users or at least with
people who fit the profile of our real users. How do we set up those tests? How
do we recruit users? How do we document the results, and what do we do
with all the feedback received? Sorting through the information derived from
these tests, and determining which feedback meant the device did not meet the
intended use, and which feedback was a potential future enhancement, took
tremendous time that had never been planned into the project schedule.
§820.30 (h) Design Transfer
“Each manufacturer shall establish and maintain procedures to ensure that the
device design is correctly translated into production specifications.”
Design transfer simply means that product designs must be converted
into production specifications that allow the product to be manufactured
repeatably and reliably, and that was validated to meet product requirements.
This requires a correct and complete product design, clear specification of
required manufacturing process capabilities, appropriate procedures for
moving the design from the development team to the production team, and
well documented testing processes. It is imperative that device design be
transferred to the manufacturing floor accurately, so that the device quality is
not compromised. The detail necessary to accomplish this objective can vary
considerably, depending on the type and class of device, intended use, and the
knowledge transfer activities between the design and manufacturing teams. The
Digital Health Group engineering team now had to learn how to effectively
move the design into a High Volume Manufacturing environment. Further
complicating the transition was our prototyping strategy, which involved
transferring several revisions of the design to the manufacturing team in
order to support field tests. Although the engineering team often felt we were
constantly doing design transfer, we had to improve our processes to fully meet
FDA requirements.
engineering team now had to learn
how to effectively move the design
into a High Volume Manufacturing
environment.”
202 | Developing Consumer and IT Platforms in a Regulated Medical Device Environment
Intel® Technology Journal | Volume 13, Issue 3, 2009
§820.30 (i) Design Changes
“Each manufacturer shall establish and maintain procedures for the
identification, documentation, validation, or where appropriate, verification,
review, and approval of design changes before their implementation.”
We had to learn when to start tracking changes, as mandated by the CFR,
versus when changes were simply the natural evolution of the design and not
subject to inspection. We had to establish a formal change control process, so
that we could ensure that all the steps mandated were satisfied, and that we had
the right objective evidence to support the decision to invoke the change. The
change control process we defined had attributes that we were becoming quite
familiar with, such as the definition of the right roles to be represented during
the decision making, the definition of the evidence necessary to support the
decision, and the appropriate archiving of the decision as well as the evidence.
We had twice weekly change-control board meetings, and sometimes
emergency meetings, to start to indoctrinate the team in this process. Our
biggest problem, prior to the change control process improvement, was when
to put a design element under formal design controls. Like many start-up
medical device manufacturers, we put everything under formal control, right
out of the gate. This added a significant amount of overhead and frankly, pain,
to our design teams. It made the design and development work cumbersome
and difficult for the PDT. We streamlined the process, by giving PDT two
forms of change control: Program Revision Control, leaving change control in
the hands of PDT where it belongs, and Formal Revision Control, used for
those design elements that need full control early on during product design
and development activities (see Figure 6). As the product evolves, so do the
design elements that support the design and development activities. This
variable and increasing approach has significantly reduced the amount of
overhead associated with design changes.
“We had to establish a formal change
control process, so that we could ensure
that all the steps mandated were
satisfied.”
2
Program Revision Control
1
Formal Revision Control
0
Design
Development
Transfer
Figure 6: Change Control Process
Source: Digital Health Group, Intel Corporation
§820.30 (j) Design History File (DHF)
“Each manufacturer shall establish and maintain a DHF for each type of
device.”
This mandate is the ultimate test of “if you didn’t document it, it didn’t
happen.”
The intent of a DHF is to create a repository for all design and development
work, one place where all design inputs, design output, tests, test output, etc.
is gathered. No longer was this to be stored on someone’s computer — the
information needed to be available to the team. The contents of a DHF
provides objective evidence that we followed our design and development plan.
“This mandate is the ultimate test of
‘if you didn’t document it, it didn’t
happen.’ ”
Developing Consumer and IT Platforms in a Regulated Medical Device Environment | 203
Intel® Technology Journal | Volume 13, Issue 3, 2009
“Finally, we could go about the task
of ensuring that the records actually
demonstrated that ‘the design was
developed in accordance with the
approved design plan and the
requirements of this part.’ ”
Our early DHFs were haphazard. We complicated them by the fact that
we had multiple document repositories, due to transitions within Intel for
document storage, as well as the reality that each engineer had his or her own
favorite repository or Web site. We had a controlled document repository that
was difficult to use, and that only accepted physical signatures for approved
documents, a true anomaly in an IT-based, e-mail-centric corporation. The first
task we faced with the DHF was corralling anything that could be identified
as a DHF item. We cast a wide net, too wide as it later turned out. The second
task was to get the right level of physical approvals on the artifacts — not
an easy task in a business group with employees in many separate physical
locations. The third task was actually loading the items into our controlled
repository and appropriately storing the physical copies. Finally, we could go
about the task of ensuring that the records actually demonstrated that “the
design was developed in accordance with the approved design plan and the
requirements of this part.”
New Product Life Cycle, Expanded and Aligned with the Code of
Federal Regulations
Based on our new understanding of both the CFR and ISO, it became obvious
that we needed our own unique PLC, separate from the rest of Intel, to ensure
that we were following the CFR and the appropriate design controls (see
Figure 7).
Research
Concept
Approval
(CA)
Product
Overview
Proposal
Level 1
(POPL1)
Product
Overview
Proposal
Level 2
(POPL2)
Production
Release
Authorization
Production (PRA)
Product
Ship
Candidate
Release
(PPC)
Authorization
(SRA)
Product
Overview
Proposal
Level 3
Product
(POPL3)
Discontinuance
Regulatory
Authorization
Clearance
(PDA)
Market Planning
Explore
Assess
Feasibility
Design and Develop
Product
Figure 7: The Digital Health Product Life Cycle
Source: Digital Health Group, Intel Corporation
204 | Developing Consumer and IT Platforms in a Regulated Medical Device Environment
Market/Sell Product
Manufacture/Distribute
Product
Support Product
End of
Life
(EOL)
Intel® Technology Journal | Volume 13, Issue 3, 2009
We spent several weeks decomposing the inputs and outputs for every major
activity in the Digital Health PLC, ensuring we had clear mapping to the
entire Quality System Regulation, 21 CFR 820 and clearly mapping out
dependencies and deliverables. Once we had a good handle on those aspects,
we were ready to create our own PLC. We started with the Intel corporate
PLC, because we knew that our engineers had been well trained on it and
because we knew we would have to correlate back to it when we did the
training on the Digital Health Group’s version. We renamed major blocks of
activity to match FDA and ISO terminology.
“We started with the Intel corporate
PLC, because we knew that our
engineers had been well trained on it.”
We defined the phases as follows:
•• Research includes the study of user needs, ecosystems, and new
technologies.
•• Explore includes the activities that deliver the initial investigation of a
product or platform opportunity.
•• Assess Feasibility includes the activities that analyze the business, user, and
technology feasibilities of a concept.
•• Market Planning includes the activities that define the marketing approach
and sales strategy for the product.
•• Market/Sell Product includes the activities that execute the marketing and
sales plan.
•• Design and Develop Product includes the activities that lead to the
creation of the product.
•• Manufacture/Distribute Product includes the activities that create the
final product for sale, and that transfer the product from the manufacturer
to the customer.
•• Support Product includes the activities that enable the successful use of the
product, once it is received by the customer.
We created separations between the blocks, so that the decisions, handoffs,
and design control expectations were clear. Marketing a medical device, either
in the United States or the European Union, is quite different than Intel’s
standard marketing activity. We had new restrictions on what we could say, to
whom we could say it, and at what stage in the product development life cycle
we could say it. Therefore, we also created a separation between marketing and
design and development, as well as a separation between the market planning
and marketing and selling activities.
“We created separations between the
blocks, so that the decisions, handoffs,
and design control expectations were
clear.”
Developing Consumer and IT Platforms in a Regulated Medical Device Environment | 205
Intel® Technology Journal | Volume 13, Issue 3, 2009
“Our group needed the business
analyses contained in those business
artifacts to make solid resource
allocation decisions.”
From Feasibility to Design and Development
Now that we had a clear distinction between the research, exploration, and
design/development spaces, we needed to decide how to capture the early work
simply, in a manner that supported management decision-making, but also
supported the brainstorming and innovation we needed. We learned that there
are business artifacts that are not required by (or reviewable by) the FDA or
ISO, but we knew our group needed the business analyses contained in those
business artifacts to make solid resource allocation decisions. Intel’s approach to
these decisions focuses on the three vectors of Technology, User, and Business,
so we used those vectors as the basis of our Concept Requirements Document,
the template and artifact for capturing the early thinking. The template has
five major sections: overview, marketing strategy, users, concept functionality,
geographies, and customer engagements. We provided many examples, and
we also established the mechanism (relying on our now trusted friend, Design
Review) for moving the information across the separations between feasibility
and design and development planning, and market planning.
Quarantining Work Products
“In regulated medical development,
however, there must be a logical
firewall at the divide between the
Assess Feasibility phase and the
Design and Develop Product
phase.”
One of the fundamental challenges early in the PLC is properly transitioning
the artifacts developed outside of design controls (during Explore and Assess
Feasibility phases) into the Design and Develop Product phase. Work
products of the early phases include lists of possible requirements, design
alternatives, potential features, mockups, prototypes, etc. Because they were
developed outside of design controls, they usually do not have carefully
documented plans, objective evidence, and traceability. In other product groups
at Intel, it is perfectly natural to continue the development and maturation
of product prototypes through downstream PLC phases. In regulated medical
development, however, there must be a logical firewall at the divide between
the Assess Feasibility phase and the Design and Develop Product phase.
We can’t just carry artifacts forward into the design and development process.
Yet we need to capture the value of the work done during the early phases.
We needed a mechanism to prevent excursions from design controls, without
throwing away useful work.
In the Digital Health Group, we created a landing zone for those artifacts
under a set of internal systems and processes using Subversion* (a software
version control repository), which is a validated tool under Intel’s design
control and quality management system.
“All of these reviews are done in an
effort to assess which assets might be
transferable to our quarantine area.”
As concepts are promoted from Assess Feasibility into the Design and
Develop Product phase, we perform a series of reviews with the originating
research teams, the user experience team, the marketing team, the quality
and regulatory team, and any external vendors who may have participated
in analysis or prototyping efforts. All of these reviews are done in an effort
to assess which assets might be transferable to our quarantine area within
Subversion. Once in the quarantine area, they undergo further examination,
modification, adaptation, and evaluation for reuse, as we develop the functional
requirements, engineering specifications, patterns, and designs for the product.
206 | Developing Consumer and IT Platforms in a Regulated Medical Device Environment
Intel® Technology Journal | Volume 13, Issue 3, 2009
Much like any technology, vendor, or software-of-unknown-provenance
(SOUP) component that might be selected for use in the design and
development process, a work product crossing from Assess Feasibility to
Design and Develop Product must be carefully selected and vetted to ensure
that the proper hazard and risk analysis have been done and that the work is
thoroughly documented and reviewed. Moreover, the V&V processes must
provide for the correct level of objective evidence and traceability to show
that the eventual released product has been properly developed under the
appropriate levels of design control.
“How are we making sure that we
have thought about the product safety
risk, as we design and develop our
product?”
Risk Management — Do it Early, Do it Often
Risk Management is a prime example of a term that means one thing to Intel
at large, and something very different to the FDA and ISO. To Intel, it means
the risk to our brand identity, the risk that the CPU may malfunction, or the
risk that the notebook may get too hot, because of our circuit design. To the
FDA and ISO, it means the risk to the user(s) of the product we’re designing
and manufacturing: will it harm them? How are we making sure that we have
thought about the product safety risk, as we design and develop our product?
The ISO places more emphasis on risk than does the CFR; we therefore work
to the ISO standard.
• Scope and Timing
• Risk Analysis Team Selection
• Communication/Reporting
(B) Risk Analysis
• Intended Use identification
• Hazard identification
• Risk estimation
(C) Risk Evaluation
• Risk acceptability decisions
•
•
•
•
Option analysis
Implementation
Residual Risk Evaluation
Overall Risk acceptance
Risk
Management
(D) Risk Mitigation
Risk
Assessment
The purpose of implementing a Product Risk Management Plan is to establish
and maintain a documented process for identifying hazards associated with a
device (the potential source of harm), safety risks (combining the probability of
occurrence of harm and the severity of that harm), estimating, evaluating, and
controlling them, determining the acceptability of residual risk, and then
monitoring the effectiveness of the control during the PLC (see Figure 8). We
learned that the FDA and ISO expect product risk management to be
implemented early and maintained throughout the entire PLC. We also
realized that we could derive benefits from embedding product risk
management into our thinking. By establishing a metric for the severity of a
given risk, and establishing the likelihood of occurrence of the risk, we could
quickly determine if the risk was acceptable, that is, as low as reasonably
possible (ALARP), or intolerable. Intolerable risks clearly had to be addressed
and mitigated, but acceptable risks were already done. By integrating this
thinking into our SOPs and templates, design teams were able to work more
effectively, putting their efforts into those areas where risks were intolerable
rather than trying to mitigate risks that were acceptable.
(A) Planning
(E) Post-SRA Risk Monitoring
• New hazards/failure modes
• Existing risks no longer acceptable
Figure 8: Risk Management Flow
Source: ANSI/AAMI/ISO 14971:2007
Developing Consumer and IT Platforms in a Regulated Medical Device Environment | 207
Intel® Technology Journal | Volume 13, Issue 3, 2009
Defined User Needs
Device and labeling
must be easily read by
visually impared user
Specification
PHS interfaces shall
be visually equivalent
to 14pt Verdana
Output
(Component/Product)
Screens and Labeling
materials in 14pt font
Figure 9: Verification
Source: Digital Health Group, Intel Corporation
“Initially, our thinking was that they
were similar in intent, and could
easily be run in parallel: we were
wrong.”
“If there were exceptions to the result,
we needed to clearly understand and
explain the reasons for the exceptions.”
Validation and Verification — Not at the Same Time
Verification and Validation (V&V) were not new concepts to our group,
nor to Intel, but the execution of them in a medical device manufacturing
environment was wholly different. Initially, our thinking was that they were
similar in intent, and could easily be run in parallel: we were wrong.
What we had to learn was that validation, ensuring that our device conformed
to the user needs, was insufficient if verification (ensuring that our design
output met our design input requirements) was not completed. Therefore, if we
did not verify and validate our design in sequence, we would end up redoing
a lot of work. Granted, the more often we did V&V, the better we became,
but it was an unnecessary step that we imposed upon ourselves, due to a lack
of full comprehension of what was needed. We initially had a bad habit of
pushing engineering testing “over the wall” to the verification teams. Instead
of verifying that our design outputs met our inputs, the verification teams
would do code analysis, mechanical debugging, and whatever else they could
think of to measure. In truth, correctness to specification was what we needed
to be measuring. This is when we understood the value of a clear, measurable
specification, just as in the readability versus 14 pt font case we referred to
earlier. We needed to shift our focus, from testing everything we thought of,
to improving the specification so that it was measurable, and then defining the
test that would substantiate the verification. Once defined, we had to execute
the test and provide the objective evidence that it had been satisfied. If there
were exceptions to the result, we needed to clearly understand and explain the
reasons for the exceptions. We needed to assess whether we needed a different
test, or whether the exception was acceptable, based on our specifications and
our design. All of this needed to be clearly documented and approved, before
we could consider the design to be verified.
208 | Developing Consumer and IT Platforms in a Regulated Medical Device Environment
Intel® Technology Journal | Volume 13, Issue 3, 2009
Defined User Needs
Device and labeling
must be easily read by
visually impared user
Specification
PHS interfaces shall
be visually equivalent
to 14pt Verdana
Output
(Component/Product)
Screens and Labeling
materials in 14pt font
Figure 10: Validation
Source: Digital Health Group, Intel Corporation
We also had adopt a definition of validation that required a user (or someone
who represented our target user group) to sit down with the device and use it.
The question we had to ask ourselves was “does our finished device correctly
satisfy the intended use?”
“The question we had to ask ourselves
was ‘does our finished device correctly
satisfy the intended use?’ ”
We had to accept that our users were more concerned about the usability of the
user interface than they were about disk access speed. Our validation efforts
needed to be focused on the real need, not our comfort zone.
Were the instructions in the manuals sufficient to enable the user to actually
understand and use the device? Or, had we assumed too much, due to our own
intimate familiarity with the device? This was the true test of all our research,
design, and development: did we really understand our users and had we met
their needs? We needed to go beyond simulation and get true feedback: can
our device be effective with our users? Validation testing is about closing the
loop. This kind of validation process was substantially more involved than what
we had initially planned and scheduled.
Complete Traceability — No Requirement Left Behind
Design traceability, to the FDA, means that inputs correlate to user needs
and risks. Outputs match inputs, and test cases support the V&V of a device.
Perhaps most importantly, traceability means that objective evidence exists
for all of these linkages, and it has been appropriately created or obtained
and properly reviewed and dispositioned. We have discussed traceability in
previous sections — how it starts with clear, unambiguous and non-conflicting
requirements. We addressed the need to define our outputs clearly, the
importance of writing a good specification, and of crafting appropriate plans
for the execution and development of the design. Even with all the work we
had done on inputs and outputs, we really did not completely understand
the concept of traceability until we needed to document it. We found that
“Traceability means that objective
evidence exists for all of these linkages,
and it has been appropriately created
or obtained and properly reviewed and
dispositioned.”
Developing Consumer and IT Platforms in a Regulated Medical Device Environment | 209
Intel® Technology Journal | Volume 13, Issue 3, 2009
even with all our testing, there were still some things that had been missed,
things we needed to resolve. We discovered that typos can stymie the best of
teams. We discovered that requirements or specifications that we thought had
been changed were not changed in our system of record. We discovered that
any manual processes for tracking were always subject to human error — not
exactly an earth-shattering revelation — but one that cost us time and resources
to correct.
Objective Evidence Standard
21 CFR 820, Subpart D pertains to Document Controls. “Each manufacturer
shall establish and maintain procedures to control all documents that are
required by this part.” It speaks to document approval and distribution, as well
as to document changes.
“We had taken the time to establish a
controlled document repository, which
had been validated for its intended
use.”
“We learned most of these best-known-
Our challenges in this area were twofold: implementing the right
documentation and the right archival process. Our first task was to ask
ourselves whether we were creating the right documentation. We were
certainly creating a lot of documentation, but getting the teams aligned on
the appropriate documentation was challenging. Where was the objective
evidence that we had the right documentation? E-mails, meeting minutes, or
off-line conversations were not acceptable: these could not be presented to an
auditor. Helping people to understand “if it’s not documented, correctly, it
didn’t happen” was a big shift in our engineers’ thinking. Again, our templates
and team training sessions (some formal, some ad hoc) were the key to getting
this message out. Our second challenge was the archival process. Were we
following the right procedures when seeking approval of a document and were
we effectively archiving it? We had taken the time to establish a controlled
document repository, which had been validated for its intended use (software
tool validation is a different topic, and is not covered in this article). However,
we did not have an electronic signature tool that had been specifically validated
for that purpose: we were initially able to use hard-copy signatures only. This
was a real mind shift for a geographically diverse business group that thrives
on e-mails. We learned how to delegate to administrative assistants the tasks of
obtaining physical signatures and sending faxes. We learned that any employee,
no matter what his or her title, can always carry a few signed documents back
to the document control office for archiving. We learned that no one goes out
of town without putting a signature delegation authority in place, so as not to
impede our progress. We learned most of these best-known-methods (BKMs)
the hard way, as we balanced project deadlines with QSR imperatives. We also
learned we really wanted electronic signatures and that we needed to make
the investment in money, time, and people to establish a validated system that
supported this capability.
methods (BKMs) the hard way, as we
balanced project deadlines with QSR
imperatives.”
210 | Developing Consumer and IT Platforms in a Regulated Medical Device Environment
Intel® Technology Journal | Volume 13, Issue 3, 2009
Adapting to New Ways of Working
From a software engineering perspective, there are many different
methodologies for designing and developing new software platforms. Under
regulatory requirements, the medical device industry has operated for the most
part by using a waterfall or modified waterfall development model — also
known as the Big Design Up Front model.
In a waterfall model, things happen in a cascaded and sequential fashion.
Progress may be slower as finish to start relationships pace the progress from
phase release to phase release. Modified models allow for sequential overlap
(Sashimi model) between design and implementation. Sashimi models
add some additional risk because they may require further levels of branch
prediction with regard to which phases of the design can be coded first and
how forgiving the program plan is of mid-stream changes downstream.
Regardless of the development model, regulated design and development
requires clear design inputs (product requirements, product architecture,
engineering specifications, lower-level subsystem design, functional engineering
specifications, and V&V plans) which lead to a series of formal design output
reviews, in order to provide actual design requirements for downstream code
development. Code development works from a planned release process through
a series of software milestone releases and formal V&V testing gates, until a
final Release Candidate can be produced. Completely documenting all the
objective evidence against 100 percent of the originating design inputs is a
complex and resource-consuming process.
Our software development process in the Digital Health Group improved
dramatically when we developed a clear understanding of what is necessary for
the initial Design Input review:
“The medical device industry has
operated for the most part by using
a waterfall or modified waterfall
development model — also known as
the Big Design Up Front model.”
“Completely documenting all the
objective evidence against 100 percent
of the originating design inputs is
a complex and resource-consuming
process.”
1. A clear understanding of the product or platform architecture, roles, actors,
constraints, assumptions, risks, hazards, and security threat matrices.
2. A clear set of product-level requirements, defined and approved.
3. A thorough, completed analysis of each of the subsystem designs and
including the unit test, code inspection, and functional engineering
specification needed to properly verify the product.
4. All design artifacts, decomposed in such a way that the V&V teams can
create viable test plans to present the clear and objective test evidence
necessary to support the original product requirements.
If any link in this chain breaks, then you run the risk of a process loop. Some
process loops are worse than others, and in the first development cycles of the
Intel Health Guide, we had several.
“If any link in this chain breaks, then
you run the risk of a process loop.”
Developing Consumer and IT Platforms in a Regulated Medical Device Environment | 211
Intel® Technology Journal | Volume 13, Issue 3, 2009
Early in the V&V of the Alpha and Beta releases, it became clear that we had
missed requirements. We also had many interpretive requirements, in which
testers felt that the application should or should not function a specific way,
regardless of how we had designed it. Yet there were still other cases where we
misinterpreted the intent of a requirement.
“We approached each setback with
that final goal in mind, which was to
produce a high-quality product that
was reliable, safe, and in compliance
with all regulations.”
All of these program interrupts caused a series of redesign efforts, as well as
re-planning and re-engineering of the platform software subsystems and UI
that produced small program loops and resets. The Software Engineering
organization clearly understood the end-goal, and we approached each setback
with that final goal in mind, which was to produce a high-quality product that
was reliable, safe, and in compliance with all regulations.
We left that initial platform development experience with a new attitude
towards our own ability to control design specifications and a commitment to
drive deeper clarity into the processes that fed into our work.
Lessons Learned and New Directions
Fortunately, engineers are not generally gluttons for punishment. After the
initial release of version 1.0 of the Intel Health Guide, we stepped back and
analyzed several elements of the program that we believed we could directly
control and could significantly improve. This would inevitably produce higherquality software in a more controlled and predictable fashion. Our analysis of
the process also drove us in several additional directions.
“We stepped back and analyzed several
elements of the program that we
believed we could directly control and
could significantly improve.”
First, we had to further define the methods by which we would operate as a
software development organization:
•• Formalize our software development plan structures.
•• Develop our software subsystem designs and output processes.
•• Perform more detailed task-level, peer code reviews.
•• Implement more rigorous code version and release control.
•• Follow secure coding standards.
•• Have design effort scheduling that is risk and developer loaded by task, in
order to produce better schedule quality and predictability.
•• Rigorously adhere to the quality management system SOPs and SOIs that
control the software engineering process.
Second, we needed to separate the higher-level product requirements from the
functional engineering specifications of the platform subsystems. The PRD, as
stated earlier, is a component of the “What are we designing?” section of the
design control process. Along with the initial hazard analysis and the Platform
Architecture Document, these three pillars define what the platform will be.
212 | Developing Consumer and IT Platforms in a Regulated Medical Device Environment
Intel® Technology Journal | Volume 13, Issue 3, 2009
Below the what level of design control, we realized that we needed to define
the subsystem and functional engineering specification outputs in such a way
that no part of the design and development of the platform was left open to
interpretation. This next-level design output would drive absolute traceability
and transparency into the how portion of the design control process. In order
to do this, we asked that each Platform Architecture Document specify the
subsystems in the platform design which would need to be thoroughly vetted
as individual lower-level design artifacts.
As a result of this change in the PLC, the Software Engineering organization
created both a Software Subsystem Design (SSD) artifact and a Systems
Engineering Specification (SES) artifact.
“We needed to define the subsystem
and functional engineering
specification outputs in such a
way that no part of the design and
development of the platform was left
open to interpretation.”
Platform architecture for client-server enterprise applications, such as the
Intel HCMS, might comprise a dozen or more major subsystems at the
architecture and platform design level. These might be represented as a Security
SSD, a Manageability SSD, a UI SSD, a Data Modeling SSD, a Peripheral
Connectivity SSD, a Communications SSD, or an Infrastructure SSD.
Each SSD is a specific how implementation design document for a given
subsystem. Not only does it deal with the next-layer design definitions, it also
represents the constraints, assumptions, standards, and implementations of
that subsystem. The SSD addresses the overall subsystem design: it defines
library interfaces, code classes, algorithms, service methods, attributes and
implementations, SOUP, component interfaces, and other functional elements
of the subsystem design.
A table in each of the SSD documents specifies both the functional and nonfunctional engineering specifications for that subsystem. It also defines the test
methods for each specification; whether they will be covered by unit test cases,
code inspections, or functional tests.
“Each SSD is a specific how
implementation design document for a
given subsystem.”
Any functional engineering specification requiring some level of functional test
coverage is automatically promoted to the SES, once the subsystem design is
completed and approved. The SES provides for a traceability link between the
PRD, the Initial Hazard Analysis, and the lower-level subsystem design itself.
It also forms the basis for the Verification Test Plan portion of the overall V&V
plan.
The SES is of course iterative through the design and development phase of the
PLC. Different subsystem designs are completed at different stages prior to the
final design output review.
During that design and development process, the SES matures — along
with the other design artifacts — providing the verification test engineers
with a clear understanding of the design outputs they will use to build
their automated and/or session-based tests. It is equally important for the
verification test engineers to work closely with the design review teams for
each subsystem design. Each SSD becomes an artifact that Test Engineering
can refer back to for the development and execution of the final test plans for
V&V.
“During that design and development
process, the SES matures.”
Developing Consumer and IT Platforms in a Regulated Medical Device Environment | 213
Intel® Technology Journal | Volume 13, Issue 3, 2009
Improving Quality, Predictability, and Design
Assurance
“We often heard our engineers ask,
‘When is done, done?’ ”
“All of this led to a clearer definition
of the intended development work
by each engineer and a very clear
expectation of what was to be tested
and how.”
In version 1.0 of the Intel Healthcare Management Suite’s development, we
often heard our engineers ask, “When is done, done?” This question came out
of many frustrations, as we collectively ground down through several internal
release cycles and subsequent field trials of the product.
Engineering reviews of test data output with the platform V&V test teams
exposed individual tester interpretations of poorly worded system requirements
which in turn led to escalated Product Design and Systems Validation review
meetings. In addition, implied requirements and even edge boundary case
functional tests were being performed in formal V&V sessions, because of a
lack of understanding between the development and test engineering teams.
In the absence of a clear requirement, the test teams continued to push the
envelope until the product ceased to function, and then it became a product
defect. It became increasingly clear that we could not operate as an effective
and efficient team if we did not have better control over all aspects of our
design, development, and V&V processes.
After the 1.0 release, we made a number of significant improvements in our
ability to gather, define, and prioritize requirements and specifications in the
hope that this would improve the overall design methodology, test framework
implementation, test integration, and predictive control of the testing process.
All of this led to a clearer definition of the intended development work by each
engineer and a very clear expectation of what was to be tested and how.
Other improvements happened in the areas of release milestone task
definition (what elements were to be developed and when). This drove a
better understanding of development across the organization, including which
requirements and which known defects would be satisfied in each release phase.
It is essential that organizations gain agreement upfront and align to formal
testing plans under a phased development release model, so that anyone testing
any release phase of a platform will clearly understand what to expect at that
point in the code development process.
Between better design definition and cross-organizational design reviews, more
consistent code reviews, clearer software project plans, and release management
phase controls, we became much better at predicting the actual schedule of
each of the development phase gates and the overall quality of the software we
delivered at each milestone.
“We became much better at predicting
the actual schedule of each of the
development phase gates.”
Another major improvement area for us was the way we performed V&V
testing cycles at certain release milestones. In version 1.0, every software release
phase immediately entered a formal test cycle. This caused lots of internal
congestion and confusion, because testers who were running formal sessionbased testing on incomplete release code (Alpha for instance) expected to see
feature implementations and therefore logged lots of application defects.
214 | Developing Consumer and IT Platforms in a Regulated Medical Device Environment
Intel® Technology Journal | Volume 13, Issue 3, 2009
In redefining our test processes for the next version of the software, we
defined two additional test cycles in the Alpha and Beta phase of development
known as a Developer Test (DT) cycle and a Characterization Test (CT)
cycle. Developer Tests are run by developer peers within the team, checking
each other’s code in an informal, integrated release runtime environment.
The developers run the verification test cases on their own code as well, but
in an informal manner, in order to find defects, issues, or implementation
problems with the code. After the software engineers run the DT and complete
their cleanup of known defects, the Alpha release (for example) is handed to
Verification, Validation, Technical Marketing, and other teams in order to run
CTs against that release.
“We defined two additional test
cycles in the Alpha and Beta phase of
development.”
While the CTs are being run, the developers are free to continue onto the
next development phase. As defects arise from the CTs, they are addressed as
a part of the Beta development phase (in this example). The benefit here is
that we make multiple informal test passes through the code-base during its
development lifecycle so that by the time we reach actual Release Candidate
status and perform formal V&V under our plan, all of the groups involved in
the development of the platform clearly understand the expected features and
the expected test-plan execution suite. As a result, this produces a much higher
confidence level in the expected outcomes.
All of these steps lead to fewer Release Candidate builds. Our development
schedules improved dramatically and the quality of our software improved
four-fold over the previous process, with regard to the number and severity of
the defects found.
“The quality of our software improved
four-fold over the previous process.”
Like building a house, the foundational elements are the most important thing.
As the work progresses, checks and balances between the architect, the builder,
various subcontractors, and building inspectors ensure a quality outcome.
Software development is no different. If you understand the elements of
each phase of the design and review and plan for them accordingly, your
developers will always know that they have successfully completed the product
requirements with a high degree of design quality and assurance and will in fact
know when done is actually done.
“Your developers will always know
that they have successfully completed
the product requirements with a high
degree of design quality.”
Developing Consumer and IT Platforms in a Regulated Medical Device Environment | 215
Intel® Technology Journal | Volume 13, Issue 3, 2009
Conclusion
“Inflection points offer promises as well
as threats.”
“Designing processes to meet the
requirements of the healthcare
industry presented a set of new
challenges.”
Implementing a regulated product development process was to say the least,
challenging. Early assumptions and confidence in our own PLC, technical and
manufacturing capabilities gave way to uncertainty and frustration. Andrew
S Grove, the former CEO and Chairman of Intel Corporation, wrote a book
on the subject of organizational tension and reconstruction entitled Only
the Paranoid Survive: How to Exploit the Crisis Points That Challenge Every
Company, in which he discusses the concept that …“inflection points offer
promises as well as threats (to your business survival). It is at such times of
fundamental change that the cliché adapt or die takes on its true meaning.”
The Digital Health Group had a deep commitment to product quality from
its inception. During our first product development cycle we pushed ahead,
resetting our process several times, focusing on doing the right things right, and
in the end, producing a high-quality, safe, and effective product that adhered to
the regulatory requirements set forth by the CFR and other regulatory bodies.
It became clear to all of us that we had no choice but to adapt our development
processes if we were to be an effective medical device manufacturer. Designing
processes to meet the requirements of the healthcare industry presented a set
of new challenges; we had to learn to operate within a much more constrained
and formalized development process, while still developing innovative products
and working efficiently.
Before beginning our next product development cycle, we made a number
of significant decisions and structural changes to our PLC, our phased
development methodologies, our SOPs, SOIs, and development artifacts.
We completely restructured the way we capture, measure, and implement
requirements and specifications. We also integrated other structural
development process improvements, such as Security Design and Development,
Risk Management Design Assurance, Labeling and Documentation Workflows
into the overall process, instead of making them parallel breakpoint activities.
“We are now adding increased focus
We continue to practice user-centered design as we develop next generation
products for home health. Our design and development processes have
converged during our first development cycles to fully meet regulatory
requirements. We are now adding increased focus on predictability and
efficiency in our design and development activities. Our constant goal is the
delivery of products that help to improve the health and quality of life of the
people who use them, while lowering cost and improving access to healthcare.
on predictability and efficiency in our
design and development activities.”
216 | Developing Consumer and IT Platforms in a Regulated Medical Device Environment
Intel® Technology Journal | Volume 13, Issue 3, 2009
References
[1]
The Intel Corporate Product Life Cycle (PLC), Intel Corporate Program
Management office, Intel Corporation
[2]
The World Health Organization. “How can chronic disease
management programmes operate across care settings and providers?”
In WHO European Ministerial Conference on Health Systems, Tallinn,
Estonia, 2008 Debbie Singh. Available at www.euro.who.int
[3]
21 CFR Part 820–Title 21 Food and Drugs, Chapter 1, Food and Drug
Administration, CFR (Code of Federal Regulations), Department of
Health and Human Services, Subchapter H Medical Devices, Part 820,
Quality System Regulation. Search for CFR Part 820. Available at
www.accessdata.fda.gov
Authors’ Biographies
Barbara Fullmer is Director of Product Development Controls at Intel
and has been with company for eight years. She has been working in the
semi-conductor industry for over 21 years. Barbara has spent 16 years in
various planning functions, from wafer fab planning to multi-national
capacity commits for fab, wafer sort, assembly, and test. Prior to joining
the Digital Health Group in 2005, Barbara worked in the Technology and
Manufacturing Group as the Strategic Capacity Planning Demand Manager.
She has received various awards at Intel for process improvement, including
re-engineering aspects of the long-range planning process. Prior to joining
Intel, Barbara worked at Affymetrix, a bio-medical device manufacturer, as the
Manufacturing Planning Manager. Her e-mail is barbara.h.fullmer at intel.
com.
Alan Boucher is the Director for Software Architecture and Engineering
in the Digital Health Group. He is a 21-year veteran of Intel where he has
focused on the healthcare market since 1999. Throughout his career, Alan has
led numerous software engineering and architecture teams for various Intel
platform groups and has managed field systems and network engineering teams
worldwide for more than half his career. He received the Intel Achievement
Award (IAA) twice (2002 and 2005) for leading the Technical Solution
Blueprint team and for technical contributions to the Mobilized Software
Initiative. His e-mail is alan.boucher at intel.com.
Copyright
Copyright © 2009 Intel Corporation. All rights reserved.
Intel, the Intel logo, and Intel Atom are trademarks of Intel Corporation in the U.S. and other
countries.
*Other names and brands may be claimed as the property of others..
Developing Consumer and IT Platforms in a Regulated Medical Device Environment | 217
9 781934 053232
7
35858 21143
7
$49.95 US
Copyright © 2009 Intel Corporation. All rights reserved. Intel, and the Intel logo, are trademarks of Intel Corporation in the U.S. and other countries.
ITJ9-3_Cover_BFC_39spn_090409.indd 1
SEPTEMBER 2009
Enabling Healthcare in the Home
vol 13 | issue 03 | SEPTEMBER 2009
ISBN 978-1-934053-23-2
Intel® technology journal | ENABLING healthcare in the home
More information, including current and past issues of Intel Technology Journal, can be found at:
http://developer.intel.com/technology/itj/index.htm
Intel® Technology Journal
9/4/09 8:28:45 AM