The Role of PhysicaliTy in The Design PRocess

Transcription

The Role of PhysicaliTy in The Design PRocess
The Role of
Physicality
in the
Design Process
Steve Gill and Alan Dix
What's more to pressing a physical switch to turn on the light than we're
consciously aware of? Is modern interface-technology in cars really designed
for improving the safety of driving or does it rather distract the driver? Why
does food taste differently depending on the ambient light? Interacting with
physical objects is in many ways much richer than digital interfaces such as
touchscreens. Humanity has grown up interacting with the physical world and
we bring this experience with us when interacting with the modern world
and its artifacts. We're often not conscious of these differences and, hence,
software engineers tend to oversee them and fail to make use of the advantages
of the physical.Steve Gill and Alan Dix look into how human behavior is
influenced by the physical world and point out how design could be improved
if designers would take physicality's importance more seriously.
Steve Steve Gill is a product designer with 18 years experience in industry and
academia. He is Professor of Interactive Product Design, Director of Research of
the Cardiff School of Art & Design and Director of the Programme for Advanced
Interactive Prototype Research (PAIPR). Gill has a range of research interests
related to product design including the role of physicality, rapid design of
information appliances and the implications for product design of testing early
prototypes within their context of use.
Alan Dix is a Professor in the School of Computer Science, The University of
Birmingham in the UK and researcher at Talis. He was a mathematician by
training, and mathematics is still his first love, but he has worked in HumanComputer Interaction since 1984, and is author of a key textbook in the area.
He has worked in several universities, agricultural engineering research,
local government and hi-tech start-ups. His interests are eclectic: formalisation
and design, physicality and digitality, the economics of information, structure
and creativity and the modelling of dreams. inhalt: 1. Interacting in the Physical World | 2. Case Study – Physicality
in the Saab 9-3 | 3. Designing for Physicality | 4. Physicality in Design
th e r o l e o f ph y sica l it y i n th e d e sig n pr o c e ss
In this chapter we discuss physicality’s influence in design, an influence
that permeates the process at many levels. Before embarking on that however it might be sensible to define physicality. In other writings we have
employed a broad definition that encompasses all non-visual senses. However in this context and for the sake of clarity and focus, we define physicality as the mechanical forces at work in designed artefacts in the physical
world, be they through our own physical interactions or through natural
forces such as gravity. Evolutionarily speaking we are still stone-age humans, physically and mentally evolved to live in a physical world of animals, plants, water, air and earth. We bring the social and cognitive baggage of that history with us when we interact with the modern world and the
artificial artefacts within it, some of which appear to break the rules of the
physical worldunderstood by our stone aged psyche.
1. Interacting in the Physical World
Taste testing booths in UWIC’s Food Industry Centre
1.1 How we experience the world
It was once thought that we experienced the world by assembling, at the
last stage, data coming into our heads from our various senses through separate channels connected to ›sensors‹ (e.g. eyes and ears). It was thought
that the eyes gathered light projected onto the back of the retina like a movie camera projecting onto film and that these pictures were sent to the
brain which saw them, more or less, like a projected moving image. Meanwhile, it was thought, our eardrums were vibrated by sound waves and
the resulting patterns were processed as sounds that were then added to
the moving pictures in an independent stream. While there are elements
of truth to this (our eyes really do gather light on the retina, our ear drums
really do gather sound when they are vibrated) it has long been proven that
what then happens with the information is much more complex. We simply don’t have the processing power to make sense of the amount of information bombarding us. Instead, we understand the world by taking information and processing it very efficiently to produce a compressed, but
information rich picture built from the stream of inter related information
sets coming in and which we experience as a whole.
Evidence for this mixed information model lies in a number of scientific studies such as those carried out by McGurk and MacDonald (McGurk
58
and MacDonald 1976) which proved that in certain instances we are unable to
tell the difference between what we see and what we hear. The so-called McGurk Effect was found when participants were shown two film clips of a woman talking to camera. In one clip she mouthed the sound ›ga‹ over which
the sound ›ba‹ had been dubbed. In the other she mouthed the sound ›ba‹
but the sound›ga‹ was dubbed. Most participants reported hearing the
sound being mouthed rather than the sound being played. Without visual
input participants were able to correctly identify the sound.
But it’s not only sound and sight that we mix. Researchers in the food
industry use coloured lights in order to filter the results of taste-based
food trials because of the known effect of colour on taste.
Meanwhile Norman has demonstrated that if we find the products we
interact with more beautiful, if we ›love‹ them, then we also think them
easier to use, even if, objectively speaking, they aren’t (Norman 2005).
We don’t necessarily have to look to science however. We know from
our own experience that our senses are influenced in all kinds of ways;
triggered by social pressures, past experience and deep animal instinct.
We know for example that the more a wine costs the better we think it tastes, the more a product weighs the higher the perceived quality and a bag
by Gucci would look better than the same bag designed by an unknown
59
th e r o l e o f ph y sica l it y i n th e d e sig n pr o c e ss
Figure 2: Equinox prototypes (from left to right): low fidelity (Sketch), high
fidelity (IE Unit) real product (Equinox) and touchscreen (Software)
designer. It doesn’t therefore take a big imaginative leap to believe that
physicality deeply affects our interactions with the world, and as we will
see, this is indeed the case.
In 2008 and 2009 we published the findings of some research on
physicality’s effects on interactions with hand held objects (Gill et al. 2008;
Hare et al. 2009). We set out to discover:
1. whether a tangible prototype, even one with limitations, (e.g. a discrete screen displayed on a PC monitor) was more similar to a final product than the monitor based Flash prototypes often used in industry,
2. the level of fidelity required to obtain an acceptable degree of tangible
accuracy.
A real phone was ›reverse engineered‹ to mimic a high fidelity tangible
prototype. A keyboard chip embedded in the prototype translated button
presses into ASCII code, triggering a Flash file mock-up of the phone’s interface on a P.C. monitor. The same Flash interface was used as the basis
of a full on-screen prototype which was interacted with via a touchscreen.
We compared the performance of the real phone with the high fidelity
keyboard chip equipped prototype and the Software prototype -> Figure 2.
Tasks included common functions (ranging from simple to complex), unusual functions (such as the Equinox’s SMS button), and functions that involved more than straight forward transitions between the product’s states.
We found that the physical prototype performed in a more similar fashion to the real phone than the software alone (for good and for bad) de60
spite the fact that both prototypes drove the same Flash interface. This
was especially visible in the time taken to complete each task but at no
point did the software outperform the tangible prototype (and it was sometimes much poorer). In Norman’s theorizing (Norman 1988), the system
image created by the physical prototype was a better fit of the user’s mental model of a phone device than a purely software simulation. This result
is all the more significant because this phone had an all push button interface mounted on the top surface. This allowed the software prototype to
compete on favourable terms since the picture on the touchscreen showed
all the controls on the same face as the display making it ideal for a screenbased prototype.
Having got so far we wanted to know how much of a role fidelity played.
Specifically we asked: Could a very low fidelity physical prototype and interface
give useful results in the same or less time than an entirely screen-based prototype? We constructed an ultra-low fidelity physical prototype married to a
low fidelity GUI (see left hand prototype in Figure 2). The low fidelity GUI was
created using sketch work produced in Flash via the mouse. The new prototype used the same keyboard chip method as before. Further tests were
carried out in line with the first set but with fewer conditions. We found
that the low fidelity prototype performed similarly to the real phone, thus
demonstrating the importance of the tangible prototype. More accurate
results were produced from a quicker, ›dirtier‹ tangible prototype produced in 80% less time than the high fidelity touch screen prototype.
61
th e r o l e o f ph y sica l it y i n th e d e sig n pr o c e ss
DEVICE
a) physical manipulation
(tum knob,
press button)
b) perceive state
physical aspects
A
c) close feedback
(resistance, sounds)
(knobs, dials,
buttons,
location,
orientation)
virtual aspects
d) electronic feedback
(see massage on screen)
PHYSICAL-LOGICAL
MAPPINGS
LO G I C A L SYS T E M
I) sensed inputs
II) physical effects
(motors, effectors)
B
III) virtuell effects
(show message,
turn light on)
C
(screens,
lights, buzzers,
effects on
logical objects
e) physical feedback
(notice light on,
kettle boils)
physical
IV) physical effects
(controlling external
things)
D
Figure 3: Physical Feedback Loops
We tried to lower the fidelity of the prototype further, reducing the physical prototype to an oblong block the size and rough shape of the phone
with a picture of the phone printed on paper on the top surface through
which the physical buttons could still be activated. The effectiveness reduced significantly. We concluded that the prototype was increasing the error
rate in two major ways. In the context of this chapter the most important
was that the user could no longer easily feel the buttons’ edges. Feedback is
a critical aspect of interaction, both with digital entities and with the physical world, and plays a major role in the theory and practice of usability:
effective feedback was one of Shneiderman’s principles of direct manipulation (Shneiderman 1983) and one of Nielsen’s heuristics (Nielsen and Mack
1994) . What precisely is happening during this feedback process though?
In 2009 we looked at this matter in some detail (Dix et al. 2009):
Once we think of the physical device and the digital effects separately, we
can look at different ways in which users get feedback from their actions.
Consider a mouse button: you feel the button go down, but also see an icon
highlight on screen. Figure 3 shows some of these feedback loops. Unless
the user is implanted with a brain-reading device, all interactions with the
machine start with some physical action (a). This could include making
sounds, but here we will focus on bodily actions such as turning a knob,
pressing a button, dragging a mouse. In many cases this physical action
62
will have an effect on the device: the mouse button goes down, or the knob
rotates and this gives rise to the most direct physical feedback loop (A) where you feel the movement (c) or see the effect on the physical device (b).
In order for there to be any digital effect on the underlying logical system the changes effected on the device through the user’s physical actions must be sensed (i). For example, a key press causes an electrical connection detected by the keyboard controller. This may give rise to a very
immediate feedback associated with the device; for example, a simulated
key click or an indicator light on an on/off switch (ii). In some cases this
immediate loop (B) may be indistinguishable from actual physical feedback from the device (e.g. force feedback as in the BMW iDrive); in other
cases, such as the on/off indicator light, it is clearly not a physical effect,
but still proximity in space and immediacy of effect may make it feel like
part of the device.
Where the user is not aware of the difference between the feedback intrinsic to the physical device and simulated feedback, we may regard this
aspect of loop (B) as part of ›the device‹ and indistinguishable from (A).
However, one has to be careful that this really is both instantaneous and
reliable. For example, one of the authors often mistypes on his multi-tap
mobile phone hitting four instead of three taps for letters such as ›c‹ or
›i‹. After some experimentation it became obvious this was because there
was a short delay (a fraction of a second) between pressing a key and the
simulated key click. Rather like the visual/aural interference in the McGurk
Effect, the delayed aural feedback was clearly more salient than the felt
physical feedback and so interfered with the typing; effectively counting
clicks rather than presses. Switching the phone to silent significantly reduced typing errors!
The sensed input (i) will also cause internal effects on the logical system, changing internal state of logical objects; for a GUI interface this
may be changed text, for an MP3 player a new track or increased volume.
This change to the logical state then often causes a virtual effect on a visual or audible display; for example an LCD showing the track number (iii).
When the user perceives these changes (d) we get a semantic feedback loop
(C). In direct manipulation systems the aim is to make this loop so rapid
that it feels just like a physical action on the virtual objects. Note that the
Equinox experiments described earlier showed that it is not sufficient to
simply emulate loop (C) during early product testing, adding loop (A)
through the physical hand-held device significantly increased the fidelity
of the user testing.
63
th e r o l e o f ph y sica l it y i n th e d e sig n pr o c e ss
up
off
down
on
user pushes
switch
up and down
up, kettle when first turned on) or it may be hidden (e.g. external security
lighting); in these cases the feedback inherent in the device is not just very
obvious, but may be the only immediate feedback.
1.2 Information in the world
user press a
key
Figure 4: Physical Feedback Loops
Finally, some systems affect the physical environment in more radical
ways than changing screen content. For example, a washing machine
starts to fill with water, or a light goes on. In addition there may be unintended physical feedback, for example, a disk starting up. These physical
effects (iv) may then be perceived by the user (e) giving additional semantic feedback and so setting up a fourth feedback loop (D).
The physical effects themselves often carry more information than a casual consideration will reveal: One of the simplest examples of a physical
device is a simple on/off light switch. In this case the switch has exactly
two states (up and down) and pressing the switch changes the state. Actually even this is not that simple, as the kind of press you give the switch
depends on whether it is up and you want to press it down or down and you
want to press it up. For most switches you will not even be aware of this
difference because it is obvious which way to press the switch. It is obvious
because the current state of the switch is immediately visible. The logical
system being controlled by the device also has states and Figure 4 shows
these in the case of the light bulb: simply on or off.
Of course in the case of a simple light switch, the states of the physical
device are in a one-to-one mapping with those of the logical system being
controlled. In previous work we have used the term exposed state (Ghazali
and Dix 2003; Ghazali and Dix 2005) to refer to the way that the perceivable
state of the device becomes a surrogate for the logical state and makes it
also immediately perceivable. In the case of turning on an incandescent
light bulb in the same room as the light switch, this is a moot point as the
semantic feedback itself is immediate and direct. However, in some cases
there may be a delay in the semantic response (e.g. neon lights starting
64
That the physical world is a rich carrier of information can be seen in
everyday activities. The following case study was observed in a busy coffee bar: Two staff work together to serve coffee. One takes the order and
the money; the other makes and serves the coffee. There are a number of
coffee-based variables: The size of the coffee (there are 4 possible sizes);
4 basic types of coffee order (espresso – latte); 1 or 2 coffee shots; no milk
or one of 3 types of milk and 12 types of optional flavoured syrups. Then
there is the sequence in which the order is to be fulfilled: Orders are taken
quicker than coffee can be made and so there is a backlog. The staff devised
the following solution: The person taking the order selects the appropriate cup (large, medium, small or espresso). They turn it upside down and
write the order on the base. This creates a physical association between the
cup and the order that simultaneously takes care of the size of the coffee
and the exact type of coffee to go in it. This leaves the order in which each
coffee is to be made. This is dealt with by queuing the cups in the same
sequence as the orders are to be fulfilled with the most immediate order
being closer to the staff member making the coffee. The end result is that
a great deal of information is efficiently dealt with through physical associations and interactions.
Coffee orders are one thing, but how can designers actually exploit this
type of information in the world? The Room Wizard by Steelcase exploits
both the digital and the physical. Rooms can be booked online from a
user’s desktop computer. Each room has its own, networked Room Wizard
which communicates with the room booking system. Each has a light to
show whether it is booked so that users can see from a distance whether a
room is available. If a series of rooms are located next to each other it becomes easy to see if one is unoccupied. People can then book an un-booked
room by using the appliance physically located in the room. Disputes over
who has a room booked are also easier to resolve because the appliance is
located on the spot of the dispute.
The Hermes office door screens (Cheverst et al. 2007) were inspired partly
65
th e r o l e o f ph y sica l it y i n th e d e sig n pr o c e ss
Figure 5: Steelcase Roombooker (left) Early Hermes prototype (right)
by early versions of room displays such as the Room Wizard. However, instead of just having displays outside a few public rooms, Hermes was designed to see what would happen if large numbers of individual personal
offices had their own displays. The early Hermes units were simply PDAs
modified to fit on walls, and the software was based on the paper PositIt notes often left by the office owners when they left (e.g. »be back in 5
mins«) or those visiting the office when the owner was out. Crucially here
the existing physical system was used to drive the development, but also
the physical prototype system was not regarded as the final solution, but
as a form of ›technology probe‹ (Hutchinson et al. 2003). That is the physical
prototype served as part of the design process to help users make sense of
new technology and to elicit understudying and requirements for further
iterations of the design.
The potential for physicality could probably be exploited for more serious applications, for example in reducing the error rate in life critical applications where the various control inputs into a product directly relate to
one another and to the output. User error with these types of machines can
mean serious injury and even death.
In 2008 Thimbleby (Thimbleby 2008) noted that the annual death toll
from medical errors was roughly similar to the combined annual toll of
car accidents, breast cancer and AIDS combined. He went on to describe a
case study of an infusion pump that had caused at least one patient death.
Thimbleby concentrates mostly on programming factors, but we would
argue (and indeed Thimbleby suggests) that the pump’s physical design is
66
an equally important factor. The accident in question involved a chemotherapy drug: diluted fluorouracil. The bag’s label described the contents,
the size of the dose and so on. In this case the 5,250 mg of fluorouracil was
to be diluted to 45.57 mg per mL and delivered over a four day period. It was
delivered 24 times too fast and the patient died.
A catalogue of issues leading to the patient death were exposed, from
the failure to employ guidelines on how quantities should be notated to
the problems with employing calculators to work out dosages. One should
also bear in mind that the context here is a hospital ward where the nurse
may well be interrupted and where there may also be a lot of background
activity. Several issues were also identified with the design of the pump
itself (which will have had to pass a series of very strict ‘due diligence exercises to be allowed into production), amongst them the ease with which a
button that changed doses by single units could be confused with one that
changed the dose by tens of units. Thimbleby also notes that computer
based medical devices such as this infusion pump are frequently rebooted
when problems arise, at which point they lose previously stored data (another source of potential error).
Following his analysis of the errors in the case Thimbleby designed an
iPhone app that might avoid them in the future. There is no reason however why the device itself couldn’t be designed to reduce the potential for
error. Could storing information via a device’s physicality offer potential
design solutions in cases such as this? Consider the following:
If dials and sliders had been used instead of buttons then the GUI could
have been removed altogether. Printed graphics could have been used
rather than an LCD panel. Since they have a higher resolution than all but
the most expensive screens this would mean that the numbers, symbols
and quantities would be more clearly displayed which might have reduced
confusion. The controls could be placed in the order that the nurse would
need to input them (e.g. from left to right or from top to bottom) because
there would be only one control device per task (slider, dial, switch etc.). A
slider that gave an approximate size of the patient (high to low) would allow the machine to warn the operator of gross dosage errors (such as those
that occur when a mistake is made over the position of a decimal point)
and the size or location of the controls could denote their relative importance. A door that covered all the physical controls but allowed the nurse
to see the dosage being administered could double as the power switch,
simplifying the display and preventing accidental changes while the machine was in use. The position of dials (high to low) would give a simple vi67
th e r o l e o f ph y sica l it y i n th e d e sig n pr o c e ss
sical extension of ourselves. Car designers have understood this for decades. The basic car controls are standardised (e.g. clutch on the left, brake
in the middle, accelerator on the right) and we use them without looking
at them. Later additions such as indicator controls were eventually designed to be primarily experienced through physicality. They are mounted
concentrically with the steering wheel so that we can locate them by moving our fingers into position using the physical reference of the wheel as
a guide. Steadily however, cars have ›evolved‹ to have more and more controls and not all of these are well designed to be operated by the driver’s
while their eyes are focused on the road. The table below is an analysis of
the controls in a 2006 Saab 9-3 (which happens to be the car driven by one
of he authors).
Saab 9-3 driver operated car controls
Figure 6: Saab 9-3 controls
sual indication whether a high dose was being inputted without the nurse
needing to think about the numbers and their relationship to one another.
Lastly a machine designed in this way would calculate the dosage, removing a risk, and the fact that its controls were physical would mean that it
would have the information to re-programme itself to its last setting in the
event of a power cut or a reboot.
Unfortunately we see a lot of design in the world today that doesn’t
make good use of physicality, even when it would seem the most obviously
approach.
Mostly or exclusively physicality
oriented
Mostly or exclusively visually
oriented
Steering wheel
Wing mirror adjustments (6)
Horn
Door lock (3)
Foot pedals (3)
Light controls (7)
Gear stick (8)
Hazard lights
Handbrake
Ventilation controls (6)
Steering wheel-mounted phone
and stereo controls (8)
Computer and miscellaneous
buttons (9)
Window controls (5)
Computer controls via 3 dials
(36)
Boot catch
Dashboard mounted Stereo
controls (23)
Door Catch
2 Case Study – Physicality in the Saab 9-3
Car driving is in many ways a physical activity. Although we respond to
what we see on the other side of the windscreen, the actual driving controls themselves are actually experienced through touch (because our
visual senses should be concentrated on what is on the other side of the
windscreen!). In fact driving is sometimes described as a cyborg activity
(Dix 2002) because to a surprising extent we treat the car as a literal phy68
Air vent direction and opening (6)
Indicators and wipers (11)
TOTAL: 46 (43 operated while
driving)
TOTAL: 91
Grand total of controls available
to the driver while driving: 134
Percentage that can be operated
without (much) visual input: 32%
69
th e r o l e o f ph y sica l it y i n th e d e sig n pr o c e ss
There is a distinct pattern visible in the controls that fail to exploit physicality as a primary interaction method. Firstly, around 75% of them are
mediated by computer and secondly, they are relatively recent arrivals on
the dashboard. Of the remainder, door locks, hazard lights and wing mirror adjustments would not generally be used while driving. This makes the
computer-mediated controls even more conspicuous for their lack of physicality bias. However there are other car controls found in many vehicles
but not developed by the designers of the car. The table below includes
two of the more common peripherals: a Griffin iTrip that allows the user
to listen to music from their iPod or iPhone through the car radio and a
TomTom satellite navigation device. Both devices are specifically designed
for in-car use.
3 Designing for Physicality
Sometimes designers with a good grasp of force, structure and materials
go to some lengths to create designs that appear to defy physicality. The
Bauhaus designer Ludwig Mies van der Rohe is a good example of this.
His 1927 chair, the MR10, uses the strength of steel to create a cantilevered
structure that employs a minimum of material and creates an impression
of a ›floating‹ structure.
Saab 9-3 driver operated car controls continued. (Expanded list including some
common computer peripherals)
Mostly or exclusively physicality
Mostly or exclusively visually
oriented
oriented
TomTom: Touchscreen control
options (≈ 200), physical controls (1)
iTrip (6)
TOTAL: 0
TOTAL ≈ 207
Approximate grand total of controls
available to the driver while driving
≈ 341
Percentage that can be operated
without (much) visual input ≈ 14%
Figure 7: Bauhaus designs appeared to defy physicality (MR10 (left) and
Barcelona Pavilion (right)
Figure 8: The Millennium Bridge, York (left) ignores physicality while The Millennium
bridge, Gateshead embodies the confluence of aesthetics and physicality)
The addition of these extra computer enabled devices accentuates the
issue significantly and it is difficult to miss the fact that the previously
well-understood lessons on the importance of physicality as an interaction
method for car controls have been forgotten. This happens with a lot of
products, particularly products with computers in them. However computer embedded products are not the only designs that have a misleading
or confusing relationship with physicality.
70
He used a similar trick in his architecture where the apparent physical
structure presented to the viewer (in this case a heavy concrete roof that
almost appears to float) is deliberately misleading.
A comparison between two recent bridge designs allows us to compare
two contrasting approaches to the issues surrounding physicality in design. The Millennium Bridge, York offers an apparent physical, aesthetic
and structural confluence while actually separating structure from the
other two. The bridge is actually held up by its curvature and the large
71
th e r o l e o f ph y sica l it y i n th e d e sig n pr o c e ss
Figure 9: Successive generations of iPod Shuffles,
clockwise from top right; with buttons, without, with
girders that make up its main span. The arch which first appears to be a
part of the structure not only contributes little to the strength, it actually
places a twisting moment on the rest of the structure!
The Millennium Bridge, Gateshead, on the other hand, offers a true
confluence between the three elements. Here the physicality and aesthetics of the structure are neatly combined and the balance of forces are as
they are presented and this arguably makes it a more elegant solution.
Another example physicality’s relative importance not being appreciated can be seen at an altogether different scale: Figure 9 shows three
successive generations of iPod Shuffles. At some stage Apple designed a
button-less Shuffle in keeping with the brand’s clean and minimal aesthetic. We have been unable to uncover any information as to why they
revised their approach (Apple’s design processes being a very well kept
secret) but a little consideration regarding the Shuffle’s strengths gives a
guide to some possible reasons: The Shuffle is small enough to be kept in
the pocket or clipped to clothing. It has no screen and so there’s no need to
see it to operate it (e.g. when it is stored in a pocket) - as long as it has buttons. It can also be operated through clothing (e.g. if it were being used
below waterproofs while walking or cycling on a rainy day) – as long as it
has buttons.
In other words the Shuffle’s physicality may well have been critical to
72
its previous success. The switch back to physical controls certainly suggests this is so. This brings us back to the problems with physicality and
computers. There are reasons that products with computers in them are
particularly poorly designed from a physicality perspective. Complexity is
certainly one of these because it typically takes a lot of people from a number of disciplines to design a computer embedded product. Another major
factor is the fact that computers appear to break many of the laws of the
natural (physical) world where many of our gut level understandings and
›instincts‹ are rooted. Below are some rules of thumb that generally apply
to physical objects in the natural world:
Directness of effort – Small effort produces small effects, large effort produces large effects. If you push a pebble a little, it moves a little; if you push
it a lot, it moves a lot – simple Newtonian dynamics.
Locality of effect – The effects of actions occur where and when you physically initiate the action. If you push something and then it moves later
you are surprised and only a magician would try to move something without touching it.
Visibility of state – Physical objects have complex shape and texture, but
this is largely static. The dynamic aspects of state are very simple: location,
orientation, velocity and rate of angular rotation.
All of these rules are systematically broken by human technology, and
in particular digital technology. Consider a mobile phone:
No directness of effort – Dial one digit wrong and you may ring someone
in a different country, not just next door.
No locality of effect – The whole purpose of a phone is to ring people
up – spatial nonlocality; the alarm you set at night and then rings in the
morning – temporal nonlocality; and text messages break both spatial and
temporal locality!
No visibility of state – The phone is full of hidden state, from the address
book in the phone itself to the whole internet (which not ›in‹ the phone,
but can appear on the screen and therefore appears to be part of it).
As noted above, it is not just computer embedded products that break
these rules, the power of even the most basic technology is often in the way
that it gives us supernatural power. For example, a simple saw means that
a small amount of effort allows one to cut through a large piece of wood
that would be impossible to break by hand (breaking directness of effort),
and a bow and arrow allows action at a distance. However the very lack of
visibility in electronic interactions tends to accentuate the break with our
natural, physicality based senses.
73
th e r o l e o f ph y sica l it y i n th e d e sig n pr o c e ss
Figure 10: The Nokia kinetic device concept
With careful thought and understanding some of the missing physical
connectedness can be designed into computer embedded products. Designers started to explore how this might be achieved some time ago. Durrell
Bishop’s Marble Answering Machine concept (as described by Quintanilha (Quintailha 2007)) is a brilliant example of a way that the physical and
digital can interact. The machine contains marbles that it releases to represent messages. Users know how many messages await them by looking
at the retrieving tray and counting the marbles. Messages can be played
back in any order and each marble is also digitally tied to the message it
represents: By placing a marble in an indentation on the machine the machine is triggered to play back a specific message, and to return a call the
user drops the marble into another indentation. Undeleted messages can
be kept outside the machine in a separate receptacle. In another example
Kyffin and Feijs (Kyffin and Feijs 2003) describe a physicality-based digital
camera concept by Joep Frens where »all interactions are natura«. To copy
a picture to the memory card for example the user physically moves the
display of the picture towards the physical memory card. To zoom, they
physically move the lens backwards and forwards.
A more recent example is the Nokia kinetic device concept (CNET 2011)
which has a flexible OLED display. The user can interact with the digital content by bending, bowing and twisting the device to browse, zoom or scroll.
An example of physicality-digital interaction that has actually made it to
74
market is the Nintendo Wii: When the Wii was launched in 2006 it brought
with it a new interaction paradigm. Like the iPhone, it allowed us to bring
learned gestures and associations from our day to day physical world into
our physical-digital interactions (see below). In doing so it opened up new
digital gaming interaction possibilities by removing the barriers between
the physical controls and their digital responses. The Wii brings with it a
lot of interesting case material on how we perceive physical space. For example two players stand side by side playing Wii Sports Tennis. One »serves« to the other »diagonally« across the court. If we assume that the player
»receiving« is on right handed and on the left hand side of the court (from
their perspective), they will be forced to receive the ball »backhanded«. No
physical movement has taken place but the digital world has forced a perceptual changed that has required a physical response.
An equally revolutionary and highly successful product is the Apple
iPhone which has a conflicted relationship in that it both hides and emulates physicality. Almost all interactions with the iPhone occur through its
multi touch screen which has almost no physical feedback at all. On the
other hand it makes it possible to input using physical gestures that map
well onto our natural sense of how the physical world works. Thus we are
able to zoom in on a picture by placing our thumb and forefinger on the
screen and expanding the space between them in a signal we might physically use in say a game of Charades to indicate expansion (in the Charades
case of a word). The iPhone’s ability to accept the inputs of more than one
touch at a time and to understand the physical movement between touches into meaningful digital inputs transforms our ability to interact with
the product in a physical sense, despite the lack of many of the tactile qualities we would usually associate with a satisfactory physical interaction
with an artificial device. In the games world the Kinect is perhaps doing
something similar, in this case being highly physical in terms of body movement, but, like the iPhone, with no tactile feedback.
4 Physicality in Design
So much for physicality in designed artifacts and their effects on the user, but
does physicality have an influence during the process itself ? Naturally. The
importance of physicality based design methods for physicality-based solutions is nicely illustrated by an exercise called The Marshmallow Challenge.
75
pr o t o t y p e !
Figure 11 left: Tivo Remote Controller
Figure 12 right: Waterfall (1961) by Escher
In 2010 Tom Wujec analysed the performance of teams completing The
Marshmallow Challenge (Wujec 2010). The rules allow teams of four eighteen minutes to build the tallest freestanding structure they can from
20 sticks of spaghetti, 1 yard of tape and 1 yard of string. The structure
must support a marshmallow at its top. He found that recent graduates of
business school were amongst the worst performers and that ›recent graduates of kindergarten‹ were among the best performers. Why? The business school graduates spent too much time competing, discussing and
arriving at a consensus, only trying their ideas right at the end and only
having time for a single iteration. On average, they managed a spaghetti
tower height of 10”. The kindergarten ›graduates‹ went straight to work
developing a prototype, trying out several iterations before arriving at a
well-tried solution. Their average tower height was around 25”. The best
performing group (architects and engineers) used the same method. The
physical world is more complex than we can generally effectively simulate in our heads and so the iterative approach is often the most practical
solution, even for apparently simple problems such as The Marshmallow
Challenge.
An actual design case study example of physicality’s employment can
be found in the design of the Tivo remote controller. The controller was designed in 14 weeks by employing an iterative, prototype led methodology
that involved potential consumers early on in every decision from »the feel
of the device in the hand to the best place for the batteries«. According to
Paul Newby, TiVo’s director of consumer design, producing physical proto-
types »early, ugly and often« was the key to the process. »Three-dimensional models were carved from rigid foam in the shape of spoons, slabs and
paddles«. The eventual shape resembles an elongated peanut. »The shape
is comfortable in your hand, it’s friendly and disarming. It’s designed for
simplicity, and it stands apart from the crowd of remotes on the coffee table.« The rubber buttons were chosen for the tactile quality of both their
physical feel and the »slight snap« as the control is activated. As Newby
says, »These are the devilish details that often get overlooked«.
There are a number of ways we can regard the way physicality based
externalisations (Dix and Gongora 2011) are used in the design process:
1. As Product
Designers have a number of ways to approach the process. All have some elements of physicality about them but perhaps the most ubiquitous
is also the least physical: the sketch. The reason for its ubiquity is its flexibility. Virtually anything can be drawn and herein lies the strength of
the sketch. It is quick, cheap, flexible and usable for the design of any
product however complex. Unfortunately, its greatest strength can also
be a crucial fault if inappropriately used: Sketches can allow designers to
mislead themselves and others. It is surprisingly easy for even experienced designers to sketch convincing solutions that are literally physically
impossible (think of Escher’s drawings of impossible structures, Figure
12). Models and prototypes on the other hand exist in the real world and
so carry information that is much more difficult to fake, intentionally or
unintentionally.
76
77
th e r o l e o f ph y sica l it y i n th e d e sig n pr o c e ss
2. As Representations
The functions of these prototypes are multifarious, including communicating to clients, and, critically, the trialling of ideas in practice. Some
external artefacts are physical and in some sense isomorphic with at least
aspects of the things being designed. Thus the blue foam ›soft‹ model represents the form of a design but little else of its physical nature. In a different
modality the hummed notes of the composer fulfil much the same function.
Some are more schematic or representative items such as rigs which might
only explore given elements of a final design and which may look entirely
different, be at a different scale or be in some other way distanced from it.
In the case of communicating to others, the design may be close to final form. However, a key role of the prototype is experimental. Designers
actually use the physicality of the making process to develop understanding of the problems they are dealing with in order to impose an elegant
solution. Both Schön (1984) and Alexander (1964) use scientific language
when talking of this: the concrete design as an ›experiment‹ or ›hypothesis‹. Certain limitations are only realised when an exact scale (or real size)
model is acted out in a real scenario. For example one of the authors once
had a discussion based around the concept of an internet-enabled Swiss
army knife. The idea was that useful tips could be shared via a web site and
step-by-step instructions for using different blades would be displayed on
a small screen on the side of the knife, using the toothpick as a stylus. Verbally this sounded fine (as it most probably would if sketched) but when
acted out with a scale model it quickly became apparent that at a critical
moment the fingers holding the knife would obscure the display.
In this sense physicality is formational. – Most writers have noticed the
common yet strange phenomenon that they know more after they have
written than they did before. This is only strange if one regards externalisation solely as an act of communication. The act of writing demands a
particular word, the need to sketch demands that the location of a door is
specified, the act of prototyping requires the components’ interactions to
be closely understood; what had been vague or fuzzy thoughts becomes
specific and concrete; the very process of elaboration of thoughts changes
the thoughts. Rather then pre-existing ideas being re-presented in an external form, the idea is itself formed in the process of presentation.
The nature of the materials and tools the designer uses at this point of
concept formation through externalisation can have a profound impact on
the kinds of designs produced. In studies we conducted of group design
using different materials, it was noticeable that those with plasticine or
78
Figure 13: Internet-enabled Swiss army knife – oops thumb on the display
cardboard and glue, tended to explore the design space by way of example,
whereas those with paper and pencil tended to create more abstract lists
of properties. In a third example, those with card tended to create designs
around flat or cylindrical (rolled) shapes (Ramduny-Ellis et al. 2010; RamdunyEllis et al. 2008).
Physicality can also be transformational (Dix and Gongora 2011). – The external representation has properties that can be used to help in understanding or planning the eventual outcome. We may sit on the rig of a seat and
find that an element sticks uncomfortably in our back, we may find that
a particular material does not have the right degree of ‘give’ or softness,
or simply run our hands over the planned shape of the wing of a car. Sennett (2008) talks about the relationship between craftsman and material
as a form of conversation and Schön (1984) refers to the "back talk" of the
situation, part of knowing in action. In problem solving research it is well
known that changes of representation can offer obvious solutions to what
appeared to be intractable problems, and perhaps this move from internal
to external is the most radical transformation of all. It is this function of
externalisation as an augmentation of cognitive activity that is critical in
distributed cognition accounts and in those studying embodiment and by
extension physicality.
Formational (left) – vague ideas becoming clearer by the process of externalisation and Transformational (right) – thinking using materials
79
pr o t o t y p e !
Conclusions
Physicality affects our daily life because it is a big part of what makes us
human. It integrates with our other senses to give us a whole experience of
our world. Physicality is crucial in the design of products themselves, influencing use patterns and potentially improving usability, aesthetics and
safety. However, as the Equinox and Hermes examples demonstrate, physicality is also crucial in the process of design, allowing designers to better
conceive their product ideas and allowing clients and test users to better
experience and conceptualise future products. In some senses designers
have long acknowledged this, but our relationship with physicality is still
not well understood, either by designers or users. A better understanding
and more consistent application of physicality as a design tool and an interaction mode will give us a richer, more satisfying and potentially safer
user experience.
Proc. of 1st UK-UbiNet Workshop, Imperial College London. http://www.hcibook.com/alan/papers/ubinet-2003/
Norman, D. (2005) Emotional Design: Why We Love (or Hate) Everyday
Things. Basic Books
Ghazali, M. and A. Dix (2005) Visceral
interaction. In: Proc. of the 10th British HCI Conf., vol 2, pp 68–72. http://
www.hcibook.com/alan/papers/visceral-2005/
Quintanilha, M. (2007) http://interactionthesis.wordpress.com/2007/02/01/
marble-answering-machine/ Accessed
28th November, 2011
Gill, S., D. Walker, G. Loudon, A. Dix, A.
Woolley, D. Ramduny-Ellis and J. Hare
(2008) Rapid Development of Tangible
Interactive Appliances: Achieving the
Fidelity/Time Balance. In: Hornecker,
E.; A. Schmidt B. and Ullmer (eds) Special Issue on Tangible and Embedded
Interaction, International Journal of
Arts and Technology, Volume 1, No 3/4
pp 309-331.
Hare, J., S. Gill, G. Loudon, D. Ramduny-Ellis and A. Dix (2009) Physical
Fidelity: Exploring the Importance of
Physicality on Physical-Digital Conceptual Prototyping. In Proceedings of INTERACT 2009.
References
Alexander, C. Notes On The Synthesis
of Form. Cambridge, Mass.: Harvard
University Press, 1964
Cheverst, K., A. Dix, C. Graham, D.
Fitton and M. Rouncefield Exploring
Awareness Related Messaging through
Two Situated Display based Systems,
Special Issue on Awareness Systems
Design: Theory, Methodology, and
Applications, in Special Issue of Human-Computer Interaction, Volume 22
(2007), Number 1 and 2, pp 173-220.
CNET Australia Nokia’s New Interface
is Seriously Twisted (2011) http://
www.cnet.com.au/nokias-new-interface-is-seriously-twisted-339325050.
htm?feed=rss accessed 2nd Nov 2011,
13:35
80
Dix, A. (2002) driving as a cyborg experience. accessed Dec 2012, http://
www.hcibook .com/alan/papers/cyborg-driver-2002/
Dix, A., M. Ghazali, S. Gill, J. Hare and
D. Ramduny-Ellis (2009) Physigrams:
Modelling Devices for Natural Interaction, in Formal Aspects of Computing
Journal, Springer doi: 10.1007/s00165008-0099-y.
Dix, A. and L. Gongora (2011) Externalisation and Design. DESIRE 2011 the
Second International Conference on
Creativity and Innovation in Design.
pp.31-42
Ghazali, M. and A. Dix (2003) Aladdin’s
lamp: understanding new from old. In:
Hutchinson, H., W. Mackay, B. Westerlund, A. Druin, C. Plaisant, M. Beaudouin-Lafon, S. Conversy, H. Evans, H.
Hansen, N. Roussel and B. Eiderback
Technology probes: inspiring design
for and with families. Proc. CHI '03,
ACM Press (2003), 17-24
Kyffin, S. and L. Feijs (2003) The New
Industrial Design Program and Faculty
in Eindhoven. Designing Designers –
International Convention of University
Courses in Design, Milan
McGurk, H. and J. MacDonald (1976)
Hearing lips and seeing speech. Nature, 264, 746-748.
Ramduny-Ellis, D., J.Hare, A. Dix, M.
Evans and S. Gill (2010) Physicality in
Design: An Exploration, in: The Design
Journal, Volume 13, Issue 1, January
2010, pp 48-76.
Ramduny-Ellis, D., J. Hare, A. Dix and S.
Gill (2008) Exploring Physicality in the
Design Process. In: Undisciplined! Proceedings of the 2008 Design Research
Society International Conference, (1619 Jul. 2008), Sheffield, UK.
Schön, D. A. The Reflective Practitioner. Basic Books, London 1984
Sennett, R. The Craftsman. Allen Lane,
London 2008
Shneiderman, B. Direct Manipulation:
a Step Beyond Programming Languages. IEEE Computer 16:8 (1983), 57-69
Thimbleby, H. (2008) Ignorance of interaction programming is killing people. ACM Interactions, Volume 15 Issue
5, September + October 2008 ACM,
New York
Wujec, T. (2010) Build a Tower, Build a
Team. TED Talks 2010 http://www.ted.
com/talks/tom_wujec_build_a_tower.
html Accessed 16:05, 2nd November
2011
Nielsen, J. and R. Mack Usability Inspection Methods, John Wiley & Sons,
New York, (1994).
Norman, D. A (1988) The Psychology
of Everyday things. Basic Books, New
York
81