Alesis QS 8.1 Electronic Synthesizer

Transcription

Alesis QS 8.1 Electronic Synthesizer
Alesis QS 8.1
Electronic Synthesizer
Interface Analysis and Design Recommendations
Based on Principles of Cognitive Ergonomics
IOE 536 – Cognitive Ergonomics
December 13, 2004
Written By
Raul Baez-Toro
Jason Clark
Tom Ferris
Shameem Hameed
Il-hwan Kim
IOE 536 Cognitive Ergonomics
1.0 Introduction
The lights are bright and the sound is almost deafening. As the keyboardist in the
band, your solo is coming up soon. Right before the solo, you attempt to punch in
the button combination of the one specific sound patch you will need. Somewhere
in the middle of the five-button sequence, you forget that you also need to
deactivate the transpose function (which has shifted the pitch of the sound output).
With so many changes to be done in a short period of time, you almost miss your
cue. Then you wish you had missed it when the first notes of the solo are output in
the wrong musical key.
Our team used a systematic approach to evaluate an Alesis QS8.1 64-Voice
Expandable Synthesizer interface in a representative user domain. Real life experiences similar
to the previous example have motivated us to analyze this model and suggest improvements
that could be made to the interface. Modeling the Cognitive Triad, we evaluated the user
(agent), the synthesizer system, in a live performance environment (world), through the
interface (representation), and how the interactions between the elements affected
performance. The functionality of this model is outstanding, with a large variety of complex
sound output modulations, but the model fails to provide a user-centered interface that is
suitable for a domain where real-time performance is expected and errors are discouraged
(e.g. a live performance in a bar).
The first step in our analysis was to interview users that were familiar with a live
musical performance domain, with varying levels of skills and experience using electronic
synthesizers. We used this information to define a persona and target user scenario, in order
to narrow the scope of our heuristic evaluation. We defined a set of phenotypic errors and
problems (domain-specific descriptions, i.e. “transpose icon is easily missed”) that we could
then translate to genotypic issues (cognitive language, i.e. visibility issue), to be analyzed in
our heuristic evaluation.
Through the use of Jacob Nielsen’s heuristic evaluation and severity ratings, we
narrowed our analysis to focus on the most critical issues of the interface for the user in the
target environment. We used the genotypic descriptions of these critical issues, and came up
with cognitive language solutions (i.e. “increase visibility of the system status”). Then, by
shifting to a top-down abstraction hierarchy (translating functional requirements to physical
form), we propose physical design solutions.
1.1 Brief Description of the System
A music synthesizer is an electronic device that can reproduce sounds from many
musical instruments, and modulate those sounds in multiple dimensions. It may be thought of
as the “Swiss Army Knife” of musical instruments. The process of condensing a multitude of
functionality into a single instrument necessitates a complex interface. The following section
1
IOE 536 Cognitive Ergonomics
describes key characteristics of the interface that are relevant to our evaluation. For a more
comprehensive system description, see Appendix A.
LCD Display
This display is the only reliable source of visual feedback available to the synthesizer
operator. Figure 1 indicates the primary areas interest for our analysis.
Figure 1: Areas of Interest on the LCD Display
Table 1: Description of LCD Display Components Highlighted in Figure 1
1
The large numbers on the left side of the LCD displays the Program or Mix code. The first
(two) digit(s) represent the instrument group (in this case <00>), and the last digit
represents the instrument subgroup, (in this case <1>).
2
The top line of the display shows the name of the particular sound patch that is active. This
patch is unique to the current Bank, numerical code (the large numbers), and Mode (see
the Sound Patch Selection Buttons explanation below).
3
TRN: An up or down arrow will appear in this area of the LCD if transpose is active.
ABCD: The four vertical bar graphs represent the level of the Multi-Function Controllers AD.
4
This displays the Bank number. There are five Banks for each Program or Mix code.
5
This displays the current mode (Mix, Program, or Edit).
* Refer to Appendix A, Figure A2 for illustrations of the following buttons/controls
Sound Patch Selection Buttons
The process of sound patch selection is characterized by four menu choices, illustrated
by the flowchart in Figure 2.
2
IOE 536 Cognitive Ergonomics
Figure 2: Sound Patch Selection
1.
First select either Program or Mix play mode by pressing the corresponding button. Program Mode
is characterized by sound patches that replicate single instruments or synthesized sounds. Mix
Mode combines multiple Program Mode patches into single selections by either overlaying the
sounds or dedicating separate key ranges to output the different sounds.
2.
Next, make choices from the following three menus (instrument group, instrument subgroup, and
Bank). These choices may be done in any order, and the changing of one menu does not affect
the constancy of the other two.
A) Instrument Group. Press one of the thirteen buttons in the top row of the code selection area.
Each button corresponds to sound patches that are based on representations of a certain
instrument; the buttons range from [00] (Piano) to [120] (Drums/Percussion). Note in Mix Mode,
only buttons [00] to [90] are active.
B) Instrument Subgroup. Press one of the ten buttons ([0] through [9]). The sub grouping is
arbitrary, as this value does not convey any information about the modulation of the sound. The Instrument Group and
Subgroup make up the numerical code displayed on the LCD.
C) Bank. Each of the five banks for a numerical code represents a separate sound patch. Pressing either [◄ Bank] or
[Bank ►] will cycle through Preset1 through Preset4 and the User Bank, which is defined through
Edit Mode or defaults to a 5th unique preset.
With 13 Instrument Groups in “Program Mode”, 10 in “Mix Mode”, and different sound
patches for every combination of Instrument Subgroup and Bank, there are over 1000
possible sound patches accessible on the QS8.1. Switching between patches that have none of
the above menu choices in common can require up to five button presses (Mode, Instrument
Group, Instrument Subgroup, and cycling two Banks away).
Volume and Multi-Function Controllers
The use of sliders allows naturalistically mapped analog level control (sliding up
increases the level) and can be assessed at a glance. Volume is controlled with the furthest
left slider in Figure A2 and the other four sliders represent Controllers A,B,C, and D, which
modify different non-volume parameters (i.e., sound “fullness”, reverberation, distortion) of a
sound patch. A user must refer to the LCD display to observe the actual output level of the
3
IOE 536 Cognitive Ergonomics
Controllers, any changes in mode/patch and the controller parameters are reset to zero and
placed in different states of activation.
Transpose Function
The transpose function shifts the musical key of the output sound. When the interface
is transposed, pressing a white or black key on the keyboard will output the pitch that is
normally associated with a key that is a certain number of half-steps up or down from the
pressed key. The LCD will display an up or down arrow to indicate that this function is active,
and the direction of the transposition. The transpose function is activated by holding down the
transpose button in Figure A2 and pressing the key of the desired output (this key’s pitch will
be output when middle C (C3) is pressed), and deactivated by holding down the button and
pressing C3.
2.0 Analysis Methods and Results
Our team incorporated a number of analysis methods to evaluate the QS8.1
synthesizer. First, we conducted interviews with experienced users to determine common
problems with the interface and to gain some insight into the user environment. Using the
information from the interviews, we were able to develop a user environment scenario and a
representative persona. Then, we utilized this persona in the scenario to conduct a heuristic
evaluation, based on Jakob Nielsen’s Usability Principles. Finally, we isolated the most critical
issues from the heuristic evaluation using Nielsen’s Severity Ratings
2.1 Open-ended interviews
The first step was to conduct open-ended interviews with users that were experienced
with this interface or the interfaces of similar synthesizer models. Through these interviews,
we were able to compile a list of common difficulties with operating the QS8.1 interface. We
were also able to understand the live performance environments where these users interacted
with the synthesizer. For a summary of interview questions and representative responses,
refer to Appendix B.
2.2 User Environment and Persona
The information from the interviews was used to define a scenario that represents the
user environment and to define a persona, which is an artificial identity that represents a
target user in that environment. In the subsequent evaluation, team members attempted to
analyze the interface from the perspective of the persona in the defined environment.
4
IOE 536 Cognitive Ergonomics
For our heuristic evaluation, the design principles are evaluated within the context of
live performance and preparation for the performance, of a keyboardist in a rock band (the
target user persona). The following assumptions summarize the target user persona and the
environment:
•
Scenario Ambience: In the live performance scenario the auditory modality is
saturated by the music produced from the various instruments and the noise of the
crowd. The ambient illumination is low, and there are intermittent flashing lights,
leading to a compromised visual modality. Any work done in preparation for the
performance is done with negligible environmental distraction.
•
User Expertise: The keyboardist is considered to be an expert in this domain;
therefore the manipulation of the piano keys may be done in a skill-based manner.
The cognitive processing required for this is assumed to be negligible.
•
Distributed Attention: The user’s attentional resources are distributed between the
continuous skill-based task of playing music and coordination with the other band
members. This coordination is done with visual and auditory cues from the band
members and their instruments. The attentional resources required to operate the
interface will compete with the attention on the band members.
•
Time Pressure and Error Criticality: The performance scenario is real-time, so
there is little opportunity for recovery from errors. The user interacts with the
interface to change sound patches or parameters of the sound at different instances
throughout the performance. Each instance may be characterized by time pressure,
because such changes commonly need to be made without disturbing the musical flow
or rhythm of the piece. Also, errors in button-press combinations can have serious
consequences, in that a completely unexpected sound may be output, interrupting the
entire music sequence.
2.3 Heuristic Evaluation and Severity Ratings
Our team next conducted a basic heuristic evaluation of the QS8.1 interface based on
Jacob Nielsen’s Usability Principles. We primarily focused on the gulfs of execution and
evaluation for the persona in the live performance environment. After analyzing each usability
principle, we assigned severity ratings according to Nielsen’s method. Severity ratings range
from zero to four, with four being reserved for the issues that require the most urgent
attention. Table 2 includes a brief description of the usability issues that received the highest
severity ratings. For the complete heuristic evaluation and more detailed explanation of each
issue, refer to Appendix C.
Table 2: Summary of Usability Principles and Issues
5
IOE 536 Cognitive Ergonomics
Usability Principle
Visibility of the
System Status
Specific Issues
1.
2.
3.
Match Between
System and Real
World
1.
2.
3.
Support
Recognition
Rather than
Recall
1.
2.
Recognition,
Diagnosis, and
Recovery from
Errors
1.
2.
Severity Rating
The display is spatially displaced from the majority of
interface controls, forcing the user to shift visual attention
from the controls to get action feedback.
The active status of the multifunction controllers for each
sound patch is not clearly communicated, and can only be
discovered by trial and error. Also, the particular sound
parameter that is modified by each controller is not displayed.
The transpose function icon on the LCD is not salient, and
serious errors can result if the user is unaware that the
transpose function is active.
4
Sound patch codes are arbitrary and convey no information
about patches with similar codes. Incremental changes within
Instrument Subgroups or cycling through Banks represent
substitutive, not additive changes.
The physical state of the multi-function controllers does not
always match the actual level of the represented sound
parameters. The particular sound parameter that is modified
is not constant across sound patches; therefore expected
modulation may not match output.
The transpose function representation is an up/down arrow on
the display, which is a mismatch to the corresponding
left/right shift in key output.
3
The user is forced to recall button sequences for specific
sound patches. The user may have great difficulty
remembering multiple patch button sequences that must be
entered within short periods of time.
The transpose function display does not indicate the particular
musical key of the transposition. The user must remember
this information.
3
The QS8.1 does not have an online help system. The only
source of information about how to handle/avoid particular
errors is the Alesis manual.
There are few available escape routines or other error
management implementations. A user is not able to “undo” an
accidental function command or return to a previous state.
3
3.0 Design Improvement Suggestions
This section is concerned with design improvements that address each of the most
severe usability issues. This is not a comprehensive list; given more time we could report
many more improvement suggestions. For illustrations of many of these design suggestions,
refer to Appendix D.
3.1 Visibility of the System Status
Illuminate Sound Patch Buttons (Figure D1)
When entering a sound patch sequence, the user is unable to tell which patch is active
without shifting visual attention to the LCD display. The user will encounter difficulty in
bridging the gulf of execution, as he will not immediately be informed that the system
accepted his command. To address this problem, a general solution is to co-locate visual
6
IOE 536 Cognitive Ergonomics
feedback with the control. Therefore, we recommend illuminating the sound patch selection
buttons as they are pressed. This will allow the user closure in confirming that buttons were
selected. One problem that may result from this is that feedback is now distributed in different
locations, instead of locating it solely on the LCD screen.
Show Controller Status and Label Functions (Figure D2)
It is difficult to determine the status (active/inactive) of the multifunction controllers in
the current interface. This is a problem with perceived affordances, and again, in bridging of
the gulf of evaluation. To address these problems, the system should be modeled as a reliable
direct manipulation interface (DMI), where the affordances are clearly communicated.
Therefore, we suggest light indicators below each controller to show which controllers are
active for the selected sound patch.
Another problem is that the controller labels (A,B,C,D) are not informative, as they do
not indicate the underlying function, which changes depending on the active sound patch.
Symbols on interfaces should allow a user to intuitively understand represented variables, and
these variables should be consistent. To address this, appropriately label the DMI controls, and
force them to manipulate consistent variables. We suggest limiting each controller to adjust
only one specific sound parameter, and labeling the controllers with the particular parameters
they modify. Integrating these changes will communicate system affordances to the user
through a direct manipulation interface in a clear and consistent manner. One issue that will
need to be further addressed is where to position the labels as to not be visually obstructed.
Salient Transpose Representation (Figure D3)
The current transpose function icon is not salient on the LCD display. Since a user’s
attention is divided across modalities, the ability to focus visual attention on displays
representing specific functions is hindered, and this can result in errors of omission. To
address this, increase the salience of feedback and display it in such a way to accommodate
pre-attentive reference. We suggest illuminating the transpose button when the function is
active. By displaying a salient light signal, the user can pre-attentively realize the activity of
the transpose function, and therefore will be less likely to commit errors of omission without
distracting visual attention. In addition, by illuminating this feature, the operator can bridge
the gulf of evaluation with simple binary feedback.
3.2 Match Between System and Real World
Force System level to match Controller level
7
IOE 536 Cognitive Ergonomics
There is a possibility of mismatch between the physical controller level and the internal
system level of a sound parameter, in that the system level resets when switching sound
patches. Multiple inconsistent representations can result in user models that do not match
system models. Having one consistent direct manipulation interface will simplify such a
system. We suggest that the system level be forced to match the physical controller level at
all times. The current LCD indication system for these controllers should be removed to
maintain consistency. This change would preserve the naturalness principle and provide a
better match between the user mental model and system model. One problem with this
change is that now each controller must be manually reset when changing sound patches if
the current parameter modifications are not desired.
Additive Sound Patch Changes
Within an Instrument Group, the Instrument Subgroup and Bank of a particular sound
patch seem to be arbitrary, in that there is no information conveyed from these entries. This
requires a user to discover system state by data-driven procedures (trial and error). By
accommodating knowledge-driven search, a desired system state may be found more readily.
We suggest that incremental changes in either the Subgroup or Bank should represent sound
parameter values on a continuum. For example, if code 50 represented the piano group then
code 51 may be made to represent a piano sound with an incremented amount of
reverberation, and the amount of reverberation could be even more finely tuned by cycling
through the Bank positions at this code. Allowing the representation to be made more additive
as opposed to substitutive, and providing the opportunity to develop the appropriate mental
model will reduce the requirement for exploration by the user in searching for a desired sound
patch. It should be noted that there is no universal standard to quantify parameter increase
for some instruments, and this could be an issue in implementing this design suggestion.
Display Transpose Output Key (Figure D4)
The transpose function display icon mismatches the keyboard layout in that the
up/down arrow represents a right/left shift on the keyboard. This violates the naturalness
principle. Changes should be made to reflect the actual state of the system with clear symbols
understood by all users. We suggest that the pitch (the particular key on the keyboard, for
example, Bb3) of the transposed output be displayed in place of an arrow to avoid this
mismatch.
3.3 Support Recognition Rather than Recall
Hotkeys (Figure D5)
8
IOE 536 Cognitive Ergonomics
It is difficult for the user to remember a number of sound patch button sequences, and
be forced to quickly enter them in a small window of time during a performance. Since the
operator bears ultimate responsibility for the operation of the system, building intelligent
system elements to allow the user to use shortcuts will decrease errors of commission and
remove the mental workload burden of recalling some procedures. Our suggestion is to add
shortcut buttons (hotkeys) to the interface. These will be pre-programmable buttons which will
provide one-touch access to specific sound patches and reduce the burden of memory recall
and the shift in attention to focus on the interface. The user should be aware that the addition
of this automated function allows for latent errors to be introduced, if the hotkeys are not
programmed properly before the performance.
Display Transpose Output Key (Figure D4)
Since the current transpose display shows the direction, but not the magnitude of
transposition, the user is forced to remember the particular key of the output. This is an
example of the keyhole effect, where only a restricted amount of information about the
system is available. The user is not allowed any additional peripheral information. In order to
bridge the gulf of evaluation, the display should provide more information about the system
state. Our solution is to display the output key in place of the arrow symbol, which is more
informative to the user as it displays the magnitude of transposition.
3.4 Recognition, Diagnosis, and Recovery from Errors
Help Button (Figure D6)
Online help is not available through the QS8.1 interface. The number of mistakes can
be reduced by supporting robust user mental models. We suggest the provision of a help
button, which when pressed in conjunction with any other button would display a short
description about the function of that button. Pressing help again would return to the normal
display. This function assists in knowledge-based behavior mode and may help the user to
generate more robust mental models.
Undo Button (Figure D6)
The QS8.1 interface does not support error management. A necessity in supporting
error management is an escape routine, which allows a user to mitigate the consequences of
erroneous actions. We suggest implementing an “undo” button that allows a user to revert
back to the state of the system before the last function/step performed. This would allow an
9
IOE 536 Cognitive Ergonomics
escape routine for when the user makes an error or is confused by interactions with the
interface.
4.0 Next Steps
Our analysis and proposed solutions were limited by time and resources. Given more
of either of these, we would continue the effort in the following sequence:
1.
Perform an iterative design process that includes building prototypes of suggested
designs, then critically evaluating them within our group.
2.
Simulate the environment and perform user testing with musicians to measure
performance improvements of the redesign and identify potential problems.
3.
Implement and test the most successful redesign solutions in the domain scenario of a
live performance.
4.
Propose improvements for production by performing various analyses. (e.g.
cost/benefit analysis).
10
IOE 536 Cognitive Ergonomics
Appendix A: Comprehensive System Description
The synthesizer can be divided into two main
areas based on functionality.
1
1) Device Controls (Buttons that allow
functional control of the synthesizer)
2
2) Music Interface (The black & white
keys that control musical output)
Our evaluation was limited to analysis of the
Figure A1 Top view of Alesis QS 8.1
Device Control Interface, which is detailed in
figure A2.
Figure A2: Device Control Interface Components
Interface
Power Switch
Function
Power plug and power on/off
I/O Interface
The output devices are connected to the keyboard via this interface.
Hardware add-ons, such as preamplifiers and various foot pedals, are
also connected here
Memory & Function
Expansion
This interface allows for expansion of system memory, additional
sound libraries, etc.
11
IOE 536 Cognitive Ergonomics
Pitch Bend/Modulation
Wheels
The Pitch Bend Wheel is used to expressively bend the pitch of the
synthesized sound.
The Modulation Wheel is used to create interesting sonic changes in
the current Program or Mix. [Alesis Manual]
Volume & MultiFunction Controllers
The volume control slider is used to adjust the sound output level of
the synthesizer.
The controllers A, B, C, and D are programmable and can be used to
obtain a hands-on control of many different parameters. [Alesis
Manual]
Edit Mode Controls
[▲ Value] / [▼ Value] - When in edit mode, these buttons
increment/decrement the selected value.
[Edit Select] - This button activates Edit Mode. Pressing either <Mix>
or <Program> cancels Edit Mode to return to normal play.
[Store] - It is involved in saving and loading both User and Card
Banks, in copying sound patches, and when initializing individual
patches within Program Mode.
[Page ◄] / [Page ►] - In Edit Mode, these buttons cycle backward /
forward through the available “pages” for the current parameter
(page indication is in the upper right of the LCD).
Seq Select &
Transpose
Seq Select: Pressing this button activates Sequence Playback Mode.
Transpose: This activates the transpose mode for the synthesizer
output. Holding the Transpose button and pressing the third C key
from the left (also known as C3) resets this mode.
Sound Patch Controls
These 27 buttons grouped together at the right side of the front
panel, are used to select a particular sound patch within the current
mode.
LCD Display
Figure A3: LCD Display Components
1) The large numbers on the left side of the LCD displays the Program or Mix code. The
first (two) digit(s) represent the instrument group (the top row of buttons in the sound
patch controls – in this case <00>), and the last digit represents the instrument
subgroup, (the bottom row in the sound patch controls – in this case <1>).
2) The top line of the display shows the name of the particular sound patch that is active.
This patch is unique to the current Bank, numerical code (the large numbers), and
Mode. This location shows the name of the selected Function in Edit Mode.
12
IOE 536 Cognitive Ergonomics
3) The icons displayed at these labels represent the activity of several functions.
[CLP] In Program or Mix Modes, an exclamation point (!) will appear in this area of the
LCD if the signal clips internally.
[SEQ] A blinking arrow will appear in this area of the LCD if a card sequence is
triggered.
[TRN] An up or down arrow will appear in this area of the LCD if transpose is active.
[ABCD] The four vertical bar graphs represent the Controller A-D slider positions.
4) In Play Mode the middle line displays the bank number. In Edit Mode it displays the
name of the parameter being edited. In Mix Program Select mode, it displays the
Program and its MIDI channel.
5) This displays the current mode (Mix, Program, or Edit).
6) The bottom line displays the MIDI channel numbers
13
IOE 536 Cognitive Ergonomics
Appendix B: Interview Summary and Representative Responses
Our team conducted unstructured interviews with three keyboard musicians who had
varying levels of experience with the QS8.1 interface. The following are representative
questions and responses during those interviews.
Q: Describe a scenario that you have been in where you have played this or a similar
keyboard instrument and used some of the specialized functionality of that instrument.
R: Playing in a bar, with a band. I am paying attention to the other band members to know
when to take cues, either from the music or from visual communication with the other
members. It’s very loud and I can’t see some things because of the lights. I may get nervous
and mess up. Also I might be a little intoxicated if there’s free beer for the band.
Q: What are some common problems you have within these scenarios when interacting with
the interface of the QS8.1 or a similar instrument?
R: I might want to switch between two or three sound patches that require long sequences of
button presses. This is difficult to do if I only have a few measures to switch. Also, switching
between some sound patches drastically changes the output volume, so I have to adjust that
with each switch.
R: Sometimes I use the transpose function for a particular song, and then I start the next
song and I forget to turn off transpose. This causes a horrible dissonance with the other
instruments until it can be fixed. It’s easy to forget that transpose is on because the
synthesizer does not clearly display that the function is active.
R: I really have no idea what’s going on with the edit functions for this. Whenever I
accidentally or purposely enter Edit Mode, I can’t ever get out of it and so I have to either
overwrite something or unplug the synthesizer.
Q: What are some other improvements you might suggest for the QS8.1 interface?
R: I never know what the controller sliders change specifically. They should either always
change the same parameters, or have a way to label them.
R: I could really use some kind of shortcut buttons so I don’t have to always remember which
codes I need to enter and try to enter them quickly in a very small amount of time.
14
IOE 536 Cognitive Ergonomics
Appendix C: Complete Heuristic Evaluation
Heuristic Category
Visibility of the
System Status
Focus
Severity
The system should always keep users informed about
what is going on, through appropriate feedback within
reasonable time.
4
The selection of features on the QS8.1 is dependent upon layers of options and menus
that provide feedback through a 1.5x3 inch LCD display. Due to the limited space available on
the display, symbols displayed can have multiple meanings dependent upon the current mode.
The lack of description for these symbols can lead to errors in the user environment. For
example:
1.
The symbol for transpose function is a small up/down arrow ( ↑ ↓ ), which indicates
transpose activation and direction (to a higher or lower pitch). However, it does not
indicate the magnitude of the transposition, and therefore the particular musical key of
the output cannot be inferred directly. Also, due to the diminutive size of the indicator
on the LCD, it is not salient.
2.
The display is spatially displaced from the majority of interface controls. This
necessitates a reorientation of visual attention for closure/feedback after performing
an action with the interface controls.
3.
The physical state of the multi-function controllers does not always match the actual
level of the represented sound parameters. Also, the particular controllers that are
active depend on the particular sound patch. To determine which controllers are active
and at what level, one must find the representation of the level on the LCD display
(see **figure A3** of Appendix A). Although the LCD indicates which controllers are
active (for the current sound patch), it will not identify the particular sound parameter
they modify, nor the parameter modified by the modulation wheel control. There is an
inconsistency in that the level of modulation according to the modulation wheel is
determined only by the physical position of the wheel.
4.
The status of I/O devices, such as the 3 control pedals and amplifier is not indicated
on the LCD display. Therefore, appropriate settings for these devices can only be
found through trial and error. This can be a problem if, for example, no sound is heard
when a key is pressed: this may be due to sound patch or volume settings, or a
number of settings or malfunctions in the peripheral devices or their connections.
5.
Volume settings do not produce consistent sound output volume. As the operator
changes sound patches, the intensity of the sound can increase or decrease without
15
IOE 536 Cognitive Ergonomics
adjustment of the main volume. There is neither an indicator nor a control for this
condition.
The overall lack of representation of system settings can have implications in a domain where
error tolerance is imperative. Since the system is “dumb” it cannot determine the difference
between error and desired performance. The system operator then becomes the only
measure of defense. Without the appropriate information, errors can occur, even for an
experienced user.
Heuristic Category
Match between
System and Real
World
Focus
Severity
The system should speak the users’ language, with
words, phrases and concepts familiar to the user, rather
than system-oriented terms. Follow real-world
conversations, making information appear in a natural
and logical order.
3
The QS8.1 displays settings with a complex array of symbols, and the user is forced to
decode/interpret them. There are multiple examples of potential display/real world
mismatches that may occur when attempting to interpret the system settings:
1.
The numerical value of the subgroup within an instrument type seems to be
completely arbitrary, in that it does not convey any information about similarity to
sounds with nearby values. There are also substitutive changes between sound
patches in Banks of the same subgroup. The user is then forced to randomly select
sounds until they find the desired output. Desired sound patches could be found much
more quickly if numerical codes and incremental changes of Bank represented additive
changes in the sound parameters.
2.
The modulator and multifunction controllers give no indication as to the particular
sound parameters they address, and therefore the expected modulation may not
match what is output when switching sound patches. Trial and error is the only user
approach for discovering represented controller parameters.
3.
The transpose function representation is an arrow pointing either up or down on the
display, but the transposed output key is physically located left or right of the
activated key (for example, if the output is transposed to the key of D, the arrow
should point to the right instead of up, because the key D is located to the right of
middle C, which is the normal output key).
When the user is in a live performance scenario, these mismatches can result in errors.
16
IOE 536 Cognitive Ergonomics
Heuristic Category
User Control and
Freedom
Focus
Users often choose system functions by mistake and
will need a clearly marked “emergency exit” to leave
the unwanted state without having to go through an
extended dialogue. Support undo and redo
Severity
2
In the defined user environment, monitoring occurs primarily through audition.
Change in pitch, key, tone, sound type, etc. are typically realized by the user without any
need for visual indication. However, if the change is unsolicited or unplanned, the user’s
mental model is disrupted. They must then diagnose the change and adjust the settings to
meet operation preference. Issues that illustrate limitations of the QS8.1 in terms of user
control and freedom are:
1.
The interface lacks an expedient method to reverse a change (i.e., an ‘undo’ button),
often forcing the operator into a taxing knowledge driven environment that detracts
from their ability to maintain contiguous play.
2.
The complexity in manipulating controls for certain functions can limit the types of
changes that can occur during a live performance. Situations exist where a user must
avoid switching to certain sound patches because the button sequence is too long to
be completed in a small window of time.
Heuristic Category
Consistency and
Standards
Focus
User should not have to wonder whether different
words, situations, or actions mean the same thing.
Follow platform convention.
Severity
4
The QS8.1 has multiple modes, Mixes, Programs, I/O options, etc. Depending on the
configuration, different buttons, sliding controllers, modulators, and even keys can have
different functions. With respect to the LCD display, the only features that are relatively
consistent are the large numbers indicating instrument group and subgroup. The meaning of
the remaining display elements will vary depending on the current mode. With this in mind
we have identified the following violations:
1.
There are much inconsistent functionality for the <Mix> and <Program> buttons
between operation modes. In addition to switching to either Mix or Program mode in
normal operation, these buttons perform the following actions:
•
<Program> or <Mix> must be pressed to exit Edit mode.
•
<Mix> must be pressed simultaneously with an instrument subgroup button to
enter Demo mode, and <Mix> must be pressed to exit this mode.
17
IOE 536 Cognitive Ergonomics
•
Confirmation of an edited sound patch in Program Mode is done with the
<Program> button, and cancellation is achieved with <Mix>.
These examples illustrate the wide range of non-intuitive actions that are completed
through these controls.
2.
The level of the multi-function controllers resets each time the user changes sound
banks, even though the physical state of the controllers remains constant. This forces
the user to re-tune the modulation by manipulating each controller prior to playing or
the desired sound may not be achieved.
3.
The transpose icon indicator on the LCD does not provide information about the
transposition magnitude, resulting in possible error generation.
Heuristic Category
Error Prevention
Focus
Even better than good error messages is a careful
design which prevents a problem from occurring in the
first place.
Severity
2
The QS8.1 system is “dumb” in that it cannot determine the difference between error
and desired performance. The system operator then becomes the only measure of defense.
There are no security measures to promote error management, such as an “undo” button. For
the most part, this is not a critical issue in the user scenario, since error correction does not
require a process that is too complicated. However:
1.
Having to key in long sequences of buttons to reach desired sound patches increases
the likelihood of an error in live performance.
2.
If Edit mode is purposely or inadvertently entered, it is very difficult to exit without
turning off the power. Since edit mode has the potential to make permanent changes
to the sound patches, there should be a more available exit/undo routine.
3.
There is a lack of display of the states of peripheral devices. Therefore, if there is an
errant setting in any of these devices, the user will not be aware until sound feedback
makes the error apparent. For example, sustain pedal settings may be inverted
depending on its state when the power switch is turned on (depressed pedal should
sustain the note, while releasing the pedal stops it, but the reverse occurs). This is not
communicated to the user until sound is produced.
18
IOE 536 Cognitive Ergonomics
Heuristic Category
Focus
Severity
Recognition rather
then Recall
Make objects, actions, and options visible. The user
should not have to remember information from one part
of the dialogue to another. Instructions for use of the
system should be visible or easily retrievable whenever
appropriate.
3
With the large functionality of the QS8.1, it is nearly impossible for a user to keep track of
all active sound patches, settings, and active devices. Therefore, it is imperative that all state
settings are displayed clearly and efficiently to promote user recognition at a glance. Some
examples of the interface not effectively supporting recognition are:
1.
The LCD display on the synthesizer is too small to clearly communicate the large
amount of information about the state of the system. Due to the size limitations, some
modes will require the user to page through multiple displays to achieve the desired
goal.
2.
The user is forced to recall button sequences for specific sound patches. If multiple
patches are to be utilized in a short period of time during performance, the user may
have great difficulty remembering all of the patch button sequences. A preprogrammable shortcut button is desired.
3.
The transpose function display does not indicate the particular musical key of the
transposition. Therefore the user must remember this information, or discover it
through trial and error.
Heuristic Category
Flexibility and
Efficiency of Use
Focus
Accelerators -- unseen by the novice user -- may often
speed up the interaction for the expert user such that
the system can cater to both inexperienced and
experienced users. Allow users to tailor frequent
actions.
Severity
1
The QS8.1 is quite flexible in the enormous range of output sounds, and modulations
on the sounds that it may produce. However, in the live performance scenario, the efficiency
of the interface has much room for improvement.
1.
Frequent button sequences or combinations that include more than two or three
button presses could be simplified by allowing programmable macro buttons. This
would increase efficiency by representing a complicated action with a single control.
19
IOE 536 Cognitive Ergonomics
Heuristic Category
Aesthetic and
Minimalist Design
Focus
Severity
Dialogues should not contain information which is
irrelevant or rarely needed. Every extra unit of
information in a dialogue competes with the relevant
units of information and diminishes their relative
visibility.
0
A satisfactory product is one that not only serves function, but has a desirable form.
The aesthetic value of an interface is largely subjective, and so it is difficult to define aesthetic
issues for the QS8.1. Assuming the perspective of the user persona for this analysis, we are
able to define the following issues:
1.
The desired size of the LCD display, in aesthetic terms, depends on the environment.
A smaller display creates an overall look closer to a traditional piano. In the live
performance scenario, the user prefers to have a larger screen to assist visibility, and
also to communicate to the audience the large amount of functionality of the
synthesizer.
2.
The information labels surrounding the LCD display are too small and thus difficult to
read, especially in the low lighting of the environment.
3.
The physical weight of the synthesizer can be overwhelming when regularly moving it
for performance setup. The QS8.1 is a particularly heavy model.
4.
The controls visible in the interface are numerous and in some cases, poorly grouped
according to Gestalt principles (like controls should be co-located and oriented
appropriately). It is difficult to achieve a minimalist design in the controls while still
maintaining the functionality and level of efficiency of this model.
Heuristic Category
Focus
Severity
Help Users
Recognize,
Diagnose, and
Recover from
Errors
Error messages should be expressed in plain language
(no codes), precisely indicate the problem, and
constructively suggest a solution.
4
Help should be accessible from any point in the system, whether in response to an
error or to explain functionality. However, the QS8.1 does not have an online help system. The
only way for users to find the information they seek is to refer to the Alesis manual. This can
cause serious problems to the users, especially during live performance.
20
IOE 536 Cognitive Ergonomics
Heuristic Category
Help and
Documentation
Focus
Even though it is better if the system can be used
without documentation, it may be necessary to provide
help and documentation. Any such information should
be easy to search, focused on the user's task, list
concrete steps to be carried out, and not be too large.
Severity
3
Documentation should clearly explain the functions of a system and how to use them
in common tasks, and should do so in an organized manner that facilitates efficient search.
This can be achieved through a good keyword index and through tabs, headers, and a table of
contents that also shows how functions are grouped. Overall, the Alesis manual for the QS8.1
meets these specifications. It provides detailed step-by-step help focused on familiarizing the
user with the functionality of the synthesizer. However:
1.
The manual seems to be written for a target user that is using the synthesizer in a
highly technological studio environment. This includes complex descriptions of sound
output characteristics and interface capability for other studio devices. Therefore, the
user who wishes only to use high level functions for purposes of live performance must
consciously filter the manual content to find the documentation of use.
2.
Some of the language is not plain enough for the target user in our evaluation. Again,
the manual seems to primarily address studio professionals.
21
IOE 536 Cognitive Ergonomics
Appendix D: Redesign Suggestion Figures
Figure D1: Illuminate Sound Patch Buttons
Figure D2: Show Controller Status and Label Functions
Figure D3: Salient Transpose Representation
22
IOE 536 Cognitive Ergonomics
Figure D4: Display Transpose Output Key
Figure D5: Hotkeys
Figure D6: Help Button and Undo Button
23