this circumstance, even suggesting that adding a shift temporarily

Transcription

this circumstance, even suggesting that adding a shift temporarily
P E R F O R M A N C E
M A N A G E M E N T
Balanced Scorecard Report
How to Write Performance
Analysis That Truly Enhances
Decision Making
By David McMillan, Consultant; with Barnaby Donlon, Principal,
Palladium Group, Inc.
“You can’t manage what you can’t measure.” What performance management adherent doesn’t know that mantra? But
just presenting measurement data doesn’t give decision
makers enough information to manage with. Sadly, most
qualitative analysis in performance reporting falls short.
Business leaders depend on performance reporting to tell
them what’s going on at the front lines. Effective qualitative
analysis contextualizes and interprets performance data to
give managers a solid basis for sound decision making.
In the aftermath of the 2003 space
shuttle Columbia disaster, which
took the lives of seven astronauts,
investigators initially focused on
a small piece of foam insulation
that broke off the ship during
launch. But later, as they peered
deeper into the organizational and
technical questions, it became
clear that NASA’s risk-assessment
and decision-making processes
were sufficiently flawed to compromise the safety of the mission.
Though human life is rarely at
stake when leadership teams meet
to review their scorecard results,
they often suffer from a problem
similar to NASA’s: faulty analysis,
compounded by a less-thanobjective (often politically driven)
decision-making process. The
consequences of flawed analysis—
the resulting decisions—may not
literally be a life-or-death matter,
but they very well might spell life
or death for an initiative, product,
or even an organization.
In our work with dozens of
organizations, we’ve observed
that analysis is often the weak or
missing link in the performance
reporting chain. As data collection
gets easier, many businesses
mistakenly believe that more
frequently available or betterquality data equals better decision
making. Wrong. Without contextu-
10
alizing and interpreting performance data, organizations undermine
the potential effectiveness of
the reporting process—in effect,
making decisions with blinders on.
Most organizations generate
reports that simply display raw
data values for their performance
measures, perhaps including
the color-coded traffic lights that
indicate the level of success in
achieving targeted performance.
What commentary is provided
typically reiterates the data displayed in the graph, with little
explanation or context-setting.
The whole point of qualitative
analysis is to contextualize and
interpret performance facts and
figures to provide a basis for
sound decision making and action.
In reporting the quantity of widgets manufactured in Q2, say, you
would show the same period’s
output last year. Depicting yearover-year performance is a nobrainer in indicating performance
against targeted objectives. But
suppose one piece of equipment
was offline for eight days in May,
causing a slight dip in production.
Most likely that information was
not known by all decision makers.
But without it, they could make
a false interpretation. Sound qualitative analysis, on the other hand,
would qualify the result by citing
this circumstance, even suggesting
that adding a shift temporarily
could recover the lost output.
Managers depend on performance
reporting to monitor operations
and progress toward goals. Senior
managers, removed from the
front lines—whether the customer
interface, the assembly line, or
the supplier relationship—depend
on performance reporting to
show them what they can’t see
firsthand. Effective qualitative
analysis puts the information into
perspective, helping decision
makers identify trends, looming
problems, improvements, potential risks, or shortfalls—enabling
them to assess their options and
take action promptly.
Luckily, it doesn’t take special
expertise to compose insightful,
actionable analysis. By following
a simple “reporter’s” approach
(asking “what?” “why?” and “what
next?”), organizations can convert
data into information, information
into insight, and insight into
actionable recommendations—
recommendations that will help
decision makers keep performance, and strategy, on track.
What Happened?
To write effective analysis, you
need accurate data, and enough
of it to get a sense of a trend, and
of whether that trend is normal
or abnormal. If seasonality matters,
you may need to adjust for it by
reporting year-over-year results;
comparing April this year with
April of last year will yield a
clearer picture than comparing
April to March. In reality, it’s not
always possible to have the luxury
of several years of historical data.
But whether you have one or 10
years’ worth, you can still create
insightful analysis.
Graph the measure and target
data values so they can be easily
compared, and it will be obvious
whether performance results were
favorable or unfavorable. Some
November–December 2008
targets may be expressed as a
range, by necessity or at the
manager’s discretion. A single
target, however, is preferable for
comparison. With a range, it’s
more difficult to discern whether
the performance was favorable
or unfavorable—and to gain
management’s attention to focus
on improvements.
Effective visual representation
takes practice, and amateur
graphic designers can easily and
unwittingly create misleading
graphics. So take the time to
create a graph that clearly communicates your message, starting
with the title. The x-axis should
show the time series, so that a
trend will be visible, and the
y-axis should display the units
of measurement along a clearly
marked scale. And be sure to
select the right chart type. A
stacked bar chart is much more
useful than a pie chart since it
can show both a breakdown
and a trend. Avoid including too
many variables. This will make
the graph too complicated and
confusing—or worse, obscure
the pertinent message.
The commentary that accompanies
your charts should ground the
audience in the topic by stating
up front, clearly and concisely,
exactly what the reader should
be seeing when looking at the
graphic. This will ensure no
misreading or misinterpretation.
For the chart shown in Figure 1
you might say “Our revenue
from repeat customers has been
decreasing steadily over the
past year, despite a target of mild
revenue growth.” With this single
sentence, you communicate the
trend in performance compared
to a target value over time. From
this introduction it will be easy
to transition to the more substantive discussion of performance
in the context of the internal and
external environment, risks, and
opportunities that helps generate
insights.
The past may not always be
prologue, but analyzing what
happened can provide clues to
future performance. How does
performance compare to last
quarter’s? Last year’s at this time?
Is the trendline rising or falling?
What’s the outlook? Written
commentary should complement
and expand on the illustrated
data. Aim for high-value, highimpact content—and brevity.
Some experts believe that analysis
should address only the measure
in question. Others (myself
included) believe that it should
address the entire objective. The
scope of a business objective
often extends beyond the performance that can be tracked
by its associated measures. For
example, for the objective “Build
Figure 1. Revenue from Repeat Customer Sales
(in U.S. millions)
$5.5
$5.0
$4.5
$4.0
$3.5
$3.0
Q1 ‘08
Q2 ‘08
Q3 ‘08
Actual
Q4 ‘08
Q1 ‘09 Q2 ‘09 Q3 ‘09 Q4 ‘09
Target
Note that targets are shown for the upcoming four quarters.We discourage
organizations from reporting target data values for the first time when they report
actual performance data. It’s important to present a forward-looking view of where
the organization expects to take performance.
customer loyalty” (the objective
encompassing our Figure 1
measure) you may want to know
the long-term potential of smaller
customers because it has implications for your marketing strategy.
No metric underlies it, so it’s
harder to analyze, but additional
research may support your speculation. Therein lies the value of
contextualizing.
Why Did It Happen?
Identifying the causes of performance is the heart of valuable
analysis—it’s the critical information that enables substantive,
worthwhile discussion and action.
Without getting a plausible explanation about the data, a decision
maker is like a carpenter with
a hammer but no nails. Yet
determining those causes can be
challenging. It involves studying
both the internal environment
(e.g., activities and changes within
the organization) as well as the
external environment (e.g., events,
competitors’ moves, and trends in
the industry and broader business
environment).
To illustrate, let’s look at our
objective, “Build customer loyalty,”
and its component measure,
“revenue from repeat customer
sales.” A decline in sales last quarter could be the result of a number
of factors, internal or external.
What do the measure’s supporting
drivers tell you about the reasons
for the performance? Figure 2
(next page) shows a simplified
driver model for this measure.
Two primary drivers, “customer
purchase intent” and “product
value,” break down further into
quantifiable drivers. An uptick
in repeat customer sales, for
example, might be caused by a
new technology need (perhaps
triggered by a new regulation).
It might be driven by heightened
buying power from major customers in key segments, something that an industry survey on
11
Balanced Scorecard Report
Figure 2. Driver Model for “Revenue from Repeat Customer Sales”
Loyalty
Customer
purchase
intent
Buying
power
Technology
need
Revenue
from repeat
customers
Functionality
Product
value
Pricing
A driver tree offers a logical, systematic approach to analyzing the factors driving
performance—information that’s crucial to the “why?” portion of the qualitative analysis.
customer spending trends might
indicate. Customer loyalty could
also be a factor, which customer
satisfaction surveys would reveal.
Product value breaks down into
two component drivers: functionality and pricing, both of which
could be examined through thirdparty surveys, product reviews,
or customer satisfaction surveys.
While driver models are useful in
performance analysis, you needn’t
develop detailed driver trees for
every measure. But you do need
to identify the most significant
causes of performance because
over time they will prove to be
useful guideposts for gathering
supporting information—and for
raising questions that might help
those who use the analysis.
What internal changes might be
affecting performance? Examine
the internal factors first; these
are the most knowable and most
controllable in terms of making
improvements. Business drivers
related to the measure may
reflect specific challenges, such
as declining employee morale,
budget cuts, the fact that a key
account person left, or that the
company’s marketing budget
12
was slashed across product lines.
Initiatives and other projects are
also constantly altering the landscape of performance. When
in doubt, consult subject matter
experts within your organization
(including functional area managers) for their insights. You can
also refer to internal communications (such as memos and operational reports) for additional clues
that might explain performance.
What external factors are at play?
A competitor may have gained
traction by releasing a new
product addressing the same
solution. That may explain a
dip in sales. Maybe your product
just received a great review in
an industry magazine, which
would augur improved sales
next quarter. Product growth is
not necessarily a zero-sum game;
perhaps the market space is
growing, allowing for continued
growth for you and your competitors. Having a clearer idea of
the most likely factors is essential
before any action, remedial or
otherwise, can be taken.
To understand external forces—
competition, overall market
trends, and other factors—and
their impact on performance,
seek resources such as industry
publications, customer feedback,
general market trend reports,
and news reports. Understanding
the impact of the external environment will also lead to a better
understanding of the risks and
opportunities your organization
faces; for example, “Despite
positive performance this past
quarter, we may see a dent in
North American sales in the next
quarter owing to the cancellation
of a major national conference
that has served as an important
sales and marketing opportunity.”
Once again, tap your subject
matter experts; they can help
prioritize the reference sources
to consult. A marketing person
could help identify key surveys
that might shed light on repeat
customer sales trends. An R&D
person would know which industry publications or blogs would
provide reliable information
about new ancillary products.
It is not always possible to know
unequivocally why a measure
performs one way versus another.
The point of analysis, however,
isn’t to seek the one answer—it’s
to provide informed hypotheses
that will aid in discussion. Don’t
try to represent hypotheses as
fact; rather, state them as possible
explanations for performance
and justify them appropriately.
What Are We Going to
Do About It?
Good performance analysis
should inspire appropriate action.
So be prepared to offer suggestions about how best to improve
performance or perpetuate success.
This requires not just an understanding of the current situation
and performance gaps, but also of
risks and opportunities. Building
on this knowledge involves identifying options and weighing them
based on relative constraints,
costs, and benefits as you develop
the most viable recommendation.
November–December 2008
Recommendations can take many
forms, from a minor follow-up
task to a major initiative designed
to close a performance gap—or
to just maintaining the status quo.
until [date] because we wanted
to incorporate the added functionality that you, our customers,
recently requested.” Here, you’ve
converted risk into opportunity.
Suppose a survey of your online
customers revealed a 10% drop
in satisfaction in the shopping
experience in the past quarter.
Digging deeper, you discover
two likely reasons: slow site
navigation (noted by 42% of
the “unsatisfieds”) and a confusing checkout process (cited by
38% of the “unsatisfieds”). You
might recommend that the web
design team brainstorm to devise
improvements and that marketing
and tech conduct a focus group
to identify opportunities for
improvement in both problem
areas. Include a proposed timeline
to ensure decisions are followed
through: “Both the brainstorming
session and the focus group
should be held within the next
six weeks.”
All recommendations should be
action-focused. Recommending
further analysis is not wise.
For one thing, it’s not an action.
In addition, it not only delays
decision making, but shows that
you didn’t do your job the first
time. And remember that recommendations are not meant to
carry authority or finality. We’ve
known performance analysts who
felt they had no right to make
recommendations—or whose
managers expressed discomfort
over allowing them to make any.
One in particular decided the
senior management team should
simply come up with recommendations during review meetings.
Our response: “How much more
productive might your meetings
be if the analysts provided a
springboard to help stimulate
your thinking and decision making—and to use your limited
time more efficiently?”
The analysis should communicate
the risks, both internal and external, to future performance, as well
as what the organization is doing
to mitigate those risks. It’s equally
important to discuss current or
future opportunities and what the
organization is doing—or could
do—to take advantage of them.
Example: noting that a competitor
with a strong customer base is
owned by a man who is looking
to sell his company or retire.
A common external risk for
retailers, especially in a difficult
economy, is customer price
sensitivity, where customers place
value above brand. To mitigate
that risk, the analysis might recommend creating special offers
for repeat customers. Consider a
software producer that must, for
manufacturing reasons, delay the
release of its latest update (an
internal risk). The analysis might
propose that the sales division
issue a communication to repeat
customers along these lines:
“We have postponed our release
Cultural Resistance and
Due Diligence
In some corporate cultures,
communicating bad news is
verboten. Yet although no
manager wants bad news, failure
without warning is far worse. It’s
not easy, but every organization
must strive to encourage candor—
objective assessment, based on
relevant facts, and done not as
finger-pointing but in the spirit
of common cause. Reality checks,
through good analysis, enable
proactive decision making that
can mitigate or arrest failure—
or yield important improvement.
Develop a network of information
sources, both internal and external,
to provide the important information that colors and shapes
performance data. Don’t go overboard, but gather a good crosssection from reliable sources to
help you establish cause and
effect. And since performance
measures are tracked frequently,
the analysis models you develop
can be reused every reporting
cycle.
An Element of Competitive
Advantage
Measurements must be more than
a “nice to know” commodity if
performance reporting is to have
value. A combination of clearly
represented graphical data and
qualitative analysis that provides
context, plausible explanations,
outlooks, and recommendations
will foster the discussion needed
to drive strategic performance—
helping assess progress and drive
healthy decision making. It’s a
critical element of your business
intelligence—and competitive
advantage. I
T O
L E A R N
M O R E
For more on performance reporting, see “The How-to’s of BSC
Reporting,” Parts I and II, BSR
July–August 2003 (Reprint
#B0307E) and BSR November–
December 2003 (Reprint #0311E),
respectively.
Many other relevant articles on
performance review and reporting
have appeared in BSR. Consult
the BSR Index, available free via
download at www.execution
premium.org.
Boost your visual design skills with
Say It With Charts: The Executive’s
Guide to Visual Communication,
by Gene Zelazny, Director of Visual
Communications at McKinsey &
Company. Or consider Envisioning
Information or The Visual Display
of Quantitative Information, by
Edward R. Tufte. Tufte, a statistician and political scientist, taught
courses at Yale in statistical
evidence, information design,
and interface design.
Reprint #B0811C
13

Similar documents

Licensed Brooklyn Driving School

Licensed Brooklyn Driving School At Pierrepauldriving.com, we offer bus, car driving lessons for Beginners and other advanced drivers in Brooklyn. We are having thirteen years of experience helping teens as well as adults and professional drivers. Join our school today to become a better driver today! Visit Us: http://pierrepauldriving.com/

More information