Implementing a Petroleum Data Governance

Transcription

Implementing a Petroleum Data Governance
Implementing a Petroleum Data Governance
Program Using the CMMI Model
Alan Rezazadeh
Digital Oil Consulting
Data Governance Analyst
Organization Overview
Study Subject is an Oil and Gas Producer Consisting
of:
• Exploration, Drilling, Operations and Production
Scope of Engagement
•
•
•
•
•
SAP (Finance, Plant Maintenance and HR Modules)
Defining and Implementing Data Governance
Operational Sustainment
Capability and Maturity Assessment
Roadmap for Increasing Maturity Level
Business Challenge
Newly Implemented SAP Requires Data
Governance to:
• Clarify Roles & Responsibilities and Decision Rights
• Standardize Data Procedures
Organizational Challenges
• Multiple Departments at Different Data Management
Maturity Levels
• Inconsistent Cross-Domain Processes for Data Entry and
Use
The Plan
Data Governance Design and Implementation
• Create Data Procedures, and Governance Portal
• Train Personnel using the new Data Procedures
Data Governance Sustainment
• Assess Data Governance Maturity Level (CMMI model)
• Design KPIs and Metrics for Maturity Level Monitoring
• Plan Roadmap for Reaching Objective Capability and
Maturity Levels
The Results
• Successful Adoption of Data Governance
Processes by Business
• Maturity Assessment Presented a Clear Picture of
Current Status and Fostered Cooperation Among
Participating Departments for Improvement
• Maturity Assessment and Improvement Roadmap
Presented a Clear Path Forward
Lessons Learned
• Consider Maturity Assessment and Improvement
from Early Stage of Project
• Design Organization and Framework Considering
the KPIs and Metrics
• Design the KPIs and Metrics to Support Maturity
Assessment
• Use the Maturity Assessment and Improvement
Roadmap for Gaining Management and
Stakeholder Support and Showing Value
Metrics Musings at all
Maturity Levels
Chris Laney, CGI Executive Consultant
Julie Stein, CGI US Emerging Markets
Director of PMO/Quality/Security,
SCAMPI(SM) Lead Appraiser
“Statistical thinking will one day
be as necessary for efficient
citizenship as the ability to read
and write.”
– H.G. Wells
2
CGI Overview
• CGI Overview
• Corporate Performance
• Quality, Risk Management
• US Enterprise Markets PMO/Quality/Security
• US Federal Healthcare BU
Agenda
• It’s all Greek to us
• It’s a profession, not a hat!
• We have to make money?
• People love belts
• Size matters, but, oh, it’s impossible
• Understanding variation and stats
• A picture is worth a thousand Metrics
It’s all Greek to us
Staffing the Metrics role
• Quality is not a ‘dumping ground’, it is vital to the
organization
• Metrics Role = Estimator, Risk Manager, Deputy MM,
PPQA’s Helper, Statistician, Accountant, Change Manager,
Database Administrator, Tool Developer, Process
Consultant, etc.
• Some good backgrounds (mix them up):
•
•
•
•
Software Developer, DBA, Analyst
Process Consultant
Industrial Engineer
Screen Actor?
It’s a profession, not a hat!
Required knowledge and training
• IT Metrics, Estimation Methods & Models
• Statistical Methods, Lean Six Sigma
• Data Extraction, Manipulation (e.g., MSOffice trickery)
• Graphical Methods
• Benchmarking
• Psychology
• CMMI, Company Quality Policies
• Software Development (e.g., lifecycles, methods
We have to make money?
Engaging everyone from top to bottom
• Metrics professionals must:
• Make Senior Managers care
• Make Project Managers care
• Make Team Members care
• Make Clients care!
The only thing I want to do is stuff with people who care about
what they’re doing, which sounds obvious, but it’s really not.
~ Joseph Gordon-Levitt
People love belts
Tying the Metrics Program to Process Improvement
• Lean Six Sigma Program at CGI Federal
• Boosting Resumes
• Everyone has improvement efforts
Size matters, but, oh, it’s impossible
How to get Size Metrics right
• Background: Unique problem to software
• Functional vs. physical sizing
• Nonfunctional measures – COSMIC, SNAP
• Will we ever get it right?
• Practical experiences
Understanding variation and stats
What methods work and why
• Statistical Process Control (SPC) in the IT Industry
• “I found the trend!”
• Do you always need the perfect statistical test?
Do you always need one at all? (Next Slide)
A picture is worth a thousand Metrics
The importance of Graphical Techniques
• Whenever possible, replace numbers with:
•
•
Time-series graphs
Cycle Time per Volume (F2A-H)
160
Bar graphs
Defect Rate by Methodology
140
120
Pareto Diagrams
100
100
DCRs per KEDH
•
Histograms
Cycle Time
•
80
60
40
•
Scatterplots
50
20
0
0
1800000
1600000
1400000
1200000
1000000
800000
600000
400000
Boxplots
200000
0
•
2/30
Expedited
OOR
Standard
Methodology
Guided Volume
• A few principles:
•
No clutter; just substance
•
Proportional scale
•
Clear, thorough labels
•
Show history, context
• Great Visualization – Tuftean Movement – “Whatever is needed”
Questions?
No Money, No Time, Need Results: Now What?
Bootstrapping Your CMMI Program
Richard Bechtold, PhD
Abridge Technology
President
Organization Overview
Standard “Small Business” Client Organization
• 25 or fewer FTEs
• Contractor to the government (or more typically, as a
subcontractor)
• Usually 1 or 2 very strong personnel in one or more of
•
•
•
•
Management
Technical / Services
Support Areas (HR, Accounting, etc.)
Business Development
• Considerable amount of ‘client-side’ constraints (such as
staff augmentation work)
• Highly competitive contract environment
Business Challenge
Typical Challenges
• Very limited budget
• Need results in 3 months (or less)
• General rank-and-file resistance to top-down change
• Considerable “corporate knowledge”, but very little of it
has been captured
• Significant risk of single points of failure
• Little or no experience w/ framework compliance
• Ambiguity and confusion regarding distinctions between
policy, requirements, standards, guidelines, etc.
The Plan
Implementation Strategies
• Don’t start over—build from what you have
• Remove (or relocate) what isn’t essential
• Leverage reliable sources of best practices (CMMI, ISO,
Agile, Scrum, IEEE, COBIT, etc.)
•
•
•
•
Implement the easiest areas first
Prioritize by “highest likelihood of success”
See results w/in 30 days
Avoid “great claims” -- Only acknowledge measureable
victories
The Results
Typical Outcomes
• Rapidly functional “pocket PAs” (RSKM, DAR, MA)
capturing and retaining corporate knowledge
• Improved clarity of required vs. optional
• Improved processes, AND improved attitudes
• Reduced (eliminated?) false dichotomy between process
improvement and business development
• Visible and objective data for marketing process-related
efforts (and victories) as competitive advantage
• Increasing intrinsic understanding of benefits—even when
no longer a contract requirement
Lessons Learned
Essential Principles
1. Start with what you have; augment w/ published best practices
2. Seek very brief “version 1” development and deployment
3. If it looks like it’ll cost too much, you are approaching it wrong
4. Trust your instincts: if it makes no sense, there’s another
approach
5. Collect, analyze, and report clear “victory data”
6. Connect/blend/merge process efforts with business
development efforts
7. For real: focus on improving each person’s work-life
NOTE: Full presentation available at: www.abridge-tech.com
Why two constellations are better than
one?
(CMMI-DEV ML5 + CMM-SVC ML5)
María Julia Orozco Mendoza
Ultrasist, Chief Operations Officer
Alejandro A. Ramírez Ramos
Ultrasist, Governance, Architecture & QA Leader
Organization Overview
• Ultrasist is a Mexican Enterprise with more than 20 years of
successful experience in IT Consulting and Software
Development.
• As a result, Ultrasist has been working and has been evaluated
as a high maturity organization for more than ten years (CMMDEV ML4 since 2004, CMMI-DEV ML5 since 2009 & CMMSVC ML5 since 2015).
• Ultrasist is a medium sized company, focused on the delivery of
quality products, based on quality processes, with the flexibility,
adaptability and scalability to tackle the biggest challenges
within the software and IT industries.
Business Challenge
• We signed a multi-annual big contract for our software factory,
based in our organizational performance baseline. (A program with
several incremental projects in it)
• Early in the projects, we identified process capability issues
• Some of our most specialized phases (architecture, security
development, transition) started to take longer than estimated
• Our process performance models predicted that we were not going
to meet our goals
• We had a national security contract with not much space to
negotiation, without clear visibility and control of the target
environment.
• We urgently needed to improve our process performance
Business Challenge
Capability issues …
Capacidad de proceso de Desarrollo (desviación en esfuerzo)
Valor individual
Gráfica I
Histograma de capacidad
2
LCS=2,085
1
_
X=0,865
0
Objetivo
LES
E specificaciones
O bjetiv o 1,0
LE S
1,1
LCI=-0,355
1
5
9
13
17
21
25
29
33
37
0,0
Gráfica de rangos móviles
Rango móvil
0,8
1,2
1,6
2,0
Gráfica de prob. Normal
1,6
A D : 0,353, P : 0,448
LCS=1,499
0,8
__
MR=0,459
0,0
LCI=0
1
5
9
13
17
21
25
29
33
37
-1
0
1
Dentro
Desv.Est.
0,4066
Cp
*
Cpk
0,19
PPM
281863,59
1,5
1,0
0,5
Dentro de
General
Especificaciones
20
25
30
Observación
2
Gráfica de capacidad
Últimas 25 observaciones
Valores
0,4
35
40
General
Desv.Est.
0,4914
Pp
*
Ppk
0,16
Cpm
0,07
PPM
316409,49
The Plan
1. Make a root cause analysis to identify the underlying
cause
2. Identify and analyze alternative solutions
3. Execute the selected solution in the program / remove
root causes
4. Evaluate the results
5. Implement the solution at organizational level
1.The Analysis (CAR)
PROBLEM: We were not meeting our process performance objectives for some kind of
projects
1 - Why?
The specialized phases (architecture, security, non-functional testing,
transition) were taking more time than estimated
2- Why?
The agile development teams were not enough to tackle this challenge
3 - Why?
We didn't have enough specialists assigned to the program
4 - Why?
It was not cost-effective to hire too many specialists
5 - Why?
We hadn’t identified the minimum,
maximum and average use of the
specialists
The specialists were more
expensive than regular developers
and there were too much idle times
1.The Analysis (CAR)
PROBLEM: We were not meeting our process performance objectives for some kind of
projects
Conclusion
Those specialized phases
(architecture, security, non-functional testing, transition)
were actually behaving like a service
2.The Solution
PROBLEM: We were not meeting our process performance objectives for some kind of
projects
We decided to use the CMMI for Services constellation
3.The Implementation
SD, CAM,
WP & WMC
CMMI SVC
We implemented a pilot with the following:
 Encapsulated the specialized phases
as a service
 Analyzed the demand, resources,
capacity and performance needed, for
our services
 We established the SLA’ s
 Updated our service request system
to include those services
 Updated our estimation techniques
 Extended our planning, monitoring and
control capabilities from our dev
projects to our services.
3.The Implementation
Optimizing the resource usage, based on the value of the
expected demand …
3.The Implementation
SCON, SST,
IWM, IRP
SD, CAM,
WP & WMC
CMMI SVC
 We identified the essential functions
and resources that support them
 We defined an strategy and a plan
based on that analysis to ensure the
continuity of the service
 We tailored our process to integrate
the appropriate use of the services
 We planed, developed and deployed
the transition of our systems
 We extended our incident
management system to include the
service’s incidents
3.The Implementation
QWM, STSM
SCON, SST,
IWM, IRP
SD, CAM,
WP & WMC
CMMI SVC
 We quantitatively evaluated the results
of the services’ implementation
 Based in the execution results, we
decided to include these services in our
catalog of organizational standard
services
 We formalized the attributes and SLA of
each service
4.The Evaluation
QWM, STSM
SCON, SST,
IWM
SD, CAM,
WP & WMC
 Through an statistical experiment we
found out that the process performance
mean had changed, showing a true
benefit
CMMI SVC
5. Deploy at Organizational Level
QWM, STSM
SCON, SST,
IWM
PP, PMC, IPM,
QPM,
RD, TS, PI,
VER, VAL
SD, CAM,
WP & WMC
 CMMI DEV & CMMI SVC co-existence
CMMI SVC
CMMI DEV
5. Deploy at Organizational Level
CMM-SVC
PA’s
CMMI-DEV
PA’s
 CMMI DEV & CMMI SVC co-existence
CMMI SVC
CMMI DEV
The Results
Prueba de desviación estándar (esfuerzo) de 2 muestras para (DEV) vs (DEV + SVC)
Informe de resumen
Estadísticas
Prueba de desviación estándar
¿Es Desv Horas A mayor que Desv Horas D?
0
0,05
0,1
> 0,5
Sí
No
Tamaño de la muestra
Media
Desviación estándar
IC de 90%
Desv Horas A
Desv Horas D
37
0,86721
0,48767
(0,4215; 0,5905)
37
1,0002
0,025479
(0,0215; 0,0316)
P = 0,000
La desviación estándar de Desv Horas A es
significativamente mayor que Desv Horas D (p < 0,05).
Gráfica de comparación de las desviaciones estándar
Desv Horas A
Comentarios
Desv Horas D
0,00
0,15
0,30
0,45
Distribución de los datos
Compare la dispersión de las muestras.
Desv Horas Antes
Desv Horas Después
0,0
0,4
0,8
1,2
1,6
0,60
-- Prueba: Usted puede concluir que la desviación estándar
de Desv Horas A es mayor que Desv Horas D en el nivel
de significancia de 0,05.
-- Gráfica de comparación: Los intervalos en rojo indican
que las desviaciones estándar difieren. Considere el
tamaño de la diferencia para determinar si tiene
implicaciones prácticas.
-- Distribución de los datos: Compare la dispersión de las
muestras. Busque datos poco comunes antes de
interpretar los resultados de la prueba.
The Results
Capacidad de proceso Desarrollo con Servicios (desviación en esfuerzo)
Gráfica I
Histograma de capacidad
Valor individual
1,1
LCS=1,0938
Objetivo
E specificaciones
LE I
0,9
O bjetiv o 1,0
LE S
1,1
LCI=0,9091
1
11
21
31
41
51
61
71
81
91
0,90 0,93 0,96 0,99 1,02 1,05 1,08
Rango móvil
Gráfica de rangos móviles
Gráfica de prob. Normal
A D: 0,260, P : 0,704
LCS=0,1134
0,10
0,05
__
MR=0,0347
LCI=0
0,00
1
11
21
31
41
51
61
71
81
0,9
91
1,0
Dentro
Desv.Est.
0,03078
Cp
1,08
Cpk
1,07
PPM
1172,04
1,05
1,00
0,95
80
85
90
Observación
1,1
Gráfica de capacidad
Últimas 25 observaciones
Valores
LES
_
X=1,0015
1,0
0,9
LEI
95
100
Dentro de
G eneral
E specificaciones
General
Desv.Est.
0,02894
Pp
1,15
Ppk
1,14
Cpm
1,15
PPM
558,78
The Results
• We successfully integrated the CMMI-SVC constellation to
our established process, solving our process capability
issues
• We improved the customer satisfaction for our national
security projects
• The process performance improvement leveraged the ROI
Lessons Learned
• The high maturity practices helped us to identify CMMIDEV was not enough for some kind of projects
• The use of CMMI-SVC helped us to focus on the missing
pieces
•
•
•
•
Capacity and Availability Management
Service Continuity
Strategic view of our services
Accuracy of project’s and service’s estimation
• The CMMI-DEV maturity at Ultrasist provided a solid
foundation for the SVC adoption
• As an outsourced software factory, SVC provided a new
strategic way to deliver our work, in benefit to our clients
Budgeting, Estimation, Planning,
#NoEstimates and the Agile Planning Onion
Thomas M Cagley Jr
DCG Software Value
VP Consulting
All Budgets, Estimation and Plans Are . . .
Fantasies
. . . more or less
Why Do We Budget, Estimate and Plan
Why centers on answering a four very basic questions:
– When will “it” be done?
– How much will “it” cost?
– What is “it” that I will actually get?
– What can I afford?
3
Budgeting v Estimation v Planning
• How much money should I allocate?
• Which projects or products should we fund?
Budgeting • Which projects will return the greatest amount of value?
• What can be delivered?
• When can we deliver?
Estimation • How should teams be allocated?
Planning
• What tasks need to be completed?
• Who needs to complete specific tasks and when?
4
The Cone of Uncertainty
Budgeting
4x
2x
Too high
Estimation
Planning
1.5x
1,25x
Good Guess
Cone
of
Uncertainty
Actual
.8x
.67x
*Probability
.5x
Too Low
.25x
P1
P2
P3
P4
All work has a cone of
uncertainty
– Sales, Testing, Messaging
All: different steps and widths!
P5
Time . . .
P
r
o
j
e
c
t
T
i
m
e
l
i
n
e
Budgeting
• Primary Stakeholders:
Executives, Finance, Deal Makers
Estimation
• Primary Stakeholders:
Project Managers, Team Members, Middle
Management
Planning
• Primary Stakeholders:
Team Members, Project Managers
6
Planning Flow In Agile
Mike Cohn’s Planning Onion
7
Classic Budgeting, Estimation and Planning
Budgeting
4x
2x
Too high
Estimation
Planning
1.5x
1,25x
Good Guess
Cone
of
Uncertainty
Actual
.8x
.67x
*Probability
.5x
Too Low
.25x
P1
P2
P3
P4
P5
– The budgeting process is a forecast helps make decisions about
which pieces of work are to be done.
– The estimate ( believed to be more accurate) makes a project
about when the work will be completed (estimates the four
questions).
– Planning provides tactical, task level guidance.
8
Budgeting Answers
How much money should I allocate for software development,
enhancements and maintenance?
Which projects or products should we fund?
Which projects will return the greatest amount of value?
Most organizations have a portfolio of work that is larger than they can
accomplish, therefore they need a mechanism to prioritize.
9
Defining An Estimate
Targets
– Statement of a desirable business objective
– Example: Taxes must be paid by April 15th
Commitments
– A promise to deliver
– Example: I promise not to leave my taxes until the last minute
Estimates
– A prediction
– Example: Preparation of my taxes will require many pots of coffee, a large
part of a bottle of aspirin, an internet connection and around two
weekends to complete.
10
Impact of Ineffective Estimating
Impact of major schedule slippage is often
dramatic:
1. Unrecoverable revenue losses
2. Not first to market
3. Public failure
4. Possible legal repercussions
Corporations are more significantly impacted by schedule pressures than
any other factor
11
Does The Past Predict The Future?
A story of childish
hijinks, black paint, a
truck and straight
roads.
Sometimes
prediction
depends on
the context.
Estimation Pathologies Via Jim Benson
The three -level process described above, if misused, can cause several
team and organizational issues. Proponents of the #NoEstimates
movement often classify these issues as estimation pathologies. Jim
Benson, author of Personal Kanban, established a taxonomy of
estimation pathologies that includes:
– Guarantism – a belief that an estimate is actually correct.
– Swami-itis – a belief that an estimate is a basis for sound decision
making.
– Craftosis – an assumption that estimates can be done better.
– Reality Blindness – an insistence that estimates are prima facie
implementable.
– Promosoriality – a belief that estimates are possible (planning
facility)
13
#NoEstimates Reflects A Continuum of Thought
• Break work down in small
chucks
• Assemble minimum viable
product (MVP) for feedback.
• Generate continuous
feedback and re-planning.
#NoProject
• Break work down in small
chucks
• Continuously measure
throughput.
• Average throughput used
for forecasting
#NoEstimates Continuum
Throughput
Vasco
Duarte
Woody
Zuill
14
When Does #NoEstimates Work?
The idea of #NoEstimates can be applied at the level of planning and
estimation IF the right conditions are met. Conditions include:
– Stable teams
– Adoption of an Agile mindset (both team and organizational levels)
– A backlog of well-groomed stories
For a sprint a team can easy answer:
– When will “it” be done?
– How much will “it” cost?
– What is “it” that I will actually get?
– What can I afford?
15
Planning Onion and Timing
Budgeting
Analogy
#NoEstimates
Estimating
(Type I)
Parametric - Light
Planning
#NoEstimates
Integration
(Type II)
Planning
Standup Meetings
Mike Cohn’s Planning Onion
16
Case Study Context
Large software development firm, moderately hierarchical
culture and several very large project and many smaller
Mixed of SCRUM/XP, Kanban, SAFe for some large
programs and Plan Based projects
Strenuous budgeting process with tax accruals
Entrenched Program Office provides administrative functions
Mixture of internal projects and outsourced work.
Evolution
High Level Estimation (product and release):
Release Plans and product road maps were easily be built from
forecasts for all external products and internal applications with 500
users or more.
Agile teams that with a track record of delivering value on a regular
basis. Were allowed to leverage #NoEstimates for planning. Other
conditions include:
– Stable Teams
– Agile Mindset (both team and organizational levels)
– Well-groomed stories
All projects and products are required to find a way to answer the classic
questions of when, what and how much work will cost whether the
work is done by single teams or by scaled Agile programs.
18
Technique Pallet
Budgeting Techniques
– Analogy (Macro)
– Business Case
– Road Mapping
Estimation Level Techniques
– Parametric Estimation
– Analogy
– Planning Poker and Rate
(form of release planning)
– #NoEstimates (Flow)
Planning Techniques
– Work Breakdown
Structures (Plan Based)
– Points (various) and Sprint
Planning
– Points (various) and
Continuous Flow (Kanban)
– Stand Up Meetings
– #NoEstimates (Flow)
Not All Happiness - Contractual Agile
All outsourced contracts being transitioned to fixed cost and date
contracts leveraging a hybrid Scrum / Plan Based project management
solution. The PMO actively tracks these vehicles. #NoEstimate
techniques are not allowed in this organizations contract vehicles.
– Raja Bavani, Senior Director at Cognizant Technology Solutions
stated in a recent conversation, that he thought that #NoEstimates
was a non-starter in a contractual environment.
20
Final Thoughts
All budget, estimates and plans are by definition are imprecise
Only be accurate within a range of confidence
The single number contract which generates anger and frustration fueling
#NoEstimates movement.
#NoEstimates and classic estimation are tools to generate feedback and
create guidance.
The goal is usually the same, it is just that the mechanisms are very
different.
21
Questions
Questions?
Data Quality and Process Performance
Baselines and Models
Ryan Bays
Booz Allen Hamilton
Process Engineer
Organization Overview
Booz Allen Hamilton
• Founded in 1914
• Headquarters: McLean, Virginia
• 22,000+ employees
• Leading provider of management and technology consulting
services to the US government
• Internal process improvement program & Corporate Quality
Office (integrated teams)
• CMMI-DEV Maturity Level 3 (ML3), originally achieved in
September 2005 (CMM appraisals began in 1998), Quality Office
achieved CMMI-SVC Capability Level 2 in February 2015
• ISO 9001-2008 and related standards registration since 1997 for
multiple sites/business units
• Website: www.boozallen.com
Business Challenge
• CMMI Level 4 and 5 process areas build on the
measurement data collected at Levels 2 and 3 by using
that data to create predicative models of process behavior
• Question: What happens to models when data quality is
poor?
The Assessment
• First, you must recognize that you have an issue with data (i.e., if your
Mars Climate Orbiter disintegrates while entering the Martian
atmosphere, you’ve discovered the error a bit too late)1
• We discovered our data quality issues when we started looking at
regression equations – data that we felt should have some correlation,
appeared to have none
• We initiated an Advanced Casual Analysis on Data Quality which
focused on Coverage, Completeness, and Accuracy
• We didn’t need more measures; we needed more accurate measures,
and more analysis against them
1
Mishap Investigation Board Phase 1 Report: http://sunnyday.mit.edu/accidents/MCO_report.pdf
The Impact of Data Quality
Before:
After:
Lessons Learned
• Drive data quality to the source
• Make data definitions operational, simple and unambiguous;
litmus test, 2 different analysts should be able to come to the
same basic understanding looking at the definition
• Consider creating an organizational QPPO to manage data
quality to the desired levels as a first typically-needed step
• Automate where possible
• Where automation isn’t possible, institute strong training and
internal process-based checks and controls (e.g., conduct
periodic data quality assessments)
• Create Process Performance Baselines & Model maintenance
procedures that allow for updates when data improves
Contact Information
Ryan L. Bays, PMP, LSSBB
1349 West Peachtree St NW #1400
Atlanta, GA 30309
[email protected]
678-360-3274
How We Quit Complying and Just Did What Made Sense
Seeing the Business Value in High Maturity
Suzanne Lawson, MBA, PMP, CSM
General Dynamics Health Solutions
Sr. Manager, Quality and Standards
YOUR LOGO HERE
General Dynamics Information Technology
Overview
• Systems integrator for more than 50 years
• General Dynamics Information Technology, a business unit of
General Dynamics, provides systems engineering, professional
services and enterprise IT solutions to customers in the
defense, federal civilian government, health, homeland security,
intelligence, state and local government, and commercial
sectors
• General Dynamics Health Solutions is a sub-brand
• Working under CMM/CMMI for more than 15 years
Health Capabilities
● Health Systems Development & Integration
● Medical Facility Outfitting & Transition
● Health Information Exchange
● Medical Logistics & Supply Chain
Management
● Subsidy, Reimbursement & Claims
Processing Systems –
Development & Support
Managing project
● Implementation of Health IT & Support
complexity
and ensuring the
Services
continuity of care
Reducing operational
● Medical Record
Review accurate
burden,
ensuring
● ICD-10 Transition
payments
and streamlining
● Eligibility Determination & Enrollment
compliance
● Multi-Channel Contact Center Management
● Medical Research Support
● Data & Infrastructure Management
● On-Demand Inbound & Outbound
Communications & Print Fulfillment
● Clinical Support Staff
● Networking & Communications
● Data Warehousing and Analytics
Expanding Insight, Ensuring
● Population Health Management
Value, Advancing Outcomes
● Quality Measurement, Management,
Reporting & Payment Solutions
● Program Integrity Solutions
● Big Data
Engaging and connecting
patients, providers and the
public
● System Development, Integration &
Process Design
Advancing today’s
research for tomorrow’s
care
● Virtual Desktop Environment
Delivering secure, cost
●
Cyber Security
effective
and responsive
health IT
● Cloud
Background
• Mature CMMI Level 3 organization
Recognize that you
don’t know what
• Three distinct programs
trust to have your
you don’t know and
• System development, operations, and maintenance
best interest at heart
seek input from
• Different processes
and know what you
organizations that
can trust.
have trod this path.
• Client required that we be rated Level 4 by
September
Know
who you2012
can
• High Maturity journey began in late 2010
• Hired outside consultants
Training
• Understanding CMMI High Maturity Practices (UCHMP)
Really
• Improving Process Performance
understanding Using
High Six Sigma (IPPSS)
Maturity
is a Using Six Sigma
• Designing Products and
Processes
process in itself.
(DPPSS)
JMP
Minitab
But….Did we understand the concepts of High Maturity?
Decision Optimizer
Be wise about the
Process Model
time and money you
spend on training.Crystal Ball
@Risk If you are
DEC
AUG
OCT
JAN
NOV
SEP
confused,
speak up!
Approach # 1:
“We have to do this so let’s just dig in and get it done”
Ensure you have a strong
“The
where
• We found out what measurement data
wegap
hadbetween
and what
and useful Measurement and
were
as an
organization
quantitative management tools we we
could
build
from
that data
Analysis program that
in terms of measurement &
supports and provides value
analysis
• …as opposed to defining our business
needsand
andhaving the
to your business.
right data for the goal was
identifying where quantitative management tools would
often deep and wide.”
improve our business
clear about why
you are level models and baselines
• Be
Organizational
vs. program
embarking on the journey to
High Maturity. Share that
vision. Keep sharing it.
Be wary of “one
size fits all”.
OPP and QPM
• Complied with each practice, step by step
OPP SP1.2
QPM SP1.2
QPM SP2.3
OPP SP1.1
“We spent more time in the
Focus on business
first appraisal
documentingvalue?
OPP
SP1.4
Where’s
the
business
value, rather than
and QPM
diagramming
(and
SP2.2
trying to comply
QPM SP1.4 selling)
our modeling
QPM SP2.4
with
the
CMMI
QPMprocess
SP2.1than we did
model practice by
actually deciding how to
practice.
OPP
SP1.3
QPM SP1.1
use the output.”
QPM SP1.3
OPP SP1.5
“Try…”
“Try, Try Again…”
• June 2011: SCAMPI-C
• November 2011: SCAMPI-C2
• May
что нам
делать
2012:дальше
SCAMPI-B
• August 2012: SCAMPI-A
???
Que
devonsnous faire
ensuite
???
JAN
APR
FEB
SEP
JUL
MAR
OCT
MAY
JUN
NOV
DEC
AUG
Circumstances change…
• Engaged a statistician
• Level 4 requirement rescinded
• Retained one program in the Level 4 organization
Approach # 2:
Really make High Maturity work for the organization
• Teamed the statistician with a senior analyst
• Re-evaluated existing baselines and models
• Reconvened cross-functional modeling team
Plan ahead for
special resources
or skills.
Engage team
members with the
right skills.
Circumstances change…again
• Lead Appraiser announced retirement
•
“SCAMPI stands for
Brought on CMMI
a new (to
Standardized
Appraisal Method for
Process Improvement.”
us) Lead Appraiser
“The approach to understanding the
Work
a Lead
organization
andwith
its goals,
and trying to
Appraiser
who model in
see the elements
of the CMMI
our work,
rather than the
previous
encourages
you
approach of rigorously ‘define the
to focus
on the– create the
process – follow
the process
business
value
of
artifact’
was very
helpful.”
the CMMI
model.
“If you take away
process
improvement, what
do you have?”
SCAM
The intervening years
• Decision Analysis & Resolution
• It’s not just for multi-million dollar decisions
• It’s what we do every day
The intervening years
• Model has been evolving over time
• Final Test Execution model
• Baselines:
•
•
•
•
Test Case Completion Chart
Problem Density
Idealized Testing Curve
Idealized Development Curve
Keep it simple!
Current state
• Passed our L4 appraisal with no negative findings, a multitude of
Understand
your
strengths,
and suggestions
where we can further improve.
Ensure you have
business needs and
senior management
what
provides
value
to
• We have quantitative management tools thatsupport.
are actually used
your
organization.
by the
business.
• The Modeling Team continues to function, asking business value
questions, building quantitative management
toolsactivities
to answer
Balance your
in
Make sure whatever
those questions, and growing in capability.
the context of adding value
you do makes sense for
to the organization and
and provides value to
• The Modeling
Team enjoys the full don’t
support
senioraway with
getofcarried
your business.
management.
models/statistics.
What has come out of this journey?
“There isn’t really a
“Modeling is only as good as
“Convey
to the
staff
they are
“Mindset
shiftwhy
from
‘magic
model’.
Let
the
“Wehopeful
have been
thatable
we’ll
the“I’m
underlying
data…before
collecting
data
and
value
“Greater
appreciation
of it
something
that what
weand
need
model
mature
“People
beginning
to
improve
andjourney,
put
see
you
setgreater
outare
onfamiliarity
that
mayquantitative
can
tools…these
tohave…[it
do inIt’s
order
toreinforce
achieve
grow.
more
about the
to
recognize
value.”
new
processes
in
and
appreciation
by
you
should
ensure
your
need]
torating
collect
itsomething
carefully
tools
can to
add
value.”
our
learning
during
the and
more place.”
of data
the staff.”
fundamental
collection
with
attention
tothe
accuracy.”
that
wethan
want
to actual
do.”
journey
tools are solid.”
final model.”
What’s next?
• Continue to identify opportunities to engage quantitative
management in our daily practice
• Address the issue of work culture
• Team members are starting to think about how these capabilities
could be used to improve processes
• At the conclusion of the SCAMPI-A last year, our Lead Appraiser
stated that we were already engaging in many L5 practices
• Planning on a successful Level 5 appraisal in 2018
Lessons Learned… Before you even start
• Really understanding High Maturity is a process in itself
• Ensure you have a strong and useful Measurement and
Analysis program that supports and provides value to your
business
• Recognize that you don’t know what you don’t know and seek
input from organizations that have trod this path
• Know who you can trust to have your best interest at heart and
know what you can trust
Lessons Learned… Before you even start (cont’d.)
• Be clear about why you are embarking on the journey to High
Maturity. Share that vision. Keep sharing it.
• Understand your business needs and what provides value to
your organization
• Ensure you have senior management support
• Plan ahead for special resources or skills
Lessons Learned… While you’re ramping up
• If you are confused, speak up
• Engage team members with the right skills
• Be wise about the time and money you spend on training
• Be wary of “one size fits all”
Lessons Learned… When you’re underway
• Focus on business value, rather than trying to comply with the
CMMI model practice by practice
• Balance your activities in the context of adding value to the
organization and don’t get carried away with models/statistics
• Keep your models and baselines simple
• Work with a Lead Appraiser who encourages you to focus on
the business value of the CMMI model
• Make sure whatever your do makes sense for and provides
value to your business
In conclusion…
“The 2012 effort was about
“High Maturity
applying High Maturity
thinking“The
is journey
to achieve
practices
and tools into our
becoming thea Maturity
work Level
and the4 2015 effort was
‘norm’ of everyday
rating has
been
an uphill
about
what
could we do from a
work.” battle but
occasionally
High
Maturity point of view to
the view
fromour
up business
here is and improve
better
incredible.”
our processes.”
Leveraging Your ISO-based Compliance with
CMMI-based Improvement
Michael West
Natural Systems Process Improvement
Principal
Organization Overview
About NSPI:

Founded 9/11/2001

CMMI Institute-Partner, Certified Lead Appraisers

AS9100C Internal Auditor

Process system design and development for CMMI, ISO/AS/TL
Standards, DO-178, ITIL, PMBoK, SOX

Delivering value-adding performance improvement results to clients
in all sectors of economy

Value-based CI3: Courage, Initiative, Intelligence, Integrity

Author of two process improvement books, multiple articles, and
delivered dozens of conference presentations, tutorials, and
keynotes
Business Challenge
Many organizations achieved and maintained ISO/AS compliance for many
years before adopting the CMMI. When they adopted the CMMI, here’s what
went wrong:

The organization didn’t take time to learn what the CMMI really is and
what it’s used for, and just considered it another “quality standard.”

Uninformed leadership pushed CMMI adoption to Operations, QA, or
Mission Assurance, failing to realize that it is a model for engineering
and product development

Operations and QA tried to patch or add onto their existing QMS to
accommodate engineering processes, usually failing, or even worse …

Management wanted to believe that their QMS already accommodated
the CMMI

CMMI appraisals were perceived to be the same as ISO/AS audits
The Plan
Transform myths, misbeliefs, assumptions, and
inaccurate interpretations into fact-based
information.
Research, publish, publicize, and educate the
uninformed so that they can understand the
difference between standards and the CMMI, and
realistically leverage the synergy between their ISO
investment and CMMI investment.
The Results
Know the differences between AS9100C (ISO 9001:2010)
and the CMMI in terms of:

Purpose, structure, and intended use

Content focus and product realization life cycle applicability

Implementation

Appraisals and audits
Know the leverage points
The Results
Scope and Scale
CMMI-DEV
688 pages
22 process areas
421 practices
AS9100C
33 pages
5 process areas
72 clauses
The Results
Purpose, Structure, and Intended Use
CMMI-DEV
AS 9100C / ISO 9001:2010
Guidelines for improving software and
systems engineering
Standards (requirements) primarily for
production and manufacturing
Informative, but not prescriptive
Prescriptive, but not informative
Provides practice interpretation and
implementation guidance
Does not provide interpretation and
implementation guidance
Project and organizational performance
excellence is the implicit goal of continual
improvement via the CMMI.
The implicit goal of continual improvement
in an AS9100 organization is maintaining
compliance with the Standard.
The Results
PM and Support
Life Cycle
Content Focus and Product Realization Life Cycle Applicability
CMMI-DEV: Product Design and Development
TS
VER
PP
RD
REQM
PI
AS9100: Production
VAL
Transition to
Production Gap
CMMI-DEV: PMC, MA, IPM, RSKM, GP 2.8s, GP 2.10s
7.1.1, 7.1.2, 8.2
CMMI-DEV: CM, GP 2.6s
4.2.3, 4.2.4, 7.1.3
CMMI-DEV: PPQA, GP 2.9s
8.5.2, 8.5.3
The Results
Content Focus and Product Realization Life Cycle Applicability
CMMI-DEV
AS 9100C / ISO 9001:2010
Focus on product engineering from
requirements to transition to production
Primary focus is on production and
manufacturing
Unit of work is based on project or program
Based on operations; does not incorporate
units of work
Accommodates multiple life cycle models
Presumes waterfall life cycle model
6 process areas defining 45 practices in
engineering disciplines
Allocates 5 paragraphs to product design and
development, and does not have to be
documented procedure
5 process areas define 54 practices in project
management
Provides minimal standards for project
management and risk management
5 process areas define 38 practices in
process related to process development,
process management, and continuous
process improvement
Very little focus on QMS development and
management; only 6 standards-based
procedures are required to be documented
The Results
Content Focus and Product Realization Life Cycle Applicability
CMMI-DEV
AS 9100C / ISO 9001:2010
Via a Glossary, definitions are provided for
226 terms and phrases
Glossary provides definition for 4 terms,
but does not define “manual” or
“procedure”
ML 4 and ML 5 provide 4 process areas for
quantitative process and performance
improvement
Does not define approaches, practices or
methods for quantitative process or
performance improvement
Provides institutionalization goals and
practices and information for establishing a
managed process and a defined process
In general, the concept of
institutionalization is not addressed; there
are some leverage points
Provides components for process tailoring
Does not provide information for tailoring
QMS performance
Provides process areas and practices for
establishing and maintaining a process
definition and management capability
Does not address establishing an
organizational function to maintain the
QMS
The Results
Content Focus and Product Realization Life Cycle Applicability
CMMI-DEV
Provides PAs and practices for four
product realization support areas:
• Configuration Management
• Decision Analysis and Resolution
• Measurement and Analysis
• Process and Product QA
AS 9100C / ISO 9001:2010
Provides standards for two support areas:
• Configuration Management
• Measurement and Analysis
The Results
Typical Implementation
CMMI-DEV
AS 9100C / ISO 9001:2010
Process focus group formed to develop
and deploy organization’s standard
processes and process assets
QMS group formed to write the Quality
Manual + procedures
Process group conducts process modeling
workshops to first define the as-performed,
and then determine CMMI gaps
QMS group writes a Quality Manual and
procedures that mimic the Standard almost
verbatim, and then expect the operations
and production personnel to perform those
procedures, often irrespective of what
people actually do
Based on project or program
characteristics, process tailoring criteria
and guidelines are developed and used to
tailor process implementation.
Because the QMS is viewed as inviolate
“requirements,” programs and projects are
usually prohibited from tailoring their
implementation of the QMS.
The Results
Typical Implementation
CMMI-DEV
AS 9100C / ISO 9001:2010
Via institutionalization practices GP 2.2 and
GP 2.8 in all PAs, process performance is
planned and monitored.
AS9100C does not provide for the
institutionalization of Standards-based
process performance, and there is no
requirement to plan and monitor process
performance.
Focus of PPQA implementation is to gain
insight into fidelity of performed process with
defined process, to improve performed
process.
Focus of QA is two-fold:
• Be the QMS “police” to ensure people
perform what is defined, and
• Conduct product, component, and
material physical inspections (QE)
Via OPF SP 1.3, OPF SP 3.4, IPM SP 1.7
and GP 3.2 in all PAs, the organization
implements and institutionalizes feedback
and learning processes and systems for
continuous improvement.
“Continual Improvement” (8.5) in an
AS9100C organization is based on
corrective and preventive action, which are
oriented toward addressing process and
product defects.
The Results
Appraisals and Audits
CMMI-DEV
AS 9100C / ISO 9001:2010
SCAMPI MDD is 258 pages and defines
three appraisal phases, within which it
defines 13 appraisal processes, within which
it defines 43 appraisal activities.
Guidelines for Auditing Management
Systems is 44 pages and defines guidelines
for conducting ISO-based audits – in three
areas, within which it defines 19 audit
activities.
The bases for strength and weakness
findings are a comparison of the appraised
organization’s implementation of processes
in relation to the reference model – the
CMMI.
The bases for a conformance or
nonconformance are a comparison of the
auditee’s:
• QMS against the Standard
• Examples of the auditee’s
implementation of its QMS/quality
manual
• Examples of the auditee’s products and
work products against customer
requirements
The Results
Appraisals and Audits
CMMI-DEV
AS 9100C / ISO 9001:2010
The aggregation of instantiation-level
weaknesses may or may not result in a
weakness finding.
One instantiation of a nonconformance
yields a nonconformance finding (equivalent
to a weakness).
Appraisal (SCAMPI Class A) results in
process capability and or organizational
maturity level ratings.
Audits result in conformance or
nonconformance to the Standard.
In appraisal planning, scope and sampling is
Model-based, and based on the organization
using defined scoping and sampling factors.
Appraisal sampling is also based on the
organization’s basic units (i.e., projects,
programs, releases, builds).
Audit scope is based on the Standard, and
whether the audit is a surveillance audit or a
full-system recertification audit. The concept
of a product realization “project” is not
inherent in the Standard, and audit sampling
does not require there to be a “project.”
The Results
Appraisals and Audits
CMMI-DEV
AS 9100C / ISO 9001:2010
Five defined characterizations are used to
determine the extent to which practices are
satisfied:
• Fully implemented
• Largely implemented
• Partially implemented
• Not implemented
• Not yet implemented
There are two determinations about the
implementation of a standard: conformant or
non-conformant.
Mini-team or full team consensus (not
“majority rule”) decisions are made on the
extent to which the organization satisfies the
intent of Model components.
Lead Auditor makes judgments about nonconformances and their categories.
The Results
Appraisals and Audits
CMMI-DEV
CMMI goal, capability, and maturity level
ratings are based on the extent to which
weaknesses, in aggregate, affect the
satisfaction of a CMMI goal.
AS 9100C / ISO 9001:2010
There is no threshold or even guideline for
minor versus major nonconformities that can
be used by a lead auditor to determine
revocation or suspension of an
organization’s registration; such decisions
are solely at the discretion of the lead
auditor.
The Results
ISO 9001:2010 and CMMI-DEV Leverage Points
Process focus, definition, and improvement:

The CMMI-DEV provides substantially more guidance for establishing
and institutionalizing a persistent organizational process focus.
Combine the QMS group with the EPG/SEPG to coordinate an
integrated process focus and capability.

Recognize the product realization life cycle scope of the CMMI-DEV
(design and development) and AS9100 (production and
manufacturing). Build one process system for end-to-end, using both
the Model and the Standard as appropriate.

Clearly define the interfaces and hand-offs of the processes and
systems used for reporting and managing corrective and preventive
actions (AS9100), and those used for process change or improvement
requests (CMMI-DEV).
The Results
ISO 9001:2010 and CMMI-DEV Leverage Points
Process and work product quality assurance:

Integrate processes, systems, and checklists for conducting AS9100based audits (internal and external) with CMMI-based engineering
process and work product audits. For example, establish CAR
categories: process, work product, material, component.

Define and apply the same QA effectivity measures to both process and
product quality assurance.

Link or otherwise establish traceability between CARs and process
change requests
The Results
ISO 9001:2010 and CMMI-DEV Leverage Points
Institutionalization touch-points:

Standard 5.3, Policy relates to CMMI-DEV Generic Practice GP 2.1

Standard 5.4.2, Quality Management System Planning relates to CMMI-DEV
Generic Practice GP 2.2.

Standard 4.2 3, Control of Documents and 4.2.4, Control of Records is related to
CMMI-DEV Generic Practice 2.6.

Standard 5.6, Management Review, is related to CMMI-DEV Generic Practices 2.8
and 2.10. However, AS9100C oriented management review is a review of the
QMS procedures whereas GP 2.8 and GP 2.10 in the CMMI provides management
with visibility into process performance against the defined standard processes.

Standard 6, Resource Management, relates to CMMI-DEV Generic Practice 2.3.

Standard 6.2.2, Competence, Training, Awareness, relates to CMMI-DEV Process
Area Organizational Training (OT) and Generic Practice 2.5.
The Results
ISO 9001:2010 and CMMI-DEV Leverage Points
AS9100C Clause 7 Product Realization expansion to CMMI-DEV
coverage:

Replace this 1.5 pages in the standard with your product development
OSSP, based on the 688 pages of the CMMI-DEV.

Replace AS9100 7.1.1 Project Management with processes based on
PP, PMC, IPM, RSKM, and QPM and – better yet – PMBoK practices.

Replace AS9100 7.1.3 Configuration Management with processes
based on CM and GP 2.6s in all PAs; also cover standard EIA-649.
Lessons Learned
Do:

If you’re responsible for model adoption or standards
compliance, and you don’t know something, go learn

Question your colleagues’ and bosses’ statements (“What’s
your source?”)

Learn vicariously from others’ experiences; be open to
ideas other than your own

Realize the universe of process improvement is ALWAYS
bigger than your knowledge and experience
Lessons Learned
Don’t:

Assume that what you know about process or the CMMI or
a Standard is all there is to know

Assume that what you’ve done in the past is the “best” way
to do things

Believe that the CMMI and an ISO-based standard are
equivalent

Think that just because someone calls himself an “expert”
that he is
Michael West
Natural SPI
Author and Consultant
Executive Certification, Notre Dame
CMMI Institute-Certified Lead Appraiser
SEI CERT-RMM Partner
AS9100C Internal Auditor
435-649-3593
An approach to organize the
jungle of models, methods and
appraisals
Winfried Russwurm
Siemens AG, Munich, Germany
Principal Key Expert
Business Challenge
• Many process related models, methods, techniques,
assessments and others have been published over the
last decades
• They compete with each other, are mapped to each other,
but finally this “jungle” causes a lot of confusion,
frustration and waste of effort, time and money
• What helps to go beyond just buzzwords and create a
deeper understanding?
• How to decide which are needed, useful and beneficial?
The Plan in General
• A “sorting” or “structure” scheme is created
• Each process model, method etc. (existing and new ones)
is analyzed against characteristics and assigned a “place”
in the scheme.
• The scheme currently comprises three levels
• Domains (Content-related, Rating-related)
• Sub-Domains (e.g., Process Models, Frameworks,
Appraisal Methods, Quality Assurance)
• Classes (e.g., rigidity of appraisal methods, process
model application classes, working method application
classes)
The Plan: Overview of the Scheme
Full Set
Full set
Domains
Examinations
Model-Processes-Objects
Assessments/Appraisals
Process models
Methods & Tools
Diagnoses
Class A
Development
models
Product Lifecycle
models
Class B
SubDomains
Classes
The Plan: Example Models Processes Objects
Models-Processes-Objects
e.g. ISO 9001, PMBOK, CMMI-DEV,
ISO 15504, RUP, code, project
Requirements for Process
Process Frameworks
Models e.g. ISO 15504-2,
e.g. RUP, PMBOK,
Objects
ISO 9126
TSP, ITIL
e.g. code, architecture,
also home for
projects
some “quality
models”
Process Models
Methods&Tools
e.g. ISO 9001, CMMI-DEV,
e.g SCRUM, FMEA, MISRA,
ISO 15504-5
also home for
Industry-specific standards
“maturity models“
From abstract/characteristics through process to work objects
View: from strategic through tactical to operational
The Plan: Example Examinations
Examinations
e.g. SCAMPI, ATAM, code assessment,
process audit, project audit
Requirements for
Assessments
e.g. ARC, ISO 15504-2
Compliance Audit
Methods
e.g. ISO 19011
Assessment/Appraisal
Methods
e.g. SCAMPI, SPA
also creating
“maturity levels” etc.
Investigations
e.g. project audits,
project recovery reviews
Quality Assurance
e.g. process audits,
desk reviews
Diagnoses
e.g. tests, ATAM, code
assessment, security
assessment, industryspecific standards
From characteristics of examination to measuring work products
View: from strategic through tactical to operational
The Results
• The “sorting” or “structure” gives fast control over
discussions and buzzword use (what is it, do we have it,
do we need it)
• Facilitates adoption at the right place, less comparison of
apples and oranges
• Structures the “jungle” and enables more informed
decisions
• Benefits and overlaps are determined much easier.
• Avoids waste, reveals gaps, reduces frustration
• Promotes organizational success
Synesthesia, Team Performance,
Ethnic Rhythms, and You
(High Performing Teams)
John Ryskowski
JFR Consulting
President JFR Consulting
JFR Consulting
Synesthesia
A neurological phenomenon that causes some people to mix
their senses, associating color with certain letters, or
sounds, or smells.
A study published in PLOS One confirmed findings that
color-odor connections are fairly stable within a culture.
Together we shall initiate connections between world
rhythms and levels of team sophistication.
JFR Consulting
Key
State
Looks Like
Feels Like
Chaos
Oompah
Rock
Calypso
Songo
© Copyright 2016: JFR Consulting
JFR Consulting
Tools
Drum set
Clap Master JD
Your participation
Rhythmic portrayals
Rhythms from the world
JFR Consulting
The Results from this Session
Over the years, observant lead appraisers have developed a “feel” for
the 5 levels of organizational maturity. This is due to experience and
the abundance of usable language furnished by the CMMI. “Let me
wonder around an organization and I can tell you the maturity level in
one hour.” Conference attendees will invariably have team experiences
ranging from good to bad, but most likely lack the language and
associated framework to fully articulate team capability. This
presentation will supply attendees with a “feel-based” framework to
describe a team’s capability ranging from chaotic to relaxed and
sophisticated.
This talk will actually begin to “wire” into attendee’s brains a “feel” for
team sophistication based on rhythms. It will be experiential learning
with group participation and lots of fun.
Working harder and faster is not the answer. A team whose energy is
mindfully invested is able to “breathe” as a single living organism and
deal with problems without losing their “rhythm.”
You will definitely have fun!
JFR Consulting
Lessons Learned
Thanks for your interest
For the most recent slide set go to jfr-consulting.com
(presentations and papers)
JFR Consulting
Walking the Walk, Talking the Talk: CAR in
Action at the Organization Level
Jennifer Attanasi
Lead Associate
Ryan Bays
Quantitative Management
SME
Shannon Campbell
Quantitative Management
SME
Agenda
• Background
• Case Study
•
•
•
•
Identify the Problem
Research and Analysis
Develop Implementation and Control Plan
Evaluate Results
• Key Best Practices
• Lessons Learned
Background
Booz Allen Hamilton
• Founded in 1914
• Headquarters: McLean, Virginia
• 22,000+ employees
• Leading provider of management and technology consulting
services to the US government
• Internal process improvement program & Corporate Quality
Office (integrated teams)
• CMMI-DEV Maturity Level 3 (ML3), originally achieved in
September 2005 (CMM appraisals began in 1998), Quality Office
achieved CMMI-SVC Capability Level 2 in February 2015
• ISO 9001-2008 and related standards registration since 1997 for
multiple sites/business units
• Website: www.boozallen.com
Business Challenge
• New Advanced Causal Analysis process in standard set of
organizational processes was untested
• Process was designed to be used at the project level and
at the support organization level
• Implementation of the processes for the case study was at
the support organization level
• Concerns related to data quality provided an opportunity
to implement the new process
The Plan: Charter
Problem Statement
• Data quality issues limit the data that is available to be aggregated and
included in the organizational baselines and models
•
•
Not able to combine full range of projects in existing baselines and models
Not able to create standardized baselines and models for additional Quality and
Process Performance Objectives (QPPOs)
Business Case
• Expect that greater data quality will contribute to the organization’s ability to
meet its QPPOs (through more robust baselines and models)
• Expect that greater data quality will result in greater range of data from
projects being available for inclusion in the organizational baselines and
models
Research and Analysis
•
•
•
•
Survey
Root Cause: Fishbone
Industry Research: Data Quality
Industry Research: Standard Metric Definitions
Survey
Data Quality
Definitions
Root Cause
Standard
Metric
Definitions
Key Aspects of Data Quality
• Coverage – Number of required base measures that are
being reported divided by total number of required base
measures
• Completeness – Number of empty data elements that
should have information filled out divided by total number
of data elements that should have information filled out
• Accuracy – Number of accurate data points from data
accuracy audit divided by total number of data points in
data accuracy audit
Implementation Plan and Control Plan
• Implementation Plan
• Training
• Job Aids
• References
• Updated Metric Definitions
• Control Plan
•
Data Quality dimensions
• Regular updates to Engineering Process Group on the
current status and results
Implementation Task Examples
• RASCI* Activity
• Job Aids
• Role-specific changes: Before and After Reference (ML3
vs. High Maturity)
• Detailed Metric Definitions Reference
• QPPOs/PPBs/PPMs Reference
• Specific Metric Definition updates made to Organizational
Construct Definition Document with ripple impacts to
Project Level Construct Definition Documents
* Clearly defined Roles and Responsibilities through independently facilitated RASCI method: Responsible,
Accountable, Supporting, Consulting, Informed as assigned to Stakeholders
Control Plan: Data Quality Assessments
• Monthly Data Quality Assessments conducted
Independently by Organization for Projects
• Coverage, Completeness, Accuracy measurements
• Accuracy assessments looked at accuracy of some recent
historical data and data coming in on a monthly basis
• As a result of the assessments, identified a need to
continue to evaluate and track. A Data Quality QPPO was
developed and approved. Continuing to conduct Data
Quality Assessments on an ongoing basis.
Evaluating Outcomes
• Some baselines were impacted more than others
• Baselines and Models were updated
• Added additional QPPO to improve the Accuracy,
Coverage, and Completeness based on results of Data
Quality Assessments
• Data Quality QPPO is reviewed with leadership and
Engineering Process Group on a regular basis
• Project are now more attentive to data quality as it
relates to their data
Key Best Practices
• By addressing data quality through the advanced causal analysis
process (including tools/templates), using quantitative management
to monitor the results, and a systematic approach will ultimately lead
to the implementation of improved data quality built into the
processes
• Identification of root causes leading to the problem statement is
necessary
• Regular updates to all stakeholders ensures buy in from all levels
• Prioritization of Implementation Plan and Control Plan tasks is critical
• Continuing to keep the Problem Statement in mind is necessary to
make sure that the ACA does not get off track
Lessons Learned
• Implementation Plan should be scoped to manageable
chunks
• Attention to Data Quality must be incorporated in to
the process at every level and in every role
• Include Project Measurement Analysts on updates of
the Support Organization Causal Analysis Activities at
milestone points so that project implications are well
understood and buy-in is received
Contact Information
Jennifer Attanasi, Lead Associate
Phone 703-377-1418
Ryan Bays, Quantitative Management SME
Phone 678-360-3274
Shannon Campbell, Quantitative Management SME
Phone 585-802-8731
Data Quality Resources
•
•
•
•
•
•
•
•
Data Quality Measurement Information Model Based on ISO/IEC 15939Research Paper by Ismael Caballero, Eugenio Verbo, Coral Calero, Mario
Piattini
Measuring Data Quality- Research Paper by Mónica Bobrowski, Martina
Marré, Daniel Yankelevich
Five Fundamental Data Quality Practices- Data Quality and Data IntegrationWhite Paper/Pitney Bowes
The Six Primary Dimensions for Data Quality Assessment-"White
Paper/DAMA (Data Management Association) http://www.damauk.org/"
3-2-1 Start Measuring Data Quality-"Experian Data Quality Blog supports
“The Six Primary Dimensions for Data Quality Assessment"
ISO Standards
DATA QUALITY: THEORY VS PRACTICE -Sydney DAMA presentation by
Intraversed
DATA QUALITY FUNDAMENTALS-PowerPoint Presentation found onlineopen source- David Loshin of Knowledge Integrity Inc.
We got a CMMI ML5 certification…
what´s next?
Gabriela Da Cunha Arnold & Thiago Falbo Cardoso
IBM Latin America Client Innovation Center
Process, Methods & Tools Leaders
Organization Overview
2013
Brazil
Mexico
IBM Global Client Innovation Centers
ROMANIA
CHINA
MEXICO
EGYPT
IINDIA
PHILLIPINES
SSA
BRAZIL
SSA
Brasil
Mexico
Spanish
South
America
Egypt
RoCeB
China
India
Philippines
2015
Brazil
Mexico
SSA
Argentina
Chile
Colombia
Ecuador
Peru
Uruguay
Venezuela
Business
Challenge
Business Challenge
There are several challenges PM&T (Process, Methods and Tools) faces when
implementing CMMI Level 5, some of them are:
Multi-country / cultural organizations
Different languages
Local business needs
How to leverage PM&T capabilities in Latin
America during integration and reinforce CMMI
Level 5 implementation?
The Plan
The Plan
• PM&T organizations assessments to recognize starting point.
•
1º : Brazil & Mexico  LA
•
2º : LA & SSA
• Definition of the common Operational Model.
• Roles and responsibilities integration.
• Establishment of the management system, including common RTC /
workflows to control the team activities, plans and actuals.
• Definition of KPI to monitor team performance.
• Language training
• Gradual deployment  waves and retrospectives
The Results
 Harmonization of practices across the shared centers and
support business alignment.
 Lean process
 Asset integration
 Better utilization of resource bandwidth.
 Cross usage of best practices across the delivery centers.
 Focused implementation leading to better effectiveness.
 Improved skills of the teams involved in providing respective
services as they would get an opportunity to work across Delivery
Centers.
 Vertical integration into the Global structure.
 LA team influencing and being recognized globally.
Lessons Learned
Lessons Learned
PM&T team to report in a higher level of the organization and to share
responsibilities regarding business results
01
02 Exchange resources between PM&T and Delivery teams
03 Consulting skills in PM&T team
04 Knowledge sharing
05 Process simplification
Expecting the Unexpected –
Managing Change while Advancing to High Maturity
Kileen Harrison
Chief Technologist
Shannon Campbell
Quantitative Mgmt SME
Agenda
• Background
• Challenges and Approaches
–
–
–
–
People
Process
Training/Tools
Culture
• Critical Success Factors/Lessons Learned
• Results
Background
Booz Allen Hamilton
• Founded in 1914
• Headquarters: McLean, Virginia
• 22,000+ employees
• Leading provider of management and technology consulting services
to the US government
• Internal process improvement program & Corporate Quality Office
(integrated teams)
– CMMI-DEV Maturity Level 3 (ML3), originally achieved in September
2005 (CMM appraisals began in 1998), Quality Office achieved CMMISVC Capability Level 2 in February 2015
– ISO 9001-2008 and related standards registration since 1997 for multiple
sites/business units
• Website: www.boozallen.com
Challenges and Approaches
• Embarked upon our High Maturity journey with two key change
management principles:
– Sense of Urgency (i.e. burning platform)
– Guiding Coalition
• Developed an initial plan to arrive at High Maturity in 17 months
• Began developing High Maturity processes and realized traction was
too slow:
– ML3-related issues surfaced
– Tasks were taking longer than planned (e.g., defining organizational
baselines associated with the organizational QPPOs)
• Obstacles were a mix of:
–
–
–
–
People
Process
Training/Tools
Culture
People
Challenge
• Staffing a High Maturity Initiative
(HMI) when there is a limited
number of resources with High
Maturity experience across the
industry
Landscape
• Two certified and experienced
High Maturity Lead Appraisers
(primary experience base)
• One practitioner with High
Maturity experience (also one of
the HM Lead Appraisers)
• Very tight timeline to accomplish
this goal
Approach
Established an HMI Project Team:
•
•
•
•
•
Core HMI Team included balance of
individuals with data/statistical experience
and individuals with SDLC experience
One of the HM Lead Appraisers
functioned as the Project Manager
Second HM Lead Appraiser functioned in
the role of Technical Advisor
Two dedicated HMI Analysts (experienced
with data/statistics) were primarily
responsible for supporting Measurement
Analysts on the project teams
Remainder of team comprised of part-time
individuals straddling roles within existing
process improvement organization (e.g.,
Performance Measurement Lead)
People—
HMI Project Org. Chart
Project
Manager
HM Technical
Advisor
HM Analyst
•
•
•
•
HM Analyst
Deputy
Project Manager
PI Operations
Lead
Performance
Measurement
Team Lead
Org. Training
Team Lead
OSP
Process
Engineer
Project Manager – Certified High Maturity Lead Appraiser
HM Technical Advisor – Certified High Maturity Lead Appraiser from firm’s Corporate Quality Office
HM Analysts – Full-time support with experience in data/statistics
Process Improvement Organization provided part-time team members in functional areas and enabled surge
support for critical activities.
FSG
Process
Challenge
• Realization that the new
processes for High Maturity
have to actually integrate with
the processes in our current
OSP, which were based on
lower maturity compliance
Landscape
• Existing OSP in place for more
than 10 years
• Significant challenges with
current tool housing OSP
• Projects operating at varying
levels of maturity
Approach
Distinguished High Maturity processes as
“advanced” project and process
management:
•
•
•
•
Conducted a “process party” for initial
process definition; paired process
experts with SMEs
Had multiple releases of the OSP based
on changes needed for High Maturity
Reworked some ML3-based processes
(e.g., causal analysis, measurement
analysis)
Created new format for “advanced
processes”
Process—
Integration: Measurement and Analysis
New Items Created to Support Existing Measurement
and Analysis Process
• Procedures
–
–
–
–
Establish and Maintain QPPOs
Perform Advanced Process Performance Analysis
Establish and Maintain Process Performance Baselines
Establish and Maintain Process Performance Models
• Templates
– QPPO Subprocess Traceability Matrix
• Addresses common pitfall with not linking the HM analysis back to critical
business objectives
Process—
Integration: Measurement and Analysis
Revised Items to Support Existing Measurement and
Analysis Process
• Procedures
–
–
–
–
–
Conduct Measurement and Analysis
Create Measurement Plan
Establish Measurable Objective
Analyze Measures and Indicators
Perform Measurement
• Templates
– Measurement and Analysis Plan
– Measurement Construct Definition Document
– Integrated Analysis Results Template
• Referenced items required by High Maturity projects as “advanced
process” or “advanced project” management
Training/Tools
Challenge
• Developing right-sized training
for the organization when
individuals within the
organization are at dramatically
different levels of understanding
Landscape
• Individuals with varying comfort
levels of using data
• Some people not familiar with
Process Improvement principles
• Some Measurement Analysts
already established in the role,
while others were “voluntold”
• All comfortable with life at ML3
Approach
Multitiered approach to training and tools:
•
•
•
•
•
•
•
•
Developed a Technical Reference Guide
Created a 4-hour training course
Developed an internal tool, the Analytical
Techniques Workbook
Established monthly (and then
bimonthly) Measurement Information
Meetings
Provided heavy one-on-one mentoring
with Measurement Analysts
Conducted High Maturity Workshops
Generated self-assessments and
learning plans for individuals
Collected feedback and implemented
changes quickly from process users
Training/Tools
TRAINING:
4-hour training course
for targeted audiences
Interactive, multiple instructors
Virtual: Skype meeting
PowerPoint with reference materials
REFERENCES FOR
MEASUREMENT ANALYSTS:
Technical Reference Guide
- Used in conjunction with the LSS Pocket Toolbook
Analytical Techniques Workbook
- Plug and chug with project-specific data
- Preferred graphs for quantitative analysis documentation
- Upgrades based on user feedback
APPLICATION:
CONTINUOUS FEEDBACK LOOPS:
Quarterly Measurement Workshops with specific agenda
items; Bimonthly Measurement Information Meetings
targeted to the most pressing issues/concepts
Lessons learned collected on new tools,
OSP, and the PIP process
Mentoring sessions with HMI Analyst and Project Team
Culture
Challenge
• “Are we there yet?”
• “What do you mean you need
more time, more resources, and
more budget?”
• “Are we right?”
Landscape
• Difficult for key stakeholders to
fully grasp the depth and
complexity of High Maturity
• Difficult to explain High Maturity
concepts in the beginning
• Checkbox mentality in some
situations
Approach
Considered all levels of stakeholders,
supporters, and “doers” of the work:
•
•
•
•
•
Embraced the critical role of
communications and the various
stakeholder groups’ tailored messages
Accepted that some messages would
require repetition (e.g., multiple EPG
briefs, webinars, weekly check-in points)
Continually collected and reevaluated
lessons learned
Conducted workshops to pull key roles
together in person and allow learning
from peers
Created safe environments for asking
questions (Measurement Information
Meeting)
Culture
Communications
•
•
•
•
Partnered with Corporate Quality Office
Targeted Communications part of the overall schedule
Marketing with webinars and presence at various conferences
Pinpointed emails with very clear direction on next steps
Walk the Walk
• Organizational Advanced Causal Analysis
• Leveraged the OSP and reference materials
• Process Improvement Proposal (PIP) submissions
SCAMPI
• Mini-Team alignment
• Multiple SCAMPIs to prepare for A
• Brought in external people to be a part of the team
Safe
Environment
• Practice walkthrough of quantitative management activities (PPB, PPM, and QPPO)
• EPG – repetitive training, EPG Chair engagement (inclusion in HMI Workshop –
previously not included)
• Quality Management Working Group – determining how best to present information
• Meeting minutes available to all
Critical Success Factors/Lessons Learned
• Ability to flex the HMI Project Team during
periods of surge activities (e.g., OSP releases)
• Diversity of HMI Core Team Members’
experience and backgrounds
• Continual emphasis on improvement by
collecting feedback and lessons learned
• Project teams’ candidness in challenges so
support could be provided
• Successful failures
Critical
Success
Factors
Results
•
•
•
•
•
•
•
•
•
We have successfully completed two SCAMPI Bs.
SCAMPI A scheduled for June!
Projects are proactive in asking for help.
The organization has become more proactive in thinking through the
timeline and next steps.
We revisited the OSP to address user-friendliness of process.
Additional teams embraced leveraging CARs for improvements.
Process allowed for integration with firm quality initiatives.
Process provided for professional growth for many people, as they
were stretched to meet High Maturity challenges.
Process provided many tangible success stories from our projects and
the organization.
Contact Information
• Kileen Harrison
– [email protected]
– (619) 278-4931
• Shannon Campbell
– [email protected]
– (585) 802-8731
Building the Perfect Metric
How we quantitatively defined product quality for our
organization and why it mattered…
Kathleen Mullen
Keymind, A Division of Axiom Resource Management Inc.,
Performance Improvement Director
Keymind Overview
Business Challenge
The Plan
Involve critical stakeholders and brainstorm and discuss
ways to measure product quality
Define the metric
Identify project data that has enough data to utilize
Collect and analyze data
Publish the metric and analysis on the wiki
Set organizational and project-level goals to monitor and
manage
The Results – Our Metric Defined
The Results –Our Newest PPB
0.800
0.750
0.672
0.700
Defect Density
0.600
0.500
0.388
0.400
0.355
0.300
0.200
0.100
0.000
2014
2015
Mean
STDV
Lessons Learned
Results and feedback from appraisal-related events
(SCAMPIs/ARC-compliant events) offer an outside
perspective and help an organization see something
differently leading to new improvements
Metrics must be based on YOUR organization, its culture,
and how you do things (i.e., define metrics that are
meaningful to your people)
Start simple (don’t try and get it perfect the first time around)
Setting up a
Compliance & Risk Management Office (CRMO)
David Farr
Jaguar Land Rover
Software Process Manager
Sripathy Ramachandran
KPIT Technologies Ltd.
Principal Consultant
Organization Overview
About JLR
Jaguar Land Rover is the UK’s largest automotive
manufacturing business, built around two iconic
British car brands: Land Rover, the world’s leading
manufacturer of premium all-terrain vehicles and
Jaguar, one of the world’s premier luxury sports
saloon and sports car marques.
About KPIT
KPIT is a global technology company specializing in
providing IT Consulting and Product Engineering
solutions and services to Automotive, Manufacturing,
Energy & Utilities and Life Sciences companies. KPIT
is a partner of choice to Automotive Industry to
deliver best in class solutions and products across
automotive subsystems and make mobility Smarter,
Greener, Safer and Affordable.
Business Challenge
Industry
Standards
CMMI
ASPICE
ISO 15288
ISO 26262
ISO/TS 16949
SAE J3061
etc.
Internal
Standards
PCDS
RMDV
etc.
Business Challenge
Maturity
JLR Projects
Maturity
Supplier Projects
Team Formation
Creation of an independent
Compliance & Risk Management
Office, familiar with all Disciplines
Industry
Standards
CMMI
ASPICE
ISO 15288
ISO 26262
ISO/TS 16949
SAE J3061
etc.
Disciplines
CMMI
ASPICE
Systems Eng
Func Safety
Supplier SWQA
Security
FOSS
MBPE
etc.
Exemplar
Processes
Compliance
& Risk
Operating Model
Management Office
Perform Compliance
Check
No
Noncompliant
?
Assess
compliance to
standards
Assess impact to
Quality / Cost /
Schedule
Compliance &
Risk Strategy
Compliance
& Risk
Report
Yes
Any PI /
NI?
No
Yes
Perform Risk
Assessment & assess
effectiveness
Policies &
processes
Annual
Calendar
Request
from Biz &
teams
End
High
Risk?
Yes
Compliance &
Risk Assessment
Report
Define Remedial
Action
No
PI – Partially Implemented
NI – Not Implemented
Compliance & Risk Assessment Approach
Prepare self-assessment
questionnaire
1
2
Obtain self-assessment
response
Prepare process profile
matrix
Process walkthrough + Risk
assessment
3
Determine improvements
actions
2
Work Product Characteristics/Attribute
(Not Limited To)
This documents covers
- Project Scope
- Customer requirements,
- Project Acceptance Criteria
- Statement of work describing the requirements, dependencies, time
Scope Document / Statement of Work frame and milestones
- In case of SW Functional Safety aspects are involved, a Development
Interface Agreement (DIA) detailing the expected functional safety related life
cycle activities, deliverables and confirmation reviews is also included. This
serves as a agreement between the involved parties for defining the
functional safety scope, execution and interfaces.
This document defines
- Project specific processes
- Deliverables to customer,
- Internal deliverables
- Project & product life cycle,
- Assumptions,
- Dependencies,
- Acceptance criteria,
- Roles and Responsibilities ,
- Project Governance
--Team structure and distribution across locations
--Common Tools
Project Management Plan
--Joint Processes
- Stakeholder management plan,
- Resource management plan
Work Product
Supplier
Work Product
Notes
Implementation Level
1
Process Gaps & Proposed Improvements
GAP ID
Process.Base
Practice
Practice Description
1
ENG4.BP1
3
ENG4.BP2
Determine the impact on the
operating environment
5/24
7
/201
Finding
Classificatio
Finding
Category
Requirements are captured in eLog.
eLog has a placeholder to capture the
requirements type. However, the
requirements are not classified as
functional / non-functional
requirements
PINE
Minor
Requirements are analyzed. However,
the testability of requirement are not
assessed
PEND
Minor
Interface requirements are not
explicitly captured in the eLog. This is
currently assessed based on experience
PEND
Minor
Analyze software requirements
2
Approach completed for ASPICE, System
Engineering. Extending to other disciplines
Improvement Opportunities(Gaps)
Identify software requirements
3
ENG4.BP3
Proposed Process Improvement
There is a Functional Specification
document available for Software
Requirements and there are classifications
available for Functional Requirements but
there is not enough or adequate details on
Non Functional requirements. To include
the different classes of non functional
requirements such as usability, security and
maintainability etc.
Functional Specification: Include coverage
of requirements analysis as part of the
process description.
Requirements Review Checklist: To include
checks during requirements review and
verify if the analysis checks include
testability of requirements
elog: Impact analysis details are not
captured in elog nor documented
elsewhere. The action is to maintain a
placeholder in elog to capture the impact
analysis details/description.
Lessons Learnt
• Every team wants to pilot and evaluate benefits before
standardization
• Limited bandwidth of SME to perform compliance & risk
assessment
• Need for quantitative dashboard to present to executive team in
order to take meaningful decisions
• Risk assessment process needs improvement to determine
business & operational risk
• Automation is needed to create dashboards
Thanks for your Attention !!
David Farr
Jaguar Land Rover
Software Process Manager
[email protected]
Sripathy Ramachandran
KPIT Technologies Ltd.
Principal Consultant
[email protected]
Top 10 Discoveries
From a team of industry experts who didn’t set
out to be lead appraisers…
Becky Fitzgerald
Two Harbors Consulting LLC
President
Organization Overview
Two Harbors Consulting LLC established 4/14/2016
–
–
–
–
Becky Fitzgerald
Tom Klein
Jim Shaver
George Zack
Working with a healthcare services and information technology
company:
–
–
–
–
–
–
–
Revenues of $190.9 billion for the full year (11th, Fortune 500)
#1 Medical Necessity Utilization Management
#1 Capacity and Workforce Analytics
#1 Provider Revenue Cycle
#1 Pharmacy Network
#2 US PACS
#2 Physician Services
Business Challenge
Fortune 500 company with growth through acquisition.
Newly purchased companies typically start-ups or large
strategic purchases, and each brings its own set of
processes, culture, staff, and geographic locations.
The challenges?
• Improve predictability, quality, and gain efficiencies without
imposing common processes across disparate entities.
• Build a culture of quality that extends across the
organization while retaining the unique and successful
properties of each business unit.
The Plan
1. Announce company-wide mission for improved quality
2. Select guru, and support infant program definition






Pick diverse R&D experts with CMMI exposure
Identify most accessible engagement model
Connect with early “burning platform” organizations
Publicize successes
Gain additional sponsorship (“mandate”)
Leverage relationships to share successes, intentionally
extend beyond mandate
 Build evidenced financial and political benefit
Engage, Learn, Evolve, Repeat
The Results
A high-performing corporate team, each plucked from
different areas of the company.
A highly efficient collection of Lead Appraisers who
developed a team approach with minimal entry hurdle for
business units.
A successful program achieving business performance,
credibility, and demand for services from the business units.
Goal Setting? You bet your measureable objectives!
Goal Setting?
Low capability organization?
Frazzled, distracted sponsor
Goal Setting?
From this:
Goal Setting?
To this:
Abdicating Responsibility: Outsourcing accountability
just doesn’t work
Accountability
Lead Appraiser:
“Why are we talking today, what are you hoping to
gain…your objectives?”
Sponsor:
“I’ve hired a consultant to get me to CMMI ML2
by September, I want you to schedule our
baseline…”
Waterfall to Agile ML5 Medical Device
Agile High Maturity
2009 – Failed Class A
2011 – Team
engagement and
Baseline
2011 – Benchmark
ML3; Waterfall
2013 – ML4, 5 Gap
analysis
2014 – Baseline;
Benchmark ML5; Agile
Scrum
Using Value Stream Mapping (VSM) to kickstart integration and quality transformation
Leverage VSM
Build common understanding of current state
Leverage VSM
Use metrics to identify areas of opportunity
Leverage VSM
Capture ideas & apply criteria
Leverage VSM
Analyze ideas
Performing work management activities:
True stories of self-destruction and redemption
(with signals of each)
Two Groups – Different Outcomes
2010: Initial Baselines
Group 1:
Group 2:
Performing work management activities: True stories of
self-destruction and redemption (with signals of each)
Group 1 – too busy to improve, growth their focus
2013: Group 1 repeat baseline
2015: Group 1 repeat baseline
Group 2 – desire culture of quality, preparation for growth
2011: Group 2 ML2
2013: Group 2 ML3
2016: Group 2 ML5
They like me, they really like me! Samples of
employee engagement through CMMI
Increase Employee Engagement
You like me, you really like me!
Data – it’s what’s for dinner: Early wins with DMM
introduction
Concepts Made Consumable
Linking Six Sigma: Clarifying the “what” and the “how”
“We already have Six Sigma”
CMMI (what)
Six Sigma (how)
CMMI in healthcare? Only if quality in a
healthcare system is important
Healthcare CMMI Synergy
Operating with regulatory compliance (e.g. FDA, CE, etc.)
Before CMMI:
Changes through CMMI:
 Large, compliant
medical device
business
 Global presence





On time completion increased by 60%
Met 100% critical milestones (post ML, 5yrs and
counting)
50% reduction in support FTEs required (priority
defects reduced to 0%; defects reduced by >40%)
Disciplined framework for adoption of new tools and
new development methodology while retaining full
regulatory compliance
Audits became “non-events”
Would I work here? {Awkward silence} to an
award-winning culture
Shifting Organizational Culture Through CMMI
Sample Business Unit Outcomes
 100% deliverable targets met inclusive of a 10% reduction in
FTEs
 82% on-time completion improvement in first year
 Annual roadmap forecasting within 5% across geographically
distributed agile Kanban teams
 17 billion transactions with 99.999% up time
 Reallocation of 23,000hrs to strategic project pipeline due to
reduced rework
 100% critical milestones met every year for last four years
 Project budget variance reduced by 8 percentage points
(shrunk standard deviation)
 Post-release maintenance costs reduced from 108% to 12%
Two Harbors Consulting
Experience, Integrity, Creativity
Becky Fitzgerald: [email protected]
Tom Klein: [email protected]
Jim Shaver: [email protected]
George Zack: [email protected]
Practical Measurements
for Greater Visibility
Satish Kumar Tumu
Concept QA Labs
SCAMPI Lead Appraiser
Organization Overview
Concept QA Labs is a CMMI partner for providing:
• CMMI SCAMPI Appraisal Services
• CMMI Introduction Trainings
11 Years of operation with offices in USA, India & Singapore:
• Spanning 23 countries
• 150+ SCAMPI A appraisals
• 100+ CMM Trainings
Concept is also ISO Certification body
• Certification audits: ISO 9001, ISO 27001 & ISO 20000
Concept develops and manages Project & Process Management
tools:
• Provision, PAL & Time Track
Business Challenge
What are good measurements to be adopted?
Who are responsible?
How to make sure that measurements add value?
What effort is needed?
How many measurements to capture and analyze?
Do we need tools for measurements?
How much budget is needed for measurements?
Attribute of a good measure:
•
•
•
•
•
•
Simple to measure
Easy to analyze
Robust to use
Must be related to our work, product or process
Must be accurate
Shall be granular
Plan
Plan shall be formal and shall specify:
• Why we need measures?
• What to measure?
• How and when to capture?
• How and when to analyze?
• Whom to communicate?
Plan shall be at all three levels:
• Business level
• Process level (Organization)
• Project level
Hierarchy of Measurements:
Business Metrics
Business Managers
Process Metrics
Process Team
Project Team
Project Metrics
Base Measurements
Quality Metrics
Base Measurements
Project Team
Project Measurements
Provide visibility into:
• Progress of tasks
• Budget consumed
• Quality being built
Simple & Robust:
• Planned tasks Vs Actual
• Planned effort vs Actual
• Actual performance / quality Vs Plan
Process Measurements
Provide visibility into:
• Cost of process
• Performance of process
Simple & Robust:
• Effort / cost consumed
• Performance of process
• Is the intent of process met
• Are desired results achieved
• Leading indicators
• People, process, technology & inputs measures
Business Measurements
Provide visibility into:
• Planned Vs Actual business results
Simple & Robust:
• Growth
• Profit
• On-time, Within Budget & Quality Measures
Summary: Simple and Practical Measurements
Shall assist to manage project success:
• Complete on time
• Complete within Budget
• Complete with quality
Shall assist to gauge the process effectiveness and efficiency:
• Cost (effort) of process
• Effectiveness and efficiency of process
Shall able to gauge the business performance and results:
• Net profit
• Increase in business and market share
• Timeliness and deliverable quality
• Customer satisfaction
The Results: Improving Visibility
Project, Product, Process and Business Management:
• Qualitative discussions: Approximate, error prone and lack
of visibility
• Shall be blended with measurements
• Accurate
• Factual
• Better visibility
• Communicate the progress with Measurement
“ Blend Status/ Progress reports with Measurement
reports for better Practical Usage of Measurements”
Lessons Learned
• A project or organization having a distinct status report
and metrics report have misunderstood the usage of
measurements
• Measurements is the responsibility of everyone
• A measure not collected at right time is a measure lost
forever
• Do not collect measurements for the sake of models and
standards
• Tools cannot generate measurements, they can help to
ease the collation and analysis
• Without proper understanding of basic measurement
concepts, its usage is lost or minimized