Tutorial: Software Cost Estimation with COCOMO 2.0

Transcription

Tutorial: Software Cost Estimation with COCOMO 2.0
S
IE
University of Southern California
Center for Software Engineering
Tutorial:
Software Cost Estimation with
COCOMO 2.0
Barry Boehrn, USC
COCOMOISCM Forum 1996
October 9.1996
l
Barry Boehm - 1
University of Southern California
Center for Software Engineering
Outline
+ Steps in Software Cost Estimation
- 10 years of TRW experience
+ Integrated Estimation, Measurement, and
Management
-Getting to CMM levels 4 and 5
- Role of COCOMO 2.0
+ COCOMO 2.0 Model Overview
+ COCOMO 2.0 Status and Plans
+ Information Sources
Utilvnrsily ol Soulliot 1 1 Cillilo~~lin
Center for Soflwaro Ellgi~icct'illg
-
EXAMPLE SOFTWARE PROJECT COST /-//STOI?Y
.I000
DEVENNY, 1976
320C
240C
1 GOO
000
0
G
1 ;!
111
21
30
NUh/!ilfr;!: OF MOPlTl 15 A l - T l l l i CONTRACT A W A R D
Steps in Software Cost Estimation
Establish Objectives
-Rough sizing
-Ma ke-or-B uy
-Detailed Planning
Allocate Enough Time, Dollars, Talent
Pin Down Software Requirements
-Documents Definition, Assumptions
Work out as much detail as Objectives permit
Use several independent techniques + sources
-Top-Down vs. Bottom-Up
-Algorithm vs. Expert Judgement
Compare and iterates estimates
-Pin down and resolve inconsistencies
-Be Conservative
Follow Up
University of Southern California
Center for Software Engineering
Cost Estimating Objectives
+ Relative vs. Absolute estimates
- Selection vs. Resource Allocation
Decisions
+ Balanced Estimates
- Degree of uncertainty vs. Size
+ Generous vs. Conservative estimates
*Key objects to decision needs
*Re-example and iterate objectives as appropriate
University of Southern California
Center for Software Engineering
SOFTWARE ESTIMATION ACCURACY VS.
PHASE
4x
- SIZE (DSI)
X -COST ($)
2x
1.5X
l.25X
X
RELATIVE
COST
RANGE
0.5x
0.25~
FEASIBILITY
BB 04/22/96
PLANS AND ROTS
PRODUCT
DESIGN
PHASES AND MILESTONES
DETA1L
DESIGN
DEVEL. AND TEST
University of Southern California
Center for Software Engineering
Table 21-1 Software Cost-Estimating Miniproject
Plan
1. Purpose. Why is the estimate being made?
2. Products and Schedules. What is going to be furnished be
when?
3. Responsibilities. Who is responsible for each product?
Where are they going to do the job (organizationally?
geographically?).
4. Procedures. How is the job going to be done? Which costestimation tools and techniques will be used (see Chapter
22)?
5. Required Resources. How much data, time, money, effort,
etc. is needed to do the job?
6.Assumptions. Under what conditions are we promising to
deliver the above estimates, given the above resources
(availability of key personnel, computer time, user data)?
University of Southern California
Center for Software Engineering
Software Cost Estimation Plan:
Zenith Rapid Transit System
+ Purpose
To help determine the feasibility of a computer
controlled rapid-transit system for the Zenith metropolitan
area.
+ Products and Schedules.
2/15/97
2/22/97
----
2/29/97
-
2/1/97
Cost Estimation Plan
First cost model run
Definitive cost model run
Expert estimates complete
Final cost estimate report, incorporating modell
expert iterations. Accuracy to within factor of 2.
+ Responsibilities.
Cost estimation study:
Cost model support:
Expert estimators (2):
Z.B. Zimmerman
Applications Software Department
Systems Analysis Department
University of Southern California
Center for Software Engineering
Software Cost Estimation Plan:
Zenith Rapid Transit System (Contd.)
+ Procedures. Project will use SOFCOST model, with
sensitivity analysis on high-leverage cost driver attributes.
Experts will contact BART personnel in San Francisco and
Metro personnel in Washington, D.C. for comparative data.
+ Reauired Resoures.
Z.B. Zimmerman: 2 man-weeks
Expert estimators: 3 man-days each
Computer: $200
+ Assumptions.
No major changes to system specification dated 15
January 1997. Authors of specification available to
answer sizing questions.
W//A T DOES A SOFTWAREPRODUCT DO?
W I TD
O A SOFTWARE PROJECT DO?
SUMMARY OF 1WO TEAMS DEVEI-OI'ING PRODUCT TO SAME IlASlC SPECIFICATION
(SMALL, I N - I ~ A C T I V ESOF-I-WAIIII c o w MODEL)
MEETING MISCEL-
LANEOUS
WHAT DO PROGRAMMEBS DO?
- BELL LABS TIME AP?O MOTlOlv STUDY
I
c
S(E
University of Southern California
Center for Software Engineering
Strengths and Weaknesses of Software CostEstimation Methods
Strengths and Weaknesses of Software Cost-Estimation Methods
Method
Algorithmic
model
Expert
Judgment
Analogy
Parkinson
Price to win
Top-down
Strengths
Objective, repeatable, analyzable
formula
Efficient, good for sensitivity
Objectively calibrated to experience
Assessment of representatives,
interactions, exceptional circumstances
Based on representative experience
Correlates with some experience
Often wins
System level focus
Efficient
More detailed basis
More stable
Fosters individual commitment
Weaknesses
Subjective inputs
Assortment of exceptional
circumstances
Calibrated to past, no hture
No better than participants
Biases, incomplete recall
Represetativeness of experience
Reinforces poor practice
Generally produces large overruns
Less detailed basis
Less stable
May overlook system level costs
Requires more effort
University of Southern California
Center for Software Engineering
Comparing and Iterating Estimates
+ Complementarity of techniques
+ Missing items
+ The "tall pole in the tent"
- Pareto (80-20) principle
+ The optimist/pessimist phenomenon
University of Southern California
Center for Software Engineering
SOFTWARE COST ESTIMATION
ACCURACY VS. PHASE
-
4x
CLASSES OF PEOPLE,
DATA SOURCES TO SUPPORT
EXAMPLE SOURCES OF UNCERTAINTY,
MAN-MACHINE INTERFACE SOFTWARE
RESPONSE TIMES
INTERNAL DATA STRUCTURE
BUFFER HANDLING TECHNIQUES
DETAILED SCHEDULING ALGORITHMS,
ERROR HANDLING
1.25)1
OF
PROGRAMMER
SPEC
UNDERSTANDING
X
RELATIVE
COST
RANGE
1
FEASIBILITY
BB 04/22/96
CONCEPT OF
OPERATION
RQTS.
SPEC.
A
A
PLANS AND ROTS
PRODUCT
DESIGN
PHASES AND MILESTONES
PRODUCT
DESIGN
SPEC.
DETAllL
DESIGN
SPEC.
A
ACCEPTED
SOFTWARE
A
DETAIL
DESIGN
A
DEVEL. AND TEST
University of Southern California
Center for Software Engineering
.
Judging the Soundness of Cost
Estimates: Some Review Questions
How would the estimated cost change if:
- There was a big change in the
- We had to supply a more extensive
interface?
capability?
- Our workload estimates were off by 50%?
-
.
.
We had to do the job on a
hardware configuration?
Suppose our budget was 20% (less, greater). How would this affect the product?
How does the cost of this item break down by component?
University of Southern California
Center for Software Engineering
Judging the Optimism of Cost Estimates:
Some Review Questions
+ How do the following compare with previous experience:
- Sizing of subsystems?
Cost driver ratings?
- Cost per instruction?
- Assumptions about hardware? GFE?
Subcontractors? Vendor software?
-
+ Have you done a cost-risk analysis?
- What are the big-risk items?
.-
Suppose we had enough money to really do this job right
How much would we need?
University of Southern California
Center for Software Engineering
Software Cost Estimation: Need for
+ Estimating inputs are imperfect
+ Estimating techniques are imperfect
+ Project may not fit model
- Incremental development
+ Software projects are volatile
+ Software is an evolving field
- COTS, objects, cyberspace, new processes
University of Southern California
Center for Software Engineering
Management-Relevant, Hypothesis Driven
Data: TRW Data Collection Policy
+ Identify major project milestone
+ At each milestone:
-
Re-do cost model inputs
Re-run cost moel
- Collect actuals corresponding to cost
model outputs
- Compare results
University of Southern California
Center for Software Engineering
Outline
*
+ Steps in Software Cost Estimation
+
- 10 years of TRW experience
Integrated Estimation, Measurement,
and Management
-Getting to CMM levels 4 and 5
- Role of COCOMO 2.0
+ COCOMO 2.0 Model Overview
+ COCOMO 2.0 Status and Plans
+ Information Sources
University of Southern California
Center for Software Engineering
Key Process Areas by Level
Level 4 (Managed)
+ Process Measurement and Analysis
- Statistical process control
- Cost/schedule/quality tradeoffs understood
- Causes of variation identified and controlled
+ Quality Management
- Measurable quality goals and priorities
- Quality goals drive product and process plans
- Measurements used to manage project
University of Southern California
Center for Software Engineering
Key Process Areas by Level
Level 5 (Optimizing)
+ Defect Prevention
- Defect sources identified and eliminated
+ Technology Innovation
- Processes for technology tracking, evaluation,
assimilation
- Assimilation tied to process quality and productivity
improvements
+ Process Change Management
- Commitment to process improvement goals
- Evidence of measured improvements
- Improvements propagate accross the organization
University of Southern California
Center for Software Engineering
Role of COCOMO 2.0
+ Supports modern application of TRW data
collection policy
- Objects, GUI builders, reuse, new
processes
- COTS integration, quality estimation
underway
+ Use data to recalibrate model
+ Use model to optimize new technology
investments
University of Southern California
Center for Software Engineering
COCOMO 2.0 Long Term Vision
Rescope
System objectives
fcn'y, perf., quality
Corporate parameters:
tools, processes, reuse
Improved
Parameters
NO
d
Cost.
Sched,
Risks
t
Execute
project
to next
Milestone
I
I
COCOMO 2.0
Revise
Milestones,
Plans,
Resources
r
1 1
Cost, Sched,
Qualiti drivers
resources
Milestone expectations
Evaluate
Corporate
SW
Improvement
Strategies
-
Recalibrate
COCOMO 2.0
-
/
L---l
Revised
Yes
-
Accumulate
COCOMO 2.0
calibration
data
Yes
I
University of Southern California
Center for Software Engineering
Outline
+ Steps in Software Cost Estimation
- 10 years of TRW experience
+ Integrated Estimation, Measurement, and
Management
- Getting to CMM levels 4 and 5
- Role of COCOMO 2.0
I)+ COCOMO 2.0 Model Overview
+ COCOMO 2.0 Status and Plans
+ Information Sources
University of Southern California
Center for Software Engineering
COCOMO 2.0 Model Overview
+
+
+
+
+
+
+
COCOMO Baseline Overview
COCOMO 2.0 Objectives
Coverage of Future Market Sectors
Hierarchical Sizing Model
Modes Replaced by Exponent Drivers
Stronger ReuseIReengineering Model
Other Model Improvements
Activity Sequence
University of Southern California
Center for Software Engineering
.
COCOMO Baseline Overview I
Software product size estimate
Software development, maintenance
cost and schedule estimates
Software product, process, computer
and personnel attributes,
'
COCOMO
1
Software reuse, maintenance, and
Software organization's
project data
Cost, schedule distribution
by phase, activity, increment
COCOMO recalibrated to
organization's data
University of Southern California
Center for Software Engineering
COCOMO Baseline Overview II
+ Open interfaces and internals
- Published in Software Enqineerinq
Economics, Boehm, 1981
+ Numerous implementations, calibrations,
extensions
- Incremental development, Ada, new
environment technology
- Arguably the most frequently-used
software cost model worldwide.
University of Southern California
Center for Software Engineering
Partial List of COCOMO Packages
-From STSC Report, Mar. 1993
+ CB COCOMO
+ GECOMO Plus
COCOMOID
COCOMO1
CoCoPro
COSTAR
COSTMODL
+ GHL COCOMO
+
+
+
+
+
+ REVIC
+ SECOMO
+ SWAN
University of Southern California
Center for Software Engineering
COCOMO 2.0 Objectives
+ Retain COCOMO internal, external openness
+ Develop a 1990-2000's software cost model
-Addressing full range of market sectors
- Addressing new processes and practices
+ Develop database, tool support for continuous
model improvement
+ Develop analytic framework for economic analysis
- E.g., software product line return-on-investment
analysis
+ Integrate with process, architecture research
University of Southern California '
Center for Software Engineering
The future of the software
practices marketplace
User programming
(55M performers in US in year 2005)
\
Application
generators
(0.6M)
Application
composition
. (O.7M)
Infrastructure
(0.75M)
System
integration
(0.7M)
[cisiEi Center for Software Engineering
.,.
University of Southern California
*
$3
COCOMO 2.0 Coverage of
Future SW Practices Sectors
User Programming: No need for cost model
Applications Composition: Use object c o u n t s or
object points
- Count (weight) s c r e e n s , reports, 3GL routines
System Integration; development of applications
generators a n d infrastructure software
- Prototyping: Applications
.
composition model
-Early design: Function Points a n d 7 c o s t drivers
- Post-architecture: S o u r c e S t a t e m e n t s o r Function
Points a n d 17 cost drivers
- Stronger reuselreengineering model
University of Southern California
'
Center for Software Engineering
Applications Composition
+
+
Challenge:
- Modeling rapid applications composition
with graphical user interface (GUI)
builders, client-server support, objectIibraries, etc.
Response:
- Object-Point sizing and costing model
- Reporting estimate ranges rather than
point estimates
Step 1 : Assess Object-Counts: estimate the number of screens, reports, and 3GL components that will comprise !his application.
Assume the standard definitions of these objects iii your iCASE environment.
Step 2: Classify each object instance into simple, medium and difficult complexity levels depending on values of characteristic
dimensions. Use the follow ing scheme:
For Screens
#
Number of
Views
Contained
-
3 7
>8
and source of data tables
Total < 4
(c2
srvr, c3
clnt)
simple
medium
#
Total 8+
(>3 srvr, >S
Number
of Sections
Total <8
(4 srvr, 35 clnt)
simple
medium
medium
difficult
difficult
0 or I
2 or 3
difficult
4t
clnt)
1
Contained
For Reports
and source of data tables
Total < 4
(<2 srvr, c3
clnt)
simple
Total <8
(<3 srvr, 3-
simple
5 clnt)
simple
medium
medium
difficult
Total
8t
(>3 srvr, >5
clnt)
medium
difficult
difficult
Step 3: Weigh the number in each cell using the following scheme. The weights reflect the relative effort required to implement
an instance of that complexity level.
Object Type
Simple
Screen
Report
3GL Component
Complexity-Weigh t
Medium
Difficult
2
1
2
3
8
10
5
Step 4: Determine Object-Points: add all the weighted object instances to get one number, the Object-Point count.
Step 5: Estimate percentage of reuse you expect to be achieved in this project. Compute the New Object Points to be developed,
NOP =(Object-Points) (100-%reuse)/ 100.
Step 6: Determine a productivity rate, PROD=NOP/person-month, from the following scheme:
. Developer's experience and capability
ICASE maturity and capability
PROD
Very Low
Very Low
Low
Low
Nominal
Nominal
High
High
Very Hiqh
Very High
4
7
In
75
.c; n
Step 7: Compute the estimated person-months: PM=NOP/PROD.
I
University of Southern California '
Center for Software Engineering
SW Costing and Sizing Accuracy vs. Phase
S i z e (DSI)
Cost ($)
Completed
Programs
+
USAF/ESD
Proposals
Relative
Size
Range
1.25~
X
/
I
Concept of
Operat ion
A
Feasability
A
Plans
and
Rqts.
Product
Detail
Spec.
Rqts.
Spec.
Design
Spec.
Design
A
A
Product
Design
Accepted
Software
A
Detail
Design
Phases and Milestones
A
Devel.
and
Test
University of Southern California
Center for Software Engineering
New Processes
+
+
Challenges
- Cost and schedule estimation of composable
mixes of prototyping, spiral, evolutionary,
incremental development, other models
Responses
- New milestone definitions
>>
Basic requirements consensus, life-cycle architecture
- Replace development modes by exponent
.
drivers
- Post-Initial Operational Capability
evolution
>>
Drop maintenancelreuse distinction
(IOC)
University of Southern California
Center for Software Engineering
Process/Anchor Point Examples
Rapid
APP=
Devel.
spiral-type
Ev.Dev., Spiral
LCO,
LCA
Spiral-type
LC0
Waterfall,
Spiral-type
IOC
W'fall, IncDev, EvDev,
Spiral, Design-to-Cost, etc.
LCA
IOC
I
University of Southern California
'
Center for Software Engineering
.. .
aif$
%
i ?K
:x-
.
Life-cyc'le Architecture as Critical
Milestone
Candidate C r i t i c a l
Milestones
Complete Requirements
Specifications
Beta-Test Code
Life-Cycle Architecture
1
I
Concerns
Premature decisions, e.g. user interface features
Inflexible point solutions
Gold plating
High-risk downstream functions
Inflexible point solutions
Capabilities too far from user needs
Provides flexibility to address concerns above
Further concerns:
Predicting directions of growth, evolution
High-risk architectural elements
University of Southern California
'
Center for Software Engineering
Early Design and Post-Arch Model:
+
Effort:
Size
- KSLOC (Thousands of Source Lines of Code
- UFP (Unadjusted
-. Function Points
- EKSLOC (Equivalent KSLOC) used for adaptation
+ SF: Scale Factors (5)
+ EM: Effort Multipliers (7 for ED, 17 for PA)
+
University of Southern California
Center for Software Engineering
IMPORTANCE OF CLEA R DEFINITIOIIIS
SCOPE
I) Include Conversion? Training? Documentation?
ENDPOINTS
I) Include requirements analysis? Full validation?
INSTRUCTIONS
I) Source? Object? Executable? Comments?
PRODUCT
I) Application? Support S/W? Test Drivers?
"DEVEl OPMENT"
Include reused software? Furnished Utility S/ W?
"MAN MONTHS"
I) Include Clerical and Supporf Personnel? Vacation?
I These can bias results by factors of 2-10 1
University of Southern California
Center for Software Engineering
Unadjusted Function Points
Step i: Determine function counts by tvpe. The unadjusted function counts should be counted by a lead technical person
based on information in the software requirements and design documents. The number of each of the five user
function types should be counted (Internal Logical ~ i l e l(ILF), External Interface File (EIF), External Input (El),
Extemal Output (EO), and External Inquiry (EQ)).
Step 2: Determine com~lexitv-levelfunction counts. Classify each functign count into Low, Average, and High complexity
levels depending on the number of data element types contained and the number of file types reference. Use the
following scheme:
For ILF and EIF
Data Elements
Record
1- 1201
Elements
For EO and EQ
Flle Types Data Elements
1- 1 6
-
For El
Data Elements
1- 151
Flle Types
Step 3: ADDIVcom~lexitvweiahts. Weight the number in each cell using the following scheme. The weights reflect the
relative value of the function to the user.
Functlon Type
Internal Logical Files
Interfaces Files
External Inputs
External Outputs
External Inquiries
-
[ Extemal
1
Low
7
5
I
3
1
4
3
Complexity-Welght
I
I
I
Average
I
I
10
I
7
I
-.
I
I
High
15
4
I
10
6
5
4
I
7
6
I
j
Step 4: Compute Unadiusted Function Points. Add all the weighted functions counts to get one number, the Unadjusted
Function Points.
University bf Southern California
'
Center for Software Engineering
Source Liries of Code:
Measurement Unit:
]
Physical source !in&]
d
Statement Type
When a line or statement contains more than one type, classify it
as the type with the highest precedence.
1. Executable
Order of precedence I€
2. Nonexecutable
4
3. Declarations
4. Compiler directives
5. Comments
6.
On their own lines
7.
On lines with source code
8.
Banners and nonblank spacers
9.
Blank (empty) com@ents
10
R
l
a
n
k
How produced
~efinitionl J
Data ~ r r a ~ , l
1. Programmed
2. Generated with source code generators
3. Converted with automated translators
4. Copied or reused without change
5. Modified
d
Origin
~efinitionl 4
) Data ~ r r a ~ l
1. New work: no prior existbnce
2. Prior work: taken or adapted from
3. A previous version, build, or release
4. Commercial, off-the-shelf software (COTS), other than libraries
5. Government furnished seftware (GFS), other than reuse libraries
6. Another product
7. A vendor-supplied language support library (unmodified)
8. A vendor-supplied operating system or utility (unmodified)
9. A local or modified language support library or operating system
10. Other commercial library
11. A reuse library (softwart designed for reuse)
J7 Q f h e r s o f t w a r e c n m n o n e n t o r l i b r a r v
1
:
1
.
1
J
7
J
?
\
I
4
L
5
-J"
L
6
7
1
sL
e
Excludes
J
1
J
J
J
J
J
J
J
J
J
J
J
u1
.
University of Southern California
Center for Software Engineering
Size Adjustment
Breakage
Nonlinear effects of reuse
AAF = 0.4(DM) + 0.3(CM) + 0.3(IM)
ESLOC = ASLOC [AA + AAF(1 + .02(SU) (UNFM))]
ESLOC = ASLOC [AA + AAF(SU) (UNFM)]
100
Automatic translation of code
-
---
.
adjusted
, AAF> .5
.A1
ASLOC 100
. - -
estimated
, AAFl.5
A TPROD
University of Southern California
Center for Software Engineering
COCOMO 2.0 Model Overview
+ COCOMO Baseline Overview
+ COCOMO 2.0 Objectives
+ Coverage of Future Market Sectors
*
+ Hierarchical Sizing Model
+ Modes Replaced by Exponent Drivers
+ Stronger ReuseIReengineering Model
+ Other Model Improvements
+ Activity Sequence
University of Southern California
I
Center for Software Engineering
Project Scale Factors:
PM
estimated
PREC
I thoroughly
REX
rigorous
RESL
little
TEAM
very difficult
interactions
PMAT
I largely
I somewhat
1 generally
I unprecedented I unprecedented ( unprecedented I famil ia r
(20%)
occasional
relaxation
some (40%)
some difficult
interactions
-.
some
relaxation
often (60%)
basically
cooperative
interaction
weighted sum of KPA competition levels
general
conformity
generally
(75%)
largely
cooperative
I largely
I familiar
some
conformity
mostly
general goals
(90%)
highly
cooperative
seamless
interactions
I
Nature of Life-Cycle Architecture
I
+
+
Definition of life-cycle requirements envelope
Definition of software components and
relationships
- physical and logical
- data, control, timing relationships
- shared assumptions
- developer, operator, user, manager views
+ Evidence that architecture will accommodate
requirements envelope
+ Resolution of all life-cycle-critical COTS,
reuse, and risk issues
University of Southern California
......-
.6,2>:::<:?!,A*;3g#r-
Center for Software Engineering
. .
New Scaling Exponent Approach
Nominal person-months = A*(size)**B
B = 1.O1 + 0.01 C (exponent driver ratings)
- B ranges from 1.O1 to 1.26
- 5 drivers; ratings from 0 t o 5
Exponent drivers:
- Precedentedness
- Development flex-ibility
- Architecture1 risk resolution
-Team cohesion
- Process maturity (being derived from SEI CMM)
University of Southern California
'
Center for Software Engineering
Reuse and Product Line
Management
Challenges
- Estimate costs of both reusing software
and developing software for future reuse
- Estimate extra effects on schedule (if any)
+ Responses
- New nonlinear reuse model
- Cost of developing reusable software
expanded from Ada COCOMO
- Gathering schedule data
+
4
'
Center for Software Engineering
University of Southern California
Nonlinear Reuse Effects
Data on 2954
NASA modules
0.75
Relative
cost 0.5
Usual Linear
Amount Modified
!
University of Southern California '
Center for Software Engineering
Reuse and Reengineering Effects
Add Assessment & Assimilation increment (AA)
- Similar to conversion planning increment
Add software understanding increment (SU)
- To cover nonlinear software understanding effects
- Apply only if reused software is modified
Results in revised Adaptation Adjustment Factor (AAF)
- Equivalent DSI = (Adapted DSI) * (AAFII 00)
- A A F = AA + SU + 0.4(DM) + 0.3 (CM) + 0.3 (IM)
University of Southern California
I
Center for Software Engineering
Prototype of Software Understanding
Rating 1 lncrement
Structure
Very low
cohesion, high
coupling,
spaghetti code.
Moderately low
cohesion, high
coupling.
Reasonably
wellstructured;
some weak
areas.
Application
Clarity
No match
between
program and
application
world views.
Obscure code;
documentation
missing,
obscure or
obsolete.
Some
correlation
between
program and
application .
Some code
commentary and
headers; some
useful
documentation.
Moderate
correlation
between
program and
application
Moderate level
of code
commentary,
headers,
documentation.
SelfDescriptiveness
SU Increment to
ESLOC
.
Strong
modularity,
information
hiding in
datalcontrol
structures.
Clear match
Gocd
between
correlation
between
program and
program and
application
world views.
application
SelfGoodcode
descriptive
commentary
and headers;
code;
useful
documentation
documentation;
up-to-date,
some weak
well-organized,
are s. with design
rationale.
10
High cohesion,
low coupling.
.
Ic IS BE I
University of Southern California
Center for Software Engineering
Proposed Reuse 1 Maintenance Model (Change <= 20%)
AAF = 0.4(DM) + 0.3(CM) + 0.3(IM)
ESLOC = ASLOC[AA + AAF(1 + .02(SU) (UNFM))] , AAF <= 0.5
100
ESLOC = ASLOCIAA + AAF (SU) (UNFM)] , AAF > 0.5
100
UNFM: Unfamiliarity with Software modifier
0.0 - Completely familiar
0.2 - Mostly familiar
0.4 - Somewhat unfamiliar
0.6 - Considerably unfamiliar
0.8 - Mostly unfamiliar
1.0 - Completely unfamiliar
10109196
Barry Boehm - 6
Ic
SI
E
Un~versityof Southern Callforma
Center
~ for Software Engineering
Maintenance Model (Change > 20%)
Obtained in one of two ways depending on what is known:
ESLOC = (BaseCodeSize MCF) MAF
ESLOC = (SizeAdded+ SizeMified) M F
where
SizeAdded + SizeModiified
MCF=
BaseCodeSize
Barry Boehm - 7
University of Southern California
'
Center for Software Engineering
Legacy Software Reengineering
+
-Varying reengineering efficiencies based
on tdol applicability, source-target system
differences
Response:
- Exploratory model based.on N E T IRS data
University of Southern California
Center for Software Engineering
NlST Reengineering Case Study
[Ruhl-Gunn, 19911
+
+
+
+
+
+
IRS Centralized Scheduling Program
Mix of batch, database querylupdate, on-line
processing
50 KSLOC of COBOL74,2.4 KSLOC of
assembly code
Unisys 1100 with DMS 1100 network database
Reengineer t o SQL database, open systems
Tools for program and data reengineering
University of Southern California
<,.*;:
iy.'."
L,.,k
b
'
Center for Software Engineering
.
,
"
:
?
<
@
"....
. . $ ....,
NlST Reengineering Automation Data
Program Group
Batch
Batch with SORT
Batch with DBMS
Batch, SORT, DBMS
Interactive
Automated
Manual
1C S BE 1
%
.. .
University of Southern California
Center for Software Engineering
COCOMO 2.0 Model Overview
COCOMO Baseline Overview
COCOMO 2.0 Objectives
Coverage of Future Market Sectors
Hierarchical Sizing Model
Stronger Reus'eIReengineering Model
Other Model lm.provements
University of Southern California
'
Center for Software Engineering
Other Major COCOMO 2.0 Changes
Range versus point estimates
+ Requirements Volatility replaced by Breakage %
+ Multiplicative cost driver changes
- Product CD's
- Platform CD's
- Personnel CD's.
- Project CD's
+ Maintenance model and reuse model identical
+
Post-Architecture EMS = Product:
Required
Reliability
(RELY)
slight inconvenience
DB bytes1
Pgm SLOC<
10
Database
Size
(DATA)
Complexity
(CPU)
Required
Reuse (RUSE)
1
Documentation Many lifeMatch to
uncovered
Lifecycle
needs
(DOCU)
moderate,
high financial risk to human
life
easily
.. loss
recoverable
losses
1OIDIPc 100 1001DIPc
DIP> 1000
1000
low, easily
recoverable
losses
see Complexity Table
across
project
none
1 Some
1I
lifecycle needs
uncovered
across
program
I Right-sized I Excessive
.
across
product line
1
for Very
to life-cycle l i f e - c y c l e
excessive for
needs
needs
life-cycle
I
1 needs
I
across
m u lt i p l e
1
1
1
Post-Architecture Complexity:
...
Mostly simple nesting. Some intermodule control. Decision
tables. Simple callbacks or message
passing, including
middleware-supported distributed
processing.
...
Use of
standard math
and statistical
routines. Basic
matrixtvector
operations.
It0 processing
includes
device
selection,
status
checking and
error
processing.
e . .
Multi-file input Simple use of
and single file
widget set.
output. Simple
structural
changes, simple
edits. Complex
COTS-DB
queries, updates.
I
I
!
University of Southern California
'
Center for Software Engineering
Post-Architecture EMS - Platform:
Execution
Time
Constraint
(TIME)
Main
Storage
Constraint
(ST0R)
Platform
Volatility
(PVOL)
150% use of ..
available
execution
time
550% use of
available
storage
I
70%
70%
I
85%
85%
95%
J
major
every
minor
every
change major: 6 mo.;
12 mo.; minor: 2 wk.
change
1 mo.
major: 2 mo.; major: 2 wk.;
minor: 1 wk. minor: 2 days
University of Southern California
'
Center for Software Engineering
Post-Architecture EMS - Personnel:
1
35th
percentile
Analyst
Capability
15th
percentile
Programmer
Capability
(PCAP)
Personnel
Continuity
15th
percentile
35th
percentile
Application
Experience
(AEXP)
Platform
Experience
(PEXP)
Language
and Tool
Experience
(LTEX)
5 2 months
6 months
5 2 months
I 2 months
1
6 months
6 months
1
I
1
75th
percentile
90th
percentile
55th
percentile
75th
percentile
90th
percentile
1 year
3 years
55th
percentile
year
1 year
1
76 years
6 years
years
3 years
6 years
University of Southern California '
Center for Software Engineering
Post-Architecture EMS Project:
edit, code,
Use of
Software
debug
Tools (TOOL)
I
1
simple,
frontend,
backend
CASE, little
integration
International M u l t i - c i t y
Multisite
Development:
and MultiCollocation
company
(SITE)
Some phone, Individual
Multisite
Development: m a i l
phone, FAX
Communicati
ons (SITE)
Required
75% of
Development
nominal
Schedule
(SCED)
I
1
strong,
mature
lifecycle
tools,
moderately
integrated
Multi-city or Same city or
metro. area
Multicompany
basic
lifecycle
tools,
moderately
integrated
Narrowband
email
Wideband
electronic
communication
130%
strong, mature,
proactive
lifecycle tools,
well integrated
with processes,
methods, reuse
Fully
Same building
or complex
collocated
1 Wideband
elect. Interactive
comm,
multimedia
occasional
video conf.
160%
University of Southern Caiifornia
Center for Software Engineering
COCOMO Model Comparisons
DSI or SLOC
COCOMO 2.0:
Application
Composition
Object Points
Equivalent SLOC = Linear
Implicit in Model
Ada COCOMO
I
Instructions (DSI) or Source
Breakage
Maintenance
f (DM.CM,IM)
f (DM,CM,IM)
Requirements Volatility
rating: (RVOL)
Annual Change Traffic (ACT)
RVOL rating
-
%added + %modified
Organic: 1.OS
Semidetached:
1.12
~
~
Embedded: 1.20
Scale (b) in
~
p = a ( ~d i z e ) b~
ACT
Implicit in Model
I RELY, DATA, CPLX
Embedded: 1.04 - 1.24
depending on degree of:
early risk elimination
solid architecture
stable requirements
Ada process maturity
Platform Cost Drivers
TIME, STOR. VIRT.TURN
I
Personnel Cost Drivers 1 ACAP. AEXP. PCAP. VEXP.
RELY', DATA, CPLX*~RUSE
None
TIME, STOR, VMVII, VMVT,
None
NRN
ACAP', AEXP, PCAP' ,
VEXP, LEXP'
1 MODP. TOOL SCED
I
'
Different multipliers
Different rating scale
t
FP and Language or SLOC
%unmodified reuse: SR
%modified reuse:
nonlinear
f (AA,SU,DM,CM,IM)
Breakage %: BRAK
Equivalent SLOC =
nonlinear
f(AA,SU,DM,CM,IM)
None
Reuse model
1.02-1.26 depending on
the degree 01:
*precedentedness
*conformily
*early architecture, risk
resolution
*team cohesion
*process malurltv
(S El)
RELY, DATA, DOCU' f ,
CPLX~, RUSE' t
TIME, STOR, PVOi(=ViRT)
Platform difficulty: P D I F ' ~
I
I Personnel capability and I ACAP'. ~
experience:
None
BW\K
1.02-1.26 depending on
the degree of:
*precedentedness
*conformity
*early architecture, risk
resolution
*team cohesion
w o c e s s maturity
(S EI)
R=PX' t,RUSE' t
I
I
Project Cost Drivers
Function Points (FP) and
Language
Reuse model
Object Point ACT
I
Product Cost Drivers
COCOMO 2.0:
Posl-Architecture
COCOMO 2.0:
Early Design
m't,P R E X * ~
SCED, FCIL' t
~ x pp ct ~, p ' ,
w'i,LTEX.'~,PCON. t
TOOL'
t, SCED,SITE' t
University of Southern California
1C 1S I E 4
Other Model Refinements:
'
Center for Software Engineering
$3
+
Initial S c h e d u l e Estimation
where
+
J
PM= estimated p e r s o n m o n t h s excluding Schedule!
multiplier effects
Output R a n g e s
--
- - --
Application Composition
Earlv Desian
Post-Architecture
I
0.50 E
2.0 E
0.67 E
1.5 E
1.25 E
0.80 E
I
- One s t a n d a r d deviation optimistic a n d pessimistic
estimates
- Reflect s o u r c e s of uncertainty in model inputs
'I
University of Southern California
Center for Software Engineering
COCOMO 2.0 Model Overview
+ COCOMO Baseline Overview
+ COCOMO 2.0 Objectives
+ Coverage of Future Market Sectors
+ Hierarchical Sizing Model
+ Modes Replaced by Exponent Drivers
+ Stronger ReuseIReengineering Model
+ Other Model Improvements
+
Activity Sequence
COCOMO 2.0 Activity Sequence
+
+
+
+
+
+
+
Cost model review
Hypothesis formulation ( prepared, iterated)
Data definitions (prepared, iterated)
Monthly electronic data collection
- Amadeus data collection support
Data archiving
Incremental model development and refinement
- New USC COCOMO (baseline version prepared)
Related research
- COTS integration costlrisk estimation
1C 1S BE 1
&...
d:'
University of Southern California
Center for Software Engineering
Data Collection
+
Quantitative collection
- Project sizing
- Software metrics
+
Qualitative collection
- Scale Factors
- Effort Multipliers
1C
...
+
$%%-
S
1~1
\
University of Southern California
Center for Software Engineering
Amadeus System
+
+
+
Automated system for software
- metric collection
- metric analysis
- metric reporting, graphing
- metric prediction
Open architecture for extensibility
Supported metrics
University of Southern California
Center for Software Engineering
Preliminary Analysis
+
+
Calibration to original COCOMO project database
- Trend analysis is positive
- Coefficient, A, calibrated
- Correlation between Cost Drivers
Continuing analysis from other sources
0"cBsmEI
<,;
?>I
3::
University of Southern California
enter for Software Engineering
Results of COCOMO 81 Database
Calibration
C2-C1ERR
University of Southern Caliornia
Center for Software Engineering
Outline
+ Steps in Software Cost Estimation
- 10 years of TRW experience
+ Integrated Estimation, Measurement, and
Management
- Getting to CMM levels 4 and 5
- Role of COCOMO 2.0
+ COCOMO 2.0 Model Overview
+ COCOMO 2.0 Status and Plans
+ Information Sources
University of Southern California
Center for Software Engineering
COCOMO 2.0 Status and Plans
+ Current Status
- Affiliates' program
+ Future plans
- Public and commercial versions
- Model extensions
+ Long term vision
I IE I
C S
University of Southern California
Center for Software Engineering
COCOMO 2.0 Current Status
Model structure iterated with Affiliates
Largely converged
- Recent function point backfiring refinements being
considered
-
Model being calibrated to Affiliates', other data
-
Currently around 65 solid data points
Model being beta-tested by Affiliates
-
Reasonable results so far
Barry Boehrn - 3
Error Before Regression
Error after stratification:
Std. Cev = .37
Wan = .OO
N = 65.N
Ic (S ME I
University of Southern California
Center for Software Engineering
Current COCOMO 2.0 Affiliates (27)
Commercial Industry (9)
-
AT&T, Bellcore, CSC, EDS, Motorola, Rational, Sun, TI, Xerox
Aerospace Industry (10)
- E-Systems, Hughes, Litton, Lockheed Martin, Loral, MDAC,
Northop Grumman, Rockwell, SAIC, TRW
Government (3)
-
AFCAA, USAF Rome Lab, US Army Research Labs
FFRDC's and Consortia (5)
-
Aerospace, IDA, MCC, SEI, SPC
Barry Boehm - 4
University of Southern California
Center for Software Engineering
COCOMO 2.0 Affiliates' Program
SEI,
Others
-funding support
-sanitized data
-feedback on models, tools
-special project definition, support
-visiting researchers
Affiliates,
Research
Sponsors
Workshops, joint projects
L
Metric,
Process
Definitions
COCOMO
2.0
Project
-USC-CSE
-USC-IOM
-UCI-ICS
1-
COCOMO 2.0 models, tools
-Amadeus metrics package
-Turorials, reports, related research
-Special project results
-Graduating students
data
COCOMO
2.0
Database
IC IS IE I
University of Southern California
Center for Software Engineering
COCOMO 2.0: Public and Commercial
Versions
Available once Affiliates' calibration and betatesting stabilizes
Final beta-test version by COCOMOISCM Forum
- Full public version January 1997
-
Public version: USC COCOMO 2.0 tool
will do basic estimation and calibration functions
- won't include extensive amenities, planning aids
-
Commercial COCOMO vendors being provided
with pre-release COCOMO 2.0 parameters, source
code
Barry Boehm - 5
University of Southern California
Center for Software Engineering
COCOMO 2.0 Plans: Model
Extensions
COTS integration costs
Cost/schedule/quality tradeoffs
Activity distribution
Domain-tailored models
Specialized life-cycle stage estimation
- Hardware-software integration, operational testing
Sizing improvements
Project dynamics modeling
Reuse, tool, process return-on-investment models
University of Southern California
Center for Software Engineering
Long Term Vision:
Produce model covering trade-offs among software
cost, schedule, functionality, performance, quality
Enable the use of the model to scope projects and
monitor project progress
Use the data from COCOMO-assisted management to
calibrate improved versions of COCOMO
Use the data and improved COCOMO cost drivers to
enable affiliates to formulate and manage software
improvement investments.
University of Southern California
Center for Software Engineering
Information Sources:
+ Anonymous FTP: usc.edu
.
in directory: /pub/soft~engineering/cocomo2
WWW:
- http://sunset.usc.edu/COCOM02.0/
Cocomo.html
- Upcoming COCOMO events, Research
Group
- COCOMO 81 calculator
- COCOMO 2.0 User's Manual