Caminao & CMMI (V1.3)

Scope

See The Pagoda Playbook for an Enterprise Architecture implementation of the CMMI.

Capability Maturity Model Integration (CMMI) is a process improvement approach that helps organizations improve their performance. CMMI can be used to guide process improvement across a project, a division, or an entire organization. CMMI is a trademark owned by Software Engineering Institute of Carnegie Mellon University.

René Magritte’s Not to Be Reproduced (La Reproduction Interdite).

As such, CMMI provides a framework within which the proposed approach should be mapped.

For that purpose three areas (four with support) are considered:

  • Products: models and software components to be developed.
  • Projects: resources and deliverables to be managed.
  • Processes: roles, tasks, and workflows used by projects.

For those areas, the CMMI defines five maturity levels:

  1. Initial: No process. Each project is managed on an ad hoc basis.
  2. Managed: Processes are specific to projects.
  3. Defined: Processes are set for the whole organization and shared across projects.
  4. Measured: Processes are measured and controlled.
  5. Optimized: Processes are assessed and improved.
Process Maturity Levels

Measure for Measure

Whatever the volume of data  and the statistical tools employed, the relevance of process assessment fully depends on (1) objectives and unbiased indicators measuring projects performances and, (2) transparent mapping between organizational alternatives and process outcomes. Lest those conditions are satisfied, maturity assessments will remain self-referencing.

Self appraisal

If circular appraisals are to be avoided, the building blocs of development processes assessment must be clearly defined for products, projects, and processes:

  • Traceability: from requirements to deliverables (products), work units (projects), and tasks (processes).
  • Measurements: functional (products), effort (projects), performance (processes).
  • Quality: verification & validation (products), tests plan and risks management (projects), quality assurance (processes).
  • Reuse: artifacts and patterns (products), profiles (projects), organization (processes).
  • Management: model driven engineering (products), planning and monitoring (projects), maturity assessment (processes).
Building blocs of Development Processes Assessment

The objective is to consolidate those criteria with CMMI’s process areas.

Software Engineering Process Areas

Processes assessment is based upon a number of process areas for which measurements and best practices are proposed.

Causal Analysis and Resolution (CAR): identifies causes of selected outcomes and takes action to improve process performance.

Configuration Management (CM): establishes and maintains the integrity of work products using configuration identification, configuration control, configuration status accounting, and configuration audits.

Decision Analysis and Resolution (DAR): analyzes possible decisions using a formal evaluation process that evaluates identified alternatives against established criteria.

Integrated Project Management (IPM): establishes and manages the project and the involvement of relevant stakeholders according to an integrated and defined process that is tailored from the organization’s set of standard processes.

Measurement and Analysis (MA) develops and sustains a measurement capability used to support management information needs.

Organizational Process Definition (OPD): establishes and maintains a usable set of organizational process assets, work environment standards, and rules and guidelines for teams.

Organizational Process Focus (OPF): plans, implements, and deploys organizational process improvements based on a thorough understanding of current strengths and weaknesses of the organization’s processes and process assets.

Organizational Performance Management (OPM): proactively manages the organization’s performance to meet its business objectives.

Organizational Process Performance (OPP) : establishes and maintains a quantitative understanding of the performance of selected processes in the organization’s set of standard processes in support of achieving quality and process performance objectives, and provides process performance data, baselines, and models to quantitatively manage the organization’s projects.

Organizational Training (OT): develops skills and knowledge of people so they can perform their roles effectively and efficiently.

Product Integration (PI): assembles the product from the product components, ensures that the product, as integrated, behaves properly (i.e., possesses the required functionality and quality attributes), and delivers the product.

Project Monitoring and Control (PMC): provides an understanding of the project’s progress so that appropriate corrective actions can be taken when the project’s performance deviates significantly from the plan.

Project Planning (PP): establishes and maintains plans that define project activities.

Process and Product Quality Assurance (PPQA): provides staff and management with objective insight into processes and associated work products.

Quantitative Project Management (QPM): quantitatively manages the project to achieve the project’s established quality and process performance objectives.

Requirements Development (RD): elicits, analyzes, and establishes customer, product, and product component requirements.

Requirements Management (REQM): manages requirements of the project’s products and product components and ensures alignment between those requirements and the project’s plans and work products.

Risk Management (RSKM): identifies potential problems before they occur so that risk handling activities can be planned and invoked as needed across the life of the product or project to mitigate adverse impacts on achieving objectives.

Supplier Agreement Management (SAM): manages the acquisition of products from suppliers.

Technical Solution (TS): selects, designs, and implements solutions to requirements. Solutions, designs, and implementations encompass products, product components, and product related life-cycle processes either singly or in combination as appropriate.

Validation (VAL): demonstrates that a product or product component fulfills its intended use when placed in its intended environment.

Verification (VER): ensures that selected work products meet their specified requirements.

Continuous vs Staged Assessment

Progresses can be assessed from two different perspectives, staged or continuous.

The staged representation uses a top down approach and assesses the maturity of the whole organization according process areas achievements.

The continuous representation uses a bottom up approach and assesses capability for each process area before assessing maturity levels.

While both representations deal with process areas achievements, they don’t address actual contents, as illustrated by the nondescript initial level. And that gap can be corrected with Model Driven Engineering which provides a sound foundation for the continuous approach by establishing a mapping between problems to be solved, as defined by system architecture and functionalities, and tasks to be performed. On that basis traceability can be defined across models and measurements managed accordingly. Quality and reuse can also be managed selectively for business and development contents.

Staged assessment of Maturity Levels with a model driven basis
Staged assessment of Maturity Levels with a model driven basis

Process areas can be subject to qualitative or quantitative assessment, or a mix of both, hence the benefits of setting them on measurable grounds whenever some are available. That’s especially the case for:

  • Causal Analysis and Resolution (CAR) and Decision Analysis and Resolution (DAR). Those areas are to be supported by models organized along architecture layers combined with differentiated traceability between artifacts, work units, and development patterns and strategies.
  • Integrated Project Management (IPM) and Configuration Management (CM).  Those will benefit from work units being defined according development flows, with technical constraints built-in within delivery schedules.
  • Measurement and Analysis (MA). That will be supported by the integration of  functional metrics and effort measurements on one hand, the correspondence between work units and development flows on the other hand.
  • Organizational Process Definition (OPD) and Organizational Process Focus (OPF). The building blocs of processes design are work units directly defined from development flows; that will ensure that technical and development constraints can be clearly identified and taken into account at process level. At a higher level, the focus should be put on the reuse of development assets.
  • Project Planning (PP), Project Monitoring and Control (PMC) and Quantitative Project Management (QPM): the mapping between work units and development flows put product and project metrics under a common roof. Projects can be set on foot directly from objectives and sequencing constraints
  • Verification (VER), Validation (VAL), and Process and Product Quality Assurance (PPQA). Differentiated traceability between artifacts, tasks, and processes is pivotal for preventive measures, accurate diagnostic, and targeted intervention. More generally, quality must be managed separately for models, tasks, and organization. And if processes improvement is to be considered, quality assurance should make room for contracted tasks.
  • Requirements Development (RD) and Requirements Management (REQM). Those areas should benefit from requirements stereotypes and analysis standards.
  • Organizational Process Performance (OPP) and Organizational Performance Management (OPM). Those areas should benefit from requirements metrics combined with reasoned development patterns and strategies.
  • Risk Management (RSKM). In that area the main challenge is to manage risk according their source. Hence the importance of differentiating external risks rooted in business contexts or technologies, from internal ones sourced in projects.

A Continuous Growth of Capabilities

Process areas are characterized by generic and specific goals, to be achieved by corresponding practices, the former applying to multiple process areas, the latter focusing on single ones. A continuous representation of capabilities will clearly be improved if based on common measurable grounds; for instance:

At level 1 the generic goal should be to achieve standard description of models, artifacts and tasks. Corresponding practices would be:

At Level 2 the generic goal is to institutionalize a Managed Process. Practices that would directly benefit from the proposed approach include:

  • GP 2.2 Plan the Process:  if work units are directly defined from development flows, sequencing constraints can be more clearly identified, and effort estimations can be based upon requirements metrics.
  • GP 2.3 Provide Resources: for the same reason, needs for both human and technical resources should be better identified.
  • GP 2.4 Assign Responsibility:  overlapping or blurred responsibilities will be reduced if work units match development flows.
  • GP 2.6 Manage Configurations: discrepancies between technical constraints and the schedule of components deliveries can be avoided if work units are defined according development flows and sequencing constraints.
  • GP 2.8 Monitor and Control the Process: functional metrics and work units set along development flows clearly enhance traceability and transparency of development progresses.
  • GP 2.9 Objectively Evaluate Adherence: with work units set according architecture-based project patterns, projects adherence to generic goals can be directly evaluated and capability assessed accordingly.

At Level 3 the generic goal is to institutionalize a Defined Process. Practices that would directly benefit from the proposed approach include:

  • GP 3.1 Establish a Defined Process: processes should be defined on purpose, in other words processes are designed to solve development problems. That goal would clearly benefit if development problems are mapped to patterns, and processes defined as corresponding development strategies.
  • GP 3.2 Collect Improvement Information: for the same reason improvement information would be significantly more accurate if associated with patterns of problems and solutions.

At Level 4 the generic goal is to institutionalize a Quantitatively Managed Process. Practices that would directly benefit from the proposed approach include:

  • GP 4.1 Establish Quantitative Objectives for the Process: the benefits observed at level 2 for outcomes and tasks can be combined with those identified at level 3 as to be generalized to patterns of problems and solutions.
  • GP 4.2 Stabilize Sub-process Performance: if tasks are matched to development flows sub-processes can be more easily set apart and the granularity of assessment refined accordingly.

At Level 5 the generic goal is to institutionalize processes optimization. Practices that would directly benefit from the proposed approach include:

  • GP 5.1 Ensure Continuous Process Improvement: this goal is meaningless without continuous and unbiased references. Hence the importance of sound benchmarks based upon development patterns  and functional metrics on one hand, development strategies and consistent definition of work units on the other hand
  • GP 5.2 Correct Root Causes of Problems: that will be much easier with differentiated traceability between artifacts, work units, and development patterns and strategies.

Those benefits can be detailed with specific practices.

From Engineering to Business Processes

The crumbling of fences between business environments and enterprise systems is progressively merging business and software engineering processes, inducing a shift of the process paradigm:

  • Business Time: familiar periodic frameworks are losing relevancy as success depends more and more on continuous readiness, quicker tempo, and the ability to operate inside adversaries’ time-frames, for defense (force competitors outs of advantageous positions) as well as offense (get a competitive edge).
  • Business Flows: mirroring continuous time-scales, ubiquitous unrolling of digital flows has brought about a seamless integration of enterprise systems with their business environments. Competitive edges are not only set in terms of definite aggregates but may also fluctuate with the ability to act on the fly on operational data.

Such a swift and comprehensive upheaval may challenge two of processes’ underlying purposes:

  • Processes are meant to differentiate and coordinate activities; that may turn into a conundrum if time-frames and actions are to be dynamically redefined.
  • Processes are usually understood as blueprints of activities and sequencing rules to be executed in actual circumstances; but charting effective blueprints beforehand will be difficult if flows are undifferentiated and alternative actions defined at run-time.

As epitomized by agile development models, these changes have already been taken into account by software engineering methods. On a broader perspective assessing such processes would probably be easier with continuous representations. And the bottom-up approach will also provide a smooth learning curve.

A Staged Path to Maturity

With core process areas  soundly rooted in clearly defined goals and practices, capabilities can be monitored and progresses consolidated into maturity levels:

Initial
  • Starting with ad hoc development activities, the goal is to identify specific development processes.
  • For that purpose input & output artifacts are to be defined uniformly throughout the organization according stereotyped development flows. On that basis it will be possible to identify basic development patterns. Corresponding work units can then be defined relative to their position along those flows as well as their impact.
Repeatable
  • Starting with ad hoc development processes, the goal is to identify them at organization level and characterize them depending on objectives and constraints.
  • For that purpose development flows are to be normalized.
Defined
  • Starting with shared development processes built from normalized development flows, the goal is to develop metrics for products and projects evaluation.
  • Product metrics can be defined using stereotyped requirements and function points. Given that tasks are defined relative to development flows, projects metrics can be defined as ratios of resources by function points.
Managed
  • Given institutionalized and quantitatively managed processes, the goal is to assess their effectiveness independently of subjective estimations.
  • That may be done by associating processes with development patterns on one hand, development strategies on the other hand.
Optimized
  • Given managed processes with objective metrics on one hand, development patterns and strategies on the other hand, the goal is identify elements whose change is to improve process adequacy and efficiency.
  • Improvements should be managed at different levels of granularity, targeting artifacts, activities, resources, and coordination.
  • They will usually entail arbitrages between: reactivity vs stability; responsibility vs planning; skills vs support;  risk vs schedules.
A Smooth Path to Maturity

That approach will (1) deal with the overlapping of business and engineering processes, (2) cover phased as well as agile project management, and (3) support a smooth learning curve.

Further Reading

External Links

6 thoughts on “Caminao & CMMI (V1.3)”

  1. 1. You are quite right that CMMI is a process improvement approach. 2. Better if you also warn that it is only an approach, not a guaranteed way of improving process.
    3. Further, no assurance that improved process results in improved outcomes,
    4. particularly the ROI of predicting the maturity that any given project WILL exhibit.

  2. Hey just wanted to give you a quick heads up. The words in your article seem to be running off the screen in Internet explorer. I’m not sure if this is a format issue or something to do with web browser compatibility but I thought I’d post to let you know. The layout look great though! Hope you get the issue solved soon. Cheers

  3. Remy,
    your article is – from my perspective – your personal interpretation of the CMMI under the viewpoint of Requirements Engineering, no less no more. You should indicate this somewhere in the introduction of your text.
    CMMI has a long time ago released itself from Software and changed to system development. Before this background, your article is missleading any person, that is not familiar with CMMI.
    Your interpretation of the GPs is your personell opinion, too. For example, GP2.8 is about day to day monitoring and can be resolved by using measurements, but that’s not a must. GP 2.9 is about adherence of processes AND products. Maybe you should take a CMMI Intro class by some well known CMMI Instructors to learn about interpretation of GPs, especially.
    And you should update your structure overview: Process Areas OID has vanished with the current version of CMMI some years ago, and OEI and ISM even with the version before…
    BR Joachim

    1. The title leaves no ambiguity about the personal perspective taken by the article. I’m not pretending to be an expert, only to explore from the outside the benefits and caveats of CMMI.

Leave a Reply to caminaoCancel reply