Going Higher on CMMI High Maturity

It is important that a CMMI high maturity organization keeps on working towards maturing its high maturity to higher and higher levels. Going higher on CMMI high maturity will certainly impact the business performance parameters that matter most to any business enterprise - an ever growing top line along with a healthy bottom line.

Going higher on CMMI high maturity requires leveraging the OPM and OPP process areas. In a certain sense the whole idea underlying this is so simple that it might appear too good to be true. It is, however, that way. One needs to consider the typical cycle of improvement to understand the above.

The PDCA concept is probably the simplest yet the most powerful concept in the area of process and performance improvement. The OPM and OPP process areas can be easily mapped to PDCA albeit with a different permutation (DCAP - P moves to the end) and extending the cycle with two additional activities - 'I' (Improve) and 'DD' (Do Differently). DCAP-IDD can be explained in the following manner:
  • D (Do) - execute business processes in accordance with business objectives and organizational performance goals
  • C (Check) - determine current process performance - use OPP (derive the PPBs and PPMs)
  • A (Act) - compare current performance against target performance and in case of deviations identify appropriate process improvement actions- use OPM
  • P (Plan) - plan process improvement actions - use OPM
  • I (Improve) -  implement process improvement actions to improve processes - use OPM
  • DD (Do Differently) - execute business processes differently, i.e., using improved processes with the difference this time as compared to the first Do being that achieving business objectives and organizational performance goals will (hopefully) have higher likelihood.

Characteristics of Prediction Models

Prediction models are quite useful in many fields like weather forecasting, demand and supply studies, production planning and control, project management, etc.

Here are some of the essential characteristics of a prediction model:
  • Prediction models should be able to provide the value of a a parameter before it occurs. Example: the model should be able to tell which team will win the Cricket world cup before the world cup starts till the final match reaches a conclusive stage.
  • Prediction models shouldn't just provide the %times the actual value will be within a certain range (or outside as the case may be) but what the range of values in a particular case could eventually be. This is like telling a patient that 20% of patients who get admitted to the hospital don't leave alive. For a particular patient though it will be important to know what is %chance of survival for him or her given the health condition at the time of admission. Example: the model instead of telling that India will win since it has won 2 world cups in the past should instead tell that given India has won last five ODI, series which is more than all the other teams, India has 80% chance of winning the world cup.
  • Prediction models should provide the prediction either in an ordered manner with %likelihood of occurrence, or a set of outcomes in a narrow range or a set of few values or at best a single value (in the case of a single value the prediction model is no less than God if the prediction comes to be true in 100% cases). Example: the model should tell that India will win with 80% probability and Australia will win with 40% probability, or one of India or Australia will win it, or India will win it.
  • Prediction models should be able to dynamically adjust the prediction made in a progressive manner continuously or at least at multiple interim points till the actual value occurs. This point hides a weakness of prediction models based on historical data - if something happens that has never happened in the past the prediction model may fail to adjust the prediction. Example: In case India looses first 3 matches the model may change the prediction to Australia or maybe some other team winning?
  • Prediction models should have some predictors that can be controlled or manipulated to get a desired outcome in case the original prediction is not a desired one. If control or manipulation is not possible then the prediction should be available early enough to plan for appropriate contingency actions. Example: the model should be able to tell that while on the way to airport to catch a flight one would get delayed if one goes by road in which case one can go by train instead. in the case where control or manipulation is not possible like hurricane in an area (which cannot be controlled) the prediction should be available early enough to inform everyone staying in that area to move to another area.

Quantitative Process Composition and Process Tailoring in CMMI Model

The concept of quantitative process composition (QPC) is integral to the implementation of the QPM/QWM (Quantitative Project/Work Management) process area at level 4 of CMMI. 

For understanding quantitative process composition (QPC) it is important to first understand qualitative process tailoring (QPT) at level 3.

So how is QPC at level 4 different from QPT at level 3 and what exactly is it?
Qualitative Process Tailoring (QPT)

QPT is the definition of project-level processes by selecting from the organization's set of standard processes (OSSP or the organization's QMS in a simple language) based on subjective factors and qualitative considerations. 

The following example illustrates the concept underlying QPT.

Consider a project that needs to work on developing a complex application.

The project may identify technical complexity as a risk that can impact the quality of application delivered to the customer.

For overcoming the above risk the project may decide to perform review of the architecture/design in a manner more rigorous than the usual.

It is assumed that the customizations done to the process in this case will have positive impact in achieving the desired outcome.

The above is an example of qualitative process tailoring and might mean one or more of the following in this specific case:
  • The review checklist might be 'tailored up' by injecting it with additional questions related to design aspects like fan-in, fan-out, modularity, coupling, cohesiveness, etc. (tool/template level tailoring)
  • The method selected from review might be perspective-based review by a group of experts rather than desk check by a single peer reviewer (procedure/method level tailoring)
In general, usual approaches to QPT include above variants along with some more as listed below:
  • Use of customer or client provided processes and templates over the organization's standard processes and templates
  • Use of organization's standard processes and templates with certain sections turned into "not applicable"
  • Use of the option of 'tailoring out' certain of the organization's standard processes and templates in entirety (case of waiver)
Quantitative Process Composition (QPC)

QPC is the definition of project-level processes by selecting from the organization's set of standard processes based on objective factors and quantitative considerations.

The following example illustrates the concept underlying QPC.

Consider the same example as above of a project that needs to work on developing a complex application.

The project may identify technical complexity as a risk that can impact the quality of application delivered to the customer.

For overcoming the above risk the project may decide to perform review of the architecture/design in a manner more rigorous than the usual.

And here comes the twist.

The quality of application delivered to the customer is first expressed in quantitative terms such as number of post release defects in six months will not be more than 'n' per SLOC of code in the delivered application (the Big Y).

It is assumed that the customizations done to the process in this case will not only have positive impact in achieving the desired outcome but also the same can be expressed in quantitative terms (typically mean and SD assuming normal distribution).

The above is an example of quantitative process composition and might mean one or more of the following in this specific case:
  • The capability of review process using the normal and 'tailored up' checklists for design review (categorical X variable) is known and its influence on Y is known.
  • Obviously, one would expect that the 'tailored up' checklist leads to a better Y (if not, the assumption that 'tailored up' checklist is better doesn't hold true)
  • The impact on Y of the extent and intensity of X is known.
  • So, by varying the levels of X (similar to the concept of varying the levels of treatment in DOE/ANOVA to determine the optimal combination) Y can be forced to behave at the desired level.
  • Another possibility is to determine some other Xs that influence Y (like number of weighted requirements, modules, code size, etc.).
In general, usual approaches to QPC include above variants along with some more as listed below:
  • Defining alternative sub-processes for performing an activity and selecting the set of sub-processes for performing various activities that result in achieving Y at the desired level
  • Determining Xs that influence Y and establishing the relation between Y and Xs (typically expressed in the form of statistical/mathematical model or based on simulation of Y by varying the Xs in the range observed based on historical data).
 QPC versus QPT

So the primary difference between QPC and OPT is that in QPC process selection is done based on statistically validated hypothesis whereas in QPT process selection is done based on experentially assumed behaviors.

In QPC defining project processes is called as process composition whereas in QPT defining project processes is called as process tailoring.

QPC being a level 4 high maturity practice is expected to make use of heavy duty statistical data analysis techniques like Sampling Distributions, SPC, ANOVA, Statistical Modelling.

Whereas QPT being a level 3 or 'low maturity' practice is expected to make use of light duty statistical data analysis techniques like Averages, Run Charts, Box Plots, etc.

P.S. Some Interesting Observations
  • It might appear that in the CMMI model process 'composition' is higher up in the pecking order compared to process 'tailoring'. 
  • The term process 'composition' makes one imagine Arts, Music, Mozart, etc. whereas the term process 'tailoring' makes one think of Stitching, Cloth, Scissors, etc. 
  • This is not a sound view of the fundamental differences between these two because 'tailoring' could also make one imagine about Versace, Chanel, etc.
  • The other point is that use of statistical analysis doesn't imply high maturity. 
  • It might surely mean high visibility and probably high control but that doesn't always translate into high maturity. 
  • Also, if one were to look at industries other than software, the practices which make level 4 and 5 high maturity will not probably quality as high maturity. 
  • Usage of DOE, ANOVA, SPC, etc. is so commonplace in manufacturing companies that high maturity is business as usual.
  • The use of statistical and/or simulation-based models for quantitative planning and monitoring & control may not be very effective if the behavior of the Big Y or the small y(s) is influenced by factors which are essentially not controllable or controllable in a limited way. 
  • A good example of this is that the maintainability of a product is heavily influenced by the complexity of the business logic, but the complexity factor is controllable in a limited way.
  • Another example is that the product quality is heavily influenced by the deep domain expertise of the personnel working on it which is not controllable in a short span of time (over a long span of time this is definitely controllable but due to personnel churn at times this may not happen).

    How Open Source is Actually Closed?

    Open Source Software (OSS) has become a standard item in the lexicon of folks working in IT companies and departments. The charm of getting source code at no cost seems so appealing that everyone is for OSS.

    Despite no commercial company pushing, at least willingly, the case for using OSS it seems to grow like wild weed across the business world.

    The use of the term "Open" may appear as a deliberate attempt to sell OSS. The term "Open" hints at the following:
    • transparent
    • easy to access, adopt and adapt
    • no discrimination among its users
    It might be interesting to note that the concept of Open Source can be compared to that of the Agile Method. Apart from clever usage of judiciously selected jargon these concepts have evolved as a result of the anti-establishment sentiments shared by some of the founder members.

    For every X which existed an anti-X or X-complement was evolved as a better solution to a known problem. So for Waterfall you have Agile and for Proprietary Software you have Open Source Software.

    It is however completely debatable as to the agility of Agile (just think of the SCRUM rituals) and the openness of Open Source Software (just think of GPL terms and conditions).

    Of course OSS does offer certain advantages like:
    • increased coding productivity
    • assurance from use of proven code
    • multiple code options for a functionality
    • visibility into code-level implementation allowing complete customization
    However, as is commonly understood there is no free lunch and also every coin has two sides, OSS is not actually that open but rather closed in many ways.

    OSS comes with certain limitations, constraints and risks as listed below:
    • higher exposure to legal violations
    • easy availability of code may choke creativity and inventiveness of programmers
    • lack of timely support or no support available in case of issues with OSS
    • programmers may have to spend more time pondering on legalities rather than technicalities (this may conflict with the psychology of a typical developer)
    • Some OSS licensing may require "opening" up the code written using the OSS (for business organizations selling software source code is akin to IP, trade secrets and hence using OSS will make no business sense in certain instances)
    Going by the trends currently visible it seems OSS is well on its way to increased usage. And if that is so, companies must understand both the pros and cons and evaluate where to use OSS and to what extent.

    It is always good to remember that there is no free lunch and also every coin has two sides.

    PPBs and PPMs in OPM Process Area of CMMI v1.3

    OPM or Organizational Performance Management is a new process area that has been added to v1.3 of the CMMI model.

    It is not entirely new since it replaces and extends the OID (Organizational Innovation and Deployment) process area.

    Hence effectively, version 1.3 of CMMI contains same number of process areas as compared to version 1.2 (2 at level 5 and 22 overall).

    So OPM is OID plus something more.

    One major difference is the expectation of CMMI v1.3 model that the performance/process improvement initiatives and activities in an organization should be explicitly linked to business objectives, must impact the PPBs (Process Performance Baselines) and PPMs (Process Performance Models) and need to be demonstrated statistically, preferably in terms of changes to PPBs and PPMs.

    An example to illustrate the above is described below:
    • Suppose the business objective is to deliver to the customers on time. 
    • A parameter of interest for the business in this case could be OTD (on time delivery). 
    • For this the 'agreed date of delivery' committed to the customer and the 'actual date of delivery' to the customer is captured.
    • OTD can be defined in terms of the difference between 'actual date of delivery' and 'agreed date of delivery' and is computed for further analysis.
    • The process behavior is statistically characterized (as needed by level 4 expectations also). 
    • This means the average (mu) and variance (sigma) of OTD is computed.
    • Effectively speaking this is captured in the form of PPBs and PPMs.
    • Suppose the average (mu) and variance (sigma) values are set wide apart or not within target.
    • This becomes a case fit for improvement and OPM can kick in.
    • The improvement effected through an OPM project needs to be statistically proven.
    • This means the average (mu) and variance (sigma) values should show change (in the positive direction).
    • From statistical perspective variance (sigma) should reduce and average (mu) should move closer to the target (higher or lower as the case may be).
    • Effectively speaking this should result in change to the PPBs and PPMs.
    PPBs and PPMs are hence integral to OPM implementation.

    Since OPM comes after level 4 process areas have been implemented (specifically, OPP - Organizational Process Performance) use of PPBs and PPMs is clearly an expectation from the CMMI model perspective.

    In a way it is also a natural progression for the organization from managing project performance or work performance quantitatively by using PPBs and PPMs in QPM (Quantitative Project Management) or QWM (Quantitative Work Management) respectively to managing the organizational performance quantitatively by using PPBs and PPMs in OPM (Organizational Performance Management).

    Using CMMI and Six Sigma Together

    Both the six sigma methodologies - DMAIC and DMADV/DFSS - can be easily used within a CMMI framework implementation. Organizations that have adopted either of the two, CMMI and six sigma, will find great value in understanding the other and surely realize the "teeth" they can provide to their quality and improvement program.

    CMMI is a generic framework which focuses on three aspects below and the integration of six sigma into a CMMI framework can be thought of along these three aspects:

    Engineering

    DFSS fits into engineering process areas of CMMI in a manner which is quite natural. All DFSS tools can be mapped to practices of the various engineering process areas (RD, TS, PI, VER, VAL). An organization can extend its existing engineering process areas implementation toolkit by simply adding the DFSS toolkit to it. The statistical rigor in DFSS can help engineering activities to become much more effective and stronger. In some sense for an organization which is attempting to adopt L4 and L5, using DFSS can be a good way to introduce "high maturity" in its engineering activities.

    Process Improvement

    DMAIC fits into process improvement process areas of CMMI in a manner which is quite natural. All DMAIC tools can be mapped to practices of the various process improvement process areas (OPF, OPM, CAR). An organization can extend its existing process improvement process areas implementation toolkit by simply adding the DMAIC toolkit to it. The statistical rigor in DMAIC can help process improvement activities to become much more effective and stronger. In some sense for an organization which is attempting to adopt L4 and L5, using DMAIC can be a good way to introduce "high maturity" in its process improvement activities. In fact, CMMI high maturity expects explicit use of statistical and quantitative techniques to manage the organization's processes.

    Project Management

    The general approach used by either of the two six sigma methods lays heavy emphasis on a project-based approach. In many companies people are working not just on projects but six sigma projects. This reflects the strong orientation in six sigma towards project management. An organization can extend its existing project management process areas implementation toolkit by simply adding the six sigma planning and management concepts like charter, reviews, etc.

    In conclusion, it can be said that CMMI is a framework whereas six sigma is a method. And like other methods six sigma can be used within a CMMI framework implementation to enhance its rigor and effectiveness.

    CMMI L5 Nemesis - PPBs and PPMs

    PPBs (Process Performance Baselines) and PPMs (Process Performance Models) - have you ever heard these two terms?

    If the above terms PPBs and PPMs sound Latin and Greek to a person who claims to possess process improvement expertise to a level more than "reasonable" then the person must admit being immature as far as understanding and applying CMMI high maturity is concerned.

    Process Performance Baselines and Models in CMMI - PPBs and PPMs

    It won't be unfair to say that PPBs and PPMs can be considered as the proverbial nemesis for any organization interested in being a CMMI level 5 organization.

    It must be carefully noted that the phrase used in the earlier sentence is "being a CMMI level 5" rather than "getting a CMMI level 5".

    At level 4, the organization is expected to develop PPBs and PPMs and the use them for managing its work.

    And at level 5, the organization is expected to use the PPBs and PPMs for improving its process and ultimately its business performance.

    Though PPBs and PPMs add tremendous value at level 4 itself their real power gets demonstrated at level 5.

    The success of PPBs and PPMs in a true sense depends on two critical factors:
    • granularity and quality of data collected along with the performance of activities (doing an activity and collecting data should be seen as integral tasks)
    • genuine attempt by the organization to use data-based process improvement (process improvement should add value to business by enhancing operational efficiency and hence the organization's competitive positioning in the market)
    PPBs and PPMs are actually simple concepts.

    PPB is essentially the characterization of the performance of a process "as it was in the recent past".

    Whereas, PPM is essentially the characterization of the performance of a process "as it is expected to be in the near future".

    PPBs are based on statistical concepts like mean, standard deviation, statistical process control limits, percentiles, etc. and reflect the historical performance of a process and may truly do so provided the granularity and quality of data collected is appropriate.

    PPMs are based on statistical concepts like statistical modeling, time series analysis, tests of hypothesis, etc. and reflect the future performance of a process but may not truly do so even if the granularity and quality of historical data collected is appropriate.

    The above might happen if the process undergoes changes with respect to its inherent behavior or the implementation of the process varies significantly from the typical way it used to be implemented.

    In situations where the process behavior depends on subjective elements the above two occurrences are expected to be the norm rather than exception.

    The most common example is any process which depends on human intellect/creativity (like software development, arts, etc.) or manual operations (like plumbing, painting , etc.)

    For software development since the intervention and influence of human intellect is so much a part of the whole process the challenge of being able to make use of PPMs in an effective manner is expected to be a natural one.

    However, it in no way means that software development is unpredictable.

    Business and sound engineering demand predictability of the outcome when an activity is performed.

    Software development needs to mature into software engineering and CMMI high maturity offers the ways and means to make the transformation of software development from being an art to engineering!

    Progressing from CMMI Maturity Level 3 to Level 5

    Progressing from CMMI maturity level 3 to maturity level 5 requires significant effort on the part of any organization. The journey to attaining maturity level 5 is a challenging one for any level 3 organization. What does it take to move up from level 3 to level 5? Why it becomes difficult to implement four more process areas (PAs) when a level 3 organization has already implemented eighteen PAs successfully in its level 3 journey?

    First and foremost, attaining level 5 means the organization will need to implement the four PAs which are called "high maturity" PAs, and rightly so. High maturity requires a paradigm shift in the way an organization conducts its business and impacts how the organization performs engineering, project management and process improvement. It follows from here that this paradigm shift requires changes in the organization's thought processes, working styles, conceptual understanding of its employees and  emphasis on "fact-based" management over "feeling-based" management.

    Since CMMI level 5 signifies an organization has become "high maturity" it is important to understand what is meant by high maturity. A good way to understand high maturity is assume that high maturity is equivalent to an organization demonstrating the following characteristics:
    • Processes must help achieve business expectations. Business strategy drives business objectives which in turn drive and are closely linked to an interrelated set of "business-critical" processes, process performance measures and objectives.
    • The capability and actual performance of "business-critical" processes is quantitatively known.
    • Day to day work is managed quantitatively using the capability and actual performance data supplemented by usage of quantitative models for predicting and proactively managing process outcomes instead of mere reliance on monitoring and correcting process outcomes.
    • Improvements to processes are identified and carried out in case the actual performance of a "business-critical" process doesn't help achieve business expectations.
    The second aspect that needs to be appreciated is that the high maturity PAs depend upon the level 2 and 3 PAs for their implementation. In a way level 2 and 3 are foundational elements for high maturity. High maturity is difficult to achieve if level 2 and 3 implementation is weak or not effective. It follows from here that refinement of the definition and deployment of level 2 and 3 PAs is a must before embarking on the level 4 and 5 journey.

    The refinements and adjustments to level 2 and 3 PAs will generally require following changes at the minimal:
    • A good, hard look at the existing metrics and measurement system (MA -> QPM, OPP)
    • Refinement of tailoring guidelines to incorporate tailoring based on quantitative considerations (IPM -> QPM)
    • Use of data, baselines and models for project planing, monitoring and control (PP, PMC -> QPM)
    • Use of data, baselines and models for process improvement (OPF -> OPM)
    • Re-looking into the remaining PAs due to the impact of the above
    Another aspect that needs to be considered is the need to have appropriate tools and templates for data collection, quantitative/statistical analysis, metrics-based reporting and action identification and quantitative assessment of impact of action closure. The existing tools and templates for capturing data (such as size, effort, schedule, cost, defects, etc.) may need complete overhaul or significant changes in functionality or usage to enable the implementation of high maturity practices. CMMI model expectations from a high maturity implementation lays heavy emphasis on data quality which may lead to complete overhaul of the operational definition of the both derived and base measures.

    Last but not the least, level 4 and 5 implementation needs level 2 and 3 to be sustained. This means that the organization must ensure that the focus is on all the twenty two PAs. Level 4 and 5 achievement in this way gets built upon the level 2 and 3 foundation. It is a known fact that for rising higher and higher the foundation has to become stronger and stronger. Thus a level 5 journey needs higher concentration and significantly higher effort.

    Six Sigma, Martial Arts and Belts

    The pervasiveness of "belts" in the corporate world is a strange phenomenon. In some organizations the talk of any typical day is around and about yellow belts, green belts (GBs), black belts (BBs) and master black belts (MBBs). There is yet another convoluted layer in these talks - there are those who are DMAIC fans and then there are those who are DFSS fans.

    In many organizations, forums and conferences the six sigma champions (MBBs, BBs, GBs and XBs, where X could be anything) prance around like magicians as if they hold the silver bullet for any and every conceivable problem faced by the organization. In fact some of them come across as super experts who can propose solutions even when there is no problem to be solved. At times some of them also project themselves as black cat commandos on the mission to save dollars for the organizations. Most of the XBs would rattle off mouthful of terms like hypothesis testing, minitab, chi-square test, normality testing, etc., etc., etc.  to show off how sounding difficult and different can create the right marketing impact. In fact, some of the organizations and obviously their executive leaders seem to be puppets in the hands of the six sigma God and the six sigma champions!

    The concept of belts originated in the field of martial arts. Belts signify the level of proficiency achieved by a student of martial arts. Proficiency in this context is usually measured in terms of the complexity and toughness of movements and postures the student is able to demonstrate. The intent of awarding different belts at different proficiency levels was probably two fold - act as a grading of students and also act as a positive motivation for students to continuously improve upon their proficiency.

    Originators of six sigma were smart enough to figure out that to make this concept popular they had to have something catchy in there and for six sigma initiatives to stick the benefit to the individual had to be clearly articulated. Belts seems to fit the bill perfectly - its catchy and those worshiping the six sigma God are bestowed divine blessings in the form of belts.

    The success of six sigma in terms of buzz it has managed to create is its failure in some sense. In many organizations the six sigma champion (usually an MBB) will happily announce the increasing number of GBs and BBs in the organization in the monthly or quarterly staff meetings. Some of the projects completed for these certifications may not be really meaningful and value-adds but are done to produce an army of GBs amd BBs. The MBBs in most of these organizations believe in producing XBs more than the anything else.

    Why the belts in martial arts is probably not a good idea for six sigma programs:

    1. In martial arts everyone is a student even one having the highest belt. The pecking order promotes humility and healthy respect for belt others wear. The belt is a means to an end where the journey to the end is actually endless. In six sigma the pecking order promotes a guru or champion culture in an organization where MBBs are the kings, BBs are the generals, GBs are the soldiers and the rest don't matter. The focus is on how to become an XB.

    2. In martial arts belt is actually worn both figuratively and literally. This promotes respect for the art and is a powerful expression of earning a belt rather than becoming one. In six sigma the belts are figurative and fuzzy as well. The focus for a GB is to become a BB and then a MBB. Attending a training, passing a test and doing six sigma projects is enough to become an XB but there may be lack of any focus on earning the belts in a true sense.

    Six sigma is essentially a problem solving methodology based on the PDCA concept combined with statistical tools with the aim of continual process improvement. There's nothing new in it other than the branding and marketing around it. In a way six sigma's popularity can be attributed to the branding and marketing rather than any original thinking.

    What matters in the end is that a method is perceived as working and hence adopted widely. With six sigma that has probably been the case. PDCA, statistical tools and continual process improvement concepts were always useful and with six sigma they have penetrated the corporate world in a much more profound way. Six sigma has made this possible and that is its real success - a catchy slogan for good old concepts packaged nicely for branding and marketing.

    Is Agile Really Better than Waterfall?

    The simple and short answer to this question is "No".


    There is no single methodology which can claim to be a panacea for the problems faced by software projects.

    Evolution of Agile

    Agile methods like XP, Scrum, FDD etc. have evolved based on the principles enshrined in the Agile manifesto which supposedly came into being as a result of frustrations experienced with the Waterfall method.

    Agile has become quite fashionable since the last few years and as the die-hard Agile fans would love to hear "Agile takes care of all that is wrong with the monster named Waterfall".

    However, reality is far removed from this statement.

    One thought that often comes up is related to Agile and Fr(agile) and does accept the fact that Agile has its own set of challenges and shortcomings just like Waterfall or any other methodology.

    In fact, in many ways Waterfall is much better than Agile and as some might say Waterfall Model, Original and Forever.

    The "Waterfall"  Concept

    Waterfall is based on the concept of staggering as it advocates starting an activity only when its precedent activities have been completed.

    Agile is based on the concept of parallelism as it advocates performing all activities in parallel without bothering about their precedence relationships.

    Agile approach is better when the product or technology is a new or innovative one and changes to try out several permuations and combinations is budgeted in its cost.

    However, no organization can provide infinite budget for any project and hence projects are forced to do upfront planning to the extent they  have visibility.

    On this front waterfall wins hands-down.

    Agile says changes are welcome.

    Waterfall also says changes are welcome but adds that changes don't come free of cost.

    This is a good incentive for customer and developers to think through and ensure the high-level requirements and overall architecture is laid down before coding and testing.

    The Agile Concept

    In Agile method estimation, planning, requirements and design documentation, test planning, meetings, status reports are viewed as overheads.

    And the focus is on delivering the working software to the customer day in and day out (or rather sprint in and sprint out).

    It follows from above that Agile is focused on coding, testing and bug-fixing.

    Code is continuously spewed out by the project, tested against the requirements and changed either due to design and implementation issues or due to changes in requirements.

    Code changes are effected through re-factoring which essentially amounts to re-design and re-coding.

    In this process the overall architecture is not focused upon and evolves along with the requirements.

    This may lead to weak architecture and impact non-functional requirements like performance, scalability etc.

    Waterfall versus Agile

    It can be concluded that Agile seems to focus on the short-term at the cost of long-term.

    For complex projects done by big, distributed teams Agile requires much higher rigor and discipline that what is advocated by the Agile purists.

    In such situations Waterfall is a better choice.

    It is also a fact that no one uses pure Waterfall.

    Intermediate deliveries have been a common practice in case of Waterfall projects as well.

    In the end, Agile seems to be more of a marketing ploy than just a well-grounded method.

    It attempts to make a virtue out of lack of discipline by suggesting that trying to get upfront clarity in the requirements and approach in the beginning, to the extent possible but limited by visibility and constraints, is futile as they will change.

    Thinking hard and having a good plan while remaining open for changes to the plan is a better stratgey than the above. 

    Since, as is commonly accepted, well begun is half done.

    In conclusion, it can also be said that "well planned is half execcution done"!

    Role of Sponsors in Improvement Initiatives

    Sponsors play a crucial role in the improvement initiatives in any organization. The sponsors are generally part of the executive management in the organization and charter improvement initiatives based on burning or strategic needs of the business. They not only lay down the vision for improvement initiatives but also allocate the budget and resources. At times, sponsors may do more harm than good for the initiatives to be completed successfully in a real sense. This may happen due to following reasons:

    Setting unrealistic time line for the improvement initiatives - this arises due to the tendency to equate more resources being available to reduction in the time line by the same proportion. This is akin to assuming that if 1 person can run a mile in 10 minutes then 10 persons can together run a mile in 1 minute.

    Focusing on short-term, superficial benefits over long-term, profound benefits - this arises due to the tendency to show quick results. This is often reflected in statements like "I would like to see us getting certified for a standard or model by so and so date". The end result may be that the certificate adorns the walls of the organization but the "improvement" part remains missing from the initiative.

    Allocating inadequate resources for the initiatives - this arises due to the fact that sponsors choose to ignore the effort requirement by viewing it as a cost than an investment. This is reflected in creating virtual and part-time teams without budgeting for a part of their effort for the improvement initiative. This in the end may result in virtual improvement since virtual teams can only achieve virtual improvements. In such a situation virtual team members have a full-time job for which they are accountable while holding additional responsibility for improvement initiative but without having the authority to invest part of their effort in a formal way.

    The role of sponsors should then be defined keeping in mind the above points. This way the improvement initiatives will have higher effectiveness. Sponsors should not ignore the long-term, profound benefits and must try to balance it with the short-term, superficial benefits. This would ensure an organization will get the certificate as well as the improvement!