A common culprit behind schedule slips is adding up estimates for building subunits and not accounting for verification and integration. In software, a piece of code may pass unit testing, but fail when integrated with other blocks. The code may work on one operating system, but not the four that the product declares it supports. Similarly in hardware and larger systems, standalone pieces may not work well when combined. Even blocks that pass test cases may fail the more complex challenges that a real life customer designs. It is a huge source of frustration to the program manager and other stakeholders to be told something is done, only to find out that it requires more time and resources to be fully and completely done. “But they said it was done!”
There are a number of ways to approach this very common problem. The easiest is to pad the schedule and figure that the integration and validation will somehow fit into the margin. This path is prone to further missed expectations and larger delays. The best is to sit down with the team and spell out Definitions of Doneness. We distinguish whether the project has been implemented, reviewed, and integrated into the top level project. Reviewed might be a quantitative number of tests passed or a peer review. A project can be completed, but under review for a while if getting it correct takes a few tries. This first implementation milestone can never be called “done” or earn the coveted green checkmark on a dashboard. I color it a nice royal blue.
Include these steps in the scheduling. By defining the steps of doneness, it makes it much easier to put them into the schedule. If using a gannt chart, there should be lines for verification, dependent on implementation, and then additional lines for verifying integrated designs. It can only help to remind individual estimators that their pieces may still need work when run in the context of other parts of the final product offering. A tricky part is scheduling time to fix things that may or may not be found in verification, especially when the sales team is screaming for the product to finish yesterday. Experience and the right mix of pessimism and optimism will help you produce a realistic schedule the team can count on.
Deal with scoffers. The combination of impatient stakeholders and busy design teams creates rushed and insufficient definition of done. “What?!? 3 months?? It should only take one month?” is a good opportunity to define “It.” What is included in the one month estimate? A prototype or a rigorously tested releasable product? Does it work in one well defined environment, or should it work in any combination a customer is likely to try? There is a chance that the one month estimate was asking for something less than the three month estimate was providing. Defining doneness can be used to shorten a schedule as well as lengthen it if there was a mismatch in expectations.
It is possible that done means not only does it work, work in context, and work in real life tests, but that it is fully documented and incorporated into the training material. Spelling out these requirements helps to ensure that not only is your first project completed nicely, your subsequent product isn’t starved for resources at the beginning still working on “extra” stuff from the last project. Training for customers and the people who support customers is often very important for product success – these steps need to be part of the original plan, and part of the definition of done.
Hopefully you will heed my advice to invest time into defining doneness when setting a schedule and a plan to track it. The discussion with each team – different skills may have different definitions of done – is a very worthwhile exercise to develop a solid and predictable plan your team believes in. Following up on doubts and apparent conflict is a good way to understand if there are gaps in expectations or other reasons a schedule may be flawed. I translate my definitions of doneness into colorful dashboards that speed up communication and add a little cheer to schedule reviews.