Wednesday, April 22, 2009

Agile + Reuse = Efficient Projects

A couple of years ago I wrote a paper with some reflections on how the dynamics of the capital markets could help illuminate the dynamics of agile software development projects. Briefly: the Efficient Market Hypothesis explains how the market ideally will reflect all information available to investors. One point that is often overlooked about this is that an efficient market also reflects all implications for the future based upon the current information available. (In other words, if you think a stock price will rise in two weeks you buy it now, you don’t wait two weeks to do it.)

I then elaborated the idea of an Efficient Project: agile developers try to construct software that way. Just look at the so-called XP Customer Bill of Rights: “You can cancel at any time and be left with a useful working system reflecting investment to date.” The system reflects all the information available to date.

Furthermore, the YAGNI (You Aren’t Going to Need It) principle of XP says that the system should reflect only the information available to date. Don’t implement anything that isn’t implied by the requirements you have now. Don’t try to second-guess the future.

There’s even a parallel in there to the markets: momentum investing is one of the causes of market inefficiency; it causes bubbles, with people investing beyond what the current information (e.g. company revenues) implies. “Momentum implementation” occurs when projects implement features beyond what is called for by the requirements.

This is great advice. It’s a way of keeping systems from adding useless functionality. However: I think there has been a traditional tendency in the agile community to go too far with this. Probably because the agile movement began with projects with truly unstable requirements, a tendency has grown up around this to consider the future to be entirely unpredictable.

But the future is in fact rarely entirely unpredictable. The only project with entirely unpredictable requirements would be one in which you sit down in front of a white sheet of paper and say, “I’m going to do … something – anything.”

The fact is that there is a continuum of predictability of the future, of requirements. But the agilists have made it a bit too easy on themselves in that respect, and tend not to look hard enough for those aspects of the system which are predictable.

And that’s where reuse comes in. Reuse is all about being able to predict the future. In some ways it is the mirror image of agile. It says, “I can predict the future in these important ways, and I can implement a system that reflects the implications of this future.” It's the "You Are Going to Need It" of software engineering.

And this isn’t just dreaming. One of the most successful examples of this is product line development. The people in Nokia lay out an entire vision for future software in their phone families, and implement the corresponding system now. The product line developers also study where the future is unpredictable, say, in terms of features, and even then try to constrain the unpredictability to squeeze out any amount of partial predictability they can. For example, they introduce variation points in feature models that say, “okay, we’re not sure exactly what a system’s features will be, but we can get it down to variations on this theme and implement based on that.”

It’s tough analysis to do, there’s no silver bullet, but it’s rewarding when it works. And agilists tend to exclude this type of analysis due to a mindset that focuses on unpredictability.

This is not to say that agile and reuse are opposed. On the contrary, they are probably the two most important software engineering techniques we have, and can work together to balance the unpredictability and the predictability of systems and their features.

Surprised to hear this? You shouldn’t be: speaking of silver bullets, probably the most important software engineering paper ever written was No Silver Bullet by Fred Brooks (Martin Fowler once told me that no other paper had had so much impact with so few words.) Brooks said that the software development problem was essentially intractable, and that there were only a few truly powerful tools to combat the problem. Two in particular he mentioned: software reuse was one of them. Another was the idea of “growing a system” – essentially the idea of incremental, agile software development.

Why did he single out reuse and agile? Because he said that essentially the only solution to the software problem was to write less software. Reuse is the way to write less software when the future is predictable. Agile is the way to write less software when the future is not predictable. Leave either one out and you end up writing more software than you should. Use them together and you have an efficient project - a project in which the amount of software written is not too much and not too little.

Reuse and agile – if they were good enough for Brooks, they should be good enough for us.

Tuesday, April 14, 2009

Agile and Reuse

There was an interesting discussion over on the Yahoo XP Discussion List over the last few days on the topic of "reuse across projects." One thing that strikes me is that nearly all of what was said during that discussion has been said many times before. This in itself is not necessarily a problem, but it does leave me with the impression that many remain unaware of the reuse community and, in a kind of ironic twist, "reinvent the wheel" of discussion around reuse.

The agile community has a particularly uncomfortable relationship with reuse. I can testify to this on the basis of discussions all the way to the top -- yes, the top -- of the community where skepticism was expressed. In the discussions over the past few days, the idea of "emergent reuse" was cited with approval. But what is that, if not the notion that reuse only makes sense after several exemplars have been made? Once again, the wheel of thinking about reuse reinvented.

Borrowing an anecdote from another time: I once saw Jean Samet speaking at the History of Programming Languages conference. Defending COBOL from its detractors, she noted that only COBOL had a truly complete facility for I/O. The others punt (and it's true, just look at C and Ada, which farm it out to libraries). She said, "And you know why? Because it's hard, that's why." Simple as that.

I defend the agilists all the time with that anecdote. I tell people that agile may, in its essence, "only" be iterative software development, but that detracts nothing from the fact that they were the ones to finally make iterative software development happen. Why? Because it's hard, and that's why people didn't do it before. It's hard to plan iterations, to time-box them, to re-plan, etc. But the agilists simply rolled up their sleeves and did the hard work of figuring it out and putting it into practice.

The agilists should put this attitude to work and realize that reuse isn't practiced often enough for the simple reason that it's hard. It's hard to distill that perfect interface that makes software easily reusable. It's hard to provide the robustness and elegance needed to make reuse work.

Agilists are always inviting other communities to become familiar with what they're doing before judging them. I think the agilists should become familiar with the reuse community ... even better, participate in it. Come to the Eleventh International Conference on Software Reuse in Washington this September. We can talk about it.

Thursday, April 9, 2009

The Contractual Process

A few days ago during an agile workshop I was explaining the concept of optional scope contracts in agile processes, and listening to myself talk, the word “waterfall” suddenly came to mind. I had never thought of it in those terms before, even though this is clearly what it is.

What I hadn’t been thinking clearly about was the fact that contracting also has a process, which usually tracks or mirrors the software development process, but doesn’t have to. But we don’t see that until we radically change the software process away from the classic waterfall process.

The waterfall process for software development has a number of known problems. The agile process (basically an iterative process) arose in response to the need to get risk and scope under control, allowing to developer to reassess the state of development continuously and intervene to take decisions.

But when agile processes are adopted, it’s actually the exception more than the rule that the contracting process is changed, too. The contracting process stays waterfall (requirements up front, etc.). We end up with a mismatch between the two processes. If people were to think this way, in terms of processes, maybe they would start “getting it” about the possibility of doing contracting in different ways – effectively, an agile contracting process.

Wednesday, April 8, 2009

Reconciling IT Governance and Quality

After the Total Quality Management (TQM) wave that swept over the industry at large during the 1980s, and the success of ISO 9000 for the software industry in the 1990s, the quality imperative has continued its march in the new millennium with a spike in popularity of Six Sigma. Today it is not unusual to see a company’s commercial brochure highlight its special commitment to quality, as a way of identifying itself as a “quality company.”

Is there anything wrong with being a “quality company”?

Six Sigma dates from 1986, but was popularized along with the ISO 9000 movement in the software quality sector in the 1990s and early 2000s, and so it is a bit early to assess the long-term financial performance of companies that have embraced these movements as a governing objective to date. But a useful comparison can be made to the performance of companies that embraced Total Quality Management, a phenomenon that has been with us for some time now.The figure above shows the financial performance of several TQM companies relative to their peers on the Standard & Poors 500 during a full decade in which TQM adoption was at a peak. Among them were several technology leaders, such as Xerox, IBM, and General Motors—all of whom pioneered software systems respected for their high quality (yes, even GM).

Although it would clearly be an exaggeration to say that the financial performance of these companies was disastrous during that period, it is equally clear that the results were not what might have been hoped for, given the undeniable technical superiority of the systems that were produced under their rigorous quality programs. What explanation might exist for the inability of companies who adopted a quality-oriented governing objective to generate financial results as impressive as their technical results?

Some reflection reveals that quality is well-suited as an operational framework, but does not offer an economic framework for strategic decision-making. Some of the most critical decisions that a company is faced with have little to do with its quality program. At the same time that General Motors was embracing TQM, it embarked on a multi-year program of investment in factory automation and robotics, spearheading many software innovations such as the Manufacturing Automation Protocol (MAP). In retrospect, though, this was an ill-advised allocation of precious company resources, which certainly contributed to GM’s under-performance of the market by a full ten percent during that period. It is now generally recognized that at the same time IBM was pursuing TQM in those days, it was paying a heavy premium for its acquisition of Lotus Development Corporation.

In other words: both GM and IBM had great quality programs, but terrible strategic programs, and the bad strategy won out. Now look at how each of those two companies is faring today: IBM is thriving, while GM fights for its very survival. It certainly isn't their quality programs that are making the difference: it is their competitive strategies - one very successful, the other not.

Another, more subtle problem with a “quality strategy” is related to the very fact that programs like ISO 9000 have become so well-accepted: in many markets (for example, aerospace and defense), quality certification has become mandatory for participation – a “union card” for market entry, thereby levelling the field for all players and reducing dramatically the possibilities for building competitive advantage based on quality. Indeed, in these markets, any benefits from quality tend to accrue to customers.

Yet another problem is the “corporate culture” that sometimes arises around quality. A colleague of mine relates that he first began to suspect problems with TQM as a corporate culture when his company worked with Eastman Kodak and observed their operations. G. Newman has described the problem in the following way: “...the fadmongers [of TQM] have converted a pragmatic, economic issue into an ideological, fanatical crusade. The language is revealing. The terms of quality as an economic issue are analysis, cost, benefit, and tradeoff. The terms of quality as a crusade are total, 100 percent quality, and zero defects; they are the absolutes of zealots. This language may have its place in pep talks ... but once it is taken seriously and literally, we are in trouble.” When quality becomes a company-level obsession elaborate (and expensive) bureaucratic infrastructures too often arise, with the inevitable adverse financial consequences.

Quality only adds business value if customers are willing to pay more for higher quality. But in some sectors of the software industry, technical innovation is valued over quality per se. In fact, a December 2005 article in the Wall Street Journal noted observations that a quality management process is thought to actually hinder innovation in many cases. “For stuff you’re already good at, you get better and better,” Michael Tushman, a management professor at Harvard Business School was quoted as saying. “But it can actually get in the way of things that are more exploratory.”

Given these considerations, the practical consequences become evident. Quality is not suitable as a top-down governing strategy, to be followed in any and all cases and contexts. It is up to the business strategist to determine—in his own particular context—whether quality should become a competitive weapon in pursuit of business value.