Editor's Note: Complexity and the Open Source Model
Will Delays Be Fatal?
It seems like development on the Linux 2.4 kernel has taken forever. By Open Source standards, it has: whereas we used to see new kernel releases come by at a fast and furious pace, the Linux kernel development team now seems mired in this seemingly impossible task of actually releasing Linux 2.4.
But I'm not one to declare the end of Linux because of a late kernel release, although others are. (They mostly come from ZDNet, so take them with a grain of salt.) But a late kernel release does do one dangerous thing: it projects an unprofessional attitude to the rest of the world, reinforcing stereotypes of Linux advocates as wild-eyed, unruly fanatics.
Unfortunately, this is a dangerous trend among high-profile Open-Source projects: they are taking longer to complete than originally estimated. Apache 2.0 has been this close to completion for quite some time now, with no final release date in sight. KDE 2.0 has experienced a very slight delay in release (due today, by the way), but to make the release it appears that some features went unimplemented and (judging by the KDE mailing lists) a lot of last-second bug-fixing occurred. Red Hat Linux 7.0 was released--hurriedly, admit company officers--with a very serious compiler problem. And there seems to be a debate whether KOffice 1.0 is an actual production-quality release or a preview with a 1.0 moniker.
I'm sure there are a few high-profile Open Source projects I'm missing here, but you get the idea.
To those intimately involved with the Open Source process, such delays are par for the course; the mantra seems to be that it's better to do the job right than to meet a seemingly arbitrary deadline. Part of this mantra derives from the fact that many in the Open Source community are safely ensconced in the cathedral and don't need to deal with the realities of the bazaar, where deadlines do matter and sales cycles are set and well-established.
These unplanned and unanticipated delays lead me to pose a single question: has the breadth and complexity of high-profile Linux/Open Source projects outstripped the Open Source software-development process?
This is not an idle question. When the Linux kernel was relatively simple, a small oligarchy of developers working on the side could successfully oversee development. The same went for the original Apache Web server--which had the advantage of building on an existing code base--and the original KDE and GNOME releases. This software fit within the UNIX ethos of creating small, modular components that could be combined into a whole. And thus was born the Open Source method of software development: an oligarchy where a small group of talented developers worked with a larger volunteer development community, using contributed code and intense feedback to create an end product. It's really not a democratic process; think of the Open Source method of software development as a large-scale peer review.
But new releases mean new features, which leads to the end of small-scale tools and the beginning of a new level of complexity. And popularity means that there are more people passing along more and more comments and requests for features. Even if those comments and requests for features are rejected, they still must be considered.
I asked David Faure, one of the lead KDE 2.0 developers (who was kind enough to respond, despite being elbow-deep in work as the KDE 2.0 release loomed), whether the Open Source method of software development is still an effective approach as Open Source projects become more complex:
"Yes, I believe that method is still effective. The increasing complexity is not a problem, as long as the underlying design is good--which is what we've been working on for quite some time, the core of KDE. I don't see KDE "running into delays." The 2.0 release is imminent--as a matter of fact, the development has already moved on to post-2.0, and we're just waiting for the packaging before releasing.
"What is starting to create problems for us is the amount of feedback we get, from an increasing number of users (and the bug report dialog contributed a lot). Now developers have to spend a lot of time answering to bug reports and closing them, when the same time could be spent developing and bugfixing. But this feedback is very important, at the same time."
Now, delays are relative, and Faure is right in that the KDE 2.0 delays have been amazingly minor when compared to the delays surrounding Linux 2.4 kernel and Apache 2.0. And there's probably very little that I could teach David about software design and engineering.
But there's something I can teach regarding how Open Source software is viewed in the larger computing community. Most in the Open Source community are amazingly impervious to what outsiders think: an admirable trait during Open Source's formative years when public sentiment was discouraging, but dangerous now that Open Source has been by and large accepted by the computing community. Here are a few guidelines that I'd like to share with Open Source developers on how to be better perceived by the outside world:
- Be realistic in your estimates of completion times. It's better to overestimate the time needed than to look bad by being late.
- Don't treat releases as fluid occurrences. Slip-streaming important features is one reason why IT professionals hate Microsoft: important features would appear without warning in some service-pack release, with no warning. One of the great strengths in the Open Source world is that dot-releases really matter, and they should continue to matter.
- Don't wait to throw everything and the kitchen sink in the release. Not everyone is waiting for every new feature in your new release. There will always some features that are more appropriate for a 2.1, 2.2, or 2.3 release. This is one of the reasons why KDE 2.0 will be released with a very short delay: instead of cramming everything into 2.0, some requested and planned features were delegated to the post-2.0 releases.
- Remember that real people are planning on your release. If you can't meet their request, tell them so they can plan accordingly. As Linux occupies a more important place in the enterprise, there are corporations that have real money at stake. If Linux and Open Source develop a reputation for being unreliable in their introduction of new releases, enterprises are going to move to a more stable platform.
To answer my own question: I do believe the Open Source method of software development--which, to the outside world, resembles the proverbial one million chimpanzees typing away in search of Shakespeare--is suited for large and complex projects. I do hope that the kernel and Apache developers prove me right: if not, then we're looking at a pretty bleak future for Open Source projects.
Solid state disks (SSDs) made a splash in consumer technology, and now the technology has its eyes on the enterprise storage market. Download this eBook to see what SSDs can do for your infrastructure and review the pros and cons of this potentially game-changing storage technology.
- 1Linux Top 3: Linus Lashes out, Linux 3.14 Gets PIE and Ubuntu One is Done.
- 2Linux Top 3: Ubuntu 14.04, Debian Gives Squeeze More Life and Red Hat Goes Atomic
- 3Linux Top 3: CoreOS, Oracle Enterprise Linux 7 and Ubuntu 14.10
- 4Linux Top 3: Debian Dumps SPARC, Ubuntu Takes Over Linux 3.13 and the Core Infrastructure Initiative
- 5Linux Top 3: Fedora, Ubuntu and Gluster Lose Community Leaders