.comment: Taking Inventory - page 2
So Much for Source Code
I suspect it's still true that the majority of Linux users are those whose distributions were merely the starting point, and that within days (if not hours) of installation they began changing things, upgrading, customizing. It takes time to assemble, document, and package a distribution, and the pace of development is such that there are always new things that appear in the interim. We're in the middle of a new release cycle, and I'm not alone in already having on my machine things that are newer than the ones provided by the latest commercial releases.
Users who want to upgrade have a choice of waiting for a binary or compiling the new stuff from source right now. If they choose the former, they face the additional problem of awaiting a binary for their particular distribution. Not too many years ago, an RPM was an RPM. No more. If they choose the latter, bigger yet more subtle problems are in store.
The fact that different distributions require different binaries demonstrates just how fragmented Linux has become. It has been proposed by many of us that this problem be erased by settling on one Linux Standard Base, and that that base ought to be the Debian distribution or close to it. Its package manager is superior to anybody else's, but that is just one of the reasons that this is a good idea. The builders of binary packages would have just one target at which to aim, reducing the time lag between publication of source and distribution of binaries. Users could grab new stuff with reasonable certainty that it would work.
This, of course, is not going to happen. Distributions have too much time, effort, and money invested in taking Linux to strange places, and no matter how compelling the reasons for falling back on a system that makes more sense, it's tremendously unlikely they would ever do it. (In their defense, they would also face deafening howls from an existing installed base unless they came up with a transition system that was utterly painless, which is a pretty tall order.)
Even if it were to happen, it wouldn't solve the problem, because even the best package handler -- Debian's -- isn't smart enough to take everything into account.
Package managing software does more than just fling binaries on your hard drive in the places it's supposed to go (or where it's not supposed to go, if you've accidentally gotten a package for the wrong distribution). It also keeps track of what other packages you have aboard and what other packages will get broken if this one is removed. That's the way it's supposed to work, anyway. I suspect that there are few Linux users who have never had to employ the --force and --nodeps options when installing .rpms.
A further and crucial limitation is that package managers are useful only if you employ them for absolutely everything. This means that you do not get a binary tarball and install it by hand. It means that if you build from source, you need to get and build source RPMs or employ the debian.rules included with most but not all source packages, so as to build an .rpm or .deb with you then install. Otherwise, your new stuff won't get entered into the database that the package manager keeps -- if you've upgraded something, the package manager will think you still have the old version; if you've added something, the package manager won't know about it at all. And there's no good manual way of telling the package manager about it. (I am told by Michael Hall about the "equivs" package for Debian, in which a user who has installed from source or from a binary tarball can list the names of equivalent packages and then build a .deb package the sole purpose of which, when installed, is to tell Debian's package manager that the packages are aboard when strictly speaking, they're not. It resolves dependency issues, but does not apparently prevent accidental downgrades if later a Debian package is built, downloaded, and installed.)
The potential for mischief is vast. You decide to auto-update your system only to find that you've actually downgraded things you've installed from binary tarball or built from source outside the package manager. In some cases, things that worked won't work anymore. If you seek to install individual packages, you might encounter insurmountable dependency problems, not because the dependencies actually exist but because the package manager doesn't know that you've already resolved them -- it thinks you still have the old software aboard.
As the situation currently stands, distributors have made no provision at all for anyone who builds things from source tarball, the time-honored source code distribution medium. Application developers are free to include or not include rules and instructions for building .deb or .rpm packages which can then be installed through their respective package managers, which in any case is an extra step (and which might not solve the problems -- sometimes upgrades are done by brute force).
Solid state disks (SSDs) made a splash in consumer technology, and now the technology has its eyes on the enterprise storage market. Download this eBook to see what SSDs can do for your infrastructure and review the pros and cons of this potentially game-changing storage technology.