Back to article
From the Desktop: M Stands for MLVWM and Memories
December 5, 2000
I have a certain affinity for Apple computers. It was, after all, an Apple IIC that I first cut my computer teeth on, lo, these many moons ago. How many of us future geeks first played with these screaming green monochrome screen machines during our high school and elementary days? Quite a few here in the United States, where Apple got a serious foot in the door of the education market.
And then there was Lisa. She didn't last long, did she? She had a pretty stiff price tag and a design unlike any other, which sort of scared off potential buyers from her Motorola 68000 chip with 512 or 1024 Kbytes of main memory, a memory management unit, a bit-mapped display, a detachable keyboard, a mouse, a built-in 400-Kbyte floppy disk drive, and a 5- or 10 megabyte Winchester disk.
My God, we were so young.
I could go on, listing all of the old Mac machines, and then launch into a diatribe on the primitive PCs of the early 1980s. But what would be the point? The amazing thing about hardware is that no matter how slow we think our computers are in hindsight, at the time they came out, they were the fastest machines money could buy, for all of two months. This is my corollary to Moore's Law: the present is always superior to the past.
In hardware terms, we always look down our collective noses at the tools of the not-so-distant past. I do it too. I look at my old IBM PS/2 Model 30 with something akin to loathing, when I realize that I have a machine now that has more than 8.5 times more RAM memory than that PS/2 hard drive's memory.
What is of ever increasing interest to me is the fact that while hardware manufacturers seem to pride themselves on distancing their wares from the past, software developers have a recurring habit of keeping the past alive.
Developers don't get nostalgic in their code setups, that much is certain. New and old developers alike are always looking at better and faster ways to streamline and improve their code, using the newest technology. The demands of the hardware require this kind of development: improve the code or get left behind.
But when you look at the outside of an application, it's interface, there's where you see the developer's nostalgia come shining through like a beacon through the fog.
Today, many applications still cling to the WIMP interface, because it�s the best that we have to offer (right now) to the general public. But the nuances of how that interface looks and feels is still an area that gets played with quite often.
Having looked through a number of X window managers over the last couple of months, I have seen strong evidence of this trend: a leaning towards the look and feel of the interfaces of the past.
A couple of window managers have borrowed attributes from the old Amiga operating system, while others have gone out of their way to look like the Windows 95 desktop, of all things.
What is this predilection with older interfaces? Is it nostalgia? Perhaps, though some might say it's just a case of programmers reusing what has worked in the past. I tend to agree with the former notion: that resurrecting a familiar interface is a labor of love for some programmers--a way to harken back to earlier days when they were first introduced to computers, when figuring out how to print a file was their biggest concern.
Programming, it has been said, is a soulless operation, not worthy of even being called craftsmanship, let alone art. At times, there is a ring of truth to this statement, as cookie-cutter code seeps into the world at large. But every once in a while, a little more care, a little more craft is given to the creation of an application, and it becomes clear that the application was made not just for the sake of utility, but for the love of building something unique.