http://www.linuxplanet.com/linuxplanet/reports/2624/1

Back to article

Lights! Camera! Linux!

Linux at the Movies

November 11, 2000

There is an indescribable sense of surrealism in contemplating the next sentence: they're using Linux to make movies. Surely Linus didn't predict this back in 1991 when he started tinkering with Minix.

And yet, given the increasing use of computers in film production and the growing use of Linux on computers, perhaps this outcome wasn't so far-fetched.

Greetings, Professor Falken?

Computers as a film tool is a fairly recent development compared to computers as film subjects. Automated thinking machines were conjectured as far back as the early 1900's, when silent film comedies portrayed the humorous plight of out-of-control "automatons" in such epics as "The Rubber Man" (1908) and "The Mechanical Statue and the Ingenious Servant" (1907). These automatons were nowhere near as frightening as the out of control computers HAL 9000 in "2001: A Space Odyssey" and the W.O.P.R. in "WarGames," which was a common theme in films from the 1960s on. That which was processor-based was morally void, and therefore potentially evil. In other words: proteins good, silicon bad.

It was in one of these machines-gone-bad films that computers made their first significant contribution to the making of feature films. In 1973, nine years before the founding of Silicon Graphics, the first movie to use computer graphics was none other than "Westworld."

In this Michael Crichton movie of useful robots gone berserk (later followed by a movie useful cloned dinosaurs gone berserk), the androids' point of view was illustrated with a pixilated version of the real world. This effect was designed by John Whitney, Jr. and while it was slow to produce (eight hours for 10 seconds of film) it certainly have the film industry a new bit of eye candy for the audience.

Things moved pretty quickly after that, and by the release of "Star Wars" in 1977, computer effects were much more commonplace. Today, it was another Star Wars movie, "Episode I: The Phantom Menace," that brought us a walking talking alien being created entirely by computer. Annoying as he might be, Jar-Jar Binks is certainly a part of movie history.

The progress of computer graphics in film has an addictive effect on audience members to whom the adage "seeing is believing" is worth a lot more than the price of admission. What was once the wave of the future in film effects three months ago is practically obsolete today.

To keep up with this ever-increasing demand, visual effects and post-production studios in Hollywood have been buying and upgrading their hardware and software to levels unimagined by the early computer jockeys of the 1970s. This progress did not come cheap, either.

But despite the money coming out of the studios for buying SGI IRIX machines, it apparently was not enough to keep the machines' MIPS RISC processor capabilities growing at a fast enough pace. In late 1999, Intel-compatible chips were running at 750 MHz, compared to the 300 MHz MIPS RISC chips released at the same time. At the same time, SGI machines were still very pricey, despite the flattening of their processor's growth curve.

The net result of this combination were studios were now looking around for faster and cheaper platforms in which to do their post-production effects work.

Sitemap | Contact Us