Saturday, December 20, 2008

Another Linux Advocate Loses Grip On Reality

Even though I'm still a fully functioning Mac Bigot (TM), I've largely gotten out of the habit of arguing my chosen platform's superiority on Internet forums (fora, for those of you who still take Latin classes). Part of the reason why is that I've decided the very practice of arguing on the Internet is unhealthy -- it's geek-meth. It gets you completely wired and hyperactive, even to the point of missing sleep and arguing incredibly trivial points (was it the WA-150 or the WA-100 that first allowed you to use a non-powered stylus?), while the addiction slowly drives you away from friends and family and ravages your physical and mental health. In the end, you'll abandon your own sister's wedding for one more chance to defend the Apple Newton again on ZDNet.

Which brings me to the point of this essay -- there are some places where I just don't bother to go for tech information, but I can go to get a chuckle out of the natives. These places feature writers who are, for the most part, true believers, utterly unfettered by the bounds of reality as they defend their own pet opinions and activities as blessed by the Holy Writ of St. Linus, not to mention that it's healthy to realize that there people out there with less of a life than I have. Among the places I go solely to chuckle over the natives' misunderstandings of all things Apple*: Gizmodo, Wired, and ZDNet.

* - This is not to say that every writer at Gizmodo or Wired is an anti-Apple jihadist or ignorant Linux-worshipper; sometimes I'm pleasantly surprised at the intelligence of a given writer and end up following his writings even against my normal distaste for his home digs. Sometimes, as in the case of John Siracusa of Ars Technica, a single writer can single-handedly rehabilitate a site I thought was hopelessly awash in inanity and ignorance.

Today's spit-take worthy batch of self-delusion dripped from the virtual pen of Jack Wallen of ZDNet, in a blog entry entitled, "10 things Linux does better than OS X". Rather than going point-by-point, which would suggest a dangerous relapse into geek-meth huffing, I'll just hit the highlights:

- It may sound strange, seeing as how OS X is based on a Linux variant...

Um...excuse me? The problem with this claim isn't its sheer ignorance and easy disprovability, but that it's far from the first time I've heard it, which suggests that some Linux evangelist out there is deliberately spreading misinformation that the supposedly bright Linux community doesn't bother trying to correct.

Saying that Mac OS X is based on a Linux variant is like saying that the Ford Taurus is based on a Chevy Volt variant -- not only did Linux enter the scene far later than the UNIX flavor used at the core of Mac OS X, the two operating systems don't even operate on the same set of core code.

First, a quick history lesson:

While Mac OS X is a relative newcomer to the field of computer operating systems, having been first released by Apple as Mac OS X Server in 1999, its pedigree can be traced back quite a ways. Mac OS X's direct ancestor is an OS known as Nextstep, developed by NeXT, a computer company founded by Apple founder and Mac co-creator Steve Jobs after he was removed from the leadership of Apple in the mid-1980s. Nextstep was itself derived from previous 'Unix-like' operating systems, primarly the Berkeley Standard Distribution and variants of that 'flavor' of OS. The Berkeley distribution (also known as BSD) was the first alternate 'flavor' or distribution of the original UNIX operating system developed by Bell Labs during the 1970s; however, though BSD is often referred to as a UNIX OS, and one of the original creators of UNIX has suggested that many OSes derived from UNIX are effectively UNIX systems, the actual designation of an operating system as a UNIX system can, today, come only from the trademark owner, The Open Group, which requires any OS which seeks to use the UNIX trademark to abide by the Single UNIX Specification, an interoperability standard developed in response to the explosion of commercial UNIX systems after the then-owner of UNIX, AT&T, allowed the OS to be licensed for use by other vendors.

The origins of Linux rest in the foundation of the GNU project in the 1980's by free software advocate Richard Stallman, at least partly in response to the growing commercialization of UNIX itself. GNU is a self-referencial acronym, which supposedly stands for 'GNU's Not UNIX'. Though Stallman's goal was to produce a complete UNIX-like operating system based entirely off of free and open-source software, the hardest tasks in the process, the development of device drivers and the kernel -- those parts of the operating system that interact directly with computer hardware -- had not been certified nearly a decade after Stallman started the project. Because of this, Scandinavian programmer Linus Torvalds wrote his own kernel and released it under the GNU Public License. Though this release was not immediately accepted by all open source advocates -- Torvalds famously (in *nix circles, anyway) engaged in energetic discussions on Usenet with MINIX founder Andrew S. Tenenbaum for some time after the release of his kernel -- eventually the Linux kernel became the popular choice and was married to the rest of the GNU OS in a conglomeration that has become known to the world as Linux, but is properly referred to (and even officially referred to by some distribution vendors) as GNU/Linux.

So while Mac OS X is technically younger than Linux, the core pieces of the operating system existed as part of the Berkeley distribution before Richard Stallman even began GNU, to say nothing of Torvald's own development of the Linux kernel. Interestingly, while neither BSD nor GNU/Linux meets the Single UNIX Specification, Mac OS X 10.5 does, and as such can be referred to as a UNIX operating system -- something Linux cannot technically claim.

Mac OS X is no more a variant of Linux than Linux is a variant of Mac OS X. The two grew from completely different development trees, though the original UNIX was the godfather to both OSes.

- Although most OS X users would balk at this (saying they have no use for the command line), most power users know the command line is crucial to serious administrative tasks.

Here lies one of the biggest misconceptions that Linux advocates have about computers.

I will freely admit that, for people whose job is to operate and administer computers, the various flavors of UNIX and *nix are generally more powerful and effective tools in meeting the challenges posed in those jobs.

Here's the problem: most people don't operate computers for a living. Most people use computers in an attempt to perform a job that has little to do with computers themselves. They're salespeople, or administrative workers, or service workers. Even in a company whose reason for existence is to operate or administer computers, there will be numerous people who don't actually have that task in their job description. The receptionist at ORACLE headquarters, for example, doesn't need to know squat about how to program a SQL query, and Steve Ballmer's executive assistant at Microsoft doesn't need to know how to troubleshoot ActiveX.

These people don't need the command line and the 'power' it provides. These people need an operating system that simply stays out of the way of them doing their actual jobs. In other words, most Mac OS X users don't actually have any use for the command line, especially as learning the command line involves learning way more than is strictly necessary about how *nix computers work than those people need to know in order to do their actual jobs.

Here's an analogy that I like to use to illustrate where I think the computer industry is going:

In the early part of the 20th century, the automobile was invented. While the earliest automobiles were largely curiosities, some folks found them useful; most people, however, couldn't imagine doing anything with the very early cars that they couldn't do with their already ubiquitous horses. However, as manufacturing processes grew more sophisticated, eventually an explosion of manufacturers and models ushered in the automobile's first great era, the era of the Chalmers, lasting through the end of World War I. War and economic woes helped narrow the range of manufacturers, but the coming of peace and prosperity, and the deliberate development of the US Interstate Highway System, helped regenerate the automobile industry, causing it to enter its true Golden Age in the 1950's. In those days, it seemed that every family owned a car, and every man was at least a passable mechanic. As time passed and car ownership went from a family to a personal experience, though, the needs of car owners slowly began to change. American car companies were slow to pick up on those changes, including increased passenger room on the one hand and increased fuel efficiency and reliability on the other (the latter at least partly triggered by the oil shocks of the 1970s). Though the Chrysler Corporation declared bankruptcy during the 1980s, American car companies comfortably went back to what they thought they knew best, designing big, 'manly' cars and other vehicles they thought the public should want, rather than the kinds of cars that the public actually needed. Another economic sea change later, and all of the Big Three are lined up at the Capitol Building seeing handouts to stay in operation so that they won't finally fall, victims of their own hubris and presumptions regarding what a car should be.

Though the parallels aren't perfect, the analogy is pretty striking: the earliest days of computing were very much like the earliest days of the car, with computers being considered expensive curiosities except in limited environments. Once proven in those environments, though, advances in manufacturing allowed for the first great explosion of 'personal computers', including the IBM PC, the Radio Shack TRS-80, the Commodore PET and C-64, and the Apple II and even the earliest Macintoshes. Though they served the purpose of educating a generation of computer mechanics, most folks at the time viewed these early machines as little more than expensive toys (save for the few that could be used for serious business tasks), and it took the development of the Information Superhighway to kick-start personal computer adoption both in homes and smaller businesses. Today's computing society is much like the automotive society of the 1960's -- those who know the most about computers tend to assume that what they like about computers is not just what's good about them but what is actually meaningful and valuable about them, ignoring a growing segment of users who don't use or need to use computers in the same manner they do.

In the 1970's you could seemingly find a mechanic in every gas station. In the 2000's few gas stations have garages associated with them, and few drivers see the need to be able to tune an engine or tweak plugs and points themselves.

By the time the twenty-somethings of today posting and writing for Gizmodo and ZDNet about the wonders of open source software start considering their own retirement, they'll be wondering how the world of computing left them so far behind, and why it is that nobody cares what flavor of *nix is being run on the latest gadget.

No comments: