Tuesday, October 06, 2015

Jobs vs. Ritchie

There's a Facebook meme going around right now that compares two highly influential men in the history of computing: Steve Jobs and Dennis Ritchie. The meme, however, really tries to make the case that Ritchie was not only more significant in computing history than Jobs was, but also that Jobs would have been easily replaceable while Ritchie's work was irreplaceable.

I have some problems with this analysis.

Dennis Ritchie

I don't want to give the impression in this comparison that Dennis Ritchie was uninfluential -- Ritchie was highly influential in his contributions to the history of computing: he helped create the C programming language and the UNIX operating system. However, a lot of the things listed under Ritchie's name as not existing without his influence would actually have still existed, for two reasons:

1. Ritchie didn't work alone, and

2. Ritchie's work was largely derivative.

Ritchie began working at Bell Labs in 1967. One of the first projects Ritchie contributed to at Bell Labs was a computer operating system called Multix, being developed jointly with Honeywell and the Massachusetts Institute of Technology. However, Bell dropped their collaboration in the project in 1969, allowing Honeywell to complete the project and develop Multix as a commercial product -- one of the first commercial computer operating systems.

After dropping out of the Multix project, Ritchie and one of his co-workers, Ken Thompson, started working on a different project, originally called UNICS, but by 1971 replaced by UNIX. Interestingly enough, the two things most computer folk know about UNIX today were not true when it was created -- UNIX was not a general-purpose operating system (until much of UNIX was re-written from assembly language, it only ran on computers produced by Digital Equipment Corporation), nor was it open-source and free for use (a commercial license was $20,000, and it allowed the purchaser access to the UNIX source code, not any assistance in installing or running the code on an existing computer).

The story of how UNIX became open-source, and tremendously popular in academic circles is itself fascinating, but beyond the scope of this essay -- suffice to say that it had a great deal to do with a consent decree that settled an anti-trust lawsuit against Bell Labs in 1956 that prevented Bell (now AT&T) from patenting anything that didn't count as a common-carrier communications technology, plus Ritchie's co-creator Ken Thompson's willingness to send the source to universities on request. It was the government's break-up of AT&T, freeing the company from their consent decree, that led to Richard Stallman's work on GNU and the official creation of the open-source software movement.

The eventual portability of UNIX to other computer platforms was made possible by another of Ritchie's creations: the C programming language. However, the C programming language was not itself wholly original, being largely a derivative of Ken Thompson's B programming language, itself a simplified version of the BCPL language designed by Martin Richards in the UK. In addition, Ritchie worked with yet another computer engineer, Brian Kernighan, to publish 'The C Programming Language', which became so popular among programmers that the book itself was often simply referred to as 'K&R', and the language it described as 'K&R C'.

Kernighan and Thompson were not the only talented computer engineers working for Bell Labs in the late 60's and 1970s, so it's difficult to imagine that, despite Ritchie's clear contributions to both C and UNIX, that some other engineer (perhaps Douglas McIlroy or Joe Ossanna, who also participated on Bell's UNIX team) wouldn't have made the same contributions Ritchie did. In fact, Ritchie himself was being either highly humble or highly realistic when he frequently gave credit to others working with him for the advances in computing that he was largely credited with.

Lastly, and it's a point that many folks who work with software concede only reluctantly, but eventually concede, is that the C language, while fine for its time, was ultimately riddled with flaws and bugs that made it very frustrating to work with for someone who wasn't an expert at reading the C manual. And though later languages were written to incorporate later advances in computing (such as object-oriented programming), even those languages inherited a lot of C's crappy behavior. In short, had C never become as popular as it did, it's likely a better-structured, less flawed language might have taken its place.

Steve Jobs

According to the Facebook meme, the only two things we'd be missing if Steve Jobs hadn't existed are 'iDevices' and 'over-expensive laptops'. While I can imagine someone being irritated with Jobs' lionization by the media, especially after his return to Apple in the late 1990s, this summary drastically understates Jobs's influence on the personal computing era.

Most people who aren't educated in the history of computers will probably say that the first personal computer was the IBM-PC, introduced by IBM in August of 1981. There's only one problem with this history -- it doesn't explain why, on August 12, 1981, Apple Computer printed an advertisement in the Wall Street Journal that started with the phrase, "Welcome, IBM."

The reason Apple could publish such an ad was that they'd already been successfully selling a personal computer for four years -- the Apple II, first produced in 1977. The Apple II wasn't the only commercial personal computer being sold -- it competed with the Commodore PET and the Tandy TRS-80 -- but the success of these early personal computers (much more so than the hobbyist personal computers of the previous generation, specifically the Altair 8800) pushed other manufacturers into the market, including Atari and IBM.

This would, oddly, end up helping Jobs and Apple. In 1983, Commodore and Atari engaged in a vicious price war, in which Commodore's C64 computer began directly competing with Atari home game consoles, already under some pressure from competitors in the console space. Apple and IBM, positioning their computers less as gaming devices (though games were certainly available) and more as general purpose computing and business devices, escaped the worst of this crash. Then, Jobs's first great innovation hit.

The Apple II was mainly Steve Wozniak's creation -- he'd put together the design of the original Apple I, and the Apple II was largely a refinement of Wozniak's machine. But Jobs had his own idea for what a personal computer would be like, starting with the famous '1984' television ad that both introduced the Macintosh personal computer and cemented the Super Bowl as a major event for the production of big-budget commercials. The Macintosh would go on to invent the field of desktop publishing, with the convergence of the Macintosh, the Apple LaserWriter desktop printer, the PostScript printing and font technology developed by Adobe Systems, and Aldus Software's PageMaker program that tied it all together in a 'what you see is what you get' presentation. Jobs didn't do it himself, and he technically wasn't even first (a programmer named James Davise produced a program for a community newspaper which was briefly sold commercially in 1984), but the combination of these technologies created an entire industry.

And of course, if not for the success of the Apple Macintosh graphical user interface (GUI), Microsoft would not have been 'inspired' to create Microsoft Windows as a replacement for their command-line DOS operating system.

Being a part of the personal computing revolution and helping to invent the desktop publishing industry would be achievements as impressive as Ritchie's in the history of computing. But Jobs takes things a step further after being forced out at Apple and then returning: Job's re-invented Apple moved from inventing new technologies to making existing technologies usable for the general public, and effectively expanding the consumer electronics industry.

Jobs, like Ritchie, didn't achieve his highest successes alone. But many of Jobs's successes don't share the sense of inevitability that surround Ritchie's -- Ritchie was one of many talented engineers working at Bell Labs, but Jobs spearheaded the development of the iMac, the iPod, the iPhone. None of those were the first products in their field; none were pioneers. Every one, though, expanded both the consumer electronics industry, and Apple Inc.'s place within it.

Dennis Ritchie's achievements are celebratory, and should be celebrated in the history of computing. But in no sane universe are Dennis Ritchie's achievements considered superior to the achievements of Steve Jobs.

No comments: