Tuesday, November 25, 2008

Gastric DRM

"When political action gets to be as much fun and games as it got to be for Abbie Hoffman and some of his friends in the late 1960s, a huge bawdy confused drama in which you have a rousing revolutionary-hero role and the whole world is watching, you don't really have to have tangible political victory to feel 'empowered.' You are already getting a kind of gratification that is the envy of millions of others who would love to get themselves so solidly into the act. The drama becomes an end unto itself."
- Walter Truett Anderson, Reality Isn't What It Used To Be

If the Vietnam War provided a generation with the opportunity to, as Anderson would write, cast themselves in the role of revolutionary hero against a corrupt, evil establishment, then it's possible to argue that digital rights management (or DRM) serves much the same purpose in this generation. The leaders of the movement are as well-known as the leaders of the anti-war movement were in the '60's -- it's hard to argue that Cory Doctorow isn't as well known among intellectuals today as Abbie Hoffman was then. And like the '60's, the '00's have spawned a whole host of DRM-agitators who cast themselves as revolutionary heroes fighting an evil establishment who wants nothing less than to control how you purchase and view media.

They're all horribly misguided, of course, reducing a complicated issue down to simple catch-phrases and good-guy/bad-guy Manichean thinking. (Heck, even the Wikipedia page on DRM is a giant advertisement for the anti-DRM argument and those who champion it.) Not that you can convince any of them of that. But maybe I can convince you.

First, a quick review of the concept. Digital rights management is a catch-all term that refers to any number of technologies with differing implementations but similar goals -- trying to protect the rights of content creators and distributors to profit from their creative work by making it difficult for people to enjoy protected work without paying for it. Put in those terms, which are generally the terms that DRM apologists use, the concept doesn't sound so bad. After all, the right of creators to profit from their creations is also an American ideal -- so much so that the U.S. Constitution specifically grants Congress the authority to pass laws to protect this right, at least for limited periods of time. (This also, ironically, makes the right of creators to profit from their work one of the few actual rights enumerated in the Constitution rather than in the Bill of Rights.)

DRM opponents, meanwhile, argue that once you purchase a work, you should have the right to enjoy that work in whatever format you choose. Buy a CD with a song, and you should be able to copy that song to your MP3 player to take it 'on the go' with you, and even burn it to a different CD so that you don't have to play your valuable CD in your car stereo, where it might be damaged by sun if it sits out on the passenger seat too long. Now, these arguments aren't in themselves bad arguments, even if they're not entirely 'portable' arguments, either. (Does your purchase of a book entitle you to make photocopies of every page so that you can leave the actual book in your personal library and only consult the loose-leaf copies when you want to kick back in the hammock on a summer afternoon?) The big problem I have with these arguments is that...well, they're misleading. DRM opponents like to portray the big media companies as hoping to milk consumers out of every last dime purchasing and re-purchasing the same content they've been buying for the past 5, 10, or 25 years, simply in different formats. In some cases, they'll accuse the media companies of forcing consumers to repurchase content they just bought ten minutes ago, when it comes to digital media.

One counter-example to this argument can be found on the recently released Special Edition DVD of Pixar's WALL-E. When you purchase the Special Edition DVD, you also get a code. Insert the DVD into a computer loaded with iTunes, and the DVD allows you to enter the code and access the iTunes store to download a copy of WALL-E at no extra charge. The copy you download is the same as any other iTunes movie you might purchase -- it can be played on any device you've authorized your iTunes account on, so if you have a desktop computer, a laptop, and an AppleTV unit, all of which are authorized to play iTunes purchases, any or all of them can play WALL-E even if you don't have the DVD handy.

I suspect DRM opponents will at least tentatively applaud this feature, even if they decry the use of iTunes DRM that limits the ability of the downloader to make copies of the movie at will. They might even suggest that other media companies follow Pixar/Disney/Apple's lead. (It's no big surprise, and yet another ironic turn in this tale, that both Apple and Pixar are run by Steve Jobs, who himself is on record as saying he's not a fan of DRM even as Apple's iTunes store is lambasted as one of the largest purveyors of DRM-protected media in the U.S.)

Here's the irony -- this is the future of DRM. This is what every media company would love to do, if only they could be certain that the extra copies they're giving away are only going to people who've already paid for one.

Allow me to make an analogy, and by doing so, change the subject for a minute.

Prior to 1967, morbidly obese people had few options when it came to losing weight. They could hope to have the willpower required to diet themselves to a healthier weight, or become gravely injured or diseased. The former is rare even today, and the latter is so dangerous that there's no wonder that no medical industry appeared appealing to the obese and offering to infect them with tapeworms. Then, in 1967, the first gastric bypass surgery was performed. (Today, gastric bypass is but one class of what's called bariatric surgeries, but for the purpose of this analogy we're just going to look at the gastric procedure.) The first surgeries were effective, but fairly brutal -- my mother had one such surgery, and though she did lose an astonishing amount of weight, she has also lived ever since with an evil-looking scar running down her midriff from near the base of her sternum down to her navel.

The procedure evolved, and today most gastric bypasses are performed laparoscopically -- that is, using a number of smaller incisions to introduce instruments (including cameras so the surgeon can see) instead of one large open incision. My sister had a laparoscopic bypass in the 1990s, and also lost significant amounts of weight, and has significantly less visible scarring than Mom does. Someday, perhaps, when medical procedures are performed the way Dr. McCoy does them on Star Trek, there may not be a need for incisions at all.

The evolution of DRM is much like the evolution of gastric bypass. Recording technologies have always been looked at with skepticism by rights distributors, and some have even taken legal action to prevent the spread of new technologies, but there was really no other way for a rights owner or distributor to protect their investment in the pre-DRM days.

Currently, we're in the pre-laparoscopic days of DRM, where the technology simply isn't advanced enough to allow for truly delicate and refined uses. DRM in this era basically acts like a blunt instrument or a gigantic scar across the content of your digital media -- not because companies want to prevent you from enjoying your media (certainly no more than 1970's surgeons enjoyed giving people massive abdominal scars), but because that's just the state of the art of the process today.

Pixar/Disney/Apple's DRM on WALL-E shows what a laparoscopic DRM might look like -- with confidence that only authorized purchasers are using the special feature of the DVD, you can now put that DVD on multiple different devices perfectly legally. More to the point, you don't have to figure out how to do that -- the content distributor has done that for you, as a way of differentiating that media from other media in the marketplace. People clearly want to enjoy their media in multiple formats -- that's why the DRM opponents' arguments are at least partly valid -- so media companies want to provide that flexibility, just not at the expense of swallowing a digital tapeworm (which is the only current alternative if you decide you don't want the scar of DRM on your media).

Perhaps some day (though hopefully not as long as the 24th century), DRM will be nearly invisible, allowing people who've purchased the right to do so the ability to port media to unlimited different formats and platforms. For today, though, the existing DRM is what we have.

You might think, as long as I'm being an apologist for Apple, the RIAA and MPAA, and any other group that wants to restrict 'digital freedoms', let me also say that I think that the 'pirates' who specialize in breaking technological encryption and DRM schemes are in fact doing more good than harm. They're not entirely blameless, given that the focus of new DRM systems tends to be more on how to make the systems more difficult for pirates to defeat rather than on how to make them more accessible to more media platforms and formats. But still, if the pirates simply gave up tomorrow and surrendered to a specific flavor of DRM, there would be far less motivation for existing content distributors to invest in new DRM systems. Given that the pirates see themselves as heroes, though (much as Doctorow and the Electronic Frontier Foundation do), I expect that I'll tweak a few noses by saying that, rather than being heroes, they're really no more than a necessary evil in the larger story of the development of better and better types of DRM. (And if you want to go after me for the use of the term 'pirate' in describing these technologists, let me remind you that the aforementioned Jobs flew a pirate flag over the building at Apple Computer where the first Macintosh was being developed -- it's not just a pejorative term of art in this discussion.)

Though given Anderson's quote at the top of this essay, DRM opponents and pirates don't have to actually win the fight against DRM to be gratified by their role in the fight. So in effect, we all get what we want out of the process in the end.

Sunday, November 23, 2008

Setting Sun

In the mid-1990's, when the Internet was just beginning to expand from the domain of tech-literate nerds, I got into an argument with a friend about the similarities between Apple Computer and Sun Microsystems. His premise was that Apple and Sun were very similar companies, but that Sun had a more viable forward-going business plan, and that Apple would be doomed if they didn't adapt their own business plan to be more like Sun's.

Fast-forward to November of 2008: Apple, Inc. (the company dropped the 'Computer' from their name during 2007) continues to see extremely strong sales of its flagship Macintosh computer, while Sun Microsystems seems locked in a death spiral, having lost over 95% of its stock value from its Internet-bubble peak, and being the subject of seemingly weekly speculative articles about which other tech giant will step in and snap up the struggling company (or not).

How did we get here? And why was my friend so wrong in his prediction? To understand the present, it helps to look back at the past -- in this case the distant past of the 1980s.

The Internet as we think of it today didn't exist back then, but computers were still fairly common, if not quite as ubiquitous as today. The two most commonly seen computer lines, both in homes and in schools, were built by two different companies: Commodore Business Machines (whose PET became popular in schools after its introduction in 1977; the Commodore 64, introduced in 1982, was more popular in homes), and Apple Computer (whose Apple II line, also introduced in 1977, seemed equally popular in homes and schools). It wouldn't be much of an exaggeration to suggest that the people who built the Internet largely grew up on Apple and Commodore machines (though other models, such as Radio Shack's TRS-80, had their proponents).

A trio of Stanford University computer students founded Sun Microsystems in 1982 after one of them, Andy Bechtolsheim, developed the SUN1 workstation from spare parts as a graduate project (SUN, at least with respect to that computer, stood for Stanford University Network). Apple, under the direction of founder Steve Jobs, developed the Macintosh during this period, releasing it for sale in 1984. (In the first of a number of similarities between the two companies, both companies' computers ran on Motorola 68xxx processors.)

By the late 1980s and early 1990s, both companies found additional innovations that contributed to additional success for each. For Apple, the introduction of Macintosh System 7, which allowed Apple to increase their dominance of the desktop publishing market, combined with the introduction of the first PowerBook in 1991, one of the earliest and most popular 'laptop' computers (as opposed to the luggage-sized 'portable' computers of the 1980s) allowed Apple to surge into what some observers would call their first 'golden age'.

Sun's innovation, meanwhile, would not come from within, but from without -- in 1989, English computer scientist Tim Berners-Lee had proposed a system of linked documents over a computer network and even developed a prototype of how that system would function, in effect, developing the first web server. (Ironically, Berners-Lee used hardware and software produced by NeXT, the company founded by Steve Jobs after he was ousted from Apple in 1985.) While Sun continued to produce workstation-class machines (the company's stock symbol during this period -- SUNW -- stood for Sun Workstations, after all), the company also aggressively pursued a place in the emerging market for server-class machines, and successfully became one of the best-known server hardware manufacturers of the time.

Sun still employed a number of very smart computer engineers, however, and in 1991, one engineer began a project which would ultimately be released to the public in 1996 as Java, a new programming language that intended to be platform-agnostic (the philosophy behind the language, 'write once, run anywhere', was very popular with the nerd-crowd). By this time, the Internet was beginning to take the world by storm, and the two companies, superficially very similar**, were moving in very different directions -- Sun was selling servers as fast as they could manufacture them, while Apple was struggling with the baggage of their legacy workstation-centric operating system and trying (and failing) to update it with internal re-invention projects.

** - Both Apple and Sun derived the lions' share of their revenue from hardware sales, but by late 1996-early 1997, both companies were arguably far better known for their flagship software products -- MacOS and Java, respectively -- than for their hardware. In addition, both companies were seen as hardware specialists -- Apple in desktop publishing and to a lesser degree in education, Sun in server hardware -- though both also had designs on a broader base of hardware sales. Lastly, both were seen as competitors of Microsoft Corporation, despite the fact that Microsoft was not itself a hardware company -- MacOS was seen as a direct rival to Microsoft's Windows operating system, while Java was seen as a product that could eventually make operating systems like Windows (and MacOS) obsolete.

It was at this time that my friend and I had our argument, and while his side of the argument continued to look good for about another five years, in the end his side of the discussion fell apart due to nothing more complicated than an unwillingness to think beyond the current business cycle. In fact, by the end of 1996, the seeds of both companies' 2008 fortunes had already been planted -- it would simply take time for each set of seeds to bear fruit.

Apple's did so first (if you'll pardon the labored analogy). By the end of 1996, Steve Jobs had returned to Apple when then-CEO Gil Amelio decided to purchase NeXT to acquire the NeXT operating system with the intent of using that OS as the core for the new Macintosh operating system. By mid-1997, Amelio had been removed as CEO and Jobs was back, and just over a year later, Apple released the product that would signal the beginning of their comeback: the iMac. Jobs also cancelled projects that Amelio and former CEO Michael Spindler had initiated to make Apple look similar to other PC manufacturers, including a period where Apple agreed to license the Macintosh OS to third-party hardware vendors.

Sun's failure at first looked like success, as the continued Internet Boom led to massive sales of Sun Microsystem's server hardware, driving up Sun's stock price to about $250 per share. Seeing the momentum building behind their Java programming language, Sun used their profitable server hardware business to bankroll the development and maturation of the Java language, and though the company did not choose to give away ownership of the Java language, they committed to the concept of 'open source' software (though they also continued to be the expert and made most of the significant advances and improvements to the language).

By 2001, it was clear that both companies fortunes had reversed -- Apple had released MacOS X, the 'modern' operating system the company had been trying to develop for nearly a decade in different projects and flavors (Copland, Rhapsody, etc.). Apple also opened their first retail stores, ending a long-time practice of serving as supplier to small, expensive 'boutique'-style proprietors. (Apple had already opened their online store back in 1997 when the iMac was released.) Steve Jobs buried the hatchet with Microsoft, long seen as an Apple competitor, by setting long-standing lawsuits with Microsoft and announcing a partnership where Microsoft would agree to continue to produce Microsoft Office for MacOS in exchange for $150 million in investment capital. Last, but obviously not least, Apple introduced the iPod -- seen by longtime technology pundits as 'just another MP3 player', the iPod did what previous models didn't: establish a default brand and configuration in the minds of non-tech-savvy consumers, turning Apple from a commodity tech hardware company into a true consumer electronics company.

Sun, meanwhile, was just beginning to see hardship. The sudden deflation of the Internet Bubble hit Sun's hardware sales hard, as companies that had purchased large 'farms' of brand-new Sun servers declared bankruptcy and sold those same servers for pennies on the dollar to budget-conscious rivals. Unlike Apple, Sun embraced the role of Microsoft antagonist, not just via Java (which was the source of another large number of lawsuits brought against Microsoft), but through Sun's acquisition of the StarOffice office productivity suite to serve as a direct competitor to Microsoft's other flagship product, Microsoft Office. Lastly, Sun's market position as a server vendor was beginning to be challenged by competitors which a much more focused hardware strategy, specifically IBM, Hewlett-Packard, and Dell.

Since 2005, my friend's argument has been strangely inverted, with Sun beginning to perform actions that Apple was doing or had already done in an attempt to remain profitable. Sun settled its long-standing legal arguments with Microsoft, supplanted its proprietary SPARC-based workstations and servers with products using Intel Corporation's CPUs, and even began aggressively seeking new markets to enter. Unfortunately, the company has had just a handful of profitable quarters since that time and, as noted above, has been hemorrhaging money and stock value so badly that many think that Sun is ripe for a takeover bid. Yet the most significant thing of value left in Sun's portfolio, the Java programming language, was simply given away by company executives in late 2006 when they chose to license the language and related code under the terms of the GNU Public License. Though Sun may have intended to prevent Java from being swallowed up and buried by a hostile takeover, the reality is that even Sun, the creators of Java, will now have difficulty turning a profit from it.

The two companies have one final, interesting similarity -- as of mid-2007, the two companies had very similar numbers of employees: Apple employed approximately 32,000 workers, while Sun reported having just over 33,000 employees. The similarity ends there, however -- Sun's 33,000 workers produced just $13.8 billion in revenue in 2007, and $408 million in profit over the same period, while Apple, with fewer employees, generated $32.48 billion in revenue and $4.83 billion in profits during the same period. In the first quarter of 2008, Sun posted a loss of $1.68 billion dollars, while Apple posted a record $1.58 billion profit over the same period.

The two companies are once again rapidly moving in opposite directions, and this time, it is Sun that looks like the company on the road to oblivion.

Monday, November 10, 2008

Prejudice -- By the Numbers

So apparently you can now download a video podcast of the entire Rachel Maddow show, complete with no commercial interruptions. That's just five different kinds of awesome in a nice tasty blend, but that's not the subject of this note. It was while watching Maddow's first podcasted show that I stumbled across a statistic that floored me -- exit polls suggesting that approximately 70% of African-American voters in California voted in favor of the anti-gay-marriage Proposition 8. As anyone who knows me can tell you, once I get ahold of one interesting number, I like to juggle it up with some other interesting numbers, and as it turns out, this is one really interesting number. Let's start by saying, in order to keep the examples easier, that nine in ten African-American voters in California voted for Barack Obama for President. (That's probably a bit on an underestimation, as exit polling suggests that about 94% of African-American voters in California voted for Obama, but as you'll see, underestimating this percentage won't change the conclusion.) Using a very simple statistical formula, if you know the percentage of one thing that happened, and you know the percentage of another thing that happened, you can determine the percentages of those two things happening together, assuming those two things are independent. For instance, let's matrix up the Obama vote among California blacks:
   1   2   3   4   5   6   7   8   9   0
1  x   x   x   x   x   x   x   x   x   .
2  x   x   x   x   x   x   x   x   x   .
3  x   x   x   x   x   x   x   x   x   .
4  x   x   x   x   x   x   x   x   x   .
5  x   x   x   x   x   x   x   x   x   .
6  x   x   x   x   x   x   x   x   x   .
7  x   x   x   x   x   x   x   x   x   .
8  #   #   #   #   #   #   #   #   #   +
9  #   #   #   #   #   #   #   #   #   +
0  #   #   #   #   #   #   #   #   #   +
The horizontal axis here represents the percentage of votes for Obama, while the vertical represents the percentage of votes for Prop 8. Each different marker, thus, represents one of the four possible outcomes: x - voted for Obama and in favor of Prop 8 (63) # - voted for Obama and against Prop 8 (27) . - voted for McCain and in favor of Prop 8 (7) + - voted for McCain and against Prop 8 (3) The total adds to 100, as you'd expect. What you wouldn't expect is that a significant majority of African-American voters who voted for Obama also appear to have voted in favor of Prop 8. Of course, it's not that simple -- I say 'appear to have voted in favor' because it's clearly not the case that these two outcomes are independent, as they would need to be for the numbers to be completely valid. All you have to do is look at the exit poll data for yourself to realize that support for Prop 8 wasn't exactly the same across party lines. But here's the thing that makes the numbers interesting -- let's say we tweak the numbers so that every McCain voter voted against the amendment. That allows us to change those three + votes into . votes, but it also messes up our original assumption, because now more than 70% of African-American voters are supporting Prop 8. To put the accounting right, we have to change three supporting votes for Obama to non-supporting votes, which then allows us to move three x votes into the # vote column, and leaves us with this: x - voted for Obama and in favor of Prop 8 (60) # - voted for Obama and against Prop 8 (30) . - voted for McCain and in favor of Prop 8 (10) Now the original assumptions are back in place, but we've still got African-American Obama supporters voting 2-to-1 in favor of the amendment. Unfortunately, there's no more juggling that can be done with the numbers -- unless you want to argue that the exit polls are simply wrong, this result is fairly close to the reality of the voting on this issue. (Two other points. Perhaps you can now see why underestimating African-American support for Obama wouldn't change the outcome, because now, for every voter we've identified as a McCain voter who actually voted for Obama, we can't actually change how they voted on Prop 8, and we're assuming *every* McCain voter voted for Prop 8. Changing McCain to Obama votes now only increases the margin of support for Prop 8 among Obama voters. Also, our assumption that every McCain voter voted in favor of Prop 8 is actually less realistic than our assumption that the support was identical by party affiliation -- while the same exit polls show that 64% of Democrats voted against the amendment, only 82% of Republicans said they voted for it. If that breakdown is similar to the breakdown among African-American voters for Obama and McCain, then again, we're identifying fewer Obama voters who voted for the amendment using our quick-and-dirty method than actually exist.) One more chart, for the benefit of those preparing to argue that white voters also voted in favor of Prop 8, which is absolutely true. For this chart, we'll assume that 60% of white voters voted for Obama, and that 60% voted in favor of Prop 8. To further show the case for racial equality, we'll use the same legend as in the charts above:
   1   2   3   4   5   6   7   8   9   0
1  x   x   x   x   x   x   .   .   .   .
2  x   x   x   x   x   x   .   .   .   .
3  x   x   x   x   x   x   .   .   .   .
4  x   x   x   x   x   x   .   .   .   .
5  x   x   x   x   x   x   .   .   .   .
6  x   x   x   x   x   x   .   .   .   .
7  #   #   #   #   #   #   +   +   +   +
8  #   #   #   #   #   #   +   +   +   +
9  #   #   #   #   #   #   +   +   +   +
0  #   #   #   #   #   #   +   +   +   +
x - voted for Obama and in favor of Prop 8 (36) # - voted for Obama and against Prop 8 (24) . - voted for McCain and in favor of Prop 8 (24) + - voted for McCain and against Prop 8 (16) Again, we know that these two positions are not truly independent, and that more than 60% of Republican voters voted in favor of the amendment. If we assume that the percentage of McCain voters in favor of Prop 8 is closer to 80% (as noted above), then we can move half of the 'against' votes into the 'for' category, and to balance the scales, move the same number of Obama voters from 'for' to 'against', leaving us with this: x - voted for Obama and in favor of Prop 8 (28) # - voted for Obama and against Prop 8 (32) . - voted for McCain and in favor of Prop 8 (32) + - voted for McCain and against Prop 8 (8) We now have a slight majority of white Obama voters voting against Prop 8. If you take into account that the actual percentage of white voters supporting Obama in California was closer to 50% than 60% (51%, according to this same exit poll), it becomes clear that white voters seemed more likely to vote against Prop 8 than to vote for Obama, which might be construed as prejudice in the opposite direction. The difference, however, isn't nearly as dramatic as that for the African-American voters noted above. Again, these are really just quick-and-dirty estimates based on one set of exit polls, and might not actually reflect the underlying reality. It would certainly be easy enough to simply dismiss the poll as flawed. The next question, for someone of a scientific mind, would be to ask if there is additional data that can either confirm or refute the hypothesis suggested by this data: that California African-American voters appeared significantly more prejudiced against gays and lesbians than their white voting counterparts in the 2008 election cycle.

Friday, November 07, 2008

Election Thoughts

So I've been slacking off about posting to the blog, but I wasn't alone -- Websnark's Eric Burns-White hadn't posted much either (as in anything) in three months leading up to the election, so I figured I was in good company.

Then Burns-White posted a retrospective based on his reaction to the historic Obama election, and it again reminded me of how similar we are.

On the morning of November 4, 1992, I walked out of the front door of my apartment and into a bright, sunshiny Yuma morning, feeling better than I had in years. After three terms of Republican executive rule, dating back to my earliest political memories, a Democrat was finally going to be occupying the White House again, and I knew, just knew things were going to work. That's not to say the Republicans didn't do everything in their power to try to throw the train off the track, but by the time Bill Clinton's second term was over, the United States had helped stop ethnic cleansing in the former Yugoslavia without losing even a single American combat death, had turned a seemingly intractable budget deficit into a projected budget surplus, and had successfully prosecuted America's enemies both foreign (those who executed the 1993 attack on the World Trade Center) and domestic (the Oklahoma City bomber and his associate).

I was certain that Al Gore would beat George W Bush in 2000, and still believe it was Bush's political connections to the Supreme Court that ultimately won him the election. I also was confident that John Kerry would defeat Bush after his disastrous first term, where Bush basically ran on the platform, 'sure, we screwed up, but if you give us another chance we'll do better'. Well, we didn't do better, as the housing crisis, the credit crisis, the War in Iraq, etc., all seem to demonstrate.

Now, finally, a Democrat is back in the White House, and not just any Democrat but one of historic proportions -- Obama is the first African-American (or 'minority' of any stripe) to occupy the White House in American history. Yet on the morning of November 5, I did not feel the same sense of overwhelming relief and confidence I felt in 1992.

One reason, pretty clearly, is my own age -- sixteen years ago I still had a pretty sizable share of my idealism, and while it's not all gone, experience, particularly when it comes to the political game played at the highest levels in this country, has shown me that idealism doesn't really count for all that much. You can't wish yourself out of a recession or clap your way home from an ill-conceived war.

But there are two other concrete reasons why Obama's election doesn't fill me with the same sense of optimism and expectation that Clinton's did:

1. The state of the country and the world is different and worse as Obama prepares to take office than it was when Clinton took office.

Though Bill Clinton won the 1992 presidential campaign on the strength of his advisor James Carville's now-famous, 'It's the economy, stupid', the primary economic indicators weren't nearly as sour as they are today -- in 1992, the country was accepted to be in a recession, but there wasn't any indication that the recession was going to end in disaster. Largely, the recession was seen as the natural aftereffects of Reagan's profligate economic policies of the 1980s coming home to roost.

In 2008, many economic indicators are as bad as they've been since the Great Depression, and though some economists still staunchly insist that the country isn't even in a recession, others wonder if this will prove to be the start of a new Great Depression -- certainly, the billions (and soon to be trillions) of dollars both Democrats and Republicans are agreeing to throw down the maw of the economic crisis suggests that folks in-the-know are desperate to do something, anything to at least slow down the tide of economic meltdown.

In 1992, the US was part of a UN-sanctioned peacekeeping force that entered Somalia in December (as much as conservatives like to blame Clinton for Somalia, the entire set-up was Bush Sr's baby), but no other U.S. troops were in an active combat zone or stationed in a location where they'd be likely to be directly in harm's way. Most of the troops committed as part of the UN mission to Somalia had come home by May of 2003, leaving only a few 'advisors' to assist in the operation.

In 2008, U.S. troops are directly in harm's way on two fronts: in Iraq and in Afghanistan. Some are reluctant to say we should withdraw from Iraq for fear of leaving it in the same state that Reagan left Lebanon after his withdrawal of U.S. troops following the infamous barracks bombing (though given Iraq's far greater resources, it's unlikely that Iraq's neighbors would simply sit back and watch it descend into 'failed state' status -- they'd run in to grab the oil for themselves). Others are reluctant to withdraw from Afghanistan since, after all, that's where the masterminds of the 9/11 terror attack still hide from justice. Both conflicts are expensive in terms of troops and resources, and both conflicts are engagments that the U.S. has, as a result of the Bush Doctrine, basically chosen to prosecute alone -- withdrawing U.S. troops from either front, to any appreciable degree, would be tantamount to removing all pretense of interest in prosecuting action on that front.

In 1992, the conservative movement was united behind little more than vague memories of Reagan's 'Morning in America' campaign ads from 1984 and a somewhat outdated jingoism that most moderates rejected. By 1994, conservatism had effectively re-invented itself as the small-government ideology first popularized by William F. Buckley, Jr., and made use of both overtly political (Newt Gingrich) and quasi-political (Rush Limbaugh) figures to organize this renaissance. By the late '90s, the resurgent conservative movement had even made common cause with the evangelical wing of the Republican party, and that alliance allowed for the undoing of the Clinton years.

In 2008, the figures that transformed conservatism are still around, but they're just shadows of their former selves: Gingrich doesn't even hold public office anymore, Limbaugh's show has been in a ratings decline since early 2005, and the once-powerful Fox News Channel has become a collection of whiners and irrelevant talking-heads whose great contribution to political debate is the screaming of Bill O'Reilly. As the philosophical conservatives have declined in influence, the evangelicals have risen, and now Obama faces an opposition that doesn't just believe that his policies are wrong-headed and non-sensical; they actually think Obama's views are anti-religious and blasphemous. Sarah Palin is currently the figurehead for this nihilistic wing of the political right, but just as Gingrich gave way to DeLay, Palin will be replaced by someone at least superficially more competent, and it'll be this person who becomes the greatest enemy liberal democracy has in the 21st century.

I'm only comforted somewhat by a variant of the old saw about women and success -- you've got to be twice as good as a man to get half the credit, but luckily that's not too difficult -- and realize that Obama wouldn't be in this position if he wasn't highly competent. He'll also have smart allies to draw on if he knows where to look -- Minnesota's own Keith Ellison would be an outstanding resource, for one. Still, Obama's road is going to be significantly harder than Clinton's was, so it's harder to get excited about how he'll walk it.

2. For all the talk of how historic Obama's election is, and how it signifies a 'post-racial' America, prejudice is still alive and well, and it could destroy everything progressives are trying to build.

There was a lot of gnashing of teeth over the so-called 'Bradley effect', where voters would tell pollsters they were going to support Obama and then be unable to actually vote for him once they were alone in the voting booth with their prejudices. Though that specific effect didn't seem to materialize, it would be wrong to claim that prejudice thus played no role in the 2008 elections.

For starters, take a gander at the three extremely prejudicial anti-gay propositions that were passed by state voters, including in California, where Obama won by twenty points with 59% of the popular vote. In effect, a sizable number of California voters decided that they could live with a black president, but not with gay men and lesbian women married.

The point of racism isn't race, per se, but prejudice -- prejudice based on skin color or national origin. While it certainly seems true that there's less prejudice against blacks today than there was decades ago, that doesn't mean there's less prejudice overall: those who feel prejudice simply seem to have changed the targets of their prejudice is all. (And before we completely write off racial prejudice, let's at least admit that, had Obama run his real campaign the way Chris Rock ran his fictional campaign in the 2003 movie 'Head of State', he absolutely would not have won.)

And of course, the great power of prejudice is that it can drive small numbers, or even just one person, to do something that changes the world, for good or ill. Sixteen men crashed four airplanes and nearly destroyed the American way of life in the process, just as an example. And presidents providing as much hope as Obama have been killed by those whose prejudices wouldn't let them rest. I'm convinced that there will be at least one attempt on Obama's life during his first term as president; I simply pray that the attempts fail, or the first African-American president to be assassinated might presage an even more precipitous fall from American ideals than we've seen over the past eight years.

So let me be clear -- I'm not at all disappointed or unhappy that Obama was elected. Heck, I voted for him. But based on my older eyes and the knowledge that the world isn't as ready for Obama, in more ways than one, than it was for Clinton, I can't find myself feeling the same sense of giddy optimism that I did back in 1992.

And maybe that'll be for the best. After all, idealism doesn't pay the bills, and there are a lot of bills for America to pay right now.