Sunday, December 28, 2008

Bookley - Year One

Almost exactly one year ago, I purchased my first-ever laptop computer: a MacBook Pro which I dubbed Bookley. I had been tempted to wait until the new product announcements at MacWorld Expo last year, in order to avoid buying a computer that would immediately be out-of-date, but desire overcame my caution and I ended up sauntering into the Apple Store in Southdale to make my purchase.

As it turned out, waiting wouldn't have made much difference -- the MacBook Pro line was announced for an update later in January, but only for a small increase in processor speed and a slightly larger hard drive. The 'Book I'd bought was hardly rendered obsolete by that refresh.

Now, after a year, if I had to describe my MacBook Pro experience in one word, that word would be 'convenient'. Though I could likely survive just fine without Bookley, he makes my life easier in a number of ways:


I work for a company doing application support, and once every three to four weeks have to be the on-call person for emergency weekend or overnight calls. Once I was able to configure Bookley to use the company VPN to allow remote access to my work PC, I no longer had to worry that any call would require me to hop in the car and drive to work just to see what the problem was. In fact, the ability to remote into work was more convenient than just connections from home -- I was also able to connect from activities like the Tuesday night bridge group or the Friday gaming sessions without having to completely remove myself from the event. The money, time, and gas I've saved just in the last year is probably enough to justify Bookley's cost right off the top.

Of course, all good things must end -- our new corporate parent, in accordance with their network policies, is going to shut down remote access for all employees at our site, so the single biggest advantage of having Bookley is about to go out the window, for no fault of mine or my laptop's. C'est la vie.


Bookley enhances my free time in a number of ways:

- My current desktop machine is a MacMini from a few years back, when it still ran the G4 processor. While this is good in a number of ways (it allows me to continue to use the backward-compatible OS 9 layer to run my really old Mac software), the bad part of using the Mini as my main machine is that many new games and other software programs are being released for Macs with higher end processors. (Civilization IV was one such title.) Bookley lets me run those programs.

- By setting up a partition on which to install Windows using Boot Camp, I can also participate in games with no Mac client. The main advantage this has been for me is in interacting with Wizards of the Coast's software development arm, who has tied themselves to Windows development as the smallest investment leading to the largest possible payoff. When I boot Bookley into Windows Vista, I can run Magic Online and the beta of the D&D 4.0 Character Creator with no problems. (I've also been playing City of Heroes on the Windows side, but with NCSoft's announcement of an upcoming Mac client, I'll be switching from PC to Mac for that game soon!)

- By using scanning and OCR software, I'm able to carry most of my D&D 3.5 reference books on Bookley's hard drive rather than in an arm-breaking pile of physical paper. This makes it a lot easier to find a relevent spell, rule, or other piece of gaming minutiae on demand in the middle of a fight at our Friday D&D sessions.

Bookley also helps me keep up with this blog, with the additional aid of the MarsEdit blogging package, a drop-dead worthwhile purchase if you plan to do any significant online writing. I use MarsEdit to connect both to this blog as well as put in my share of work on The AL Central Blog (though we'll see how that blog fares in the off-season).

I was even able to take Bookley on the road and make a few posts from far-away places, like GenCon 2008.

Not everything I've done with Bookley has been filled with win -- I got into the preview for MobileMe and discovered that I didn't use it nearly enough to justify the $100 per year price tag, for instance. On the whole, though, owning Bookley has been a great experience, and well worth the price.

Still, my resolution for 2009 is to do something so significant with Bookley that I'll feel compelled to finally set up some kind of backup regimen for him.

Saturday, December 20, 2008

Another Linux Advocate Loses Grip On Reality

Even though I'm still a fully functioning Mac Bigot (TM), I've largely gotten out of the habit of arguing my chosen platform's superiority on Internet forums (fora, for those of you who still take Latin classes). Part of the reason why is that I've decided the very practice of arguing on the Internet is unhealthy -- it's geek-meth. It gets you completely wired and hyperactive, even to the point of missing sleep and arguing incredibly trivial points (was it the WA-150 or the WA-100 that first allowed you to use a non-powered stylus?), while the addiction slowly drives you away from friends and family and ravages your physical and mental health. In the end, you'll abandon your own sister's wedding for one more chance to defend the Apple Newton again on ZDNet.

Which brings me to the point of this essay -- there are some places where I just don't bother to go for tech information, but I can go to get a chuckle out of the natives. These places feature writers who are, for the most part, true believers, utterly unfettered by the bounds of reality as they defend their own pet opinions and activities as blessed by the Holy Writ of St. Linus, not to mention that it's healthy to realize that there people out there with less of a life than I have. Among the places I go solely to chuckle over the natives' misunderstandings of all things Apple*: Gizmodo, Wired, and ZDNet.

* - This is not to say that every writer at Gizmodo or Wired is an anti-Apple jihadist or ignorant Linux-worshipper; sometimes I'm pleasantly surprised at the intelligence of a given writer and end up following his writings even against my normal distaste for his home digs. Sometimes, as in the case of John Siracusa of Ars Technica, a single writer can single-handedly rehabilitate a site I thought was hopelessly awash in inanity and ignorance.

Today's spit-take worthy batch of self-delusion dripped from the virtual pen of Jack Wallen of ZDNet, in a blog entry entitled, "10 things Linux does better than OS X". Rather than going point-by-point, which would suggest a dangerous relapse into geek-meth huffing, I'll just hit the highlights:

- It may sound strange, seeing as how OS X is based on a Linux variant...

Um...excuse me? The problem with this claim isn't its sheer ignorance and easy disprovability, but that it's far from the first time I've heard it, which suggests that some Linux evangelist out there is deliberately spreading misinformation that the supposedly bright Linux community doesn't bother trying to correct.

Saying that Mac OS X is based on a Linux variant is like saying that the Ford Taurus is based on a Chevy Volt variant -- not only did Linux enter the scene far later than the UNIX flavor used at the core of Mac OS X, the two operating systems don't even operate on the same set of core code.

First, a quick history lesson:

While Mac OS X is a relative newcomer to the field of computer operating systems, having been first released by Apple as Mac OS X Server in 1999, its pedigree can be traced back quite a ways. Mac OS X's direct ancestor is an OS known as Nextstep, developed by NeXT, a computer company founded by Apple founder and Mac co-creator Steve Jobs after he was removed from the leadership of Apple in the mid-1980s. Nextstep was itself derived from previous 'Unix-like' operating systems, primarly the Berkeley Standard Distribution and variants of that 'flavor' of OS. The Berkeley distribution (also known as BSD) was the first alternate 'flavor' or distribution of the original UNIX operating system developed by Bell Labs during the 1970s; however, though BSD is often referred to as a UNIX OS, and one of the original creators of UNIX has suggested that many OSes derived from UNIX are effectively UNIX systems, the actual designation of an operating system as a UNIX system can, today, come only from the trademark owner, The Open Group, which requires any OS which seeks to use the UNIX trademark to abide by the Single UNIX Specification, an interoperability standard developed in response to the explosion of commercial UNIX systems after the then-owner of UNIX, AT&T, allowed the OS to be licensed for use by other vendors.

The origins of Linux rest in the foundation of the GNU project in the 1980's by free software advocate Richard Stallman, at least partly in response to the growing commercialization of UNIX itself. GNU is a self-referencial acronym, which supposedly stands for 'GNU's Not UNIX'. Though Stallman's goal was to produce a complete UNIX-like operating system based entirely off of free and open-source software, the hardest tasks in the process, the development of device drivers and the kernel -- those parts of the operating system that interact directly with computer hardware -- had not been certified nearly a decade after Stallman started the project. Because of this, Scandinavian programmer Linus Torvalds wrote his own kernel and released it under the GNU Public License. Though this release was not immediately accepted by all open source advocates -- Torvalds famously (in *nix circles, anyway) engaged in energetic discussions on Usenet with MINIX founder Andrew S. Tenenbaum for some time after the release of his kernel -- eventually the Linux kernel became the popular choice and was married to the rest of the GNU OS in a conglomeration that has become known to the world as Linux, but is properly referred to (and even officially referred to by some distribution vendors) as GNU/Linux.

So while Mac OS X is technically younger than Linux, the core pieces of the operating system existed as part of the Berkeley distribution before Richard Stallman even began GNU, to say nothing of Torvald's own development of the Linux kernel. Interestingly, while neither BSD nor GNU/Linux meets the Single UNIX Specification, Mac OS X 10.5 does, and as such can be referred to as a UNIX operating system -- something Linux cannot technically claim.

Mac OS X is no more a variant of Linux than Linux is a variant of Mac OS X. The two grew from completely different development trees, though the original UNIX was the godfather to both OSes.

- Although most OS X users would balk at this (saying they have no use for the command line), most power users know the command line is crucial to serious administrative tasks.

Here lies one of the biggest misconceptions that Linux advocates have about computers.

I will freely admit that, for people whose job is to operate and administer computers, the various flavors of UNIX and *nix are generally more powerful and effective tools in meeting the challenges posed in those jobs.

Here's the problem: most people don't operate computers for a living. Most people use computers in an attempt to perform a job that has little to do with computers themselves. They're salespeople, or administrative workers, or service workers. Even in a company whose reason for existence is to operate or administer computers, there will be numerous people who don't actually have that task in their job description. The receptionist at ORACLE headquarters, for example, doesn't need to know squat about how to program a SQL query, and Steve Ballmer's executive assistant at Microsoft doesn't need to know how to troubleshoot ActiveX.

These people don't need the command line and the 'power' it provides. These people need an operating system that simply stays out of the way of them doing their actual jobs. In other words, most Mac OS X users don't actually have any use for the command line, especially as learning the command line involves learning way more than is strictly necessary about how *nix computers work than those people need to know in order to do their actual jobs.

Here's an analogy that I like to use to illustrate where I think the computer industry is going:

In the early part of the 20th century, the automobile was invented. While the earliest automobiles were largely curiosities, some folks found them useful; most people, however, couldn't imagine doing anything with the very early cars that they couldn't do with their already ubiquitous horses. However, as manufacturing processes grew more sophisticated, eventually an explosion of manufacturers and models ushered in the automobile's first great era, the era of the Chalmers, lasting through the end of World War I. War and economic woes helped narrow the range of manufacturers, but the coming of peace and prosperity, and the deliberate development of the US Interstate Highway System, helped regenerate the automobile industry, causing it to enter its true Golden Age in the 1950's. In those days, it seemed that every family owned a car, and every man was at least a passable mechanic. As time passed and car ownership went from a family to a personal experience, though, the needs of car owners slowly began to change. American car companies were slow to pick up on those changes, including increased passenger room on the one hand and increased fuel efficiency and reliability on the other (the latter at least partly triggered by the oil shocks of the 1970s). Though the Chrysler Corporation declared bankruptcy during the 1980s, American car companies comfortably went back to what they thought they knew best, designing big, 'manly' cars and other vehicles they thought the public should want, rather than the kinds of cars that the public actually needed. Another economic sea change later, and all of the Big Three are lined up at the Capitol Building seeing handouts to stay in operation so that they won't finally fall, victims of their own hubris and presumptions regarding what a car should be.

Though the parallels aren't perfect, the analogy is pretty striking: the earliest days of computing were very much like the earliest days of the car, with computers being considered expensive curiosities except in limited environments. Once proven in those environments, though, advances in manufacturing allowed for the first great explosion of 'personal computers', including the IBM PC, the Radio Shack TRS-80, the Commodore PET and C-64, and the Apple II and even the earliest Macintoshes. Though they served the purpose of educating a generation of computer mechanics, most folks at the time viewed these early machines as little more than expensive toys (save for the few that could be used for serious business tasks), and it took the development of the Information Superhighway to kick-start personal computer adoption both in homes and smaller businesses. Today's computing society is much like the automotive society of the 1960's -- those who know the most about computers tend to assume that what they like about computers is not just what's good about them but what is actually meaningful and valuable about them, ignoring a growing segment of users who don't use or need to use computers in the same manner they do.

In the 1970's you could seemingly find a mechanic in every gas station. In the 2000's few gas stations have garages associated with them, and few drivers see the need to be able to tune an engine or tweak plugs and points themselves.

By the time the twenty-somethings of today posting and writing for Gizmodo and ZDNet about the wonders of open source software start considering their own retirement, they'll be wondering how the world of computing left them so far behind, and why it is that nobody cares what flavor of *nix is being run on the latest gadget.

Wednesday, December 17, 2008

So much for that idea...wait...

I went to the D&D Experience in Washington DC back in 2007, and sat in the audience as Scott Rouse, head marketing maven for the D&D brand, told the assembled crowd at the designers' keynote that Wizards of the Coast was hoping to make the convention the D&D equivalent to the MacWorld Expo conference held annually in San Francisco. It would be a place where gamers could go to celebrate all things D&D, from the role-playing game to the then still-fairly-new tactical miniatures game. My main reason for attending the 2007 D&DXP was to try to win the Limited Championship in that miniatures game; though I didn't succeed, I did end up qualifying for the 2008 Constructed Championships, so the trip was still, from that perspective, a tremendous success.

Not quite two years later, and perhaps the plan needs to be modified, as Apple today announced that 2009 would be Apple's last exhibiting at MacWorld Expo. Some very smart Apple commenters have noted how this might allow MacWorld to become an even better convention, though the general consensus seems to be that MacWorld is likely doomed without the presense of Apple as a flagship sponsor, and that's probably not all bad.

(Aside: the Apple-centric blogs, as well as the Apple-centric media (MacWorld magazine, Mac|Life, etc.) are focusing on what this means for MacWorld Expo, but the MSM, so to speak, has decided it's more interested in talking about Steve Jobs' health rumors...again. *sigh*)

So what does that mean for D&DXP? Well, Wizard has already ended their official support for the skirmish game, handing the reins to a volunteer organization called the DDM Guild. So you might say that, in a way, Wizards beat Apple to the punch by a couple of months.

When the D&D Experience was primarily a convention supporting the Role Playing Gamer's Association's tabletop roleplaying games, it was called Winter Fantasy. Perhaps that name will come back now, given that RPGA stuff appears to be all that will appear at D&DXP beyond 2009.

Tuesday, November 25, 2008

Gastric DRM

"When political action gets to be as much fun and games as it got to be for Abbie Hoffman and some of his friends in the late 1960s, a huge bawdy confused drama in which you have a rousing revolutionary-hero role and the whole world is watching, you don't really have to have tangible political victory to feel 'empowered.' You are already getting a kind of gratification that is the envy of millions of others who would love to get themselves so solidly into the act. The drama becomes an end unto itself."
- Walter Truett Anderson, Reality Isn't What It Used To Be

If the Vietnam War provided a generation with the opportunity to, as Anderson would write, cast themselves in the role of revolutionary hero against a corrupt, evil establishment, then it's possible to argue that digital rights management (or DRM) serves much the same purpose in this generation. The leaders of the movement are as well-known as the leaders of the anti-war movement were in the '60's -- it's hard to argue that Cory Doctorow isn't as well known among intellectuals today as Abbie Hoffman was then. And like the '60's, the '00's have spawned a whole host of DRM-agitators who cast themselves as revolutionary heroes fighting an evil establishment who wants nothing less than to control how you purchase and view media.

They're all horribly misguided, of course, reducing a complicated issue down to simple catch-phrases and good-guy/bad-guy Manichean thinking. (Heck, even the Wikipedia page on DRM is a giant advertisement for the anti-DRM argument and those who champion it.) Not that you can convince any of them of that. But maybe I can convince you.

First, a quick review of the concept. Digital rights management is a catch-all term that refers to any number of technologies with differing implementations but similar goals -- trying to protect the rights of content creators and distributors to profit from their creative work by making it difficult for people to enjoy protected work without paying for it. Put in those terms, which are generally the terms that DRM apologists use, the concept doesn't sound so bad. After all, the right of creators to profit from their creations is also an American ideal -- so much so that the U.S. Constitution specifically grants Congress the authority to pass laws to protect this right, at least for limited periods of time. (This also, ironically, makes the right of creators to profit from their work one of the few actual rights enumerated in the Constitution rather than in the Bill of Rights.)

DRM opponents, meanwhile, argue that once you purchase a work, you should have the right to enjoy that work in whatever format you choose. Buy a CD with a song, and you should be able to copy that song to your MP3 player to take it 'on the go' with you, and even burn it to a different CD so that you don't have to play your valuable CD in your car stereo, where it might be damaged by sun if it sits out on the passenger seat too long. Now, these arguments aren't in themselves bad arguments, even if they're not entirely 'portable' arguments, either. (Does your purchase of a book entitle you to make photocopies of every page so that you can leave the actual book in your personal library and only consult the loose-leaf copies when you want to kick back in the hammock on a summer afternoon?) The big problem I have with these arguments is that...well, they're misleading. DRM opponents like to portray the big media companies as hoping to milk consumers out of every last dime purchasing and re-purchasing the same content they've been buying for the past 5, 10, or 25 years, simply in different formats. In some cases, they'll accuse the media companies of forcing consumers to repurchase content they just bought ten minutes ago, when it comes to digital media.

One counter-example to this argument can be found on the recently released Special Edition DVD of Pixar's WALL-E. When you purchase the Special Edition DVD, you also get a code. Insert the DVD into a computer loaded with iTunes, and the DVD allows you to enter the code and access the iTunes store to download a copy of WALL-E at no extra charge. The copy you download is the same as any other iTunes movie you might purchase -- it can be played on any device you've authorized your iTunes account on, so if you have a desktop computer, a laptop, and an AppleTV unit, all of which are authorized to play iTunes purchases, any or all of them can play WALL-E even if you don't have the DVD handy.

I suspect DRM opponents will at least tentatively applaud this feature, even if they decry the use of iTunes DRM that limits the ability of the downloader to make copies of the movie at will. They might even suggest that other media companies follow Pixar/Disney/Apple's lead. (It's no big surprise, and yet another ironic turn in this tale, that both Apple and Pixar are run by Steve Jobs, who himself is on record as saying he's not a fan of DRM even as Apple's iTunes store is lambasted as one of the largest purveyors of DRM-protected media in the U.S.)

Here's the irony -- this is the future of DRM. This is what every media company would love to do, if only they could be certain that the extra copies they're giving away are only going to people who've already paid for one.

Allow me to make an analogy, and by doing so, change the subject for a minute.

Prior to 1967, morbidly obese people had few options when it came to losing weight. They could hope to have the willpower required to diet themselves to a healthier weight, or become gravely injured or diseased. The former is rare even today, and the latter is so dangerous that there's no wonder that no medical industry appeared appealing to the obese and offering to infect them with tapeworms. Then, in 1967, the first gastric bypass surgery was performed. (Today, gastric bypass is but one class of what's called bariatric surgeries, but for the purpose of this analogy we're just going to look at the gastric procedure.) The first surgeries were effective, but fairly brutal -- my mother had one such surgery, and though she did lose an astonishing amount of weight, she has also lived ever since with an evil-looking scar running down her midriff from near the base of her sternum down to her navel.

The procedure evolved, and today most gastric bypasses are performed laparoscopically -- that is, using a number of smaller incisions to introduce instruments (including cameras so the surgeon can see) instead of one large open incision. My sister had a laparoscopic bypass in the 1990s, and also lost significant amounts of weight, and has significantly less visible scarring than Mom does. Someday, perhaps, when medical procedures are performed the way Dr. McCoy does them on Star Trek, there may not be a need for incisions at all.

The evolution of DRM is much like the evolution of gastric bypass. Recording technologies have always been looked at with skepticism by rights distributors, and some have even taken legal action to prevent the spread of new technologies, but there was really no other way for a rights owner or distributor to protect their investment in the pre-DRM days.

Currently, we're in the pre-laparoscopic days of DRM, where the technology simply isn't advanced enough to allow for truly delicate and refined uses. DRM in this era basically acts like a blunt instrument or a gigantic scar across the content of your digital media -- not because companies want to prevent you from enjoying your media (certainly no more than 1970's surgeons enjoyed giving people massive abdominal scars), but because that's just the state of the art of the process today.

Pixar/Disney/Apple's DRM on WALL-E shows what a laparoscopic DRM might look like -- with confidence that only authorized purchasers are using the special feature of the DVD, you can now put that DVD on multiple different devices perfectly legally. More to the point, you don't have to figure out how to do that -- the content distributor has done that for you, as a way of differentiating that media from other media in the marketplace. People clearly want to enjoy their media in multiple formats -- that's why the DRM opponents' arguments are at least partly valid -- so media companies want to provide that flexibility, just not at the expense of swallowing a digital tapeworm (which is the only current alternative if you decide you don't want the scar of DRM on your media).

Perhaps some day (though hopefully not as long as the 24th century), DRM will be nearly invisible, allowing people who've purchased the right to do so the ability to port media to unlimited different formats and platforms. For today, though, the existing DRM is what we have.

You might think, as long as I'm being an apologist for Apple, the RIAA and MPAA, and any other group that wants to restrict 'digital freedoms', let me also say that I think that the 'pirates' who specialize in breaking technological encryption and DRM schemes are in fact doing more good than harm. They're not entirely blameless, given that the focus of new DRM systems tends to be more on how to make the systems more difficult for pirates to defeat rather than on how to make them more accessible to more media platforms and formats. But still, if the pirates simply gave up tomorrow and surrendered to a specific flavor of DRM, there would be far less motivation for existing content distributors to invest in new DRM systems. Given that the pirates see themselves as heroes, though (much as Doctorow and the Electronic Frontier Foundation do), I expect that I'll tweak a few noses by saying that, rather than being heroes, they're really no more than a necessary evil in the larger story of the development of better and better types of DRM. (And if you want to go after me for the use of the term 'pirate' in describing these technologists, let me remind you that the aforementioned Jobs flew a pirate flag over the building at Apple Computer where the first Macintosh was being developed -- it's not just a pejorative term of art in this discussion.)

Though given Anderson's quote at the top of this essay, DRM opponents and pirates don't have to actually win the fight against DRM to be gratified by their role in the fight. So in effect, we all get what we want out of the process in the end.

Sunday, November 23, 2008

Setting Sun

In the mid-1990's, when the Internet was just beginning to expand from the domain of tech-literate nerds, I got into an argument with a friend about the similarities between Apple Computer and Sun Microsystems. His premise was that Apple and Sun were very similar companies, but that Sun had a more viable forward-going business plan, and that Apple would be doomed if they didn't adapt their own business plan to be more like Sun's.

Fast-forward to November of 2008: Apple, Inc. (the company dropped the 'Computer' from their name during 2007) continues to see extremely strong sales of its flagship Macintosh computer, while Sun Microsystems seems locked in a death spiral, having lost over 95% of its stock value from its Internet-bubble peak, and being the subject of seemingly weekly speculative articles about which other tech giant will step in and snap up the struggling company (or not).

How did we get here? And why was my friend so wrong in his prediction? To understand the present, it helps to look back at the past -- in this case the distant past of the 1980s.

The Internet as we think of it today didn't exist back then, but computers were still fairly common, if not quite as ubiquitous as today. The two most commonly seen computer lines, both in homes and in schools, were built by two different companies: Commodore Business Machines (whose PET became popular in schools after its introduction in 1977; the Commodore 64, introduced in 1982, was more popular in homes), and Apple Computer (whose Apple II line, also introduced in 1977, seemed equally popular in homes and schools). It wouldn't be much of an exaggeration to suggest that the people who built the Internet largely grew up on Apple and Commodore machines (though other models, such as Radio Shack's TRS-80, had their proponents).

A trio of Stanford University computer students founded Sun Microsystems in 1982 after one of them, Andy Bechtolsheim, developed the SUN1 workstation from spare parts as a graduate project (SUN, at least with respect to that computer, stood for Stanford University Network). Apple, under the direction of founder Steve Jobs, developed the Macintosh during this period, releasing it for sale in 1984. (In the first of a number of similarities between the two companies, both companies' computers ran on Motorola 68xxx processors.)

By the late 1980s and early 1990s, both companies found additional innovations that contributed to additional success for each. For Apple, the introduction of Macintosh System 7, which allowed Apple to increase their dominance of the desktop publishing market, combined with the introduction of the first PowerBook in 1991, one of the earliest and most popular 'laptop' computers (as opposed to the luggage-sized 'portable' computers of the 1980s) allowed Apple to surge into what some observers would call their first 'golden age'.

Sun's innovation, meanwhile, would not come from within, but from without -- in 1989, English computer scientist Tim Berners-Lee had proposed a system of linked documents over a computer network and even developed a prototype of how that system would function, in effect, developing the first web server. (Ironically, Berners-Lee used hardware and software produced by NeXT, the company founded by Steve Jobs after he was ousted from Apple in 1985.) While Sun continued to produce workstation-class machines (the company's stock symbol during this period -- SUNW -- stood for Sun Workstations, after all), the company also aggressively pursued a place in the emerging market for server-class machines, and successfully became one of the best-known server hardware manufacturers of the time.

Sun still employed a number of very smart computer engineers, however, and in 1991, one engineer began a project which would ultimately be released to the public in 1996 as Java, a new programming language that intended to be platform-agnostic (the philosophy behind the language, 'write once, run anywhere', was very popular with the nerd-crowd). By this time, the Internet was beginning to take the world by storm, and the two companies, superficially very similar**, were moving in very different directions -- Sun was selling servers as fast as they could manufacture them, while Apple was struggling with the baggage of their legacy workstation-centric operating system and trying (and failing) to update it with internal re-invention projects.

** - Both Apple and Sun derived the lions' share of their revenue from hardware sales, but by late 1996-early 1997, both companies were arguably far better known for their flagship software products -- MacOS and Java, respectively -- than for their hardware. In addition, both companies were seen as hardware specialists -- Apple in desktop publishing and to a lesser degree in education, Sun in server hardware -- though both also had designs on a broader base of hardware sales. Lastly, both were seen as competitors of Microsoft Corporation, despite the fact that Microsoft was not itself a hardware company -- MacOS was seen as a direct rival to Microsoft's Windows operating system, while Java was seen as a product that could eventually make operating systems like Windows (and MacOS) obsolete.

It was at this time that my friend and I had our argument, and while his side of the argument continued to look good for about another five years, in the end his side of the discussion fell apart due to nothing more complicated than an unwillingness to think beyond the current business cycle. In fact, by the end of 1996, the seeds of both companies' 2008 fortunes had already been planted -- it would simply take time for each set of seeds to bear fruit.

Apple's did so first (if you'll pardon the labored analogy). By the end of 1996, Steve Jobs had returned to Apple when then-CEO Gil Amelio decided to purchase NeXT to acquire the NeXT operating system with the intent of using that OS as the core for the new Macintosh operating system. By mid-1997, Amelio had been removed as CEO and Jobs was back, and just over a year later, Apple released the product that would signal the beginning of their comeback: the iMac. Jobs also cancelled projects that Amelio and former CEO Michael Spindler had initiated to make Apple look similar to other PC manufacturers, including a period where Apple agreed to license the Macintosh OS to third-party hardware vendors.

Sun's failure at first looked like success, as the continued Internet Boom led to massive sales of Sun Microsystem's server hardware, driving up Sun's stock price to about $250 per share. Seeing the momentum building behind their Java programming language, Sun used their profitable server hardware business to bankroll the development and maturation of the Java language, and though the company did not choose to give away ownership of the Java language, they committed to the concept of 'open source' software (though they also continued to be the expert and made most of the significant advances and improvements to the language).

By 2001, it was clear that both companies fortunes had reversed -- Apple had released MacOS X, the 'modern' operating system the company had been trying to develop for nearly a decade in different projects and flavors (Copland, Rhapsody, etc.). Apple also opened their first retail stores, ending a long-time practice of serving as supplier to small, expensive 'boutique'-style proprietors. (Apple had already opened their online store back in 1997 when the iMac was released.) Steve Jobs buried the hatchet with Microsoft, long seen as an Apple competitor, by setting long-standing lawsuits with Microsoft and announcing a partnership where Microsoft would agree to continue to produce Microsoft Office for MacOS in exchange for $150 million in investment capital. Last, but obviously not least, Apple introduced the iPod -- seen by longtime technology pundits as 'just another MP3 player', the iPod did what previous models didn't: establish a default brand and configuration in the minds of non-tech-savvy consumers, turning Apple from a commodity tech hardware company into a true consumer electronics company.

Sun, meanwhile, was just beginning to see hardship. The sudden deflation of the Internet Bubble hit Sun's hardware sales hard, as companies that had purchased large 'farms' of brand-new Sun servers declared bankruptcy and sold those same servers for pennies on the dollar to budget-conscious rivals. Unlike Apple, Sun embraced the role of Microsoft antagonist, not just via Java (which was the source of another large number of lawsuits brought against Microsoft), but through Sun's acquisition of the StarOffice office productivity suite to serve as a direct competitor to Microsoft's other flagship product, Microsoft Office. Lastly, Sun's market position as a server vendor was beginning to be challenged by competitors which a much more focused hardware strategy, specifically IBM, Hewlett-Packard, and Dell.

Since 2005, my friend's argument has been strangely inverted, with Sun beginning to perform actions that Apple was doing or had already done in an attempt to remain profitable. Sun settled its long-standing legal arguments with Microsoft, supplanted its proprietary SPARC-based workstations and servers with products using Intel Corporation's CPUs, and even began aggressively seeking new markets to enter. Unfortunately, the company has had just a handful of profitable quarters since that time and, as noted above, has been hemorrhaging money and stock value so badly that many think that Sun is ripe for a takeover bid. Yet the most significant thing of value left in Sun's portfolio, the Java programming language, was simply given away by company executives in late 2006 when they chose to license the language and related code under the terms of the GNU Public License. Though Sun may have intended to prevent Java from being swallowed up and buried by a hostile takeover, the reality is that even Sun, the creators of Java, will now have difficulty turning a profit from it.

The two companies have one final, interesting similarity -- as of mid-2007, the two companies had very similar numbers of employees: Apple employed approximately 32,000 workers, while Sun reported having just over 33,000 employees. The similarity ends there, however -- Sun's 33,000 workers produced just $13.8 billion in revenue in 2007, and $408 million in profit over the same period, while Apple, with fewer employees, generated $32.48 billion in revenue and $4.83 billion in profits during the same period. In the first quarter of 2008, Sun posted a loss of $1.68 billion dollars, while Apple posted a record $1.58 billion profit over the same period.

The two companies are once again rapidly moving in opposite directions, and this time, it is Sun that looks like the company on the road to oblivion.

Monday, November 10, 2008

Prejudice -- By the Numbers

So apparently you can now download a video podcast of the entire Rachel Maddow show, complete with no commercial interruptions. That's just five different kinds of awesome in a nice tasty blend, but that's not the subject of this note. It was while watching Maddow's first podcasted show that I stumbled across a statistic that floored me -- exit polls suggesting that approximately 70% of African-American voters in California voted in favor of the anti-gay-marriage Proposition 8. As anyone who knows me can tell you, once I get ahold of one interesting number, I like to juggle it up with some other interesting numbers, and as it turns out, this is one really interesting number. Let's start by saying, in order to keep the examples easier, that nine in ten African-American voters in California voted for Barack Obama for President. (That's probably a bit on an underestimation, as exit polling suggests that about 94% of African-American voters in California voted for Obama, but as you'll see, underestimating this percentage won't change the conclusion.) Using a very simple statistical formula, if you know the percentage of one thing that happened, and you know the percentage of another thing that happened, you can determine the percentages of those two things happening together, assuming those two things are independent. For instance, let's matrix up the Obama vote among California blacks:
   1   2   3   4   5   6   7   8   9   0
1  x   x   x   x   x   x   x   x   x   .
2  x   x   x   x   x   x   x   x   x   .
3  x   x   x   x   x   x   x   x   x   .
4  x   x   x   x   x   x   x   x   x   .
5  x   x   x   x   x   x   x   x   x   .
6  x   x   x   x   x   x   x   x   x   .
7  x   x   x   x   x   x   x   x   x   .
8  #   #   #   #   #   #   #   #   #   +
9  #   #   #   #   #   #   #   #   #   +
0  #   #   #   #   #   #   #   #   #   +
The horizontal axis here represents the percentage of votes for Obama, while the vertical represents the percentage of votes for Prop 8. Each different marker, thus, represents one of the four possible outcomes: x - voted for Obama and in favor of Prop 8 (63) # - voted for Obama and against Prop 8 (27) . - voted for McCain and in favor of Prop 8 (7) + - voted for McCain and against Prop 8 (3) The total adds to 100, as you'd expect. What you wouldn't expect is that a significant majority of African-American voters who voted for Obama also appear to have voted in favor of Prop 8. Of course, it's not that simple -- I say 'appear to have voted in favor' because it's clearly not the case that these two outcomes are independent, as they would need to be for the numbers to be completely valid. All you have to do is look at the exit poll data for yourself to realize that support for Prop 8 wasn't exactly the same across party lines. But here's the thing that makes the numbers interesting -- let's say we tweak the numbers so that every McCain voter voted against the amendment. That allows us to change those three + votes into . votes, but it also messes up our original assumption, because now more than 70% of African-American voters are supporting Prop 8. To put the accounting right, we have to change three supporting votes for Obama to non-supporting votes, which then allows us to move three x votes into the # vote column, and leaves us with this: x - voted for Obama and in favor of Prop 8 (60) # - voted for Obama and against Prop 8 (30) . - voted for McCain and in favor of Prop 8 (10) Now the original assumptions are back in place, but we've still got African-American Obama supporters voting 2-to-1 in favor of the amendment. Unfortunately, there's no more juggling that can be done with the numbers -- unless you want to argue that the exit polls are simply wrong, this result is fairly close to the reality of the voting on this issue. (Two other points. Perhaps you can now see why underestimating African-American support for Obama wouldn't change the outcome, because now, for every voter we've identified as a McCain voter who actually voted for Obama, we can't actually change how they voted on Prop 8, and we're assuming *every* McCain voter voted for Prop 8. Changing McCain to Obama votes now only increases the margin of support for Prop 8 among Obama voters. Also, our assumption that every McCain voter voted in favor of Prop 8 is actually less realistic than our assumption that the support was identical by party affiliation -- while the same exit polls show that 64% of Democrats voted against the amendment, only 82% of Republicans said they voted for it. If that breakdown is similar to the breakdown among African-American voters for Obama and McCain, then again, we're identifying fewer Obama voters who voted for the amendment using our quick-and-dirty method than actually exist.) One more chart, for the benefit of those preparing to argue that white voters also voted in favor of Prop 8, which is absolutely true. For this chart, we'll assume that 60% of white voters voted for Obama, and that 60% voted in favor of Prop 8. To further show the case for racial equality, we'll use the same legend as in the charts above:
   1   2   3   4   5   6   7   8   9   0
1  x   x   x   x   x   x   .   .   .   .
2  x   x   x   x   x   x   .   .   .   .
3  x   x   x   x   x   x   .   .   .   .
4  x   x   x   x   x   x   .   .   .   .
5  x   x   x   x   x   x   .   .   .   .
6  x   x   x   x   x   x   .   .   .   .
7  #   #   #   #   #   #   +   +   +   +
8  #   #   #   #   #   #   +   +   +   +
9  #   #   #   #   #   #   +   +   +   +
0  #   #   #   #   #   #   +   +   +   +
x - voted for Obama and in favor of Prop 8 (36) # - voted for Obama and against Prop 8 (24) . - voted for McCain and in favor of Prop 8 (24) + - voted for McCain and against Prop 8 (16) Again, we know that these two positions are not truly independent, and that more than 60% of Republican voters voted in favor of the amendment. If we assume that the percentage of McCain voters in favor of Prop 8 is closer to 80% (as noted above), then we can move half of the 'against' votes into the 'for' category, and to balance the scales, move the same number of Obama voters from 'for' to 'against', leaving us with this: x - voted for Obama and in favor of Prop 8 (28) # - voted for Obama and against Prop 8 (32) . - voted for McCain and in favor of Prop 8 (32) + - voted for McCain and against Prop 8 (8) We now have a slight majority of white Obama voters voting against Prop 8. If you take into account that the actual percentage of white voters supporting Obama in California was closer to 50% than 60% (51%, according to this same exit poll), it becomes clear that white voters seemed more likely to vote against Prop 8 than to vote for Obama, which might be construed as prejudice in the opposite direction. The difference, however, isn't nearly as dramatic as that for the African-American voters noted above. Again, these are really just quick-and-dirty estimates based on one set of exit polls, and might not actually reflect the underlying reality. It would certainly be easy enough to simply dismiss the poll as flawed. The next question, for someone of a scientific mind, would be to ask if there is additional data that can either confirm or refute the hypothesis suggested by this data: that California African-American voters appeared significantly more prejudiced against gays and lesbians than their white voting counterparts in the 2008 election cycle.

Friday, November 07, 2008

Election Thoughts

So I've been slacking off about posting to the blog, but I wasn't alone -- Websnark's Eric Burns-White hadn't posted much either (as in anything) in three months leading up to the election, so I figured I was in good company.

Then Burns-White posted a retrospective based on his reaction to the historic Obama election, and it again reminded me of how similar we are.

On the morning of November 4, 1992, I walked out of the front door of my apartment and into a bright, sunshiny Yuma morning, feeling better than I had in years. After three terms of Republican executive rule, dating back to my earliest political memories, a Democrat was finally going to be occupying the White House again, and I knew, just knew things were going to work. That's not to say the Republicans didn't do everything in their power to try to throw the train off the track, but by the time Bill Clinton's second term was over, the United States had helped stop ethnic cleansing in the former Yugoslavia without losing even a single American combat death, had turned a seemingly intractable budget deficit into a projected budget surplus, and had successfully prosecuted America's enemies both foreign (those who executed the 1993 attack on the World Trade Center) and domestic (the Oklahoma City bomber and his associate).

I was certain that Al Gore would beat George W Bush in 2000, and still believe it was Bush's political connections to the Supreme Court that ultimately won him the election. I also was confident that John Kerry would defeat Bush after his disastrous first term, where Bush basically ran on the platform, 'sure, we screwed up, but if you give us another chance we'll do better'. Well, we didn't do better, as the housing crisis, the credit crisis, the War in Iraq, etc., all seem to demonstrate.

Now, finally, a Democrat is back in the White House, and not just any Democrat but one of historic proportions -- Obama is the first African-American (or 'minority' of any stripe) to occupy the White House in American history. Yet on the morning of November 5, I did not feel the same sense of overwhelming relief and confidence I felt in 1992.

One reason, pretty clearly, is my own age -- sixteen years ago I still had a pretty sizable share of my idealism, and while it's not all gone, experience, particularly when it comes to the political game played at the highest levels in this country, has shown me that idealism doesn't really count for all that much. You can't wish yourself out of a recession or clap your way home from an ill-conceived war.

But there are two other concrete reasons why Obama's election doesn't fill me with the same sense of optimism and expectation that Clinton's did:

1. The state of the country and the world is different and worse as Obama prepares to take office than it was when Clinton took office.

Though Bill Clinton won the 1992 presidential campaign on the strength of his advisor James Carville's now-famous, 'It's the economy, stupid', the primary economic indicators weren't nearly as sour as they are today -- in 1992, the country was accepted to be in a recession, but there wasn't any indication that the recession was going to end in disaster. Largely, the recession was seen as the natural aftereffects of Reagan's profligate economic policies of the 1980s coming home to roost.

In 2008, many economic indicators are as bad as they've been since the Great Depression, and though some economists still staunchly insist that the country isn't even in a recession, others wonder if this will prove to be the start of a new Great Depression -- certainly, the billions (and soon to be trillions) of dollars both Democrats and Republicans are agreeing to throw down the maw of the economic crisis suggests that folks in-the-know are desperate to do something, anything to at least slow down the tide of economic meltdown.

In 1992, the US was part of a UN-sanctioned peacekeeping force that entered Somalia in December (as much as conservatives like to blame Clinton for Somalia, the entire set-up was Bush Sr's baby), but no other U.S. troops were in an active combat zone or stationed in a location where they'd be likely to be directly in harm's way. Most of the troops committed as part of the UN mission to Somalia had come home by May of 2003, leaving only a few 'advisors' to assist in the operation.

In 2008, U.S. troops are directly in harm's way on two fronts: in Iraq and in Afghanistan. Some are reluctant to say we should withdraw from Iraq for fear of leaving it in the same state that Reagan left Lebanon after his withdrawal of U.S. troops following the infamous barracks bombing (though given Iraq's far greater resources, it's unlikely that Iraq's neighbors would simply sit back and watch it descend into 'failed state' status -- they'd run in to grab the oil for themselves). Others are reluctant to withdraw from Afghanistan since, after all, that's where the masterminds of the 9/11 terror attack still hide from justice. Both conflicts are expensive in terms of troops and resources, and both conflicts are engagments that the U.S. has, as a result of the Bush Doctrine, basically chosen to prosecute alone -- withdrawing U.S. troops from either front, to any appreciable degree, would be tantamount to removing all pretense of interest in prosecuting action on that front.

In 1992, the conservative movement was united behind little more than vague memories of Reagan's 'Morning in America' campaign ads from 1984 and a somewhat outdated jingoism that most moderates rejected. By 1994, conservatism had effectively re-invented itself as the small-government ideology first popularized by William F. Buckley, Jr., and made use of both overtly political (Newt Gingrich) and quasi-political (Rush Limbaugh) figures to organize this renaissance. By the late '90s, the resurgent conservative movement had even made common cause with the evangelical wing of the Republican party, and that alliance allowed for the undoing of the Clinton years.

In 2008, the figures that transformed conservatism are still around, but they're just shadows of their former selves: Gingrich doesn't even hold public office anymore, Limbaugh's show has been in a ratings decline since early 2005, and the once-powerful Fox News Channel has become a collection of whiners and irrelevant talking-heads whose great contribution to political debate is the screaming of Bill O'Reilly. As the philosophical conservatives have declined in influence, the evangelicals have risen, and now Obama faces an opposition that doesn't just believe that his policies are wrong-headed and non-sensical; they actually think Obama's views are anti-religious and blasphemous. Sarah Palin is currently the figurehead for this nihilistic wing of the political right, but just as Gingrich gave way to DeLay, Palin will be replaced by someone at least superficially more competent, and it'll be this person who becomes the greatest enemy liberal democracy has in the 21st century.

I'm only comforted somewhat by a variant of the old saw about women and success -- you've got to be twice as good as a man to get half the credit, but luckily that's not too difficult -- and realize that Obama wouldn't be in this position if he wasn't highly competent. He'll also have smart allies to draw on if he knows where to look -- Minnesota's own Keith Ellison would be an outstanding resource, for one. Still, Obama's road is going to be significantly harder than Clinton's was, so it's harder to get excited about how he'll walk it.

2. For all the talk of how historic Obama's election is, and how it signifies a 'post-racial' America, prejudice is still alive and well, and it could destroy everything progressives are trying to build.

There was a lot of gnashing of teeth over the so-called 'Bradley effect', where voters would tell pollsters they were going to support Obama and then be unable to actually vote for him once they were alone in the voting booth with their prejudices. Though that specific effect didn't seem to materialize, it would be wrong to claim that prejudice thus played no role in the 2008 elections.

For starters, take a gander at the three extremely prejudicial anti-gay propositions that were passed by state voters, including in California, where Obama won by twenty points with 59% of the popular vote. In effect, a sizable number of California voters decided that they could live with a black president, but not with gay men and lesbian women married.

The point of racism isn't race, per se, but prejudice -- prejudice based on skin color or national origin. While it certainly seems true that there's less prejudice against blacks today than there was decades ago, that doesn't mean there's less prejudice overall: those who feel prejudice simply seem to have changed the targets of their prejudice is all. (And before we completely write off racial prejudice, let's at least admit that, had Obama run his real campaign the way Chris Rock ran his fictional campaign in the 2003 movie 'Head of State', he absolutely would not have won.)

And of course, the great power of prejudice is that it can drive small numbers, or even just one person, to do something that changes the world, for good or ill. Sixteen men crashed four airplanes and nearly destroyed the American way of life in the process, just as an example. And presidents providing as much hope as Obama have been killed by those whose prejudices wouldn't let them rest. I'm convinced that there will be at least one attempt on Obama's life during his first term as president; I simply pray that the attempts fail, or the first African-American president to be assassinated might presage an even more precipitous fall from American ideals than we've seen over the past eight years.

So let me be clear -- I'm not at all disappointed or unhappy that Obama was elected. Heck, I voted for him. But based on my older eyes and the knowledge that the world isn't as ready for Obama, in more ways than one, than it was for Clinton, I can't find myself feeling the same sense of giddy optimism that I did back in 1992.

And maybe that'll be for the best. After all, idealism doesn't pay the bills, and there are a lot of bills for America to pay right now.

Saturday, September 06, 2008

My D&D Character

Yes, yes, I know -- I've heard the jokes and am fully aware that few things are more boring than hearing someone talk about his D&D character. Nevertheless, I want to tell you about my D&D character, because I think it says something about what many people look for in a role-playing experience.

My character is an elf wizard named Rennal Maiavar. His backstory is somewhat signficant -- as an apprentice wizard, he was psychically contacted by an evil race of beings thought imprisoned by previous generations, and though he did not wish to do their bidding, it is also true that he did not possess great reserves of willpower. The evil beings, once in command of young Rennal, chose to use him more for entertainment than for the furtherance of any evil plot, and forced him to kill his father. "Maiavar", the elven for 'father-killer', was a name forced upon him by the elders of the city, who then exiled him from their presence. Rennal's claims that he was controlled by the evil race were dismissed, as everyone knew the race was locked safely away.

Of course, within a human generation, the evil race had found a way to escape and placed Rennal's homeland under siege -- a siege that was lifted only thanks to the intervention of some of the most powerful heroes in the land. Rennal's warning was either forgotten or ignored.

In D&D, most elves are Chaotic Good -- they seek both good and freedom in near equal measure. Rennal's experiences, however, turned him Chaotic Neutral; hostile to authority, resentful of restriction, and suspicious of the claims of those who assert that they know what 'good' is. He's also unpredictable, also a trait that fits with his chosen alignment - on more than one occasion, he's demonstrated that he's not entirely careful about making certain that all friendly combatants are outside the area of effect of one of his fireball spells, for example.

Rennal is untrusted and seen by some in the party as untrustworthy -- and he's a total blast to play. Not because he disrupts the party -- our group has been playing together for quite a long time and is a lot more tolerant of this kind of character than most newer groups might be -- but for two reasons:

1. He is many things that I would like to be, and imagine that I would be if not for the compromises I have to make to 'reality. I work in customer service, so I don't always have the ability to tell a difficult customer that they're full of crap, or tell a co-worker that they're far more clueless than they think they are. Rennal can say these things, and mean them, and saying those things allows me to vent them in a way that I find very healthy, spiritually.

2. Rennal represents a philosophy -- and given his intelligence, one that's frequently proven right.

A bit more explanation on point #2.

The party was exploring the Fortress of the Yuan-Ti (evil snake-men, for those unfamiliar with the D&D milieu), when the party encountered a bronze dragon who attacked us. Normally this kind of thing would be seen as odd, since bronze dragons are generally good-aligned and thus aren't usually inclined to attack largely good-aligned adventuring parties, but the paladin and the lawful-good monk/fighter in the party wasted no time in heading after the dragon, intent on taking down its hit points as quickly as possible.

Though he had no real reason to do so, Rennal noted that the dragon's behavior seemed odd, and so decided to try something -- specifically, casting dispel magic on the dragon. One of the magical effects Rennal successfully dispelled was the enchantment that allowed the 'bad guys' to control the dragon's actions. The dragon immediately turned on his captors and made what would have been a tough fight into a rout in our party's favor.

After the battle, it was discovered that the dragon, normally Chaotic Good, had been driven to Chaotic Neutral by a life lived as a captive in the fortress, being tortured and tormented by his inhuman captors. At first glance, you might think this would make the dragon (whom the party named 'Bob') and Rennal fast companions, also considering that it was Rennal who specifically freed the dragon from the compulsion controlling his behavior.

Rennal's response, however, was simply, "That's just the way life is." He developed no special bond with the dragon, nor the dragon with him. Though the 'lawful good' members of the party attempted to bond with the dragon, every attempt they made to 'help' seemed only to push the dragon farther into introspection and distrust -- they tried to heal his wounds (the same wounds they'd inflicted on him when they thought he was out to kill them), to convince him to stay behind and serve as guardian to the eventually cleaned-out fortress (when his only memories of the place were of pain and isolation), and to eventually arrange to meet with others of his own kind (when those others either didn't know or didn't care about his fate).

Rennal also attempted to free a captive yuan-ti in exchange for information, reasoning that, although the enemy of my enemy isn't necessarily my friend, she'll still likely find a way to put a monkey-wrench in our mutual enemy's plans. The party resisted this move, preferring instead to re-equip a captive paladin (whom the party didn't realize had fallen due to her treatment by the yuan-ti), who then used her borrowed equipment to murder all of the nearby villagers that the party thought they'd saved from the yuan-ti menace.

This is not to say that Rennal never makes mistakes -- he showed a fellow wizard the magical creation halls in the fortress that allowed one even without the proper training to be able to craft magical items, who then memorized the location and seemed interested in using that location for his own purposes -- but his errors are errors of personal judgement (as is his character), not mistakes in assuming that the world abides by the rules of his personal moral code.

In short, he's a wonderfully flawed character who nevertheless tends to be much more like the kind of person I'd like to be, and taking an evening every so often to inhabit his skin is among the most therapeutic and entertaining experiences I can think of.

This, I think, is one reason why people play RPGs, or watch movies, or read books. There's an 'escapist' tendency toward these things, sure, but they're not just trying to experience life from the standpoint of idealized versions of themselves -- instead, they're hoping to see the world through the eyes of characters they can view as more interesting than themselves, both in terms of what they can do, as well as in terms of how they go about doing what they do.

Of course, since people find very different things 'interesting', this still leaves a great deal of room for variety in the fantasy/SF/movie/book/etc. realm. I think the urge, though, is much more common that most producers of such material anticipate.

Monday, August 18, 2008

GenCon - End of Day Three

The D&D Minis Community Draft is a much-anticipated event, one where players from all over North America (and sometimes Europe) gather to engage in the hobby they all enjoy. In addition, the tradition of providing one's own prizes for the Community Draft has led to some very cool prizes being offered for those who do well (including an amazing 3-D representation of a popular tournament map in this year's prize pool). Lastly, since it's not a DCI-sanctioned event, it's one of the few minis events that employees of Wizards of the Coast can participate in, and they choose to do so quite frequently -- this year I personally saw both Rob Heinsoo (the lead designer of D&D fourth edition as well as the head muckety-muck of D&D minis design and development) and Peter Lee (the newest member of D&D/DDM R&D) participating, and have seen other WotC employees (and not all design employees, sometimes marketing folks too) participating in past events. And, for the second Community Draft in a row, I came in dead last. I wasn't quite as disappointed this time as I was at XP, largely because I had more fun -- the guys around me were having a good time, and we were all yukking it up since we weren't playing under any pressure to grab one of the 'elite' prizes. It would have been even better if I hadn't been paired with a local (Matt McMillen) in the final round, not that I regretted playing him, but simply because it's sometimes frustrating to realize you travelled 1200 miles to sit across the table from someone you see at every major local tournement, too. After the Community Draft, we went back to the hotel and crashed. The next morning, we got up, packed, checked out, and drove home, and by 10pm I was safe in my own apartment again, once again standing on the precipice of real life. Except that now I have a Flash T-shirt I can wear underneath my shirt and tie...

Saturday, August 16, 2008

GenCon - Day Three

It's amazing how time can get away from you around here. Day Two continued with the World of Warcraft miniatures premiere -- one of many being run this weekend. What I was unaware of when I accepted the ticket from my friend was that, in order to get the ticket in the first place and even have a chance to go to a premiere tournament, you had to play in a demo, which I hadn't. Thus, while everybody else got their minis and seemed to have some idea what was going on, I was utterly clueless, and it showed in my play and in my warband selection as I ended up 0-3 before dropping. I ended up giving the minis back to Bill, as he'd provided the entry for the tournament. Also, one of the problems in letting time get away from you is that sometimes you discover you can't reconstruct what you did. For instance, I know the WoW minis event was over before 6pm, at which time I went to grab dinner with Chip, one of my travelling companions. Later, a bit before midnight, a number of us went to do True Dungeon. (Capsule review: it was still very cool, even if our party this year felt less successful than the party the previous year -- had any of the monsters managed to hit us, and I mean hit us at all, as the highest roll I remember on a monster die was a '5' and the only damage I recall was from a no-roll-required magic missile.) After stumbling back to the hotel by 2:30, I struggled back to consciousness at about 7:30 to fly off to the Day Three events. First up: 4th edition Dungeons & Dragons. And yes, I've discovered that 4th edition is really Dungeons & Dragons. We're playing the initial adventure in the new Living Forgotten Realms RPGA campaign, and we're entering the underground ruins of Zhentil Keep (the 'new' Zhentil Keep was apparently built over the ruins of the old). We defeat a few shadow-creatures and rummage through a pile of treasure where the party discovers a magical staff of fire with an interesting fire-related power. Well, it just so happens that my wizard character has taken staff implement mastery and has some fire spells, so I grab the staff, and I then spend the next 30 minutes bouncing in my chair over all the cool things I can do with this staff. That's Dungeons & Dragons, no matter how else you want to slice it. Another trip through the dealer room (only costing me $50 this time, since I skipped the really pricey areas), and then off to dinner. Later tonight is the DDM Community Draft, a traditional event organized by players, for players. More on that later.

Friday, August 15, 2008

GenCon - Day Two

Day Two begins with breakfast, as all good days should. Unfortunately, unless you bring breakfast with you, it's not always easy to find breakfast at a convention. I had been stopping at Crawford's Bakery, just down the block from the Methodist Tower Hotel where we're staying, and picking up doughnuts and juice for breakfast, which makes a nice start. Unfortunately, with the early end last night, we're trying to get an early start today, so we're off for the convention hall before Crawford's opens. Karma continues to pull in my favor, however, and I stumble across a Starbucks selling ham-egg-and-cheese croissant sandwiches that are surprisingly good for sitting under a heat lamp. I get one of the last three on the table and head back to eat. Morning is Delve, Delve, Delve -- the D&D Delve is a quick, combat-oriented version of the Dungeons & Dragons role-playing game where you try to complete three encounters in 45 minutes or less. It's not as easy as it sounds, because the new 4th edition rules make each enemy monster tougher to compensate for the fact that player characters are somewhat more resilient and have more options in their attacks than earlier editions of the game. Also, playing the Delve earns you prize tokens that you can then trade in for prizes. The big prize, for fifty tokens, is a T-shirt based on the 'gnome and tiefling' 4th edition preview video, but I'll happily settle for a 15-point RPGA compaign card that gives my character a bonus ability, since I'll be playing in an RPGA event tomorrow morning. After Delving, I settled back into the minis area and discovered that the woman I'd taught the game to yesterday was back and playing League, so whatever assholery I'd perpetrated on her the day before hadn't taken. She even let me take a picture of her corset! Then I discovered that I had a raffle ticket for the World of Warcraft Miniatures Premiere Sealed event, so I hustled off to a completely different part of the convention hall to register. The event officially starts in about two minutes, but I've learned from my experience in this neck of the woods that such things never actually launch when they say they will. Nonetheless, I'm going to wrap up this update quickly to get back in the game.

GenCon - Day One (continued)

The evening brought a number of interesting things: - A couple of friends who came to GenCon were participating in the first of the 'last chance' championship qualifiers for D&D Minis, where the top four participants get invites to play in the championship tournament on Saturday. One finished eighth, the other fifth. - A friendly, attractive woman responded to my flirtateous comment about her hair color (red, which I LOVE, btw) by sitting down in front of me and asking, 'Can you teach me how to play this game?' Holy crap! What untapped source of karma did I suddenly open up like a fire hose? Unfortunately, though I tried to give her an entertaining intro to DDM, I think I ended up kicking her ass and driving her from the minis hall; I haven't seen her again. - We ended up leaving the con early, as our long road trip on Wednesday caught up with us. We headed to a nearby sandwich and pizza shop for dinner. The city of Indianapolis has not shown its best side to us on this trip. Granted, part of the reason is that we're staying in one of the 'bad' sections of town, but we haven't yet experienced a 'good' section of town, even the area around the Indianapolis Motor Speedway. When we got out of our cars to go into the restaurant, two very aggressive drunks/panhandlers latched onto us and didn't leave, even after we went into the restaurant, until Vic bought them sandwiches. (The store proprietors ended up giving us a free sandwich, perhaps as a 'thank-you'; I thought that was cool, and the sandwiches were really good.) Day One ended with my roommate and I camped out in the room watching Olympic women's beach volleyball and drifting off to sleep. Day Two is about to begin, and the end of the day promises True Dungeon adventure, so a wildly gushing post tomorrow morning is already penciled onto the agenda.

Thursday, August 14, 2008

GenCon - Day One

There are two theories when it comes to going to a big gaming convention like GenCon: 1. Go as hard as you can, doing as much as you can, until you drop from sheer exhaustion. 2. Pace yourself, getting in the stuff you want to do, but leaving time for spontaneous discovery and occasional fallow periods during the con. Theory one tends to be practiced by younger con-goers who have the stamina for such a marathon sprint; it also helps to have a good idea of what you're already planning to do, because you can schedule yourself to within an inch of your life. Theory two is the one I currently subscribe to, not just because of advancing age, but also because this year I didn't have a firm idea of exactly what I wanted to do. I knew I wanted to do different things, and though there were a few events I knew I wanted to repeat from last year (True Dungeon, the D&D Minis Community Draft), for the most part I wanted to do things I hadn't done before. So far, I've hobnobbed with a bunch of old friends and acquaintances, showed a friendly and attractive woman the basics of D&D Minis, and spent nearly $100 on the show floor buying dice and novelty T-shirts; in other words, I'm doing what almost everybody else is doing, and loving it. We'll see what the evening brings.

Gen Con - Day Zero

I'm at GenCon for the next four days (or what the GenCon marketing folks call 'the best four days in gaming'), and am looking forward to a few different events (that I'll hopefully be able to chronicle here). As it stands, though, the first day of any convention is the travel day, or Day Zero. Each travel day has its own stories and frustrations, and this one was no different. It started well before the actual travel day, when our plans from last year fell apart because Senior decided he couldn't afford to go to GenCon this year. Victor picked up the ball and ran with it, and four of us decided we'd travel cross-country in a rented Lincoln Town Car from Minneapolis to Indianapolis. Then a good friend, who we'll call Bill, asked if he could ride out with us, and we agreed. Victor's wife frequently gets motion sickness on long car trips, so we'd agreed to let her sit in the front seat. Unfortunately, this meant we had three reasonably good-sized men (or more accurately, two good-sized men and one huge one) squeezed into the back seat together. After a few hours, it was clear that we weren't ever going to be comfortable, and Vic's wife, bless her heart, agreed to move into the back seat to help us all out. In order to try to keep her from getting nauseated, we occasionally broke out a deck of 'Would You Rather' cards and asked each other questions. (The game developers will be glad to know that the car was frequently filled with raucous laughter during these sessions.) We arrived in Indy without further incident after about a 12-hour drive, checked into our traditional hotel, and crashed. Today is day one -- more updates as events warrant.

Friday, July 25, 2008

Cue The Maniacal Laughter!

Noted a couple of pieces of news this week that really got my sense of humor going.

First, newly installed Microsoft CEO Steve Ballmer noted in an internal memo to MS employees that the company was launching a plan to improve customers' end-to-end experience with MS Windows. Apparently, this plan has two prongs:

1. Communicate with hardware vendors to provide a more consistent hardware basis for Windows installations, in theory reducing the headaches users have experienced when installing or upgrading to Windows Vista on their existing PC hardware.

2. Launching a series of advertisements directly challenging the perception of Vista as kludgy and troublesome.

To which my response is this:


For starters, Ballmer clearly likes his spin; in his memo, he talks about how "we outsell Apple 30-to-1", but that's only true if you consider the world-wide market, and if you assume every machine not sold with MacOS comes with Windows pre-installed. According to IT News, who gets their info from IDC, Apple has a bit over 3% of the global market for personal computers. Of course, Apple has about 8% of the US market and growing according to the same source, so...yeah.

The more significant point, however, is that if Ballmer is hoping to duplicate Apple's tight hardware control -- after all, the same company making MacOS is making the computers that run it -- he's got a long way to go, and a lot of corporate partners to offend. It'll likely be easy for the big-name manufacturers to follow any directives from MS, given that their current offerings seem to work pretty well with Vista as it is. However, the biggest PC makers -- HP, Dell, Acer, Lenovo (maker of the ThinkPad now that IBM has gotten out of the business), and Toshiba -- account for just over half of the world market for PCs and servers. The rest of the market is crawling with bargain-basement manufacturers that throw together whatever hardware they can get their hands on cheaply and throw Windows on top. None of these guys individually is all that large, but if MS's directive to standardize hardware sends even half of them out of business (because they can't make their margins using stardardized hardware), that's a potential loss of as much as 20% of the global market. MS is betting that HP, Dell, and the like can fill in to pick up the slack, but my guess is that Apple is just hoping for an opportunity like that to push its own desktops, laptops, and servers into a larger share of the global market.

Which, of course, assumes that such a directive is actually enforceable. After all, similar strong-arm tactics performed by MS, including requirements on how to present the OS, what programs could and could not be pre-installed, etc., have been investigated all over the world and found to be anti-competitive. And though MS's legal problems at home have declined somewhat during the Bush years, all indications are that a far less friendly administration may end up taking over the White House starting in 2009, so any renewal of MS's old-fashioned monopolistic practices may do way more harm than good, in both the short and long runs.

Let's also not forget that MS is looking to bring those smaller, cheaper manufacturers into line at precisely the same moment that some PC market observers are predicting that a slowing global economy will make those manufacturers more attractive to consumers, as price becomes a bigger selling point than brand.

On the whole, point one looks to be impossible to achieve, and likely to cause a whole lot of damage in the process of making it work. (Not entirely unlike installing some Windows applications, but I digress...)

Point two, on the other hand, is ludicrous. The best explanation as to why comes from Mac site "", which points out that phase one of the ad campaign is basically complete misdirection. Windows Vista has a poor reputation among PC folks for a number of reasons: installation on legacy hardware is difficult, driver support for non-standard hardware is spotty, memory requirements are obnoxious, security enhancements are annoying and largely ineffective. To combat this 'negative perception' of Vista in the market, MS is presenting a series of ads featuring people who dislike Vista who are shown a new OS by MS called 'Mojave' -- once the observers express their interest in the new OS, they're told, hey, it's just Vista! It's better than you think, isn't it?

Such an ad campaign may make the Windows fanboys happy, but I doubt it'll actually make much of an impact on MS's bottom line. Why? It's the central message. Compare:

- Apple's 'Genius' ads of the late 90's had a central message of 'we design our computers for the smartest, most creative people in history. Is that you?'

- Apple's 'I'm a Mac/I'm a PC' ads have a central message of 'Macs are fun, hip, and simple to use; PCs are cantankerous and spend a lot of time complaining'


- MS's 'Vista is the new Folger's' ad has a central message of 'Hey! Vista doesn't suck as much as you think it does!'

Granted, Bare Bones Software gets a lot of mileage out of 'it doesn't suck' as a marketing blurb for their flagship product, BBEdit, but they have two advantages over MS:

1. They're marketing to a small portion of users who need a product that's reliable, functional, and stays out of their way, not to a mass-market audience looking for a replacment for Microsoft Word, and

2. their product doesn't actually suck.

So that news was worth a few laughs. Then I see word of a speech given by Mark Shuttleworth, founder of Canonical, the company that distributes Ubuntu Linux in which Shuttleworth calls for open-source Linux developers to focus on the Linux UI, asking "Can we not only emulate, but blow right by Apple?"


Sorry, lost consciousness for a second there.

The simple answer to Shuttleworth's question is 'No'. The more complete answer requires you to understand two things:

1. Linux developers, for the most part, are not interested in the UI.

This is not to say that Linux developers are all knuckle-dragging atavists who long for the return of the days of Big Iron, but rather, in recognizing that if those developers were interested in doing UI work rather than 'productive' work on things like drivers and applications, they'd already be developing for other platforms. They might not be Mac developers, but the Mac is far from the only OS known for great OS support -- there are still active Amiga developer groups out there, for instance.

The best Linux UIs available are basically cleaner versions of the Windows 2000 user interface, which as far as I can tell is about all the farther most Linux developers care to work on the UI. Guys who are passionate about UI programming? They get hired by Apple. Which leads into the next point:

2. Apple is actually still out there and likely isn't planning to stop running any time soon.

Let me illustrate this point by way of analogy: Say you're planning to run a race against a friend -- a marathon, perhaps. On the day of the race, you decide to walk the first 10 miles at a leisurely pace while your friend runs at a competent marathoner's pace. By the time you finish your 10 miles, your friend is far ahead of you. You're not going to catch him by the end of the race. Even if the race were to be extended forever, you're not going to catch him unless you're actually a faster, more conditioned runner than he is and can maintain a faster pace until you pass him.

Did I mention that the guys who are really passionate about UI programming tend to get hired by Apple?

It's always nice when the tech news provide you with unexpected humor.

Sunday, July 20, 2008

On A Woman's Mind

I know a woman with a plan. While her parents are out of town on a family trip, she's going to invite her boyfriend over, spend a good long time getting comfortable, and finally have sex. I personally don't have a problem with this plan.

First, she's nineteen, which means she's old enough to make her own decision about when and with whom she's going to have sex.

Second, she's smart, which means her plan is going to include protection and making sure that things are only going to go as fast as she wants them to go. She's already taken enough time to be certain that her boyfriend is the guy she thinks he is.

Lastly, because when it came time for me to lose my virginity years ago, I ended up doing it much the same way she's planning to. In fact, thinking about the plan has sent me down memory lane for much of the day today.

Begin at the beginning:

Christmas, 1991. A friend from high school (lets call him James) comes back home from southwest Arizona, where he's gone to live with his biological father and finish a college degree. He brings with him a videotape of the school's fall theatrical production, 'The Foreigner'. In high school, James had always worked behind the scenes, and had intended to do so in this show as well, but when the actor playing the lead broke his leg two weeks before opening night, James was pressed into service as the understudy.

The thing that astonished me was how good James was in the role. Granted, the lead in 'The Foreigner', like the lead in Christopher Durang's 'The Actor's Nightmare', can be played by actors who aren't bravura performers or even all that experienced; as long as the actor can use his own bewilderment and confusion for the role, it'll work and work well. Still, James was notably good in the role, and was surrounded by other good performers. Something was happening in that little town in the desert, and I decided I wanted to be a part of it. I asked James if he needed a roommate, wrapped up what few plans I had brewing in Minneapolis, and boarded a Greyhound bus for Yuma.

On February 7, I got off the bus, and met her. Let's call her Stephanie.

James was involved in the rehearsals for the college's spring production, back behind the scenes of Neil Simon's 'Brighton Beach Memoirs'. Because he couldn't leave rehearsal to pick me up, he sent Stephanie to do it, despite her having laryngitis -- her first words to me were a croaked-out 'Are you Dave?' while I was on the phone telling my mother I'd arrived in the desert in safety.

Though I expressly refused to give Stephanie a Valentine's gift a week later (having only known her a week, despite James's urging), we did end up spending a fair amount of time together; not only were we all involved in the college theater program -- Stephanie had taken me straight to the college theater to see James, and I ended up spending the evening 'on book'; providing the actors with their cues if they forgot during rehearsal -- but James was dating Stephanie's best friend.

Stephanie was tremendously good to me, and for me. She helped me find a job, working for her father at one of the two Pizza Hut franchise stores he managed in town. She eventually helped me find places to live after I alienated my first Yuma land-lady. And, one day while her family was out of town visiting the Grand Canyon (as I remember it), she invited me over, we went swimming, and she eventually led me into her room where we...

Or rather, she did. I was too nervous, even with the extremely obvious set-up, to perform well and I ended up not officially 'consummating' the evening, though I did my best to make her feel good. Apparently it worked, because she made the same plans a second time, this time when another friend and her husband left town to spend a few days with in-laws.

It's kind of odd, looking back and thinking about Stephanie now -- I might have even married her, had it not been for James. At the end of the 1993 school year, James was announcing to anyone who'd listen that he'd been accepted to a workshop for technicians seeking to work in television, and was very excited for the opportunity. Then, near the end of summer, he sent a postcard to his now ex-girlfriend from St. Louis, Missouri, explaining that he hadn't really gotten into a workshop; instead, he'd decided to follow another woman to St. Louis to be with her.

Stephanie, who'd always carried a torch for James, became convinced I'd do the same to her. So, when I returned to Yuma after Christmas break in January of 1994, Stephanie had begun dating another guy; a guy who, in many ways, was just like me. They eventually got married, but I've lost touch with them both and am not sure how they've done over the years -- the last time I saw Stephanie was, ironically, in a theater, when I went to Tucson to audition for the BFA program at the University of Arizona in the spring of 1996.

I'll always have bittersweet memories of her, and I'll never be able to listen to Mannheim Steamroller's 'Fresh Aire IV' without thinking of the day we spent in her friend's apartment, finally consummating our more-than-friendship, as she'd planned.

And every so often, I have a day where I deeply miss being the most important thing on a woman's mind.

Saturday, July 12, 2008

An Apple Observation

I've been an Apple customer, in one way or another, for nearly half my life now; I purchased my first Macintosh in 1989, and most of the computer gear I have in my apartment is Apple-branded. Because of my experiences, I have to say I don't really 'get' Apple-hate; it seems to me that people who dismiss Apple products as overpriced and overhyped and that deride the Apple culture as elitist and marketing-driven are missing the point.

Here's what I mean.

Folks that know me won't be surprised that I jumped into line late yesterday to pick up a new 3G iPhone. While in line, and awaiting my turn with the one-on-one who'd sell me my new phone, I noticed something:

- A cute girl wearing an Apple T-shirt and smiling at you is attractive.

- A cute girl wearing an Apple T-shirt and offering to let you play a game on her new iPhone is irresistible.

That's why Apple kicks ass.

Saturday, July 05, 2008


Saw the Disney/Pixar animated film WALL-E over the weekend and was blown away -- the first twenty minutes (and a good portion of the last twenty) will have you wondering if you should have brought your small children with you, that's how grown-up the movie is about its theme.

It's great, though -- it's already vaulted to the top of the list of my favorite Pixar films (and the short that precedes the movie is already among my favorite Pixar shorts as well -- watch for the Jay Ward credit!)

So go! See it! (I'll certainly see it again, at least once more.)

Monday, June 16, 2008

Well, dang, that explains everything!

The Dating Persona Test says you are...

The Last Man on Earth

Random Brutal Sex Dreamer (RBSD)

The Last Man on Earth

FACT: The apocalypse has come. All are dead. You never should've asked her out.

Shit, rejected again. You are The Last Man on Earth.

Sorry, but most women would rather see the human species wither to an end—and therefore deny the most fundamental instinct that living creatures have—than sleep with you.

We've learned the following: you don't think things through. You're haphazard. You're dangerous. You're somewhat inexperienced. It's totally obvious that you're a horny bugger, as well. Everybody knows that and steers clear.

To top things off, when you do find your way into a relationship, you tend to be a dick somewhere down the line and fuck it all up.

There's a small, but negligible, chance we're wrong. In any case, your friends find your shit hilarious. There's nothing cooler than a dude reducing himself to human rubble.

Your exact male opposite:

The Gentleman

The Gentleman

Deliberate Gentle Love Master

Always avoid: The Sonnet (DGLD)

Consider: Half-Cocked (RBSD), The Nymph (DBSD)

Link: The Online Dating Persona Test | OkCupid - dating services | Dating
My profile name: : Pauper063

City of Heroes and the Cookie Monster Economy

One of the things I've been doing while not posting to this blog is playing City of Heroes, a superhero MMOG. As part of that process, I'll occasionally visit the official message boards hosted by the game company.

While there, one of my guilty pleasures is trolling the Market board, looking over the kinds of posts people put there. The regulars in that forum...well, they seem to get along with one another, but they're just odd folks to me. The biggest reason I find them odd is that they seem to think that the 'economy' in City of Heroes is an analog of a real-world economy, and thus that they're understanding of that economy makes them smart, savvy people.

I can't say I agree. To explain why, a bit of a primer on the City of Heroes 'economy'.

City of Heroes models the world of superheroes and superhero comic books. The developers of the game nail a number of the tropes and traditions of the superhero world. One way they did this, when the game was new, was in their model of the game's 'currency'; instead of money, superheroes gained Influence when they successfully defeated bad guys, completed missions, and occasionally did other things in-game. There were other commodities that heroes could acquire by doing missions -- bad guys occasionally 'dropped' Inspirations and Enhancements that could be used to increase a hero's effectiveness, but these drops were random to some degree. (For Inspirations, the drop rate was largely random. For Enhancements, the level of the Enhancement was determined by the level of the defeated opponent, while the type of Enhancement was largely random. There's also an Origin factor that is partially controllable -- each hero has a particular Origin and certain types of opponents dropped Enhancements usable by particular Origins -- more on this later.) The use of Influence in-game was largely to balance out the random chance in the game; if you needed more damage Inspirations than you were getting as random drops, you could visit an in-game trainer and trade Influence for them. Likewise, if you had a Mutant Origin and were running missions against the Circle of Thorns, a cabal of evil wizards who tend to drop Magic Origin Enhancements, you could go to a 'store' and 'sell' the dropped Enhancements for Influence, then use that Influence to 'buy' the Mutant Enhancements you could actually use.

Influence, thus, was an in-game currency; however, the existence of a currency is not sufficient to establish an economy, and nobody I know of really considered early City of Heroes as having an economic system.

This changed with the release of 'Issue 6', where the developers introduced two new features side-by-side:

A crafting system: In addition to dropping Inspirations and Enhancements, defeated foes would now sometimes also (or instead) drop Salvage and Recipes. Salvage is basically the ingredients used in Recipes to create things in-game. The most common thing created in-game is Crafted Enhancements, which aren't restricted by Origin the way 'normal' Enhancements are, though it's also possible to craft costume pieces, temporary powers, and other things.

An auction house/consignment system: Rather than create a series of NPC vendors where players could purchase Salvage and Recipes (as players can still do with 'normal' Enhancements), the developers provided an 'auction house' where players can put the Salvage and Recipes (and even Enhancements and Inspirations) they don't want up for sale, allowing other players to bid on those items.

As Influence was already the in-game currency, Influence became the medium of exchange used to negotiate these consignment bids -- the more popular an item, the more Influence people were willing to bid for it and thus the more 'valuable' that item is.

There's nothing particularly wrong with this system; characters have a limited number of 'slots' in the Consignment House that can be used either to place bids on desired items, post items for sale, or simply act as extra storage for items that won't fit in the character's personal or base inventories. If you want an item, you can see what other winning bids for those items are and you can choose to bid that amount (or larger) if you want the item right away, or you can bid lower if you would rather take the time and try to get the item for a lower Influence cost.

Then the Market People moved in.

The Market People really enjoy playing the market in City of Heroes. They like trying to predict the bounces in bidding in the Consignment House system, buying low and selling high. They like being able to get more Influence out of other players then they otherwise would get by selling their items to in-game vendors. None of this is, in itself, a bad thing.

But then they post about it...

They'll post about the first time they sold an Enhancement for 100 million Influence, as if they were posting about defeating one of the game's signature villains, and they'll get congratulatory messages just as if they'd accomplished that far more significant milestone. They'll alternate posts where they try to explain how their activities in the market don't hurt or even help casual players with posts where they decry casual players as lazy and stupid for not following the principles of market 'domination'. And they love, absolutely LOVE, presenting themselves as experts, or at least teachers of concepts in real-world economics and how they apply to City of Heroes.

In short, they're more than a little bit creepy, especially with their insistence that they are smart people who understand 'real economies'. I'd feel a bit less concerned if City of Heroes actually had a real economy, then I could at least allow them their belief and simply say my own is different. The problem is that City of Heroes isn't a real economy -- it's a Cookie Monster Economy.

I have Timothy Burke of Terra Nova to thank for the analogy:

Recently, I happened to catch a segment of Sesame Street that my daughter was watching. In it, Cookie Monster was trying to hire a human assistant to help him sell six cookies. Cookie Monster explained helpfully, “Cookie Monster sell cookies in order to have money to buy cookies”.

Cookie Monster, of course, sells cookies because cookies are the only thing in the world he values. But the City of Heroes Market People sell Enhancements, primarily, in order Enhancements? Yep, that's pretty much it in a nutshell. (The presence of Salvage and Recipes could be seen as making the City of Heroes market a bit more complex, but the Salvage and Recipes, while they have a certain value, only have value to the degree they're used to make Enhancements -- so it's as if the Cookie Monster, realizing that selling cookies to make money to buy cookies seems a bit silly, decided instead to sell cookies to buy flour, eggs, and sugar to make sell to make money to buy cookies. It's a wonderfully circular analogy that perfectly matches the actual City of Heroes economy.)

Of course, there is another option -- that, instead of using their humongous amounts of Influence to buy pricey Enhancements, the Market People are instead simply using their Influence totals as a score pad to measure how well they're playing their version of the game. Ultimately, that is a valid goal to set for oneself in a game environment -- only problem is, how many people do you know in the real world who measure their value (and thus other peoples' value) as people by the size of their bank accounts? Do you really enjoy spending time with those people? Or do you find that their strange interpretation of economics, in which everything has a monetary value if one just decides to do the math, makes them...well, creepy?

In one final attempt to understand these Market People, I decided to try to emulate them -- to play the game they seem to be enjoying so much and try to find out why they enjoy it. I created a character, adventured around until I was high enough level to have a reasonable number of slots in the Consignment House (level 7), and then began following their guides to economic mastery.

First, I began by performing what real-world economists and traders would refer to as arbitrage: I bid on and purchased 'stacks' of Salvage (you can buy up to 10 items of a particular type at one time in a single 'stack') from the Consignment House, took them across the street to an NPC who acts as a 'store', and sold the Salvage for Influence. This is a perfect example of arbitrage, since the two 'markets' are entirely disconnected; the 'store' pays a fixed amount for salvage of a particular rarity, while the Consignment House prices are set by sellers when they enter the amount they're willing to accept in payment for the item they're putting up for sale. I very quickly found that I was able to purchase half a dozen different Uncommon Salvage items (including Sapphires, Chaos Theorems, and Psionic Threat Reports) for 10 Influence each; I'd grab a stack of 10, take them over to the 'store', and sell them to the vendor for 1000 Influence each.

To borrow Market People Speak for a moment, this works because casual players are either stupid (they don't realize that there are places to sell their salvage where they'd receive far more than 10 Influence for each piece) or lazy (they know of the 'stores', but don't want to take the time to run to the stores when they're already at the Consignment House trying to sell other items), or both.

At times I could sell Salvage this way as quickly as I could run between the Consignment House and the vendor; by the time I returned to 'Wentworth's' (the name of the Consignment House in-game), there'd be another stack or two waiting for me to grab and sell. At other times, I'd have to wait a while, but in those cases I'd log out of that 'toon', play a different character, then log back in after three or four hours and sell the stacks that accumulated while I was gone.

Once I got up to about 250,000 Influence, I decided to try my first 'flip'. 'Flipping', in the game, is the term for purchasing an item at a low price and then selling it for a higher price. The Market People like to describe this in terms of cyclical pricing and the principle of supply and demand: when there is a lot of supply on the market, sellers price their items lower to be able to make the sale rather than have the item sit in the Wentworth's inventory slot unsold. As these cheaply-priced items leave the market as the number of buyers grows, only the higher-priced items are left, and thus when demand exceeds supply, price goes up.

Having seen the market in action, I can say that there are items in the game that do follow this cyclical pattern. However, for about 90% of the items in the game, this cyclical pricing behavior means absolutely nothing, and in fact if you graphed the price over time for those items you wouldn't find a cyclical, sine-wave plot, but rather a random scatter-plot. The reason is that, for about 90% of the items in game (but a significantly smaller percentage of the top-level items in-game), there are significant periods of time when those items have no buyers and no sellers -- nobody has an item up for bid, and nobody has a bid in trying to grab an item when it finally appears. In these situations, an item can sell for just about any price. There's a history of what the last five items of that type sold for, along with the dates on which the sales took place, and sellers do appear to allow that history to guide them when posting the minimum acceptable bid for their own items on the Consignment House. However, sellers are also required to pay a fee in Influence equal to 5% of the amount they wish to use as their minimum acceptable bid, so if a player gets a 'drop' of an item that normally sells for 5 million Influence, but only has enough Influence to pay the fee for a 300,000 Influence minimum bid, in many situations the player will post the item for that minimum bid and hope for the best -- the buyer can't actually see the seller's minimum bid, just the history, and it's possible that a buyer will simply post the history price and not bother to check to see if the item may have been posted at a lower price to save posting fees.

Again, if I may borrow Market People Speak, this is because sometimes buyers and sellers are lazy; if you don't have enough Influence to pay the fee for the minimum price you want, you're supposed to hold onto the item and grind out the Influence required to make the fee payment. Alternately, a Market Person will explain that this is because buyers and sellers are stupid; a competent market player will always have some amount of 'liquid Influence' they can use to make these kinds of fee payments, and a competent market player will always start bidding incrementally up from an extremely low bid until he reaches the level he's comfortable with, just in case such a bargain is there to be had.

By searching specifically for auctions with no buyers or sellers, I was able to fairly quickly turn my 250K Influence into 3 million Influence by posting a lowball bid, winning that bid from a poster who posted a lower-than history bid, then re-posting the item for slightly higher than the highest history bid. Every point of that Influence was 'earned', if such a word can be used to describe what I was doing, by taking advantage of inefficiencies in the Consignment House market system -- Michael Lewis would have been proud of me.

Once I had my 3 million 'seed money', the rest was easy: identify a top-level Enhancement with a cyclical price structure. My first such item was an Enhancement that moved between 2 million and 8 million in price depending on circumstances. Post a bid for just over 2 million, but not so much that I couldn't pay the fee for my eventual seller's minimum bid. Buy the item. Post the item back to the market with a minimum bid of 8 million or just below. Sell the item.

Lather, rinse, repeat.

At this point, I simply lost interest in the entire process. The Market People had led me to believe that this was a challenging part of the game -- one poster described the consignment system as 'market PvP', or the equivalent of player-versus-player combat. But all I was doing was the traditional 'buy low, sell high' that traders have been doing since the dawn of time, and identifying the places to do that in the consignment market wasn't anywhere near as challenging as the Market People had led me to believe. I didn't feel smart or hard-working playing the market; I just felt bored.

What made the situation even worse was that I was buying and selling items I'd likely never use. The most Influence is available to be made dealing in high-level items, and to use a high-level Enhancement you have to yourself be a high-level character. But my marketer is level 7, and it takes a long time to go from level 7 to level 50 when you generally only play a few hours on the weekend (and have nearly a dozen different characters to choose from on any given day). I could check my marketing in about ten to fifteen minutes after I got home from work during the week; by the time I got to the weekend, I wanted to do something, anything else other than mess around on the market. Even more ironic is that most of the Influence I've 'earned' would likely be wasted if I did try to play the character in the traditional way; while I could completely trick myself out with level 10 crafted Enhancements -- the lowest level crafted enhancement in the game, and the only crafted Enhancements a level 7 character can equip -- the level 10 crafted Enhancements also have the lowest bonuses among crafted Enhancements (the bonuses increase as the level goes up), and the crafted Enhancements, unlike the 'dropped' Enhancements, can't be 'combined' to become more effective as the character gains levels. In effect, I'd have to choose at what level to throw out nearly all of my existing Enhancements and replace them with new Enhancements, and while there are ways to do that without completely losing all the value in your existing Enhancements, there's no way to do it and preserve all of the value in your existing Enhancements, meaning some amount of my 'hard earned' Influence would be spiraling down the toilet every time I wanted to upgrade.

And sure, I could just get back on the treadmill and grind out a few tens of millions of Influence all over again, but why? I'd already demonstrated that I don't find that fun.

The Market People are welcome to play their game and enjoy it. I just feel the need to respond when they try to tell me how much fun I'm missing by not playing their game -- because I've tried it, and I'm not missing anything.