Sometimes Justice League is really cool. Other times, well...
Like the episode that aired last night, whose plot centered on a group of cloned superheroes that had been engineered by some top-secret agency to be "totally loyal to the government... unlike those loose cannons in the Justice League."
The band of hero-bots decides to take out the Justice League, for God and country. When one of them (Windtalker, or Dreamcatcher, or something, with big sad eyes and a tragic fate) demurrs and raises objections, the others blast him aside, sneering, "It's like the man said: you're either with us, or you're against us."
Maybe I'm reading too much into this. .... Naaahh.
Explain to me again how our popular media is indoctrinating our youth with messages of conformity and unquestioning loyalty and the necessity of crushing all dissent.
I hate when that happens: I'm working up a post in my head, and I've got some great illustrative example I want to use in it somewhere, and I flesh out the whole post to support that one example (rather than the other way around)... and then, when it comes time to actually write the thing, I forget to mention the example I had in mind.
See, in this post, I mentioned a phenomenon I called "Cargo Cult software"—and yet I managed to forget to point out one of the most obvious examples of it that I'd seen, of a concept that was designed on one platform for a specific engineered user-experience purpose, and then was adapted for use on a second platform with seemingly the same function but with no clear understanding of the principle behind the feature's existence in the first place.
The example of which I speak is the startup sound. And yes, the original feature I have in mind is the chime on the Mac, the bonggg sound that you hear as soon as you turn on the machine. (Incidentally, as Mac guys know, various generations of Mac models have had different startup chimes over the years, starting with the simple square-wave "beep" of the original 128K Macintosh, proceeding through several variations that added harmony and progressively more aurally interesting chords, including the weird suspended one on the Performa 6100 series and the fanciful echoey one on the Twentieth Anniversary Mac, to the deep full-chested one we've had for several years now—and we're due for an update, don't you think? See Mactracker for this and other information for Apple history obsessives.)
The purpose of the startup chime (SGI had one too, and first) was to tell the user that the power was on and the machine was about to start booting. It says that the initial POST checks have passed and the motherboard is viable. (There's a "death chime" if the POST fails, too—or at least, there was in earlier Macs. So very sad-sounding.) It tells the user something concrete. Now, from a user-engineering perspective, there are two useful places to have a sound play during startup of your computer: a) at power-on, and b) when your user environment is ready to accept input. Apple chose to put their chime at power-on, but they could just as easily have put it at the "system ready" point—it would have been plenty useful to the user. But where they did put it makes sense. It means the basic hardware is working. It means you've begun the boot process.
But then Microsoft added a startup sound in Windows 95. They didn't put it in at power-on time, because they couldn't. But neither did they put it at the next most logical place, the "system ready" point, where the Desktop is done setting up and the cursor can let you drive. No—they put it, for reasons that are fathomable only to those in the inner sanctum of the Redmond war room, at the beginning of the user login process, after OS boot has completed. What does "The Microsoft Sound (by Brian Eno).wav" tell the user? Apparently, that "the system has completed an indeterminate segment of a long boot process, and will be ready for you to use after another indeterminate period." How is this useful? You hear this sound, and instead of it signalling to you that you have either succesfully accomplished something or are now able to start doing something, it tells you... precisely nothing. Except that your speakers are on.
This is what I mean when I say "Cargo Cult software". Did Microsoft have any understanding of why Apple put in a startup chime? I don't blame them for being too limited by generic PC hardware to be able to affect anything at the immediate power-on stage, so they can hardly have been expected to put a sound there; but couldn't they have made their ethereal synthesized musical phrase occur at a time that's useful to the user even slightly—namely, say, the "ready to accept input" phase? That would have demonstrated that they understood the purpose of the feature, and weren't just copying a Mac bell and/or whistle for the cynical reason that they had to impress the rubes in Radio Shack into thinking Windows 95 was just like a Mac.
This is where so much of the derision aimed at Microsoft from within the Mac camp tends to come from. Mac-heads generally can see the user-experience rationales for the design decisions made by Apple. It's not that we resent Microsoft's copying them; that's sort of expected, and we can hardly begrudge them doing what any competitor in the free market would do. But it's when they do it blindly and without apparent thought, or when they do it by copying the feature's basic functionality but changing some trivial element of it that ends up making their motives look self-conscious and patronizing (like calling it "Recycle Bin" instead of "Trash"), one can't help but curl a lip a bit. It's things like that that make otherwise gracious and tolerant and practicality-minded computer geeks rant and rail idealistically about how much better the computing world could have been oh, if only we didn't so readily accept the mediocre and fail to demand the best, and so on ad nauseam.
Now, don't get me wrong: I think Microsoft's engineers are doing the best they can, down at the individual level; the company is full of geniuses. But they're having to deal with a corporate culture that was ossified back before Windows 95 was even in the planning stages, and now has become impossibly strangulated by its own size and labyrinthine backward-compatibility-constrained processes, such that the company couldn't produce a charismatic public-relations figure like Steve Jobs if it tried—and the only facsimile thereof that they have is bewilderingly embarrassing. With Internet Explorer stagnant, Windows' next major version being repeatedly delayed into the indefinite future, and once-grandiose initiatives fizzling or meeting with intense skepticism, one can't long avoid the impression that Microsoft's upper-level designers and planners are doing little more than waving their arms around and chanting a lot of mystical incantations, in the hopes of hitting on some lucky breakthrough that they can sell to an awestruck public. It also makes one wonder, the longer they keep fruitlessly trying, whether a computing platform based on demonstrable vision evident in every trace and screw is destined never to dominate the market—but only to be a niche player, marginalized by whoever is able to market a passable simulacrum the cheapest.
But that's an emerging market for ya: nobody knows, and nor will we until this is all in texbooks read by our great-grandkids in grade school.
Well now, there's something just so spectacularly sad about this. The by-now-famous "Sidetalkin'" site, dedicated to the Nokia N-Gage and its gloriously silly "sideways" method of using it as a cellphone, is now... no more. For the N-Gage QD is now out, and it looks depressingly normal.
But check out this purportedly leaked ad from Nokia.
If it's for real, Nokia wins huge points with me. That is shrewd. I wish more companies would market by playing up their criticisms... because it's so dang entertaining.
Here's a little geekiness for the New Year: an article written a little while ago by Nitrozac and Snaggy, creator of The Joy of Tech, covering all aspects of getting the most out of your iSight.
How many times do you do it a month? Oh I know, I know, it used to be every day; sometimes even two or three times a night! Alas, those passionate feelings have probably faded a little with time, and that feverish desire you once felt has moved into the comfortable blahs, or worse, you barely think of each other with techno-lust anymore.
Admit it. You are completely bored with your iSight. Your impulsive, giddy love affair has all but dried up, now that the reality of video chatting has settled in. To paraphrase the band Talking Heads, you may find yourself in front of a beautiful geekosphere, and you may find yourself on some beautiful bandwidth, but you may ask yourself: where is my useful device? Is this my beautiful iSight? How did I get here? My God! What have I done?
Blast! I know what song they're talking about.
Since the article was written, iChat has acquired an AppleScript interface, making some of the hacks unnecessary; and Tiger brings not only super-slick multi-way video chat, but also has little features like a "Current iTunes Track" setting for your status message. That's the thing about Apple software: it's always being improved. There's almost nothing they let stagnate. About the only perpetually languishing Apple app is AppleWorks, and it looks like they're finally about to do something about it...
The EcoBot II uses human sewage as bait to catch the insects. It then digests the flies, before their exoskeletons are turned into electricity, which enables the robot to function.
Bacteria in the sewage eats the flies' soft tissues, which releases enzymes that break down the hardened shell.
Sugar molecules released from the broken-down shell are then absorbed and used as energy by the bacteria.
"The robot then has the energy to carry out some example tasks which in this case include moving towards light, measuring temperature. It has a temperature sensor. It could be anything, but we have chosen temperature," Melhuish said.
"Then it transmits that temperature information over a radio link to a base station a couple of meters away and it does that all using the energy from insect or plant material."
I must admit I'm looking forward to this: Tim Burton's Willy Wonka remake. The trailer certainly seems to have captured a certain unsettling intrigue about the characters and the setting, which is all to the good, I think.
I don't know whether I'm alone on Earth in this or what, but I was never very keen on the 1971 version. It just seemed so very sad and lifeless. Gene Wilder floated around the stage reciting bored lyrics to lugubrious songs, and nobody exhibited a sense of urgency throughout the entire thing. Now, I remember the book—it was frenetic. Wonka was a hyperactive, bouncing-off-the-walls freak, and nobody could tell whether he was in possession of his marbles or even human; one was constantly wondering what he'd do next, which is precisely the feeling one gets from that unhinged-looking grin that Johnny Depp gives in the trailer. And in the book, I could have sworn that the frickin' Oompa-Loompas didn't sing (though Nathan S. writes to assure me that they did—just not so 70s-ily).
I've always thought Tim Burton was a perfect match for adapting Roald Dahl books. His inherent sense of darkness is precisely what you need in order to tell these kinds of stories properly, not lots of primary colors and Munchkin-land sets. One of my favorite Dahl adaptations, Matilda, isn't done by Burton—but it might as well have been, for all the innovative camera angles, super-busy set pieces, and over-the-top character portrayals that outpace any of the Harry Potter castings. (Actually it's directed by Danny DeVito, and I thought he did a marvelous job.) It always felt like a Burton movie to me; silly as it sounds, I saw it in the theater (on an inspired whim) with a bunch of twentysomething hipsters—and kid's movie though it might seem, we all loved it. Discussing it in the parking lot afterwards, we all agreed that it was dark and distressing, yet thoroughly satisfying in texture and conclusion—exactly the kind of thing Burton had apparently forgotten how to do by then, and something we all hoped to see more of. Or maybe the other guys were just high; I don't know. Either way.
It's interesting to see that Tim Burton and Danny Elfman seemingly parted ways in recent years; but they're back together on CatCF, which does my heart good to see. This is a combination that looks great on paper. But a lot of things do, of course: it remains to be seen how it'll look up on film. I've got high hopes, though.
I wonder if John Dvorak will ever tire of predicting doom for Apple. Now he says that you can feel free to ignore all those iPod sales this season, the Apple Stores sprouting up in malls everywhere, the Apple logo at the end of Return of the King, and every other sign that the company has never been healthier—because he went scraping through some Apache logs at a particular site and found that Mac users represented only 2.7% of the hits, less even than Linux.
Well, shucks. I guess that means I can go hit the W3Schools site a couple dozen times from my Mac and change history, huh?
MacDailyNews has a neat little rebuttal, though, using some of those pesky "fact" things we've heard so much about lately:
Let's take a look at W3Schools. W3Schools is a website designed to teach people how to develop Web sites for Microsoft's Internet Explorer using the the Microsoft ASP.NET framework. W3Schools statistics above are extracted from W3Schools' log-files, and also include "monitoring other sources around the Internet" (W3Schools doesn't disclose which sources or their weight in relation to their own logs). Therefore, it should come as no surprise that visitors to the W3Schools' site would be using predominantly Windows and Internet Explorer. On the flip side, if you took a look at MacDailyNews' logs, you'd conclude that Macs have 91.7% market share and Windows has less than 7%.
However, the real stat of importance, the stat Dvorak fails to mention (perhaps because it blows the foundation for his whole theory to pieces), ironically comes directly from W3Schools' windows-slanted logs themselves: Mac market share in March 2003 was pegged at 1.8%. Mac market share in December 2004 was 2.7%.
Oh no! More Mac users than ever before are learning how to program for .NET! What'll we do?
Meanwhile, how's that whole Passport thing workin' out for ya?
Incidentally, I spent about an hour today trying to figure out an alternative to the utterly maddening Windows Media Player, which I'm obligated to use a couple of times a day for various tasks (namely, opening WMV animations created by certain people, scrubbing to a visually auspicious point in the movie, and selecting and saving a still-frame for use as a thumbnail for the movie). Windows Media Player, totally apart from the insane "disappearing mouse-over window-frame" thing and the bizarre bulbous pseudo-transparent shape, drives me bonkers in that dragging the scrub bar control back and forth through the timeline does not actually scrub through the movie—the video desynchronizes and does not update to match where you place the playhead, and even if you try to play it from that point, it might be five or ten seconds before it hits another keyframe and starts showing you video again. So scrubbing to a good point in the movie for a thumbnail is impossible—and to add insult to injury, Windows Media Player has no Copy function. That's right: you can't simply save a still frame. you have to do a screenshot at the Windows level.
"Oh, what I'd give," I said, "for the ability to scrub real-time through the movie and save still frames at will, like, oh, I don't know, in the QuickTime Player." But such was not to be my fortune. Until I asked a friend about it and was directed to this site, where you can download a "Classic Media Player" that looks like WMP before they "prettied it up" in the manner befitting Homer with his "makeup gun" invention. Download the "K-Lite Codec Pack" (I got the "Full" one), and use the media player that comes with it. It's pretty trashy; the menus freeze randomly for several seconds every time I try to open them, and the scrub bar is slow and laggy, so I have to scrub slowly—but at least scrubbing works, meaning that this seemingly basic feature of video playback is quite simply beyond Microsoft's competence. The player is also very disk-intensive, meaning that I have to copy the movie to the Windows machine to open it and play it properly, rather than playing it over the network via SMB; but that's a small price to pay—for glory be, it has a "Save Image" function. As hideous as this piece of software is, it's worlds better than WMP. And I've now shaved about 15 minutes of work off my daily chores.
Whatever deep-rooted psychological torment John Dvorak must suffer from to make him lash out so pointlessly and cluelessly at Apple whenever a column is due, perhaps this tip will dull his pain a little and we can all be happy.
Aziz pointed out this rather remarkable blogger—a female Czech/Iraqi software developer, a Bush supporter, Iraq war supporter, and person with a lot of very unique perspectives on a lot of issues. This article on software design is a perfect case in point—I wonder how many software people feel pangs of guilt and regret over having been part of the revolution that has plagued the Earth with the fruits of their frontal lobes, causing such mental weariness and consternation to otherwise perfectly intelligent members of society who simply can't intuitively grok the vagaries of Internet acronyms and confusing dialog boxes.
Very interesting writing. I may have to spend some time here.
Regarding the tsunamis and the relief effort I don't have a lot to say. Well, no, that's not true. There's plenty I could say, but I think I'd be happier with myself if I abstained. It'd certainly make for a more pleasant experience looking back on this period from six months in the future.
What I mean is this: there's an awful lot of temptation out there to treat the quake/tsunami disaster as an opportunity to "prove", through innumerable little acts of one-upmanship, that so-and-so is more generous with his largesse than such-and-who else. With the Amazon.com/Red Cross donation box having taken in over $6 million all by itself (and the numbers are still climbing), and with dozens of other charitable organizations collecting money, it's easy to use that to demonstrate how great the charitable spirit of the private American citizen is as opposed to what the governments of other countries—or even our own—has pledged. Similarly, one could point out that Microsoft has nothing on its home page to correspond to what Apple has posted in light of the occasion, for all the good that would do. (UPDATE: Now they do.) Personally? I don't think it accomplishes much. It only serves to turn this whole thing into a less-stingy-than-thou game, which is a really awful thing to be doing right about now.
There are those who will smirk at the fact that the initial US Government donation was about $15 million, followed by that UN official's remark that (depending on whose side of the story you believe) "prompted" us to raise the offer to $35 million, as though the amount we're willing to donate depends only on how much green we have to feed into the little slot machine to get little dings of approval from the nodding heads lined up in the General Assembly. The sainted WaPo has weighed in on the nerve of President Bush to be vacationing at this time of year, instead of personally ripping off his suit coat and Santa hat and wading into the surf to save drowning victims. MoveOn.org has already sent out e-mails to its members urging them to write to Bush to express their outrage at his uncaringness and pigheadedness, helpfully pointing out that we're burning through $35 million a day in Iraq. (Surely we could just, I don't know, put all our soldiers in a big icebox to keep them out of trouble for 24 hours, and give the money to the Red Cross instead? Then, hey, Iraq would have peace for a day!) And it's easy to point out outrages and shameful cover-ups and criminal pettiness on the part of the usual suspects. But you know what? This is like watching the live coverage of the events of September 11th, seeing the headlines crawl across the bottom of the screen, and mumbling to yourself about what you might like to have for lunch.
Is it too much to imagine, for instance, that the US and other countries have contributed increasing amounts of aid because it wasn't so very many hours ago that we thought the tsunamis had killed only ("only") ten thousand people? The understanding of the scale of this event has rolled over us with as much deliberate and smothering weight as the waves themselves did on those beaches—and it's taken a while for it to sink in. It's going to get worse still as "missing" figures continue to be added to the casualties. There is no human shame in becoming slowly, painfully more aware of how big a thing this is and how much we ought to contribute. This isn't about thinking of ourselves as emotional whores whose sympathies can be bought with a sufficiently high body count. This is about the slow realization of something that's going to be affecting all of us, and an opportunity to put aside the temptation to compete and backbite, and see what we can do to fix the problem—or at least mitigate it a bit.
If there's anything I've come to appreciate in recent years, it's that the decisions that are made in this world, all the way on up to the top of our highest institutions, are made by human beings, with the same mental chemistry and human motivations and provincial family concerns and stubbly chins and lack of sleep and worries about the very real future that every one of us has. The people who came up with the $35 million figure are better qualified to account for it and how they arrived at it than any of the pundits who style themselves experts in everything they ever read a headline about, and this is one of those times when the speed of decisions matters. There will be ample time to criticize our respective collective generosities and measure everyone's charity dicks against each other in the future, so let's wait until then. Better yet, let's not do it at all.
UPDATE: Oh, and incidentally, the Evil Corporations have all just joined the party. KCBS just rattled off a list of names including Pfizer, the Citi Group, the Bill & Melinda Gates Foundation, Coca-Cola, Johnson & Johnson, Bristol-Meyers-Squibb, and about ten others, each of whom was donating $3 to $10 million. That just about leaves any government's pledge in the dust, and makes even the Amazon.com thing seem like small change.
For what it's worth.
UPDATE: Here's the list. And Tim Blair has a roundup of contributions and fundraising efforts from all over the world, including the latest news and anecdotal stories of heroism.
Well, doesn't this just make one want to sit up and bark...
With iPod-savvy Windows users clearly in its sights, Apple is expected to announce a bare bones, G4-based iMac without a display at Macworld Expo on January 11 that will retail for $499, highly reliable sources have confirmed to Think Secret.
The new Mac, code-named Q88, will be part of the iMac family and is expected to sport a PowerPC G4 processor at a speed around 1.25GHz. The new Mac is said to be incredibly small and will be housed in a flat enclosure with a height similar to the 1.73 inches of Apple's Xserve. Its size benefits will include the ability to stand the Mac on its side or put it below a display or monitor.
Along with lowering costs by forgoing a display (Apple's entry-level eMac sells for $799 with a built-in 17-inch CRT display), the so-called "headless" iMac will allow Apple's target audience -- Windows users looking for a cheap, second PC -- to keep their current peripherals or decide on their own what to pair with the system, be it a high-priced LCD display or an inexpensive display. Sources expect the device to feature both DVI and VGA connectivity, although whether this will be provided through dual ports or through a single DVI port with a VGA adapter remains to be seen.
Wow. Several commentators (including Aziz Poonawalla, who alerted me to this) have speculated hopefully about this kind of new Mac in the past: a headless, entry-level, laptop-form machine to compete with the loss-leaders from Dell and HP and Gateway (which tend not to come with monitors, though you only find that out in the fine print). I've always taken Apple at their word when they said that they had no interest in that market segment, deeming the sacrifices they'd have to make in hardware packaging and software functionality to be too much of a downside to justify the upside. But as the article says, it seems the iPod really has changed all the rules. Suddenly Apple is a household name again, and in a good way this time; now everybody's looking at them expectantly, saying, "Okay—you've got my attention. What do I do next?"
So they're going ahead with what's meant to be the ultimate Switcher Box, are they? Maybe not—this is just a rumor at this stage. But we'll find out soon enough. Talk about striking while the iron is hot, though. I don't know if there's ever been a riper time.
Jordan Golson has collected some home videos taken by vacationers at spots where the tsunamis hit. They're worth watching, especially the Norwegian one—it's hard to imagine, otherwise, just how slow and menacing this kind of thing is to experience. The water just keeps coming.
Amazon.com is collecting donations, as noted elsewhere (but it can't hurt to spread the word farther).
UPDATE: It's become quite an illustrative phenomenon too. Seriously—I don't wanna hear anyone argue, in the face of this phenomenon, that a country that tries to keep taxes low promotes greed and stinginess among its citizens.
So now that the Washington Post has reported a Bush increase in Pell Grant funding as a decrease, and apparently still thinks the Rather Memos were genuine and the innocent casualties of a bunch of untrained dung-throwing lowlifes...
...Do you suppose they'll report this story as proof of the intolerance and bigotry of America, and fail to mention that it was a fraud?
Hell, what have they got to lose? It's not like people wouldn't believe them.
It's back! Phase II of the infamous "Child's Play" article, where today's kids (with all their modern ideas... and products!) are plunked down in front of beloved classic video games, and their off-the-cuff reactions are recorded for posterity.
Bobby: This is like Pong. Everybody thought it was amazing and good, but now we're just thinking, "Oh, it's only a good loading screen for Test Drive."
Dillon: And to think 20 years from now, people are going to think, "Oh, you're playing [GameCube Zelda game] Wind Waker? That's boring."
EGM: What will you say when your kids say Wind Waker looks boring?
Parker: Get out of my house. You're out of my will.
11:30 - One who breaks a thing to find out what it is has left the path of wisdom
I know I should probably just be leaving well enough alone, but—why do you suppose it is that Glenn says that iTunes is "still not an Amazon.com for online music"?
I'm sure he can't mean the iTMS is missing features that Amazon has. What's lacking, anymore? Gift certificates? Recommendations? Reviews? Account management and history? Linkable pages? From my perspective as a shamelessly biased corporate shill, I think it's all there, or at least everything that's relevant. It even has stuff I'm not aware Amazon has, like allowances. I'm pretty sure he doesn't mean Amazon.com is a better store for online music; they've got that "Music Downloads" section, but there's not much in it but artists who don't mind publishing free MP3s. It's just not the same kind of store at all.
Is it just a matter of selection? It's true that many omissions are curious, though Apple has been diligently adding new music at a very steady clip ever since opening the store; there are some bands that I doubt will ever appear there *cough*Beatles*cough*, but the other holes are being steadily plugged. I remember someone saying upon the iTMS's opening that it would not gain credibility as a retail avenue until it had a million songs in its library. Well, haven't they passed that mark by now? They certainly seem to have achieved some measure of credibility.
I suspect that the interface of iTunes is really key here. I don't know that people would put up with even a totally full-featured and usable music store if the music-browsing interface itself were mediocre. Remember when iTunes for Windows first came out? It was a revelation to many. And deservedly so: iTunes is a stunner of an app, revolutionizing the world of interface design even in the eyes of those who represented the vanguard of the industry's then-best practices. Everyone on the Mac side has been coding to the iTunes standard since it came out—it was like a flash of sudden enlightenment that filled everyone with a renewed desire to create great new software that could benefit from the new unifying simplicity of the iTunes metaphor.
On the surface, the slickness of iTunes is simply its one-window approach, its eschewing of the here-useless hierarchical "tree" view, and the stabbing insight that what we're dealing with here are not MP3 files, but songs. Songs aren't organized by "filename" and "folder", but by album and artist and genre. And that's only the beginning of the list of criteria by which to sort one's music. Other organizers of data have given us huge amounts of meta-data before, including Windows XP; but nothing until iTunes had had the audacity to hide the files/folders metaphor entirely and abstract the entire interface into the best-suited criteria for organizing a particular form of media. The real leap of genius in iTunes is to use an adjunct database that separates the music layer from the files layer; this creates an abstraction that's potentially prone to breakage and strikes purists as inelegant, but it's a leap that made a whole world of new features possible. It took many people a good while to figure out what iTunes was trying to do (particularly those used to Windows and apps like WinAmp): create a music interface, not just a front-end to files and folders, not just a suited-up filesystem interface with extra bells and whistles. Once people understood this, everything fell into place: it was just so obvious.
iPhoto attempted to follow suit along the same philosophical lines, organizing photos not by filenames and folders but by film rolls and by their actual visual appearance, using resizable thumbnails to help organize. This effort has been less successful than iTunes, whose media is seemingly perfectly suited for the abstracted media interface metaphor. Photos work a little less well; they're a bit more unwieldy. Movies, when the time comes, will work less well still. But iTunes remains on top of the heap, a gold standard so inspirational that everyone wants to make their software work as well as it does, even if its content doesn't lend itself to iTunes-style organization.
But iTunes wasn't done changing people's outlook on software yet. Soon came new features like "Smart Playlists", which leveraged the queryability of the music database to allow the user to set up saved queries for particular classes of music, on whatever combinations of criteria they liked, which would always remain up-to-date—"80s Music", or "Never Listened", or "Classic Rock", all defined on things like ranges of release dates, the user's "star" rating, the play count, groupings of multiple genres, and so on. It's limited only by the user's imagination, and packaged into a metaphor like "playlists", it turns SQL into an Everyman's tool, making one's music collection sing and dance like collectors of vinyl or CDs could only have envisioned in sci-fi fever dreams. And it's this new functionality—the saved query—that's going to shortly revolutionize everyone's computing all over again. Mac OS X Tiger will include an iTunes-style database (Spotlight) that gives not just your files in the filesystem, but any app that wants it, the same data-organizing functionality that iTunes has. Tiger's Mail has "Smart Mailboxes", whose usefulness is obvious to anyone who's used a Smart Playlist. "Smart Groups" enhance Address Book. And in the Finder, "Smart Folders", stealing thunder from Microsoft's Longhorn (which for a long time has been aiming to do the same thing), free your very files from their folder-based strictures and let you group them according to your query-driven whims. We're already seeing various attempts to leap onto this new bandwagon barreling down on us—Google Desktop seems to want to provide a stop-gap before Longhorn gets here, and Thunderbird has its own Smart Mailboxes-workalike functionality—but Apple will get here with the biggest and earliest comprehensive solution. And it's all thanks to the iTunes revolution.
Now, with all this insight exhibited so blindingly by iTunes, you'd think that there would be a host of imitators by now. But what ones there are are pale, halfhearted simulacra, desperately seeking some angle that Apple has missed, and grievously wounding the simplicity of the concept in the very attempt. You can't improve on iTunes by adding hierarchical playlists, for example—the metaphor doesn't make sense. That's why Apple didn't put them in in the first place. All they do is clutter up the interface and make it less compellingly usable, for negligible benefit. "Skins" or "Faces" also don't help; iTunes' interface was deliberately chosen to be attractive but austere, eschewing the chaotic car-stereo-looking layouts of so many competitors that end up looking like nothing so much as elaborate spyware. Skins make the user focus on the interface, when the whole purpose of iTunes is to make the interface so natural that it blends into the background and becomes invisible. Very few voices today bemoan the lack of skinnability in iTunes.
After a time, one gets the feeling that the competitors are dealing in Cargo Cult Software—they're merely imitating the surface elements that they believe make iTunes successful, but without understanding the philosophy that lies underneath it. Without that understanding, it's yet possible to make a fine, usable iTunes clone; but you're not going to make any leaps of further insight, following from the design ideals that informed the software's original creation, that bring it to the next level or create the Next Big Thing. All the minds put together that created WinPlosion couldn't have come up with Exposé in the first place. That took Apple.
Mac users have been used to Apple software for many years; and with a number of notable exceptions (Apple's user-experience commitment waxes and wanes in regular cycles, I'm told by those on the inside), they've demonstrated a continuous thread of philosophy under all the software they create: a philosophy that software development is not about the software at all. It's about the people who use it, and getting them in direct contact with the media they're working with. The best software that exemplifies this ideal is the software that fades into the woodwork and becomes transparent, letting the user roll up his sleeves and work with the data without ever thinking about the tools he's working with. (Having all apps look and work like all other apps is a big part of this.) True, some Apple software loses track of this vision; some of it is annoyingly obtrusive, or inexplicably rebellious in its look and feel. But there's still that common thread running through it all: the continuous inspiration of a company full of people who "get it", whose final product is a small subset of their grander vision, not just a point-for-point Cargo Cult imitation of something they've seen elsewhere. Just as Terry Brooks couldn't ever seem to conceive of any idea he hadn't already read in Tolkien, the latter man took a much grander world to his grave than he had ever revealed to any earthly ears.
That's the kind of thing Apple's got going. Windows users have been given a glimpse of it with iTunes, and a good number of them have absorbed what it means to have an entire platform where everything works that well. There's just a different style behind the creation of Apple software, something ineffable that can't be quantified, and that nobody else has the chops to challenge. It's why I think it would be such a shame for Apple to disappear, unlikely though such an event might seem nowadays (boy, how far we've come in the last three years, eh?). Other companies might take Apple's place as the guardians of their various pieces of the multimedia pie; someone else might outdo iTunes, or someone else might make a better iMovie, or someone else might remake Mac OS X. But nobody can claim such a long-lasting legacy of commitment to a consistent software design philosophy as Apple can; nobody else's name is synonymous with "ease of use" dating back to the 1970s; nobody else fully knows the extent of Apple's vision, or would be willing to sacrifice market share (as Apple has) for the sake of sticking to that vision and its attendant ideals. Another Apple might rise... but it would be a mere shadow of the one we've known, the goose who continues to give us golden eggs as long as we don't demand to know how. For all our artifice we can't construct another once that goose is dead.
We'll give Glenn a couple of weeks. Then he'll be hooked for good. Heh.
Steven Den Beste found this photo-gallery of Apple devoteeism on CNet, culled from the pages of The Cult of Mac by Leander Kahney. Some of the shots are older (such as the multicolored Apple-logo tattoo) and some are speculative (the Globe Mac, the iWalk); but it's all the result of real creative expression, totally voluntary and totally heartfelt. Whatever else the Apple fandom may be, at least it's genuine. That's nothing to sneeze at.
I was just in the Valley Fair mall returning something; and I stopped in at the Apple Store to observe the post-holiday crush, which (the red-shirted employees assured me) was as light as it's been all day. It was a madhouse.
All hands were on deck, handling both new purchases (from gift cards) and, more notably, returns. In the ten-deep line at the five-abreast checkout counter, I saw that most people were carrying an iPod box—in some cases, two. The likeliest scenario, then, is that tons and tons of people got iPods from more than one person. There's no use for two iPods, let alone three.
Here's the interesting thing, though. The iPod is priced just right in order to be a very special gift in a middle-class family. We're talking a $300-600 price point here: the kind of thing you'd give as a graduation present, a wedding gift, or a milestone birthday. Here it is being bought by the truckload for Christmas... often by more than one member of the same family, not knowing of other members' plans. The iPod is just too perfect, too obvious a gift idea. And doesn't it say something about our economy that so many families are not only willing, but able to support buying multiple iPods for the same person for Christmas?
Surely this isn't the best news for Apple; I'm sure they must have looked at their pre-Christmas iPod sales numbers and muttered "This is too good to be true". And so it was. But it's not every day you get a neat statistical outlier like this, a real economic phenomenon, unfolding right in front of us.
I really should know this by now. But seriously: What kind of circumstances are they that can lead a person to look at these lyrics (to John Mellencamp's rewrite of the traditional folk song "To Washington") and have any reaction rise to his lips other than "You moron"?