|Friday, January 28, 2005
11:41 - The Last Good War
In response to this post about the Left's beatific nostalgia for WWII (exemplified by Doonesbury), George H. e-mails:
I'm of the opinion that the modern Left loves to reverently invoke WW2
because its a safe way to claim they're not complete weaklings when it comes
to national defense and related issues. See, they can say, we were all for
It also doesn't hurt that the Nazis were/are the perfect foes for the Left,
whose hivemind is notoriously emotion-ruled. Hitler and the gang were
downright demonic - right out of Central Casting - and made to order.
Also, hating the Nazis and celebrating their destruction in the post-1945
period was the safest thing one could do. In the past 200 years, no regime
has ever been so as destroyed, scattered, vilified and comprehensively
liquidated as that of the Third Reich. Hell, even the most recent
Euro-conqueror before Hitler - Napoleon - escaped with his life and one of
Nappy's "Feldmarschalls" founded a royal dynasty in Sweden! The
Swastikettes, on the other hand, were hung, shot, driven to suicide or
chased to the ends of the Earth.
GOOD, says I. But the point is that dancing on the grave of Nazism and
idolizing the war that dug that grave was a pretty risk-free political
endeavor after WW2. Very little chance that Otto Skorzeny was going to show
up at your door with a Luger.
Being comprehensively opposed to tyranny - you know, like a true classical
liberal - was another kettle of fish entirely. As we both know, a regime
just as bad as Hitler's survived the war and its bloodsoaked leaders mostly
died in bed (unless killed by each other or by Stalin, the worst snake in
the nest). Add to this the decades-long, collective blowjob administered by
"progressives" to the Soviet State and the right-thinking Liberal has a real
Solution? More WW2 nostalgia, please. Big Red was an ally, the eee-vil Nazis
were the foe, and everything was ideologically comfortable.
Also note the prevalence of the European theater over the Pacific - even
though we fought the Japanese for longer than the Germans. There's also that
messy Atom Bomb thing. And the far less snappy uniforms of the Japs. :)
Yup, I'd noticed that too. Mentioning the A-bomb makes people inevitably start to bring up Dresden. And that wouldn't do.
11:30 - Fundamental theorems
Mike Hendrix has posted a deft summary of the position that a lot of us hold, which we all may as well just link back to as the starting point from which we can start having some new arguments. In the years since 9/11, our discourse has converged on these ideas, and while some might find parts with which to disagree (and even in a principled manner, in some cases), there's not much there to argue with. Sometimes getting it all into one big pile helps perspective. I don't think many people take the time to remember how much it felt like the world had changed on the evening of 9/11, for instance, and knew—instinctively knew—that the old rules of engagement and tolerance of global threats no longer applied, that the burden of proof and demonstration of necessity for preemptive war were things that would not survive in their familiar forms in the new, changed world.
One might still argue that our priorities could have been arranged differently, that having the Europeans on board with the long-term War on Terror (through whatever concessions to them might be necessary) outranked taking out the immediate perceived threat of Iraq, and that Iraq should have waited if it meant sacrificing that global consensus. That's at least a principled position—not one that I agree with, because I don't happen to believe that the Europeans would have gotten on board with us beyond Afghanistan under any circumstances, as we can see through their relaxed dealings with Iran and their lack of outrage or even surprise at the moral implosion of the UN. I happen to think that there really was no other good option than the one we undertook, as the War on Terror might have been easier won with the Europeans on board, but it can never be won without us calling the shots. Nobody else has the resources or the willpower, or the clarity of purpose. Nobody else could stomach the hard, ugly transformative steps that are necessary to make Islamic terrorism die, not through short-term symptom-treating palliatives, but through the march of generations. Nobody else seems to share that vision, probably because nobody else shares our experience.
I don't expect agreement from those who don't have the same vision of what the world needs to be. I don't like to have to say they'd just better get used to it, either. But I think we do know what we're doing here. The Europeans shouldn't make the all-too-common mistake of underestimating our understanding of human nature. We have figured a few things out. All on our own.
AFP and the BBC may judge us harshly, but I'm confident that the history books will be more charitable. They'll have to be, because they'll be written in a better world.
|Thursday, January 27, 2005
16:40 - Cleaning Up Dodge
This is probably the most exhilarating thing I've read in months: an extemporaneous dictated essay by one Moses Sand. It's long, and parts of it are glib or quaint—but it's brilliant for all that. The essence of American democracy dictated to a colleague's microphone by a steely-eyed old cowboy squinting out over the badlands, invoking the names of Martin Luther and Gandhi and Mohammed right alongside Wyatt Earp and Johnny Cash.
"But just look at who's saying this about the Arabs. I'm not sure who those guys are but they are either misinformed... ignorant... or liars. What I find curious is that the things those guys are saying about Arabs today is exactly what Jim Crow gringos was saying about black folks fifty years ago in Mississippi. 'Why them nigras don't want to have to make up their own minds. They need organization put into they lives. Three hundred years of being told what to do, why they couldn't even organize a good church supper. They need to be told where to be, what to do and how to do it. And then they are content.'
"I can't tell you how many times I heard that as a child... only it came from the barber shop philosophers... people like Verdell McCutchins, from the hardware store, and not the so-called educated elites of today. Verdell had a thigh the size of a pot belly stove, and would slap it as he crossed his legs, as an exclamation point to some luminous revelation about how the country was going to hell in a hand basket. Everybody knows somebody like this... except maybe Republicans, who don't seem to get out very much. But it's amazing to me how similar Verdell was to what passes as our most educated minds nowadays. Some men actually pay extra to send their kids to special colleges just to learn to be that stupid.
If I told you how much I enjoyed reading this, to paraphrase Zaphod Beeblebrox, I wouldn't have time to read it (again).
13:46 - It's different out here
All eyes are really on Apple now. Here's the NYT with a piece showering praise on the Genius Bars in the Apple Stores—with nary a criticism or a "beleaguered" in sight.
In an age when human help of any kind is hard to come by, the eight or nine Geniuses on duty at any given time here are a welcome anomaly.
In fact, go to any of the 102 Apple-owned retail stores in the world and - if you are willing to wait - you will be treated to what is an increasingly rare service: free face-to-face technical support.
The walk-up assistance has existed since the first Apple Store opened in 2001, in Washington. Over the years, as the concept gained momentum, the bars have become what Ron Johnson, Apple's senior vice president for retailing, calls the soul of the stores.
"It's the part of the store that people connect to emotionally more than any other," Mr. Johnson said.
For the first few years, there was general mayhem around the Genius Bars. Customers would stand four or five deep, broken gadgets in hand, waiting to speak to an expert. Now there is an online system for scheduling free, same-day appointments. And for $100 a year, customers can schedule appointments up to a week in advance with the expert of their choice.
. . .
The stores in general and the Genius Bars in particular have been credited with creating a halo effect for Apple. The iPod owners who own PC's and go to the unrelentingly chic stores for an expert's help are often seduced by Apple's self-conscious hipness.
Paula Mauro, who lives in New York and recently spent several hours at the Genius Bar in the SoHo store, got that message when getting help with iPod-to-PC communication. As she sat at the bar with her 10-year-old son, William, who aspires to Macintosh ownership, it became evident to her that synching an iPod to a Macintosh computer is relatively seamless, while her three-year-old PC posed no end of technical challenges.
"The next computer I buy is going to be a Mac," she said.
Maybe there was once a time when this kind of thing was the norm... and industry trends have just made such face-to-face interaction unfeasible. In a situation like that, all a company like Apple has to do to differentiate itself is recognize how unique such a concept is these days, and put in what investment it takes to get past the cost-benefit deficit that the bean-counters would recommend. Four years on, we've got an international phenomenon on our hands.
The Apple Stores may well prove to be the best decision of the New Jobs Era. In these days where the Mac is less unlike a PC than ever before—same applications, same games, same video cards, same hard drive buses, same RAM, same connectors, same optical drives, all of which used to be proprietary—Apple needs a way to deliver a wholly different computing experience. The Genius Bar is—or has become—the answer. It's something only Apple has been able to pull off, too; the other computer makers that have tried boutique stores (Gateway, those Dell kiosks in malls) have failed to generate any cachet. But those big glowing Apple logos that stare you in the face when you round the corner in the mall... well, they're force multipliers for Apple products' already irresistible coolness. The buzz feeds into itself now; the customers line up.
I'd never have believed it if you'd told me this back in 2001.
Registration required, but it's worth it. Via Chris M.
13:26 - Reanimation
This is precisely the kind of thing that Roy Disney left his uncle's company in disgust over. Clearly the Eisnerians show no contrition or desire to change their ways; flailing for ideas for profitable new enterprises, all they can think of doing is harnessing dark arts to create grotesque undead versions of previously successful properties... even if they can't get a single ingredient of the original's secret sauce together.
No one wants to direct "Toy Story 3."
That's the word in Hollywood's animation world, where the third installment of the incredibly successful Pixar series has no director, writer or, possibly, stars.
My sources in the animation biz tell me that Disney, which will make "Toy Story 3" without Pixar, cannot find a director to guide the project.
John Lasseter, who directed the first two movies, will stay with Pixar after he finishes its last Disney-distributed movie, "Cars," set for release in 2006.
It's also undetermined whether stars Tom Hanks and Tim Allen will reprise their roles in the new film. The odds are that Hanks won't, but that Allen — who's made some successful family films at Disney — will.
Pixar just received four Oscar nominations for its current Disney-distributed film, "The Incredibles." "Cars" will mark the last collaboration between the two studios, since Disney's Michael Eisner has essentially told Pixar to take a hike.
Not smart, Mikey. Jobs holds all the cards here. If you can't see that, no wonder your company is sailing down the tubes. A company that creates art needs vision, and you have none.
I can see a new Disney arising again, ten years or so from now after the old one has vanished (I don't expect Disney as we know it to see 2010); headed by Roy, a small scrappy studio starting the way Walt did with high-grade shorts and stuff, defining a new style and a new market all over again. But Eisner's ossified Disney is moribund, and perhaps it needs to die—there's no turning back the clock.
Disney has the right to make sequels to all the Pixar movies it distributed, including "Toy Story," "The Incredibles," "Finding Nemo," etc. But there's a hitch — since Pixar developed all the animation materials to create the movies, it also gets to keep them.
In other words: Disney is now trying to hire another team of animators to recreate Buzz Lightyear, Woody and all the other "Toy Story" characters so that they look the same. It will have to start from scratch to reproduce Pixar's creative work.
The next step, of course, is to find a writer and director for the project. With Lasseter gone, my source says, "Every single animator of note has turned down the director's job. They don't want to cross Pixar. They've become the only deal in town."
One source told me that a possible offer had been floated to an assistant director who worked on Disney's straight-to-video traditional cartoon, "The Lion King 1½."
But even that film was a bastardization, since most of the creative people who worked on the original 1994 "The Lion King" were long gone from Disney.
Both the original "Lion King" director, Roger Allers, and writer, Irene Mecchi, are said to be now working on Pixar projects. Allers' last big project for Disney animation was "Kingdom of the Sun," the movie that became "The Emperor's New Groove" after he was unceremoniously replaced.
The entire debacle was recorded in a wonderful but unreleased documentary called "Sweatbox," made by Trudie Styler while her husband, rock singer Sting, was writing songs for the film which were ultimately cut from the final release.
Meantime, Disney announced last week that the script for "Toy Story 3" would be based on a proposal submitted to them by a young student in their feature animation story development program.
Man, I'd love to see that documentary! There's so much about the story behind The Emperor's New Groove that we've never heard about in its entirety. This just makes it all the more tantalizing. (All the more so because the final movie was, somehow, brilliant—even after having been essentially recreated from scratch after being mostly done.)
And what a documentary they could make about Disney's supposed "feature animation story development program", eh? What is that, a cubicle in the sub-basement of the ABC Edifice in Burbank in the parking lot of the hollow husk of the old, abandoned Feature Animation building?
Feh. You go right ahead and run the greatest animation company ever into the ground, Eisner. What a legacy you'll leave the world.
11:29 - Careful what you wish for
In the course of history, Manichaeism was ruthlessly eradicated as an heretical, ungodly doctrine. When looking at demographic statistics, however, one might think that the populations in developed countries have converted en masse to Manichaeism and decided to become extinct. The birth rate in most western countries has fallen bellow replacement level.
In the so-called "New Europe", the situation is even gloomier. According to UN projections, Latvia will lose 44 percent of its population by 2050 as a result of demographic trends. In Estonia, the population is expected to shrink by 52 percent, in Bulgaria 36 percent, in Ukraine 35 percent, and in Russia 30 percent. In comparison with these figures, the projected population decline in Italy (22 percent), the Czech Republic (17 percent), Poland (15 percent) or Slovakia (8 percent) looks like a small decrease. France and Germany will lose relatively little population, and the population of the United Kingdom will even see a slight growth -- thanks to immigrants.
. . .
Today, children no longer represent investments; instead, they have become pets - objects of luxury consumption. However, the pet market segment is very competitive. It is characteristic that the birth rate decline in the 1980s, and especially in the 1990s, was accompanied by soaring numbers of dog-owners in cities. While in the past dog-owners were predominantly retirees, today there are many young couples that have consciously decided to have a dog instead of a baby. These are mainly young professionals who have come to a conclusion (whether right or wrong) that they lack either time or money to have a child. Thus, they invest their emotional surpluses into animals.
Still, both day-to-day experience and the media frequently suggest that the quality of life enjoyed in the United States and Europe is under threat by population growth. Sprawling suburban development is making traffic worse, driving taxes up, and reducing opportunities to enjoy nature. Televised images of developing-world famine, war, and environmental degradation prompt some to wonder, "Why do these people have so many kids?" Immigrants and other people's children wind up competing for jobs, access to health care, parking spaces, favorite fishing holes, hiking paths, and spots at the beach. No wonder that, when asked how long it will take for world population to double, nearly half of all Americans say 20 years or less.
Yet a closer look at demographic trends shows that the rate of world population growth has fallen by more than 40 percent since the late 1960s. And forecasts by the UN and other organizations show that, even in the absence of major wars or pandemics, the number of human beings on the planet could well start to decline within the lifetime of today's children. Demographers at the International Institute for Applied Systems Analysis predict that human population will peak (at 9 billion) by 2070 and then start to contract. Long before then, many nations will shrink in absolute size, and the average age of the world's citizens will shoot up dramatically. Moreover, the populations that will age fastest are in the Middle East and other underdeveloped regions. During the remainder of this century, even sub-Saharan Africa will likely grow older than Europe is today.
. . .
Some biologists now speculate that modern humans have created an environment in which the "fittest," or most successful, individuals are those who have few, if any, children. As more and more people find themselves living under urban conditions in which children no longer provide economic benefit to their parents, but rather are costly impediments to material success, people who are well adapted to this new environment will tend not to reproduce themselves. And many others who are not so successful will imitate them.
So where will the children of the future come from? The answer may be from people who are at odds with the modern environment -- either those who don't understand the new rules of the game, which make large families an economic and social liability, or those who, out of religious or chauvinistic conviction, reject the game altogether.
And to think I used to send money to these people.
(Hey, they were at Earth Day at my high school. What's an impressionable do-gooding youth to do?)
|Wednesday, January 26, 2005
19:23 - If you get lost, zoom out to gain perspective
Here's to the crazy ones, eh?
Jef Raskin, the self-styled "Father of the Macintosh" (who was actually forced out of Apple by Jobs early on, before having any significant impact on Mac design), has just secured $2 million in venture capital funding to develop the UI ideas that got him kicked out in the first place: a "Humane Interface" in which everything we hate about modern computers is done away with... and everything we love about them, too.
Raskin's interface (demonstrated in this zoom-centric Flash file) apparently amounts to converting your computer into a giant monospaced word processor. Everything's keyboard commands. He wants people to learn to use specialized "command sets", entered using customized keyboards, instead of the mouse, because—as we all know—modern computer functionality is all about modal manipulation of text, not those silly Jobs-induced hallucinations of "music" and "images". (Raskin doesn't like the iPod, surprise surprise. Too "showbiz".)
In reading the various documents linked from this page and the Slashdot commenters (and commenters on his book at Amazon), all I can think is how there are all these diligent UI designers out there trying their hardest to come up with ways to make computers easier to use for the blind, illiterate, uneducated, or otherwise limited user. The approaches everyone keeps coming up with involve more use of GUI, easily understandable icons and pictographs, and so on. Why do you suppose that is? Maybe because a UI that's entirely text-based and command-line driven is utterly unusable to anyone who can't interact efficiently with text. Even someone who's just a lousy typist would be crippled by this interface, which is like taking the excellent LaunchBar idea and extending it until it's the entire computer. What's great as a supporting navigation system in keyboard-literate, alphanumeric societies is not a panacea for all computing tasks. Think how well it would do in, say, China (where most of the population is functionally illiterate and the rest use a typography system fundamentally incompatible with ours, and by extension, Raskin's).
The man seems to have trouble with basic technical concepts, such as when he explains that "There has never been any technical reason for a computer to take more than a few seconds to begin operation when it is turned on". Editorial reviewers take up the chant, agreeing heartily: "So why then does Windows (or Linux!) take so darn long to start up? The PalmPilot is on instantly, as is your cell phone. But for some reason, we tolerate the computer taking a few eons to start. (And until consumers complain about it, things won't change.)" Yeah, or until computers start having as few physical devices to initialize and load drivers for as the Palm Pilot, or something. Really—geez. Claims like this are embarrassing, or ought to be.
I mean, yes, I applaud the desire to rethink some of our foregone conclusions in computing; it would be nice to see some breaking down of the unnecessary barriers between things like "Applications" and "Documents", as Slashdot commenter TotalWimp says:
I've been asked on several occasions to help people find their missing documents. Naturally I've asked them "where did you last see it?" A surpisingly common answer is, "it's in Word."
I would ask them some more questions and they'd show me "exactly where it is" by clicking Open from the File menu of word and showing me "where the doument should be"..." right there in word."
Sometimes they'd show me the list of recently opened documents hanging right off the file menu "in Word."
...But this guy looks like just the kind whose vision is sufficiently unmoored in reality to make him dangerous. He can sell an idea, clearly, and he has enough sane concepts in his book to win converts. But I don't think people are really thinking the ideas of modern computing through any more than Raskin is, and are following his lead in simply ignoring any aspect of computer usage—like navigating your music, or Photoshopping, or using the Web—that doesn't fit into his 1979 vision.
Raskin is "the man" for UI?
In a word; no.
And, if you don't believe me, check out the Canon Cat. Really.
Post Mac, and has NOTHING that is on your standard UI list. Big (BIG) flop.
Check out Raskins ramblings -- boils down to "The UI should be vi; and people will love it". Especially, vi with dedicated function keys.
In a sense, he *is* right. It would be a better UI. But, he *is* very wrong; people will *not* love it. So its a non-starter.
But now he's got $2 million to play with, so maybe we'll see a real prototype soon. Personally, I look forward to it: I hope he brings it fully to fruition. That way, people will see exactly how easy it really is to do all the things we've become accustomed to in his Humane Interface: not very. Until that happens, though, we have Raskin's word to go on, and his salesmanship, and the false sense of sanity into which he lulls the reader with patently solid theories and appeals to common sense. That's where he springs his trap; and without a concrete interactive demo to see what he's really proposing, I don't think anyone really realizes how debilitating it would be.
But we'll see, right?
11:50 - Everything you ever wanted in a Mac—and less
Aziz spotted this development:
Apple has quietly lowered the price of some of its build-to-order components on the Mac mini as well as offering a faster version of its optical SuperDrive. The Apple Store is now offering the combination SuperDrive reads DVDs at 8x, writes to DVD-R at 4x, writes DVD+/-R at 4x, writes DVD+RW at 2-4x, reads CDs at 24x, writes to CD-R at 16x, and writes to CD-RW at 8x. Bluetooth/AirPort Extreme upgrade for $100 ($30 drop), while offering the Bluetooth upgrade separately for $50. Apple is also offering 1GB RAM upgrades for $325 ($150 drop) and offering a faster 8x SuperDrive (with both DVD+/-RW functionality) for the same $100 upgrade price as its previous 4x SuperDrive. A hard drive upgrade to 80GB (from 40GB) is now $50 ($40 drop). The estimated ship time on BTO and standard configuration Mac minis is 3-4 weeks. Update: The BTO SuperDrive combines a 4x/2x/8x DVD+/-RW mechanism with a 16x/8x/24x CD-RW mechanism.
This looks like damage control to me. Suddenly Apple realized they were under intense scrutiny from PC penny-pinchers, and must have decided that the Mac mini BTO upgrade process is no place to try to recoup loss-leader deficits. Apple's RAM costs have always been exorbitant, but they know at least in this case that every crossover buyer will be wanting to max out the RAM—and it's either face a flood of geeks bearing street-price generic 1GB DIMMs for Apple Store Geniuses to install on their valuable time, or risk alienating the very people they're trying to convince that the Mac can be cost-effective after all by insisting they pay through the nose to have it tricked out at the factory. Not a great PR move—certainly no better than suing sites leaking rumors.
At least this way Apple can sell some of its own RAM; at the earlier price, they must have realized that precisely zero tech-savvy buyers would opt for the factory upgrade, and would instead seek out just about any alternative—including the ones involving giving their cash to other RAM vendors and voiding their warranties with a flick of a screwdriver.
I know a lot of people who are probably glad to have waited.
11:30 - These enlightened times
San Francisco is one hell of a beautiful city. I love taking out-of-state visitors there, to walk out on the Golden Gate Bridge, look out across the sparkling bay to the gleaming skyline, to drive up the Marin headlands to where the old WWII anti-naval gun battery was and look down at the grandeur of the landscape and at the neat white grids of the residential parts of the city giving way to the sharp and dramatic spires of the downtown skyscrapers like a seismograph suddenly registering a new temblor.
However, there's a problem: I can't take visitors much closer than that. Oh, sure, we can hit places like North Beach and Fisherman's Wharf and the Castro; but drive down Market Street? Walk up to Union Square? Head down into the Mission District to see the Metreon and Moscone Center? Go over to the Civic Center for a show in the theater district? Unless you can duck from a Starbucks to a Taco Bell to another Starbucks and repeat the process all the way down the long blocks to your destination, you're going to be literally stepping over so many homeless people that you end up hating yourself for the very excess and frivolity of what you're venturing into the area to do. You feel like a plutocrat just for being able to afford a Subway without arguing with the clerk over the posted food-stamp acceptance policy.
I think it's worse in San Francisco than most places, too; one thing I noticed about New York, when I was there in October, was that as raw and crowded and under-construction and hurried as everything was, I didn't see a single panhandler from Times Square to Chinatown. The NYC subway system is missing walls and floors and looks like the skeletal remains of some steam-punk Jules Verne dystopia, but it felt way more wholesome than the space-age BART, somehow.
Chris' Australian family was just here visiting, and while they gushed over how much they loved sightseeing in San Francisco, the first observation they made, and with great shock, was how many "beggars" there were. There's no denying it. And all we could say in its defense was point out that aside from the fact that cities like San Francisco and San Diego are at least warm enough to be homeless in, the state's mental illness treatment policy has been such that everyone who's ever had debilitating drug problems or can only barely fend for himself ends up on the street. After all, you can't "institutionalize" people anymore.
All of which is by way of preamble to this post by Glenn Reynolds, whose wife made a documentary on this very subject, and who has some things to say—and some posted reader feedback in support of it—regarding the "de-institutionalization" movement. And it itself springs from this observation by Jeff Jarvis:
And the real issue isn't homelessness. It's insanity. The laws in this country make it impossible to commit and help even the obviously and often the dangerously insane.
I say that One Flew Over the Cuckoo's Nest is as much at fault as any politician, for it made the institution frightening and the people who run it bad guys.
Read Jarvis' whole post; his perspective is from the New York end of things, and he notes some reasons why the atmosphere there is different from here in the City by the Bay.
I'm not sure what political tradition is consistent with wanting to give a fair shake to institutional practices that probably served us better in the past than we like to let on nowadays, even ones that cost state taxpayers large amounts of money. But you know, I don't care. It isn't "compassionate" to allow people to be on the streets out of some perverse knowledge that "at least they're not being locked up", or a twisted and practiced revulsion at the idea of wanting to "clean up" downtown and make it "safe" for our bourgeois pursuits. It isn't "principled" to demand that the homeless make their own way in the world, when the vast majority of them aren't equipped to do so if they wanted to. This is not just a function of the system. It's an outlier to the system, something that the system needs to take explicitly into account when figuring out what is in our interest as a society.
I'd like to be able to walk down Market Street with friends from out of town with the same devil-may-care attitude that lets people crane their necks upward in Times Square, unconcerned with where their feet are going. This isn't because I want the helpless shuffled off to where they're out of sight, out of mind; I know that being tripped over is a lot worse than being the one doing the tripping, and just about anything has to be better. For everyone.
|Monday, January 24, 2005
00:20 - The year was 1984
Ever wanted to see what form the Stevenotes took back before they were Stevenotes? Back when Jobs wore a tuxedo and was clean-shaven and long-haired?
Back when Macintosh was only a word that people had heard in vague trade magazines and incomprehensible Super Bowl commercials?
Back when the term "insanely great" had not yet been coined?
Just watch him try to keep that smirk off his face. And thus was the bar set for the next twenty years...
UPDATE: B. Durbin adds the following fascinating observations:
1. As of yesterday, the Macintosh is old enough to
2. Steve Jobs is adopted. (I found this out
3. Steve Jobs is also the most famous American of Arab
extraction. (Likewise yesterday.)
On that third point, it's so obvious when you look at
his picture twenty years ago, that classic "sheikh"
profile. Apparently, his birth sister is a novelist
named Mona Simpson, and his father an "unknown"
UPDATE: The original link stopped working, so it's been mirrored and updated.
UPDATE: And since this is essentially a Mac Birthday celebration, Chris M. has this to say:
I do remember the first time I saw the Mac and read the quote "insanely
great"--if I'm not senile, it was in the big cover story in Byte in
1984. That was the story that sold me then and forever on the Mac.
Here's why: as you know, back then computers had two display
modes--Text Mode and Graphics Mode. Text Mode was driven from firmware
(I think) and it had one font. In fact, the word "font" was then
unknown to most computer users. Graphics Mode, if it existed on one's
computer, was of course a function of the special graphics card one
bought at great expense to run games and such like.
In the Byte article one of the programmers (Burrell Smith? Andy
Hertzfeld?) said something that got my attention for the century.
Something to the effect that the Macintosh didn't have separate modes
for these two forms of data, because everything on the display--text,
graphics, what have you--was ultimately a pattern of pixels and
therefore could be represented by a bitmap. Everything. And once they
realized this, the ability to put different fonts on the screen and
WYSIWYG and programs like MacPaint were just different exploitations of
the same core idea. The same core, liberating idea.
I don't know how others reacted to this, but for me it was as if someone
had turned on a 1000 watt lamp. I was awestruck at the simple elegance
of this idea. From then on, no matter what came afterwards, I didn't
care--I would be a Mac loyalist now and forever. That was it, for me.
To this day, whenever you turn on a Windows machine, the boot up screen
is in Text Mode. And then, if the monitor is a CRT, there's often a
loud click. That's the sound of the circuitry switching to Graphics
Mode. In the year 2005.
And that's the difference between insight and imitation.
How long before PCs abandon the floppy drive, you think? What's the pool up to by now?
And why is it so easy to imagine the Wintel machines of the year 2018 still booting up in text mode?
Backward-compatibility is the PC's great advantage—and the huge ball chained around its ankle. Whereas Steve Jobs thinks nothing of breaking everyone's third-party software with every major OS release just because the engineers have reworked some key library from scratch, Microsoft cannot countenance disrupting the vast infrastructure of computers all across the world—from PCs to servers to embedded point-of-sale kernels—by changing anything. Any attempt to modernize Windows is ultimately hamstrung and converted into a skin-deep simulacrum of the change they'd originally yearned for. They can't afford to do the kinds of ground-level architectural redesigns that are par for the course in the Mac world.
That's why Apple will never replace Microsoft as the dominant platform in all the humdrum installations of the world... but it's also why there will always been a demand for an Apple to exist.
23:28 - [respek]
As if there yet remained any doubt as to the coolness of Adult Swim, tonight they devoted their bumps to Johnny Carson tributes, such as this one:
Johnny Carson died.
We used to watch him when we were little.
It was hard staying up that late, but when we did, we were happy.
Even though we didn't know why.
And though we didn't know it at the time,
He shaped our lives.
Thank you & Goodnight.
We're sure "The Tonight Show" will air plenty of
Johnny Carson clips this evening.
But "The Tonight Show" hasn't been "The Tonight Show" since
The REAL "Tonight Show" is "The Late Show".
Hosted by a fellow named David Letterman.
The ideas and opinions expressed in this bump complely reflect those
of [adult swim]... and you, the viewers.
No music; just silence. Very classy.
Oh, and later they plugged their Downloads page, full of wallpapers and MP3s of the theme songs from all their shows. Damn, this network rules.
15:46 - Segway II
Dean Kamen's at it again.
What I’m staring at on the lawn is the outcome of this process of innovation: a Segway with four wheels. Seems like a pretty natural product-line extension. Field is the first to acknowledge that the Centaur has a tremendous amount of HT DNA—the entire base, stuffed with tilt sensors and gyroscopes that allow it to balance, is an HT. But, he adds, it was hardly as simple as adding two more wheels. The steering, for example, is part mechanical, part drive-by-wire. Two sensors in the steering column calculate speed (from the throttle) and angle of turn (from the handlebar) and send the data to computers in the base. Integrating that information with readings from the tilt sensors and gyros, control boards adjust the speed of the independent rear wheel motors 100 times a second to keep the machine upright.
Who'd buy one, though? Park rangers? We all know the tremendous impact they've had in making and breaking the pic-a-nic basket and blunderbuss industries in the past...
14:58 - Too cool to resist
Wow. Don't miss this graphic and analysis by Paul Nixon of how he sees the iPod and Mac lineups for Apple converging on this month's new product announcements, a nexus of low price and coolness that he believes will lead Apple to breakout numbers.
Now, I tend to think the guy's engaging in a little bit of "Revise hypothesis to fit facts; Backdate revised hypothesis; Publish" wishful thinking here. If there's one thing Mac pundits are known for, it's coming up with tenuous theories that explain any given current Apple product line as being the blindingly ingenious ideal sum total of a long-running corporate strategy—even if six months later a whole new line of products get released that throw that model into a cocked hat. I think the current lineup is at least to some extent just more of the same: new products dreamed up less than a year ago in response to some boardroom projection describing a potential hole in the market where some dollars are being left on the table. I can't help but think that even though they were announced at the same time, and even though (as I noted) almost all the announcements in the Stevenote were of extremely low-priced products that would be immediately available, the convergence so neatly illustrated by Nixon here is probably more coincidental than not.
But even so, it can't be denied that the Mac mini is something Apple hasn't really done before; it's got to represent at least a moderately ambitious shift in Apple's trajectory and its attitude toward the markets that might interest it. Who knows—maybe this is what the Steve's plan has been all along, even before the G5 existed or the iPod had been conceived or or the retail stores had opened their doors or the iMac had achieved its candy colors. (Maybe he meant for the Cube to fail all along. Yeah, that's the ticket.)
Either way, it makes for a great graphic to show anyone who thinks Apple is flailing or has an incoherent story to sell the public. At least that much is as slick as the image.
14:07 - Beware low-flying pigs
Do my eyes deceive me?
...Or has Aaron McGruder actually produced a sane comic strip? And one that's actually quite funny and incisive, at that?
I guess stuff like this is hard for even the Moore types of the world to ignore... or to convince themselves forever that in a battle between American values and anything, America must always be wrong.
Perhaps a sign of good things to come? I know I'm not holding my breath...
11:43 - Scanners make heads explode
It's my belief that the collective intelligence of the flatbed scanner industry has been rapidly and steadily decreasing over the past several years.
This is totally aside from the fact that I had to jump through all kinds of hoops to find both a scanner built in the past decade that supports A3-size/tabloid sheets (11x17") and costs less than $3000, and software that will support it (thank you very much, SilverFast, for deciding in your infinite wisdom that the Microtek ScanMaker 9800XL—the only such large-format scanner in the industry, and the direct product-line descendant of the 6400XL, 8600XL, 9600XL, and 9700XL, all of which you supported with your $50 consumer-level software, is in fact suddenly a "Pro" scanner and thus will only be supported by your $500 "Pro" software). Totally apart from all that.
See, what I don't get is how a company that has been making scanners for many, many years—very good scanners, in fact, well worth the $1000+ price tags—can have its illustrious engineers' brains seep out onto the floor with the simple passage of time, such that the newest scanners they make, while still being of quite high build and function quality, have completely brain-dead design decisions built into them.
Case in point: my Microtek ScanMaker 9800XL. I bought it to replace my old SCSI ScanMaker 6400XL, because my G5 doesn't support the PCI SCSI card that it required (and Mac OS X's SCSI support has been halfhearted and vague anyway). I liked the 6400XL just fine; it was speedy and reliable, and the bed had ruler markings molded into the rails, so I could align a piece of paper or Bristol board quite easily by just butting it up against the raised edges of the scan bed. SCSI though it was, life was good. The new scanner is large-format just like the old one, and it's FireWire-based. And it sold for $1400, same as the old one—and I got it at $1000. What a deal, huh?
Until I opened up the lid, and discovered this staring me in the face:
See that? See that? See it?
The rulers are embedded under the glass. About an inch from the raised edges of the scan bed, on both the left and front edges.
Can someone please explain to me exactly what reason in the name of Hell any scanner manufacturer would have to design a scanner in this way? Because the only function it serves, as far as I can tell, is to prevent me from being able to align a piece of paper correctly. I can butt it up against the raised edges, but then an inch of either the top or the side—or both—will be cut off. Or I can gingerly set it manually next to the rulers, so the whole sheet will be visible to the sensor—but then it won't have anything to align against, and will invariably turn out misaligned and send me into a cycle of five or six Preview passes before I get it right. Which is no fun at all when you've got dozens of pages in a batch job that all have to be scanned exactly the same way, with the same margins and registration.
What? What am I missing? What possible benefit can this arrangement serve? What graphic-design experts were they who prevailed upon Microtek over the years to abandon their age-old practice of allowing you to align your paper against the guide rails, and instead make you have to float the paper out into the middle of this glass sea, squinting at the light gleaming through the crack between the submerged ruler and the edge of the paper, getting it just aligned as right as you possibly can, closing the lid (and having the wind blow the paper off-alignment in the action), previewing the scan, and finding that it's a couple of degrees rotated from what you wanted? Is this an exercise instituted by Microtek to help us keep our eyeballs sharp and our bile ducts circulating in good health? What possible purpose can this serve other than to piss me off?!
It's not as though this is a measure to allow people to scan larger pieces or anything; there is still that raised guide rail, so anything that overlaps over the edges will still be raised as it rests on the rails rather than the glass. And it'll still be blocked out by those stupid rulers. I honestly can't think of a practical reason why this should be.
Oh! Oh! And note that there's a little message in the lower right of the scan bed, a note with an arrow pointing to the rectangular clear area between the front ruler and the front guide rail: KEEP THIS AREA CLEAR. Got that? No sticking a ruler in there to line your paper up against. No fair begging out of the pain we have mandated for you as a foolish buyer of large-format scanners, of which ours is the sole occupant of the product category. We are Microtek, and we will find you out!
Someone? Anyone? Can someone shed some light on this? I don't like to believe that an entire company can be guided by the ideal of cruelty to its customers, but right now I don't have many other working theories.
So, anyway: after dealing with this for months now, and emitting only sporadic grumbles, I have finally come to the conclusion that this situation is untenable, and I have taken firm and decisive action to correct the egregious design of this scanner:
Ha! Screw you, Microtek!
And thank you to the good people at TAP Plastics, who cheerfully fabricated this thing for me out of 1/8" acrylic. And now I have a paper guide that fits neatly into the corner, covers up the rulers, lets me align the paper, and won't interfere with whatever mechanism it is that can't bear to have that area in the front of the scan bed made opaque. (I assume it's an optical mechanism. Or maybe it's where the Scanner Gnomes play rugby during the course of a scan, an activity crucial to the color-balance process. I don't know.)
And that's the end of that chapter.