g r o t t o 1 1

Peeve Farm
Breeding peeves for show, not just to keep as pets
Brian Tiemann
Silicon ValleyNew York-based purveyor of a confusing mixture of Apple punditry, political bile, and sports car rentals.

btman at grotto11 dot com

Read These Too:

InstaPundit
Steven Den Beste
James Lileks
Little Green Footballs
As the Apple Turns
Entropicana
Cold Fury
Capitalist Lion
Red Letter Day
Eric S. Raymond
Tal G in Jerusalem
Aziz Poonawalla
Corsair the Rational Pirate
.clue
Ravishing Light
Rosenblog
Cartago Delenda Est



Cars without compromise.





Book Plugs:




Buy 'em and I get
money. I think.
BSD Mall




 4/21/2014 -  4/24/2014
 4/14/2014 -  4/20/2014
  4/7/2014 -  4/13/2014
 3/31/2014 -   4/6/2014
 3/24/2014 -  3/30/2014
 3/17/2014 -  3/23/2014
 3/10/2014 -  3/16/2014
  3/3/2014 -   3/9/2014
 2/24/2014 -   3/2/2014
 2/17/2014 -  2/23/2014
 2/10/2014 -  2/16/2014
  2/3/2014 -   2/9/2014
 1/27/2014 -   2/2/2014
 1/20/2014 -  1/26/2014
 1/13/2014 -  1/19/2014
  1/6/2014 -  1/12/2014
12/30/2013 -   1/5/2014
12/23/2013 - 12/29/2013
12/16/2013 - 12/22/2013
 12/9/2013 - 12/15/2013
 12/2/2013 -  12/8/2013
11/25/2013 -  12/1/2013
11/18/2013 - 11/24/2013
11/11/2013 - 11/17/2013
 11/4/2013 - 11/10/2013
10/28/2013 -  11/3/2013
10/21/2013 - 10/27/2013
10/14/2013 - 10/20/2013
 10/7/2013 - 10/13/2013
 9/30/2013 -  10/6/2013
 9/23/2013 -  9/29/2013
 9/16/2013 -  9/22/2013
  9/9/2013 -  9/15/2013
  9/2/2013 -   9/8/2013
 8/26/2013 -   9/1/2013
 8/19/2013 -  8/25/2013
 8/12/2013 -  8/18/2013
  8/5/2013 -  8/11/2013
 7/29/2013 -   8/4/2013
 7/22/2013 -  7/28/2013
 7/15/2013 -  7/21/2013
  7/8/2013 -  7/14/2013
  7/1/2013 -   7/7/2013
 6/24/2013 -  6/30/2013
 6/17/2013 -  6/23/2013
 6/10/2013 -  6/16/2013
  6/3/2013 -   6/9/2013
 5/27/2013 -   6/2/2013
 5/20/2013 -  5/26/2013
 5/13/2013 -  5/19/2013
  5/6/2013 -  5/12/2013
 4/29/2013 -   5/5/2013
 4/22/2013 -  4/28/2013
 4/15/2013 -  4/21/2013
  4/8/2013 -  4/14/2013
  4/1/2013 -   4/7/2013
 3/25/2013 -  3/31/2013
 3/18/2013 -  3/24/2013
 3/11/2013 -  3/17/2013
  3/4/2013 -  3/10/2013
 2/25/2013 -   3/3/2013
 2/18/2013 -  2/24/2013
 2/11/2013 -  2/17/2013
  2/4/2013 -  2/10/2013
 1/28/2013 -   2/3/2013
 1/21/2013 -  1/27/2013
 1/14/2013 -  1/20/2013
  1/7/2013 -  1/13/2013
12/31/2012 -   1/6/2013
12/24/2012 - 12/30/2012
12/17/2012 - 12/23/2012
12/10/2012 - 12/16/2012
 12/3/2012 -  12/9/2012
11/26/2012 -  12/2/2012
11/19/2012 - 11/25/2012
11/12/2012 - 11/18/2012
 11/5/2012 - 11/11/2012
10/29/2012 -  11/4/2012
10/22/2012 - 10/28/2012
10/15/2012 - 10/21/2012
 10/8/2012 - 10/14/2012
 10/1/2012 -  10/7/2012
 9/24/2012 -  9/30/2012
 9/17/2012 -  9/23/2012
 9/10/2012 -  9/16/2012
  9/3/2012 -   9/9/2012
 8/27/2012 -   9/2/2012
 8/20/2012 -  8/26/2012
 8/13/2012 -  8/19/2012
  8/6/2012 -  8/12/2012
 7/30/2012 -   8/5/2012
 7/23/2012 -  7/29/2012
 7/16/2012 -  7/22/2012
  7/9/2012 -  7/15/2012
  7/2/2012 -   7/8/2012
 6/25/2012 -   7/1/2012
 6/18/2012 -  6/24/2012
 6/11/2012 -  6/17/2012
  6/4/2012 -  6/10/2012
 5/28/2012 -   6/3/2012
 5/21/2012 -  5/27/2012
 5/14/2012 -  5/20/2012
  5/7/2012 -  5/13/2012
 4/30/2012 -   5/6/2012
 4/23/2012 -  4/29/2012
 4/16/2012 -  4/22/2012
  4/9/2012 -  4/15/2012
  4/2/2012 -   4/8/2012
 3/26/2012 -   4/1/2012
 3/19/2012 -  3/25/2012
 3/12/2012 -  3/18/2012
  3/5/2012 -  3/11/2012
 2/27/2012 -   3/4/2012
 2/20/2012 -  2/26/2012
 2/13/2012 -  2/19/2012
  2/6/2012 -  2/12/2012
 1/30/2012 -   2/5/2012
 1/23/2012 -  1/29/2012
 1/16/2012 -  1/22/2012
  1/9/2012 -  1/15/2012
  1/2/2012 -   1/8/2012
12/26/2011 -   1/1/2011
12/19/2011 - 12/25/2011
12/12/2011 - 12/18/2011
 12/5/2011 - 12/11/2011
11/28/2011 -  12/4/2011
11/21/2011 - 11/27/2011
11/14/2011 - 11/20/2011
 11/7/2011 - 11/13/2011
10/31/2011 -  11/6/2011
10/24/2011 - 10/30/2011
10/17/2011 - 10/23/2011
10/10/2011 - 10/16/2011
 10/3/2011 -  10/9/2011
 9/26/2011 -  10/2/2011
 9/19/2011 -  9/25/2011
 9/12/2011 -  9/18/2011
  9/5/2011 -  9/11/2011
 8/29/2011 -   9/4/2011
 8/22/2011 -  8/28/2011
 8/15/2011 -  8/21/2011
  8/8/2011 -  8/14/2011
  8/1/2011 -   8/7/2011
 7/25/2011 -  7/31/2011
 7/18/2011 -  7/24/2011
 7/11/2011 -  7/17/2011
  7/4/2011 -  7/10/2011
 6/27/2011 -   7/3/2011
 6/20/2011 -  6/26/2011
 6/13/2011 -  6/19/2011
  6/6/2011 -  6/12/2011
 5/30/2011 -   6/5/2011
 5/23/2011 -  5/29/2011
 5/16/2011 -  5/22/2011
  5/9/2011 -  5/15/2011
  5/2/2011 -   5/8/2011
 4/25/2011 -   5/1/2011
 4/18/2011 -  4/24/2011
 4/11/2011 -  4/17/2011
  4/4/2011 -  4/10/2011
 3/28/2011 -   4/3/2011
 3/21/2011 -  3/27/2011
 3/14/2011 -  3/20/2011
  3/7/2011 -  3/13/2011
 2/28/2011 -   3/6/2011
 2/21/2011 -  2/27/2011
 2/14/2011 -  2/20/2011
  2/7/2011 -  2/13/2011
 1/31/2011 -   2/6/2011
 1/24/2011 -  1/30/2011
 1/17/2011 -  1/23/2011
 1/10/2011 -  1/16/2011
  1/3/2011 -   1/9/2011
12/27/2010 -   1/2/2010
12/20/2010 - 12/26/2010
12/13/2010 - 12/19/2010
 12/6/2010 - 12/12/2010
11/29/2010 -  12/5/2010
11/22/2010 - 11/28/2010
11/15/2010 - 11/21/2010
 11/8/2010 - 11/14/2010
 11/1/2010 -  11/7/2010
10/25/2010 - 10/31/2010
10/18/2010 - 10/24/2010
10/11/2010 - 10/17/2010
 10/4/2010 - 10/10/2010
 9/27/2010 -  10/3/2010
 9/20/2010 -  9/26/2010
 9/13/2010 -  9/19/2010
  9/6/2010 -  9/12/2010
 8/30/2010 -   9/5/2010
 8/23/2010 -  8/29/2010
 8/16/2010 -  8/22/2010
  8/9/2010 -  8/15/2010
  8/2/2010 -   8/8/2010
 7/26/2010 -   8/1/2010
 7/19/2010 -  7/25/2010
 7/12/2010 -  7/18/2010
  7/5/2010 -  7/11/2010
 6/28/2010 -   7/4/2010
 6/21/2010 -  6/27/2010
 6/14/2010 -  6/20/2010
  6/7/2010 -  6/13/2010
 5/31/2010 -   6/6/2010
 5/24/2010 -  5/30/2010
 5/17/2010 -  5/23/2010
 5/10/2010 -  5/16/2010
  5/3/2010 -   5/9/2010
 4/26/2010 -   5/2/2010
 4/19/2010 -  4/25/2010
 4/12/2010 -  4/18/2010
  4/5/2010 -  4/11/2010
 3/29/2010 -   4/4/2010
 3/22/2010 -  3/28/2010
 3/15/2010 -  3/21/2010
  3/8/2010 -  3/14/2010
  3/1/2010 -   3/7/2010
 2/22/2010 -  2/28/2010
 2/15/2010 -  2/21/2010
  2/8/2010 -  2/14/2010
  2/1/2010 -   2/7/2010
 1/25/2010 -  1/31/2010
 1/18/2010 -  1/24/2010
 1/11/2010 -  1/17/2010
  1/4/2010 -  1/10/2010
12/28/2009 -   1/3/2009
12/21/2009 - 12/27/2009
12/14/2009 - 12/20/2009
 12/7/2009 - 12/13/2009
11/30/2009 -  12/6/2009
11/23/2009 - 11/29/2009
11/16/2009 - 11/22/2009
 11/9/2009 - 11/15/2009
 11/2/2009 -  11/8/2009
10/26/2009 -  11/1/2009
10/19/2009 - 10/25/2009
10/12/2009 - 10/18/2009
 10/5/2009 - 10/11/2009
 9/28/2009 -  10/4/2009
 9/21/2009 -  9/27/2009
 9/14/2009 -  9/20/2009
  9/7/2009 -  9/13/2009
 8/31/2009 -   9/6/2009
 8/24/2009 -  8/30/2009
 8/17/2009 -  8/23/2009
 8/10/2009 -  8/16/2009
  8/3/2009 -   8/9/2009
 7/27/2009 -   8/2/2009
 7/20/2009 -  7/26/2009
 7/13/2009 -  7/19/2009
  7/6/2009 -  7/12/2009
 6/29/2009 -   7/5/2009
 6/22/2009 -  6/28/2009
 6/15/2009 -  6/21/2009
  6/8/2009 -  6/14/2009
  6/1/2009 -   6/7/2009
 5/25/2009 -  5/31/2009
 5/18/2009 -  5/24/2009
 5/11/2009 -  5/17/2009
  5/4/2009 -  5/10/2009
 4/27/2009 -   5/3/2009
 4/20/2009 -  4/26/2009
 4/13/2009 -  4/19/2009
  4/6/2009 -  4/12/2009
 3/30/2009 -   4/5/2009
 3/23/2009 -  3/29/2009
 3/16/2009 -  3/22/2009
  3/9/2009 -  3/15/2009
  3/2/2009 -   3/8/2009
 2/23/2009 -   3/1/2009
 2/16/2009 -  2/22/2009
  2/9/2009 -  2/15/2009
  2/2/2009 -   2/8/2009
 1/26/2009 -   2/1/2009
 1/19/2009 -  1/25/2009
 1/12/2009 -  1/18/2009
  1/5/2009 -  1/11/2009
12/29/2008 -   1/4/2009
12/22/2008 - 12/28/2008
12/15/2008 - 12/21/2008
 12/8/2008 - 12/14/2008
 12/1/2008 -  12/7/2008
11/24/2008 - 11/30/2008
11/17/2008 - 11/23/2008
11/10/2008 - 11/16/2008
 11/3/2008 -  11/9/2008
10/27/2008 -  11/2/2008
10/20/2008 - 10/26/2008
10/13/2008 - 10/19/2008
 10/6/2008 - 10/12/2008
 9/29/2008 -  10/5/2008
 9/22/2008 -  9/28/2008
 9/15/2008 -  9/21/2008
  9/8/2008 -  9/14/2008
  9/1/2008 -   9/7/2008
 8/25/2008 -  8/31/2008
 8/18/2008 -  8/24/2008
 8/11/2008 -  8/17/2008
  8/4/2008 -  8/10/2008
 7/28/2008 -   8/3/2008
 7/21/2008 -  7/27/2008
 7/14/2008 -  7/20/2008
  7/7/2008 -  7/13/2008
 6/30/2008 -   7/6/2008
 6/23/2008 -  6/29/2008
 6/16/2008 -  6/22/2008
  6/9/2008 -  6/15/2008
  6/2/2008 -   6/8/2008
 5/26/2008 -   6/1/2008
 5/19/2008 -  5/25/2008
 5/12/2008 -  5/18/2008
  5/5/2008 -  5/11/2008
 4/28/2008 -   5/4/2008
 4/21/2008 -  4/27/2008
 4/14/2008 -  4/20/2008
  4/7/2008 -  4/13/2008
 3/31/2008 -   4/6/2008
 3/24/2008 -  3/30/2008
 3/17/2008 -  3/23/2008
 3/10/2008 -  3/16/2008
  3/3/2008 -   3/9/2008
 2/25/2008 -   3/2/2008
 2/18/2008 -  2/24/2008
 2/11/2008 -  2/17/2008
  2/4/2008 -  2/10/2008
 1/28/2008 -   2/3/2008
 1/21/2008 -  1/27/2008
 1/14/2008 -  1/20/2008
  1/7/2008 -  1/13/2008
12/31/2007 -   1/6/2008
12/24/2007 - 12/30/2007
12/17/2007 - 12/23/2007
12/10/2007 - 12/16/2007
 12/3/2007 -  12/9/2007
11/26/2007 -  12/2/2007
11/19/2007 - 11/25/2007
11/12/2007 - 11/18/2007
 11/5/2007 - 11/11/2007
10/29/2007 -  11/4/2007
10/22/2007 - 10/28/2007
10/15/2007 - 10/21/2007
 10/8/2007 - 10/14/2007
 10/1/2007 -  10/7/2007
 9/24/2007 -  9/30/2007
 9/17/2007 -  9/23/2007
 9/10/2007 -  9/16/2007
  9/3/2007 -   9/9/2007
 8/27/2007 -   9/2/2007
 8/20/2007 -  8/26/2007
 8/13/2007 -  8/19/2007
  8/6/2007 -  8/12/2007
 7/30/2007 -   8/5/2007
 7/23/2007 -  7/29/2007
 7/16/2007 -  7/22/2007
  7/9/2007 -  7/15/2007
  7/2/2007 -   7/8/2007
 6/25/2007 -   7/1/2007
 6/18/2007 -  6/24/2007
 6/11/2007 -  6/17/2007
  6/4/2007 -  6/10/2007
 5/28/2007 -   6/3/2007
 5/21/2007 -  5/27/2007
 5/14/2007 -  5/20/2007
  5/7/2007 -  5/13/2007
 4/30/2007 -   5/6/2007
 4/23/2007 -  4/29/2007
 4/16/2007 -  4/22/2007
  4/9/2007 -  4/15/2007
  4/2/2007 -   4/8/2007
 3/26/2007 -   4/1/2007
 3/19/2007 -  3/25/2007
 3/12/2007 -  3/18/2007
  3/5/2007 -  3/11/2007
 2/26/2007 -   3/4/2007
 2/19/2007 -  2/25/2007
 2/12/2007 -  2/18/2007
  2/5/2007 -  2/11/2007
 1/29/2007 -   2/4/2007
 1/22/2007 -  1/28/2007
 1/15/2007 -  1/21/2007
  1/8/2007 -  1/14/2007
  1/1/2007 -   1/7/2007
12/25/2006 - 12/31/2006
12/18/2006 - 12/24/2006
12/11/2006 - 12/17/2006
 12/4/2006 - 12/10/2006
11/27/2006 -  12/3/2006
11/20/2006 - 11/26/2006
11/13/2006 - 11/19/2006
 11/6/2006 - 11/12/2006
10/30/2006 -  11/5/2006
10/23/2006 - 10/29/2006
10/16/2006 - 10/22/2006
 10/9/2006 - 10/15/2006
 10/2/2006 -  10/8/2006
 9/25/2006 -  10/1/2006
 9/18/2006 -  9/24/2006
 9/11/2006 -  9/17/2006
  9/4/2006 -  9/10/2006
 8/28/2006 -   9/3/2006
 8/21/2006 -  8/27/2006
 8/14/2006 -  8/20/2006
  8/7/2006 -  8/13/2006
 7/31/2006 -   8/6/2006
 7/24/2006 -  7/30/2006
 7/17/2006 -  7/23/2006
 7/10/2006 -  7/16/2006
  7/3/2006 -   7/9/2006
 6/26/2006 -   7/2/2006
 6/19/2006 -  6/25/2006
 6/12/2006 -  6/18/2006
  6/5/2006 -  6/11/2006
 5/29/2006 -   6/4/2006
 5/22/2006 -  5/28/2006
 5/15/2006 -  5/21/2006
  5/8/2006 -  5/14/2006
  5/1/2006 -   5/7/2006
 4/24/2006 -  4/30/2006
 4/17/2006 -  4/23/2006
 4/10/2006 -  4/16/2006
  4/3/2006 -   4/9/2006
 3/27/2006 -   4/2/2006
 3/20/2006 -  3/26/2006
 3/13/2006 -  3/19/2006
  3/6/2006 -  3/12/2006
 2/27/2006 -   3/5/2006
 2/20/2006 -  2/26/2006
 2/13/2006 -  2/19/2006
  2/6/2006 -  2/12/2006
 1/30/2006 -   2/5/2006
 1/23/2006 -  1/29/2006
 1/16/2006 -  1/22/2006
  1/9/2006 -  1/15/2006
  1/2/2006 -   1/8/2006
12/26/2005 -   1/1/2005
12/19/2005 - 12/25/2005
12/12/2005 - 12/18/2005
 12/5/2005 - 12/11/2005
11/28/2005 -  12/4/2005
11/21/2005 - 11/27/2005
11/14/2005 - 11/20/2005
 11/7/2005 - 11/13/2005
10/31/2005 -  11/6/2005
10/24/2005 - 10/30/2005
10/17/2005 - 10/23/2005
10/10/2005 - 10/16/2005
 10/3/2005 -  10/9/2005
 9/26/2005 -  10/2/2005
 9/19/2005 -  9/25/2005
 9/12/2005 -  9/18/2005
  9/5/2005 -  9/11/2005
 8/29/2005 -   9/4/2005
 8/22/2005 -  8/28/2005
 8/15/2005 -  8/21/2005
  8/8/2005 -  8/14/2005
  8/1/2005 -   8/7/2005
 7/25/2005 -  7/31/2005
 7/18/2005 -  7/24/2005
 7/11/2005 -  7/17/2005
  7/4/2005 -  7/10/2005
 6/27/2005 -   7/3/2005
 6/20/2005 -  6/26/2005
 6/13/2005 -  6/19/2005
  6/6/2005 -  6/12/2005
 5/30/2005 -   6/5/2005
 5/23/2005 -  5/29/2005
 5/16/2005 -  5/22/2005
  5/9/2005 -  5/15/2005
  5/2/2005 -   5/8/2005
 4/25/2005 -   5/1/2005
 4/18/2005 -  4/24/2005
 4/11/2005 -  4/17/2005
  4/4/2005 -  4/10/2005
 3/28/2005 -   4/3/2005
 3/21/2005 -  3/27/2005
 3/14/2005 -  3/20/2005
  3/7/2005 -  3/13/2005
 2/28/2005 -   3/6/2005
 2/21/2005 -  2/27/2005
 2/14/2005 -  2/20/2005
  2/7/2005 -  2/13/2005
 1/31/2005 -   2/6/2005
 1/24/2005 -  1/30/2005
 1/17/2005 -  1/23/2005
 1/10/2005 -  1/16/2005
  1/3/2005 -   1/9/2005
12/27/2004 -   1/2/2004
12/20/2004 - 12/26/2004
12/13/2004 - 12/19/2004
 12/6/2004 - 12/12/2004
11/29/2004 -  12/5/2004
11/22/2004 - 11/28/2004
11/15/2004 - 11/21/2004
 11/8/2004 - 11/14/2004
 11/1/2004 -  11/7/2004
10/25/2004 - 10/31/2004
10/18/2004 - 10/24/2004
10/11/2004 - 10/17/2004
 10/4/2004 - 10/10/2004
 9/27/2004 -  10/3/2004
 9/20/2004 -  9/26/2004
 9/13/2004 -  9/19/2004
  9/6/2004 -  9/12/2004
 8/30/2004 -   9/5/2004
 8/23/2004 -  8/29/2004
 8/16/2004 -  8/22/2004
  8/9/2004 -  8/15/2004
  8/2/2004 -   8/8/2004
 7/26/2004 -   8/1/2004
 7/19/2004 -  7/25/2004
 7/12/2004 -  7/18/2004
  7/5/2004 -  7/11/2004
 6/28/2004 -   7/4/2004
 6/21/2004 -  6/27/2004
 6/14/2004 -  6/20/2004
  6/7/2004 -  6/13/2004
 5/31/2004 -   6/6/2004
 5/24/2004 -  5/30/2004
 5/17/2004 -  5/23/2004
 5/10/2004 -  5/16/2004
  5/3/2004 -   5/9/2004
 4/26/2004 -   5/2/2004
 4/19/2004 -  4/25/2004
 4/12/2004 -  4/18/2004
  4/5/2004 -  4/11/2004
 3/29/2004 -   4/4/2004
 3/22/2004 -  3/28/2004
 3/15/2004 -  3/21/2004
  3/8/2004 -  3/14/2004
  3/1/2004 -   3/7/2004
 2/23/2004 -  2/29/2004
 2/16/2004 -  2/22/2004
  2/9/2004 -  2/15/2004
  2/2/2004 -   2/8/2004
 1/26/2004 -   2/1/2004
 1/19/2004 -  1/25/2004
 1/12/2004 -  1/18/2004
  1/5/2004 -  1/11/2004
12/29/2003 -   1/4/2004
12/22/2003 - 12/28/2003
12/15/2003 - 12/21/2003
 12/8/2003 - 12/14/2003
 12/1/2003 -  12/7/2003
11/24/2003 - 11/30/2003
11/17/2003 - 11/23/2003
11/10/2003 - 11/16/2003
 11/3/2003 -  11/9/2003
10/27/2003 -  11/2/2003
10/20/2003 - 10/26/2003
10/13/2003 - 10/19/2003
 10/6/2003 - 10/12/2003
 9/29/2003 -  10/5/2003
 9/22/2003 -  9/28/2003
 9/15/2003 -  9/21/2003
  9/8/2003 -  9/14/2003
  9/1/2003 -   9/7/2003
 8/25/2003 -  8/31/2003
 8/18/2003 -  8/24/2003
 8/11/2003 -  8/17/2003
  8/4/2003 -  8/10/2003
 7/28/2003 -   8/3/2003
 7/21/2003 -  7/27/2003
 7/14/2003 -  7/20/2003
  7/7/2003 -  7/13/2003
 6/30/2003 -   7/6/2003
 6/23/2003 -  6/29/2003
 6/16/2003 -  6/22/2003
  6/9/2003 -  6/15/2003
  6/2/2003 -   6/8/2003
 5/26/2003 -   6/1/2003
 5/19/2003 -  5/25/2003
 5/12/2003 -  5/18/2003
  5/5/2003 -  5/11/2003
 4/28/2003 -   5/4/2003
 4/21/2003 -  4/27/2003
 4/14/2003 -  4/20/2003
  4/7/2003 -  4/13/2003
 3/31/2003 -   4/6/2003
 3/24/2003 -  3/30/2003
 3/17/2003 -  3/23/2003
 3/10/2003 -  3/16/2003
  3/3/2003 -   3/9/2003
 2/24/2003 -   3/2/2003
 2/17/2003 -  2/23/2003
 2/10/2003 -  2/16/2003
  2/3/2003 -   2/9/2003
 1/27/2003 -   2/2/2003
 1/20/2003 -  1/26/2003
 1/13/2003 -  1/19/2003
  1/6/2003 -  1/12/2003
12/30/2002 -   1/5/2003
12/23/2002 - 12/29/2002
12/16/2002 - 12/22/2002
 12/9/2002 - 12/15/2002
 12/2/2002 -  12/8/2002
11/25/2002 -  12/1/2002
11/18/2002 - 11/24/2002
11/11/2002 - 11/17/2002
 11/4/2002 - 11/10/2002
10/28/2002 -  11/3/2002
10/21/2002 - 10/27/2002
10/14/2002 - 10/20/2002
 10/7/2002 - 10/13/2002
 9/30/2002 -  10/6/2002
 9/23/2002 -  9/29/2002
 9/16/2002 -  9/22/2002
  9/9/2002 -  9/15/2002
  9/2/2002 -   9/8/2002
 8/26/2002 -   9/1/2002
 8/19/2002 -  8/25/2002
 8/12/2002 -  8/18/2002
  8/5/2002 -  8/11/2002
 7/29/2002 -   8/4/2002
 7/22/2002 -  7/28/2002
 7/15/2002 -  7/21/2002
  7/8/2002 -  7/14/2002
  7/1/2002 -   7/7/2002
 6/24/2002 -  6/30/2002
 6/17/2002 -  6/23/2002
 6/10/2002 -  6/16/2002
  6/3/2002 -   6/9/2002
 5/27/2002 -   6/2/2002
 5/20/2002 -  5/26/2002
 5/13/2002 -  5/19/2002
  5/6/2002 -  5/12/2002
 4/29/2002 -   5/5/2002
 4/22/2002 -  4/28/2002
 4/15/2002 -  4/21/2002
  4/8/2002 -  4/14/2002
  4/1/2002 -   4/7/2002
 3/25/2002 -  3/31/2002
 3/18/2002 -  3/24/2002
 3/11/2002 -  3/17/2002
  3/4/2002 -  3/10/2002
 2/25/2002 -   3/3/2002
 2/18/2002 -  2/24/2002
 2/11/2002 -  2/17/2002
  2/4/2002 -  2/10/2002
 1/28/2002 -   2/3/2002
 1/21/2002 -  1/27/2002
 1/14/2002 -  1/20/2002
  1/7/2002 -  1/13/2002
12/31/2001 -   1/6/2002
12/24/2001 - 12/30/2001
12/17/2001 - 12/23/2001
Sunday, August 4, 2002
23:51 - My Goldmember Review

(top)
I went to see Austin Powers: Goldmember tonight.

And after seeing this movie, I can safely say that I am now willing and prepared to go on a one-man personal worldwide crusade against cellphones. For you see, not one person on the face of the Earth appears to be capable of shutting off his blasted phone on the way in to the theater.

I am going to go to the entrance to each theater room at the multiplex, pat each person down as he or she enters, take any and all cellphones that they might be carrying, and put them all into a large wooden crate. Then I will fill the crate with quick-drying post-hole concrete, and then take the crate outside to the parking lot, remove the wooden crate sides, and then begin to slowly and methodically demolish the concrete block with a sledgehammer, singing "Steel Drivin' Man" and various chain-gang pick-swingin' songs like from the beginning of O Brother, Where Art Thou?.

Then I will seek out every person who has ever made a call to someone's cell phone, when the recipient of the call was in the middle of a heartfelt, involved, or otherwise valuable personal conversation with another person in real life, just so that the caller could say, in that hideous whining wheedling voice of piteous sycophancy, "What'cha dooooooooIN'?" And I will take each phone from each such caller, and I will reprogram it so that when he tries to dial any number, it will instead play back a detailed verbal description of the Persian Boat Torture-- the one that involves strapping someone naked and covered with honey onto a boat floating in the middle of a swamp full of hatching mosquitoes and flies, under the blazing sun, so that the person dies under the torment of about fifteen different horrific forms of pain that are otherwise undescribable in any kind of polite company. And just to be extra cruel, I'll put it on a randomizer so that one in ten calls, instead of the Persian Boat Torture, the caller gets a recording of William Shatner's "Lucy in the Sky with Diamonds" or Leonard Nimoy's "Bilbo Baggins".

And then I will seek out the people behind the cell-phone service commercials-- the Verizon "Can you hear me now?" guy, and Carrot Top, and Mr. T, and Alf, and everybody who has shilled for these bloody long-distance phone ads which can't ever seem to take the hint and get their damn selves off my TV-- and I don't care if it gets me on Seanbaby's shit list to want to do this to Mr. T, but I will hire whatever muscle I'll need to in order to subdue these people, tie them down, force-feed them asparagus, and then wait until they fall asleep and put their hands in pans of warm water so that they pee in their sleep and wake up in a miasma of smell so horrible that they die of embarrassment and revulsion, and nevermore influence anybody who is going to be at a movie that I am seeing, or in a car where I am talking to them, or in line at Taco Bell where I am planning to get food, to spend that entire time with their bloody bleeding bloody blasted billions of blistering blue bloody barnacles on a cell phone ringing at top volume with whatever kick-in-the-head-inducing ring tone they've programmed it with, over and over and over and over and over again. If they can't wait until the movie is over before they have to call their friends to ask "What'cha dooooooIN'?", then they can consider themselves duly warned of my intentions.

Oh-- the movie. It was funny.
Saturday, August 3, 2002
03:58 - Hope for Rationality

(top)
One other NPR point-of-interest that I passed like a ship in the night on the way home was the always-interesting This American Life segment, which this time was a fairly in-depth look at life in the West Bank, from both the Israeli and Palestinian perspectives. They got Israeli high-school students dishing about dating; they got Palestinian blue-collar workers grumbling about being cooped up day after day due to the imposed-at-gunpoint curfews; they got a look in at the Knesset, where the right-wing Likud party HQ was full of supporters and life and energy, whereas the pro-peace-process Labor wing was all but abandoned, and the phones silent; they got a Palestinian-American developer working on building a large glass-walled shopping mall in a West Bank town, who spoke with careful nonchalance about how the Palestinian Authority and the big shareholders of the regional phone company had looted the company by splitting off the lucrative cellular division into a new company in which the PA and the bigwigs were the only shareholders-- on the very same day that the PA was pledging to the Europeans that the PA would be divesting itself of its private holdings in a show of good faith towards an honest free market. And then he had to hurry to hail a cab for the reporter and zoom off so he wouldn't get shot when a curfew was suddenly re-imposed.

But most interesting to me was the street-interview segment where they talked about a politically high-profile doctor named Mustafa Barghouti, who has built up quite a reputation for himself as the leader of a volunteer medical task force providing emergency care to people in the West Bank. To a man, every single person interviewed described Barghouti as "a good man-- a good, good guy". Presented as a "third alternative" (next to Arafat and the leader of Hamas), he seemed to be an ideal candidate, at least in our eyes, for filling Arafat's political shoes. He's moderate (he advocates non-violence and opposes suicide bombings on moral grounds as well as tactical ones); he's reasonable (he wants to see a return to 1967 borders, but his is a two-state solution that doesn't eventually end up being a one-state and Israeli-minority solution, the way other "moderates" want it); and he's secular. He's run for office before, and he's only lost because his opponent (his brother Marwan, who is much more extreme in his views, sort of a Malcolm X figure) was said to have fixed the elections-- which he only narrowly won anyway. It would seem that Barghouti would be an ideal horse to back.

Except for what the street interviewees had to say. Though they all loved Barghouti, none of them seemed willing to vote for him if he were to run. When asked who they would vote for if given a choice between Arafat and Barghouti, those interviewed said they'd pick Arafat. When pressed for why, they repeated the same phrase: He is our father; he is our symbol. (This could well be the result of fear of secret-police inquisitions, but considering how much stock these folks seem to put in symbols, I'm not at all sure that these sentiments aren't genuine.) And if given a choice between Barghouti and one of the Hamas mucky-mucks, the interviewees said they'd take the Hamas guy. Why? Because Hamas is religious, and Barghouti is secular. "Everything Hamas does is based on our religion," said one interviewee.

So there we have it. If we can take this as any kind of representative sample, a true democratic vote-- if taken tomorrow-- would probably still turn up Arafat as a winner. Even if people distrust him and find him to be corrupt and ineffectual, he's our father and our symbol. Opinions are opinions. But change? Nooo... we can't have that. I've seen this kind of mentality before. People will complain about a situation that is just bad enough to make them complain but not bad enough to make them want to do something about it. It's the art of keeping people on a knife edge. Microsoft has mastered it, and so has Arafat, apparently.

So on one hand, I'm cheered that people like Barghouti-- with blue jeans and a pin-striped shirt, leading non-violent chanting protests against curfew conditions-- exist in the occupied Palestinian areas. But on the other, I'm discouraged at the thought that the Palestinians are more concerned with preserving symbols than with forging themselves a better life.

But at least those poll numbers keep fluctuating. If a shakeup occurs, at least there's some nonzero chance of a rational, charismatic, secular leader taking the reins.

Not a large one. But larger than it used to be.

01:50 - Oh, do shut up...

(top)
The quote that everybody on all the world-conscious news reports today kept repeating was by George Bush at a Republican Party fundraiser saying:

We cannot allow the world's worst leaders to develop, and thereby hold hostage freedom-loving countries with the world's worst weapons.

Some MP by the name of Galloway then went on air to publicly call America's leadership akin to "a giant with the mind of a child"; and as was even more insufferable, he claimed that such a characterization should be obvious to "anybody who has heard President Bush speak just now".

You could cut the arrogant paternalistic superiority with a knife.

The BBC spent the rest of the report covering (and repeating four times) the breaking story that the WTC firefighters on 9/11 were not actually as heroic as they had heretofore been made out to be-- that on that day they were actually hampered by critical communications failures, out-of-date radio systems, supervision by incompetents who hadn't had refresher training in fifteen years, hierarchical chaos, internecine bickering, inter-office opacity, and any number of other accusations that, while nobody even in the FDNY would deny them, were... well, feel free to draw your own conclusions about the tastefulness of the glee with which the BBC World Service returned to the story. (Four times.)

Afterwards, on the domestic news, evidence was uncovered that the firefighters working in the South Tower had managed to penetrate higher into it than previously had been thought. Drawing on evidence from tapes of emergency transmissions made during the rescue operation, it was revealed that at least two firefighters were in fact able to reach the crash site on the 78th floor, prior to the building's collapse.

If we're being told not to attack Iraq by people like Galloway and the BBC World Service, then I for one consider that to be just as valid a reason for us to attack as any of Steven den Beste's best arguments.

If we attack, one of two things will happen: 1) The moment we start bombing, Tel Aviv will disappear under a nuke, or meteorologists in New York or Chicago would have to invent a new icon for "Anthrax clouds" or "Smallpox fronts"; or 2) we will take Baghdad without such a thing happening, but we'll discover that the WMD switches were armed and everything ready to go off, armed and loaded and fully operational. Either way, we'll be proved right, and either way, we'll be taking the risks upon ourselves.

So the nay-sayers can just jolly well butt out.

13:47 - Just to cite an example...
http://www.dockfun.com

(top)

As the philosophical battle rages over Watson and Sherlock 3, I have to call up the example of software like DockFun!.

This thing is the coolest piece of OS X-specific shareware I've run across. I haven't tried it out myself, but people who have used it tell me it's indispensable and oh-so-much-fun. Just watch the Flash intro on the website and see what I mean.

But while I was watching it, the only thing that was running through my mind was, "Wouldn't it be cool if Apple were to integrate this kind of functionality right into the system?"

Why would this leap to my mind? Why would I want to see something created by an innovative, fun-loving third-party shareware developer subsumed into the default OS, presumably without any recognition or compensation?

Because it seems like an obvious thing for people to want. And it's squarely in Apple's development path to incorporate this kind of technology.

When I think rationally about it, no, I wouldn't want to see the developer's efforts brought low by a corporate edict rendering it meaningless. But deep down, viscerally, all development of this type for the Mac community seems to me to be a part of a continuum-- whether it's by Apple or by an independent developer, it all seems to be toward the same goal: making the Mac rock.

And when that presents itself to be the goal, I just want to see it implemented in the most streamlined and elegant way possible (as an Apple-menu option, for instance, like the Location Manager, which switches your TCP/IP settings from location to location-- rather than as a Dock item taking up room)... and in the way that gets it in front of the most people possible. Everybody should have functionality like this. Just like everybody should have a menu-bar clock, WindowShade, and SOAP-based XML database access in customized client panes.

It'll take time, though. And just as I'm waiting still for Apple to incorporate all of the functionality that the Location Manager had under OS9 into OS X (adjusting your volume and power-management settings, as well as a whole bunch of other preferences, depending on whether you're at home or in the quiet office, for example), I'll wait however long it takes for them to enhance the Dock in this way. They've already put in the foundations, responding to customer requests-- allowing you to put the dock on the left or right, or to change the minimize behavior. So it's only a matter of time.

And until then, the DockFun! guy gets the vote of my money.
Friday, August 2, 2002
19:24 - Once again, we're stuck with cheaper instead of better

(top)
Kris just noticed an external DVD-RW drive hooked up to a computer in IT as he walked by. Upon close inspection, he noticed that it was connected via USB. USB 2.0, to be exact. And a good thing, too, because if it were USB 1.1, DVD write speed would be so slow as to be unusable.

So USB 2.0 has now arrived, and what it means is that FireWire's speed advantage over USB has now been more or less nullified. Its early lead over USB in fields such as DV camcorders and mass storage was valuable, and now a lot of those devices have been built and entrenched; but now that USB 2.0 can match FireWire's speed for one-way downloading of data such as digital photos or to-be-burned DVD data, and because USB 2.0 will now be the shipping default on all new PCs, FireWire will cease to have a clear advantage, and will be relegated to a second-class status and eventually die in obscurity. Like all good things crowded out by something cheaper, uglier, but backed by more companies bent on smashing all competition at all costs.

Those USB cables, as Kris explained, have to have ferrite beads-- those big, heavy, metal chokes-- clamped around each end, if they're going to carry large amounts of data. You know the kind. It's like having a huge bullet strapped to each end of the cable. Why is it there? Because USB only has four wires, and is an unbalanced specification. Your four wires are power, ground, transmit, and receive. The data wires are driven independently of each other, and their states are read relative to the ground wire. This means that you have to run it at a high voltage, like RS-232 (which operates at 12V). It means you've created a radio transmitter. Two unbalanced data signals oscillating relative to ground. Hence the big slugs of metal to try to shield the transmission effects.

Whereas FireWire has six wires-- power, ground, a pair for transmit, and a pair for receive. Each data pair is complementary to itself. When one wire is positive, the other wire is negative-- and they average to ground. The difference between them is read as the signal, not their relative voltage to ground like USB, which means FireWire can operate at a much lower voltage than USB, like in the neighborhood of 2V. And the two pairs are twisted, as in Ethernet, which cancels out any transmitter effects. So what you've got is a balanced design, one where the signals all average out to nothing. The four-wire i.Link version of FireWire just has the two pairs for send and receive, and no power. It's still nice and balanced, and noise-free.

But six wires means more expensive controllers at the endpoints, so everybody sticks with USB. And that means transmitter noise, which means we have to strap these chunks of metal onto them in order to keep them from scrambling everything FCC-regulated in the house.

FireWire's native speed is 400Mbps, and USB 2.0's is 480Mbps. FireWire's speed can be bumped up by a factor of two, four, eight, and so on-- by using the same tricks Ethernet has been able to get away with, in order to jump from 10Mbps to 100Mbps to 1000Mbps. Signal can be sent on different pairs of wires with more complex components at the ends. But USB can only be sped up by clocking it higher, which makes the noise characteristics leap into the unmanageable. Sooner or later we'll have USB cables that have to be sheathed from end to end in lead, because hey-- we gotta have that speed, but we can't use FireWire. That's not the standard.

FireWire will get faster, doubling and quadrupling in speed in very short order-- but it won't matter, now that USB 2.0 is at the level that we all oohed and aahed at when FireWire let us put 150 CDs onto an iPod in five minutes.

(And this is aside from other stuff that I haven't gotten into-- like how FireWire is a peer-to-peer/daisy-chainable protocol, meaning you can hook up all kinds of devices end-to-end-- camcorder to DV bridge to hard drive-- without even having a computer in the mix. You can even put your PC into FireWire target disk mode and access its disks from another machine. But USB is host/hub-based, meaning that if you want to hook up more devices than you have ports in your hub-bearing devices (PC, monitor, keyboard), you have to buy a USB hub and use up another A/C adapter slot in your power strip. And it has to go through a computer in order to work. But for most people's purposes, that's close enough to being a good design.)

It pisses me off no end to see an elegant and effective design shouldered aside by the big, dumb, lumbering competitor with fewer features, worse expandability, and significant engineering drawbacks-- just because it has the trump-card of lower price and corporate-backed ubiquity on its side. Power over the market is so much more important than putting the best solution into people's hands, after all.

No, I'm not jaded by this industry yet. But some days...

12:30 - My dear Holmes, stick it in your ear
http://www.macobserver.com/article/2002/07/29.7.shtml

(top)
I've been trying to stay out of this argument. The accusation that Apple's new Sherlock 3, as included in Jaguar, is a technological duplicate of Karelia's truly groundbreaking Watson application, has very neatly split the Mac community down the middle, and I'm still teetering on the fence.

One one hand you've got the backers of Watson, including its developer, Dan Wood. Their contention is that Apple has pulled a tactic that's Microsoftian in the extreme: they acknowledged the innovative nature of Watson, which is a consolidated and extensible framework for SOAP/XML tools which let you get movie showtimes/trailers, TV listings, package tracking information, flight times, eBay auction listings, weather, baseball scores, and a host of other pieces of functionality that harnesses the design advantages of dedicated client software layout rather than the inherent clunkiness of the Web to access publicly accessible XML databases-- by awarding Karelia Software, the maker of Watson, the coveted Apple Design Award for "Most Innovative Mac OS X Software". And then, on the very same day that the award was given, Apple announced Sherlock 3. Which is almost exactly the same thing.



The contingent who are indignant over this, and quite understandably so, see this as a betrayal by Apple. They could have compensated Dan Wood to some degree, or even just put a credit to Karelia in the About page on Sherlock 3. They liken this action to Microsoft giving Marc Andreesen a coveted design award for Netscape, and then immediately turning around and releasing Internet Explorer.

But then there's the other side of the argument, which is fairly well represented. These people say that Apple was perfectly well within their rights to develop Sherlock, which already had a fairly long history as a web-wide information-gathering tool, into a SOAP-based information browser with customized interfaces for each tool. Design-wise, it's really not that much different from Sherlock 2-- it's just that the information it presents is more useful and better laid out. It leverages the same technological foundations, developed by Apple in Cocoa, that Watson does-- it just takes publicly available XML data, furnished by other companies, and formats them in a nice way. There's a minimal amount of effort involved in putting this stuff together. That's the whole point of Cocoa. It was a no-brainer for Apple to redesign Sherlock to take advantage of this new functionality, and they would have done so even if Watson weren't around. (In fact, if Karelia hadn't done it, this stuff is so easy that somebody would have created something just like Watson.)

Or even (say these people) if Watson was inspirational to Apple, Apple has a long history of gradually folding into its OS the little tweaks and advancements made by third-party shareware developers. The menu-bar clock, WindowShade, and the Internet Control Panel were all third-party developments that Apple realized were so useful that they would be remiss if they did not include them in the core operating system. Sometimes they compensated the original developers, but usually they didn't. There was always a little bit of grumbling, but it was quickly forgotten as the new features came to benefit all users and be thought of as an indispensable part of the OS. Apple should be praised for seeing an opportunity for enhancing the user experience in an obvious new direction, rather than vilified for taking advantage of the poor third-party innovators.

These guys are accused by the first group of being Apple shills, of holding Apple to a double standard-- exonerating them of guilt for what they decry Microsoft for doing. And there's something to be said for that.

But there's a question one has to ask oneself. Does Apple have an obligation to sit on its hands and not develop some piece of technology, if there's an existing implementation of it out there that they might be stepping on? Or is third-party development inherently fraught with the danger that at any moment the OS maker might incorporate their functionality into the product (which, as long as it's not patented, is perfectly legal)-- and that it's their obligation to simply keep ahead of the curve?

iPhoto undoubtedly took some sales away from existing photo-manipulation apps and camera managers; iTunes has undisputably hurt the ability of MP3-player authors to sell their products. But in the latter case, Audion is a perfect example of a piece of shareware that keeps ahead of the curve. When a lot of their core functionality was co-opted by iTunes being integrated into the OS, they simply made their own product better. And now, while iTunes provides core music-playing functionality in an outstanding way, Audion is the only game in town if you want stuff like skins, album art, and alternative encoders and codecs. The makers of Audion understand what life is like in the shareware development world. You've got to stay hungry, or else you'll get eaten yourself.

And in any case, Apple's turning Sherlock into a Watson-like application for gathering Web-accesible data, and incorporating file search back into a quick adjunct to the Finder itself-- the way it always used to be, which is why it's called the Finder, for crying-out-loud-- is an excellent step. One thing that has pissed me off ever since OS 8 is that pressing Command-F to find a file fires up this big clunky application, rather than an instant search box. I think this is a perfectly reasonable direction for the design of the system to go, and the only unfortunate thing is that Watson predates it.

The Watson community, both developers and users, is lively and enthusiastic, and it's full of great minds and innovative thinkers. Naturally they feel slighted that Watson has been shafted by Apple-- or at least, the way they see it. Honestly, I agree to a certain extent. But I don't agree that Apple is under an obligation to acknowledge the influence of third-party developers upon their own development, especially when that development is an "obvious" direction for them to take. How does this differ from Microsoft seizing control of the Web by writing a browser and incorporating it into their OS? Not by much in the technical sense, but by quite a bit in the business sense. Microsoft consciously wanted to kill Netscape, because they saw the commercial potential in owning the Web. But Apple doesn't want to harm Karelia-- they want shareware developers like Karelia to keep innovating with new products like Watson which take full advantage of the opportunities afforded by OS X and Cocoa. But the direction in which they want to take Sherlock is one that they genuinely feel they'd be remiss in ignoring, and while committing to a new version of Sherlock that does what Watson does is sure to be a blow to Karelia, Apple considers that to be a regrettable but necessary sacrifice. There's no malice involved. Apple's actions are about functionality, not power. This is an argument about intent. That's where the difference lies, to me, though it's a nebulous and almost impossible-to-prove distinction.

Dan Wood can be proud that his innovation has influenced the development of Apple's core software. I know that's small comfort to him and to his loyal following, of which I'm a part. But if Watson were to be developed with the same fervor and the same love of creation that drives Audion, then a blow like Sherlock 3 or iTunes will not hurt so much-- it will just make the shareware community all the more innovative, by necessity as much as by desire.
Thursday, August 1, 2002
17:35 - Digital hub? Pshaw!
http://lowendmac.com/maclife/02/0801.html

(top)
Some Mac columnists fail to "get it" even worse than the PC-centric tech pundits. Like Jason Walsh of Low End Mac.

The digital hub always struck me as a ropey idea. It's not that I object in principle to people connecting digital cameras and camcorders to their Macs, it's just that I don't want to be forced to sit through the dross that they subsequently create.

The Apple propaganda machine has been going full tilt for the last while, informing us of the wonderful free iApps that come with every Mac. So what? I like to think that for an investment of over ¤1,000 I'd get something other than an operating system thrown into the box.

When Jobs announced iPhoto, I dutifully went to the Apple site and downloaded it. I looked at it -- and erased it. Yes, it's very nice, but I'm not going to give up Photoshop any time soon.

iMovie? Sorry, my Adobe Premier habit is too ingrained.

Where do I begin? This reads more like a troll than a serious opinion piece. How far removed is this guy from modern computing reality? For him to dismiss the value of the iApps, just because they don't appeal to him and his needs, and particularly on the basis of faulty assumptions about what the iApps are for, is just clueless arrogance.

Who ever said, for instance, that iPhoto was supposed to be an alternative to Photoshop? If that was the assumption under which he downloaded it, he was apparently reading his LCD monitor through polarized glasses at 90° angles to the polarity of the screen, because one doesn't have to look too far to find clear descriptions of what it is for. iPhoto is supposed to work in tandem with Photoshop. Photoshop is a high-end image manipulation and compositing application. iPhoto is a digital camera manager. The two have a very limited function set overlap. iPhoto provides the most rudimentary of editing features, like red-eye removal and cropping and rotating, but for more complex stuff-- well, that's what Photoshop is for. But can Photoshop automatically read in all the photos on your digital camera into a named and dated "film roll", and browse through them all visually or search by assignable and user-definable parameters? Does Photoshop let you order prints online or design and order a hardbound book of your pictures?

I use Photoshop all the time. I also use iPhoto all the time. I don't have to be some kind of computing genius to realize that they are designed for different purposes. Adobe GoLive isn't intended as a replacement for Microsoft Word, for Frith's sake.

And, okay, it's great that you use Premiere. Fine. I'm glad you do such in-depth work that you require the features it provides for a paltry $600. But, again, Premiere and iMovie are not intended for the same purposes. Premiere is widely used as a professional video-editing tool for creating finished contract work in many high-end studios. (Well, except for most of them, which use either Final Cut Pro or Avid.)

But iMovie isn't for that. It's intended to allow Mr. Husband to make home movies, and iDVD is intended to allow him to send them to Grandma. What do you think all those low-end camcorders are intended for, that have sold so well since the mid-80s? What about all those little film cameras that people used in the 50s and 60s? They're for home users who fancy themselves amateur filmmakers-- people who want to capture their families' memories, to immortalize the moments of their lives.

iMovie does that bloody well. And it's free.

What exactly is the cognitive dissonance coming from? Apple provides consumer-level applications for doing genuinely useful, in-demand things, for free, on all their machines. And this guy is bitching about it? Look, just because you've apparently never used a digital camera doesn't mean you get to ruin it for the rest of us.

If Apple want to impress me, then they'd better write a HyperCard style iMedia and Homepage style iWeb tout suite.

... Excuse me? I'm sure this comes as just as much of a surprise to Apple as it does to me.

Totally leaving aside arguments like Final Cut Pro and all the high-end audio/video companies that Apple has been buying up left and right, and their outright ownership of much of that industry, where does this kind of demand fit into Apple's business plan? He's demanding that Cisco build a tractor, or Microsoft get into the lava-lamp business. (Well, maybe the latter isn't so far-fetched.)

The problem with the iApps is this -- they're not powerful enough. Okay, you say, but they're not aimed at commercial users. This is absolutely correct, and it's also the nub of my argument. I am genuinely concerned that Apple is beginning to neglect its core professional user base in the graphics and media industries. If Adobe ever pulls Photoshop, then the party's over. Macs will be stone dead as far as designers go, and mine will go out the window. Literally.

People talk about the "empowering" potential of the iApps, but having tools available to edit photos and video does not a professional make. The effect is more likely to be similar to that of Microsoft Word and PowerPoint -- where people like me were once paid embarrassingly small amounts of money to produce professional presentations and stationery, offices are now awash with printouts and presentations made by people who think that combining double underlining, bold, and italics is a good thing.

Ahh, here we see the problem. The guy is bitter about creative technology being given into the hands of the plebs. He thinks the functionality in iMovie and iPhoto should be enhanced and brought up to a "professional" product level, and sold at a high price. (Kinda like they're already doing. Except he wants that to be the only sales point.) He once had the kind of expertise that would have earned him a high salary, and now Apple is giving away what used to cost him thousands of dollars and lots of education time-- for free, with every Mac. This makes him fume.

Look, man, I feel for you. I really do. I understand your mindset. I know what it's like to have your bailiwick become democratized. How do you think I felt about the obviation of knowing how to write bare HTML, in the presence of WYSIWYG web-page editors? How do you think I felt when AOL users got access to USENET, or when the ISP I worked for had to stop requiring people to have at least six months' experience working with computers before we would allow them to sign up for Internet accounts? How do you think car tweakers from the 60s felt when cars became something as reliable as the phone company, something you didn't have to install new starters into every three weeks or spend every weekend under the hood tinkering? Why do you think PC users throughout the world were so disdainful of the Mac when it first came out? Because what was once to them a secret, esoteric art-- "using a computer"-- was now something that was accessible to the common man. That can play hell with a guy's insecurities.

These very same arguments were made back when Apple introduced the first WYSIWYG text editor, MacWrite. Look at all those damn fonts! Look-- you can do underlines, italics, shadows, outlines-- gawd damn! My next English paper's gonna look like a ransom note! And many did.

But jealously guarding a piece of technology from getting into the hands of people who might be able to use it well and tastefully-- especially as the market comes to mature-- is elitist and arrogant in the extreme. It's empire-building. It's backwards-facing banana-republic power-hoarding. And it's exceedingly distasteful.

Apple made a conscious decision when they decided to bring out the iApps and support the digital hub strategy: they wanted to turn the home computer into an extension of the geek toys that every male person in the 18-55 age bracket buys. They're not ignoring the pro market-- far from it; one has to look no further than FCP, DVD Studio Pro, and Cinema Tools to see that (and if Walsh thinks iPhoto is intended as some kind of land-grab from Photoshop, that Adobe might take offense at and leave the Mac platform in a huff, he's simply not done his research). But Apple's core market, the one where they make all their money, and the one where they stand to present an attractive value proposition to potential converts from Windows PCs, is in the home consumer-- the guy with a digital camera, a DV camcorder, and two kids on a tricycle and a dog to wash and a vacation to Disney World coming up in the summer. This is where Apple saw an opportunity to make people happy.

That is what Apple is all about.

And if that happiness comes at the expense of grumblings from a few bitter techno-trolls who see their mystique slipping away into oblivion, then-- frankly-- so much the better.

13:09 - What is wrong with these people?
http://www.theinquirer.net/?article=4745

(top)
A particularly uninspired swipe at Apple from Andrew Thomas of The Inquirer:

PORTLY APPLE SUPREMO Steve iJobs today announced a new initiative which he claimed will consign the WinTel PC to history.

"The iDiot™ scheme is brilliant in its conceptual elegance and is available in six exciting colours," said Jobs at iApple iExpo i2002 at iNew iYork.

"Put simply, we've realised that our products are bought by people with money to burn who aren't interested in tawdry things like value, compatibility, software choice or performance. What our users want is to pay over the odds for colourful tat. They'll buy anything as long as it's purple, pink, misty buff or moonlight indigo, provided it costs twice as much as the PC equivalent.

Look, if you're going to make fun of Apple, fine. But can't you do it without looking like a complete barking moron?

Hasn't anybody noticed that it's been like two years since Apple offered any computers in multiple colors? And when they did, those machines were the iMacs and iBooks-- consumer machines priced almost exactly competitively with entry-level PCs of similar capability. And is this guy trying to claim that it was an unsuccessful scheme? Hasn't he noticed that it's only just now that the rest of the PC world has finally stopped thrashing about in their efforts to provide colored faceplates for their monitors and keyboards and speakers, and that multiple candy-colored and translucent products like water coolers and desk fans are still commonly to be found? As a sales gimmick, at the time, it was one of the most wildly successful ones of all time-- particularly in the sense that it became de rigeur throughout the entire industry almost overnight. Companies that were about to publicly sneer at Apple for their moronic publicity stunt had to backpedal quickly when it became clear just how badly people wanted their computers to come in colors.

But almost as soon as multicolored iMacs peaked in popularity, Apple was already moving on to new color schemes and beginning to phase out the multicolored thing in favor of frosted white and stainless steel. I always enjoyed pointing out that in the year 2000, the only things in the world that you could not get in iMac colors... were iMacs.

Is it so much for these people to accept that the reason people are willing to spend more for a Mac is that the Mac offers them a better value or a technological advantage, rather than simply assuming that the only reason anybody would ever buy a Mac was because he had more dollars than sense?

Evidently so. After all, everybody who didn't make the same choice you did must be an idiot. Otherwise it might mean you might possibly have made an imperfect choice.

At least this guy has a "Flame Author" link at the bottom of the page. Not that I think it's worth using or anything.


...Oh, and just when exactly in the blistering hell did Steve Jobs become "portly"?
Wednesday, July 31, 2002
18:39 - The winners write the history books, but the pioneers come up with the names

(top)
It's been seeming more and more strange to me lately how we have developed within the framework of the technology industry as it has been passed down to us, using terminology and vocabulary that is completely ideographic-- having no bearing at all upon what it is intended to represent or describe.

I'm referring to things like file formats. Because of one circumstance and another, and what formats happened to be public-domain and which ones were easy to implement and provided good quality/compression tradeoffs and features, we now find ourselves dealing in a market where we do not talk about "picture files", but instead we refer to "JPEGs" and "GIFs" and "PNGs" and "BMPs". Similarly, the movie files we use are "AVIs" and "WMVs" and "MPEGs" and "RMs".

Just put yourself in the shoes of someone who's new to computers. Does any of this make even a tiny bit of sense? Is someone really expected to learn all these acronyms, and which ones mean "pictures" and which ones mean "movies" and which ones mean "music"?

We've inherited these oddly-named formats because of circumstance. But they weren't the first formats to become widely used; far from it.

As should hardly come as a surprise, Apple was the first company to scout the uncharted fields of image and movie and sound media, and so they were the first to give names to what their customers would be using.

What did they call their picture files? Why, PICTs. As in, PICTures. Not some funky acronym, not something referring to a standards body or a working group or a company that popularized it. Nobody had to think about what it stood for or what kind of file it represented. It was a PICTure. No filename extensions, either-- just an icon that clearly demonstrated that this was a picture. Take a screen shot on a Mac, and you'd get an output file called "Picture 1". There was never any question about what format it was in. It was just a picture.

And PICT wasn't an inflexible format, either. Like TIFF, it could incorporate a variety of encodings and compression algorithms, and you could have PICTs with internal JPEG compression at arbitrary levels, or with color depth from 1-bit to 32-bit, including alpha.

PICT didn't catch on in the world at large, though. I'm not entirely sure why.

At any rate, QuickTime was the first movie-file format to really enable the desktop computer user to do video. Apple referred to the files as "movies", and so the filename extension that the files received (when saved for cross-platform use) was ".mov". As in, movie. (The Type code for the file, incidentally, was MooV.) There wasn't any question what kind of file it was; rather than giving it an implementation-specific extension like ".qt", Apple got to lay claim to the "movie" moniker because, well, they were there first.

Same with sound files. .WAV? .AIFF? No, the native Apple sound file was known as "SND". Sound. (UNIX vendors tried to do the same sort of thing, referring to them as "audio" files, with an extension of ".au".)

Leave it to the company whose computers don't need filename extensions to lay claim to the choicest plain-language extensions, eh?

But that's just it, though. It's a perfect illustration of the philosophical difference between the two schools of thought. Apple wanted to think everything through, to overengineer the user experience so nobody would have to deal with any technical trivia that computers should be able to deal with better than humans can anyway, and automatically. They designed everything so extensions would never interfere with a person's ability to freely name a file, without fear that it would break some mysterious app-binding spell deep in the bowels of the machine. And they wanted to make sure people thought in terms of pictures and movies and sounds, not in terms of JPEG and GIF and MPEG and WAV. Who needs to put up with that kind of useless trivia? Shouldn't computers be doing that sort of thing for us?

I've written before about how today we're entering a new phase of the same philosophy, where Windows XP has espoused a newly-discovered "task-based interface" concept, something that can be pretty well described in the same terms as I've already just covered: thinking in terms of your content, not in terms of file-formats or in terms of applications. Apple's iApps illustrate the new, modern incarnation of that same philosophy, the one they've been promoting all these years but now applied to modern media: iTunes lets you think in terms of music, in terms of songs and artists and albums, instead of in terms of MP3 files and folders and encoding and filenames.mp3. Likewise, iPhoto lets you think in terms of pictures, browsing them visually as well as by assigned meta-data and descriptions, not by obscure filenames with .JPG extensions. They're grouped into albums and named and dated "film rolls", not by anonymous folders. This is the task-based interface, as envisioned back in the early 80s with the first Mac OS.

Apple's job, as they see it, is to deliver abilities to their customers, not just features. They refuse to do anything half-assed. If they can't obscure the technical trivialities entirely, except to those who want to work with them, they don't bother trying-- because they know what they uniquely bring to the table. What they offer the technological community is a philosophy, a way of designing computers so that those who use them can quite literally just sit down and get things done-- the operating system takes care of the most trivialities that it possibly can, and leaves to the user only those things that a human is uniquely equipped to do. And that's the actual creating and enjoying.

Apple has had to jettison some of its cherished ideals on some fronts; PICT is now relegated to the ash-heap of history in favor of TIFF for client-side uncompressed image manipulation, and JPEG and PNG and GIF are fully supported for export. QuickTime will play AVI and MPEG files as well as its own MooVies, and it will faithfully tack on filename extensions so Windows users won't choke on video content created on a Mac. And as ungainly as the MP3 name is, Apple has recognized the ubiquity of the format and adopted it in its own unique way: letting iTunes make MP3 files from CDs, filename extensions and everything, but pushing the organizational aspects of dealing with those files upward into iTunes itself, where the interface is all about the contents of the ID3 tags and querying the song database rather than filenames and folders. iTunes keeps the files and folders dutifully organized and named to match the ID3 tags as the user changes them in iTunes itself, and even numbers them according to your chosen name scheme so you can export them efficiently onto MP3 CDs for use with Windows... but during normal everyday use, it's not MP3s-- its music. It floats through the fabric of the computer like electricity. It makes listening to music into a complete no-brainer, without the slightest hint of technical expertise required in order to use it. All it takes to operate iTunes is the ability to move a mouse and manipulate scrollbars. You don't ever have to have seen a folder before. You never have to think about one.

The conquistadors gave the names of their Catholic saints to the towns they founded throughout the Southwest, even though today they're filled with McDonald's and Blockbusters and Wal-Marts. Half the South and half the East Coast is named after Indian place-names and tribal monikers, though the ones who gave those names are now long gone.

And if the computer industry ever comes to understand how important it is to design software that human beings don't have to think of as software, they will owe that discovery to Apple-- or else if they stumble upon it entirely independently, they will have embarked upon a laudable but ultimately unprofitable journey, one that has been traveled before and found to be no match for the marketability of unusable, if gaudy, fluffware.

14:24 - So now there's a liquor store involved...
http://zdnet.com.com/2100-1103-947358.html

(top)
Well, well. IBM seems to be building a new chip foundry.

According to this article, brought to my attention by J. Karl Armstrong, IBM wants to refocus its business away from money-losing arms (like its hard-drive manufacturing sector, which I would guess is the IBM Microelectronics they mention as being a money pit, and which I have heard they'll be dumping quite soon-- though not their consistently cutting-edge research into drive mechanisms which always seems to develop the best new miniaturization and data-density technologies) and bolster its chipmaking prospects with a new foundry for forging chips "for other companies".

The article doesn't explicitly mention Apple, but I would have to see a fair bit more negative punditry before I dismiss the possibility that the thrust of this venture might be to ramp G4 production and G5 development back up to the pace that it needs to be in, following a buyback of Altivec from Motorola by Apple. Or it could as well be for the possible POWER4 chimaera that we're hearing about in sidelong whispers.

I've also heard whisperings that there is a mysterious new manufacturing facility being built at the Apple campus in Cupertino, and that Motorola employees have been spotted going in and out of it. I don't know if I buy that one as easily, since I walk past most of the Apple campus on a daily basis here at work, and I've seen no such black-shrouded construction. I hope it isn't simply that someone saw the Peppermill having been bulldozed down (good riddance to awful food, I say), and assumed that the new building going up in its place-- with its de Anza frontage and its across-the-parking-lot proximity to One Infinite Loop-- was a mysterious new Apple lab.

Whatever ends up happening in the near future with Apple's chip supplier juggling, I'm less and less convinced that Motorola will remain a player of any note.

Unless a miracle happens.

13:48 - File Sharing-- the way it was meant to be

(top)
This morning, I was copying an image file-- a silly Photoshopped gag image-- from my desktop machine at home (where I had originally saved it off the Web) to my iMac in the office so that I could show it off to my friends there. In the middle of doing so, I realized that there's a whole lot about that process that I take for granted as a Mac user.

Long ago, back around 1985, when "workgroups" were a pie-in-the-sky office networking dream, and long before anybody had ever considered running FTP or Web servers on their desktop computers, Apple was creating something called AppleTalk. This was a zero-configuration, transport-independent, routable, point-to-point networking protocol that could support file servers and printer sharing with the click of a single button. In the age when only the elite of the computer industry had ever programmed TCP/IP settings or assigned an IP address, AppleTalk pointed toward something light-years beyond what IPv4 promised.

Imagine-- you open up your server browser, and every server in your current routing zone would automatically show up, listed by its plain-English name; you could switch to a different zone if you wanted, or double-click on one of the servers and be prompted for either guest access (which would give you limited access to the resources there) or authenticated user access (giving you free rein over those resources, as configured by the server owner). Then, once you had access, the server's resource would appear on your desktop as though it were another disk, and you could drag files to and from it to your heart's content.

This wasn't just a simple file transfer mechanism either, like FTP or HTTP. This was a complete meta-data-preserving protocol, in which custom icons and resource forks would be dutifully copied, Type and Creator bindings would be maintained, and date stamps would be preserved as made sense. You weren't simply opening a new file with the same name as some file on a remote system and then dumping data into it; you were making a total duplicate of the resource, with all its attendant details, that you could use on your local system just as though you had copied the file there using a floppy disk. Remote resources became part of the local system while the remote server was mounted; files created by applications on remote servers, when double-clicked, would automatically connect to the server and find the application in order to open in it. You could even create aliases to remote resources which would automatically dial the modem, connect to a remote network, mount the resource, and open it with your stored privileges.

AppleTalk machines, when they came online, would automatically assign themselves an address from the available space, and through broadcasts would immediately appear in everybody else's browse lists. Stories abounded about how fun the testing of this protocol was; Kris tells of how at one point, a lab full of several dozen machines were all artificially preprogrammed with a zeroed-out AppleTalk address, and then they were all brought online at once. Every machine broadcast to see whether their own addresses were unique-- and got back replies to the effect that no, they certainly were not. Every machine simultaneously jumped to a random new address, and broadcast again. Some inevitably still had collisions, and had to jump two or three times. But, as the story goes, everything had all settled out within about eight seconds; everybody was on a unique address and ready to talk.

All this back in 1985.

If you had a collection of pictures on one Mac, you effectively had them on all your Macs-- because you could always access them from any of the machines.

Other companies recognized the importance of this kind of connectivity; Microsoft built AppleTalk server support into Windows NT, and AppleTalk routers became entrenched into college campuses the world over. A student could spend many happy hours perusing the various zones throughout the campus and giggling at the clever names other students had assigned to their Macs. (My favorite, at Caltech, was "Bhoutros Bhoutros Duo".)

Eventually, though, two things happened to render AppleTalk irrelevant except among Mac users. The first was the widespread adoption of TCP/IP. And the second was Windows SMB/CIFS file sharing.

TCP/IP became the default Internet protocol because of its easily scalable routability. While AppleTalk was indeed routable, its address space-- designed for browsing rather than for direct connections-- became cumbersome and imprecise for widespread connectivity to really take hold. (Besides, it only really applied to Macs.) All the major services powering the nascent Internet ran on UNIX servers, which meant TCP/IP was the language of FTP and the Web and e-mail, and soon both Windows and the Mac had full TCP/IP implementations in their operating systems; on the Mac this was in addition to AppleTalk, which remained on board for LAN networking-- nice and convenient, and a lot easier than having to deal with TCP/IP settings, but only useful for talking Mac-to-Mac within an office environment.

And by this time, Windows had incorporated SMB into its networking suite, enabling its users to do just about everything that Mac users could do with AppleTalk. (Almost.) SMB shares in a zone could be browsed; remote apps could be run without being installed locally (well, unless they wanted to muck with the Registry, which they often did); users could set up multiple shares on their machines and define labels for them and for the entire networked computer; shortcuts could point to remote shares and mount them automatically.

But SMB, like AppleTalk, was not easily scalable, and routing between zones was nearly impossible. SMB was designed to be much more flat than AppleTalk ever was, and while hierarchies of domains and workgroups could be set up, routing SMB traffic outside the LAN was (and remains) the subject of night sweats among IT administrators all over. You can't file-share to your Windows machine at home, for instance, from your Windows machine at work.

Apple realized that the world had sidestepped their plans for networking and had gone down a different road. They knew that AppleTalk would not survive as a non-IP protocol, because more and more new routers being implemented didn't support (or the administrators didn't bother to configure) AppleTalk routing. The elegant hierarchical zone approach, with its named lists of servers, wasn't going to work in a world of direct point-to-point client-server connectivity. So, in Mac OS 8, they shifted gears, and out came AppleTalk/IP.

It worked just like AppleTalk, as far as the file-sharing and the authentication parts were concerned. The big difference was that instead of having to browse through a list of machines on the LAN, you could now specify a direct IP address or hostname, and it would connect directly to that machine-- no matter where on the real, live, TCP/IP-based Internet it was. Mount a friend's drive in Rhode Island from your laptop in the San Francisco airport? No problem. Grab those movie files off your home machine's desktop and show them to people at work? Go right ahead. While Windows users had to install FTP servers and wrestle with IIS and move files around to get them into the right place for sharing before leaving the machine's presence if they wanted to fetch files off them remotely, Mac users simply connected to the remote machine and mounted the share they needed.

And in Mac OS X, it became even simpler-- setting up shares became nearly irrelevant, as the system's multi-user nature asserted its dominance over what had previously been a confusing mess of assignable per-user privileges. Now, a guest user (if allowed access) could only mount each user's Public folder, inside which was a Drop Box that he could copy files into (but could not open); a neat way for users without accounts on the server to be able to send files to the server's owner or to other users without any risk. But an authenticated user could mount not only other users' Public folders, but also his own complete Home folder, or (if he had Administrator privileges) the entire disk. Or other disks in the system. And once that share was mounted on the remote user's desktop, he could copy files to and from it as though it were a local disk.

With AppleTalk/IP, Mac users have their long-time capabilities back, scaled to match the modern Internet. If we have files on one Mac, we have those same files on any Mac-- regardless, now, of what transport connects those machines, or even where in the country they might respectively be. I can sit on the couch downstairs with my iBook, browsing the image files from my desktop machine upstairs while connected wirelessly via AirPort. I can grab my in-progress book chapters from the home machine and copy them to my computer at work without having to do anything to the home machine but connect to it. The only configuration involved in AppleTalk is flipping the "on" switch, as always.

And now it's just part of a pantheon. We've also got "on" switches for the Web and FTP servers, for SSH connections, and even for impersonating an SMB server to other Windows machines (when Jaguar arrives).

But it's still not quite good enough for Apple. You see, it's flexible-- but it's not elegant. It's not truly zero-configuration, like AppleTalk was back in its early conception. It still has to deal with DHCP servers, TCP/IP settings, gateways, masquerading, leases-- it's just not as seamless as it could be.

Well, good news: it's all coming full circle, with Rendezvous.

We're on our way back to the original promise of networking that we had in front of us in 1985. When Rendezvous is here, not only will we not have to configure any network settings (all the machines will negotiate working settings between them all whenever anybody appears on the network), but sharable services will build themselves into browseable lists for everybody on the network to peruse. Within the local zone, iTunes playlists and iChat partners will automatically make themselves available to each other's machines, Mail will notify us when someone who has sent us a message is on the network and available to contact directly, printers will automatically configure themselves without any setup beyond plugging them in-- and that's just the first iteration, before we've discovered what this can really mean for us.

Meanwhile, I don't find myself deprived at all of functionality and ease, since I was able to simply hop over to my home machine and grab that picture off the desktop. I'm sure this capability will eventually make it into Windows, and when it does, everybody will hail it as a great advancement that they're not sure how they were ever able to get by without; but I've taken it for granted all these years. It's easy to lose sight of the magic of a piece of technology if you use it every day of your life, enough so that you come to depend on it.

Every time someone tries to send me a file over the fitful and crash-prone ICQ transfer protocol, or has to put a file up on a Web server somewhere, or has to fire up Gnutella or KaZaA to try to get a reliable cross-network point-to-point transfer going, I'm struck by the wistful feeling of something beautiful that we'll never be able to enjoy to its full potential-- and it's made all the more stark every time Marcus tells me "Hey, check your Drop Box-- I put another couple of files in there for you".

It could all be this easy. Somewhere out there, there's an alternate Earth where it is.



UPDATE: As clarified by Kris, the big AppleTalk settling-out test actually occurred in 1989, when the new Ethernet-based implementation of AppleTalk (called EtherTalk) was being developed. Prior to that, AppleTalk was designed to operate over serial cables, and because of the characteristics and limitations of serial topologies, the settling-out behavior wouldn't have been anywhere near as dramatic as it was over Ethernet.


Tuesday, July 30, 2002
19:23 - I actually thought it might end up this way...
http://www.koenighaus.net/indepundit/archives/000789.html

(top)
...For like three days.

On September 13th, 2001, I caught the first plane out of SFO after the airports reopened, so that I could attend my brother's wedding it Atlanta. He and his fiancee had had it planned for the better part of a year beforehand, and yet when everything changed on the Tuesday right before everything was set to happen, they decided that the show must go on.

The time between Tuesday morning and Thursday afternoon (when I raced from the nearly abandoned San Jose airport, where my flight had been supposed to originate from, up to San Francisco where (with no small amount of surprise at their own system) they were able to get me on one of the first planes to taxi from the terminal, at a little after 3:00PM) was a time when we all spent a whole lot more time inhaling than exhaling. We didn't know what was next. The Golden Gate Bridge? The Space Needle? A cloud of anthrax over Los Angeles? The confident among us said "pshaw" and explained scientifically why such-and-such thing couldn't happen. The jaded gamer d3wds made up tasteless anagrams of "Osama bin Laden" (I still like "damn labia nose"). But most of us simply sat there getting our jobs done and occasionally glancing out the window, nervously reloading cnn.com every few minutes.

As we collected our wits, we started thinking about possible futures. Anything seemed possible. New York flattened by a nuke? Disneyland turned into a garden of corpses? Would the America of 2002 be one where Mr. and Mrs. Thompson from down the street had to pick their way from bombed-out rubble-pile to bombed-out rubble pile as they searched for mementos or food, wrapped up in rags to keep out chill arctic blasts brought on by nuclear winter?

Somewhere over Texas, at 30,000 feet, was where I dismissed that vision.

The South was still the South; Atlanta was still full of life and laughter. We saw the Martin Luther King memorial, we fought the crowds at the big semi-outdoor mall known as the Underground, and we toured the Coke Museum (with its freaky international flavors and its bizarrely endearing refusal to mention what the original Coca-Cola mix was used for). The sun was still bright, the cars still honked, we still had to pay full price for admission.

And the wedding still brought families and friends together and let people share memories and hopes and dreams, almost as though nothing had happened. And when their car pulled away from the church, it had PROUD TO BE AMERICAN NEWLYWEDS scrawled across the windows.

By the time I got on the plane on the way home, the scenario the Indepundit lays out was long gone from my mind. It just wasn't going to happen that way. Ludicrous. Not on this planet Earth. Not in this dimension. Not in this country.

As the plane descended into San Jose, I'd had a lot of time alone to think, staring out the window, the Space Ghost and Brak songs on my laptop long ago having exhausted themselves; I'd had a lot of things go through my head. I'd thought of tanks rolling, troops advancing, impassioned international meetings and summits and condemnations and condolences and complaints and support. But the buildings in downtown San Jose, as the plane glided over them on the way to the runway, seemed anything but flimsy. They were there for good. Even the plane I was on didn't feel like it could knock them down.

And as the wheels touched down, the thought that went through my head was, simply, What the hell were they thinking?


13:24 - Arr, ye scurvy gods
http://corsair.blogspot.com/2002_07_01_corsair_archive.html#79595446

(top)
Corsair has some snarky comments about people's religiously-bent reactions to the rescue of the miners in Pennsylvania.

What it might have been was men and women and machines and processes working the way that they were built, trained, and employed. These people did not just show up there at random and begin drilling. They had a plan develped by men and carried out by deceidedly ungodlike individuals.

Why then is God getting all the credit and none of the blame? He did, in fact, move that old mine that was filled with water 300 feet in closer to the new mine so that the miners would break through and almost drown. That was the real miracle! Oh, he didn't do that? That was a faulty map drawn by men many years ago? I see. well, couldn't he have intervened back then when the map was being drawn and fixed so that the nine miners wouldn't become trapped in 2002? A little proactive miracling would seem to work wonders.

It's a question as old as the genre of edgy cynical fiction (Dante up through Vertigo comics): Why is every good thing that happens in the world a "miracle", and every bad thing that happens simply "God's will"? Why does God allow bad stuff to happen in the world at all?

I think what we have in this case is simply a question of vocabulary. "God", as invoked by these people, is a concept-- a shorthand for the good will of humans and for luck. You don't see people holding up signs saying "Thank the rescuers!" or "All hail the rescue equipment designers!" or "We had good fortune!" or anything, but everybody pretty much knows that's taken as read. Thanking "God" is a well-understood and comfortable way to express relief that everything went right.

There could have been some serious injuries down there that claimed lives, for instance. The equipment could have broken down in a freak accident resulting from poor maintenance or clumsy usage or whatever. But it didn't; and the ingenuity of human beings, while fully capable of solving a problem like this in an ideal world (or even in a realistic one), was not hampered by unforeseen and inexorable events. Their human genius was left free to work as intended. And that's what has left people relieved.

I think what this shows is that we're a humble species by nature, and we have a pandemic guilty conscience. We're really not all that high-and-mighty in our attitudes when there's a disaster to face. Deep down in our biology, we're conditioned to accept loss, to write off death that results from accidents or disasters over which we have no control. We know to look the other way. We move on. We don't blame God in a societal bloc for things that go wrong, because as humble and fearful as we are conditioned to be, we assume that such reaction would just be inviting more misfortune. Instead, we react with surprise and joy when everything turns out all right. We have a lot of emotions to express in that kind of circumstance, and the way we're conditioned to do it is to recognize that Fortune was kind-- or in other words, that "it was a miracle".

I really don't mind it when people use this kind of vocabulary to express their feelings of relief-- none of it ever is intended to trivialize the ingenuity or the efforts of the humans involved. I don't think anybody in the town thought the rescue could have happened without those things-- or if they did, they're a statistical outlier, like the people who believe that all modern medicine is quackery and only prayer can cure disease. Most people are rational. Even sports stars who thank God for getting them to the World Series know that what they're really saying is that they're relieved that everything has gone right so far, that despite whatever hardships or injuries or disadvantages were in the way, he got there anyway-- and what he's treating as a miracle is the fact that it wasn't bad enough that it would have forced him to quit. That's the modern sense of "miracle": that nothing fell out of the sky to screw it all up beyond recovery. Not a trivial concept, that.

But that said, I commiserate with your frustration, me matey. There's so much in this world that would seem so much easier to deal with if we could just call everything by its right name: human kindness, perseverance, ingenuity, quickness, endurance, selflessness, and dumb luck.

But there are also times when if given the choice between having things the way they are now, and having them such that we treat everything turning out right as the default expected behavior and throwing irrational tantrums and railing against misfortune or evil when something like 9/11 occurs, I would pick the former.
Monday, July 29, 2002
18:15 - Gee-Ffffffff-f-f-ffizzle
http://macbuyersguide.com/hardware/system/2002_pro_g4.html

(top)
Here's a technical rundown of what's likely to be the case design in the new Pro towers that are supposed to be released later in August.

The meat of the article, though, is the following bit about the CPU:

There will be single and dual processor CPU options, most likely based on the 7470 Power PC chip, as opposed to the legendary G5, which some sources suggest may never be released. Indeed, in an attempt to get to the bottom of the story behind these persistent rumors, Mac Buyer's Guide spoke to Motorola Canada president Frank Maw in on July 25, 2002. The way he explained things, G5 processors aren't even on the company's radar screen any more. In fact, he doesn't even mention the company's desktop processor business in his corporate presentations and told us there is "no timeline" for future non-embedded PowerPC family processor releases. He maintains that he's not aware of any timeline and says it's "not a big focus" for the company at this point. And so it goes.... 

Well, there we have it. Unless there's some hellacious disinformation campaign going on, from a company that really stands to gain nothing from such a tactic, Motorola's pretty much shut the book on the PPC development story. Sure, they'll keep selling what CPUs they have, as long as Apple can make use of them. But this isn't the speech of a company that has world-crushing surprises lying in wait; this isn't what you say when you're biding your time and licking your lips before you whip the cover off something that's supposed to knock the collective socks off everybody in the area. This is the tired, broken leave-me-alone growl of an executive who knows it's the end of the line.

I think we can pretty much rest assured that however much information we have on Motorola's chip-making future, Steve Jobs has a whole lot more. And I think we can take it as read that he has at least one good contingency plan in the wings, and he's only waiting for the right moment to spring it into action, careful not to make of it too big a deal or to time it in a way that would deflate confidence in Apple rather than bolster it. Jobs isn't a stupid man, nor is he unrealistic. His blue-sky envisionings might not owe much to earthbound limitations, but he knows the realities of selling Macs in the market in which he now finds himself.

From the analysis to which the above quote links:

The final clue that Apple is switching its emphasis from PPC hardware is the decidedly software-oriented thrust of the announcements at the keynote speech by Steve Jobs at Macworld New York in July 2002. No speed bumps at all --but the company clearly showed its intent to turn software into a profit center, with the announcement of "no upgrades" pricing of US$129 for a point upgrade to Mac OS X (10.1 to 10.2); a new version of QuickTime costing $29.95 to enable full-screen viewing (the upgrade disables previous $30 "Pro" product keys); it costs $40 more to add MPEG 2 support to OS 9 and OS X. And then there's its US$99 per year ".Mac" subscription service. All of these items suggest that the company's focus is not on the hardware, but on the software. Although Steve Jobs has a reputation as a hardware guy, I think he's taking the company's new "Switch" mantra to heart....

It's not that Apple will soon splinter into hardware and software divisions, or that Apple will cease to make its own boxes and instead market OS X for generic Intel boxes, as an alternative OS like Be or OS/2. I don't think that will happen. Too much of Apple's fundamental strategy is based on the "whole-widget" engineering approach for that to be an option.

But some big change is coming soon. Whether it's Apple buying the PPC from Moto and setting up its own fabs, or switching to Intel or Sparc or POWER4, we're going to see a change, probably within a year. This current story has played out its final act, and there's nothing more to say.

16:58 - Windows Moment of Zen

(top)
"Doctor, look! I... I think it's trying to communicate!"



What does the Windows attempt to say? It interrupts your work to let its pleas be known, and yet it speaks with a muddled tongue, and its message is lost in the cacophony of modern life.

Meditate upon the truth of the message, focus your mind and rid yourself of emotion and desire, and the hidden meaning will become clear:

"Out of Disk Space"

13:02 - Moyger?
http://www.macblog.com/comments.php?id=14_0_1_0_C

(top)
MacBlog has a spit-take on the Apple/StarOffice thing that toys with the idea of a Sun/Apple merger, something that's been bandied about over schnapps many a time-- people always seem to like to imagine who could merge with whom, rather like how people in the early 1990s would fill up the Tolkien newsgroups for months on end with their wish-lists for what actors should be cast for some hypothetical Lord of the Rings movie. Sean Connery as Gandalf! Val Kilmer as Aragorn! Leonardo DiCaprio as Legolas!


Would this be cool? Apple's bouncing baby XServe would be backed up by Sun's massive iron, launching the company (er, Snapple? Is that taken?) into the corporate and education server market with Sun's expertise and Apple's scrappy little OS (or some Solaris/OSX hybrid). Sun adds Java know-how, applications, vertical-market solutions, high-end hardware, swoopy multimedia and business accounts. Apple would add all of its consumer know-how, digital video apps and its ability to market stuff and make it whizzy. Together they might even become unlikely corporate heroes of the open source community.

Meh. I don't know... it seems to me that if Apple and Sun were to merge, it would be out of desperation-- or at least it would be impossible to make it appear that it were anything else. Two beleaguered minority computer makers, both of whom happen to be waving the UNIX banner (always a great unifying cause in the past, hyuck hyuck), realizing that they're powerless to fight both Microsoft and each other for market share. Of course they'd merge. All the UNIX vendors will eventually merge. They'll spiral into each other like a black hole, but they'll be shrinking faster than they can accrete new material, and before you know it they'll be a pinprick in space.

Nah... I don't think it's any likelier than it was for Apple and IBM to merge, back in the Taligent days. Apple wants to be Apple; just look at how fiercely they're guarding their identity just against the NeXTies in their own midst.

Corporate alliances I can see happening. But not mergers.

12:45 - The Open Source Essays
http://www.denbeste.nu/cd_log_entries/2002/07/OpenSourcepart1.shtml

(top)
There's a well-worth-reading series of essays on the nature of software development and how open-source fits into it over at USS Clueless, sparked by a one-on-one meeting with Eric S. Raymond. There's good stuff there.

I've been of the mind for a long time that open-source software works best for infrastructural, well-defined, server-side software that is intended to serve a purpose according to a spec (e.g. an RFC). But it doesn't work very well for consumer software that requires delivery deadlines, UI guidelines, corporate alliances, codec licensing, and so on. In other words, open-source works great for servers, but it's miserable for desktops.

(By this token, I've noted how Linux makes a dandy server but a joke of a consumer desktop-- while Windows makes a perfectly fine desktop but a laughable server. The design goals of the two genres of computing are entirely different, and they're respectively best served by different models of software development.)

Then there's the question of software bloat and Moore's Law, and the position that software bloat is more a conscious result of coding for reliability and quick time-to-market than it is a consequence of too many undisciplined kids being hired off the streets.

There are numerous ways in which this tradeoff plays out. For example, most embedded code is written in C, which doesn't hold your hand and pretty much requires you to do everything yourself. More modern languages like C++ and Java offer considerable assets to the programmer in terms of handling automatically things which a C programmer would have to do directly. On the other hand, you have to load a much bigger runtime system (consuming memory) and a lot of the code generated by the language runs less efficiently than would comparable C-code (decreasing execution speed). A competent C++ programmer could probably finish his program sooner and it would usually have far fewer bugs, though, and if the hardware can sustain the lowered efficiency of the code, it's still probably the right thing to do.

His points are good, but I'd feel better having expanded on them a little. Yes, it's true that as codebases grow by orders of magnitude, so does the risk of bugs and of delay. But there are ways to mitigate that. Object-oriented programming, for instance, is a design initiative whose entire point is in making large-scale projects react like small-scale projects by reducing the number of interfaces between modules that the programmer has to deal with manually. But den Beste is right-- OO is slower than non-OO code, and so software that's written to take advantage of OO design principles might get to market quicker and be more bug-free, but it'll also be slower. As a matter of fact, that's a good deal of what's behind OS X's perceived slowness: Cocoa is a very OO-heavy development platform.

But, again, there are ways to mitigate this. The way the development process works these days in an OO-heavy project is for the first iteration of something to be done in as high-level a coding style as possible, using as many OO elements as possible, and delivering the promised features in a timely manner without overly many bugs. This release will be sluggish in execution; but in the second iteration, that's where you start optimizing. You take the components that represent the biggest performance hits and you re-code them in C. Then, in the next iteration, you take the C parts that cause the biggest performance hits and re-code them in assembly. The newest and riskiest code remains OO, but as the code matures and the engineering team comes to know it well, it evolves downward to lower-level implementations that are a lot faster-- and a lot less risky to write that way now than it would have been if it had been done that way at the outset. New OO features will tend to suck away the performance benefit gained by the low-level optimizations, but in a well-managed project the result will be a net gain in speed and in functionality.

The downside of this development process is that it takes a long time and the initial impressions in the public are of something that's very slow. But, once again, sometimes a company's goal is more about feature delivery than about absolute speed; and besides, speed will always increase with time. Whether it's because of better hardware or better-implemented iterations of the software, the apps will get faster as they mature. And in the meantime, we've had those cool features to play with.

It's not a viable development plan for all the classes of software that den Beste mentions, or even for most of them. But for consumer multimedia software, it does a pretty good job, I think.

09:29 - Yeah, what he said.
http://www.spokesmanreview.com/news-story.asp?date=072802&ID=s1188630&cat=section.bu

(top)
David Saraceno of The Spokesman-Review says pretty much the same thing about Macs that I did last night:

I only use one word processor, and it runs just fine. I don't need another one.

Sub-argument: PCs are faster. They sure are. In fact, I read recently that a Pentium III can outrun a Pentium IV in certain operations. But my word processor runs plenty fast on a Mac.

Ever done video, MP3s, or DVDs on a PC? Have fun. A couple of years ago, an acquaintance of mine bought a $2,700 "preconfigured" Gateway computer to create digital video tapes. Months, hundreds of dollars in long-distance tech calls, and a bunch of gray hair later, he somehow got Gateway to take it all back for a complete refund. Then he bought a cheaper Mac-based system, and never made another support call.

What's more, Mac programs like iMovie, iTunes, iDVD and iPhoto are all FREE on a Macintosh, and work right out of the box. But then again, there are more word processors available on the PC.

He brings up a point which is a frequent bone of contention: the smaller pool of software which is available on the Mac. It's been the subject of plenty of jeering from PC users in the past, both in its own regard and when Mac users point out that having less software to choose from can be a benefit. "Ha! Look at the silly Mac users, trying to pretend that having less software is an advantage!"

But really, let's be realistic for a moment. How important is it for there to be fifty photo-organizer apps available rather than five? What's the point of having a dozen word-processing programs when everybody only uses one? Sure, it represents a wider user base for there to be more shareware and commercial apps available for a given task. Sure, there's a whole lot more choice. But in the real world, what's the benefit? If everybody chooses a different application for a particular task, doesn't that just contribute to stagnation and gridlock, as everybody tries in vain to interoperate with each other? And if everybody standardizes on the one clear victor in the field (like WinAmp or Word), then what's the point of having all that choice?

Sure, competition spurs innovation. But that can occur just as easily with three competitors as with thirty, especially if each of those competitors has a bigger piece of its respective pie to worry about. Besides, Apple and the third-party software developers on the Mac, as I've said before, don't appear to have had much difficulty demonstrating innovation above and beyond what the laws of the market would seem to dictate. And when's the last time Word was forced to incorporate new innovations because of pressure from competitors?

Of course, we know what people really mean when they say that the PC has more software: they mean "more games". That's all any of this comes down to. It's all about the gaming. It's all about the different flavors of crack. Never mind how a computing platform might inspire a person to create; all that really matters is how a platform might enable a person to consume.

It's in circumstances like this, though, that I love noting how the people who bitch at the Mac for its lack of games are the same people who scoff at it for being a "toy".
Previous Week...


© Brian Tiemann