Future Shock

Okay, I have this overwhelming urge to write that I must indulge.

In 1998, when I was opening the Borders store in New Orleans, I told my trainees that within five years CD’s would become obsolete.  They would be replaced by solid state media.  I got it partially right.  I did not foresee the iPod, and at the time I was not familiar with mp3s.  CD’s still exist, but their sales have declined drastically over the last decade or so.  The original iPod had a hard drive, and even though there were some flash memory-based mp3 players on the market the iPod worked well and stored a lot more music.  It would be almost four years before Apple released the Shuffle, followed a few months later by the Nano.  The album, as an individual artifact, became less and less relevant as consumers shifted toward a song and playlist-based means of organizing and listening to their music.

For a few years now I have mulled over the changes in computers.  Someone once asked me what would supersede the Pentium processor now that clock speeds had leveled off.  “Multiple cores,” was my reply.  Now we have dual-core mobile phones running at clock speeds that best my first four computers combined, and four-core phones are forthcoming (if they have not already hit the market).  That I did not predict.  Granted it is all part of a larger push for increasing revenue and getting consumers to buy a new phone as often as their wallets will allow, but it is still indicative of things to come and I feel like exercising my prognostication skills.¹

The size and portability of mobile phones lends their design to modularity.  I think the mobile phone, or something very much like it (i.e. iPod touch, Nokia N800, etc.) will become the central hub of our computing lives, and I mean that rather literally.  With enough processing and graphics power, the mobile phone will take advantage of keyboards, mice/touchpads and external monitors, essentially becoming the home computer. Untether it from these accessories and it reverts to its smartphone persona, still carrying its user’s preferences, data and programs. If it is a work phone, it gets plugged into a peripheral infrastructure² at the office, something like a laptop dock.

I think that for most users, this scenario will be more than adequate for day-to-day usage. They carry their computer around in their pocket, and when they need to type something quickly, or manipulate images on a larger screen, they will plug it into the dock. It could be almost any dock, too, and not just the ones at home or work. Internet cafes and libraries could offer docks for occasional or student use. The full-fledged desktop computer and laptop will still exist for those who really need or want them (me).

And then there is this idea of a “smart monitor” that I have been toying around with, and I am virtually certain that someone has already implemented the following scenario. If I went out right now and bought an iPad, a stand, a mac mini, a Bluetooth keyboard and mouse, I could take it all home and within a few minutes have what amounts to a full-fledged system. Using remote desktop software like Splashtop, Screens or LogMeIn, my iPad would in fact serve more than adequately as my monitor. Gaming and watching smooth video might not be an option, but everything else that a desktop can do will be. SAS? Well, let me just fire up my Windows virtual machine and get to it.

The point of this thought exercise is that in the near future, the issues with latency that currently exist in my iPad/mac mini setup will simply not exist. I will be able to play first-person shooters with my keyboard and mouse, run Windows programs in a virtual machine, then take the iPad downstairs to browse IMDB while watching a movie. When a high-resolution iPad/tablet enters the market (and that is one prediction I am certain of), the need for an external monitor decreases somewhat. The iPad is already less expensive than many monitors, and for some consumers the price premium for the added portability and functionality will not be enough to stop them from making the leap. This kind of setup will not satisfy everyone, I know that. But for a large and possibly growing number of consumers it will.

¹ No, I cannot predict the future, nor am I foolish enough to believe that I can.

² Peripheral infrastructure? Where did I come up with that?


Signal vs. Noise

Everybody is a know-it-all these days — chicagotribune.com.

I am not interested in the take of @stinky on the Fort Hood shootings or any other current events. I am watching CNN because I expect them to gather the news, not act as a clearinghouse for any bonehead with a computer, a cable modem and a half-baked opinion.

I agree.  If I wanted to listen to a bunch of people rant–often illogically–about the latest news stories, I’d go to a bar or a coffee shop.  A news organization should filter out the noise, not promote it.  Twitter has its place in this world, but this is just a ratings grab.

NSFW: Weezer, plane crashes and everything else that’s worrying about the real-time web


Okay, I wouldn’t call this NSFW, but my standards are not your standards, so there ya go.


In truth the desire is far more cynical: to ensure that the world knows that we were there when something dramatic happened. I was on the scene, I was somewhere you weren’t – and I have the photos and tweets and videos to prove it. Check out my YouTube account; follow me on Twitter. LOOK AT ME, LOOKING AT THIS.

Reminds me of the t-shirts people bought and wore the day after a concert.

And speaking of concerts:

I mean, what were we all doing? Filming and tweeting and checking in rather than just putting our phones away and enjoying the gig. Why does the world need two thousand photos of the same band on the same stage, all taken from a slightly different angle. That kind of 360 degree imagery might have been useful on the day Kennedy was shot – not least because it would have kept Oliver Stone quiet – but for a Weezer gig? And what’s the point of checking in on Foursquare at a ticketed event that no one else can get into. You might as well tweet “I’m a dick” and be done with it.

Okay, so I put the answer before the question, but I wanted to use that nice segue.  And let me not forget the irony: this post, as part of a personal experiment, will be tweeted.  Because how else can I prove it exists? </sarcasm>