Select your localized edition:

Close ×

More Ways to Connect

Discover one of our 28 local entrepreneurial communities »

Be the first to know as we launch in new countries and markets around the globe.

Interested in bringing MIT Technology Review to your local market?

MIT Technology ReviewMIT Technology Review - logo


Unsupported browser: Your browser does not meet modern web standards. See how it scores »

{ action.text }

Many years from now, I’ll be hunched over in a creaky old pine rocker on the porch of my retirement home. For hours at a time, I’ll sit staring at the trees, lost in thought. Then a passing car will startle me out of my reverie and suddenly I’ll begin to blurt out words like an old radio whose short-circuited wiring has accidentally righted itself. My utterances might seem incoherent at first, but whoever takes a moment to listen will quickly realize that they’re not incomprehensible, merely ancient: “MacPaint … AppleShare … ImageWriter …” I will tell anyone who will pretend to listen, “I was a Mac person.” Maybe I’ll get really lucky and catch the ear of a young history buff. She will recognize some of my strange utterings from her History of Technology class and understand right away that I come from the dawn of the Age of Personal Computing. With wide eyes and hushed voice, she’ll want to know if I ever saw a Macintosh with my own eyes. I’ll tell her truthfully and in all modesty, “I owned one.” The Mac will presumably be pure history by then.

Every day seems to bring more bad news for Apple and its famously loyal customers: “Apple Loses $708M,” “Apple to Slash Work Force by 30%,” “Gateway 2000 overtakes Apple in Education Market.” One particularly dark moment came last fall, when Yale University officials declared that after 2000 the university network will not guarantee support for the Mac-until recently the most popular machine on campus. This public abandonment threatens to undercut Apple’s strategy of falling back on a few niche markets, notably education; for longtime Apple users, it is a betrayal tantamount to telling an aging Nobel Prize-winner that his services are no longer needed.

These, then, are tough times for any Mac person: to watch the steady demise of the company that invented this “insanely great” machine; to see frightened school principals and college deans abandon this elegant, intuitive platform; to see colleagues, friends, and even family members, good and loyal Mac people, throw in the towel, however valid their reasons-price, software selection, peripheral availability. Wired magazine’s cover story last year on the embattled company featured a collection of former Mac loyalists who have gone over to Windows for one reason or another. It was agonizing to read the list of high-profile defectors.
The question that has been disturbing me recently is: Should I join them?

I realized a few months ago that I needed to buy a new computer. The last machine I bought was a PowerBook 180, purchased in 1993. It has a grayscale monitor, doesn’t run a lot of Internet software and, after four years of enthusiastic use, shows a fair amount of wear and tear. I try to avoid getting caught up in hardware and software upgrade mania-upgrading just for the thrill of it or in response to the pervasive cultural anxiety about falling behind. But sometimes there are good reasons to upgrade. Since I now perform a good portion of my research on the Web, it is time to step up to a quicker machine with color and more memory.

I phoned my brother Josh to tell him that maybe the time had come to switch to Windows: “Everyone else seems to be doing it.”

“David,” he gasped, “you’re not serious!” This from someone who has been forced to use Windows at his place of business. Knowing that I have the freedom to stay with Mac, he couldn’t believe I would even consider defecting.

It’s not that I have become dissatisfied with the Macintosh. On the contrary: After 13 years and nearly as many hardware upgrades or outright purchases, I retain my reverence for the machine that helps me think and write my best. The Macintosh was, after all, the first personal computer to capture the popular imagination. Before the Mac, nontechies didn’t have much interest in personal computers for one simple reason-they weren’t personal. They were computers-big ugly calculators that one could wrestle into performing calculations or type on without having to use White-Out.

The Macintosh changed all that. Its famously intuitive graphical user interface, which put aesthetics on equal footing with function, turned the personal computer into a tool whose power derived not from its calculating capabilities (on that front, the Mac was no powerhouse) but from its ease of use. “The interface makes the teeming, invisible world of zeros and ones sensible to us,” writes Steven Johnson in his terrific new book, Interface Culture: How New Technology Transforms the Way We Create and Communicate. “There are few creative acts in modern life more significant than this one, and few with such broad social consequences.”


1 comment. Share your thoughts »

Tagged: Computing

Reprints and Permissions | Send feedback to the editor

From the Archives


Introducing MIT Technology Review Insider.

Already a Magazine subscriber?

You're automatically an Insider. It's easy to activate or upgrade your account.

Activate Your Account

Become an Insider

It's the new way to subscribe. Get even more of the tech news, research, and discoveries you crave.

Sign Up

Learn More

Find out why MIT Technology Review Insider is for you and explore your options.

Show Me