Select your localized edition:

Close ×

More Ways to Connect

Discover one of our 28 local entrepreneurial communities »

Be the first to know as we launch in new countries and markets around the globe.

Interested in bringing MIT Technology Review to your local market?

MIT Technology ReviewMIT Technology Review - logo

 

Unsupported browser: Your browser does not meet modern web standards. See how it scores »

{ action.text }

Within a few years of the Apple II’s debut, there followed a whole set of “business-class” microcomputers from other manufacturers. Most of these machines ran a common operating system, called CP/M, which had been developed by Digital Research. CP/M was extraordinarily simple-all it could do was read keystrokes, display characters on the screen, manage files on a floppy disk, load programs into memory, and run them.

Rudimentary though it was, CP/M had enough power to give birth to the microcomputer software industry. My first exposure to a computer was with a Xerox-built CP/M machine that my father bought in 1980. It ran dBase II (a database program) and WordStar. When IBM brought out its PC in 1981, it was a late entrant into the game. The company hired a tiny company called Microsoft to write a clone of CP/M called PC DOS. (Microsoft had actually bought DOS from Seattle Computer Products for $50,000 and sold the program as its own.) Like CP/M, PC DOS could do little other than manage disk files, load programs into memory and keep them running.

At the time, Apple was criticized for not building its own CP/M or DOS-based computer. But Apple’s business model-and its corporate structure-were based on using proprietary but innovative software so that it could enjoy significantly higher margins on its hardware than its competitors could ever justify. (How IBM overcame its corporate culture to build a PC without its own proprietary operating system is a story that has been well chronicled by others.) So rather than join the pack, Apple decided to leapfrog. Instead of using Intel’s popular 16-bit processor, Apple opted for Motorola’s new 32-bit 68000. Apple also concentrated on developing a graphical user interface that would make the computer dramatically easier to manage-and thus expand the market to a whole new class of customers who felt put off by the PC’s techie look and feel. After two failed attempts (the $10,000 Lisa and the Edsel-like Apple III), the company finally got it right in 1984 when it introduced the Macintosh.

For this reason, attempts to compare Apple to Microsoft misunderstand what drives the two companies. Microsoft innovates software. But with the exception of the Macintosh user interface, virtually all of Apple’s innovations have been in hardware. Apple popularized the mouse and 3.5-inch floppy disks. Apple introduced trackballs and then touch pads on laptops-in the process pushing the keyboard to the back of the laptop and creating a wrist rest, which is today standard on almost all portables. Now, Apple is pushing wide-format displays-screens considerably wider than they are tall, more akin to a movie screen than a TV-into the mainstream. Within three years, such displays will probably be standard in the PC world as well.

What’s exciting for me about OS X is that this the first time in more than a decade that Apple has introduced a significant software innovation. And oh my, is OS X significant! For starters, consider its geeky underpinnings. For more than three years, analysts have been hailing the arrival of a Unix variant called Linux (or GNU/Linux, to give proper credit to its many developers). But although Linux has charmed the code-breathing set, it has made little headway into homes and businesses because it is too hard to use and too unlike Windows and the MacOS. OS X will change this. Unless an atomic bomb goes off at Apple’s headquarters in Silicon Valley, by this time next year Apple will be the world’s largest supplier of Unix-based operating systems. OS X will prove that it is possible to give Unix a friendly wrapping. The impact will also be felt by Bill Gates’s little enterprise because, for the first time ever, Apple’s operating system will be more stable and faster than Microsoft’s.

OS X also brings with it Cocoa-a new set of tools for writing desktop applications. These tools evolved from NeXTStep, the development framework for the NeXT computer. I wrote a book about NeXTStep back in 1993, so perhaps I’m biased. But practically all the programmers I knew told me they could write applications with NeXTStep five to 10 times faster than they could for Windows. If Cocoa is even half as good as NeXTStep (and initial indications are that it is better), we could see an explosion of high-quality applications written by individuals or extremely small companies. This means that OS X has the power to revolutionize the software industry.

Initial reception of OS X has been lukewarm at best. Many users seem to think that Apple invested too many resources in “eye candy.” As you move windows around the screen, for example, they stretch and warp as if painted on sheets of rubber. And because OS X is a fundamentally new operating system, it doesn’t yet work with many scanners, digital cameras and other peripherals (compatibility will come when the necessary drivers are written). But within a year, these minor problems will have been overcome. What remains will be the start of the next big thing in desktop computing.

2 comments. Share your thoughts »

Tagged: Computing

Reprints and Permissions | Send feedback to the editor

From the Archives

Close

Introducing MIT Technology Review Insider.

Already a Magazine subscriber?

You're automatically an Insider. It's easy to activate or upgrade your account.

Activate Your Account

Become an Insider

It's the new way to subscribe. Get even more of the tech news, research, and discoveries you crave.

Sign Up

Learn More

Find out why MIT Technology Review Insider is for you and explore your options.

Show Me