In Search of Stupidity

Merrill R. Chapman’s “In Search of Stupidity” was recently recommended on JoS. The book covers the high-tech (mostly software) industry from the CP/M days onward; Chapman himself had worked for MicroPro (WordStar) in its heydays.

As the title says, Chapman assumes that Microsoft’s dominant position today is mostly the result of massive stupidity on the part of its once-formidable competitors, so he tries to isolate their fatal mistakes. I found the book quite enjoyable, if only moderately enlightening; it’s mostly an entertaining trip down memory lane with lots of anecdotes, not a business textbook.

I know there are some other fossils here who have personally used CP/M machines (WordStar 3.0 rocked!) so I thought I’d let you know.

I imagine his theory hinges on the seminal event of Gary Killdall blowing off IBM about porting his CP/M to their first personal computer, thereby opening the door for MS to not only port BASIC to the PC, but the operating system as well.

I guess you could say he killdall his chances for glory. HEE HEE HAW HAW…sorry, always wanted to use that line…

Edited for speeling

As I understand it, IBM backed off from their original plan of building their PC using CP/M on the 6502 because one of their executives took a trip to Japan at just the wrong time and saw that some Japanese manufacturer was planning the same thing. This scared them for some bizarre reason (at the time, the Japanese had no US computer market share.) This fateful trip supposedly was the major factor in convincing IBM to distinguish their product from the Japanese by switching to the inferior 8086 and the much-inferior DOS. As it happened, the Japanese never sold PCs in the US until much later on.

First of all, the 8086 and even the 8088 (which was the actual PC chip) for all its flaws was vastly superior to the 6502. I think you’re confusing this CPU with the Motorola 68000 – the 6502 had an extremely primitive instruction set, no option for a math coprocessor, and only 64 KB total address space! This was later the CPU used by the C64 (which was a cool home computer but NOT because of its CPU). Now if the Japanese had actually planned to build a PC with a Motorola CPU (was the 68k even out at the time?) that would be quite a story. Do you have any links or references?

Anyway, on the IBM/DRI thing – Chapman doesn’t mention Japanese companies, and he also refrains from telling the old Kildall-in-an-airplane story. What he does say is that IBM was hated by the computer industry back then much like Microsoft is hated today, and DRI just plain didn’t feel like striking a deal with them. MS actually sent IBM to DRI because MS knew they couldn’t make a decent OS, but they did want the PC out on time because IBM had already licensed their popular BASIC.

So when DRI kept procrastinating they just bought QDOS and patched it up to become MS/PC-DOS 1.0, fully expecting that everyone would switch to CP/M 86 once it was out. Except that DRI still didn’t get it – when they finally got their OS done they tried to sell it for a whopping $240 as opposed to $40 for MS-DOS! Business software was quickly released in CP/M versions but DOS was already widespread so there was also a DOS version of every app. The predictable net result was the swift death of CP/M 86. Chapman recounts that Kildall kept insisting until the bitter end that “CP/M 86 was priced just right”…

I used Wordstar back in the day. They took far too long to put out a Windows version. That killed it. By the time they got around to doing a crappy Windows version, Wordperfect and Word were already dominant.

The story I heard about DRI and IBM is that IBM called on a weekend and the DRI CEO was out flying his plane. IBM then called Microsoft that same weekend and struck a deal. This version presented the whole IBM-Microsoft deal being much more the result of chance.

No, I’m not thinking of the 68000, it wasn’t in production yet in 1979 or 1980 or whenever this decision was being made. I remember the professor was excited about getting new 68000s for my EE class in 1982, so they were not a possibility for use in the original IBM PC.

I never used the 8086 chip for anything myself, so I could be wrong, but I thought it only had very limited addressing, too, didn’t it?

Ha! I know! It probably wasn’t the 6502 IBM was originally going to use, it was probably the Z-80 with an S-100 bus!

Anyway regardless of the 8086/6502/Z-80 issue, IBM’s subsequent failure to switch to the 68000 which was incomparably better than the 8086, and merely vastly better than the 80x86 series, was a very bad decision. Compatibility be damned when you are improving performance by such a huge factor.

My understanding of the Motorola/Intel decision for IBM is that it was simply a matter of availability. Intel promised to make chips available to IBM in whatever quantity they needed. Morotola wouldn’t set up a dedicated fab for the 68000 series just to support IBM. Motorola makes a lot of other kinds of chips, and at the time, they didn’t see the PC market taking off to the extent it ultimately did.

This is all such ancient history I don’t even remember where I heard or read that. Anyone who knows the issue better than me is welcome to tell me how stupid that whole story is…

–milo
http://www.starshatter.com

I believe the story goes that Gates actually gave Killdall and DRI a heads up that they should be ready for IBM to come knocking at the door about a CP/M port to the PC, and that they pretty much blew Blue off.

The really funny part of the story is that Seattle Computer Products actually created QDOS out of frustration with DRI being slow to port a version of CP/M to their line of computers. So with Paul Allen grabbing QDOS and turning it into MS-DOS for the PC, DRI doubly screwed themselves, heh.

Yes, that’s what I remember as well.

I never used the 8086 chip for anything myself, so I could be wrong, but I thought it only had very limited addressing, too, didn’t it?

Emphatically not! It’s big huge giant killer feature at the time was the fact that it could address ONE ENTIRE MEGABYTE of RAM! :shock:

Yes, you had to use segment registers but the 6502 absolutely couldn’t address more than 64 KB. And it couldn’t use 16 bit numbers for arithmetics – the 8088/86 could. I’ve programmed both CPUs in assembly language. The 6502 was really, really crappy. Even among 8 bit CPUs the Z80 was far better, probably even the 8080.

Ha! I know! It probably wasn’t the 6502 IBM was originally going to use, it was probably the Z-80 with an S-100 bus!

You may be right there. I dimly recall an enhanced Z80 successor called Z8000 or so, and back in the farthest corner of my memory I catch a glimpse of somone, possibly the Japanese, wanting to build a PC based on it. Not much came of it, though.

Anyway regardless of the 8086/6502/Z-80 issue, IBM’s subsequent failure to switch to the 68000 which was incomparably better than the 8086, and merely vastly better than the 80x86 series, was a very bad decision. Compatibility be damned when you are improving performance by such a huge factor.

Bad decision? From a programmer’s perspective maybe. From a business viewpoint, Intel’s decision to put backwards compatibility above all else was what made them the dominant chip maker. And IBM’s decision to go along with that was what earned them the PC monopoly. You’ll notice that all Motorola-based PCs were left in the dust, and IBM only eventually lost to clones that were powered by the same CPUs (or Cyrix/AMD clones).

I’m not sure about the “huge factor” either – I think the 68k could go to higher clock rates but at that time Intel had the 80286 out which could go just as high. How much faster was the 68k than the 8088, clock for clock? I don’t think the difference was that dramatic.

Besides, Intel & IBM expected to quickly move to protected mode software once the AT took off. But because of the failure of OS/2 and the success of DOS-based Windows 3.x, slow and hard-to-write real-mode software stayed with us much longer, and that created an impression of the Intel chips being much worse than they actually were.

I believe the segmented architecture of the 8086 also facilitated porting existing CP/M apps written for the 8080 because the addressing code didn’t need to be changed – you could take advantage of a bigger memory space just by changing by segment register. At any rate that was the reasoning given by Intel in their processor manual. So there was an actual technical reason for choosing the 8086 architecture.