Power. Speed. Capacity. Connectivity.
At the beginning of the 21st century, these are the major areas of focus for most users of computing devices, whether desktop, laptop, or handheld. Does it have the ability to run the applications that I need? Can it run them well? With reliability? With security? With access to my data from multiple locations? And, of course, robust ability to communicate with other computers via the Internet is essential. Sometimes, even the question, “Does it look cool?” is important.
Most of consumer computing today assumes the above requirements are met in one way or another. As far as a particular computing device is concerned, its value is dependent on how well it meets these criteria. The computing “religious wars” of the 1970s, 1980s, and 1990s have for the most part receded, shifting from a fervent adherence to a particular piece of hardware, over to a preference of operating system (Windows, Linux, MacOS, or others). Furthermore, we are gradually moving to the place where applications and the documents they produce are becoming the focus, rather than the box or OS that runs on that box. The computer operating system is gradually transforming into a vehicle through which an application or Internet content is presented, much as a local radio or television station broadcasts programs and entertainment. And the future holds the potential for even more exciting advancements.
These innovations did not happen overnight. The transition from the first horse-less carriage on a dirt road to the latest fuel-conserving automobile on a modern superhighway required the combined efforts of many talented inventors and engineers, as well as a consensus of standards on how to operate a car and what are the rules of the road. In the same way, the path to today’s complex computing power was a long and convoluted one, with many side roads, and a few dead ends.
The purpose of this writing is not to comment on the present state of computing, nor to speculate on its future. It is, rather, to look back on the past, to examine in detail one of the paths that brought us from simplicity to complexity. Many people and companies over the past thirty years have created or used innovations in technology to move us along those paths. And one company in particular has pioneered the widespread use of many aspects of computer technology that today we all take for granted:
This company, Apple Computer, is a fascinating study in engineering talent, innovation, perseverance, and success in spite of dysfunctional behavior. Apple Computer has the unique distinction of being the single surviving hardware company from the early days of personal computing that is still in business and is still successfully selling computers, as well as challenging Microsoft’s dominance in operating systems. And within Apple, the Apple II computer in its various incarnations played a large role in the company in ways that often were not recognized at the time.
Consider the following facts:
The success of the Apple II is due entirely to the millions of people who bought it, used it, and developed software and hardware for it, in spite of the mistakes of its parent company.
One of the greatest strengths of the Apple II is the attention paid over the years to ensure backward compatibility as enhancements were introduced. That compatibility meant that a IIGS “power system” manufactured in its latter days, with a full eight megabytes of memory, a hand-held optical scanner, CD-ROM drive, and 150 megabytes of hard disk storage, could still run an Integer BASIC program written in 1977, probably without any modification! This may not seem very surprising now, when compatibility is expected, but in the early days of personal computing it was not always easy or even possible to move documents or programs from an old computer system to a new one. Also, consider the quantum leap in complexity and function between the original 4K Apple II and the last revision of the Apple IIGS; the amount of firmware (built-in programs) in the IIGS was larger than the entire available RAM space in a fully expanded original Apple II!
This strength of the Apple II could also be considered a weakness, because it presented a major difficulty in making design improvements that kept up with the advances in computer technology between 1976 and 1993, but yet maintaining the compatibility with the past. Some other early computer makers found it easy to design improvements that created a better machine, but sometimes they did so at the expense of their existing user base (Commodore comes to mind, with the PET, the VIC-20, the Commodore 64, and lastly the Amiga, all very different from each other, and with limited compatibility with older models). However, this attention to detail is just one of the things that made the Apple II the long-lived computer that it was.
In examining the development of the Apple II, we will take a look at some pre-Apple microcomputer history, the Apple-1 (which laid the foundation for the design of the Apple II), and the formation of Apple Computer, Inc., with some side-roads into ways in which early users overcame the limits of their systems. We will follow through with the development of the Apple IIe, IIc, and IIGS, discuss peripherals, online services, and magazines, and lastly, make some generalized comments on Apple Computer, Inc. and they way they handled the Apple II.
To understand how the Apple II came about, it helps to know the environment that produced it. During the 1960s and early 1970s, a sentiment had been growing amongst those who had been taught to use a computer in college (or possibly high school). This sentiment recognized the usefulness and in some cases just plain fun of running computer programs. In a normal post-schooling work environment, a computer was used for purposes of handling finances, statistics, and possibly engineering problems. These were all important, but boring to the computer enthusiast who wanted to run his or her own programs. Many computer systems were mainframes that worked in an environment that required a user to present a stack of punched cards to the person who actually operated the machine, and who would then deliver back to the user a printout of the results of the program. Sometimes a smaller computer system would allow a user to actually sit at the console and interactively run a program, fix the bugs, and then run it again, bypassing the need for an officially trained computer operator. This improved the immediacy of using the computer, but even these smaller computers (such as the PDP series) were far too large and expensive for anyone to actually have in their homes as a “personal” computer.
Despite these hurdles, there was still a desire for a computer that was small enough and inexpensive enough to actually own. The change that made it possible to create one of these small machines was the development by Intel in 1969 of the first programmable microprocessor chip, the 4004. This chip could manage only four bits of data at a time, and was originally designed for the Japanese company Busicom to be used in a desktop calculator, the Busicom 141-PF. In an attempt to recover some of the development costs, Intel decided to advertise the chip for general use in the fall of 1971 in Electronic News. Meanwhile, Intel had been developing a different processor for an outside company. Those plans fell through, and in leu of paying Intel for their work, that company left Intel with the intellectual property. The chip was eventually redesigned and out of that project came the 8-bit 8008, which was twice as powerful and would actually approach the ability to function as an adequate central processing unit (CPU) for a stand-alone computer., 
By 1972, the Intel 8008 was available for purchase and experimentation. It was somewhat difficult to work with, and could address only 16K of memory, but there were several small computers that were designed around it. The first computer that made use of the 8008 was the Micral, designed by François Gernelle. It was sold by the French company R2E for $1,750, fully assembled, but did not have any impact in the U.S. In 1974, Scelbi Computer Consulting (Scelbi stood for “SCientific, ELectronic, and BIological”) began to sell a microcomputer called the Scelbi 8-H. It was available as a kit for $565, and could also be purchased fully assembled. The first advertisement for the Sclebi 8-H appeared in the March 1974 issue of the amateur radio magazine, QST., However, it had limited distribution, did not make a profit for the company, and due to the designer’s health problems it didn’t go very far.
Another 8008 computer available the same year was the Mark-8. This computer appeared on the cover of the July 1974 issue of Radio-Electronics, and was available as a kit. To build it, a hobbyist had to send in $5.00 and would receive in the mail plans for the kit. Also available were the printed circuit boards used in assembling the kit, and the 8008 processor from Intel cost only $120 by that time. Although it was a difficult kit to complete, about 10,000 manuals were sold and about 2,000 circuit board sets were completed, and newsletters and small user groups appeared around the country.
In addition to the above machines, other enterprising engineers began to look at ways the 8008 could be used. In Redmond, Washington, Paul Allen convinced a hacker friend, Bill Gates, to join him in writing a BASIC interpreter for this new processor to use in a simple computer circuit that he had designed. They formed their first business venture, a company called Traf-O-Data, to process traffic information using this computer. Although not a success, this effort was a foundation for the role they would play in the future development of the home computer.
In 1974 Intel released the 8080, an enhanced 8-bit processor that at a speed of 2 MHz was ten times faster than the 8008. The 8080 addressed some of the shortcomings of the 8008, increased to 64K the memory it was capable of addressing, and generally made it more capable of acting as a CPU for a small computer. It was Intel’s expectation that these integrated circuits might be usable for calculators, or possibly for use in controlling traffic lights or other automated machinery. As with the 4004 and 8008, they didn’t particularly expect that anyone would actually try to create a computer using these chips. But there was a strong desire for a computer to use at home, free of the restrictions of the punched-card mainframe environment, and talented engineers began to find ways to make use of Intel’s new invention.
During the latter part of 1974, a company called MITS (Micro Instrumentation Telemetry Systems) was in dire straits due to, of all things, one of its most successful products. MITS was started in 1969 by Ed Roberts, and he originally sold control electronics for model rocketry. With the new availability of sophisticated electronics in the early 1970s, MITS grew in size and was doing a very profitable business selling electronic calculators, both as kits and pre-built. However, in 1972 Texas Instruments decided to not only produce the IC chips but also build the calculators that used those chips, at a substantial discount from what it cost their commercial customers, like MITS. Roberts saw his calculator inventory become less and less valuable, and finally he found it difficult to sell them even below the cost of making them. This put his formerly successful business deeply in debt; he desperately needed a new product to help him recover from these losses.
At the same time that MITS was looking for something to revive the company, Popular Electronics magazine was looking for a computer construction article that would exceed the Mark-8 computer project that had appeared on the July 1974 Radio-Electronics cover. Roberts began to design a computer that would use Intel’s new microprocessor, and with a special deal he was able to make with Intel he would be able to supply the computer as a kit, along with the 8080, for only a little more than the cost of the chip by itself. This construction project appeared on the cover of the January 1975 issue of Popular Electronics magazine. Roberts and the editor, Les Solomon, decided to call the computer the “Altair 8800”; it measured 18-inches deep by 17 inches wide by 7 inches high, and it weighed in at a massive 256 bytes (that’s one fourth of a “K”). Called the “World’s First Minicomputer Kit to Rival Commercial Models,” the Altair 8800 sold for $395 (or $498 fully assembled). MITS hoped that they would get about four hundred orders for the computer, trickling in over the two months that the two-part article would be printed. This would supply the money MITS needed in order to buy the parts to send to people ordering the kits (this was a common method used in those days “bootstrapping” a small electronics business). This “trickle” of orders would also give MITS time to establish a proper assembly line for packaging the kits. However, they misjudged the burning desire of Popular Electronics‘s readers to build and operate their own computer. MITS received four hundred orders in one afternoon, and in three weeks it had taken in $250,000, soon reversing its large deficit.
The Popular Electronics article was a bit exuberant in the way the Altair 8800 was described. They called it “a full-blown computer that can hold its own against sophisticated minicomputers now on the market… The Altair 8800 is not a ‘demonstrator’ or souped-up calculator… is a complete system.” The article had an insert that listed some possible applications for the computer, stating, “the Altair 8800 is so powerful, in fact, that many of these applications can be performed simultaneously.” Among the possible uses listed were an automated control for a ham station, a digital clock with time zone conversion, an autopilot for planes and boats, a navigation computer, a brain for a robot, a pattern-recognition device, and a printed matter-to-Braille converter for the blind. Many of these features would begin to be possible by the end of the twentieth century, but in 1975 this was a lot of “hype,” just like the claim that it would be possible to carry out these tasks “simultaneously.” The exaggeration by the authors of the Popular Electronics article can perhaps be excused due to their excitement in being able to offer a computer that anyone could own and use. All this was promised from a computer that came “complete” with only 256 bytes of memory (expandable if you could afford it) and no keyboard, monitor, or storage device.
The success of the Altair overwhelmed MITS in many ways. The hunger for the personal computer made people do crazy things. One person was so anxious to get his Altair working that he couldn’t wait for phone help; he drove out to Albuquerque, New Mexico (the home of MITS) and parked his RV right next to the building, running in for help when he needed it. Another enthusiastic person sent MITS a check for $4000, ordering one of everything in their catalog, only to have much of it refunded with an apologetic note that most of these projects had not even been designed yet!
All of this excitement was somewhat tempered by the fact that when the Altair 8800 was completed, in its base form it could not do much. One could enter a tiny program a single byte at a time, in binary, through tiny toggle switches on the front panel, and read the output, again in binary, from the lights on the front panel. This novelty wore out quickly. So, to do something useful with the Altair it was essential to add plug-in boards to expand it. MITS could not reliably meet the demand for these boards (their 4K RAM board had problems), and so an entire industry grew up to help make hardware (and software) for the Altair. Examples of these additions included a paper-tape reader (useful for loading and saving programs), a Teletype (which worked as a keyboard, printer, and paper-tape read/write unit all in one!), a keyboard, and a video display (often called a “TV Typewriter”, after Don Lancaster’s 1973 Radio-Electronics article of the same name). Eventually it became common to use a cassette tape for both loading and saving programs, since it was inexpensive and cassette recorders were easily available.
The difficulty MITS had in supplying the demand for its computers also led to the creation of other similar computers that used the 8080. These usually used Altair’s 100-pin “bus” (a protocol for circuit boards to communicate with the microprocessor) and so were compatible with most of the Altair boards.
The IMSAI 8080 was marketed to businesses and was built as a more sturdy alternative to the Altair. Polymorphic Systems sold the Poly-88, also based on the Intel 8080. Processor Technology’s “Sol” came out in 1976 and did one better than many of these, by putting the keyboard and video display card in the same case, as well as using the Altair-style S-100 bus for add-on boards. Other computers released in 1975 that enjoyed limited success were some based on the Motorola 6800 processor, such as the Altair 680 (also from MITS), the M6800 (Southwest Technical Products), and the Sphere, which came complete with keyboard and monitor. One company, Microcomputer Associates, sold a kit called the Jupiter II, based on a new processor called the 6502 from MOS Technology.,
At this time, in the mid-1970s, the market for microcomputers was populated by hobbyists, those who didn’t mind getting out a soldering iron to wire something together to make the computer work. After the computer was actually operational, it was still necessary to either create a program in assembly language or enter a program written by someone else. Those who were fortunate enough to have enough memory to run BASIC had a little easier time creating and using programs. It was also very helpful to have some device to simplify input of the program, either a keyboard or a paper-tape reader. It was an era of having tools that were good enough for the home experimenter, but not yet really useable by the average person.
I think the PDP-11 had a longer run than the Apple II, Wikipedia says 1970 to 1997, but I have pictures of my father and his brother standing next to an early 11 which I was always told was taken during the Apollo 11 moon landing. My first computer, BTW was a PDP-8 and I first used the Apple II-an early unit with the nonvented case, which I drilled with holes and put a fan on-as a terminal for it. Even though it had more memory than the 8 did if memory serves.
This stuff is informational!!!!!! I LOVE IT!!!
I always think about what was a real strength of Apple computers and it’s key technologies to help the company survive and florish every next year
This is a very interesting and very informative article (as is the entire site). Kudos for that. I only have some problems with this sentence (sorry for the nitpicking).
“Meanwhile, some Intel engineers began to work on an 8-bit version of the 4004, the 8008”
I believe (and my beliefs can easily be confirmed by simple web searches) that: first, 8008 is not a 8-bit version of the 4004, but a very different and totally incompatible beast (although it is in fact quite similar to Intel’s next CPU , the 8080). Actually it seems that 8008’s development even started before that of the 4004! Second, the architecture of 8008 was developed by another company, CTC, rather than Intel. Intel was chosen just as the primary manufacturer of the chip. At one point, however, CTC abandoned the chip and the ownership of it passed to Intel, which then continued its development, raising the speed and lowering the cost, and eventually built an expanded version, known as Intel 8080.
On doing some further research (available to me now that was not available in the early 1990s when I originally wrote this), you are correct: The 8008 was not evolved from the 4004. It’s too late for my book (which is printed), but I can make a modification to the article here to better explain this. Thank you for pointing this out.