Gates was assisted in his work on the puzzle by then Harvard professor Christos Papadimitriou, who taught computer science. Gates considered the math puzzle very similar to the kind of challenges he faced when working on a complicated computer program in which he had to design algorithms to solve a specific problem.
"This was a simple problem that had proved very stubborn," said Papadimitriou. "Bill claimed to have a way of doing it better than anyone else, and I was patient enough to suffer through his long and ingenious explanation." Later, Papadimitriou decided to write up Bill’s solution, and it was published in 1979 in the journal Discrete Mathematics. The breakthrough Gates made on the puzzle has remained on the cutting edge of the field for the past 15 or so years, according to Papadimitriou, who is now at the University of California in San Diego. The professor occasionally gives the puzzle to some of his students and tells them if they solve it, he will quit his job and work for them. "I should have done this with Bill," he said.
Gates may not have been the best math student at Harvard, but he had no peers in computer science. His professors were impressed not only by his smarts, but his enormous energy. There s one in a handful who come through in computer science that you know from the day they show up on the doorstep they will be very, very good," said professor Tom Cheatham, director of the Center for Research and Computing Technology at Harvard. "No doubt, he was going to go places."
Although Gates took several computer science classes from Cheatham, they did not like each other. "Gates had a bad personality and a great intellect," recalled Cheatham. "In a place like Harvard, where there are a lot of bright kids, when you are better than your peers, some tend to be nice and others obnoxious. He was the latter."
When Gates wasn’t playing poker at night, he was usually working in the Aiken Computer Center. That was when the machines were least used. Sometimes, an exhausted Gates would fall asleep on computer work tables instead of returning to his room at Currier House. "There were many mornings when I would find him dead asleep on the tables," recalled Leitner, the graduate math student who was also interested in computers. "I remember thinking he was not going to amount to anything. He seemed like a hacker, a nerd. I knew he was bright, but with those glasses, his dandruff, sleeping on tables, you sort of formed that impression. I obviously didn’t see the future as clearly as he did."
But Paul Allen saw the future. He may have seen it even more clearly than Gates.
On a cold winter day in December 1974, Allen was walking across Harvard Square in Cambridge on his way to visit Gates, when he stopped at a kiosk and spotted the upcoming January issue of Popular Electronics, a magazine he had read regularly since childhood. This issue, however, sent his heart pounding. On the cover was a picture of the Altair 8080, a rectangular metal machine with toggle switches and lights on the front. "World’s First Microcomputer Kit to Rival Commercial Models," screamed the magazine cover headline.
"I bought a copy, read it, and raced back to Bill’s dorm to talk to him," said Allen, who was still working at Honeywell in nearby Boston. "I told Bill, ‘Well here’s our opportunity to do something with BASIC.’ "
He convinced his younger friend to stop playing poker long enough to finally do something with this new technology. Allen, a student of Shakespeare, was reminded of what the Bard himself wrote, in Julius Caesar: "There is a tide in the affairs of men, which, taken at the flood, leads on to fortune. Omitted, all the voyage of their life is bound in shallows and in miseries. On such a full sea are we now afloat, and we must take the current when it serves, or lose our ventures."
Gates knew Allen was right. It was time. The personal computer miracle was going to happen.
It was named after a star and had only enough memory to hold about a paragraph’s worth of information. But the Altair, the people’s entry into the dazzling new world of computers, represented nearly 150 years of technological evolution and thought.
Although what we know as the modern computer had arrived some 30 years before the Altair, in the 1940s during World War II, the concept of such a machine came from the mind of an eccentric 19th century mathematical genius named Charles Babbage, who developed the first reliable life-expectancy tables. In 1834, having already invented the speedometer and the cowcatcher for locomotives, Babbage put all his creative energies into the design of a steam-powered machine he called the "Analytical Engine." Frustrated by inaccuracies he found in the mathematical tables of the day, Babbage wanted to build a machine to solve mathematical equations. On paper, his Analytical Engine consisted of thousands of gears and cogs turned by steam, and a logic center that Babbage called "the mill." His design called for a machine the size of a football field. Such an undertaking also called for huge sums of money, and when the government stopped backing the project, Babbage was helped financially by Augusta Ada, the Countess of Lovelace and daughter of the poet Lord Byron. The beautiful and scientific-minded countess was a fine mathematician herself and is now considered the first computer programmer. The countess planned to use punch cards to instruct the Analytical Engine what to do. She got the idea from the cards used on Jacquard looms to determine the design on cloth. "The Analytical Engine weaves algebraic patterns just as the Jacquard loom weaves flowers and leaves," she wrote.
Although Babbage devoted almost 40 years of his life to the project, his Analytical Engine was never completed. The technology just wasn’t there to make it possible.
By the end of the century, however, punch cards were used in a test to help tabulate information from the 1890 Census. The electric tabulating machine used in this experiment was designed by a young engineer named Herman Hollerith. Soon, punch cards were widely used in all kinds of office machines, and Hollerith’s company was absorbed by a New York firm that would later become the biggest name in computers—International Business Machines.
In the 1930s, IBM agreed to finance the development of a large computing machine. It gave Howard Aiken, a Harvard professor for whom the university’s computer center was later named, $500,000 to develop the Mark 1. When it was finally completed in 1944, the Mark 1 could multiply two 23-digit numbers in about five seconds. But it was an electromechanical machine, which meant that thousands of noisy relays served as switching units, opening and closing as the machine performed its dim-witted calculations.
The vacuum tube soon replaced electromechanical relay switches and gave birth to ENIAC, the first electronic digital computer in the United States. It was unveiled in 1946 at the University of Pennsylvania. Built to calculate artillery firing tables for the military, ENIAC (for Electronic Numerical Integrator and Calculator) weighed 30 tons and contained 18,000 vacuum tubes, 70,000 resistors, and 10,000 capacitors. It took up more space than the average two-car garage. The ENIAC cost about a half-million dollars to develop and could handle about 5,000 additions and subtractions per second. Today, any inexpensive home computer can outperform the ENIAC. The machine was not very reliable. Its tubes failed on the average of once every seven minutes. Still, it was used during final stages of completion to do mathematical calculations for the physicists at Los Alamos who were building the first atomic bomb.
The big breakthrough in computing technology came two days before Christmas in 1947, when three scientists working at Bell Labs tested a crystal device known as the transistor, short for "transfer resistance." (They would win the Nobel Prize for the invention.) These tiny crystals, or semiconductors as they became known, acted like switches, controlling the flow of electricity in circuits. Semiconductors replaced the vacuum tube. They were much smaller and more reliable. They didn’t give off as much heat as tubes did, so they could be packaged close together. They had no moving parts, so they were less likely to fail. And perhaps most important of all, semiconductors were cheap to make. The first ones were made out of crystals of germanium. Later, silicon became more popular.
William Shockley, one of the inventors of the transistor, left Bell Labs to return home to Palo Alto in the Santa Clara Valley of California to form his own company in the heart of what would become known as the Silicon Valley. Other companies soon hired away the Bell Lab’s star scientists and began turning out semiconductors, including Texas Instruments.
Another technological leap came in the late 1950s, when networks of transistors were etched on a single piece of silicon with thin metallic connectors. These integrated circuits, or chips, became the foundation of all modern electronics.
Computers, meanwhile, got smaller, faster, and more powerful. IBM dominated the playing field in the 1950s. Business writers referred to the other makers of large, mainframe computers as the Seven Dwarfs—RCA, General Electric, Honeywell, Burroughs, NCR, Sperry Univac, and Control Data Corporation. The so-called giant brain computers made by these corporations were big and expensive. They could easily fill several rooms and cost hundreds of thousands of dollars. A priesthood of technicians was needed to watch over them. The machine had to be pampered with air conditioning. Access was usually through intermediaries. Scientists and engineers wanted a computer they could operate themselves, one that would be smaller, cheaper, and easier to maintain. The development of the semiconductor made possible just such a machine—the minicomputer. When IBM decided not to enter this new market, it left a fertile field of opportunity to be plowed by new computer companies such as Digital Equipment Corporation, which quickly became the leader. DEC established the minicomputer market in 1965 when it introduced its PDP-8 (shorthand for Program Data Processor). It cost $18,500. The price included a teletype. Digital called its machine a small computer." The press, looking for a sexier name, tagged it the "minicomputer," after the fashionable miniskirt. "We fought the name for years and finally threw up our hands," recalled one engineer at Digital. The minicomputer was a highly interactive machine. Instead of feeding punch cards into the machine, the user communicated with the computer via keyboard—a novel idea at the time.
When engineers working at a Santa Clara company known as Intel developed the microprocessor in 1971, the next evolutionary step for the incredible shrinking computer was inevitable. The microchip allowed the entire central processing unit of a computer to be encoded onto a silicon chip no larger than a thumbnail. But this next step would not be taken by large corporations like DEC or IBM with money and expertise. Instead, it would be taken by entrepreneurs and hobbyists with vision and dreams... dreams of one day owning their own computer. A personal computer. A pretty radical idea.
One of these hobbyists was a hulking bear of a man by the name of Ed Roberts. He stood about six feet four and weighed nearly 300 pounds. Roberts had enormous energy and an insatiable appetite for food and information. If he became interested in a subject, be it photography or beekeeping, Roberts would read everything he could find in the library on the topic.
Roberts was something of a gadget nut. He loved tinkering with electronic hardware. He had joined the Air Force to learn more about electronics and ended up stationed at Kirtland Field outside Albuquerque. There, he formed a company called Model Instrumentation and Telemetry Systems. (Later, the word "Model" would be changed to "Micro.) At first, Roberts operated MITS out of his garage, selling mail-order model rocket equipment. He also sold radio transmitters for model planes. After Roberts left the service, he started selling electronic equipment. In 1969, he moved MITS out of the garage and into a former Albuquerque restaurant called "The Enchanted Sandwich Shop." Roberts sunk all of his company’s capital into the commercial calculator market. MITS was the first company in the United States to build calculator kits. Business was good. MITS quickly expanded to more than 100 employees. Then the bottom fell out. In the early 1970s, Texas Instruments entered the calculator market. Other semiconductor companies did the same. Pricing wars followed. MITS could no longer compete.
By 1974, MITS was more than a quarter of a million dollars in the red. Desperate to save his failing company, Roberts decided to take advantage of the new microprocessors and build computer hobby kits. Roberts knew that Intel’s 8008 chip was too slow. He was banking on the next generation of chip, known as the 8080. It came out in early 1974. The 8080 was an exciting successor. It was much faster and had much more brainpower than the 8008. The new chip could certainly support a small computer. Or so Roberts believed.
He decided he would sell his machine for $397. This was a mind-boggling figure, and Roberts knew it. After all, Intel’s 8080 chip alone was selling for $350. But Roberts had been able to browbeat Intel into selling him the chips in volume, at $75 apiece.
Although the machine had a price, it still lacked a name. David Bunnell, MITS technical writer, suggested the Orwellian-sounding Little Brother." Roberts didn’t much care for the name. With the name still up in the air, Roberts and his small team of engineers went to work building a prototype machine. He was soon contacted by Les Solomon, technical editor of Popular Electronics. Solomon was looking for a good computer story to put on the cover of his magazine. He knew Roberts and had heard about his plan for a home computer kit. Solomon flew to Albuquerque to talk with Roberts. Could Roberts have the prototype ready by the end of the year? Roberts assured Solomon he could.
After he returned to New York, Solomon scratched his bald head for days trying to come up with a name for the computer. One night, he asked his 12-year-old daughter, who was watching "Star Trek" on television. Why not call it "Altair," she said. That s where the Starship Enterprise was heading.
Roberts, a sci-fi fan, liked the name, too. Altair was also the name of the planet visited by the spaceship in the classic science fiction movie, Forbidden Planet.
Although Solomon’s daughter came up with the computer’s name, it was Roberts who coined the term "personal computer" as part of an ad campaign for Altair. "I was trying to convey a small machine you could afford to buy that didn’t sound like a toy," he said.
Before Popular Electronics could publish the articles on the Altair, Solomon needed to see the prototype to test it and make sure it worked as advertised. Roberts shipped his only working model to New York City by rail. It never arrived. The world’s first home computer—lost in transit! Solomon was in a panic. It was too late to change the planned January 1975 cover. And there was not enough time to build another computer. MITS engineers hurriedly put together a metal shell with the proper, eye-catching switches and lights on the front and shipped the empty machine to New York. And that’s what appeared on the magazine’s cover. The magazine’s nearly half-million hobbyist-subscribers never knew—although they would soon learn that things didn’t always work right at MITS.
The article on the Altair explained that the computer had only 256 bytes of memory, although it had 18 slots for additional memory boards that could increase its capacity to about 4K, or 4,096 bytes. There was no screen or keyboard. Since no one had developed a high-level language for the 8080 microchip, the Altair could only be programmed in complex 8080 machine language. This was painstakingly accomplished by flipping the switches on the front panel. One flip of a switch equaled one bit of information. (A series of 8 bits equals a byte, or one character of ordinary language.) The Altair "talked" back by flashing red lights on the front panel.
There were about a dozen high-level software languages available for mainframe and minicomputers when the Altair came along. These languages were each designed for different kinds of applications. The first widely accepted language was FORTRAN, or formula transition. Developed by an IBM team in 1956, FORTRAN was widely used in scientific circles and involved complex programming. Another language was COBOL, or common business-oriented language. It was mostly used for programming on mainframes. It, too, was difficult to master. But BASIC was easy to learn. It was even taught in some elementary schools. As John Kemehy, one of the two Dartmouth professors who developed BASIC, wrote: "Profiting from years of experience with FORTRAN, we designed a new language that was particularly easy for the layman to learn." BASIC, he went on to explain, "facilitated communication between man and machine."
Roberts had decided in the summer of 1974 that BASIC would be the language for the Altair, the people’s computer. But Intel had never expected its 8080 chip to be used as a microcomputer. Some engineers had told Roberts they didn’t believe it was possible to develop a working BASIC for the chip.
At Harvard, two young men went to work to prove these experts wrong.
Wired from excitement and lack of sleep, Gates and Allen made the long-distance call to MITS in Albuquerque from Gates’ room in Currier House. It had been only a few days since he and Allen had read the Popular Electronics article on the Altair. They had done little but talk since.
A man with a deep, gruff-sounding voice came on the phone at the other end. Hello, is this Ed Roberts?" Gates asked in his high-pitched, boyish voice. Told it was, Gates proceeded to explain, with youthful bravado, that he and his friend had developed a BASIC that could be adapted for the Altair computer.
In fact, they didn’t have a program at all, and Roberts suspected as much. He had heard such boasts already. "We had at least 50 people approach us saying they had a BASIC," recalled Roberts. We just told everyone, including those guys, whoever showed up first with a working BASIC had the deal."
Gates and Allen followed up the phone call with a letter to Roberts, reiterating that they did indeed have a BASIC that worked with the 8080 Intel chip. They proposed an arrangement whereby they would license MITS to sell their software with the Altair to hobbyists, and in return they would be paid royalties. They sent the letter on Traf-O-Data letterhead. When Roberts received the letter and called the number on the letterhead, he found he had reached a private school in Seattle. No one at Lakeside knew anything about a BASIC for the Altair. What was an Altair, someone wanted to know. This was curious indeed, Roberts thought. Who are these guys? High school pranksters?
Back at Harvard, Gates and Allen had hunkered down in the Aiken Computer Center. Like a couple of school boys caught in a lie, they were furiously trying to cover their tracks. They had told Roberts they had a BASIC, and now they had to produce one—before all those other competitors who undoubtedly were also trying to make good on their exaggerated claims.
For the next eight weeks, the two would work day and night in the computer room, trying to do what some experts at Intel said couldn’t be done—develop a high-level computer language for the 8080 chip. Gates not only stopped going to all his classes, he even gave up his beloved poker games. His poker pals knew something was going on. "As soon as Bjll started missing the games, it was obvious he was up to something, but none of us knew what it was," said Greg Nelson, one of the poker regulars.
Gates and Allen didn’t have an Altair, which made their task especially difficult. Roberts had the only up-and-running Altair in existence. And he was in New Mexico. The Popular Electronics article contained the computer’s schematics, which would help. But what they really needed was detailed information about the 8080 chip. So they went to an electronics shop in Cambridge and bought a manual on the 8080 written by Adam Osborne, an Intel engineer whose job was to write technical manuals for the company’s new microcomputer chips. Osborne, a stately, Bangkok-born Englishman, would soon become a very famous player in this new revolution. He would make a fortune publishing the first microcomputer books before building his own version of the Altair.
While Gates concentrated his efforts on writing code for the BASIC, Allen did the more technical work with the PDP-10 in the Aiken Computer Center.
They would have to create their BASIC with some brilliant innovation. Since they didn’t have an Altair, Allen had to make the PDP-10 mimic the 8080 chip. It required all his technical knowledge and skills. But he eagerly accepted this new challenge. All those days in the computer room at Lakeside... those all-nighters at C-Cubed... hacking away on computers at the University of Washington... building the Traf-O-Data machine... learning about the 8008 chip... all his previous experience with computers had prepared Allen for what he and Gates now faced. "We were in the right place at the right time," Allen would say later. "Because of our previous experience, Bill and I had the tools to be able to take advantage of this new situation." Gates faced different challenges than his friend. He had to write slick, tight code and make it fit into the maximum 4K memory of the Altair. It was like trying to squeeze his size 13 feet into size eight shoes. Actually, it was a tighter fit than that. Their BASIC not only had to fit in the limited memory space, but room had to be left over to run programs. What was the use of having a BASIC if there were no memory left in the computer to do anything?
"It wasn’t a question of whether I could write the program," Gates said, "but rather a question of whether I could squeeze it into 4K and make it super fast."
He did. Gates said later that of all the code he ever wrote, he was most proud of the BASIC program developed in those eight weeks at Harvard. "It was the coolest program I ever wrote," Gates said.
No one had ever written a BASIC for a microcomputer. In that sense, Gates and Allen were blazing the trail for future software developers and establishing the industry standard. They would decide, rather than the marketplace, the minimum features their BASIC needed. The two worked at a frantic pace in the computer lab, often for days at a stretch with only an hour or two of sleep. When he was so exhausted he could no longer program, Gates would lay down behind the PDP-10 for short catnaps. Occasionally, he would nod off at the computer keyboard, then wake up with a start and immediately start typing again.
He and Allen took about as much time to eat as they did to sleep. One day, during a quick meal break in the dining hall at Currier House, they were talking about the math package that would be needed as part of the BASIC. This was a subprogram known as "floating point routines," which manipulate the numbers in a computer. The routines implement basic operations like addition, subtraction, multiplication, and division. Both Gates and Allen knew how to write this subprogram, but neither wanted to spend the time doing so.
Sitting at their table and overhearing the conversation about floating point routines was another student, Monte Davidoff. The talk had caught his ear because he had done this kind of programming before. Davidoff spoke up.
"I know how to do that," he said.
Gates and Allen wanted to know more. After talking with Davidoff for awhile, they told him about the Altair project they were working on. Davidoff said he would like to help. Several days later, they gave him the word—he was in. But payment for his work was left up in the air. "We just had an oral agreement," recalled Davidoff, who now works for a computer and electronics firm in Cupertino, California. "They didn’t know if the thing was going to make any money. They felt there was the potential to make some money, and if they did make money, they would pay me.... So I just trusted them and left it loose."