The Secret Life of Vector Generators


VOUT = X1* Ramp + X2*(1-Ramp)



Download 193.19 Kb.
Page4/4
Date conversion06.11.2016
Size193.19 Kb.
1   2   3   4
VOUT = X1* Ramp + X2*(1-Ramp) .

The creation of the Ramp is a major challenge. Assuming Star Wars speeds (160 us across the 1024 pixel screen), it may have to go from 0 to 1 Volt in as little as 156 ns (1 pixel) or as long as 160 us (1024 pixels).


There is also the monitor problem. While most of the XY Monitor can be easily duplicated, the Deflection Yoke must be specifically designed for XY. You cannot use one designed for a TV.
Perhaps if we wait a little we will be able to make our own display from Light-emitting polymers (LEP). See Plastic display prototype proves inkjet technology, Electronic Products , April 2001, page 23. (Electronic Products)
XY might not be so dead after all.

A Final Thought

In a Digital Vector Generator, each XY position comes from the output of a counter so the result is similar to what you would get by using a frame buffer. Since the Digital Vector Generator in Lunar Lander and Asteroids used 10-bit DACs we have a screen resolution of 1024 x 768. (We actually could use 1024 x 1024 but then the 4:3 aspect ratio of the CRT produces different X and Y scaling values.)


As a result, lines have the same stair-stepping that you get in a frame buffer.

So, considering the effort that was required to develop the technology (the XY Monitor as well as the Vector Generator) the question is, why did we bother?

The answer is, that in 1978 when the Digital Vector Generator was developed for Lunar Lander, memory was much too expensive for a frame buffer in a video game. The first game to use a frame buffer was several years in the future (Missile Command) and even then, it was low resolution. (It may have been 512 x 384, but I'm not sure.)

Even in 1980, the latest and greatest DRAM was 16K bits and cost about $4.80. A single 256 x 256 x 4 frame buffer would have required 16 devices at a cost of $74. Two frame buffers would have required 32 devices costing $148 just for memory, which was more than the cost allowed to manufacture the entire PC board. Two frame buffers of 512 * 512 * 4 would have required 128 devices costing $614.
Video games of the time used Motion Objects and Playfield ROMs.
Motion Objects was Atari's name for what others called Sprites. Since Atari invented the technology, the companies 'borrowing' it should have been honest enough to call it by its correct name. (A Sprite
is just a Motion Object with the serial numbers filed off.)
The way a Motion Object works is that the programmer specifies a stamp (a picture) and a position on the screen. The hardware knows where the beam is (because it generates the sync) and pulls the correct data from the Motion Object ROM at precisely the right time. There is no frame buffer. Later, a line buffer was added to permit multiple objects to use the same Motion Object ROM. During Horizontal Sync, the hardware went through the display list and assembled the appropriate data for the upcoming line. Over the years the hardware evolved to permit more and more objects to share horizontal lines. Even so, there was always a limit on how many Motion Objects (or parts of Motion Objects) could exist on the same horizontal line.
Playfield ROM was just a picture burned into ROM that was scanned out. (Masked ROM was relatively cheap.) A later refinement was the Scrolling Playfield, which allowed the Playfield to be moved around.

Motion Objects and Playfields are what made the Atari VCS (later renamed the 2600) possible. The Atari 800 (and 400) used Motion Objects and Scrolling Playfields (implemented in custom ICs) as well as a frame buffer.

The hardware to do Motion Objects and Scrolling Playfields was used for quite a long time, even after frame buffers became common.
I expect that Motion Objects and Scrolling Playfields were used until it became impossible to compete with the graphics hardware developed for PCs, which is what is mostly used now in the (dying) coin-op industry.
But in 1978, there was almost no PC industry. For most people of that time, Lunar Lander and Asteroids were the most advanced interactive computer systems they could actually put their hands on.
The Analog Vector Generator got rid of the stairstepping produced by the Digital Vector Generator. It was first used in BattleZone (1981) and Tempest (later in 1981) added color.
These systems were also more advanced than what was available in home game consoles, not to mention the budding Personal Computer industry.
So, what happened? How did we lose our technological lead?
I think we lost our technological lead because we were so successful, and here's why.
The first 'video game' that I know of was developed in 1946. (U.S. Patent 2,455,992 Cathode-Ray Tube Amusement Device Thomas Goldsmith and Estle Mann). Although it used a sawtooth circuit it was essentially an XY game.
The first video game of the modern era (Computer Space) was invented in 1972 by Nolan Bushnell and Al Alcorn. Since the first microprocessor (Intel's 4004) was still in the process of being born, the game was a completely hardwired machine. Different operations were performed at different times according to the Counter used to produce Vertical Sync. The Motion Objects were stored in a diode matrix. The objects were created by stuffing the diodes in the appropriate holes in the PC board.

Computer Space was not very successful. The next game, Pong, was.

Even so, it didn't even register on the radar of the semiconductor companies.
Even when the semiconductor companies started making ICs for the Pong-type games for the home market, the semiconductor industry wasn't very interested in graphics.
When MITS developed the first personal computer (the Altair 8800) in 1975 it used a front panel with lights and switches. (There is a good article on the 8800 at www.vintage-computer.com/altair8800.shtml .)
When other companies eventually came out with personal computers they had character generators, which is like having motion objects except you can't move them.

When Apple came out with the very successful Apple II, it had a frame buffer (hurrah!) but no hardware assist; the programmer had to laboriously (and slowly) manipulate the bits himself/herself.


When IBM came out with the IBM PC, it, too, had only a character generator. Other companies developed Frame Buffers for it, but again, with no hardware assist.
When Atari came out with its 400/800 computers, they had Motion Objects, Playfield Memory, and a Frame Buffer. It was a very advanced graphics computer (for its day) but Atari had to develop the custom ICs itself, because the semiconductor companies did not recognize the value of graphics.

The reasons why the IBM was successful in the personal computer business, but Atari was not, would take a book of its own. Part of it comes down to marketing. Atari's was bad; IBM's was good. After all, "No one ever got fired for buying IBM", even if IBM didn't design the product, or build it, or write the software for it.

However, because the IBM PC had an open architecture, other companies were able to build better and better graphics cards for it. Eventually these companies were successful enough to design their own graphics ICs.

Well into the 1980s, Atari Coin-op was still able to compete with PCs because
1) We were not hobbled by the PC's 80x86 processors. The first microprocessor that Atari used was MOS Technology's 6502, which was more capable than Intel's 8080. When we started using the Motorola 68010, it was more capable than the Intel 80286, and about par with the 80386 which came later.
2) Atari started designing its own custom and semi-custom ICs for its coin-op games.
Texas Instruments was the only company that made a serious effort to enter the PC graphics business with the TMS34010, which was a nice 32 bit processor with additional instructions for doing graphics.
TI failed in the PC graphics business, but I used the TMS34010 in the Hard Drivin'/Race Drivin' series of games. Even so, the TMS34010 was not fast enough to do enough polygons without a special trick I developed. Perhaps some day I will write about it.
At some point in the early 1990s, the demand for PC graphics became hot. People wanted the games on their PCs to be as good as the ones in the arcades.
The small companies making graphics cards for PCs became larger companies in an increasingly competitive industry. Because of the competition, many (or most) of these companies either failed or went into other areas, and their places were taken by other companies.
The reason Atari started designing its own graphics ICs was that it was the only way to get the ICs we needed. We were a games company, not an IC company.

We couldn't compete with companies whose only business was designing graphics ICs. At some point this included the entire hardware system, especially when you consider the large numbers of PCs made compared to the small number of custom hardware systems used by a shrinking Coin-Op industry.

One of the main things that fuels the demand for PC graphics is 3D games.

Although other companies made 3D games, Atari was the pioneer. For many people, their first experience with an interactive 3D game was BattleZone. A few years later it was Star Wars. Both were XY. Shortly afterwards came I, Robot, the first 3D polygon game. After that came the Hard Drivin'/Race Drivin' series.
Like I said, we were a victim of our own success.
There are worse things that can happen to a person.

Jed Margolin

San Jose, CA

April 22, 2001

Revised: July 21, 2001; July 20, 2003

________________________________________________________________________________________

Copyright 2001 Jed Margolin

1   2   3   4


The database is protected by copyright ©hestories.info 2017
send message

    Main page