Technologies:

A PC Guide That Won’t, Technically, make your Head Explode.

by Rob Cheng and Bill Zahren


Graphics Accelerators
(Zero to 16 million in a millisecond)

You’d think that making all this fancy, color stuff appear on the monitor would just be as simple as buying the huge monitor of the apocalypse and plugging it into the back of the box.

Sure, that would be the simple way to go, therefore, it has no place in the PC world. Instead, we say, "make it complicated, please." So we’d like you to meet the video card, often called the "graphics accelerator." We gave it a new title instead of a raise.

Technically, the graphics accelerator is a chip that sits on the video card, but we hate to be technical.

PC people tend to call a lot of stuff "cards" since that’s what they look like big cards with a bunch of electronic deally-bobs all over them. The cards plug into a connector strip on the motherboard, which we’ll get to later. In a blatant fit of male bashing, PCs do not include any fatherboards. Fine.

Dots, Everywhere Dots
The picture on your PC right now is made up of dots, called pixels in the industry since calling them "dots" would be too straightforward. Anyway, your screen contains a ton of pixels. Probably at least 307,200 pixels, if not 786,432 and each of them can turn at least 16 colors. Sometimes monitors can make their 786,432 pixels turn 16 million colors. We’re not sure 16 million colors actually exist, but we hate to argue with the tech geeks who swear it’s true.

So you’ve got something like a bazillion different combinations of pixel color, size and resolution. Then throw in motion from scrolling down a screen to playing a video clip on your PC.

It’s the graphics card’s job to control every single pixel on the screen all the time, telling each one what color to turn and how long to stay that color. That’s a lot of information to control, so you can see why the video card always wants to nap when the hard drive and sound card want to run out and play. Probably only the Big Unit (CPU) works harder than the graphics card.

Graphics cards contribute only marginally to the quality of the monitor display in the sense that the optic nerve contributes to your ability to see. Without a good card, your kick-butt monitor becomes a giant, expensive waste of glass and stylish plastic case.

Video Cards: The Lean Years
Back in PC pre-history (the early 1980s), people called graphics accelerators plain old "video cards." They had it slack. Video cards just had to display text on some cheesy monochrome monitor 80 characters per line, 24 lines (a maximum of 1,920 characters). Puh-lease. In those days, the video card always wanted to screw off after work while the poor CPU and memory (what there was of it) were just totally spent and wanted to go home and go to bed.

But then along came Bill (Gates, not Zahren) who did the digital equivalent of grabbing the industry bull by the snout and flipping it onto its back Windows.

Now the processor and memory chuckled at the smarty video card who used to always want to stay out all night and come to work with only two hours sleep. When Windows first hit the PC, video cards said, and I quote "AHHHHHHHRG!" and exploded into flames. Suddenly, the video card had a billion pixels to control and 16 colors to manage not to mention opening and closing those crazy windows.

Around 1991 or 1992, everyone began to yell at their video cards for being a little bottleneck inside the computer. The processor blazed along doing Windows stuff, but the video card couldn’t get it on the screen as quickly as the processor did it. Or they couldn’t make the most of the application’s brilliant color and quality.

And you know how computer users get when they have to wait even two seconds for something to happen. Surly.

The Festival of Graphics Cash
The resulting era must have been called the Golden Time inside the graphics card industry. Huge pent-up demand for faster, higher performing video cards built and built until it nearly ruptured out of the great demand containment area. The great economist, Dr. Supply N. Demand, showed up and prices for burly video cards soared. Card makers made serious cash. But even in 1992 and1993 graphics cards were still kind of puny and lame. They had improved, but Windows and other apps still loped far out in front of what video cards could handle.

The industry now measured graphics cards by the number of pixels they could redraw in a second. (Megapixels per second. There is no speed limit.) Chips on graphics cards follow roughly the same rule as CPUs, their power can double every 18 months at roughly the same cost.

By 1993 or 1994, all this color stuff really turned people on and the CD-ROM drive kicked the door open for multimedia. Graphics cards really started groaning when multimedia showed up. Lights, camera, action, speed during this time graphics cards mainly lived on antacids, cigarettes, coffee and diet colas, and they sometimes slept at their desks.

Computer users didn’t even care. "More speed, more color, more motion," we all screamed. We get that way a lot.

So graphics cards continued to grow, until they turned into graphics accelerators with 1MB to 4MB of memory right on the card, and chips that really rock compared to those early video cards.

Drivers Aren’t Just for Limos
But as graphics cards got big, the industry discovered that if you take any trip with hardware, you often need a driver. A driver is a bit of software that allows the chips and electronic doo-dads on the card to talk in a civil and understandable tone with other components to achieve a desired result. Think of drivers as the instructions needed for something to work. Without the right driver to let your video card talk to the software and the monitor, everyone just stands around and says, "Huh?"

For example, if you try to change the resolution of your monitor right now, the computer will sometimes ask you for a driver. No driver, no change. Sorry. Thanks for the effort. Come again when you have a driver. Often driver programs can be downloaded from an electronic bulletin board or Web site operated by the graphics card maker or computer maker. Gateway’s BBS and Web site is littered with new drivers all the time.

The standard, low-end video mode today, VGA (Video Graphics Array), doesn’t need a driver. Programmers write everything at minimum to be compatible with VGA. VGA is OK with resolution of 640 x 480 and 16 colors. You get pretty good color and graphics. But Super VGA (at least 800 x 600 in 16 colors) is better and a lot of people like to go with 800 x 600 with 256 colors or even that hard-to-grasp 16 million colors, often called "true color."

For that, you need a driver. The problem is, drivers are often the last thing perfected in any new piece of hardware including graphics cards. They unleash amazing power, but often need to be updated a few times before you can feel confident you won’t fly into a swearing fit when you try to use your cool new hardware due to a driver error.

Power Plus Some Fancy Stuff
When graphics card makers finally got their chips together and got out the Brutus cards that could handle all the wild graphics, they said, "Hey, what about making more cash?" and they turned to adding more function to the cards.

Like "video scaler," for example, that lets you play video on your PC. Remember that the graphics card controls every single pixel on the screen at all times. Now think about running the motion of video across there at about 30 frames per second. Graphics cards start to smolder just thinking about it.

It’s so much work that often graphics cards can’t handle full-screen video. They have to do a little window within the screen because controlling all 300,000 pixels in 16 million colors at 30 frames per second would give them a silicon stroke.

A video scaler allows the window to be bigger than it would normally be, if not full screen. The scaler helps the graphics card stretch its power by extrapolating and interpolating and a lot of other -ating that sort of projects the video out to a bigger size. Those crazy software and hardware designers, what will they think of next?

How about 3-D, which does not mean three dimensions in the typical sense. Nobody has to sit in front of his computer wearing those part red and part green alien glasses watching their screens. No, the three dimensions in this deal are foreground, middle ground and background.

3-D is primarily for games because it allows much more realistic movement across the screen. Of course, 3-D drives the heck out of the graphics card, which by now has developed one really huge inferiority/persecution complex.

Also inside the modern graphics accelerator lives what’s called a DAC (Digital to Analog Converter). The DAC determines what resolutions and color amounts the card will support. When you boot your computer, the DAC and monitor go through this cute synchronization thing. The DAC says, "Good morning, Mr. Monitor. I support up to 1024 x 768 resolution in 16 million colors." And the monitor says, "Sorry, DAC, I can only go up to 640 x 480 in 256 colors," and so they settle on the highest one they both support, have some coffee, a donut and get to work.

To help graphics accelerators do all the stuff we users order them to do, card makers have slapped a bunch of RAM (Random Access Memory) on the card. Graphics cards today have at least 1MB of RAM on them. Most have 2MB. With 2MB, the card can do what’s called "interleaving," which means the chip stores info on one bank of memory while info flies out the second bank. That lets graphics chips work as fast as they can and not be slowed up by the amount of data that can flow out.

Cards On The Table
You’ll need at least 2MB of memory on your graphics card to get to the "true color" level. Some cards have 4MB, but only hard-core users really need that much. If you handle huge spreadsheets or do very graphics-intensive stuff like computer assisted design (CAD), then you’ll need 4MB. But for the rest of us earthlings, 2MB smokes.

You’ll also want to pay attention to what’s called a "bus." That’s basically the path that takes video signals across the card and out to the monitor. The wider the path, the more can go out at once and the higher quality the video. Most graphics cards have a 64-bit wide bus. You need a 24-bit bus to support true color. But, beware if someone says a graphics card is "16 bit," that either means the bus is 16-bits wide (bad), or it supports 16-bit, true color (good). Better ask the sales person to get that cleared up.

So, put the pedal to the metal on your graphics accelerator and watch the screen images dance and sing. Just remember to yell, "thanks!" into the back of your PC where the monitor cord goes in. Graphics cards appreciate the support.

[Products] [Global Site] [Tech Support] [Corporate Information] [The Cow Zone] [What's New] [Glossary] [Search] [Home]

[Gateway 2000]
Copyright © 1996 Gateway 2000 Inc. All rights reserved.
Please see our Legal Information. Please send feedback to Webmaster Central.