![]() |
![]() |
![]() ![]() ![]() |
One of the main disagreement in many amateur game projects lies at the heart of screen resolutions and color depth. Graphic artists when more pixels, more color, while programmers want more speed, and less work.
As a programmer, you are in charge of all the technical aspects of your game - it is your job to know when to draw the line. Nothing keeps you from having your game run at a resolution of 1024x768x32, apart from the fact that no one will buy your game.
In the great majority of games, the most intensive task performed is pushing pixels around. If you have a screen resolution of 640x480, that makes 307,200 pixels. In 16-bit color depth, each pixel takes 2 bytes, making 614,400 bytes. That means your game has to "push" 615 kilobytes of data every time it is drawing the game screen. Obviously your game screen has to be refreshed several times per second to give a good illusion of motion - 20 frames per second is a good target. So, a 640x480x16 resolution game will have to "push" 12,288,000 bytes of data every second, or 12 megs/second.
This is quite attainable even on slower computers; but the numbers rise very rapidly as the screen resolution increases. Put the color depth at 24-bit, and the number rises to 18 megs/second, at which point slower computers (P100-P150) will have a hard time keeping up.
Television has 30 frames per second, while movies have 24. Having numbers in this range is ideal. 15 frames per second is a minimum : there will be noticeable chopiness, and the game will not feel as responsive. 10 frames per second is definitely too slow.
It's interesting to note that sometimes users don't mind if the game is choppy. When Quake II came out, the great majority of people who played it could barely get 15 frames per second, and a good portion only had 10 frames or lower. But Quake is, of course, the exception, not the rule : people won't like your game if it's choppy.
To show you what kind of performance you can expect from your computer, I've built a small screen benchmark application, called ScreenBench. ScreenBench blits a screen-wide image in several display modes, using several blitting techniques, and shows the resulting framerates. I encourage you to download and try ScreenBench on your computer - I'd be very interested in anyone sending me what numbers they get on their machine, so I can compile a database of sorts.
My system is a Cyrix P166+, with 64 megs RAM and a 2 meg Matrox Millenium I. The monitor's refresh rate is 85 Hertz; the bus speed is 33 Mhz. The rest of the system specs don't really matter for this test. As you can see, my equipment is somewhat dated; however, Pentium 166's are the highest minimum specs I've seen for a 2D game yet, and it's asking a lot. Any game should try to include at least the P166 in the minimum specs, otherwise you will be cutting yourself from a very important share of the market. We'll get back to this later.
So, let's look at the first results we get with ScreenBench.
Raw DirectDraw Hardware Blitting Performance, without flipping | |||
640x480 |
800x600 |
1024x768 | |
8-bit color |
371.0 |
208.9 |
137.8 |
16-bit color |
185.1 |
110.9 |
* |
24-bit color |
121.1 |
* |
* |
32-bit color |
* |
* |
* |
* there was not enough video memory on a 2 meg video card to do this operation
Well, these numbers look pretty amazing, to say the least. 371.0, for 640x480x8, is more than 10 times what you get on television; it's more than the eye can perceive. The good news is, if your player has a solid video card with plenty of video RAM, you could get away with a great performance, in any screen resolution. The bad news is, your player may only have 2, or 4 megs of video RAM. In fact, 2 megs video cards are still being produced today. So, on a 2 megs or 4 megs videocard, these numbers are just fantasy : most of your game graphics won't fit in the video memory, and most of your blits will be software-based, not hardware-based. This is why ScreenBench didn't do the tests in many resolutions : there was not enough video memory to fit 2 screenfuls of data in the video card at the same time.
The time will come when there's plenty of video RAM on all video cards for you to put most of your game's data on. But right now, the trend of video cards with little video memory is still here (after all, the average user has absolutely no use for video memory apart for games).
Note that there hasn't been any significant breakthrough on 2D blitting for a while now (after all, it,s just moving data around), so I think almost everyone one should have similar numbers for the previous table, regardless if they have the latest Vodoo cards or the original Matrox Millenium I.
That being said, let's quit Fantasy Land and get some more down-to-earth numbers.
Raw DirectDraw Software Blitting Performance, without flipping | |||
640x480 |
800x600 |
1024x768 | |
8-bit color |
125.7 |
79.5 |
50.2 |
16-bit color |
62.6 |
40.4 |
25.2 |
24-bit color |
40.5 |
26.5 |
* |
32-bit color |
30.1 |
19.7 |
* |
* This video mode is not available on 2 meg video cards
Ouch. Now that's a reality check. The software blit is almost 3 times slower than our hardware blit!
This table is here you may see the most difference between systems. Again, the video card doesn't matter much here : it's the CPU and mostly the bus speed that matters, so if you have some one of the newer 100mhz bus systems, your numbers should be much, much better. AGP or PCI won't matter much either I believe.
But these are the numbers you should expect from a P166, our minimum spec system. Those numbers aren't half bad, after all - 8-bit ad 16-bit speeds are excellent, and 24-bit is even attainable, providing your game doesn't have serious performance bugs.
Alas, even these numbers are still not close enough to reality. All games today implement page flipping, providing cleaner pictures and responsive. Unfortunately, flipping is a great performance killer. See below...
Raw DirectDraw Software Blitting Performance, with flipping (double buffering) | |||
640x480 |
800x600 |
1024x768 | |
8-bit color |
75.0 |
75.0 |
37.5 |
16-bit color |
37.5 |
37.5 |
9.1 |
24-bit color |
37.5 |
13.4 |
* |
32-bit color |
15.7 |
10.0 |
* |
* This mode is not available on 2 meg video cards
Hmmm. Our numbers went down again, dramatically in some cases. This is because page flipping forces us to wait for the flip to finish itself before continuing our blits. And since the flip can only occur in the monitor's vertical sync interval, our framerate is slaved to the monitor's framerate. This is why so many numbers repeat themselves in this table : 75.0 is the monitor's refresh rate, 37.5 is half, 15 is a fifth, etc.; with page flipping, the framerate can never be higher than the monitor's framerate. There is a lot of idle being spent waiting for the monitor to "catch up".
The solution is to use triple-buffering. Triple buffering involves creating a third buffer, a "backup" buffer that we can work with whenever the two other buffers are busy. You can get a detailed discussion of triple buffering here. not available yet
Raw DirectDraw Software Blitting Performance, with flipping (triple buffering) | |||
640x480 |
800x600 |
1024x768 | |
8-bit color |
75.0 |
75.0 |
* |
16-bit color |
62.1 |
* |
* |
24-bit color |
* |
* |
* |
32-bit color |
* |
* |
* |
* there is enough video memory on a 2 megs video cards to setup triple buffering in this mode.
Ah, good. Using triple buffering, we we're able to regain most of what we lost, and the speeds we get are comparable to the raw software blit speeds, on the second table. But again, there is a bad news : we can only setup triple buffering in 3 modes! (on a 2 meg video card). In fact, the only mode where we are getting a performance increase is 640x480x16; triple buffering is simply not worth it on the 8-bit color modes, where we we're already maxed out in performance.
So what this all means, is that the third table, the slowest table, is what is closest to reality. I'll put it here again so you can get a glimpse of it again:
Raw DirectDraw Software Blitting Performance, with flipping (double buffering) | |||
640x480 |
800x600 |
1024x768 | |
8-bit color |
75.0 |
75.0 |
37.5 |
16-bit color |
37.5 |
37.5 |
9.1 |
24-bit color |
37.5 |
13.4 |
* |
32-bit color |
15.7 |
10.0 |
* |
* This mode is not available on 2 meg video cards
Stay away form 24/32-bit...
24-bit and 32-bit take a lot of CPU power, so they're not ideal choices for a game's screen resolution. But there's an even better reason not to use 24 and 32-bit : some video cards support 24-bit, others support 32-bit, but few support both modes. And I'm talking mainstream cards here, here are some results from the ScreenBench survey:
Videocards that support both 24 and 32-bit
- Matrox Millenium I & II
- Matrox Mystique
- ATI Expert@Play
Videocards that support 24-bit only
- S3 Virge
- Intel740 (EON)*
Videocards that support 32-bit only
- S3 Trio 64V+
- S3 Vision968 64bit DAC
- Diamond Viper 330
- Creative Labs Graphics Blaster Riva TNT
- Unbranded AT25 + Voodoo Rush
*This odd card reports 40 megs of video memory, which is impossible... Intel has been known to play make-believe tricks before, dunno what this is all about here though.
So, basically, if you want color depth higher than 16-bit, you'll pretty much have to implement both 24-bit and 32-bit, otherwise your game will run only on a subset of cards. I believe this is what Baldur's Gate is doing : the games graphics are in 16-bit, and special effects can be rendered in 24 or 32-bit, depending on your videocard.
32-bit = lots of wasted bandwidth...
Many people believe that 32-bit has higher visual quality than 24-bit. This is absolutely false : 32-bit has the same visual quality as 24-bit. The only difference between the two is that 32-bit has an extra byte per pixel, which is often used for alphablending, but you could use it for anything you want. But in a real-time game situation, there is simply no use in slowing down all of your game simply to get an extra byte of information on each pixel, unless you have no choice (i.e. you want 24-bit but the videocard only supports 32-bit).
24-bit is only attainable in 640x480, with optimization...
The only mode where 24-bit achieves good performance is 640x480; at 800x600 it's too choppy. I have not seen any 24-bit graphics game yet, for many good reasons : the increase in visual quality from 16-bot to 24-bit is not as noticeable as from 8-bit to 16-bit, and 24-bit graphics take a huge amount of space in memory. The fact that many videocards don't even support is also a big reason why I advise you to stay away from 24-bit.
16-bit is good in 2 modes, with optimization...
640x480x16 is certainly a valid choice, and many commercial games are now coming out with 16-bit graphics : Baldur's Gate, Get Medieval, etc. 800x600x16 is a bit heavier, so you may be a bit short on CPU juice is you implement a CPU-intensive AI or an isometric map. 16-bit graphics require good optimization to be able to handle more heavy duty thinds like alphablending, shadowing, etc. 16-bit is pretty much "we're it's going" - but if your game has other tasks besides graphics that may require a good deal of CPU juice, don't be afraid to fall back on 8-bit graphics.
8-bit is safe, in any mode...
8-bit is safe; slim and fast in all screen resolutions. You'll have to put in a little extra effort to handle palettes, but it's well worth the effort. You'll have plenty of breathing space with 8-bit graphics, and you'll get good performance even if your game doesn't implement all the latest optimization tricks. A good graphic artist can draw 8-bit pictures that will fool you into thinking they are 16-bit or higher; it's all a matter of choosing the right palette colors. I thougth Diablo was 16-bit for some time until I took a screenshot and realized the file had been saved as 8-bit color bitmap. 8-bit graphics are great for games that do a lot more than just display pretty pictures ;-).
All the numbers discussed here are theorical numbers; the Screenbench program does not play any sounds, it doesn't do any pathfinding, it doesn't have an AI. Plus, your game may have more than 1 screen to refresh at each frame : since some pictures obviously cover up each other, your game will most likely blit something like one screen and a half at each frame. So don't be ashamed if your numbers aren't that high - in many cases, those framerates aren't attainable in a game, not wihout a great deal of optimization. The numbers here are numbers to aim for - if your game's framerate is close to the framerate given by Screenbench, then congratulations, you can say your game is running at top notch performance.
That being said, when beggining your game project, you should evaluate if your game will do any heavy processing besides graphics, and choose with this in mind. The type of view your game uses also has a great effect on performance:
Unless you're writing your game that is only meant for your own personal enjoyment, ideally you want your game to be played by someone else. This means that someone, somewhere will have to enjoy your game enough to shell the thirty to fifty bucks to buy it. But there's a good chance that no one will find your game enjoyable if it's playing at one pixel per hour on their machine. This is the bottom line when choosing your screen resolution : you can choose anything you want, but aiming for the lowest common denominator will give you a lot more exposure.