Detonator Battle! 5.32 vs 6.18
Sep 17th, 2000 | By Archive
Detonator Battle! 5.32 vs 6.18
Date
: 09/17/00 – 03:30:43 AM
Author
:
Category
: Drivers
When NVIDIA’s 6.18 Detonator 3 reference drivers were released the other day(along with a simultaneous announcement of the Geforce 2 GTS Ultra), I knew I had to have them. I had been hearing great things about the exclusively Win9x 6.16 and earlier 6.x drivers. So, when I found out that the new ones were out for Windows 2000, I jumped at the chance. They were even out for NT4.0! I installed them, and played for a few, not noticing a huge performance, but everything was definitely running smooth.
When I had first installed the card, along with the drivers it shipped with(<5.30), I was getting these strange banding lines on the desktop, as well as in games. I couldn't figure it out. I don't remember what version exactly of the drivers they were, but they only recognized the card as NVIDIA NV11. As we all know, this was the codename for the Geforce MX, before it came out. I decided to do a little surfing, to see if I could come up with some new Win2k drivers. Luckily, the 5.32 drivers were right on the NVIDIA website, and so I downloaded and installed them. Pretty much all of the problems were fixed, though every now and then, I thought I could catch a glimpse of some banding lines. Well, whether or not I found them, it was only a couple more days before the 6.18 drivers came out, and everything was peachy keen.
I'm sure you don't want to read all of this intro here, even though it's about wonderful me, so here are the benchmarks. I've bunched them altogether into one big picture because it's just easier like this. Easier for me anyway. Quake3 1.17 demo001.dm2 was run. The system settings were the same as last time...here, I'll copy and paste the table I made last time for use here....
System Settings | |
GL Driver | Default |
GL Extensions | On |
Resolution | 512×384 to 1280×1024 |
Color Depth | 16/32bit |
Fullscreen | On |
Lighting | Lightmap |
Geometric Detail | High |
Texture Detail | Highest |
Texture Quality | 32bit |
Texture Filter | Trilinear |
Graphic Options | |
Simple Items | Off |
Marks on Walls | On |
Ejecting Brass | On |
Dynamic Lighting | On |
Identify Target | On |
Ok, here’s the table. I’ll type out the exact numbers in a table below just for you (1 or 2?) people. By the way. Just in case you didn’t read the review I wrote, the default frequencies for this card are 175mhz/166mhz. The overclocked frequencies are 220mhz/200mhz, and of course if you need specs on my computer, go to the My Computer page.
Resolution | 5.32 | 5.32 O/C | 6.18 | 6.18 O/C |
512x384x16 | 120.5 | 121.0 | 120.4 | 120.5 |
512x384x32 | 111.2 | 117.2 | 120.1 | 120.8 |
640x480x16 | 112.3 | 119.2 | 118.5 | 120.6 |
640x480x32 | 94.1 | 104.0 | 110.9 | 116.9 |
800x600x16 | 97.1 | 108.0 | 103.8 | 112.8 |
800x600x32 | 64.3 | 78.3 | 82.9 | 98.0 |
1024x768x16 | 64.1 | 79.9 | 71.8 | 87.5 |
1024x768x32 | 40.7 | 49.7 | 53.2 | 64.8 |
1280x1024x16 | 42.1 | 52.2 | 46.0 | 55.8 |
1280x1024x32 | 24.0 | 29.2 | 32.1 | 39.8 |
As you can see, the 6.18 drivers overclocked give an astounding performance gain. Like with the 5.32 drivers, 32bit performance enjoys the more efficient drivers. Here’s something I want to try. I’d like to see how the overclocked additional performance gained for each drivers is compared. That last sentence came out somewhat messy, so I’ll try to get the point across. I wanted to see the percentage of additional performance you get, from say overclocking and running a test at 1024x768x32 in each driver set. If you don’t get it, here’s the table I’ve made.
I’ve been thinking about what this means, and when you look at the numbers, it almost looks like the 6.18 drivers have aided in stepping up each higher resolution/bit-depth, and made it the equivilent of the lower resolution. We see that at the low resolutions, the 5.32 drivers really use every bit of overclocking it can. Although the real gain isn’t absolutely huge(more like about 5fps vs. 1fps), it’s still something to think about. Only at the highest resolution and bit-depth do the 6.18 drivers get a better performance gain. Looking at this graph can easily fool you. You see that the percentage of the gain is larger mostly with the 5.32 drivers, and yet most of the non-overclocked scores from the 6.18 drivers are still higher than the overclocked 5.32 scores. So just take this graph with a little interest, but don’t get caught up in it.
Conclusion
Let’s go through this a little why don’t we? It’s pretty obvious from here that if you have a Geforce 2 MX and Windows 2000, there is no reason on earth other than perverse nostaglia to keep the older Detonator 2 drivers(5.x). There is a very large performance gain using these new det 3 drivers, and the gain goes even larger when overclocking the det3 drivers. Let’s look for a second. At 800x600x32, the non-overclocked 5.32 drivers get a score of 64.1. The overclocked 6.18 drivers get a score of 98.0. That right there is 33.9 fps! Crikey, I’m sure most people will be able to sense a little performance boost there. Although that’s the largest numerical gain. The highest percentage gain is from the 1024x768x32 scores. That’s 59% more performance in the form of 24.1fps. Let’s just say that I’m pretty happy with this. I don’t really have anything else to say, so I’ll just let you look through the graphs some more.
Do you want to see my spread?…sheet that is! Get your mind out of the gutter and click here.