I did a driver comparison using Fraps in CoD Black Ops 2 multiplayer. Driver 326.84 vs driver 326.80. I was testing because I couldn't figure out why I seemed to be doing much better in the game while using the driver 326.84.
Test setup:
GTX 680 at default clocks, Haswell 4770K cpu at 3.9 GHz, Windows 8.1 Pro Preview 64-bit
In-game AA set to 4x MSAA, resolution set to 1920 x 1080, AO and DoF set to Off, MaxFPS set to Unlimited.
Benchmark was run for 60 seconds in map Hijacked with just me running around. Results:
Frames MinFPS AvgFPS MaxFPS
21044 278 350.733 439 - Driver 326.84 1st run
21454 279 357.567 429 - Driver 326.84 2nd run
21170 278 352.833 432 - Driver 326.80 1st run
21476 282 357.933 438 - Driver 326.80 2nd run
Analysis of results:
1) The differences in MinFPS, AvgFPS and MaxFPS are not significant. They can be explained as normal variance in the different test runs.
2) The 326.80 driver showed a slight tendency to process more frames during the testing.
3) The 326.80 driver had a better IQ. It made things look better than the 326.84 driver.
When I got more kills in the game using the 326.84 driver, the enemy players might have thought I was lagging a bit, based on their reactions to my movements. I'm wondering if I was scoring better with the 326.84 driver because I was processing less frames per the packets I was sending, or because of the worse-looking graphics.
↧
NVIDIA DRIVER 326.84 vs DRIVER 326.80
↧