What were the tests that WinG did to evaluate video cards?

Raymond Chen

Georg Rottensteiner was curious about the weird things that WinG performed on installation to evaluate video cards. “What did it do actually and what for?” I don’t actually know, since I was not involved in the WinG project, but I remember chatting with one of the developers who was working on video card benchmarks. He says that video card benchmarks are really hard to develop, not just because video cards are complicated, but also because video drivers cheat like a Mississippi riverboat card sharp on a boat full of blind tourists. He discovered all sorts of crazy shenanigans. Like a video driver which compares the string you ask it to display with the text “The quick brown fox jumps over the lazy dog.” If the string matches exactly, then it returns without drawing anything three quarters of the time. The reason: Benchmarks often use that sample string to evaluate text rendering performance. The driver vendors realized that the fastest code is code that doesn’t run, so by ignoring three quarters of the “draw this string” requests, they could improve their text rendering performance numbers fourfold. That was the only one of the sneaky tricks I remember from that conversation. (I didn’t realize there was going to be a quiz 17 years later or I’d have taken notes.) Another example of benchmark cheating was a driver which checked if the program name was TUNNEL.EXE and if so, enabled a collection of benchmark-specific optimizations.

Anyway, I suspect that the weird things that the WinG installer did were specifically chosen to be things that no video card driver had figured out a way to cheat, at least at the time he wrote the test. I wouldn’t be surprised if fifteen seconds after WinG was released, video driver vendors started studying it to see how they could cheat the WinG benchmark…

0 comments

Discussion is closed.

Feedback usabilla icon