View Full Version : 3D Mark 2003 Under Fire
OmnipotentKiwi
2003-03-03, 04:59 PM
Don't know if many have read this yet, but it has been around a few weeks: 3D Mark 2003 (http://www4.tomshardware.com/column/20030219/index.html)
Basically NVidia openly attacked 3D Mark 2003 saying it was a poor benchmark program as it is designed for the "normal" DX9 code.
They continue to state, that games are never "normal" code, and that game code is optimized.
There is a lot of info on both sides, and other non-partial sides commenting, and ATI supports the new 3D Mark (well duh, they did get high scores).
Basically what it came down to is:
Game code: Optimized
DX9 in 3D Mark 2003: "Normal"
ATI Core: "Normal"
NVidia Core: Optimized
After reading the whole thing, I feel even though ATI does come out on top by far in testing, it really doesn't say much of anything. 3D Mark 2003 does expose the flaw with the NVidia core, however how much impact that will have on gaming is not shown at all by this test.
3D Mark also states that the first DX9 game won't even be out for 6+ months, so this test doesn't reflect much of anything, especially Planetside.
It should also be noted that the next NVidia core (The FX one), is also optimized rather then normal coded. So this test is again fairly biased.
powdahound
2003-03-03, 06:11 PM
It has been known for quite a long time that although ATIs usually score higher in benchmarks, they don't do as well as nvidias in game. You can't really compare the 9700 to anything that's mainstream from nvidia at the moment though. :p
Zatrais
2003-03-03, 06:30 PM
Hehe, kinda amusing isn't it... Course Nvidia dosn't approve of it, they get owned in it and ATI loves it cause they score high.... It also kinda exposes the GF core and how old it is.. hasn't been anny new core for nvidia cards in the GF series... FX core is new tho and copes very well whit the 2003 tests.
Arshune
2003-03-03, 06:34 PM
Quite frankly, I don't care how well ATI does in performance tests, their track record precedes them (not to mention their cards have been responsible for EVERY blue screen I ever saw on my old PC). Strangely, first nvidia card I ever buy works flawlessly...very unusual... ;)
SpaceDrake
2003-03-03, 10:18 PM
But then, of course, another problem is the fact that THEIR CRAP-ASS CARD ISN'T DESIGNED TO DO ADVANCED SINGLE-TEXTURE FILLING AND FUTUREMARK REFUSED TO BEND OVER BACKWARDS FOR THEM:
http://www.theinquirer.net/?article=7920
Forgive the bold caps. But after this and the crappy job they've done on the 4x.xx series of Dets, I'm quite sick of NVidia. My next purchase will likely be one of the advanced Radeon cards (likely a R350-based one.)
Lexington_Steele
2003-03-03, 10:24 PM
However you can't deny that Nvidia cards do outperform ATI cards when benchmarked in some games. So obviously there are coding scenarios where Nvidia cards are better.
Why would Nvidia give full backing to any kind of benchmarking tool that did not put them on top when there are benchmarking situations where Nvidia cards raw outperform ATI cards? Isee it as simple bussiness sense. If Nvidia came out on top with a particular benchmarking program, you can rest assured that ATI would not fully back that benchmarking program.
Derfud
2003-03-03, 11:21 PM
I'm sure Nvidia is just jealous that ATi is actually doing well. Plus they have 0 tech support for thier drivers. You can only talk to your card manufacturer, but they don't have river support :/
OmnipotentKiwi
2003-03-04, 05:07 PM
Originally posted by Derfud
I'm sure Nvidia is just jealous that ATi is actually doing well. Plus they have 0 tech support for thier drivers. You can only talk to your card manufacturer, but they don't have river support :/
I've never had driver issues with NVidia cards personally.
And like has been stated, who gives a damn if the NVidia core can't do Advanced Single Texture Filling if games NEVER actually use it?
Also the ATI card runs better at 24-bit, which is were a lot of "normal" DX9 processing takes place, but when was the last time you played a game in 24-bit?
Last I checked it was 32-bit. :p
Zatrais
2003-03-04, 06:11 PM
thats the Z buffer... they prefer 24bit Z buffer over 16 bit...
Arshune
2003-03-04, 06:17 PM
Let's put it this way...ATI is better if you want to put up with a load of garbage from them and take the time to make sure everything is just right for their card. Nvidia cards just get plugged in and forgotten about, periodically remembered for driver updates. Pick yer poison. :D
Zatrais
2003-03-04, 06:19 PM
If ATI is so horrible now then why are they stealing so manny of Nvidias partners? like MSI, Microsoft (you can hate them all you want but in the buisness world they're a heavy hitter.)....
Hamma
2003-03-04, 08:43 PM
People always root for the underdog.
vBulletin® v3.8.7, Copyright ©2000-2024, vBulletin Solutions, Inc.