nVidia, Futuremark Exchanging Blows

It seems that the Futuremark debate and the, so-called, driver optimization war is still going strong. Following a brief truce, both sides have resumed the campaign, with Futuremark initiating the current bout of aggression with a 3D Mark '03 patch released on tuesday. The company was very careful not to point any fingers but claimed that this new patch disables all driver optimizations. Finger pointing however, cannot be avoided when reviewing the way the patch affected 3DMark '03 scores. nVidia boards experienced drops of up to 26.4 per cent while ATI's offerings experienced more performance increases than decreses with the latter peaking at 2.4 per cent. An example of how badly nVidia results were affected can be seen by considering the 5950 Ultra which had a score of 5886 3DMarks and dropped to 5081 following the application of the patch.

So if we were to accept Futeremark's statements about the patch then nVidia were clearly the company that was utilising driver optimizations. nVidia, of course, replied with a statement, only for Futuremark to respond with one of their own. So this will go on for some time and those with less than a relevant PhD will not get much. No one can be blamed however for gaining some form of perverted enjoyment out of this mess.

Here is the full nVidia statement:

With the introduction of the GeForce FX - we built a sophisticated real-time compiler called the Unified Compiler technology. This compiler does real-time optimizations of code in applications to take full advantage of the GeForce FX architecture.

Game developers LOVE this - they work with us to make sure their code is written in a way to fully exploit the compiler.

The end result - a better user experience.

One of the questions we always get is what does this compiler do? The unified compiler does things like instruction reordering and register allocation. The unified compiler is carefully architected so as to maintain perfect image quality while significantly increasing performance. The unified compiler a collection of techniques that are not specific to any particular application but expose the full power of GeForce FX. These techniques are applied with a fingerprinting mechanism which evaluates shaders and, in some cases substitutes hand tuned shaders, but increasingly generates optimal code in real-time.

Futuremark does not consider their application a "game". They consider it a "synthetic benchmark". The problem is that the primary use of 3DMark03 is as a proxy for game play. A website or magazine will run it as a general predictor of graphics application performance. So it is vital that the benchmark reflect the true relative performance of our GPUs versus competitors.

And, while they admit that our unified compiler is behaving exactly the way it behaves in games and that it produces accurate image quality, they do not endorse the optimizations for synthetic use. Hence, Futuremark released a patch that intentionally handicapped our unified compiler.

So, we advocate that when reviewers are using 3DMark as a game proxy, they must run with the unified compiler fully enabled. All games run this way. That means running with the previous version of 3DMark, or running with a version of our drivers that behave properly.

And here is the Futuremark reply:

3DMark03 does not talk to graphics driver, it talks to the DirectX API, which then talks to the driver. Thus, it is impossible for the application to disable GPU compiler!

The only change in build 340 is the order of some instructions in the shaders or the registers they use. However, new shaders are mathematically equivalent with previous shaders. A GPU compiler should process the old and the new shader code basically with the same performance. Of course, if there are application specific optimizations in the driver that depend on identifying a shader or parts of it, then you might see performance differences because these optimizations will not work if the driver is not able to detect the shader.

Let's also repeat that 3DMark specific driver optimizations are strictly forbidden in our run rules because they invalidate the performance measurement and the resulting score is not comparable to other hardware.

Thus, the right conclusion is that the new version of 3DMark03 is now very suitable for objective performance measurement between different hardware. Futuremark's Benchmark Development Program is an open program where we also work a lot with drivers and actually help IHVs in getting more performance via generic enhancements that benefit all applications. We would like to invite you to apply to the program.

Add new comment

CAPTCHA
This question is for testing whether you are a human visitor and to prevent automated spam submissions.

Comments

nVidia, Futuremark Exchanging Blows

who gives a f**k if you got first post or if Nvidia or ATI suck.f**ken Geeksf**ken fan boys, just go for the best card that performs best for the games you want to play. And stop jumping to f**ken conclusions before doing abit or research and finding out why.Arg some of you guys are the biggest d**ks

nVidia, Futuremark Exchanging Blows

"oh boy oh boy, i got the first post", f**k you dumbass, nobody f**king cares goddamnitnow as for the topic at hand, futuremark can suck my c**k, nvidias putting out the beefiest shit on the market, and futuremarks trying to f**k them in the ass on it, those c**kbiting f**ktards

nVidia, Futuremark Exchanging Blows

its lame to attack nvidia drivers, there seems to be nothing wrong with releasing drivers that allow a card to perform at its best. cheaters what? its just a dumb excuse all the ATI fanboys came up with because AT THE TIME ati's drivers sucked. i really mean it; what does it really matter if drivers have been tweaked for performance? thats like using your low beams at night while driving in total darkness when you have the ability to switch to your brights. but then all the ATI fan boys hop out like "oh no! your optimizing your light so you can see better! thats cheating!" (ehm.. oh no your optmizing your driver code so it runs smoother - thats cheating!)

nVidia, Futuremark Exchanging Blows

futuremark sucks, ati and nvidia both rule, 3dmark03 is a "synthetic benchmark" funny thing is synthetic means not natural or in other words not made with natural materials and more chemical imbalances, futuremark, give up you are only corrupting the graphics world with false information about your program, you suck and thats all there is to it

nVidia, Futuremark Exchanging Blows

Y'know, it would seem that NVIDIA is puting some really cool features into their architecture. This unified compiler really sounds cool . Also, NVIDIA has the distinction of creating the first "Programable GPU". Remember the old definitions: GPU, a chip that turns geometric calculations, done in most cases by the CPU, that to turn 3D objects or environments into 2D coordinates for display, into graphics: VPU, (as in ATi's [and a few other companies at the time's chips, by the way, not just ATi's]) is described as a chip that can do all of the complex, geometric calculations that a CPU can do (including, and perhaps most interestingly, shader coordinates) plus integrates a rasterization engine that turns those calcs into graphics. NVIDIA was the first to really change this, with the GeForce 3, which integrated a programable shader engine (i.e. a GPU could now process shader coordinates like a CPU or VPU).The trouble is, none of these "inovations" seem to help. Perhaps the most troubling bit is the "Glide"-like feel of the CineFx engine (NVIDIA's shader engine), which requires a completely seperate line of programing code for full use. This means that beyond what they have already written, game developers must also compile a whole DIFFERENT version again, so that CineFX can use it. This while Hardware from the other guy (in this case ATi, in the case of 3Dfx, the propriators of Glide, it was mostly nVidia) uses the code that is compiled originally for use in the game, in this case (3DMark'03) the version using the original and true DirectX 9.0 APIs. Hmmm...The trouble with this is that NVIDIA, who is use to totally ruling the market, much like 3Dfx did way back when, is not 3Dfx. Or, at least, not what 3Dfx was in its heyday. It is not in a situation where it's way is the only way. It's not even in a situation where it's ways is the best way, or even the intended way. While ATi hardware runs the original, unoptimized, DirectX 9.0 API nearly flawlessly (despi

nVidia, Futuremark Exchanging Blows

sorry folks, My post ran a little long. Here is the rest of it, starting, apropriately, where my last one ends:(despite rumor which, let's face it was spread by none other than NVIDIA itself to the contrary) and with greater performance than NVIDIA's parts dare bost, NVIDIA's parts require additional programing, and in some cases image alteration (which should NEVER be) to even aproach the type of performance we see with ATi hardware. So, in this way, we can safely say that "the way it's (truely) meant to be played" is with ATi (which just happens to run the correct [and intended] API).Sorry for any inconvenience or undue confusion this may cause (I know, I know. Your poor tortured existance, heh heh.)Enjoy!

nVidia, Futuremark Exchanging Blows

actually, Staticpurge, what is wrong here is that NVIDIA is using benchmark specific optimizations. That is, optimizations that improve performance in a benchmark, but have no real berring on actual gameplay. It would be great if all NVIDIA were doing, was using drivers to give us all a better experience, but instead, NVIDIA, who obviously knows the influence the media has on which cards we buy, has tweaked driver performance for a few benchmarks the media uses commonly. That IS cheating.

nVidia, Futuremark Exchanging Blows

You nvidia fanboys can suck my balls,ATI Is the BESTJust look at all the problems with nvidia,halflife 23D Mark 03and the list goes onand what has ATI got promlems withwell not much, sure they have a few minor problems but they still kick Nvidia's assNvidia sucksAlso ATI drivers are fine, they put out new ones every month and they always improve game preformanceas for nvidia they put out tons of crap drivers, then none at alland finnaly come out with some poor piece of crap "forceware"get realATI foreverNvidia SUCKS

nVidia, Futuremark Exchanging Blows

Thanks a bunch Killboy. Good to know I'm apreciated. Wish you would've included your email though, bro. By the way, I couldn't agree more (about the fanboy thing, anyway. Although I too enjoy my posts, it's mostly the posting of same that I get my jollies from, LOL).

nVidia, Futuremark Exchanging Blows

Both cards have their problems... Nvidia with Halflife 2 blows. ATI with Doom 3 and Splinter Cell BLOWS! It all depends on the game and which card manufacturer the game developers specifically write for in my opinion... Nvidia have always been reliable with their driver support and games support... Why give them shit for being a slight "under performer" compared to ATI... It don't make sense...

nVidia, Futuremark Exchanging Blows

hell i got a radeon 9600 256mb card i havnt seen any heating my card stays at a nice 34c even oced up to a 15% oc my pc runs a nice 31c at a nice cpu temp of 37c full load and my cpu is oced 15% all so i get great fps in slinter cell and the doom 3 alpha demo dont get where you guys are geting you rumors from ati hates never used a ati card so dont jugde them untill you have used them i cant say shit aboght nvida becasle i never used there cards all i know is ati runs all the game i want to play great and thats all the matters i dont have to oc to get great fps but i do it for brageing rites and i only have 3 case fans a power fan and a volcano 9 with a ducting mod on my cpu so shut up plz stop the bitching and buy the card you want to buy i dont give a shit let other ppl decide what they want and yes i know i didnt put ant . in and i dont care if i spell shit right.

nVidia, Futuremark Exchanging Blows

i think nvidia got the upper hand hear,i completely agree with nvidia on this one there drivers are optimized for this kind of shit,why take away the performance,there drivers are the best in the buissiness!!!!that compiler code helps with its performance,we like to call it DRIVERS!!!why not take away all the drivers and let everything run on default drivers,SHIT!!!then well see...

nVidia, Futuremark Exchanging Blows

did you not read the article Psych?! this patch HARDLY disables drivers. You've gotta be jokin' with that one. All it does is put the shader code in a slightly different order so that nVidia's past driver optimizations, which were optimized for 3DMark03's original shader code order, don't work. thus making it impossible for nVidia's past driver cheats, which were based on an older 3DMark03 version, to work! that's ALL!Moron

nVidia, Futuremark Exchanging Blows

Quite so -- both cards have enough problems. Just see any recent 3d-game's tech-suport forums.ATI blows in mnay cases (homeworld 2 , halo,Splinter cell, even unreal 2, etc, etc ---the graphics quality on ATI cards in these games isn't as good as on Nvidia's.) I've done quite a lot of testing on both ATI and Nvidia cards with recent games --so i really have something to support my ideas. Yes ATI's buggy in many games (sometimes you don't get the full image quality --some effects are completely disabled). But even ALL that doesn't tip the scales in Nvidia's favor ! Because it's fatal wound is it's shaders' performance. So, obviously, they're trying to improve their performance by ANY means necessary (including cheating :)).Futuremark ''accuses'' Nvidia in cheating, Nvidia denies that -- how do we even know they're both telling truth? :)At first i used an FX card -- i DIDN't like it's performance in dx9 apps. Then i switched to ATI ---the performance was really nice, but i didn't get ''the full picture'' !So, my point is ---- is there a GOOD dx9 card today ? :)''The problem is CHOICE''

nVidia, Futuremark Exchanging Blows

Also, many of the textures flicker in and out of existance or are removed altogether. This continues until you restart the level, and then, inevitably, happens again once you get going again. Also, and I know EVERYONE has experienced this, well, at least all of the professionals out there who review hardware, who's sites we all may visit and make mention of this problem, nVidia hardware produces jerky responses, jerks in display of graphics in almost every game. Even when frame rates are relatively high (compared with other times, which they are not) I am plagued by the ocasional image JERK that of course jerks you right out of the gameplay experience as well. Didn't bother to download the new "forcewares", mostly because I don't care. I'm getting a new ATi card. tommorow in fact! By the way, also, most of the games mentioned by sapphiron which display a bad image on ATi hardware were either rumors perpetrated by nVidia themselves (or indirectly through associates of nVidia) such as in the case of halo (which was proved wrong, check out the report on anandtech.com) or bugs which were recognized by ATi and either fixed with newer drivers or have been vowed will be fixed by future drivers, such as splinter cell (as early as cats.3.7) and homeworld 2 (cats.3.9). So, I submit this, not as an argument with you Sapphiron, but simply to augment what has already been posted. sorry to go on so long, but it feels better to have that off my chest. by the way, this is one big post so read it from the first one (had to cut it up to get it to fit) it starts at "I own a GeForce FX..."

nVidia, Futuremark Exchanging Blows

I own a GeForce FX card. I do have to agree with sapphiron, but where we differ is on one key point: Sapphiron, you failed to mention all of the image quality "quirks" that pop up on nVidia hardware. On EVERY game I've played I've noticed one issue or another. On Unreal II: little or no visible shaders on textures, little or no active shadow and lighting effects (e.g. when firing the asault rifle in a darkened corridor, the active lightsource of the muzzle flash is almost never rendered on the environment or creatures around me. bummer), unsatisfactory framerate performance (an average of 26.3fps using FRAPS and a low of 3.4fps on the first level, on 1024x768x32bpp, 4xAA8xaniso). On Jedi Knight: Jedi Academy; multiple shader and texture glitches that make the ecperience somewhat, well, frooky, and a dismal drop in frames when enabling 32bpp color. Homeworld 2: cannot use FSAA or aniso AT ALL. Turned them on in drivers, they had no effect. Also, what a surprise, unsatisfactory frame rates, even at low detail. And, best for last, Rainbow Six 3: Raven Shield; Frame rates are perfect at relatively low res, however, with the drivers ver.45.23 enabled, when the framerates drop, like when you get shot and the screen displays that cool, post-process, blurr effect, everything goes to Hell. The vertex shader effects, like particles, lense flare, and lightsource glare, are reduced to these giant, non-seethrough blocks that are used only to tell the rasterizer where to draw particle effects. This includes smoke from firing your weapons, which means that whenever you fire, your screen is imediately filled with giant, non-seethrough blocks that make it (imaginably) impossible to see anything, much less what your shooting at.

nVidia, Futuremark Exchanging Blows

what would happen if ATI used the same methods as nvidia?having custom code is like knowing the backend of systems, and using that knownledge to control execution. normally this is not done for compatibility issues, since that custom code would only work for the system its built for. and i'm refering to gaming software or software in general when i say system. since software is a system with subsystems that are a group of functions that have input and outputs.if the custom software calls a major modified version of a function, assuming the custom software plays with the internals or use internal information of the funtion in anyway, then it will fail. which brings costly maintainance when revisions are done (also bad programming practice). when calling a function, u should only use the output that the function provides, not interact with the internal varibles in the function.functions take the same input and output regardless of implementation. which is why programmers normally only use what the functions provides, and not play with the internals.i know some of you don't really care as long as it works, and runs faster. but in practice, having custom code is expensive as oppose to general code that works on all systems. which maybe a reason for the higher cost of nvidia cards or not, they might just be greedy.simply put, without constant maintainance if it uses custom code for every system, your valuable investment will go down the toilet.why would developers LOVE to jump through hoops just to get their gaming working with good performance, when the game runs just as well for a card that needs no custom code.for any of you that are still not convinced that custom code brings an unfair advantage when doing comparisons. i suggest we ALL ask futuremark to have 2 charts, one for customized codes, and another that has none. thus we can compare nvidia with custom code versus ATI with custom code.of course, i think it would be fun to see t

nVidia, Futuremark Exchanging Blows

In order for Nvidia hardware to run DX9 applications they need to run their complier program. And when they do it still runs those games or applications slower then ATI's offerings. Nvidia made the mistake in thinking that all developers would jump on the Cg band-wagon the same way 3Dfx pushed Glide. It never worked for 3Dfx and it's not working for Nvidia. Why? Because game developers have to work twice as hard to program for Nvidia cards then ATI cards. That cost EXTRA money and time.And using application specific optimizations for a synthetic benchmark is just that. A BENCHMARK. It has no real value on actual gameplay in todays or tommorows games. This is a old story. Old in the fact that we know why Nvidia keeps insisting on this type of optimizations which in most cases lower IQ. (Something they said they would NO LONGER DO. Remember their new AUDIT GUIDLINES? Wink, Wink)The FX hardware has poor PS2.0 and other shader performance issues. This is why they run their complier program and specifically optimize for benchmarks that OEM's use. However in REAL dx9 games and applications the real numbers and IQ present themselves. FX hardware is slower then ATI hardware. All you need to know is the results of the 9800XT over the 5950 to see this. Go to any review hardware site and you see this.What FX owners fail to realize is this: If Nvidia does not optimize for that new DX9 game that hit's store shelves, OR that given game company does not want to spend the EXTRA time programing a specific path for Nvidia's Cg coded graphic core, performance for that game will be piss poor on FX hardware. As seen when they first released Halo, TRAOD, HL2. YOU WILL HAVE TO WAIT UNTIL NVIDIA RELEASES THEIR NEXT SET OF DET DRIVERS TO CORRECT THAT ISSUE! To correct it with "application specific optimizations"And you are willing to wait for a driver update to fix it? Because a simple patch won't surfice to code for it's lack of shader registers.This is a hardware issue. It'

nVidia, Futuremark Exchanging Blows

i agree with cannonThe FX chipset is just flawed with its Dx9.0 abilities. It still has the power and bandwith, but when FSAA and Aniso comes in, ATI gains the upper hand!How can this happen??? I used to be a Geforce freak, but until i found out how ATi has potential (and truthful) better hardware then the GeForce lines, i decided i wanted a 9700/9800...that was just over half a year ago.

nVidia, Futuremark Exchanging Blows

Nvidia is milkin everyone. alot of ppl used Nvidia cards be4, i personally swiched cuz all the lies, optimizations, everything.. ITs like "Okey so they fell, lets wait and see, maybe its not so bad at all", but no, they fool ppl, instead of improving the architecture they optimize drivers and then ask cheesy pricetags. Well 1 thing i cant stand is bullshittin me

nVidia, Futuremark Exchanging Blows

So knowing how bad nvidia performes in directx 9.... and seeing the halflife 2 benchmarks comparing the mid range ati cards to the high end nvidia cards.. we saw that even the 9600pro over took 5900 ultra.... does this mean all future directx 9 games will suffer the same fate on nvidia cards.. or just when playing half life2.

nVidia, Futuremark Exchanging Blows

here we go again.... same old argument i see.If the nvidia driver optimizations are like it says in the statement and work with a game to make it run better, then there is no problem with that atall. on the other hand if they make a benchmark run better yet have no improvement in an actual game then that would be considerd cheating by me.in the end i think a benchmark result dosent mean much and comparisons should be made using proper games and measuring their fps. for example get 2 pc's with a fresh windows install and an xp3000 and in 1 stick an nvidia card and the other an ati. run them side by side on a collection of the latest games and measure fps. if ati win say 12 out of 20 games on fps then ati should be declared the winner. Simple really

nVidia, Futuremark Exchanging Blows

actually, a lot of hardware reviewers are on to the same track a lot of you refer too: using the FRAPS tool in actual games (as opposed to synthetic benchmarks) and using AquaMark03 which uses an actual, real-world game engine (that of AquaNox 2). Check out anandtech.com's later reviews of hardware. Also, you will see why you should by ATi over NVIDIA at this point, as they also have side-by-side comparissons of image quality. ATi mostly rules that event while still performing better. Look it up.

Pages

Add new comment