Dawn ATI Fling Official

Dawn was beautiful, ethereal and possessed a certain graceful quality, possibly due to her amazing resemblance to T'Pol from TV's Enterprise. Most importantly Dawn belonged to nVidia, but how can you hope to confine such a unique creature to one family of graphic cards. A group of MIT engineering students decided that they would offer Dawn her freedom and created an OpenGL wrapper that allows any ATI DX9 video card owner to run nVidia's amazing Dawn demo.

The real story behind the development of this wrapper is rumoured to be the disappointment the students felt at the performance of nVidia boards while playing the demo. The wrapper has been tested on a Radeon 9800 Pro 256MB DDR II by Rage 3D and was found to work very well.

nVidia are bound to be extremely disappointed by this development, especially if some claims that the demo performs better on ATI cards, see below, persist.

These are some unconfirmed features added to the dawn demo by the use of ATI cards:

-It runs 15 per cent faster than NV30 on the 9800pro, and it also runs faster than NV35
-Creates higher quality images than the original due to the normalization being done in a fragment program (dp3/rsq/mul) instead of in a normalization cubemap which the FX extensions does directly in hardware
-The OpenGL wrapper adds more overhead, as it has to interperet code calls for Nvidia extensions and map them to ATI/ARB extensions, and yet it still runs faster on the ATI card, due to its more sophisticated pixel shader engine.

Requirements

A graphic card that supports GL_ARB_*_program and GL_ATI_vertex_array_object
Nvidia Dawn Demo
512MB System Memory
Patch works with the following chipsets
R350 based cards
R300 based cards
RV350 based cards
M10 chipset

Installation

Unzip and place opengl32.dll into the bin directory of the installed demo. Default example would be c:program filesnvidia corporationnvidia demosdawnbin

How to change resolutions

Select properties for the args.txt or args_ultra.txt in the bin directory and deselect Read-only attribute
Change to desired resolution which has the commented line "// Window width/height", default resolution is 1024x768
Go back to properties and select Read-only attribute

Experienced issues

Selecting Wireframe mode or Point mode under Drawstyle results in the demo crashing
The dll does not work with other GeForce FX demos

Add new comment

CAPTCHA
This question is for testing whether you are a human visitor and to prevent automated spam submissions.

Comments

Dawn ATI Fling Official

This is the error I get. I have a P4 2.7Ghz / 256 RAMBUS / 9700Pro / Win98SE / DX 9.0a / 3.4 Catalyst drivers / Audigy scFAIRY caused an invalid page fault inmodule CG.DLL at 0197:006e9450.Registers:EAX=40c0c0c0 CS=0197 EIP=006e9450 EFLGS=00210246EBX=00000000 SS=019f ESP=00a5e78c EBP=00000000ECX=8d3b9000 DS=019f ESI=8d3b7000 FS=1237EDX=9f1f1f1f ES=019f EDI=8d3b7000 GS=0000Bytes at CS:EIP:8b 01 ba ff fe fe 7e 03 d0 83 f0 ff 33 c2 83 c1Stack dump:006ec4c8 8d3b7000 00fd2de0 00666944 8d3b7000 00fa68e0 00000000 00664e24 8d3b7000 006653ee 00001804 00fd17a0 013ce3cc 006661ba 00000000 8d3b7000

Dawn ATI Fling Official

did you put in the opengl.dll file to run the demo on a radeon card? also try and run the demo at a lower quaility setting (lower your AA, and AF). Also the demo says that it requires 512 megs of ram... i dont know if it will or wont run with less.

Dawn ATI Fling Official

thats it,, i give up. i got the 9800pro today, and love it. no more nvidia, untill they get there act together anyways. the only thing is i have to turn agp 8x off cause of my mobo. p4g8x. but i do get 15274 on benchmark 2001 se.

Dawn ATI Fling Official

One thing i find interesting is the fact that ALL dx9 ATI cards ran the Dawn demo perfectly and faster then the 5900, or the 5800 Ultra(9800pro) also in OpenGL wrapper. Which we all know degrades video presentation. So Nvidia's PR department that praised the FX Chip and the Cg graphics that the FX chips will employ(something that they said would give them impressive rendering power over ATI's chips.) turns out to be a major dud. ATI cards can render the same images and shaders with their harware and do it faster. And do it with OpenGL wrapper at that! Good PR campaign though at Nvidia for fooling everone that Cg graphics is the way of the future. We all now know that was just more PR crap from Nvidia. :-P

Dawn ATI Fling Official

I can't believe that renaming trick. I've seen a patch where it does all sorts of renaming files and moving textures, and it was as simple as changing the filename. Nice built in feature. :)Sheesh, the people that made the "nude patch" wasted a lot of time.Also, to Anonymous having problems (errors) you need to download the file opengl32.dll and place it in the c:program filesnvidia corporationnvidia demosdawnbin folder to run this demo on a 9700pro. That's the whole point of this article. You need the opengl wrapper. You can download it from this link:[url removed]

Dawn ATI Fling Official

Anonymous: The p4g8x should do 8xagp... That's what the 8x part of the boards name means. It's an ASUS board, right? p4 = pentium 4 / g = chipset (g?) / 8x = 8x agpCheck your mobo drivers to make sure you've installed them all and check your bios to make sure you have 8x turned on.

Dawn ATI Fling Official

LOL, just read a 13 PAGE thread on the issue. Its over at [url removed] provides an interesting read, and its got so many people arguing so many points.The main point being ardued is Futuremarks legitimacy as a DX9 benchmark, when it doesnt really show how a DX9 game would be.Only thing though, you do have to become a member, its free though.

Dawn ATI Fling Official

BTW, i came across that just looking for info on Det 44.10 and how good they are LOL. ANyone seen any info on them? Just that there ripped from Dell..... they may be crap on non-Dell systems LOL.

Dawn ATI Fling Official

No one will know how dx9 games will run on the new cards. Period. Until they come out. And guess what? The 9800pro. The 5900 Ultra will be bargin cards by this time. NO DX9 GAME WILL BE OUT UNTIL 2004 AT THE MIN. That's when you will see the NV40 and the R400.And when 3dmarks2003 was released, Nvidia was still involved and was a current member. One of the big issues they are looking at is, did Nvidia already know how the test was going to develop on the FX line of chips before finally release and then argued about it. (looks that way)

Dawn ATI Fling Official

This is from Beyond3d..."Actually, I'm not sure -- was 3DMark03 released after FM knew NVIDIA wanted to pull out of FM's beta program, or did NVIDIA pull out while 3DMark03 was in development and while NVIDIA was a beta member? If NVIDIA was a beta member during 3DMark03's development (which, on a important related note, means that they more or less know how their NV3x would perform in 3DMark03), at what specific stage did they pull out?Patric... Aki..? Is revealing this sort of info confidential?"_________________Anthony Tan"Reverend"Beyond3D"No it is not confidential (the reason for confidentiality is to make sure that we protect any information that is given to us during the Beta Co-operation) and I think it's already been mentioned elsewhere too. Nvidia left the Beta Program 1st of Decembers 2002. 3DMark03 development took 18 months all together and the product was released in Feb 03.Cheers,"AJ_________________Aki JärvilehtoVP, Benchmark ProductsFuturemark CorporationBy all accounts by this admission, Nvidia was involved in the Beta process up until Dec. 2002 They knew how there card would have operated. How could they not? The final product was released just 2 months later! And then balked about it and discontinued their membership. Not good PR for Nvidia.3dmarks2003 might not be a "true" test of games, but the way Nvidia has handled this whole mess looks plain bad. They look like whining little babies. They should have at least admitted certain facts and issues. However, they have not as of yet. Nor did they release that "new" driver version to fix that "bug" All looks bad.

Dawn ATI Fling Official

"As you have probably heard, with the release of Futuremark’s 3DMark03 benchmark relations between its developer and NVIDIA Corporation were badly damaged. NVIDIA said that 3DMark03 game-benchmarks unfairly favoured graphics cards based on ATI RADEON 8xxx and 9xxx-series graphics processors while solutions powered by GPUs from NVIDIA showed too weak performance in the latest FutureMark’s benchmark. NVIDIA was so dissatisfied with this fact that it left FutureMark's beta program in late 2002.Starting from February this year some strange things with 3DMark03 and NVIDIA Detonator drivers have happened. There were famous Detonator 42.68 drivers with improved performance in 3DMark03. There was NVIDIA pressuring some online media in an attempt to avoid benchmarking the GeForce FX 5800 Ultra with 3DMark03 (see our news-story here). And now there are two more episodes involving NVIDIA Detonator drivers and FutureMark 3DMark03 benchmark.French 3Dchips.fr web-site has found that during 3DMark2001 SE Dragothic scene with the GeForce FX 5600 Ultra and FSAA and anisotropic filtering turned on the dragon disappears, leaving the Amazon mysteriously flying without any hard surface under. You can read about this strange thing in French here (translated version here).The latter case is quite serious and can even give some ground for rather unpleasant conclusion about NVIDIA’s driver optimisation policy. ExtremeTech observed NVIDIA GeForce FX cards using developer version of 3DMark03 and recently released 44.03 Detonator FX drivers. The developer version, which is available only to selected partners of FutureMark, and lets not only to watch the benchmark scenes, but also to pause them and move the camera freely through the whole scene. Using this ability of 3DMark03 ExtremeTech has found that if you move the camera anywhere outside the “normal” path, i.e. the path, it travels during ordinary benchmarking, the image on the screen gets seriously distorted in some strange

Dawn ATI Fling Official

...... some strange way. ExtremeTech sent a report about this problem to NVIDIA and the company answered that it has no access to developer version of 3DMark03, since it is no longer a member of 3DMark Beta Program. The respond of the company is at least a strange one and analysts now think that there are some aggressive performance optimisations in the new drivers, such as custom clip planes or absence of full Z-buffer clearing. In short, graphics cards based on NVIDIA’s GPUs show better results in 3DMark03 now. Obviously, such optimizations only work in 3DMark03,NOT IN REAL GAMES.

Dawn ATI Fling Official

What a can of worms!Since I got my Rad9700Pro I aint looked back. The GF4 Ti4200 I was using aint bad but it's just not as good.I don't see how nVidia can have got it all so wrong? The launch of the FX was amazing, 10 millon polygon scenes rendering at never before seen speeds, 128 bit fp color etc. The FX looked great, then the benckmarks started rolling in...nVidia claim that the unoptimised PS1.1 (3DM says it's testing PSv2 so what gives?) used in 3DMark2003 is what's crippling the FX, so why does it not cripple the R300?nVidia have got something wrong somewhere, to me it's obviously in the backward compatibility of the FX rendering pipeline.Also, I don't see that the issue should be all about speed anymore. The R300 and FX are virtually super computers (10 years ago these chips would have set you back about a million bucks). 'Cinematic' graphics should be about speed and image quality. As the guys at MIT said, the render quality of Dawn on the FX didn't look as good as it should so they built a wrapper and tried it on an Radeon and we all know the results...nVidia, GET URE FOOKIN ACT TOGETHER!I ran twin Voodoo2's for ages after 3DFX died and although only 16bit colour, the image looked 10 times better than 16bit on a GF2. Looking at some of the reports here it seems to me that nVidia only managed to get to the top because they've always compromised something in the rendering pipeline to gain extra speed. In RTCW on the GF4 the gold bars are almost invisible. On the Radeon they look like gold bars. This is not down to DX8 drivers and shite either, this is an OpenGL game!nVidia have dropped the ball and it's going to take something really special from them to get it back.BTW, there are some DX9'ish games out there guys - try Splinter Cell for PS1.1 effects!!

Dawn ATI Fling Official

And here's the problem with this and many other issues of late. Many web-sites have alliances or FAN-BOY attitudes with major companies. Wether it be ATI, Nvidia, IBM, or AMD. It is becoming more evident in the extremely BIASED reporting. It's plain brutal. Even when evidence is plain as day, or a product is not what the company claims it shuld be, certain sites preach otherwise. Where do you go to find the true skinny? I can only think of a couple now. One right off the top of my head is TechReport.com And they write a little mention of this also.............

Dawn ATI Fling Official

DRDEATH: Thank you for finding a better quote to state what I was trying to say about the camera angles with the developer versions and the "optimizations" not meaning jack squat in real life.

Dawn ATI Fling Official

When all is said and done the 5900Ultra is a faster card....But I wouldn't buy one. The company has bad methods of operation. And a history of duping its customers. I don't care how "popular" they are. It's bad business.

Dawn ATI Fling Official

DeQuosaekMaybe it does have bad operation, but the only areas affected are this benchmark. Now its your opinion, but the fact that in games, its ahead in most cases, and when games arent biased (ie. Doom3) it will be right there with the 9800pro, or slightly ahead.If Futuremark had written 3dMark2003 the way it should of been done (so it uses optimisations for both Nvidia and ATI) then it would be alot more even. Instead, they made the program optimised for ATis specific path, and that gives Nvidia a disadvantage. While cheating shouldnt be resorted to, to fix the matter, Nvidia decided they needed some way, to make the best of this situation.With any luck, they will release a mroe even 3dMark. One which wont need hacks in drivers, because it adds Nvidias methods of doing things, which do the exact same job, only its done so it works with their hardware.But IMO, i dont think FM will stand the test of time. Because its just testing cards under specific scenarios, which can be cheated, and arent the way games are.Doom3 will be the ultimate benchmark at release. It uses both cards coding, is DX9, and will doubt pressure both cards LOL.BTW, i know this changes nothing to you LOL, just trying to say the fact they cheated in this test, doesnt take anything away from the card itself. Its still jsut as powerful as the 9800pro, and at points faster/slower. Although im pretty sure most people are aware of this... it will be a while before a card takes a clear win, like 9700pro did over GFTi4*00

Dawn ATI Fling Official

SSCREW I agree with you 100% that this cheat does not take away from the amazing nv35 and it's rendering capabilites. The card is the fastest kid on the block by small margins. That's not the issue here however. And the only reason as many can see why the NV35 did so much better then the 9800pro in DOOM3 was that they focused their current drivers for it. ATI did not. And said so. Expect that to change real soon. As both cards are VERY similar in all other tests but that one. Time will tell when the game is finally released. (then it own't matter. The NV40 and the R400 will be the cards to battle it out then. lol)If you have read the reports about this closely and i speak something about it in the other post, the FX chipset works 12, 16, 24, 32 bit interger float shader ops. And defaults to 12 or 16 depending on the application. dx9 standard is 24. ATI's R300, R350 core always renders in 24 bit interger float regardless. So what you have is Nvidia reducing this to 12, 16 to run it faster in some apps. However, John Carmack himself says this should not effect IQ or functionality at all. Which has been raising eyebrows all over the place. It does not add up. Then again, Carmack has always been in Nvidia's camp and NEVER mentions custom clip planes or absence of full Z-buffer clearing and 'other' dubious methods Nvidia used during 3dmarks2003 test. "Don't piss off your biggest friend" they always say....Hmmmmmm.It seems the only websites and software experts that are telling the whole story are the ones that have no affliation with ATI or Nvidia and fear no repraisles. TechReport for one has been removed from Nvidia's list for their honesty and straight forward facts about this whole issue. And a few others. They NO LONGER GIVE THEM VIDEO CARDS FOR THEM TO REVIEW. lol.And last point. FM is not ATI biased. If anything this whole fiasco proves this, as the methods Nvidia is using is plain wrong and dubvious at best. And it DOES NOT DO THE EXACT SAME JOB BUT FAST

Dawn ATI Fling Official

.....JOB BUT FASTER. If it did, when the camera angles are moved slightly off the rendered path there should NOT be any artifacts at all. But in Nvidia's case....there are plenty. This is not called optimization. IQ is lost! Even John Carmack(who I regard as a questionable spokesperson for this issue as he has major ties with Nvidia) says this. But like I have mentioned before earlier, HE DOES NOT EVEN BRING UP Nvidia's custom clip planes and full absence of F-buffer clearing in his statement of support for Nvidia and this issue. Makes him look extremely biased.ATI'S method produces NO artifacts whatsoever when they implemented their process. Yet, ATI still is removing the "cheat" and coming clean on this issue. Nvidia is not. That tells consumers and harware sites plenty.

Dawn ATI Fling Official

SSREW. This is what i posted in another post about your comment on the cost of being a beta member for FM. And Nvidia's mention that it costs "Hundreds of Thousands" of dollars.Futuremark Caught NVIDIA On Misleading Statement3DMark03 Scandal Continuesby Anton Shilov05/27/2003 09:16 AMYou may remember that NVIDIA did not comment on its cheats in drivers that allowed graphics cards based on certain GPUs from NVIDIA to score more in Futuremark’s 3DMark03, but accused the Finnish company in developing unfair benchmarks. The company also said that Futuremark asks for hundreds of thousands US Dollars to participate in its Beta Program. Well, Futuremark denied both allegations on Monday.First of all, a Futuremark representative said that the company utilises the most-efficient shaders and rendering-patterns. Secondly, Aki Jarvilehto, Vice President Benchmark Products of Futuremark Corporation said over here that the minimum payment to join the Beta Program as a Beta Member is $5000 a year, not hundreds of thousands as NVIDIA Corporation claimed last week."

Dawn ATI Fling Official

This was just released....."As you know, 3DMark03's overall score is derived from its four game tests. You can see my results for Game 1, Game 2, Game 3, and Game 4.As you can see, ATI's optimizations—which they claim didn't change image output—barely affect the overall score. NVIDIA's, however, make a substantial difference. The most striking difference between builds 320 and 330 is in the Game 4 test, where performance drops dramatically once the cheats and optimizations are disabled. This test, the "Mother Nature" scene, makes the most extensive use of DirectX 9 and pixel shader 2.0.Those are the numbers. I don't have time here to dig into all of the related issues, but Dave from Beyond3D sent me a note about a couple of things you should check out on his site. First, Dave has captured images from ATI and NVIDIA cards in 3DMark03 to show the image quality differences between builds 320 and 330. The NVIDIA drivers clearly have more impact on image quality. Next, to help you sort out what that fact means, have a look at Unreal guru Tim Sweeney's take on cheating versus optimization. The basic principle he outlines seems like a good guide in this case."

Dawn ATI Fling Official

The weak spot for the fx line of cards has been the shader ops. 1 less vertex shader and 4x2 pipeline amoung other things. And the fact that the ONLY area Nvidia cheated in 3dmarks2003 and the performance increased was in the test that, you guessed it, tested SHADER PERFORMANCE. (Nature Test) Which employs the most dx9 features and extensive use of pixel 2.0 which Nvidia's cards run slower then the R350 core. That's just too coinicendental to be a driver bug to effect ONLY the test inwhich Nvidia's fx line of chips have a disadvantage to ATI's chips and most other tests the 5900 wins. This test was targeted. Targeted to improve the performance of the 5800, 5900 in dx9 and pixel 2.0 shader tests. However, the process(cheat) produced major artifacts. That is not called "optimization."

Dawn ATI Fling Official

This is how I really feel about NVIDIAI don't want you be no slaveI don't want you work all dayI don't want 'cause I'm sad and blueI just want to make love to you, babyLove to you, babyLove to you, babyLove to youI don't want you cook my breadI don't want you make my bedI don't want your money tooI just want to make love to you, babyLove to you, babyLove to you, babyLove to youWell I can tell by the way that you twitch and walkSee by the way that you baby talkKnow by the way that you treat your manI can love you, baby, till the night trainI don't want you wash my clothesI don't want you leave the homeI don't want 'cause I'm sad and blueI just want to make love to you, babyLove to you, babyLove to you, babyLove to youYeahI don't want you wash my clothesI don't want you leave the homeI don't want 'cause I'm sad and blueI just want to make love to you, babyLove to you, babyLove to you, babySweet love to you, babyLove to you, babyI just wanna make love to you, babyLove to you, babyLove to you, babyLove to youI just, I just want ot make love to you, babyI just want to make love

Dawn ATI Fling Official

Ok so i'm not anonymous anymore, but I did put the wrapper in the bin file, and still to no avail. no dawn. My bro has XP on his sytem which is basically set up like mine, and to no avail.... no dawn. we both have the latest and greatest drivers too. DX9.0a blah blah blah.

Dawn ATI Fling Official

Next generation games from what i hear will use lots of shader operations and dx9 features. With the 5900 having a slower shader engine then the 9800pro as the bench tests and game tests prove, the 5900 might not be the best bet for future games then? As those future games will incorporate lots of vertex and pixel shaders. Something i have been reading about that the 5800, and 5900 run slower.

Dawn ATI Fling Official

Hey, i need your guys help on a project im doing for school. I need these specs on each of these ATI cards: Memory bandwidth, Fill Rate, Vertices per second, and maximum memory on these cards-9200pro, 9500pro,9700pro,9800pro

Dawn ATI Fling Official

SSCREW:> "If Futuremark had written 3dMark2003 the way it should of been done (so it uses optimisations for both Nvidia and ATI) then it would be alot more even. Instead, they made the program optimised for ATis specific path, and that gives Nvidia a disadvantage. While cheating shouldnt be resorted to, to fix the matter, Nvidia decided they needed some way, to make the best of this situation.With any luck, they will release a mroe even 3dMark."Ok, this is the way it happened. nVidia was a part of the beta development of 3dMark2003 through about 13 months of its development. When they realized their cards weren't going to perform the DX9 shader functions very well, they decided to drop out of the beta program. THEY MADE THAT CHOICE. It was late 2002 when they left and 3dMark 2003 was released in late Feb 2003. Coincidence? No. The GeForce line of cards does not and will not handle newer shaders very well. Period.And they DID release a more even 3dMark2003. It's called patch 330.

Dawn ATI Fling Official

>>>Ok, this is the way it happened. nVidia was a part of the beta development of 3dMark2003 through about 13 months of its development. When they realized their cards weren't going to perform the DX9 shader functions very well, they decided to drop out of the beta program. THEY MADE THAT CHOICE. It was late 2002 when they left and 3dMark 2003 was released in late Feb 2003. Coincidence? No. The GeForce line of cards does not and will not handle newer shaders very well. Period.And they DID release a more even 3dMark2003. It's called patch 330.<<<How is that a more even benchmark? It is still designed to use the ARB2 path, which ATi primarily use, and are faster in. Nvidia have the NV30 path, which when used, can provide the same results, even faster. If a game such as Doom3 can provide both functions, shouldnt a graphic card benchmark? I mena, there designed to show how good a card should run games. If they arent showing how good the FX5900ultra can really run games, how is that more even? Its just more even in that neither company can cheat, its still in favor of the Radeon cards. Had Futuremark included this, i GUARANTEE that Nvidia wouldnt need any cheats, and would be getting score like this without cheating. Carmack stated that when both cards use the ARB2 path, ATi come out on top. BUt in Doom3, when the NV30 path is used, it come out ahead in most cases, but both do have there stronger points in areas.A benchmark, is not more even, until it supports ALL graphics cards ways of doing things. Nvidia may be doing it differently to ATi, but its getting the exact same results done faster. The reason they do it different, is because there hardware is different.I aint got nothing against ATI, i love them. I love Nvidia too.The company im against here, is Futuremark. They have made an unfair playing field against Nvidia. In ANY game that allows the NV30 path, Nvidia will do just as good, or better than the 9800pro. No matter the pixel shaders etc. Thats what

Dawn ATI Fling Official

"How is that a more even benchmark? It is still designed to use the ARB2 path, which ATi primarily use, and are faster in."SSCREW, you just answered your own question. In all paths taken. If nvidia cards take the same path as ATI cards, ATI cards run faster. Period. More effiecient, powerful shader engine. This is why Nvidia DROP that to FP12 or FP16. And if you go to Beyon3d. TechReport THEY show you screen shots that depict that doing so DOES NOT RETAIN IMAGE QUALITY as running it in FP24 or FP32. Considering the minimum standard is FP24, Nvidia knew this, and dropped it down anyways. YOU CAN'T RUN APLLICATIONS IN ANYTHING LESS THEN FP24 AND CONSIDER IT DX9 COMPLIANT. PERIOD. TO DO SO IS CHEATING. WETHER IT BE IN GAMES OR BENCH TESTS. IQ is effected by this! This has been proven.The so-called unfair playing field is ONLY in the SHADER TESTS. Why? BECAUSE THE SHADER ENGINE IS SLOWER THEN THE R350, R300 CORE OF CHIPS. PROVEN. Read the article at The Inquirer. You need to read that. And Nvidia dropped out of the program when they found out the FX line of cards would run poorly. That is what i call cry babies. And the Whole software and Hardware community is seeing this now as the REAL truth is coming out. Not PR crap from Nvidia. Nvidia needs to fess up and move on and improve their shader engines in their fx line of chips instead of trying to defend its weakness."In ANY game that allows the NV30 path, Nvidia will do just as good, or better than the 9800pro. No matter the pixel shaders etc."Not quite. ANY game that uses extensive pixel and vertex shaders this is where the 9800pro managed to win the bench tests. Even with it's slower clock speed. Splinter Cell. Commanche4. Unreal 2003 in tests that used heavy shaders. That has been proven. And the test NVIDIA developed called SHADERMARKS. ATI cards beat the 5900 in that also. This all proves one thing. The shader engine for the NV35, NV30 is lsower then the on incorporated in the 9700pro, and 9800pro.

Dawn ATI Fling Official

Last point. Nvidia wanted the NV35, NV30 cards to look better in that test only because of it's weakness in that area. All other benchtest were not effected. It was a targeted cheat. Any future games that use heavy vertex or pixel shaders, the 9800pro will run faster. Period. This is why they never wanted this test to be done by vendors for their cards. ANOTHER CONTROVERSY THAT IS TOLD AT BEYOND3D. Shader marks and games that use them extensively like pixel shader 2.0 don't fare as well on the fx line. Look at the game tests that use shaders, let alone 2.0 This is the one area the 9800pro wins. And considering these feature will be incorporated in many future games had Nvidia worried.[url removed] this SSCREW. And then read the report at TechReport.com and Beyond3d. Nvidia needs to come clean. They did cheat to cover up the cards weakness in shader ops.....

Dawn ATI Fling Official

And here's the thing with DOOM3. Something we gamers will only find out when it's fully released. Since both cards run on different paths. Which path offers better IQ? Because you know that DOOM3 runs heavy pixel and vertex shaders and in order for the NV30 or NV35 to run it faster or as fast as the 9700pro or 9800pro, it won't be running it FP32, FPR24 or ARB2. When the final version is released, we will see the quality difference. I will put my money on FP32 or 24 as opposed to 12 or 16.

Dawn ATI Fling Official

I am assuming everyone knows that pixel shader 2.0 is a dx9 feature? ;) And at this point the 9800 and the 9700pro runs pixel and vertex shaders faster then the 5800 and the 5900. Stronger shader engine can do that. Dr.death is correct in saying this worried Nvidia in regards to this. Future games will use this extensively. And by all accounts, Nvidia's chips run them slower.

Dawn ATI Fling Official

SSCREW:> "Nvidia may be doing it differently to ATi, but its getting the exact same results done faster. The reason they do it different, is because there hardware is different."Not the exact same results. And did you say their hardware is "different"? I think the word you're looking for is "inferior" ;)SSCREW:> "In ANY game that allows the NV30 path, Nvidia will do just as good, or better than the 9800pro."No games should allow the NV30 path. It's not true DX9.

Dawn ATI Fling Official

Some of you people are extremly childish and unstable.I know that some of you are adult and that worries me.Get a life people.I like computers and star trek just like the next geek, but that doesn't mean I don't have a life.Some of you really don't. Seriously, think about your situation and if you are lucky, you will see the light, if not, then carry on thinking of even more stupid comments to write on the next ATI product.If you try hard, you may be able to get your own REAL Dawn. You know, girlfriend, sex, normal life.Until next time peeps.Think about what I said.

Dawn ATI Fling Official

As long as it's a productive debate, rather than a bunch of idiots screaming and saying "you f**king dumbass mother f**ker. My favorite card rocks and I have no reason to say that other than I'm a blind follower with no real mind of my own and can't speak or write english very well except to say say f**k d**k asshole shit faggot pussy" there's nothing wrong with knowing a little something about hardware and software. In fact it may lead to a very good job in the information/ technology field, leading to big paychecks, which makes it a lot easier to take your girlfriend out and have a fun life.Besides, I gots me a girlfriend. She lives with me. :)Normal life & computer knowledge. It's not as hard as it sounds.

Dawn ATI Fling Official

Hi again.I'm just curious, can anyone with a RADEON 9700 PRO and less than 512MB RAM run this demo?I get an error. It says: Dawn has encountered an error and needs to close. Send Error report or not.I got Hercules RADEON 9700 PRO, 1.3GHz Athlon and 384 MB SDRAM.It says on the official website that it requires 512MB RAM. Is this so?

Dawn ATI Fling Official

It's alright, I've sorted it out now.It will work on my PC after all(rather well actually).I needed to get the Athlon Thunderbird patch for Dawn from the website, so if your Dawn isn't working, try getting the patch and it might work afterwards.It seems to corrupt sometimes at 6xAA at 1024x768 (max everything else), but it still runs smoothly.I can't run AA at all at 1600x1200. It could be the RAM. I only got 384MB, below the required min. 512MB, but it runs at 1600x1200 with everything else on max (AS, textures, Trueform, etc) again smoothly, and I didn't even overclock the 9700 PRO. :-D I wish it had an FPS meter.It does not run at 2048x1536 at all under any settings.

Dawn ATI Fling Official

Dang, Inimbrium, You must have a nice monitor to be able to try 2048*1536. I can't wait for my new one. It does a little under that, but is better than my current one that only does 1280*1024 @ 60Hz

Pages

Add new comment