ATI New Board With 32 Pipelines

The latest information released from Canada regarding ATIs monstrous R520 board, also known as Fudo, mentions the magic 'pipeline' word, a key element of the battle for graphical supremacy last year. Many might remember how the number 16 seemed to be the holy grail of both manufacturers and how stunned we all were that the companies involved could produce such monstrous boards carrying so many transistors and not be responsible for the melting of the polar ice caps.

Well this time ATI claims that its new range of boards will carry in excess of 300 million transistors meaning that, at least in theory, it will be able to feature 32 pipelines. It's time for gamer jaws to drop again as this will mean that the boards will feature twice as many pipelines as their predecessors. The current line of thinking at ATI is to have all boards with 32 pipelines but to restrict the first few versions to 24 functioning ones and to gradually introduce the full 32 enabled versions when market conditions and competitor products demand them.

ATI is rather keen to begin making its presence felt on the Top 20 3DMark hall of fame, as the recent failed attempt in Texas clearly demonstrated. The R520 could be the card to bring that success to ATI since it should be capable of performing at least twice as fast as the X850 meaning it should outperform even current SLI setups.

If ATI also manages to introduce its dual-GPU configuration this summer as expected, the resulting performance should run away with the 3DMark crown. I will leave you with the thought of two 32 pixel pipeline R520s running in tandem a game of your choice, how's that for starters?

NOTE :

The picture card is the Crucial RADEON X850XT 256MB PCI Express card and not the R520.

Add new comment

CAPTCHA
This question is for testing whether you are a human visitor and to prevent automated spam submissions.

Comments

ATI New Board With 32 Pipelines

"This is why i buy ATI chips. Love when they disable those pipelines. Oh the joy. Why? Because of the hard modding and bios tweaks you can do to open those remaining pipelines!!!! Major sweet. Did it to my X800PRO. Had 12 pipes. Modded it to 16. Saved uber cash!"LMAO!!!!!!! Where's the difference to geforce 6800 cards? anyone help me?

ATI New Board With 32 Pipelines

"1. ATI has more market share than Nvidia.2. ATI has the fastest PC Card X850XT PE.3. ATI hardware runs better in DX9 Apps.4. ATI runs better in shader intensive games.5. ATI was first to come out with a 256bit memory bus, 8 pixel pipeline, and DX9 support.6. Nvidia f**ked up on the NV30, and shutdown its production, ATI has never f**ked up a release.7. ATI has beaten Nvidia with the same f**king core for years now based on the R300.8. Nvidia is f**ked for life.9. The R520 will be a behemoth"1. who cares2. tell me a game tht stresses a geforce 6800gt (exception farcry hdr which dowsnt even run on ati)3.nv performance is more than enough4.nv shader performance is great as well5.who cares?6.yeah, but they should have...7.yeah, will it survive the next gen games or will all th stupid little atifans have to buy a 500$+ card to play their games??8.prove ati isnt^^9.source?

ATI New Board With 32 Pipelines

"That is the biggest F*ckin' lie yet. Ive got like 8 computers and each have gone through several cards. All my ati cards have been fine, and all my nv cards have givin me blue screen of death and more of multiple driver versions. Ive only had better stability, quality, and perfectness with ATi"Call it a lie, but it is true for a lot of people. btw ajax do you get paid by ati?

ATI New Board With 32 Pipelines

TO Anonymousat 7:31 6/4/20051. You must since you replied you must have felt offended.2. Farcry, Half-Life2, Doom3, Etc. Just higher up the resolution AA and AF, dumbass that will be stress for you.3. Maybe enough but not as good as ATI.4. Not that great or it would run a lot better in intensive shader games.5. Again you must since ATI did it first, you are probabaly sad Nvidia got beat to it.6. Too bad they didn't.7. Of course since 9700 till plays most games.8. They have a higher stock market share.9. This thread topic.

ATI New Board With 32 Pipelines

"6. Nvidia f**ked up on the NV30, and shutdown its production, ATI has never f**ked up a release.""6.yeah, but they should have..."That is THE LAMEST comment ever. They should have, that makes an AMAZINGLY powerful comeback there. Writing that line only made u look like a frickin retard to everyone. The point is, they havent, they wont. And if the only comment u have is "They should have", then ur a really poor loser

ATI New Board With 32 Pipelines

Both nvidia and ati make good cards. and both have their pros and cons. some users have problems with ati cards, others with nvidia cards.Ati has good performance, but nvidia is almost at the same grade (differs from game to game). so in the end its a question of the amount of money you wanna spend, and when i bought my 6800 it was the best bang for the buck, would now be an x800xl i suppose. but even if ati pwnes nvidia the end-user would not take advantage of it, because unless nv releases a comparable model, r520 will be out for years before it can be called an affordably priced card. So i hope we can expect g70 asap. that is just my opinion as an end-user. Even if i had the money i would not spend 500$+ on gfx cards. I think those who do are crazy, but i'm not personally offended just because somenone has a better pc than me. btw don't forget that a 32 pipe card is wasted since even 16 pipe cards are being held back by cpu/ram/chipset etc."6.yeah, but they should have..."nvidia learned from its mistake, the gf6 series proved that imo

ATI New Board With 32 Pipelines

"Listen let me conclude this battle.NVIDIA is Better OK just deal with it.END OF THREAD"WOW way to copy what was said beow, if ur gonna say something like that, at least make ur own, something a little more original. By posting that, u killed any sign of that being a good post at all

ATI New Board With 32 Pipelines

"Listen let me conclude this battle. Neither company would be waht it is right now without the other OK just deal with it.END OF THREAD"agreedtrue, they do need the compitition, but Ati is always winning for every card nvidia has, Ati has a successor

ATI New Board With 32 Pipelines

This isn't about who has the bigger balls here. Nvidia and ATI aren't playing for the thrill of victory here; its about money. In my opinion, Nvidia is doing the smarter thing by not releasing incremental refresh cards at terribly low quantities. You can compare the performance of the refresh ATI cards to the price premium you pay and hopefully, you can see that it is not worth it. Nvidia apparently don't feel the need to push out a new card every 3 months because their current products are selling well (not to say that thye are more powerfull etc). No, the killing is made in the mainstream market. Also, the killing is made when dealing with system integrators and OEM's, not with customers like us who buy from newegg etc. This is the reason that Intel owns the market for GPU's; integrated chipsets. ATI's r520 will be a monster; it better be when it comes with a $600 price tag. However, you can bet that nvidia will have a response. Like I said, I believe that Nvidia saved a good amount of time and money by not releasing refresh cards. I also believe that ATI is spreading themselves too thin by trying to supply too many variations on their boards. As for my needs, my 6600GT plays every game I have like butter.

ATI New Board With 32 Pipelines

nvidia is number one... ati is better... well it is all in how you look at things.ati skipped a cycle to become king (in single card use)... what am I talking about??? when the geforce 3 ti 500 came out ati responded with the 8500. then nvidia came out with the geforce 4 ti series, but all that ati did was put out an 8500 with 128 mb of ram. for one cycle ati had nothing (but were putting resource in the r300 chip) nvidia kept on putting out a new card every cycle, and putting time and money on the fx line.that is why the fx line failed compared to the 9500 and 9800 cards. ati spent more time and energy in developing a better card than nvidia did. that one cycle of time helped ati allot. and because of that cycle jump ati will always be ahead of nvidia. the x800 was on the market for 6 to 8 before the 6800 came out... the same will be said for the r520 chip set. it will take nvidia to skip a cycle just to catch up with ati's time frame (both ati and nvidia coming out at the same time). is that good for ati? right now, it is.

ATI New Board With 32 Pipelines

continue...how about sli? sli right now does not factor in because for a few reasons.a) it only works on 70 games. yes you can manually try to make a game sli but it could cause your computer to reboot, freeze or other things.b) the 6600 gt in sli has a problem in beat out a 6800 gt or the x800 xl on those 70 registered games (other than bench marks or no iq in games the 6600 gt in sli just does not have the balls... and it is more expensive than buying a 6800 gt.c) this is the main reason I am not buying a 6800 gt... you can not sli two different types of cards or different manufactures (a 6600 with a 6800, or a 6800 with the next gen nvidia card). you are required to buy two of the same types of cards each time. (if I am wrong here please post a web site of where a 6600 and a 6800 was sli).so the best card for gamers are (cost to power ratio)1- x800 xl $ 275-350 USD2- 6800 gt $ 360-450 USD3- 6600 gt $ 180-250 USD[url removed]

ATI New Board With 32 Pipelines

now what is the best gaming card??? well I would be stupid to say that the r520 would not be the best, but lets get in to the real world for a bit. most of us gamers are not going to spend that kind of money just to play games (that includes the 6800 in sli or the x850).for a beginner gamming card a 9800, x700 or the 6600 would be great. all of them will play every game to date with good visuals. nvidia's 6600 is the best in this level and most people should look at buying this one first before the others.for the guys and gals who want a lot more bite the 6800, x800 pro, x800 xl and the 6800 gt are what you are looking for. the best for the money to power level was the 6800 gt, but now with the x800 xl you are only spending 50 to 100 dollars more than the 6600 gt (yes the 6600 gt) and getting equal to better power (other than open gl games) than the 6800 gt. the x800 xl has also the best overclocking I have seen in a long time (with water cooling you can get to an x850 xt-pe)

ATI New Board With 32 Pipelines

im sry, but the r520 is gonna make sli 6800ultras look like a kiddy ride ;)that is exactly what the 6800u did to the 9800xt, but then again, the x800 was equal. now r520 is gonna be released and it will eat the 6800u, but nvidia will produce an equal product. Or does anyone have an idea what the secret of nvidia's g70 is. i think it's a new high end chip like r520 and we are gonna have two equal chips, both probably not available for month, but 6800/x800 prices are gonna drop :-)

ATI New Board With 32 Pipelines

best card u can buy at the moment is a geforce 6600 without gt. it has the same chip on it as the one with gt but is 100 dollars cheaper. the only thing u have to do is overvlock the card without gt to the gt clockspeed. memory is still slower but you wont see the difference in games.

ATI New Board With 32 Pipelines

"Like I said, I believe that Nvidia saved a good amount of time and money by not releasing refresh cards."The only reason Nvidia did not release a refresh card for the 6800 line is not because they did not feel to release a new card every 3-6 months as you claim. Trust me, if they could release a card that would beat the XT850 out of the gate, they would. They made lots of refresh cards with the FX line: The 5900...5950...5900XT to name a few. However, this time around because of the transistor count on the 6800 series caused by PS3.0 support, they have no head room to push clock speeds to 500MHZ+. Any speed increase not with a water block cooler would be pointless. (even with water cooling, the 6800Ultra cannot go beyond 450MHZ)If Nvidia could push clock speeds with standard or dustbuster type cooling for the 6800 line to decent levels, THEY WOULD. However, they can't because of die size and transistor count....

ATI New Board With 32 Pipelines

I think not being able to push clock speeds higher is more of a fabrication issue and not really a design issue. Also, overclocking a part beyond specifications is not an indication of future parts, as manufacturing processes can change. E3 should be a good time for Nvidia to speak on the subject of new cards. I still believe that nvidia is doing the right thing by limiting the amount of cards it "launches" so that they can focus on something which they feel is more important. ATI has in their high end, the following: X800, X800pro, X800XL, X800XT, X800XT PE, X850, X850XT, X850XT PE. Nvidia has the following in their high end: 6800, 6800GT, 6800Ultra, 6800Ultra Extreme (done by OEM's, not even by nvidia). 2 times the cards for ATI (just in the high end alone), but more is not always better. It could be that nvidia wanted to see how ATI would refresh so that they could accurately guage how aggresive their refresh parts would have to be, performance wise. Why guess when you can be certain? Also, about the 6600 being a better buy than the 6600GT; memory clock can and will make the difference between playing games at 1024x768 or 1280x1024; bandwidth is the key word there. Also the 6600GT has now been priced at about $30US above the regular; get the GT for better RAM. I am looking forward to getting a new video card next year when once again, the mainstream parts are beating out the high end of the previous generation. You can bet I won't be spending more than $250 on a video card.

ATI New Board With 32 Pipelines

If anything ATI has saved the most money these last couple of years. Since all of their releases were based on previous cores primarily R300. Now Nvidia created a whole new design the NV4x series. The implementation of PS 3.0 cost Nvidia 60 million additional transistors to the already strained silicon. ATI waited and did not implements PS 3.0. If you ask me it is obvious that ATI has saved the most money here. However they release a lot of refreshes, but still based on the same GPU design. Not creating a new design allowed ATI to focus on MVP and the R520. Currently ATI dominated the mobile market, high-end, mid-end segments while Nvidia only the low-end. It was announced the X800XL is the new leader bouncing off the 6800gt at the inquirer.net. The R520 is ATI first new design since the R300, and if the info is correct it will be 3x more powerful than the X800XT. ATI is just better at creating a GPU. Basically ATI has beaten Nvidia with old technology. Nvidia has nothing new this year, they were hoping SLI 6800ultras would hold off. Expect the G80 Q2 2006, it will be competing with a PS 4.0 R600 GPU.

ATI New Board With 32 Pipelines

It's a good thing this card is coming out now, we have been working with ATI on getting a home version of the IMAX theaters out to consumers because we want everyone to see Halloween 9 in all it's glory (or gory, haha LOL!) when it comes out on DVD. Michael Myers himself wanted ATI to front this project well... because he is an ATI fanboy.

ATI New Board With 32 Pipelines

"that is exactly what the 6800u did to the 9800xt"WHAT ARE YOU TALKING ABOUT??? THE X800 HIT THE STORE SHELVES 6 TO 8 MONTHS BEFORE THE 6800. IT WOULD BE LIKE SAYING THE 9800XT BEAT OUT THE GEFORCE 4 TI 4600. THE 9800XT CAME OUT AFTER THE 5900 FX WAS MADE, SO THE 9800XT WAS AGAINST A 5900 FX. THE 6800 WENT UP AGAINST THE X800 NOT THE 9800.NOW YOU COULD OF SAID THE X800, FOR 6 TO 8 MONTHS, WAS GOING AGAINST THE 5950 FX... THE SAME WILL BE SAID ABOUT THE R520 GOING AGAINST THE 6800 ULTRA OC.

ATI New Board With 32 Pipelines

"WHAT ARE YOU TALKING ABOUT??? THE X800 HIT THE STORE SHELVES 6 TO 8 MONTHS BEFORE THE 6800. IT WOULD BE LIKE SAYING THE 9800XT BEAT OUT THE GEFORCE 4 TI 4600. THE 9800XT CAME OUT AFTER THE 5900 FX WAS MADE, SO THE 9800XT WAS AGAINST A 5900 FX. THE 6800 WENT UP AGAINST THE X800 NOT THE 9800.NOW YOU COULD OF SAID THE X800, FOR 6 TO 8 MONTHS, WAS GOING AGAINST THE 5950 FX... THE SAME WILL BE SAID ABOUT THE R520 GOING AGAINST THE 6800 ULTRA OC."where on this planet was the x800 series available before the 6800?? True, both took about 5 months or so, but saying the x800 was available 6 to 8 month before gf6800 is total bullsh*t. why? i got my 6800 in september 2004, 5 months after the official launch. x800 was announced 1 month later than the 6800, so you say the x800 was available 2-4 months BEFORE the launch. the first time ati had enough cards for sale was about october, so the 6800 was available earlier. think again.and, comparing r520 to the 6800u is indeed like comparing the 6800u to the 9800xt..........1st Generation (not the first of all but i gotta start somewhere)nv: gf4ti / ati r85002nd:nv: 5800,5900,5950 / ati: 9700,98003rd:nv: 6800 / ati: x800,x8504th:nv: g70(???) / ati: r520so where is the difference between comparing the 6800 to 9800 and r520 to 6800?

ATI New Board With 32 Pipelines

"If anything ATI has saved the most money these last couple of years. Since all of their releases were based on previous cores primarily R300. Now Nvidia created a whole new design the NV4x series. The implementation of PS 3.0 cost Nvidia 60 million additional transistors to the already strained silicon. ATI waited and did not implements PS 3.0. If you ask me it is obvious that ATI has saved the most money here. However they release a lot of refreshes, but still based on the same GPU design. Not creating a new design allowed ATI to focus on MVP and the R520. Currently ATI dominated the mobile market, high-end, mid-end segments while Nvidia only the low-end. It was announced the X800XL is the new leader bouncing off the 6800gt at the inquirer.net. The R520 is ATI first new design since the R300, and if the info is correct it will be 3x more powerful than the X800XT. ATI is just better at creating a GPU. Basically ATI has beaten Nvidia with old technology. Nvidia has nothing new this year, they were hoping SLI 6800ultras would hold off. Expect the G80 Q2 2006, it will be competing with a PS 4.0 R600 GPU."yeah, but the time will come when games require sm30 and the 6800 will be able to handle that challange (theoretically), but the x800 will be struggling. they might have saved money, but the older "the new" technology is the sooner it will be outdated. so buying a 6800 is wiser than buying an x800

ATI New Board With 32 Pipelines

"yeah, but the time will come when games require sm30 and the 6800 will be able to handle that challange (theoretically), but the x800 will be struggling."This can be said of the same thing with the FX line of cards and it's horrible PS2.0 support. They never followed MS format for shader protocal. Result? Slower shader performance in pixel shader 2.0 support without lowering Floating point precision and major optimizations. The difference here is, ATI followed MS protocol for PS2.0b so there should be little issue. Unless the game uses PS3.0 exclusively as it's shader model or most of it(NOT A PARTIAL OF ps3.0, ps1.4, ps1.1 like FarCry, Battle for Middle Earth)Then the 6800 series will fair better. However, here's the catch. Those very games that use PS3.0 exclusively or mostly(Unreal 3 ie.)will be very demanding and require major horses. Remember to be PS3.0 compliant, the game must render in full color precison. 128bit color or 32bit Floating point! ALL THE TIME! At this point, NO GAME IS PROGRAMED IN THAT COLOUR BIT or floating point. Nvidia's cards from the 5800 to the 6800 have the capability for it, however they run those programs so much slower when in that mode. Look at the 6800ultra running the Unreal 3 demo for a perfect example. Major slow fps. There is not one game that runs in 32 bit floating point on Nvidia's card. 12 or 16. Not 32FP. This is why Nvidia runs at a lower Floating point in all it's games. PS2.0b runs it the same at this time without the performance hit or IQ degradation. Until games becoame fully PS3.0 compliant with no other shader protocals used, you will see very little difference between PS2.0b and PS3.0 in games. IE: FarCry(Which happenens to use the most PS3.0 shaders at this time. And it still runs faster on ATI cards. And the label box on FarCry claims "Way It's Mean't to Be Played....."

ATI New Board With 32 Pipelines

"This can be said of the same thing with the FX line of cards and it's horrible PS2.0 support. They never followed MS format for shader protocal. Result? Slower shader performance in pixel shader 2.0 support without lowering Floating point precision and major optimizations. The difference here is, ATI followed MS protocol for PS2.0b so there should be little issue. Unless the game uses PS3.0 exclusively as it's shader model or most of it(NOT A PARTIAL OF ps3.0, ps1.4, ps1.1 like FarCry, Battle for Middle Earth)Then the 6800 series will fair better. However, here's the catch. Those very games that use PS3.0 exclusively or mostly(Unreal 3 ie.)will be very demanding and require major horses. Remember to be PS3.0 compliant, the game must render in full color precison. 128bit color or 32bit Floating point! ALL THE TIME! At this point, NO GAME IS PROGRAMED IN THAT COLOUR BIT or floating point. Nvidia's cards from the 5800 to the 6800 have the capability for it, however they run those programs so much slower when in that mode. Look at the 6800ultra running the Unreal 3 demo for a perfect example. Major slow fps. There is not one game that runs in 32 bit floating point on Nvidia's card. 12 or 16. Not 32FP. This is why Nvidia runs at a lower Floating point in all it's games. PS2.0b runs it the same at this time without the performance hit or IQ degradation. Until games becoame fully PS3.0 compliant with no other shader protocals used, you will see very little difference between PS2.0b and PS3.0 in games. IE: FarCry(Which happenens to use the most PS3.0 shaders at this time. And it still runs faster on ATI cards. And the label box on FarCry claims "Way It's Mean't to Be Played....."yeah, but if somone wants to buy a graphics card that he can use for lets say 3 years, he should definately get a 6800, as one year ago, he should have gotten the 9800 pro. and concerning unreal 3: the tested version had to run on the first drivers available. drivers were in a rather early st

ATI New Board With 32 Pipelines

"yeah, but if somone wants to buy a graphics card that he can use for lets say 3 years, he should definately get a 6800, as one year ago, he should have gotten the 9800 pro..."Rember the Geforce3 card? That card had no ps1.4 support and the 8500 did. Yet 2 years later the geforce3 card was playing games with ps1.4 effects with no issues or iq problems compared to the 8500. What does this mean. Simply put, ps1.1 and ps1.4 offered little performance differences. This is the same situation with PS2.0b and PS3.0.

ATI New Board With 32 Pipelines

WHATEVER!!! dude...its simple everyone...if you want quality GO WITH ATi!!!...But..if your an idiot who cares bout his p*ssy FPS..go with nvidia...when you have AMD paired with ATi..you get the speed of AMD and the Quality of ATi.....ATi Cards look WAY more prettier than nVidia cards. If you ever notice..the lines on graphics with nvid cards look REALLY jaggedy...In conclusion, stick with (AMD and)ATi...

ATI New Board With 32 Pipelines

to this post 4:40 8/4/2005yes in the us ati had a slower product release date but in canada, europe, and japan you could get the x800 6 to 8 months before the first 6800 was out.now true most stores had their x800 supply sold out very fast but the 6800 did not hit the store until 6 to 8 months later, and when the 6800 hit the store they also had a problem (even in the u.s.) of being availilable.i am not saying that ati is better than the 6800, just one cycle a head.get your timing right...the 8500 was made to go against the geforce 3 ti 500 (the release date were both the same).the 8500 128 mb came out during the geforce 4 ti series (ati for one cycle did nothing but work on the r300 chip)you might not agree with what i say but it is the truth.secondly do not take what someone was saying out of context.16:25 7/4/2005 hisher quote had this in the beginning"that is exactly what the 6800u did to the 9800xt"this was taken from 4:55 7/4/2005.now maybe 16:25 7/4/2005 was an idiot or heshe could of been sarcastic and was making another comparison based on release dates and not product equivalent.

ATI New Board With 32 Pipelines

Acutally my friends the there will be a new version of DirectX for windows Long Horn it will now be called WGF 1.0. Now the NV4x series of cards are not supported. The R520 will have WGF 1.0 support, so basically whoever bought the NV4x series of cards did not get the long term investment they thought. Source [url removed]

ATI New Board With 32 Pipelines

GOOD POINT!most review sites will only look at the best from each company at the time. we will see reviews of the 6800 ultra single and sli going against the r520. is it fair? no! but it will happen.they did it with the 8500 against the g4 ti, g4 ti against the 9700, the x800 against the 5950 fx. for those few month before the competitor has thier equivalent card out, they get beat up with a newer card. it's life.

Pages

Add new comment