nVidia Introduces Dual GPU Systems

nVidia has decided to give new life to an old Voodoo concept and has announced nVidia SLI which will allow multiple GeForce 6 Series or Quadro graphics cards to operate in a single PC or workstation for increased graphics horsepower. Appearing later this year in PCI Express-based PCs and workstations from the world's top manufacturers, the new nVidia SLI technology will take full advantage of the additional bandwidth and features of this new high-bandwidth bus architecture.

Interestingly enough Alienware, who will also be launching its own Video Array dual GPU system in Q4 2004 (ALX), is also listed as a partner. nVidia seems to be creating a product which will allow other system builders to compete with Alienware, this competition however will leave Alienware with one advantage over its rivals, that its own setup will, most probably, support ATI cards as well. It will be interesting to see how this battle of the ultimate PC will progress, especially since it may well develop into a battle between system builders.

nVidia SLI is a patent-pending hardware and software solution that enables system builders to connect two PCI Express-based GeForce 6 Series or Quadro graphics boards on their PCI Express-compatible motherboards. The technology features an intelligent communication protocol embedded in the GPU and a high-speed digital interface on the graphics board to facilitate data flow. A complete suite of software provides dynamic load balancing, and advanced rendering and compositing to ensure smooth frame rates and outstanding image quality.

Unreal Engine 3 has an insatiable appetite for fill rate and polygon throughput, running shader programs of 50-100 instructions per pixel to achieve advanced per-pixel lighting and material shading. nVidia's SLI technology running on PCI Express motherboards provides an incredible performance increase of up to 2x, bringing next-generation gaming up to even higher resolutions and frame rates, stated Tim Sweeney, founder and president, EPIC Games.

Systems based on SLI multi-GPU technology are expected to become available in the second half of 2004 from the world's leading PC and workstation manufacturers including:

-Alienware - North America, Europe
-Atelco Computer - Germany
-Boxx Technologies - North America
-Falcon Northwest - North America
-Mouse Computer Japan (MCJ Co., LTD) - Japan
-Network Technical - Sweden
-Paradigit - Holland
-Scan Computers - UK
-TSUKUMO Co., Ltd. - Japan
-UNITCOM INC - Japan
-Velocity Micro - North America
-ThirdWave Corporation - Japan
-VoodooPC - North America

Add new comment

CAPTCHA
This question is for testing whether you are a human visitor and to prevent automated spam submissions.

Comments

nVidia Introduces Dual GPU Systems

the dude below me is so right. i looked in the mirror and was fukcing dying of laughter. nvidia is trying so hard to attempt to even try to make something good, oops hold on, they've already made many good things, I know I couldn't do better. ati is just still the best (in my opinion, which doesn't make it right). howcone these god damn americans cant relize it (not just americans, I just like blameing them cause I'm really a looser. god damnit, if ati was an american brand, they would commercialize their hardware 1 billion percent (even though I don't know that for a fact, I just want to sound smart). and hold on yet again, ati rocks (I don't know why I wrote that, guess I'm too stupid to see that my opinion alone doesn't make it right).no further opinions (they dont matter anyway)-beeblebrox

nVidia Introduces Dual GPU Systems

nice to see that the most inventive legasy of 3DFX lives on. be nice to see which will be faster in a year...high end GF6 or 2x Budget GF6? think aboot it!And go AlienWare! they are managing to do their own version of SLI and they just make computers! that rocks!

nVidia Introduces Dual GPU Systems

OMFG, I'm soo tired of all the SHIT nvidia gets, and i'm in a rather violent mood. Listen here f**kers, in far cry when they installed the patch to allow ps 3.0 functionality (thats right, all the benchies till now have been far cry in ps 2.0) nvidia card PWNs the x800XT, want a f**king link:[url removed] the whole thing, or you can skip to the benchies.Point being in all the OLD game engines sure the x800XT may win, but in newer game engines the 6800U is the one to go with. What does this say? all the fools who got a x800xt, will be playing all todays games great, guess what? So can a 6800. But when performance with these cards become an issue, in a few years. With urber leet games which WILL be in ps 3.0 frankly cos games are easier to make in 3.0 all those who knew their tech and are with a 6800 will be wooting. Blarg heheheeAlso about the dual nv40's, read the f**king link:[url removed] is a 1.7 to 2 x performance increase. Yes it has already been tested f**kers. It works fine. And ATI will follow, they cant not follow. Lest they be absolutely pwned by nvidia.The only person here with the right idea is shibby

nVidia Introduces Dual GPU Systems

right now there is no telling who has the highest lead in grapics performance. each card is better in its own way which is dufficult to test and see which one is faster at its unique perpose. i cant choose which one is better of the two. this race will continue for a long time.

nVidia Introduces Dual GPU Systems

I don't hate apple before than everything else. The only thing I hate are the SPAM, Pop Ups, Spy ware, Tronjans and Virus. The people that make this kind of shit should be eradicated from the earth.Something that nowbody have pointed out is the POWER SUPPLY that most be used with this system. For two cards on the PCIe bus you will need a Power Supply with 4 CDROM conectros just for the 2 cards to be functional. Then comes you hard drive an CDROM or floppy, what about a 550 Watts maybe 600W. I don't thinks this is a good idea specialy when you need a backup UPS and also a more than one computer in the house with that kind of configurations, for example 2 or 3 brothers that will like to play in a LAN game... is my case... there are 3 PCs here for gameing and 3 more less powered not able to play some of the new games out there. Imagine the bill that will come out if the Power Consumption keeps going up.

nVidia Introduces Dual GPU Systems

May be strange but nv40's have very good power management when idle. So when you aint useing the comp they dont power down the neighbourhood. Also nv40 is perfectly stable with 1 power rail splitted with a power splitter. So only 2 power rails are needed for 2 nv40's although nvidia recomends two seperate rails each. Another thing, are you going to be owning 2 nv40's ? if you have that much money upgrading your powersuply to 550 watts wont be hard. Anyways you dont need 2 nv40's, any game in the market today is more than happy with 1. They only made the dual ones to put ATi back in it's place.*nibbles on a petri dish*

nVidia Introduces Dual GPU Systems

If future games are perfectly playable with just one card, imagine the possibilities with this dual GPU systems and PS 3.0 which already being used. Gamers are closer to get into gaming heaven! Ofcourse we'll have to wait for the price to go down. But this is still in an early stage of something awesome! nVIDIA is the best!

nVidia Introduces Dual GPU Systems

buy this, buy that. i'm starting to feel like i'm getting ripped off every time i go to upgrade my pc. it's really starting to piss me off. i own an x800 pro and i'm going to keep it and be happy with it for a long time. say what you will, but it works great and hey... thats all i want/need. i used to like nvidia more cause i had a geforce ti 4200, but when i went to upgrade i got a sucky fx 5700 ultra that broke like six months after i got it. now i'm done with crappy products. i've had my x800 now for a while and i have no more worries. thanks for the geforce 4200 nvidia, but never again will i buy your silly products again. you seriously don't believe ati won't find their own answer to this do you? they are resourceful. they will boost their single cards performance, they will do it better than nvidia, and they will be cheaper products. if anything in the past has tought me anything about these two companies, it's exactly what i just said. you wait and see, but for now, like i said before, i have my sweet x800 to cherish. ~rdog~

nVidia Introduces Dual GPU Systems

had a radeon once. with all that driver problems and game compability problems it's not worth it. upgraded to nvidia last february since my mates are using nvidias too. so far it never let me down like radeon did. the drivers are solid and it has no problems at all. and i'm upgrading to an nv40 next month! it has lots of better features and i want my card to be able to play ps30 games when the games are out.

nVidia Introduces Dual GPU Systems

There is nothing wrong with the ATi offering, i would be extremely happy with it, having a 9000 atm. It's very close, the war could flip either way. It's just my opinion that the 6800 is better. ATi will probably make dual card systems soon. It's too hard putting two cores on a single chip. Too much heat and voltage on a card.

nVidia Introduces Dual GPU Systems

"i want my card to be able to play ps30 games when the games are out."When they finally do come out not just 1 game like farcry, ATI as well as Matrox will support PS3.0.As for dual chips. Way too hot to run and major power consumpttion. Especially if it's 6800 Ultra's. Plus this everyone here is talking like this is available right now. Get a grip

nVidia Introduces Dual GPU Systems

Hahahaha, I'm hella stalking this site.Yes ATi "*"will"*" support ps 3.0, about the time when many ps 3.0 titles are out. If you want to upgrade now, buying a card which already supports 3.0 will be a good idea, save you from buying a leet ps 2.0 card and finding you need a 3.0 for some games down the track. Knowing myself whom tends to leave upgrades for about 3 years at a time.Dual chips are fine so long as they are not on the same pcb. 2 6800's, one above the other doesnt overheat. The mamoth of a heatsink and urbur fan make sure of this. They even said on a review site that ocing of the top card was possible, but not as much as the bottom. Any powersuply over 500 watts can deal with the load easy. If you can afford 2 of these monsters then upgrading your psu wont be hard.*nibbles on petri dish*

nVidia Introduces Dual GPU Systems

City, when "many" games support PS3.0, that 6800 will be too slow to render those games decently. You will not see a abundance of PS3.0 games until well into 2005. At this time only FarCry supports PS3.0(PARTIALLY) and that took the makers of FarCry almost 6 months to do. And the end result was 10% increase in that game. Not even across the board performance increase. It's a start. And i applaud Nvidia for doing it. But it's not exactly going to convert the masses yet.What people seem to forget is now FarCry runs the same as ATI's Flagship chip with PS3.0 patch. The patch does nothing for anyother game.

nVidia Introduces Dual GPU Systems

"We also wonder if ATI will respond in the consumer space. R300 and R420 have always been designed with scalability in mind scalable by virtue of a patented method that divides the screen into tiles and allows different chips to render different tiles (thus being inherently load balanced). ATI have presently made no indications they wish to bring this to the consumer space, although Evans & Sutherland and SGI have created high end systems scaling numerous graphics processors in a single system, indicating the solution is fully operational."-Beyond3d.com-If ATI wants to follow Nvidia's footsteps, it can. But there is no money to made with this. Less then 1% of the market will by into these dual expensive chipsets. It's great for workstations. If only 1% of gamers buy high-end GPU's that cost $499 and up. What do you think will be the percentage for buyers of dual chips that cost $1000+? ;)

nVidia Introduces Dual GPU Systems

Anonymous. Read the post from Hill Below. And then read the article at Beyond3d."R300 and R420 have always been designed with scalability in mind scalable by virtue of a patented method that divides the screen into tiles and allows different chips to render different tiles (thus being inherently load balanced)...."The Rage Fury Maxx patent technology is also another option for ATI if they want to go that route. And Dual GPU systems are not patented. lol. It's the respective companies technology that is. A.K.A.SLI and Alienwares setup. And ATI's Rage Fury chipset which can be used with the 9800XT or R420 series. Do some research before you post

nVidia Introduces Dual GPU Systems

Two reasons it stopped after voodoo:1) Top of the line cards are expensive let alone to buy two of them.2) Why buy two when one card can run the latest games. Let the best card shine out from the crowd.Good idea thou maybe if they sold them in two packs i might buy. ;)

nVidia Introduces Dual GPU Systems

City, ATI already has. They use it for their graphic workstations at SGI. The R300 and R420 are scalable for it:"We also wonder if ATI will respond in the consumer space. R300 and R420 have always been designed with scalability in mind scalable by virtue of a patented method that divides the screen into tiles and allows different chips to render different tiles (thus being inherently load balanced). ATI have presently made no indications they wish to bring this to the consumer space, although Evans & Sutherland and SGI have created high end systems scaling numerous graphics processors in a single system, indicating the solution is fully operational."-Beyond3d.com-[url removed] more of the same with AA and AF enabled. Not much exciting so far, but these benchmarks don't highlight the properties that have enhanced performance under the new rendering path. It is interesting to note that the X800 series of cards appear to be more resilient to turing on AA and AF than the NVIDIDA cards even under the new rendering path."[url removed] main point that the performance numbers make is not that SM3.0 has a speed advantage over SM2.0 (as even the opposite may be true), but that single pass per-pixel lighting models can significantly reduce the impact of adding an ever increasing number of lights to a scene.It remains to be seen whether or not SM3.0 offer a significant reduction in complexity for developers attempting to implement this advanced functionality in their engines, as that will be where the battle surrounding SM3.0 will be won or lost."[url removed] Research, the results are more dramatic. With Shader Model 2.0, the GeForce 6800GT can't keep up with the Radeon X800 Pro when AA and aniso

nVidia Introduces Dual GPU Systems

CONTINUED..."In Research, the results are more dramatic. With Shader Model 2.0, the GeForce 6800GT can't keep up with the Radeon X800 Pro when AA and aniso are at work. Switching to SM3.0 changes the outcome, giving the 6800GT the lead. The Radeon X800 XT PE still leads the entire pack, however.""In this case, Ubisoft worked with NVIDIA to make one of the best games of the past six months run smoothly on the GeForce 6800. That's spectacular, especially because the game still runs very well on Radeon cards."What you will find City, is the 6800 wins in some tests now, and loses in others and ties in some with the PS3.0 patch. When AA and Fsaa is applied, thats were the XT800 or the X800pro do better in most cases. Not all. But most.What you have to consider is this. Crytek took 6 months to optimize for Nvidia cards to make FarCry run smoother on Nvidia cards. It did no such optimizattions for ATI cards at all!! And they still in most cases run the same or better at higher IQ settings. Oh, and the XT800 is still a 1 slot card. The 6800Ultra is not. Also, what has been seen is ATI's desire to optimize the R420 chips for PS2.0 which unfortunately for Nvidia at this time is the vast bulk of games coming out. There is only 1 game that supports PS3.0. And that game still runs on par as the XT800 or X800pro even with 6 months of optimizations and PS3.0. Just the fact that the R420 cards can run it the same without PS3.0 and Crytek's optimizations it gave Nvidia over the past 6 months, shows the R420 chips are solid. Just as solid as the NV40......

nVidia Introduces Dual GPU Systems

I think i'll try to press my ideas again when the next benchmarks arive. Which should be from nvidias dual solutions and ATi's complement (what do you guys think it will be? )Heh, I read an article on how ATi can pull such good numbers in high filtering. It's rather smart. In surfaces which directly face the person playing the game the edges are naturally smoothe. Most gpu's filter the whole scene, Ati's doesnt filter direct objects. Which makes no graphic difference hence the smartness. It's noteable that nvidias gpu's pull ahead in higher resolutions though. They must have some ace up their sleave there.Nvidias problem is their core being to chubby, gobbling up heaps of power. Requireing a imence heatsink and runny at low clocks. ATi's problem is the aged tech. The only reason nvidia is still up there.Well i have been doing alot of reading. The 6800 U just beats the X800xt in the next gen games, but when filtering is enabled the average frame rate is way over that of the 6800U. A plus to the nvidians is the minimal frame rate was 99% of the time higher than the radeons. So there were less slow downs in the game. But when the cards rampaged the radeons rampaged faster. Thats just for the 6800U. The 6800GT does some serious dammage beating the x800pro in everything respect, except for a few filtering benchies. And i dont think the 9800xt to the normal 6800 was a fair comparison but the 6800 notched ahead.mm, 6800 GT, hopefully there is some oceing to be done.

nVidia Introduces Dual GPU Systems

"It's noteable that nvidias gpu's pull ahead in higher resolutions though. They must have some ace up their sleave there."That begs to differ. If you put higher resolutions on, Nvidia cards run faster at times, however as soon as you enable FSAA and AA, ATI cards take the lead and that's mostly due to it's higher clock speeds and temporal ANTI-ALAISING methods. Almost the same story as the 9800 vs. the 5900 series of last year."Well i have been doing alot of reading. The 6800 U just beats the X800xt in the next gen games, but when filtering is enabled the average frame rate is way over that of the 6800U. A plus to the nvidians is the minimal frame rate was 99% of the time higher than the radeons. So there were less slow downs in the game. But when the cards rampaged the radeons rampaged faster. Thats just for the 6800U."Remember Dis, the whole reason people spend $499+ for high-end video cards IS TOO PLAY their games with FILTERING methods on at high resolutions. What's the point if you don't? Then stick with the 5900 series or 9800 series or less then.Those cards can do it also."The 6800GT does some serious dammage beating the x800pro in everything respect, except for a few filtering benchies."That all depends on what game. And remember also that the X800PRO IS just a 12 pipeline card compared to the GT's 16. However, the X800PRO can be modded for 16 pipeline card goodness and over clock out of the box to 500MHZ at only $399.99 and a 1 slot card. They already modded a few and ran tests. The card is on par with the XT800 in speeds. Now that's a deal and a half.[url removed] the last link. It also mentions PS3.0 and the NV50 and R500 BREIFLY.Now please tell us some links that show the GT causing "serious" damage to the X800PRO in "every respect" And not just in one or two games like FarCry OR Quake3 engine titles. Also show us some links with filtering on c

nVidia Introduces Dual GPU Systems

continue..Also show us some links with filtering on compared.Also, Dis, you know you STILL cannot buy a 6800GT or 6800Ultra yet. So you have to take certain websites for their word on reporting s the cards have not reached retail yet. Or damn hard to find one. I know a few people who have the X800PRO.

nVidia Introduces Dual GPU Systems

"Which should be from nvidias dual solutions and ATi's complement (what do you guys think it will be? )"You will have to use Alienwares dual setup for the ATI solution compared to the SLI solution for Nvidia if Nvidia releases this SLI technology soon. As of right now it's just a paper launch ad campaign. And the results should be the same as the single chips. If the Nvidia cards are faster in one game then the ATI cards then doubling the performance for both if that's possible will still have Nvidia ahead. And That goes for games that run faster on ATI cards.From all the reviews out. The best Nvidia next gen card to get if you can find one is the GT version. It offers less power and 1 slot design over the $100 more epensive power hungry 2 slot card the 6800Ultra.The ace in the hole for the x800pro is it's modding capabilities and great over-clocking out of the box on standard cooling. The GT cannot over-clock as high due to it's transistor count. The GT over-all from intial tests beats the X800pro by small margins, however who in their right minds would buy the x800pro and not mod it to 16 pipes???? At that point, the card will surpass the GT. 16 pipes at 500mhz and 100mhz memory clock........

nVidia Introduces Dual GPU Systems

"Heh, I read an article on how ATi can pull such good numbers in high filtering. It's rather smart. In surfaces which directly face the person playing the game the edges are naturally smoothe. Most gpu's filter the whole scene, Ati's doesnt filter direct objects. Which makes no graphic difference hence the smartness."Thats a version of smart cullin. And Nvidia uses it also. They call it ultra shadow or something to that effect. Not rendering scenes or filterng methods that are not nessacary. This has been around since the days of the first 3dfx chip.

nVidia Introduces Dual GPU Systems

Woaa, k, then. I dont have that much time on my hands. :pI never meant that nvidia cards win when they are at higher resos with filtering. The moment filtering comes into the equation ATi takes the lead. But without foltering and reso increase nvidias cards fram rate drop is lower than ATi's."Thats a version of smart cullin. And Nvidia uses it also. They call it ultra shadow or something to that effect. Not rendering scenes or filterng methods that are not nessacary. This has been around since the days of the first 3dfx chip." Well if thats the case the person who wrote it is a noob. I just read and reguritate, most of the time.There is no guarentee that a soft mod to enable the disabled pipes in the X800pro will work. ATi i remember has gone to great lenghs to prevent this. Like the 9800se it was rare to find one which could enable all pipes without graphic problems. Also the amount a card oces is how much effort you put into it. Personally i never settle for an oce below 20%, i will probably end up sticking a cpu hsf on it if i got one knowing myself. :)And you wanted links showing the gt giving the pro a swift ass kicking."The faster test-system proved that performance wise things will shift in advantage for the x800 XT series over the Ultra. It's really a tie though, both cards are just so close to each other. But when we look purely at performance rankings in the highest resolution this would be the end-result : on the 4th place the x800 pro, 3rd place the 6800 GT, 2nd place the GeForce 6800 Ultra and at the first place the x800 XT, although a shared first place would probably be better wording. Remember I'm only talking about 1600x1200x32 here.On a somewhat slower system (I refuse to call a 1 gig, 2.8 Ghz system mid-end) the results however are way closer to each other and definitely in favor of the GeForce 6800 series. The x800 XT is hungry for a faster processor that's for sure."[url removed]

nVidia Introduces Dual GPU Systems

(It's at the last page, use the tabby things to go back)This one is based on far cry[url removed] to each cards minimal fps at the highest reso for each test.Thats all i can be bothered finding for now.Btw you taste like chicken*what would this world be without randomness*grr didnt all fit

nVidia Introduces Dual GPU Systems

What you are finding for the first time now are the CPU's are the bottleneck. Not the GPU. This is why you see a difference on higher-end systems compared to lower-end systems with these cards from Nvidia and ATI. GPU's with greater fill-rate and band-width fare better on faster CPU setups.What i find disturbing for Nvidia owners and it's latest driver releases is the increased support for the 6800 series(which you cannot buy at this time) and the diminshed support for the FX line of cards. FarCry is on example. Nvidia's past 2 driver sets and patch 1.1 and patch 1.2 actually caused issues with some IQ on the 5950 and 5900 cards. You should see the abortion of IQ problems patch 1.2 cause on ATI cards. Driver Heaven talks of this.Gameplay on the 6800 series was flawless though. Like many have said and this seems to hold true. Nvidia looks to be distancing itself from the whole FX line the same way they did with the 5800 when they released the 5900 in favor of it's new chipsets. I find that disturbing considering not one average Nvidia owner has a 6800 to call his/her own.Here are some reviews of the GT also:"The flyby benchmark scales a little better, but that's bad news for the 6800 GT OC as it gets whooped up on by the X800 Pro at 1280x960 and 1600x1200."" The tables are turned here, as the 6800 GT OC outperforms the X800 Pro by a considerable margin at each resolution. Over 44 FPS at 1600x1200 with 4xAA and 8xAF is rather impressive." (X-2 the thread,. Rolling Demo)What you find is both cards, the X800pro and the 6800GT trade benchmark blows. Just as the XT800 does with the 6800Ultra.[url removed] generation of cards are too close to call a winner. If anything, like i said before the better cards to get would be the GT and the PRO....

nVidia Introduces Dual GPU Systems

All you have to do regarding modding the X800PRO is wait for about 2 months or so and then you will find out what brand and model of the x800pro will mod to a xt800. I did this with my 9500pro. I waited until it was common knowledge which 9500 card and brand could mod to a 9700pro then bought that one. That's the advice i give anyone planning to buy a X800PRO card. That way you are not screwed later and have a $399 card that runs like the $499 16 pipeline XT card.

Pages

Add new comment