# 6990-6970 Trifire VS GTX580 Tri-SLI CLASHED



## Jaskanwar Singh (Apr 28, 2011)

Introduction - NVIDIA GeForce 3-Way SLI and Radeon Tri-Fire Review | [H]ard|OCP



> The Bottom Line
> *AMD Radeon Tri-Fire is giving you the same or better performance than GTX 580 3-Way SLI for $500 and 200 watts less. *You get both a money savings and a power savings using Radeon 6990/6970 Tri-Fire instead of GeForce GTX 580 3-Way SLI.
> 
> It just makes no sense to build a GTX 580 3-Way SLI currently. AMD Radeon 6990/6970 Tri-Fire is better in terms of value, efficiency, and gaming performance than GTX 580 3-Way SLI. If you want to utilize that performance, the 2GB of RAM per GPU on the Radeon HD 6970 will allow you to do this and provide a noticeable gameplay experience and visual improvement over GTX 580 3-Way SLI. No other conclusion can be made at this point, AMD Radeon HD 6970 Tri-Fire is a tremendous value compared to GTX 580 3-Way SLI, and Tri-Fire is the better choice for multi-display gaming.
> ...





> In fact, the best case was that GTX 580 3-Way SLI matched Radeon Tri-Fire performance. In none of these games did the average framerate of GTX 580 3-Way SLI exceed that of AMD Radeon HD 6990/6970 CrossFireX. This is bad news for the GTX 580 3-Way SLI folks, considering the Radeon HD 6990 wasn't even running in its OC performance mode. *On top of that, three separate Radeon HD 6970 video cards offering triple-GPU performance will be faster still since those have faster core clock speeds and faster RAM speed than what we tested here.*



the power consumption -
*www.hardocp.com/images/articles/1303213710rBOPZA7NSR_3_1.gif
Conclusion - NVIDIA GeForce 3-Way SLI and Radeon Tri-Fire Review | [H]ard|OCP

My bad. Mods move it to gfx card section.


----------



## Cilus (Apr 29, 2011)

Very good read Jas...thanks for the Link. It looks like AMD is seriously improving constantly in their driver, offering more better performance every day in Multi GPU scaling which was really dominated by nVidia couple of years ago. 

Hope they come out with some good in House Physics Engine too.


----------



## ico (Apr 29, 2011)

Nice review.  Infact, I'm myself surprised from this.

I guess it is time for everyone to accept that Crossfire scaling in HD 6000 series is superior. And higher VRAM is not the only reason if you ask me.


----------



## asingh (Apr 29, 2011)

nVidia really lost it 2010-11 regarding thermals and power. Now multi-GPU is slipping too.

Jas, really nice article. Short, informative and to the point. Well done.


----------



## Skud (Apr 29, 2011)

My mind goes back to the days of X800 when you need special Crossfire slave cards and that Ugly DVI connector. How I hated ATi those days! The 6000 series changed everything though. One major development is the Catalyst Application Profiles which are released as and when required taking care of most of the Crossfire issues.

Once again great article by HardOCP.


----------



## vickybat (Apr 29, 2011)

ico said:


> Nice review.  Infact, I'm myself surprised from this.
> 
> I guess it is time for everyone to accept that Crossfire scaling in HD 6000 series is superior. *And higher VRAM is not the only reason *if you ask me.



Here's a quote from hardocp guys:



> For F1 2010 we were able to turn on 4X AA at 5760x1200 to try and push this game. We could not use 8X MSAA as a comparison because the GTX 580 3-Way SLI configuration ran into the VRAM wall at 8X CSAA and 8X MSAA.



For crysis warhead



> With GTX 580 3-Way SLI we were able to play at 5760x1200 with No AA and all Enthusiast settings. We were not able to have 2X AA enabled however, once again the VRAM limitation hit us. However, with AMD Tri-Fire we had no problems enabling 2X AA at these settings, providing a better gameplay experience.



How can you say vram isn't the issue? On higher resolutions , with aa and and other texture settings, higher vram becomes a necessity. In case of the trifire config, the vram playing field is 4gb (6990) whereas for the tri sli, the playing field is just *1.5gb*.

You can't say that sli scales poorly here. The gpu is clearly held back by the paltry 1.5gb vram of 580 at higher resolutions. 

I am sure that a tri-sli setup of 3 gtx 580 3gb versions will go past a 6990+6970 or even 3 6970 setup.

But amd setup consumes a lot less power for the performance and offers good value on setting a multimonitor config at a cheaper price.


----------



## asingh (Apr 29, 2011)

^^
Vicky, if the 580 SLI is being held back because of the paltry 1.5GB -- it is the GPUs fault right..? So the economics and thermals of this type of solution. XfireX wins this round.


----------



## vickybat (Apr 29, 2011)

^^ yup it is nvidia's fault of giving such less vram to a strong gpu core.

Definitely xfirex wins the round owing to cheaper setup and 200+ watts lesser power consumption. Fermi's were always power hungry right from the 4 series. 

Hope nvidia fix all these issues in its upcoming "kepler" based gpu's.

ps- For sake of performance, tri sli *these* and see it decimate every competition


----------



## ithehappy (Apr 29, 2011)

They surely will. But I think this 3 way SLI is absolutely impractical. Wanna compare? Go on with a single pair of SLI. How many of general users or even extreme gamers use Multi Monitor setup?


----------



## Skud (Apr 29, 2011)

You are right. Very few users have more than a single monitor. But one things for sure, they look AWESOME!!! Again, with AMD, its comparatively easier to setup 3 or more monitors and even a single card is sufficient if you really want to take the plunge.


----------



## damngoodman999 (Apr 29, 2011)

Check out the resolution they benched , its 3 monitors setup 5XXX resolution , No way way Nvidia cards gona beat Ati @ this resolution .
Need to be benched @ single monitor performance , though i accept ATI Thumbs UP for
EYEFINITY


----------



## vickybat (Apr 29, 2011)

^^ 3gb gtx 580 tri sli can beat a 6990+6970 trifire @ 5xxx resolutions. But the cost and power will go skyhigh and thus not a practically ideal setup.


----------



## Jaskanwar Singh (Apr 29, 2011)

those who say amd is given 4gb ram then guys 6990 is combined with 6970 which has 2gb vram. so AFAIK 2gb will be used. 

and people missed this -


> *We have given every advantage to the GeForce GTX 580 3-Way SLI *configuration in this evaluation, and yet it still can't compete. We are testing with the "slowest" Radeon 6990/6970 Tri-Fire configuration possible in this $1000 price range, and GTX 580 3-Way SLI just can't touch it.



actually truth is they were not comparing with 2xAA even in crysis.



ithehappy said:


> They surely will. But I think this 3 way SLI is absolutely impractical. Wanna compare? *Go on with a single pair of SLI.* How many of general users or even extreme gamers use Multi Monitor setup?



you need not worry. XFireX rules in dual gpu setups too.



damngoodman999 said:


> Check out the resolution they benched , its 3 monitors setup 5XXX resolution , No way way Nvidia cards gona beat Ati @ this resolution .
> Need to be benched @ single monitor performance , though i accept ATI Thumbs UP for
> EYEFINITY



actually you dont expect people to go with triple setups for just full HD resolution


----------



## vickybat (Apr 29, 2011)

^^Nope its 4gb. The primary cards frame buffer is used. Lets say you xfire 69702gb with 69501gb, there will be 2gb frame buffer. Atleast this is my understanding but i may be wrong.

They slied 1.5gb versions. They said the vrams holding back the gpu. So it has to be the 1.5gb version and not the 3gb ones. Infact the availability of 3gb laced 580's is scarce.
So learning this, we can assume the hardocp guys didn't give all the advantage to 580.

The above sentence you quoted means that they allowed the 580 to be tested in trisli mode instead of sli to give it a fair chance against the trifire setup.

It doesn't mean they tested the 3gb version.


----------



## Jaskanwar Singh (Apr 29, 2011)

batman the sentence i quoted means that they tested 580 1.5gb tri sli by allowing it to compete comfortably  at the settings it could easily play.


----------



## Skud (Apr 29, 2011)

Jaskanwar Singh said:


> actually you dont expect people to go with triple setups for just full HD resolution




Even 25x16 res of 30-inchers would be insufficient to stretch those triple setup monsters.You need at least 2 monitors if not more to justify 3-way graphics config.


----------



## damngoodman999 (Apr 29, 2011)

The Real Fact 
AMD has got really want to have all these years , Yes their  Xfire profiles are gr8 -I ACCEPT 

But 

Check this Video ! YouTube - Quad SLI vs Quad Crossfire Heavyweight Showdown GTX 590 & HD 6990 Linus Tech Tips

Benched @ 2560x1600 , Single monitor


----------



## Skud (Apr 29, 2011)

The point is when you are spending $1000 or more on graphics card alone, in all practicality you are not going to run it on a $300 monitor setup. AMD rules at multi-monitor setup. Like nVIDIA has the edge on 3D gaming.


----------



## asingh (Apr 29, 2011)

Can the nVidia cards run multi monitor on single cards..?


----------



## vickybat (Apr 29, 2011)

^^ No they can't. The gtx 590 can but its a dual card.


----------



## damngoodman999 (Apr 29, 2011)

3D is a joke , out how many of them are going for 3D gaming ?



Skud said:


> The point is when you are spending $1000 or more on graphics card alone, in all practicality you are not going to run it on a $300 monitor setup. AMD rules at multi-monitor setup. Like nVIDIA has the edge on 3D gaming.






asingh said:


> Can the nVidia cards run multi monitor on single cards..?



Only Nvidia is concentrating on releasing GPU for their BRAND SHOW , 3D, Physx ,CUDA will not increase Frame-rates though they ll reduce . I am searching physx (not enjoying Physx) while playing games 

I really appreciate the Multi monitor support (Eyefinity)! This is really needed for now ! makes an impression in the gaming . 

Am not Fanboy i support only for single GTX 560ti because its really Big bang for Bucks thats y ! if HD 6950 2GB priced same i wud sure point out HD 6950 .


----------



## vickybat (Apr 29, 2011)

^^ You can call physx and cuda as marketting gimmicks but not 3d. I even consider 3d more immersive than eyefinity but a combination of these works wonders. Nvidia calls it *"3d surround"* and amd is also going the 3d way with its *hd3d* technology.

Framerates are not everything. Until and unless you are getting playable framerates, its never gonna bother.

Consoles hardly get over 30fps and you get all great games on them. There are extremely scarce pc exclusives nowadays and every game is getting consolised. They boast 3d gaming and the pc community is also following with great guns.


----------



## Skud (Apr 29, 2011)

damngoodman999 said:


> 3D is a joke , out how many of them are going for 3D gaming ?



I know, all I was saying is that nVIDIA has the edge.




> Am not Fanboy i support only for single GTX 560ti because its really Big bang for Bucks thats y ! if HD 6950 2GB priced same i wud sure point out HD 6950 .



Even HD6950 1gb is bang for buck and competes successfully with 560Ti.


----------



## ashis_lakra (Apr 29, 2011)

It looks like ATI is gaining speed and advantage over nVIDIA .. Max. sweetspots in all ranges are occupied by ATI .. 

Love to hear that, it has got grunts to give a challenge to might gpu of nvidia too.


----------



## damngoodman999 (Apr 29, 2011)

Skud said:


> Even HD6950 1gb is bang for buck and competes successfully with 560Ti.



Yes , HD 6950 1 GB is also considerable but other brands are not concentrating on the HD 6950 1Gb cooler are increase Factory OC performance !


----------



## Skud (Apr 29, 2011)

That's because its too close for comfort of the 2gb version of the card. At all the common resolutions there's little difference between the two cards in current games. Releasing OCed version of 1gb card would kill the 2gb as it will be the better performer of the two while being cheaper. Neither AMD nor its board partners will invite such a situation.


----------



## ico (Apr 29, 2011)

lol have I said VRAM doesn't matter @ high resolution?

All I have said is, 512MB more VRAM per GPU is not the _only_ reason. 



ithehappy said:


> *Go on with a single pair of SLI.* How many of general users or even extreme gamers use Multi Monitor setup?


They surely did that.

2 * GTX 580 SLI = $1000.
HD 6990 + HD 6970 = $1000 again.

The "trifire" solution was much faster.


----------



## Joker (Apr 29, 2011)

lol..give hd 6970 and 6990 3gb RAM per gpu, i'm sure that they will win against gtx 580 3gb sli due to their superior scaling.



ashis_lakra said:


> It looks like ATI is gaining speed and advantage over nVIDIA .. Max. sweetspots in all ranges are occupied by ATI ..
> 
> Love to hear that, it has got grunts to give a challenge to might gpu of nvidia too.


since hd 4000 series. ati has been superior.


----------



## Skud (Apr 29, 2011)

exactly!!!


----------



## ashis_lakra (Apr 29, 2011)

Wish AMD should also come up with a great solution to have similar potential of Core i5 2500k and at a much lesser price.. then that would be a sweet deal for gamers.


----------



## ico (Apr 30, 2011)

Please have a look at the thread's title and stick to the topic. Thanks.


----------



## vickybat (Apr 30, 2011)

Joker said:


> lol..give hd 6970 and 6990 3gb RAM per gpu, i'm sure that they will win against gtx 580 3gb sli due to their superior scaling.



Do you even know the term scaling? Do you know that every gpu has a framebuffer saturation point beyond which , additional framebuffer is unused.

You just don't go adding unnecessary vram. There was a time 512mb was more than enough for a gpu core. Their 1gb counterparts performed similarly.

Can you tell the difference in performance between 5470 512mb and 5470 1gb? The difference is non because the additional 512 mb is not utilised. My 5750 cannot handle 2gb of vram because the core won't be able to utilse it.

Gtx 580 hits the limit due to its 1.5gb vram (only at ultra high resolutions beyond 1080p) and many reviewers have stated this. But 6970's 2gb is more than enough for the gpu core and it does not need additional vram. 6990 has 4gb and that is 2gb per gpu which is very good for it and doesn't need extra vram. 

The bottomline is gtx 580 3gb trisli is faster than 6990+6970trifire or 6970 trifire but there are no reviews currently to support this because 3gb 580's are very hard to find and not part of nvidia's reference design.


----------



## asingh (Apr 30, 2011)

For dual cored GPUs or xfire/SLI the double vram does not make a difference. Each frame is written to each card. So theoretically the vram does not double.

Also remember when multi monitors are scaling and screen size is large, AA is hardly needed. It does not make a difference.


----------



## Jaskanwar Singh (Apr 30, 2011)

^thanks for info asingh. also tell how much vram will be available in - 6990 4gb + 6970 2gb?


----------



## Skud (Apr 30, 2011)

Its 2 gb. Check this link. It was published on 2008 but holds true till now:-

CrossFire X explored - The Tech Report - Page 1



> The options are many. You could harness four GPUs together by using a pair of dual-GPU Radeon HD 3870 X2 cards, or given enough PCIe x16 slots, you could achieve a similar result using four Radeon HD 3850s. Cross-breeding is an option, as well, so a Radeon HD 3870 X2 could pair up with a single Radeon HD 3850 in a three-way config. Kinky.
> 
> The caveat here is that CrossFire X will settle on the lowest core GPU clock, memory clock, and video RAM size to determine the operative clock speeds and effective memory size. As a result, a Radeon HD 3870 X2 paired with a Radeon HD 3850 256MB would perform like a trio of Radeon HD 3850 256MB cards. And, of course, that means the effective memory size for the entire GPU phalanx would effectively be 256MB, not 768MB, because memory isn't shared between GPUs in CrossFire (or in SLI, for that matter).



So here, it is basically 3x6970 2gb.


----------



## Jaskanwar Singh (Apr 30, 2011)

^thanks for that link skud. so 2gb vram is effective in 6990+6970.


----------



## Cilus (Apr 30, 2011)

I don't think that is the case for CrossfireX in the current ATI GPUs, starting from HD 4000 series. Check the guru3d review of HD 4850 + HD 4870 CF review. all the cards present in CF can have their separate Clock frequency setting. Check Wikipedia for this. Even in my CrossfireX setting, I can Overclock each of the cards separately with different values.


----------



## asingh (May 1, 2011)

^^
What has clock frequency got to do with vRam allocation. Skud, is absolutely correct.


----------



## Joker (May 1, 2011)

i remember vicky and one more guy saying hd 6990 + 6970 cfx is not possible and even if it is possible..they said gtx 580 1.5 gb tri sli will be better. (ICO cleaned up that thread) that didnt happen. thank god hardocp ppl came up with a review and proved me correct. 

coming to the point if u go through hardocp's review..they havent really cramped up extreme settings with antialiasing where gtx 580 is handicapped by its 1.5gb vram. they only used settings which gtx 580 sli could handle and then compared it to the alternative trifire solution. they made it an even contest. if u see the review, gtx 580 sli has lost..but hasnt lost by much cuz hardocp made it an even contest.

at the end of the day..what matters is value - which gtx 580 sli DOES NOT have. cfx wins - plain and simple.

note:
interesting thing to note is gtx 580 1.5gb easily owns hd 6970 2gb @ 2569x1600 4x antialiasing. the scenario changes when u use multigpu setups. reason: cfx just scales better than sli.


----------



## Cilus (May 1, 2011)

Skud also mentioned that CF will settle down to the lowest GPU clock frequency of the two cards. This thing is not true, both of them can work on their separate freq
Skud is right about the Vram allocation...each of the card will have their own memory to access.


----------



## ico (May 1, 2011)

vickybat said:


> Gtx 580 hits the limit due to its 1.5gb vram (only at ultra high resolutions beyond 1080p) and many reviewers have stated this.





			
				from the Review said:
			
		

> All tests are done in apples-to-apples configuration at the exact same settings. *Please note that the resolution and AA settings used were not a bottleneck to any setup here.* *We made sure we were not running into VRAM capacity limitations at these settings, so performance is being correctly compared and we are not hitting any VRAM walls at these tested settings.*


read this. ^^


----------



## Skud (May 1, 2011)

Cilus said:


> Skud also mentioned that CF will settle down to the lowest GPU clock frequency of the two cards. This thing is not true, both of them can work on their separate freq
> Skud is right about the Vram allocation...each of the card will have their own memory to access.




If I remember properly, you can change the clock frequencies separately, most probably since 4000 series, but that will open the door to more micro stuttering and ultimately hamper gameplay performance.


----------



## vickybat (May 1, 2011)

Okay i just want to know how can they say its not hitting the vram wall?  Afaik not only aa and other post rendering effects require vram but also its a necessity when you render shader heavy games at ultra high resolutions.

The gpu fetches more and more instructions from a larger vram and renders simultaneously thus reducing performance downtime. Now increasing a resolution significantly hampers this directly and vram shortage will lead to gpu bottleneck.

My question is 3 gtx 580 1.5gb can beat 3 6970 cfx at 1920x1080 or even 2560x 1200.

But why it can't perform as well in 5760x1200? Forget aa , ssao and others. I think vram matters here. Those guys can test a 3gb trisli versions and prove us all wrong that vram doesn't matter here. But they won't. 

I am not saying the review is unfair cause the 1.5 gb version of 580 is available widely and not the 3gb one as it wasn't part of nvidia's reference model.

Gpu scaling largely depends on *alternate frame rendering, supertiling (cfx only) , split frame rendering and vram*. All these factors matter and vram has a large role to play. 

Compare a 6950 2gb trifire with a 6950 1g trifire and you will know. The 2gb version will lead significantly @ 5760x1200 despite both of them sharing same die. Tomshardware mentioned in their review that 580 was running into vram limitations at 2560x1200. So there's no doubt that 5760x1200 is too much for it for competing against 6970 trifire.

Check the following:

[YOUTUBE]WpC_4JH05Yk[/YOUTUBE]

Two 6990 in cfx vs evga 580 3 way sli @ 2560x1600

[YOUTUBE]yzb_fjjVlYQ[/YOUTUBE]

*^^ That's 4gpu's vs 3* If we put some simple logic here rather than blindly believing those hardocp guys, we find that 3gf110 dies are putting well against 4 cayman dies at 2560x1600. They will definitely beat a 6990+6970 trifire at these resolutions. But its performance goes down when 3 monitors come into play. So i see a clear vram deficit that hampers scaling of the 580's while rendering shader heavy games like metro, crysis etc. 
I am not the one thats gonna blindly believe those hardocp people that vram isn't a bottleneck because they are saying so and not even those silly user comments in that site.


----------



## tkin (May 2, 2011)

VRAM is actually the bottleneck here, no doubt about that, and in multi gpu mode the game does not see the total memory, just the vram of the primary card, so the vram is the only issue here, not scaling.

See the review numbers:
HD6970:
Newegg.com - Computer Hardware, Video Cards & Video Devices, Desktop Graphics / Video Cards, Radeon HD 6000 series, Radeon HD 6970
GTX580:
Newegg.com - Computer Hardware, Video Cards & Video Devices, Desktop Graphics / Video Cards, GeForce GTX 500 series, GeForce GTX 580 (Fermi)

The most reviewed 580 has more reviews(hence I think better sales) than the top two 6970s combined, so despite the price considerations the 580s are outselling the 6970.

Again:
HD6950:
Newegg.com - Computer Hardware, Video Cards & Video Devices, Desktop Graphics / Video Cards, Radeon HD 6000 series, Radeon HD 6950
GTX570:
Newegg.com - Computer Hardware, Video Cards & Video Devices, Desktop Graphics / Video Cards, GeForce GTX 500 series, GeForce GTX 570 (Fermi)

Now before anyone jumps up and says reviews do not means sales or vice versa, I've observed before that the most sold products have more reviews.


----------



## Joker (May 2, 2011)

many gtx 570 reviews might be about vrm failures lol.

reviewer has said that he didnt let vram bottleneck come into place in sli..so no point in arguing abt this.

all what matters is rawpower which is a combination of everything. nvidia just got thrashed in this review every way u look at it - power consumption, performance, price, value etc.

also wait for hd 6990 cfx drivers to mature before posting videos.


----------



## tkin (May 2, 2011)

I don't remember any review that said 570 vrm failing, maybe you mean 590? If you push vcores to abnormal levels vrms will fail, true for any card, my 5850's vrm failed when I was Oc'ing it.


----------



## damngoodman999 (May 2, 2011)

tkin said:


> I don't remember any review that said 570 vrm failing, maybe you mean 590? If you push vcores to abnormal levels vrms will fail, true for any card, my 5850's vrm failed when I was Oc'ing it.



Yes , GTX 570 VRM is very poor which is old one @ the time of release Due to OCing /  heat , now the new ones are well which comes with aftermarket cooler !

Source - Dead 570's (Have you killed yours?) - XtremeSystems Forums


----------



## vickybat (May 2, 2011)

Joker said:


> many gtx 570 reviews might be about vrm failures lol.
> 
> reviewer has said that he didnt let vram bottleneck come into place in sli..so no point in arguing abt this.
> 
> ...



Check the following thread:

*Gtx 580 3gb sli VS 6970 2gb trifire*

The guys is a member of hardforum and tested in a 3x30inch multimonitor setup. See it for yourself if you have any doubt on sli scaling.

Scaling has nothing to do with that hardocp review. It was clearly a vram bottleneck.

A small quote from that guy :



> My largest complaints originally with the 580s was the severe lack of VRAM and the DX9 portrait mode FPS limit bug. Those were total deal killers. I am glad to report that both are solved with these latest cards and drivers! I've included a few Tri-6970 benchmarks that did run but deleted the Quad-6970 line as everything besides Resident Evil 5 crashed (RE5 also allowed the sole Quad-Xfire benchmark). The Palit 3GB 580's where modestly clocked to 840/2050. Afterburner 2.1.0 Beta 7 can adjust voltages on these non-reference cards. All benchmarks are with driver texture quality settings at maximum, 16x AF forced with 2x MSAA.


----------



## tkin (May 2, 2011)

damngoodman999 said:


> Yes , GTX 570 VRM is very poor which is old one @ the time of release Due to OCing /  heat , now the new ones are well which comes with aftermarket cooler !
> 
> Source - Dead 570's (Have you killed yours?) - XtremeSystems Forums


LOL, the first post says the guy took a soldering iron to his 570 and volt modded it, maybe that's why it popped? Try volt modding your 6970 and see how long it survives when you add, say 0.5 volt to it. When building gpu they don't keep in mind that some people want to run their gpus from the wall using a 110v input


----------



## damngoodman999 (May 2, 2011)

tkin said:


> LOL, the first post says the guy took a soldering iron to his 570 and volt modded it, maybe that's why it popped? Try volt modding your 6970 and see how long it survives when you add, say 0.5 volt to it. When building gpu they don't keep in mind that some people want to run their gpus from the wall using a 110v input



Not only that but lot 

[Solved] Fried Palit GTX 570 and poor warranty response - any advice? - Graphics-Cards - Graphic-Displays

GTX 570: VRM phase issue and brands : buildapc

But trust me These issues are not with the Asus Direct CU II , Twin Frozr II/III , etc ... These VRM are really gr8 runs excellent when its OCed


----------



## vickybat (May 2, 2011)

^^ Yes 570 has been for a while now and lot of board makers have come out with their non-reference version of the card which has good vrm design.

So any buyer going for 570 should stick with a non-reference card like palit sonic, msi twin frozr series, asus direct cu II, Zotac amp etc.


----------



## Skud (May 3, 2011)

Some Quad GFX slug-fest @ HardwareHeaven:-

*Zotac NVIDIA GeForce GTX 590 - Quad SLI Review - Introduction* 


And some triple monitor gaming performance @ TechSpot:-

*Triple Monitor Gaming: GeForce GTX 590 vs. Radeon HD 6990 - TechSpot Reviews*


HardOCP has updated the tri-SLI/triFire results with with a Sandy Bridge 4.8 GHz system, Check it out:-

*NVIDIA 3-Way SLI and AMD Tri-Fire Redux*

And the discussions on the same can be found here (its already running at 20 pages!!!):-

*NVIDIA GeForce 3-Way SLI and Radeon Tri-Fire Review @ [H]*


----------



## ico (May 4, 2011)

so, it seems like VRAM wasn't the factor after all. For some reason GTX 580 SLI gained hugely from the i7-2600k @ 4.8ghz and wins. Crossfire faced a penalty in some games.

Verdict: Don't compare until you have mature drivers out. True for both.


----------



## vickybat (May 4, 2011)

^^ The nf200 chip also mattered somewhat in terms of available bandwidth. The extra lanes really helped multigpu setups. Nvidia architecture is clearly favouring sandybridge and we have seen some whooping performance increments from the 580 trisli setup.

Talking about drivers, the tests included the same drivers as the previous test. They were unchanged.


----------



## mohiuddin (May 4, 2011)

Why so confused?
Not only vram making them (580) weak,
it is know fact that amd cards (both in single and cfx) get a bit less hit with resolution increase.
In case of 6850 1gb vs gtx460 1gb, at below hd resolution 460 did well, but in hd or more high resolution 6850 do better and with 11.4 driver at high aa and res, it (along with 6870,69xxhd) got 8~20% performance in many titles.cheers, we hope more in 11.5


----------



## vickybat (May 4, 2011)

^^ That's not the case with gtx 580. It scales well and equal to its amd counterpart.


----------



## damngoodman999 (May 4, 2011)

ico said:


> so, it seems like VRAM wasn't the factor after all. For some reason GTX 580 SLI gained hugely from the i7-2600k @ 4.8ghz and wins. Crossfire faced a penalty in some games.
> 
> Verdict: Don't compare until you have mature drivers out. True for both.



Bingo ! Ati if they release the best drivers for any of their cards they can hold the phenom for the GPU also !


----------



## mohiuddin (May 8, 2011)

vickybat said:


> ^^ That's not the case with gtx 580. It scales well and equal to its amd counterpart.



care to explain why? Why 580 scale better than other structurally similer fermi series?
Common meet the fact,  6xxx  just became awesome defeating sli at highres indeed.amit it.


----------



## tkin (May 8, 2011)

mohiuddin said:


> care to explain why? Why 580 scale better than other structurally similer fermi series?
> Common meet the fact,  6xxx  just became awesome defeating sli at highres indeed.amit it.


Only applicable for 699x due to 2GB VRAM, else fermi is faster.


----------



## rchi84 (May 8, 2011)

Anyone who denies that Nvidia holds the performance crown, is floating in a cloud of denial. 

Where AMD is beating Nvidia is in the value segment. And going by the updated tests, if it costs 400 $ more to get 30% better frame rates(when compared to the 6990-6970 Xfire), then it's a huge difference, in terms of budgets.

Man, I wish Nvidia allowed vendors to slap on 1.5 Gb Vram on a 570. Would love to see a fight between the 6990-6970 combo against a 570 Tri-SLI setup..


----------



## mohiuddin (May 8, 2011)

tkin said:


> Only applicable for 699x due to 2GB VRAM, else fermi is faster.



nope, u see, 6850cfx is a bit faster than 460gtxsli, 6870cfx catching up 560ti almost..go check guru3d and anandtech. 
Why blindly support someone? Plz bro check ...


----------

