# A New Grapgic Card for my Gear



## DARK KNIGHT (Sep 20, 2011)

*A New Graphic Card for my Gear*

Hi guys ,
         I want a new graphic card but there is little confusion in my mind
  that did i wait for 7 series or go for the recent graphic cards.please
  acknowledge me.
                         my budget for gpu is 12k . your suggestions are very 
helpful for me.


----------



## Tech_Wiz (Sep 20, 2011)

If you can wait then it will be good.

But Considering You got a HD Monitor & your card is no way powerful to game at that HD resolution you can opt for a HD6870 or Push 1.5k more and get a HD6950. 

Waiting for 7xxx will get you the same cards at a Lower price. But for your Monitor I dont feel anything more than 6950 is really required.


----------



## Nipun (Sep 20, 2011)

If you can wait for the 7000 series, it would be great.

Otherwise you can get HD6950 1GB @ 13k or HD6870 1GB @ around 11-12k(not sure about price).

Which PSU are you using, BTW?

EDIT: Ninja'd


----------



## DARK KNIGHT (Sep 20, 2011)

I Mention My psu in my signature its corsair HX 620 modular

should i go for this Sapphire HD6950 1GB or MSI R6950 2GB Twin Froti III PE. 
please suggest. jas, cilius &dk i will wait for ur suggestion


----------



## Cilus (Sep 20, 2011)

^^ Then it can easily handle HD 6950 or GTX 560 Ti without any problem. But I suggest you wait until November, until the launch of the HD7000 series.


----------



## DARK KNIGHT (Sep 20, 2011)

Did 7 Series cards give a fresh breath of air to my led monitor. can it support the maxim
       resolution .


----------



## Tech_Wiz (Sep 20, 2011)

HD 7xxx delayed till Jan'12 so thats 4 months away 

and my vote is MSI R6950 2GB Twin Frozer III PE. That Cooler is awesome


----------



## d6bmg (Sep 20, 2011)

Tech_Wiz said:


> HD 7xxx delayed till Jan'12 so thats 4 months away
> 
> and my vote is MSI R6950 2GB Twin Frozer III PE. That Cooler is awesome



Me will say same to OP. Just before making his post I read the news about the delayed release of 7xxx. So, OP should buy 6950 2GB edition for best value for money.


----------



## Tenida (Sep 20, 2011)

@Dark Knight
My suggestion~
*Rs 12k- MSI R6870 Hawk
Rs 14K-MSI NGTX560Ti Hawk
Rs 16K-MSI R6950 TFIII 2GB PE *
Choose that suits your budget.


----------



## DARK KNIGHT (Sep 21, 2011)

*Re: A New Graphic Card for my Gear*

Thanks tenida & guys for ur suggestions. I will take the gpu in the mid of oct so keep giving me nice suggestion so i will purchase a good future-proof gpu . its;a shocking news that 7 series cards r delayed till jan, then it will appear in market after one two months later so its better to go with 6 series anyway thanks guys.  
                      nice avatar & signature tenida .


----------



## Tenida (Sep 21, 2011)

^^Thanxxx buddy


----------



## Jaskanwar Singh (Sep 21, 2011)

DARK KNIGHT said:


> I Mention My psu in my signature its corsair HX 620 modular
> 
> should i go for this* Sapphire HD6950 1GB or MSI R6950 2GB Twin Froti III PE. *
> please suggest. jas, cilius &dk i will wait for ur suggestion



PE features a better cooler, more oc potential. obviously first choice.

if on budget then Sapphire HD6950 2GB @ 14k or Sapphire HD6950 1GB @ 13k.


----------



## vickybat (Sep 23, 2011)

*@ op*

For 14k, nothing beats Gtx 560-ti hawk. Its got a twin frozr III cooler.


----------



## mithun_mrg (Sep 23, 2011)

vickybat said:


> *@ op*
> 
> For 14k, nothing beats Gtx 560-ti hawk. Its got a twin frozr III cooler.



also Nvidia means + PhysX+3D - driver issues


----------



## Tenida (Sep 23, 2011)

Yeah agreed mithun and Vickybat.
Nothing can beat msi gtx560ti hawk@14k.


----------



## Jaskanwar Singh (Sep 23, 2011)

Hmm. So there is a new card namely 560ti hawk. Please give me a review of it all 3 of u above. By name it sounds a beast. How does it stand against 6950?


----------



## Nipun (Sep 23, 2011)

^^ Here's a review: MSI N560GTX-Ti Hawk Review - Overclockers Club


----------



## Jaskanwar Singh (Sep 23, 2011)

Nipun said:


> ^^ Here's a review: MSI N560GTX-Ti Hawk Review - Overclockers Club



good card it seems. gives competition to 6950.

any other review?

I found 3 more -
HARDOCP 560TI HAWK REVIEW
HARDOCP 550TI SLI REVIEW
TOMS HARDWARE 32 GRAPHIC CARDS BENCHMARKED.

Search in google for these links.


----------



## Hustlerr (Sep 23, 2011)

Here's another one: MSI GeForce GTX 560 Ti HAWK review


----------



## Jaskanwar Singh (Sep 24, 2011)

^thanks.



mithun_mrg said:


> also Nvidia means + PhysX+3D - driver issues



my friend afaik amd also has eyefinity, hd3d.
And physx is gimmick. Except 2 or 3 games none uses it properly. 

But u know these 3d and eyefinity etc dont come cheap. So mosta people just get a card for normal gaming if we talk of india.

About drivers why didnt i faced a single driver issue till now?


----------



## mithun_mrg (Sep 24, 2011)

Physx is not a gimmick try playing darkest of days,mafia2 with & without physx u will notice the differences

Ati 3D is a crap outsourced to 3rd party btw i think nvidia 3d will become affordable soon

recently i have read a lot about driver issues online u can find some of them on TDF also
*www.thinkdigit.com/forum/graphic-cards/146524-amd-driver-issues.html


----------



## vickybat (Sep 24, 2011)

^^ Exactly, physx is not a gimmick but is the first step to implement physics code to be processed by a gpu rather than a cpu. The former is much better in doing so and refer *this* to know why.


----------



## Jaskanwar Singh (Sep 24, 2011)

enough of joking now by me. 



mithun_mrg said:


> Physx is not a gimmick try playing darkest of days,mafia2 with & without physx u will notice the differences
> 
> Ati 3D is a crap outsourced to 3rd party btw i think nvidia 3d will become affordable soon
> 
> ...



you play 2 games or 100 others. thats my point. and mentioned except 2-3 games in my previous post.

and amd hd3d is gaining wide support. 
Supported Hardware
and you say nvidia 3d will become affordable without any proof from nvidia?

that link you mentioned is case of faulty card not driver issue. read it.


----------



## vickybat (Sep 24, 2011)

Jaskanwar Singh said:


> enough of joking now by me.



Jas, be serious in your posts.



Jaskanwar Singh said:


> you play 2 games or 100 others. thats my point. and mentioned except 2-3 games in my previous post.



What if those 2-3 games turn out to be AAA titles? I guess we already have some AAA titles releasing this year.... Batman arkham city , metro last light.
But these can be played disabling physics and it won't hamper the playing experience. But nvidia owners or even amd owners having a dedicated nvidia card to compute physics will be relishing these games in its full glory and effects.



Jaskanwar Singh said:


> and amd hd3d is gaining wide support.
> Supported Hardware
> and you say nvidia 3d will become affordable without any proof from nvidia?



Hd3d is still an infant in front of 3dvision but is surely gaining some footing with the new samsung and lg 3d monitors. Software support is still low relatively.

About 3d vision becoming affordable, it already has. Nvidia has introduced wired 3d googles which are much cheaper than their wireless counterparts.

Check *this*.

Check some *games* here supporting amazing 3d (3d vision) experience.

In contrast, here are the hd3d *games* supporting 3d. Now tell me which has more support and where would you invest your money if you want 3d?


----------



## Jaskanwar Singh (Sep 24, 2011)

vickybat said:


> Jas, be serious in your posts.
> 
> What if those 2-3 games turn out to be AAA titles?
> 
> ...



ok. 

there are lot of other good titles to play. 

i know 3d vision is more mature. just pointing out that amd hd3d is growing. 

and we need a good monitor too with glasses. dont come cheap. what is the price of those glasses in India?

an mostly we need to concentrate on Indian market  in which as per forum most people dont have those budgets of buying 3d.

And we are yet to see how physx performs in metro last night.


----------



## vickybat (Sep 24, 2011)

^^ Couldn't find any indian prices of those wired glasses yet but will let you know once i do. India is slowly rising adopting 3d. It will be mainstream in a couple of years.

About last light (not night) its a latest title and most probably will have very good effects in it. Explosions will have good deal of physics effects in the game and i'm sure will be much better than metro 2033. Its kind of obvious that it will be good than its predecessor.


----------



## Jaskanwar Singh (Sep 24, 2011)

Well yes


----------



## DARK KNIGHT (Sep 24, 2011)

Did phyx really a matter of concern .if yes then which will be the better gpu in  my case. does phyx will be the matter in future games also.please give me nice suggestions because it is the matter of hard earned money
 i am very confused after a spat between vickybat & jaskanwar.


----------



## kapilove77 (Sep 24, 2011)

LOL @ hard earned money. Nevertheless Physx r just effects and it's available in nvidia gpu's only. Maybe more games will use it in future.


----------



## Jaskanwar Singh (Sep 25, 2011)

DARK KNIGHT said:


> Did phyx really a matter of concern .if yes then which will be the better gpu in  my case. does phyx will be the matter in future games also.please give me nice suggestions because it is the matter of hard earned money
> i am very confused after a spat between vickybat & jaskanwar.



no man. physx is not the criteria for choosing. only 1-2 games use it properly and there are lots of other good titles to play. 

get MSI R6950 TFIII PE/OC eyes closed.


----------



## Tenida (Sep 25, 2011)

^^Don't push everyone to buy HD6950 because you have bought it.BTW. do u know OP's  budget??How can you refer something without knowing the actual budget.
BTW. in my previous i have said@ 14k MSI GTX560ti Hawk killer card because it has tfIII cooler.But why you are asking me give the review???Who are you to said like to me??Its solely my opinion.Why you competing to hd6950 everytime because you have them, so everyone has to buy it.Stop giving suggestion as a fanboy.


----------



## Jaskanwar Singh (Sep 25, 2011)

Tenida said:


> ^^Don't push everyone to buy HD6950 because you have bought it.BTW. do u know OP's  budget??How can you refer something without knowing the actual budget.
> BTW. in my previous i have said@ 14k MSI GTX560ti Hawk killer card because it has tfIII cooler.But why you are asking me give the review???Who are you to said like to me??Its solely my opinion.Why you competing to hd6950 everytime because you have them, so everyone has to buy it.Stop giving suggestion as a fanboy.




dont you understand a joke also?

and i am not compelling but giving my suggestion. and op mentioned he can get a 6950 pe. so i said get it. how does it disturb you. because you are a fanboy?

and a tfIII is not everything. whole of the world doesnt buy msi. other cards are also good. and run at fine temps. 
but i never said low temps aren't welcome.

dont just come and start bashing others. 
and i can say the same to you. dont suggest 560ti because you bought it. happy now?

we have to suggest seeing performance and not just a cooler everytime. 
what will you suggest out of these - msi 550ti cyclone II or sapphire 6790?
ofcourse 6790 because it performs better although cyclone 550ti will run cooler.
(dont take it 6950 vs 560ti)


----------



## vickybat (Sep 25, 2011)

*@ op*

For 14k, opt for MSI Gtx 560-ti hawk.

For 16k, go for MSI 6950 TF-III/P.E


----------



## DARK KNIGHT (Sep 25, 2011)

Will my psu is sufficient for both  MSI Gtx 560-ti hawk & MSI 6950 TF-III/P.E gpus .
               And please guys don't fight as we all r digitians members.


----------



## Jaskanwar Singh (Sep 25, 2011)

DARK KNIGHT said:


> Will my psu is sufficient for both  MSI Gtx 560-ti hawk & MSI 6950 TF-III/P.E gpus .
> And please guys don't fight as we all r digitians members.



yes. 

and my suggestion -
14k - sapphire hd6950 2gb
16k - msi r6950 twin frozr III pe/oc


----------



## DARK KNIGHT (Sep 27, 2011)

Did the prices gone up of-these above mention cards .
14k - sapphire hd6950 2gb
16k - msi r6950 twin frozr III pe/oc


----------



## mithun_mrg (Sep 28, 2011)

^^ well i think a little bit due to the rupee $ depreciation or hell something like that  u can wait a bit i think the prices surely will drop again also during diwali u may get a sweet deal

so finally u did choose ati


----------



## DARK KNIGHT (Sep 29, 2011)

Yes mithun i decided to go wth ati 6950 . Now tell me guys which version is best.


----------



## mithun_mrg (Sep 29, 2011)

^^ ASUS directcuII, msi-Hawx/twinfrozer, sapphire-vapourx


----------



## Skud (Sep 29, 2011)

DARK KNIGHT said:


> Yes mithun i decided to go wth ati 6950 . Now tell me guys which version is best.




MSI TFIII PE/OC 2gb. Or if you really find it, Sapphire Toxic 2gb. 

@mithun: there's no vapor-x version of 6950 2gb afaik.


----------



## rchi84 (Sep 29, 2011)

@Dark knight Dude, I will give you the honest answer. Today, AMD and Nvidia are well matched in most price segments, so choosing any card from either of them will not hurt you at all. It comes to down to which company you prefer, really.

On the lower mid range, you have the Radeon 6870 and Geforce 560. In the mid range, you have cards like the 560Ti, 6950 and their OCed counterparts. Then in the upper mid range, you have the Geforce 570 and Radeon 6970. Beyond that, there is only one true high end card which is the Geforce 580 which is way beyond budget and practicality.  Then we get into the crazy range with the 6990 and 590s.

As for the other issues mentioned, 3D is a gimmick which gives you a headache after 30 minutes (sometimes less) and also requires an expensive monitor/Tv. Unless they find a way to make it more comfortable, it is useless to make a 3D gaming rig for a serious gamer. Who wants to spend 70-80K on a gaming setup, and only play for 1 hour at a time?

Physx, there are a few AAA titles like Mirror's Edge, Batman AA, Mafia 2, Metro 2033, Alice: Madness Returns which use the effects well. the rest of the title, imho, don't make effective usage of Physx and it doesn't add much to the experience.

But, developers know Physx usage is not widespread, so the Physx effects are not included as main gameplay elements. It's more like salad dressing, and not the vegetables.

There are people who like Physx and those who don't like it. For me, if Battlefield Bad Company 2 and BF3 can have destructible environments on an awesome scale, all through CPU physics, i don't see the point of GPU physx, beyond some debris, dust and cloth effects.

Driver issues are nonsense for 99% of users who don't face any troubles on either side. If you are not into crossfiring, SLI, multimonitors then both companies drivers are rock solid.

AMD drivers used to be below average in Linux in the past, but I can tell you from experience, that both Nvidia and AMD make great drivers for Linux now. Of course, there's no point in using linux for gaming, which makes driver comparisons pointless on linux at least.

So choose any card which sticks to your budget and rest assured that you will have a decent gaming experience for a long while. There are people here who are gaming on medium-high details on 720p with a 4 year old system. comfortably.

Looking forward to the pics in the new purchase thread 

flamers, fire away


----------



## varunb (Sep 29, 2011)

*Re: A New Graphic Card for my Gear*

Duhhhh...for those who still dont know. If you don't have an Nvidia card, then it doesn't means you won't see the Physx effects (you wont regret even if you don't install Physx software). Lack of an Nvidia card means that the game will force the CPU for the Physx processing. So people can chill & buy AMD cards with peace.

As for the OP, you can go & buy any 6950 2GB card that is easily available near your area or which you won't have trouble ordering online. Its not like you will get a sky-rocket performance out of a particular brand's 6950 card. Its all depends on the user's config. So MSI twin Frozr III, Sapphire, etc any will do fine. 

If you still wanna assure yourself, then there are various benchmarks/reviews scattered throughout the internet which can be easily found.


----------



## mithun_mrg (Sep 29, 2011)

*Re: A New Graphic Card for my Gear*



varunb said:


> Duhhhh...for those who still dont know. If you don't have an Nvidia card, then it doesn't means you won't see the Physx effects (you wont regret even if you don't install Physx software). Lack of an Nvidia card means that the game will force the CPU for the Physx processing. So people can chill & buy AMD cards with peace.




read this article
NVIDIA Sheds Light On Lack Of PhysX CPU Optimizations - HotHardware


----------



## varunb (Sep 29, 2011)

I am fully aware of that article but what exactly are you trying to suggest to the OP with that ? That article in short clearly indicated to me that Physx is just a gimmick that Nvidia is using to fool customers into buying their products. Nothing else. Also, with the exception of Mafia 2, every game featuring Physx can be played smoothly without an Nvidia card.

Anyways guys lets not carry this topic to somewhere else. Rchi84 has pretty much summed up the main points. The OP asked the price & the card he can get for that price.


----------



## mithun_mrg (Sep 29, 2011)

varunb said:


> I am fully aware of that article but what exactly are you trying to suggest to the OP with that ? That article in short clearly indicated to me that Physx is just a gimmick that Nvidia is using to fool customers into buying their products. Nothing else. Also, with the exception of Mafia 2, every game featuring Physx can be played smoothly without an Nvidia card.
> 
> Anyways guys lets not carry this topic to somewhere else. Rchi84 has pretty much summed up the main points. The OP asked the price & the card he can get for that price.



again i am saying physx is not a gimmick its a technology & u read the article wrong it meant that the physx code is optimised to run smoothly on nvidia gpu's only 

graphics is all about eye candy & physx just adds to that without any extra cost try playing metro2033,dark void, crytorysis, darkest of days , mafia2 with physx on & then off u will notice the difference & customer nowadays r not fools to buy a gpu for just taking physx into consideration

also op has already made his choice there is no point suggesting him now this thread can cool off for now


----------



## varunb (Sep 29, 2011)

mithun_mrg said:


> again i am saying physx is not a gimmick its a technology & u read the article wrong it meant that the physx code is optimised to run smoothly on nvidia gpu's only
> 
> graphics is all about eye candy & physx just adds to that without any extra cost try playing metro2033,dark void, crytorysis, darkest of days , mafia2 with physx on & then off u will notice the difference & customer nowadays r not fools to buy a gpu for just taking physx into consideration
> 
> also op has already made his choice there is no point suggesting him now this thread can cool off for now



Its a gimmickkkkkkkkkkk alright. It appears you misunderstood me & you are not looking beyond  the physx effects. Sure the games look good with Physx on. I am not denying it. By gimmick, I meant the trick to attract customers by saying "Oh look at the graphics, particle effects, etc we have got to offer....blah blah blah..." I suggest you do a research about why many of the devs have not yet adopted it, why Nvidia never meant the code to run as efficiently on the CPU which many websites have stated. 

Nvidia is giving a competition to DX11 DirectCompute & OpenCL by releasing CUDA & Physx to draw customers by saying that "only we can give you hardware Physx but you gotta own our card". If there is no Nvidia card installed, then no hardware physx for the gamer. NV is being an ass about this by not sharing Physx or making it open source. It is always about the competition buddy never about the graphics. Do not forget this fact. If its not the gimmick or should I say competition then why are they not removing the GPU check for the PPU ?

The games you mentioned can be played with an ATI card with physx on. The only exception which I mentioned earlier is Mafia 2. If the OP wants to get an Nvidia card for Physx, then I won't discourage him at all. Anyways, the OP has already decided so I tapping out now.


----------



## Skud (Sep 29, 2011)

PhysX is gimmick because you cannot simply use it in each and every games, unlike MLAA/FXAA/Eyefinity etc., even on an nVIDIA card. You need to specifically code for PhysX, and very handful of developers actually take the pain to implement it, because at the end of the day, that means catering to only a handful of customers.

That not to say PhysX is bad, far from it, but nVIDIA's monopoly has almost killed it.


----------



## mithun_mrg (Sep 30, 2011)

why blame nvidia don't u think even if ati had acquired Ageia they would have done the same thing moreover thank nvidia that they provided it for free not charged for any extra for hardware or software lastly i also feel that if they want physx a success as a tecnology they should or have to make it platform independent

see the current situation price & perfomance ratio difference is almost minimal between the two manufacturers but those value added features count for the end users


----------



## Skud (Sep 30, 2011)

I would have thanked nVIDIA if PhysX worked when I paired a nVIDIA card with a AMD (main) card without any hack etc. That PhysX doesn't work in this kind of setup is good enough reason to blame nVIDIA.


----------



## Cilus (Sep 30, 2011)

AMD always emphasizes technologies based on Open architecture, not proprietary design which only supports some specific design. That's why their APP or Advanced Parallel Processing technology, the competitor of Nvidia CUDA is based on *Direct Compute architectur* 
which can be used with any Video card with DirectCompute support, irrespective of the crad manufacturer. I have plenty of experience on it.

Condier the example of *CoreAvc Video Decoder*, the fastest H264 or AVC decoder available. A*MD has provided support for DXVA GPU acceleration on it and it can be used with Nvidia cards too*. On the other hand, enabling CUDA on CoreAVC requires Nvidia card only.
The apporch Nvidia is taking basically slows down the development of advanced software design due to the proprietary approch taking by them. On the other hand any open box software design grows far faster due to the number of developers can work on it, easily available and highly customizable software libariries to work with and large number of supported hardware.

Yes, I cannot deny that games with highly optimized PhysX design looks better, but it does a very little to improve the game playing experience. It cannot be considered deciding factor while purchasing a card, one should look at the FPS counter, AF and AA performance etc while making his decesion of the Graphics card. I have a dedicated PhysX card (enabled through PhysX Mod to work with AMD cards) and beleive me, only a few games look better with PhysX enabled.
Homefront, Bullet Storm, Batman Arkham Asylum, Metro 2033 have very little effect while PhysX is switched on. The only game which looks significantly better is Mafia II.* Other thing is PhysX can run in CPU very smoothly if the code has been optimized to use SSE (1,2,3 or 4) instruction sets of CPU but Nvidia has used unoptimized X87 code for CPU PhysX execution to reduce the CPU PhysX performance forcefully to glorify their cards.* It is not only my word but all the major review sites like Guru3d, Toms Hardware, Anandtech have said those words over and over.

AMD is currenly helping the advancement of HAVOC and BULLET Physics engine for GPU acceleration. Again they are forcing on open architecture so that those engines can be accelerated by any Gfx card.


----------



## vickybat (Oct 2, 2011)

Skud said:


> I would have thanked nVIDIA if PhysX worked when I paired a nVIDIA card with a AMD (main) card without any hack etc. That PhysX doesn't work in this kind of setup is good enough reason to blame nVIDIA.



Its no one to blame. Why in the hell would nvidia support amd? Its their proprietary code and should work on their cards only. Its a very common business strategy.

About physx, i would say its a first step to enable physics code to run in a gpu. 
Gpu physics should not be cpu optimized because for that , the floating point math has to converted to fixed point logic because the cpu is more comfortable with the latter.

*For eg- 1.68 in floating point will be represented as 1680.*

Fixed point math operations take away the level of precision when applying physics logic to a code. Amd will follow  nvida's path but like *cilus* said, it will cater to open standards rather than something proprietary which is good imo.

Future games will see the rise in gpu physics and will bring some incredible affect to the table.


----------



## Skud (Oct 2, 2011)

vickybat said:


> Its no one to blame. Why in the hell would nvidia support amd? Its their proprietary code and should work on their cards only. Its a very common business strategy.
> 
> About physx, i would say its a first step to enable physics code to run in a gpu.
> Gpu physics should not be cpu optimized because for that , the floating point math has to converted to fixed point logic because the cpu is more comfortable with the latter.
> ...




Its not about supporting AMD, its about supporting games. Why not each and every game have support for PhysX effects even with nVIDIA cards? Not even all "the way its meant to be played" games come out with PhysX support. And regarding future games, even Aegia predicted the same, and we are yet to see that future. And by the time that future comes, will these cards be able to handle those games, let aside with full PhysX effects?

At this point, just like 3D, PhysX is a moot point (read gimmick). It cannot be used as a reference or a deal breaker/maker unless someone specifically asks for, as there's very few games to actually take the benefit of it. As *rchi84* rightly said, it's more like salad dressing rather than the actual vegetable.


----------



## vickybat (Oct 2, 2011)

Skud said:


> Its not about supporting AMD, its about supporting games. Why not each and every game have support for PhysX effects even with nVIDIA cards? Not even all "the way its meant to be played" games come out with PhysX support. And regarding future games, even Aegia predicted the same, and we are yet to see that future. And by the time that future comes, will these cards be able to handle those games, let aside with full PhysX effects?
> 
> *At this point, just like 3D*, PhysX is a moot point (read gimmick). It cannot be used as a reference or a deal breaker/maker unless someone specifically asks for, as there's very few games to actually take the benefit of it. As *rchi84* rightly said, it's more like salad dressing rather than the actual vegetable.



Who in the blue hell told you that *3d* is a gimmick? Its one of the most talked about feature and a next step towards virtual reality. Amd is desperately trying to catch up with nvidia in 3d.

Read *this*.

And talking about physx not all games are supported because of lack of gpu horse power. In shader heavy *"nvidia titles" * like *crysis 2* or even the upcoming *battlefield 3*, the games are so demanding that implementing physx will take a toll with the overall perfomance. So cpu physics is used as a compensation. *Unreal engine 3* is best suited for gpu physics currently because its light and not resource heavy so that the gpu can compute physics code along with rendering simultaneously. You would want to mention metro as its a shader heavy title as well but enabling physx takes a toll on performance and besides its not used extensively as the game has less open environments to show scattering except a few.* Metro last light* will also incorporate physx. Check *this* & *this.*

As gpu's become more powerful, we will see more and more games supporting gpu physics and hopefully amd will too join the bandwagon.


Contradicting rchi84's statement *"he never saw the big picture"*, even cpu physics is a salad dressing. Afterall, garnishing and dressing enhances the looks of any food and also gives it higher nutritional values. Physics is the same ,whether cpu or gpu physics.

So before saying anything simply a gimmick, think twice.


----------



## ico (Oct 2, 2011)

There we again see pseudogamedevs who don't work in the field/nor intend to commenting on whether PhysX is a gimmick or not.

Excessive tessellation myth in Crysis 2 was busted not quite long ago when Crytek tessellated a plan plain concrete slab unnecessarily for actually no change in visual quality. Also a mesh of tessellated water was flowing beneath the ground. Waste of GPU horsepower for no benefit shall I say? Hate to say it, but AMD was right about "excessive" tessellation is not required.

The fact that you take decent hit when you turn on PhysX tells me there are better ways to implement those flying paper and glass break effects. After all they are mere flying paper and glass breaks? How many games using it are worth playing? I can name only 4. Metro 2033 doesn't qualify as worth playing as I have quite high standards.

Do have a look at The Witcher 2. A game without any gimmicks from either side. Runs on DirectX 9. Best looking game till now. That's what games should be like without using any proprietary gimmick like PhysX. It also busts the DirectX 11 is OMFG EPIC myth. At the end of the day, it all depends on the gamedev, how he chooses to develop his games using various APIs.

In the end, I am willing to spend Rs. 500 to play 100 games faster than choosing to spend Rs. 500 only to play 4 games (irrespective of the card manufacturer) with added effects which devs could have implemented through other ways. PhysX is a gimmick.

I'd rather take John Carmack's take on PhysX rather than some random poster.

Coming back to the topic, spend more and get HD 6950 2GB reference or factory OCed cards. That extra 1 GB of VRAM is going to do you a more favour in the long run than gimmicks of either GPU camps.


----------



## AcceleratorX (Oct 2, 2011)

IMO, NVIDIA's antics of late are due to desperation, i.e. heavy competition from AMD and their exit from the chipset market which is costing them revenue.

That being said, these days both NVIDIA and AMD GPUs are pretty competitive in price as well as performance. A lot could be said about NVIDIA, but the latest PhysX versions *do* have optimized multicore PhysX (including SSE2 if I remember correctly) and efforts to get PhysX working on Radeon cards were thwarted by AMD itself (by not offering any help whatsoever) and not NVIDIA who remained apathetic (read: Neutral) to the whole situation. Also note that NVIDIA never sued the group trying to make it work.

After that issue was forgotten, guess what? AMD announces partnership to get Havok and Bullet working through DirectCompute. Results are preliminary at best, almost two years later.

And what about the good? The real truth is that both companies have done good things as well. NVIDIA, having helped many small developers test and debug their games with their specialized TWIMTBP labs in Moscow testing for all kinds of bugs, graphics related or otherwise (even if they did add vendor specific crap). And AMD, which has added features that work on all GPUs including those of the competition.

There are issues with hardware, such as the angle-dependent Anisotropic filtering of NVIDIA or the less-than-optimal quality driver tweaking and possible AF bugs in AMD hardware.

Regardless of all this, the fact is that both companies have advanced semiconductor technology, enabled new features, and allowed PC gaming to advance and possibly survive. This is why both companies are important, historically or otherwise.

Also, the fact is that NVIDIA was the first to start the push for GPU computing, and still excels at it, something AMD has always traditionally followed NVIDIA at rather than leading it (I'm talking feature wise here, reaction time, etc.).

So, given this basic rundown of the two companies, I really feel that at any price point, one GPU is as good as the other and there should be no inherent bias against AMD or NVIDIA - just get the product that has the best price to performance ratio! 

As for the topic, I do feel Radeon HD 6950 is slightly better than GTX 560 Ti, but you should have a good look at the price difference between them. If prices are close, go for the 6950, else go for the 560 Ti since performance difference is not very significant.


----------



## Skud (Oct 2, 2011)

vickybat said:


> Who in the blue hell told you that *3d* is a gimmick? Its one of the most talked about feature and a next step towards virtual reality. Amd is desperately trying to catch up with nvidia in 3d.
> 
> Read *this*.
> 
> ...




I have read that Toms article twice already but what really you want to convey? 3D is commonplace or it's every gamers' next target? Or something else? And regarding lack of PhysX, if it is lack of GPU horsepower then why recommend someone a card only because it supports PhysX?


----------



## vickybat (Oct 2, 2011)

Skud said:


> I have read that Toms article twice already but what really you want to convey? 3D is commonplace or it's every gamers' next target? Or something else? And regarding lack of PhysX, if it is lack of GPU horsepower then why recommend someone a card only because it supports PhysX?



If it would have been a commonplace or every gamer's next target, you would have agreed that it isn't a gimmick right? Just because its expensive at the moment or "everybody doesn't have a 3d display " does not make 3d a gimmick.

Like a said, think twice.

Can you please quote who recommend a card here just because it supports physx? Do you even differentiate between physx and physics?


----------



## Joker (Oct 2, 2011)

im with skud, ICO, cilus and acceleratorx on this one.

now GPU is fast for floating point calcs...u dont have to have use PhysX API for offloading them....there are other ways too and that DOES NOT mean running physics calcs on the CPU. once u sit and use a game development stack like DirectX 11 or OpenGL + SDL u find plenty of ways.

PhysX is a gimmick. educated ppl know this. those who drink the nvidia koolaid don't.



			
				AcceleratorX said:
			
		

> but the latest PhysX versions *do* have optimized multicore PhysX (including SSE2 if I remember correctly)


i think nvidia is still using x87 for physx. another reason why they run slow on modern CPUs...SIMD based instructions like SSE2, 3 and 4 are much faster on CPUs.


----------



## vickybat (Oct 2, 2011)

ico said:


> There we again see pseudogamedevs who don't work in the field/nor intend to commenting on whether PhysX is a gimmick or not.
> 
> Excessive tessellation myth in Crysis 2 was busted not quite long ago when Crytek tessellated a plan plain concrete slab unnecessarily for actually no change in visual quality. Also a mesh of tessellated water was flowing beneath the ground. Waste of GPU horsepower for no benefit shall I say? Hate to say it, but AMD was right about "excessive" tessellation is not required.
> 
> ...



Hey welcome back.

I couldn't get your bold comments. Can you please throw some light?




ico said:


> Do have a look at The Witcher 2. A game without any gimmicks from either side. Runs on DirectX 9. Best looking game till now. That's what games should be like without using any proprietary gimmick like PhysX. It also busts the DirectX 11 is OMFG EPIC myth. At the end of the day, it all depends on the gamedev, how he chooses to develop his games using various APIs.



Well there's a game called battlefield 3 in case you are not aware. Wonder how witcher 2 stands next to it in the visual department.Sorry to say but its high in tesselation. Only time will tell if its done right or not. Tesselation code in crysis 2 was not part of original development but was patched later. That's why the there were unnecessary overuse of it.

Btw witcher 2 is using havoc physics(cpu) which is not proprietary physics engine and amd is busy  implementing it on their gpu's. Yes you heard right, havoc physics code will be processed by a gpu. 

So the point was not physx but physics to be implemented on a gpu rather than cpu and we had a debate before on it before.



Joker said:


> im with skud, ICO, cilus and acceleratorx on this one.
> 
> now GPU is fast for floating point calcs...u dont have to have use PhysX API for offloading them....there are other ways too and that DOES NOT mean running physics calcs on the CPU. once u sit and use a game development stack like DirectX 11 or OpenGL + SDL u find plenty of ways.



O really?? Can you tell what's this mate?

*
AMD demonstrates Havok with GPU acceleration*

What say now? open-gl api's are not meant for gpu physics but its open-cl rather and amd is pushing hard.
Good that its not proprietary like physx.



Joker said:


> PhysX is a gimmick. educated ppl know this. those who drink the nvidia koolaid don't.



I don't think physx is a gimmick because i don't see it in the way you see. I say that its a first step to implement physics ina gpu rather than cpu and since its proprietary, they chose to unoptimize it for other platforms including the cpu. If you read the above link, you will see nvidia's APEX tool set is similar to havok. So both are trying to achieve the same but taking a different route.

I don't think i'm uneducated and neither do i drink nvidia koolaid(seriously, i don't know what that means). Adding instruction set support doesn't make a code optimized but the actual execution units matter. Instruction sets are a medium.


----------



## Jaskanwar Singh (Oct 2, 2011)

vickybat said:


> I don't think i'm uneducated and neither do i drink nvidia kookaid(*seriously, i don't know what that means)*.



Drinking the Kool-Aid - Wikipedia, the free encyclopedia


----------



## vickybat (Oct 2, 2011)

^^ Hey *jas *thanks a lot man. Now i'm starting to think that i'm not that educated i thought to be a little while ago.

*@ everyone*

Guys take a chill pill and lets stop it right here. Op will be confused further. I guess op has decided on a 6950 and that's a good decision imo.


----------



## Joker (Oct 2, 2011)

lol @ AMD demonstrates Havok running on gpu. 2.5 year old vapourware link. i have got nothing to make of it since every game dev implemens physics in some form or the other whether it runs off cpu or gpu.

PhysX is a marketing gimmick...GPU physics is not. 

lolbtw..dont blow the trumpet on BF3...hype can make you taste sour grapes..lile it has done with crysis 2.  much improved The Withcer 2 version 2 launched yesterday.


----------



## Jaskanwar Singh (Oct 2, 2011)

vickybat said:


> ^^ Hey *jas *thanks a lot man. Now i'm starting to think that i'm not that educated i thought to be a little while ago.
> 
> *@ everyone*
> 
> Guys take a chill pill and lets stop it right here. Op will be confused further. I guess op has decided on a 6950 and that's a good decision imo.



you are welcome


----------



## vickybat (Oct 2, 2011)

*@ joker*

Yeah i had much higher expectations with crysis 2 but it wasn't like its predecessor. But visually it was stunning to be honest.

I guess you were following the crysis 2 thread seriously?

But physx is gpu physics right? Only difference is that its proprietary.
I guess that doesn't make it a gimmick. There will be future development on it and more and more games are finding support.


----------



## AcceleratorX (Oct 2, 2011)

I think NVIDIA did try hard for PhysX to become a universal API for GPU physics, but neither Intel nor AMD supported it. It's not heavily documented, but some hints can be found at press releases:

Nvidia Helps Porting PhysX on Radeon - Softpedia



			
				Softpedia said:
			
		

> Yet, AMD was rumored to *try developing its own PhysX* a few weeks ago.





			
				Softpedia said:
			
		

> There's no doubt that Nvidia is *more than delighted to see its API working on cards made by its strongest competitor*, not to mention the threat it may represent to Havok.





			
				Softpedia said:
			
		

> AMD still refuses to provide access to any HD 4800 hardware, the support from other people allowed to almost complete its CUDA Radeon library. All that is left to do, besides the huge amount of work the porting requires, it to convince AMD to aid the project, since its approval is mandatory at "developer and PR level".



Guess who didn't cooperate?

I'm sure the issues were a lot more complex than what's presented in a few press releases, but the fact is that AMD is not a saint in the woods either 

As for PhysX vs. no PhysX, I do believe PhysX will not completely die since it is by far the cheapest physics engine to integrate into any game (given it's feature level compared to open source ones).


----------



## ico (Oct 2, 2011)

AcceleratorX said:


> I think NVIDIA did try hard for PhysX to become a universal API for GPU physics, but neither Intel nor AMD supported it. It's not heavily documented, but some hints can be found at press releases:
> 
> Guess who didn't cooperate?
> 
> I'm sure the issues were a lot more complex than what's presented in a few press releases, but the fact is that AMD is not a saint in the woods either


There is a reason why AMD didn't accept. It would have hurt their sweet spot strategy. When AMD designs a GPU...they target a particular 'sweet spot' die size and design their GPUs to fit in. They only add features which the engineering team thinks are worth adding. Intel has a similar funda for CPUs after the Netburst fiasco...only that circuitry will be added which yields > 2% performance increase for every 1% power consumption increase.

Ever since the successful RV570/80 and the RV600 fiasco, AMD's strategy is to develop chips around maximum performance per mm^2, maximum performance per watt, to maximize yield and have such designs which can be easily scaled down from low-end to high-end. They don't care about having huge 500 mm^2 monolithic dies with everything in them like nVidia does.

When they went from RV770 to RV870, they also removed features like Sideport which were eating up space only meet the desired size.

Why doens't nVidia release PhysX SDK and documentations under GPL or BSD license? Only that would make PhysX open. All other licenses are proprietary or commercial.


----------



## Liverpool_fan (Oct 2, 2011)

AcceleratorX said:


> Guess who didn't cooperate?
> 
> I'm sure the issues were a lot more complex than what's presented in a few press releases, but the fact is that AMD is not a saint in the woods either
> 
> As for PhysX vs. no PhysX, I do believe PhysX will not completely die since it is by far the cheapest physics engine to integrate into any game (given it's feature level compared to open source ones).


Why should AMD accept PhysX which is owned and controlled by NVIDIA?
Why should they trust NVIDIA who rather have a fishy record?
Why should they look for "hints" with incomplete documentation and no guarantee of first class support?


----------



## vickybat (Oct 3, 2011)

Ok guys now check what physx 3.0 brings to the table. Finally physx 3.0 is supporting multicore cpu's with latest SSE instruction set to take care of floating point math operations. They claim to reach a broad user spectrum by this move.

Check *this* & *this*

*Nvidia Releases PhysX 3.0 *

*guru3d*

It doesn't use the older x87 instruction set.



> *Arguably more noteworthy is a new Task Manager and managed thread pool, which "allows games to take advantage of multi-core processors on all platforms." You might recall that, last year, we discovered that certain games completely fail to implement PhysX in a way that takes advantage of multiple CPU cores—or even modern instruction sets like SSE. PhysX 3.0, it seems, is tackling that issue.*







ico said:


> *Why doens't nVidia release PhysX SDK and documentations under GPL or BSD license? Only that would make PhysX open. All other licenses are proprietary or commercial.*



This will never happen.


----------



## Liverpool_fan (Oct 3, 2011)

^ I'll suggest a new thread about that. NVIDIA releasing PhysX 3.0 doesn't affect the OP's purchase does it?



mithun_mrg said:


> also Nvidia means + PhysX+3D - driver issues



I see where this all started.


----------



## vickybat (Oct 3, 2011)

^^ Yeah you're right mate. I'll pm a supermoderator and hopefully he'll shift physx based posts to a new thread.


----------



## ico (Oct 3, 2011)

X87 for their GPU. So that they can say.."FPU calculations on GPUs are so much faster than CPUs" when CPUs clearly not support X87.

Now when it comes to supporting CPU, they use use SSE because they _had_ to.  I wonder what problem there was for them to use SSE on GPUs itself? Perhaps no benefit over CPU?  But then they wouldn't have been able to say that GPU are better than CPU for physics calculations. Self created myth and cult + clever marketing.

Again tells me, we are arguing over a non-issue and PhysX is nothing more than a gimmick. nVidia fagboys don't seem to believe this. Gamers thinking they are gamedevs is the last thing which should happen to this world.


----------



## Joker (Oct 3, 2011)

physx now natively supporting multicore CPUs?? acceptance of defeat there. beats the purpose why nvidia gpu physx was born??

edit: +1 to what ICO said.


----------



## sygeek (Oct 3, 2011)

I don't know much about this, but ico's posts do make more sense.


----------



## vickybat (Oct 3, 2011)

*@ joker*

 No its far from that. You still need an nvidia gpu to make it work and this time both cpu and gpu will work together.

You know its all about diversification. Physx sdk now supports wider hardware i.e from pc's to game consoles like ps3 and xbox 360.
Look at the big picture before posting. Nvidia is not targeting amd or any other (physics)competitor but aiming for a wider audience.

Great move btw.

*physxinfo.com/wiki/PhysX_SDK_3.x



sygeek said:


> I don't know much about this, but ico's posts do make more sense.



If you don't know much then how can you say his posts make more sense?
Stop trolling please.


----------



## Joker (Oct 3, 2011)

vickybat said:


> *@ joker*
> 
> No its far from that. You still need an nvidia gpu to make it work and this time both cpu and gpu will work together.



why not use SSE on the gpu then? why x87?


----------



## sygeek (Oct 3, 2011)

vickybat said:


> If you don't know much then how can you say his posts make more sense?
> Stop trolling please.


Because, it makes sense, something which your posts lack significantly.


----------



## Liverpool_fan (Oct 3, 2011)

vickybat said:


> *@ joker*
> 
> No its far from that. You still need an nvidia gpu to make it work and this time both cpu and gpu will work together.
> 
> ...


How the hell is NVIDIA targeting a wider audience if you need both CPU and GPU to work together? This makes absolutely no sense.
I don't see the real of point of NVIDIA PhysX in portable devices to be honest either if that's the wider audience they are targeting.


----------



## ico (Oct 3, 2011)

Joker said:


> why not use SSE on the gpu then? why x87?


here is the reason:


Cilus said:


> *Other thing is PhysX can run in CPU very smoothly if the code has been optimized to use SSE (1,2,3 or 4) instruction sets of CPU but Nvidia has used unoptimized X87 code for CPU PhysX execution to reduce the CPU PhysX performance forcefully to glorify their cards.* It is not only my word but all the major review sites like Guru3d, Toms Hardware, Anandtech have said those words over and over.


good post here by Cilus. I had somehow missed your post.  This is exactly what I also meant in my previous post. (#71)

X87 is antique and became outdated 12 years ago on CPUs when SSE succeded it. nVidia has only used X87 on GPU only to falsely glorify themselves. Using SSE on GPU would have resulted in no marketing aura and no pseudobenefits.

This is a terrific read if you guys haven't read it: *Real World Technologies - PhysX87: Software Deficiency*


----------



## Skud (Oct 3, 2011)

LOL. This thread is in Page 3 and half of the posts are related to PhysX, which OP never asked for. I guess, by this time, even OP has completely forgotten why he created this thread in the first place. Chill guys.


----------



## mithun_mrg (Oct 3, 2011)

i will post a small comment here if physx was a gimmick one of the most knowledgeable members of TDF i.e cilus won't have used a dedicated card 
the problem is that to take full advantage of physx u need the power of a 8800/9800 card. 
it generally enhances the look & feel & most importantly game-play  *the impact physx has is much better than havoc or frostbrite* can u deny this
Game physics is as important a graphics remember the first time you played Maxpayne2


----------



## vickybat (Oct 3, 2011)

Joker said:


> why not use SSE on the gpu then? why x87?



Actually buddy its using SSE based instruction sets only and not x87 this time. Couldn't find a detailed info but some bits and pieces here and there.

See this forum:

PhysX 3.0 - NVIDIA Forums

The guy says its not using x87 for gpu anymore. Maybe the gpu too is using SSE. The article ico posted is old and no point in following that now.
Maybe nvidia made this move so that latest cpu's can also handle its physics engine efficiently but i also read that it needs an nvidia gpu to work. Until a detailed article is found, everything is a bit vague now.

Found a small read:



> *1) Reads like an advert for NV graphics cards.
> 
> 2)
> Similar to the first set of test results, Radeon video cards suffer poor frame rate speeds when PhysX is enabled. Our Intel Core i7-920 quad-core CPU just doesn't compare to the hundreds of cores available in a graphics processor. Both AMD and NVIDIA products suffer heavily reduced performance when APEX PhysX is processed by the computer's CPU, although there appears to be an unexpected trend: the most powerful GPUs offer the inverse in CPU-processed PhysX performance.
> ...






Liverpool_fan said:


> How the hell is NVIDIA targeting a wider audience if you need both CPU and GPU to work together? This makes absolutely no sense.
> I don't see the real of point of NVIDIA PhysX in portable devices to be honest either if that's the wider audience they are targeting.



Well friend, you got me wrong. By wider audience, i meant that they are porting physx middleware in game consoles as well. You see previously games for eg- uncharted 2 used havok physics.
Now nvidia is targeting them. So in upcoming games, they might use physx 3.0 and use the apex toolset which is actually similar to havok. So that's a wider audience and not only pc gamers.


----------



## ico (Oct 3, 2011)

vickybat said:


> See this forum:
> 
> PhysX 3.0 - NVIDIA Forums
> 
> *The guy says its not using x87 for gpu anymore. Maybe the gpu too is using SSE.* The article ico posted is old and no point in following that now.





			
				your link said:
			
		

> The PhysX SDK where compiled with SSE since the 2.8.4 release (August 16, 2010), GPU would never benefit from this as they don't support SSE...


Time to read your own links properly and understand them. nVidia can't support SSE just like that on GPUs without the necessary circuitry.

My article might be a year old...but whatever it said about nVidia PhysX on GPU, still avails true. THey intentionally used X87 on GPU to falsely claim a performance benefit over CPUs.

Game, set and match.


----------



## Joker (Oct 3, 2011)

ownage.

to the OP..get hd 6950 2gb from msi or sapphire.


----------



## vickybat (Oct 3, 2011)

Well *ico *, i say its time to cool of and put up discussions in a sane manner. Fighting won't give productive data.

Lets stay calm and put up some valid points instead of flaming nvidia or physx. 

Ok now we both know that sse which is an extension of x86 was first adopted by intel. Starting from sse to all its extensions i.e sse2,3 & 4 were optimized for float operations.

Yesterday i was checking a link where a guy posted a matrix multiplication c++ code involving floating data.

*He compiled the code using x87, sse & sse2. The compilation time for x87 and sse were 2373ms and 2368ms respectively. Now these are almost equal. But when he used sse2, things were significantly fast and the result was 1112 ms.*

Check the *source* please.

Now the guy says its possible for physx to use sse2 but he hasn't mentioned whether cpu or gpu.

But remember nvidia has its proprietary API i.e CUDA. Using cuda, nvidia gpu's have access to a ptx instruction set. Now this instruction set does not contain simd instructions which we call SSE or streaming simd instructions.

Now we all know that nvidia's architecture utilizes TLP in a grand scale. Cuda basically does simd but in a different manner. Here the threads are divided into groups known as warps. Within these warps, same sequence of instructions are executed and if there are dependencies from other threads, those instructions are supressed. This gives an illusion of different execution sequences. Nvidia calls this *SIMT* or single input multiple threads instead of multiple data.

So i think they have done something in a software level to harness sse instructions.

Now lets wait for a more concrete preview of physx 3.0 to get to know the facts in more detail. Besides, i found the above information in* "stackoverflow"*


----------



## ico (Oct 3, 2011)

At the end of the day, PhysX is a gimmick. 

They intentionally didn't use SSE on the GPU.  *Edit: Note..SSE means the whole family. Not only the first iteration.*


vickybat said:


> So i think they have done something in a software level to harness sse instructions.


If that is so...it is not worth it.  There will be a huge performance penalty as their GPU supports only X87.

Shouldn't be quoting PMs.


			
				ico said:
			
		

> Most likely it is a wrapper or in simple words translator. To support SSE natively on the GPU, they would need to add suitable circuitry which they haven't done.


----------



## Joker (Oct 3, 2011)

lol..vickybat now u are saying what we were saying.


----------



## vickybat (Oct 3, 2011)

But guys, i don't understand one thing. Now lets say that nvidia realized its folly by not supporting sse and now they claim to do it in physx 3.0.

Or maybe they are eyeing their next-gen gpu's i.e *Kepler* to support sse by adding some physical circuitry(as ico said).

Considering the SIMT model, can we assume that gpu's finally will make use of sse and its extensions? I've got a feeling that they will. Can you people please throw some light?


----------



## ico (Oct 3, 2011)

vickybat said:


> But guys, i don't understand one thing. Now lets say that nvidia realized its folly by not supporting sse and now they claim to do it in physx 3.0.
> 
> Or maybe they are eyeing their next-gen gpu's i.e *Kepler* to support sse by adding some physical circuitry(as ico said).
> 
> Considering the SIMT model, can we assume that gpu's finally will make use of sse and its extensions? I've got a feeling that they will. Can you people please throw some light?


SSE for GPUs? The future is fusion.


----------



## vickybat (Oct 3, 2011)

^^haha ok mate.


----------



## Liverpool_fan (Oct 3, 2011)

mithun_mrg said:


> i will post a small comment here if physx was a gimmick one of the most knowledgeable members of TDF i.e cilus won't have used a dedicated card
> the problem is that to take full advantage of physx u need the power of a 8800/9800 card.
> it generally enhances the look & feel & most importantly game-play  *the impact physx has is much better than havoc or frostbrite* can u deny this
> Game physics is as important a graphics remember the first time you played Maxpayne2



PhysX is a gimmick and plays absolutely no role in deciding a purchase of a customer graphics card. Please don't drag this again. Thank you very much.


----------



## AcceleratorX (Oct 3, 2011)

Liverpool_fan said:


> Why should AMD accept PhysX which is owned and controlled by NVIDIA?
> Why should they trust NVIDIA who rather have a fishy record?
> Why should they look for "hints" with incomplete documentation and no guarantee of first class support?



I can answer your questions with more questions.

Why should AMD cause inconvenience to developers by not supporting an API which is clearly available to them, even if it runs faster on the competition's products? Why should they *force slower software based physics processing* when a faster method is clearly available?

Now I am well aware of the DirectCompute/Havok argument, but here's something to think about: Small developers have limited budgets. PhysX is feature packed, cheap and easy to use compared to something like Bullet. One can have a decent game even with software PhysX.

Can this developer afford Havok? Maybe, maybe not. Now, this developer is hampered by the PhysX product not being accelerated at all on AMD products. Why? Why force someone to spend more on a supposedly better solution just so that AMD cards can have a performance parity?

A similar situation with CPUs: Hey, I want to use SSSE3 and SSE4. But wait! AMD's SSE4 is not Intel's SSE4! AMD doesn't support SSSE3! Now I have to code fallback paths so it will work on these CPUs, or be happy using SSE2 and plain SSE3 instead.

What's the consequence? For bigger developers, not much - they have developers, resources and money to spare so that everything works great. Smaller developers, who want to push a product out as fast as possible so that they can post a revenue to pay their employees face the brunt of the problem. The disadvantages: Larger development time (code fallback paths), more testing required (since no support means more bugs), less performance on AMD hardware possibly leading to lower sales, etc.

Who suffers? The developer of course. Now do you understand why so many games from smaller developers carry TWIMTBP and Intel tags? 

I'm not bashing AMD here since I have used AMD for a very long time (fully AMD CPU setups since 2003, AMD graphics since 2007), but the fact is that AMD is simply not as open to developers as it should be. It needs to step up it's game real fast.

But for us as end users, all of the above pretty much amounts to nothing since we just care about how the graphics performance is in our games given the price we pay for the product 

So if AMD wins there, so be it 

As for PhysX as a gimmick - it probably is a good marketing tool for NVIDIA, but it's also responsible for giving a good amount of quality to a number of independent and small budget games (Example: See the game "Trine").


----------



## DARK KNIGHT (Oct 5, 2011)

Guys enough discussions about physx . I understand the difference now i have finally decided to go with  ATI. And please start a newthread physx is a gimmick or not. so everybody will know about the topic and choose his card he like . I have decided to go with hd 6950 . Thanks everybody for ur suggestion for guiding  me about the topic of phsyx.


----------



## DARK KNIGHT (Oct 7, 2011)

Guys can u tell me the best online site to so that I can buy these cards 
1.msi 6950tp-III/p.e
2.saphire hd 6950 2 gb
On online bcse somebody told me that it is not available in nehru palace. 
please guide me .


----------



## ico (Oct 7, 2011)

DARK KNIGHT said:


> Guys can u tell me the best online site to so that I can buy these cards
> 1.msi 6950tp-III/p.e
> 2.saphire hd 6950 2 gb
> On online bcse somebody told me that it is not available in nehru palace.
> please guide me .


TheITWares - MSI R6950 Twin Frozr III PE/OC 2GB

SMCInternational.in doesn't have it.


----------



## mithun_mrg (Oct 7, 2011)

DARK KNIGHT said:


> Guys can u tell me the best online site to so that I can buy these cards
> 1.msi 6950tp-III/p.e
> 2.saphire hd 6950 2 gb
> On online bcse somebody told me that it is not available in nehru palace.
> please guide me .



u can try here also
Buy Sapphire | Sapphire HD6950 2GB DDR5 PCI Express card | Buy PCI Express card | Buy Graphic card


----------



## DARK KNIGHT (Oct 7, 2011)

*Are these sites were trustworthy sites & what is the procedure to buy a product in these sites 
does it requires credit card or it need on line banking.*


----------



## mithun_mrg (Oct 8, 2011)

these are all trusted sites mode of payment depends on u, u can deposit cash in their A/c after confirmation of the order or pay through neft using online banking


----------



## DARK KNIGHT (Oct 8, 2011)

*Thanks again mithun_mrg for above info.*


----------



## Jaskanwar Singh (Oct 8, 2011)

^for sapphire this is the one to get -
TheITWares - One Stop for all Gizmos!SAPPHIRE 100312-3SR Radeon HD 6950 Dirt3 Edition 2GB 256-bit GDDR5 PCI Express 2.1 x16 HDCP Ready CrossFireX Support Video Card with Eyefinity


----------



## DARK KNIGHT (Oct 11, 2011)

*Thanks for the suggestion jassy *


----------



## DARK KNIGHT (Oct 20, 2011)

Who deleted the last two posts in my thread, i didn't like this


----------



## DARK KNIGHT (Nov 16, 2011)

guys guys guys just bought the beast today msi 6950tp-III/p.e . it coast me 15,800 from smc .


----------



## vickybat (Nov 16, 2011)

^^ Congrats mate. Post pics in the latest purchase section.


----------



## DARK KNIGHT (Nov 17, 2011)

yea sure i will do it soon


----------



## mithun_mrg (Nov 17, 2011)

DARK KNIGHT said:


> guys guys guys just bought the beast today msi 6950tp-III/p.e . it coast me 15,800 from smc .



congrats mate waiting for the pics


----------



## ico (Nov 17, 2011)

DARK KNIGHT said:


> guys guys guys just bought the beast today msi 6950tp-III/p.e . it coast me 15,800 from smc .


welcome to the club.


----------



## DARK KNIGHT (Nov 17, 2011)

Thanks mates  but i have one difficulty it doest not give the full reso in *HDMI*  .  please help me guys .


----------



## ico (Nov 17, 2011)

which resolution? Full HD i.e. 1080p?

I'm getting 1080p off my card via HDMI. Have you installed Catalyst 11.10?


----------



## DARK KNIGHT (Nov 17, 2011)

yes ico in 1080p .no i installed the Catalyst 11.11 for win -7 64 bit. 
please help me guys.


----------



## ico (Nov 17, 2011)

May be the problem you are facing is, it is not scaling from left to right and top to bottom. This can be fixed via the Catalyst Control Center.

Just follow this guide: Benq V2210 Led Monitor. Using HDMI - FixYa

Do this:

*i.imgur.com/0k5ut.png


----------



## DARK KNIGHT (Nov 17, 2011)

it is not available in my catalyst  driver now what to do i have 11.11 version .


----------



## DARK KNIGHT (Nov 17, 2011)

Thanks for u r support ico i did this,now i have full reso with my* HDMI *cable . Thanks for the help.


----------



## ico (Nov 17, 2011)

DARK KNIGHT said:


> Thanks for u r support ico i did this,now i have full reso with my* HDMI *cable . Thanks for the help.



No problem. 

Glad you figured it out. I was just uploading a screenshot. 



Spoiler



*i.imgur.com/QLSyU.jpg


----------



## Jaskanwar Singh (Nov 17, 2011)

what about starting a Radeon HD6950 Owners Club


----------



## DARK KNIGHT (Nov 17, 2011)

yea sure why not with pleasure.


----------



## kapilove77 (Nov 17, 2011)

is that card better performer than zotac 560ti amp edition?


----------



## DARK KNIGHT (Nov 17, 2011)

Just dowloaded the movie Rango (2011) - IMDb . it look so awesome. The video quality of *HDMI* is much better than* DVI-SUB* thanks for the help ICO  .


----------



## ico (Nov 17, 2011)

Jaskanwar Singh said:


> what about starting a Radeon HD6950 Owners Club


start it.  plenty of 6950 users now.



kapilove77 said:


> is that card better performer than zotac 560ti amp edition?


imo, for Battlefield 3 it is. Overall, it has slightly more raw power.



DARK KNIGHT said:


> Just dowloaded the movie Rango (2011) - IMDb . it look so awesome. The video quality of *HDMI* is much better than* DVI-SUB* thanks for the help ICO  .


hmm HDMI and DVI have same video quality. Same electrical signal. Or do you mean, D-SUB aka VGA. Compared to it, HDMI/DVI is better.


----------



## kapilove77 (Nov 17, 2011)

Lol he don't know what's he talking about he's so excited. And thx for clarifying ico


----------



## DARK KNIGHT (Nov 17, 2011)

yeah i am exited  & yes i am talking about *VGA* sorry ,but *HDMI* gives better picture quality in comparison of* VGA* 
that is my opinion,check me if i am right or not  I am noob in these things .
             what do u think kapilove & ico


----------



## ico (Nov 18, 2011)

yup, you are completely right. HDMI/DVI is clearly better than VGA. VGA is analog. Quality decreases in proportional to cable length. HDMI/DVI are digital. Zeroes and Ones remain zeroes and ones no matter how long the cable might be.


----------



## DARK KNIGHT (Nov 18, 2011)

ico said:


> yup, you are completely right. HDMI/DVI is clearly better than VGA. VGA is analog. Quality decreases in proportional to cable length. HDMI/DVI are digital. Zeroes and Ones remain zeroes and ones no matter how long the cable might be.



Thanks For Your Compliment and Explanation ICO.


----------

