# ati gpu nvidia physx



## mohityadavx (Feb 27, 2011)

Hi!

I have heard that nvidia physx card gets disabled when used with non nvidia gpu (or ati gpu )  is it true ? If yes do you think it is legitimate. Would you buy nvidia card even after hearing this.


----------



## ico (Feb 27, 2011)

PhysX = gimmick for me. You can only name 4-5 games which employ it and I don't even know which all games will be employing it this year.

I'll have my own reason for buying an nVidia card i.e. better Linux driver support.

As far as your question regarding PhysX is concerned, my answer is NO. So, I have voted No.


----------



## Faun (Feb 27, 2011)

I don't care for physx. But I like nVidia for their linux driver support as pointed by ico.

Provide a better support for linux and I will hop on to AMD.


----------



## damngoodman999 (Feb 28, 2011)

ICO pointed out LINUX DRIVER SUPPORT , i say they do better drivers 1st with out bugs !


----------



## ico (Feb 28, 2011)

mohityadavx said:


> Hi!
> 
> I have heard that nvidia physx card gets disabled when used with non nvidia gpu (or ati gpu )  is it true ? If yes do you think it is legitimate. Would you buy nvidia card even after hearing this.





			
				Poll said:
			
		

> Would you still buy nvidia card if an ati card of same genre is available.



You want a general consensus about which card will people prefer or only about PhysX?


----------



## mohityadavx (Feb 28, 2011)

ico said:


> You want a general consensus about which card will people prefer or only about PhysX?



no the point is only about physx. its more about being fair suppose i bought a nvidia physx card and nvidia give me that physx is being blocked as u r using non nvidia gpu. Don't u think its like stealing my money. Like giving a counterfeit product. Suppose u buy iphone and u use a third party screen cover and after some time there is a problem with the screen and apple say u r not using apple screenguard  thats why we won't repair it.


No personal hatred towards nvidia but this policy really sucks big time.


----------



## vickybat (Feb 28, 2011)

^^ Thats actually fair imo. You are not supposed to use an nvidia card as physx and a non nvidia card as the primary gpu for rendering. In this case if amd would have been promoting physx, it would also not support non-amd cards as primary gpu. So its a fair marketing strategy.

Though you can use an nvidia card as a physx card with an amd gpu as primary by hacked drivers.


----------



## mohityadavx (Feb 28, 2011)

^^ suppose u didn't knew the above fact and u buy nvidia physx card and now when u use it it won't work  won't u feel cheated nowhere on box it is written it only works with nvidia gpu

as per u said about amd . amd is developing a free and open physics engine


----------



## ico (Feb 28, 2011)

mohityadavx said:


> no the point is only about physx. its more about being fair suppose i bought a nvidia physx card and nvidia give me that physx is being blocked as u r using non nvidia gpu. Don't u think its like stealing my money. Like giving a counterfeit product. Suppose u buy iphone and u use a third party screen cover and after some time there is a problem with the screen and apple say u r not using apple screenguard  thats why we won't repair it.


If your thread is only about PhysX, then make your Poll question like that. 

---------- Post added at 11:39 AM ---------- Previous post was at 11:38 AM ----------




vickybat said:


> *So its a fair marketing strategy.*


No it isn't. It isn't in favour of games running the same across all hardware platforms.

I gave the example of DirectX/3D vs OpenGL earlier. Microsoft kept on pushing Direct3D till it became _de facto_. Result = developers only using Direct3D and developing games only for Windows.


----------



## vickybat (Feb 28, 2011)

ico said:


> No it isn't. *It isn't in favour of games running the same across all hardware platforms*.



Yes it is. It doesn't have to favour all games across all platforms. Everything doesn't have to be open source. Its proprietary code and there's nothing wrong with that.



ico said:


> I gave the example of DirectX/3D vs OpenGL earlier. Microsoft kept on pushing Direct3D till it became _de facto_. Result = developers only using Direct3D and developing games only for Windows.



I know about directx and opengl. Microsoft pushed directx and involved a lot of money . Now thats pure market strategy and again is nothing wrong. I see nvidia in the same manner here.

Though open-gl is getting support too and has great potential. The upcoming rage is a testament for that.


----------



## ico (Feb 28, 2011)

vickybat said:


> Yes it is. It doesn't have to favour all games across all platforms. Everything doesn't have to be *open source.* Its proprietary code and there's nothing wrong with that.


Flawed argument.

I am talking about running the same on all *hardware platforms.* Read properly again.



vickybat said:


> I know about directx and opengl. Microsoft pushed directx and involved a lot of money . Now thats pure market strategy and again is nothing wrong. I see nvidia in the same manner here.
> 
> Though open-gl is getting support too and has grteat potential. The upcoming rage is a testament for that.


Nothing wrong?

Ever heard about AntiTrust lawsuits?

example, Intel offering rebates to OEM. Result, a crappy chip like Pentium 4 overselling compared to Athlon XP/64 which were vastly superior.

If you go and offer money to a developer, he will accept it.

PhysX was an underhand strategy by nVidia to sell their cards when it was being smashed all over by HD 3870, HD 4890 and HD 5870.

I'm not against PhysX, but against the fact that people are using it as a decision-making factor which it is NOT.


----------



## Joker (Feb 28, 2011)

physx = gimmick. not a deal maker or breaker.


----------



## vickybat (Feb 28, 2011)

ico said:


> Flawed argument.
> 
> I am talking about running the same on all *hardware platforms.* Read properly again.



I have read it properly. Yes its not running across all hardware platforms(read gpu's right?). And it doesn't have to. 

So there's no flaw here.




ico said:


> Nothing wrong?
> 
> Ever heard about AntiTrust lawsuits?
> 
> ...



Yes heard those lawsuits as well. I guess thats how the world goes. You cannot change it. Intel and microsoft have money and they can do anything with it. If amd would have had that kind of money, it would have done the same.

Offering money to a developer to support products is done by most manufacturers and companies. Its nothing new and neither is shocking.

Amd 3 series never smacked nvidia. Its 8 & 9 series cards sold very well. But amd's 4 and 5 series really smacked nvidia.

I agree that physx is not a deal maker or breaker but nvidia's strategy is not wrong. Its quite common.


----------



## ico (Feb 28, 2011)

vickybat said:


> I have read it properly. Yes its not running across all hardware platforms(read gpu's right?). And it doesn't have to.
> 
> So there's no flaw here.


It doesn't have to. I very well know that.

The flaw is in the mentality of people who think it is something important and they bring it in while arguing. (yup, you if we go by your posts)

Traditionally physics processing was done on the CPU. Was there any need to shift from it when it was already same for everyone? It wouldn't even matter if you have an nVidia card or an AMD card. It wouldn't even matter if you had an Intel CPU or AMD CPU. Everything runs normal and it runs fine.



vickybat said:


> If amd would have had that kind of money, it would have done the same.


They had money earlier and still never did that. 



vickybat said:


> Offering money to a developer to support products is done by most manufacturers. Its nothing new and neither is shocking.


Never called it shocking.



vickybat said:


> I agree that *physx is not a deal maker or breaker* but nvidia's strategy is not wrong. Its quite common.


then why YOU make it sound like one?


----------



## vickybat (Feb 28, 2011)

ico said:


> It doesn't have to. I very well know that.
> 
> The flaw is in the mentality of people who think it is something important and they bring it in while arguing. (yup, you if we go by your posts)



Well i am a physx supporter. But i don't expect everybody to do the same or follow in the same lines. So maybe for me its a deal breaker along with performance but not for everybody.



ico said:


> Traditionally physics processing was done on the CPU. Was there any need to shift from it when it was already same for everyone? It wouldn't even matter if you have an nVidia card or an AMD card. It wouldn't even matter if you had an Intel CPU or AMD CPU. Everything runs normal and it runs fine.



Yes there was a need. The reason being to offload the cpu from physics code processing. Ageia first developed ppu(physics processing unit) and these were standalone cards. So they offloaded physics code from the cpu to themselves thus increasing overall throughput. Now after nvidia bought ageia, they employed physx as a proprietary code to be processed by their cards. Now we can see some more games employing it and the success or failure of those games will decide the fate of the coin.

So the point was to offload the cpu. Amd should also do the same. Doesn't matter if it chooses to go the proprietary or open source route.

Well the switch to gpu was because of a gpu's computational ability of more floating point units. Its more capable of handling physics codes than cpu. Thats where it all started.




ico said:


> They had money earlier and still never did that.
> 
> 
> Never called it shocking.



They never had money close to intel. You will get a brief idea if you check their revenues and turnover in wikipedia page.




ico said:


> then why YOU make it sound like one?



Like i said, i support physx but never expect everybody to do the same.


----------



## ico (Feb 28, 2011)

vickybat said:


> Yes there was a need. The reason being to offload the cpu from physics code processong. Ageia first developed ppu(physic processing unit) and these were standalone cards. So they offloaded physics code from the cpu to themselves thus increasing overall throughput. Now after nvidia bought ageia, they employed physx as a proprietary code to be processed by their cards. Now we can see some more games employing it and the success or failure of those games will decide the fate of the coin.
> 
> *Well the switch to gpu was because of a gpu's computational ability of more floating point units. Its more capable of handling physics codes than cpu. *Thats where it all started.


At the end of the day, you are getting a massive 60-70% performance hit.

Having played Batman:AA with PhysX, I very well know that those effects could have been implemented easily  through traditional ways and without 60-70% performance hit in nVidia cards.  It is all about marketing propaganda to woo people who believe in such things.

Rendering and PhysX processing is again too much for every nVidia graphic card.



vickybat said:


> So the point was to offload the cpu. Amd should also do the same. Doesn't matter if it chooses to go the proprietary or open source route.


You mean AMD should go out, come up with a gimmick of their own which makes games unplayable on nVidia cards?   If that is the case, then you are an idiot. Sorry to say this.

Also, stop bringing in "open source" when I'm not talking about it. I'm in favour of open things - neutral to everyone. It doesn't have to be open source.


----------



## mohityadavx (Feb 28, 2011)

vickybat said:


> Well i am a physx supporter. But i don't expect everybody to do the same or follow in the same lines. So maybe for me its a deal breaker along with performance but not for everybody.
> 
> 
> 
> ...



fair deal eh?
u must have heard about nvidia gpu integrated processor if tomorrow nvidia starts making mobo also.

Think of this scenario you buy brand new gigabyte 20k mobo u insert ur nvidua integrated nvidia gpu cum processor,u press on button and u get message
*
"Booting Error. Non nvidia mobo found"*

off topic:-
my friend is selling his nvidia 8800 gtx card for 6k is it a good deal!


----------



## ico (Feb 28, 2011)

mohityadavx said:


> off topic:-
> my friend is selling his nvidia 8800 gtx card for 6k is it a good deal!


hmm, I don't think so.

For 8k you get HD 5770 which is much faster.

8800 GTX = HD 4770 level.


----------



## vickybat (Feb 28, 2011)

ico said:


> At the end of the day, you are getting a massive 60-70% performance hit.
> 
> Having played Batman:AA with PhysX, I very well know that those effects could have been implemented easily  through traditional ways and without 60-70% performance hit in nVidia cards.  It is all about marketing propaganda to woo people who believe in such things.
> 
> Rendering and PhysX processing is again too much for every nVidia graphic card.



Thats how things will go up in future. *Expect gpu's to handle physics computations.* I never saw an unplayable performance hit in batman aa on a gtx 460. You get good playable framerates. Better optimisations in future will allow lesser performance hits. There is much more headroom here.
Physics processing units in the hardware level can be employed as separate execution units. These won't come in the way of rendering. Just like you have separate alu's and cu's in a traditional cpu.




ico said:


> You mean AMD should go out, come up with a gimmick of their own which makes games unplayable on nVidia cards?   If that is the case, then you are an idiot. Sorry to say this.
> 
> Also, stop bringing in "open source" when I'm not talking about it. I'm in favour of open things - neutral to everyone. It doesn't have to be open source.



Well not exactly but an open physics code handled by both amd and nvidia gpu's are the way to go imo. Now this would be absolutely neutral since both gpu's will handle the physics computations and free the cpu for other useful computations and i mentioned before why. Whatever it is, should not be processed by cpu.

Everybody favours neutral things but thats too much of an ideal scenario and never gonna happen. I too favour open things (not proprietary) but the world isn't open and competition will tend to be a big thorn in the bush.

So am i still an idiot?



mohityadavx said:


> fair deal eh?
> u must have heard about nvidia gpu integrated processor if tomorrow nvidia starts making mobo also.
> 
> Think of this scenario you buy brand new gigabyte 20k mobo u insert ur nvidua integrated nvidia gpu cum processor,u press on button and u get message
> ...



An offtopic post.


----------



## mohityadavx (Feb 28, 2011)

vickybat said:


> An offtopic post.



no its not offtopic this is the same thing nvidia is doing with physx its just a more sceptical view which would take place if nvidia is encouraged


----------



## ico (Feb 28, 2011)

vickybat said:


> I never saw an unplayable performance hit in batman aa on a gtx 460. You get good playable framerates.


It is not about being playable. You are getting a good 60-70% hit at the moment which can be greatly minimized if you implement the effects the traditional way.



vickybat said:


> Physics processing units in the hardware level *can be employed* as separate execution units. These won't come in the way of rendering. Just like you have separate alu's and cu's in a traditional cpu.


Currently they aren't separate.



vickybat said:


> Well not exactly but an open physics code handled by *both amd and nvidia gpu's* are the way to go imo. Now this would be absolutely neutral since both gpu's will handle the physics computations and free the cpu for other useful computations and i mentioned before why.


would be good.


vickybat said:


> Whatever it is, should not be processed by cpu.


any reason why?


----------



## vickybat (Feb 28, 2011)

*@mohityadavx*

well if nvidia enters the x86/64 market, then ofcourse it will design its own chipset and mobo. Have you seen an amd processor fitting an intel motherboard and vice-versa?

Think properly before posting.


----------



## mohityadavx (Feb 28, 2011)

ico said:


> hmm, I don't think so.
> 
> For 8k you get HD 5770 which is much faster.
> 
> 8800 GTX = HD 4770 level.



r u sure its gtx series 
costed him 40 k at that time

what should be the apt price

actually i am not buying the card for myself but for a cousin my pc won't even support the card due to generic psu so i don't want my cousin later on to say that i tricked him so as to help my friend dump his card.


----------



## vickybat (Feb 28, 2011)

ico said:


> It is not about being playable. You are getting a good 60-70% hit at the moment which can be greatly minimized if you implement the effects the traditional way.



Like i said it shouldn't always have to be the traditional way. More optimisations will lead to lesser performance hits.



ico said:


> Currently they aren't separate.



I know that.



ico said:


> any reason why?



Gpu's ability to process much more floating point operations per second,integer data types, unified shader architecture, and a geometry shader stage which allows a broader range of algorithms to be implemented. Thus more capable than a cpu for handling physics computations. Read this in an article in digit(way back). 

But separate physics units in a gpu [something like dedicated ppu's and spe's in *cell*(sony,ibm,toshiba)]  will work wonders.


----------



## ico (Feb 28, 2011)

vickybat said:


> Like i said it shouldn't always have to be the traditional way. More optimisations will lead to lesser performance hits.


Would take 5 years.



vickybat said:


> Gpu's ability to process much more floating point operations per second,integer data types, unified shader architecture, and a geometry shader stage which allows a broader range of algorithms to be implemented. Thus more capable than a cpu for handling physics computations. Read this in an article in digit(way back).


Honestly saying, I'm someone who doesn't give a damn about these terms. 

GPU currently are not more than capable of doing things on their own - PhysX is a good example. They still can't handle rendering and physics processing together without a massive performance hit. They might become after 5 years, but that is another thing.

Infact what you have said is the very idea behind Fusion by AMD.

---------- Post added at 02:47 PM ---------- Previous post was at 02:37 PM ----------

anyways, a small Google search, "PhysX gimmick" would get you tons of threads and reasons.


----------



## Joker (Feb 28, 2011)

ok i played batman:arkham asylum with physx on. only had paper flying and some bullet effects. all these have been employed in games since long.

same case in mafia 2. now ur computer cant handle it because u dont have an nvidia card? total gimmick. if it wasnt it would have been used in every game which it isnt. as of now..it is total gimmick. end of.

if it is implemented properly in future..then it isnt. but today it is and not a decidign factor.


----------



## vickybat (Feb 28, 2011)

ico said:


> Would take 5 years.



No not that much. You will see several improvements in kepler that will be launched later this year. In a 5 years time, it will only become much more mature. Amd southern islands may also have some tricks up its sleeve.

You sound more like a pessimist.




ico said:


> Honestly saying, I'm someone who doesn't give a damn about these terms.
> 
> GPU currently are not more than capable of doing things on their own - PhysX is a good example. They still can't handle rendering and physics processing together without a massive performance hit. They might become after 5 years, but that is another thing.
> 
> Infact what you have said is the very idea behind Fusion by AMD.




Well thats your problem. It doesn't matter whether you give a damn to those terms or not. They are going to happen and cpu physics will no longer be a de-facto like in current scenario.

What i said has nothing to do with amd fusion whatsoever. But maybe we may really see an *APU* working on physics computations instead of the cpu cores. They might well have a dedicated physics unit in future.

Go through the cell broadband architecture properly and you will know what i am talking about. Its spe's are much more capable in handling physics computations than  traditional cpu cores and heck it can even render geometrical shapes and figures which are the job of vertex shaders.They closely resemble a gpu architecturally.

In a nutshell- cpu physics is not the future.



ico said:


> anyways, a small Google search, "PhysX gimmick" would get you tons of threads and reasons.



Well it might sound as a gimmick now but is the perfect step incorporating gpu physics. When both companies will support it (something similar but de-facto for both), will put a final nail to the coffin (read cpu physics).


----------



## ico (Feb 28, 2011)

What is Crysis 2 using? Battlefield 3? 

Nothing about pessimism. As of now, all current implementations are gimmicky - an opinion shared by most.


----------



## vickybat (Feb 28, 2011)

Joker said:


> ok i played batman:arkham asylum with physx on. only had paper flying and some bullet effects. all these have been employed in games since long.
> 
> same case in mafia 2. now ur computer cant handle it because u dont have an nvidia card? total gimmick. if it wasnt it would have been used in every game which it isnt. as of now..it is total gimmick. end of.
> 
> if it is implemented properly in future..then it isnt. but today it is and not a decidign factor.



Actually physics does these in case you don't know. You won't get jaw dropping vistas or scenic atmosphere with in game physics. But its the objects behaviour as in a real world.

I am not talking about amd vs nvidia here. Read my previous posts. But implementation of physics in *gpu* rather than *cpu*. *I am favouring both amd and nvidia here.* Nvidia has started and amd will follow suit. Soon we might see physics handled by both amd and nvidia gpu's.



ico said:


> What is Crysis 2 using? Battlefield 3?
> 
> Nothing about pessimism.



I told you already. This is the present scenario. And as you said, there is no de-facto solution of gpu physics handled by both camps yet. That day isn't long though. 

We might or lets say will see much better implementations of in game physics than the current cryengines and frostbite engines in future and that isn't much long. Since these are neutral games, they support cpu physics cause they don't have much option left.

I am damn sure there will be a physics engine in the very near future that will be handled by both amd and nvidia. Thats because physics algorithms favour the gpu more than a cpu and has tremendous architectural differences. This isn't a gimmick by any means and has been proved by many tech experts.



ico said:


> As of now, all current implementations are gimmicky - an opinion shared by most.



Yes this part is somewhat true but the implementations may be gimmicky, show us whats in store for future when gpu's start handling physics.


----------



## Liverpool_fan (Feb 28, 2011)

vickybat said:


> Yes it is. It doesn't have to favour all games across all platforms. Everything doesn't have to be *open source*. Its* proprietary* code and there's nothing wrong with that.


Clearly you are confused between "Open Source" and "Open Standards". Come back when you get your terms right.

---------- Post added at 06:16 PM ---------- Previous post was at 06:10 PM ----------




vickybat said:


> ^^ Thats actually fair imo. You are not supposed to use an nvidia card as physx and a non nvidia card as the primary gpu for rendering. In this case if amd would have been promoting physx, it would also not support non-amd cards as primary gpu. So its a fair marketing strategy.


So nVidia should dictate how should I use the products I bought with my own money 
How about Intel locking out nvidia that you can't use nVidia GPU with an Intel processor. Oh wait in that case you are not supposed to use the nVidia graphics card since you are "not supposed to"  and it will be a "fair marketing policy"


----------



## ithehappy (Feb 28, 2011)

Hmm, I am reading a lot about Marketing Strategy and blah blah but I highly doubt how many of them (who are saying about Marketing Strategy) have a minimal idea about Marketing Strategy! Marketing Strategy is a PRECISE stuff, if everyone had idea about it then we would have 10,000 or more brands like Nvidia or AMD. So I guess it's better to comment on the Performance or other things but not regarding Business here .
BTW- I do have some idea about Marketing Strategy


----------



## mohityadavx (Feb 28, 2011)

@vickybat



> @mohityadavx
> 
> well if nvidia enters the x86/64 market, then ofcourse it will design its own chipset and mobo. Have you seen an amd processor fitting an intel motherboard and vice-versa?
> 
> Think properly before posting.



actually i got somewhat confused but wanted to say this , 



Liverpool_fan said:


> So nVidia should dictate how should I use the products I bought with my own money
> How about *Intel locking out nvidia that you can't use nVidia GPU with an Intel processor*. Oh wait in that case you are not supposed to use the nVidia graphics card since you are "not supposed to"  and it will be a "fair marketing policy"



mind is a strange thing.....


----------



## Liverpool_fan (Feb 28, 2011)

Clearly nVidia should give a refund unless the product box has explicitly stated it works only with nVidia cards.


----------



## mohityadavx (Feb 28, 2011)

^^ thats not the case buddy my friend had to sell his physx card for dirt cheap as nvidia won't accept this(it is also the reason for the existence of this thread). Actually Nvidia earlier used to give support for AMD cards but later on they closed this by a firmware update just like sony doing with ps3


----------



## vickybat (Feb 28, 2011)

Liverpool_fan said:


> Clearly you are confused between "Open Source" and "Open Standards". Come back when you get your terms right.



I am back. You are right, i mistyped the terms. I meant open-standards. Open source are the ones where the developer gives the source code with the app which he develops. In the other hand, open standard is something that is universally accepted and is not proprietary. Is that right?




Liverpool_fan said:


> So nVidia should dictate how should I use the products I bought with my own money
> How about Intel locking out nvidia that you can't use nVidia GPU with an Intel processor. Oh wait in that case you are not supposed to use the nVidia graphics card since you are "not supposed to"  and it will be a "fair marketing policy"



No friend you did not get my point. Intel is not nvidia's competitor in the gpu market but amd is. If nvidia develops some *proprietary standard*, you don't expect the competitor to use that. 

In hardware level proprietary is taken as patent. Now there are a number of patents between intel and amd which they cannot utilize in their chips.
Now many things are thought of when preparing a patent or developing something proprietary. Its not just done randomly. 

In this case, physx code is proprietary and amd gpu's cannot process them which is absolutely fair & is a marketing strategy. *Can amd or nvidia gpu's utilize quicksync?* No, because its proprietary. Same can be said on cuda and stream. Manufacturers do it for the sake of competition.

Now we want something open-standard(or universal) codepath which can be utilised by both gpu's. I was telling *ico* the samething and he kind of agreed.There should be an open physics engine that can be handled by both gpu's and the future is leading us there.

Physx is the first step of a* gpu handling physics*. Look it this way. A gpu is better than a cpu on handling complex physics algorithms. So we can expect more physics engines that are gpu based and it will be sweet if they follow the open-standard which according to me is inevitable.


----------



## ico (Feb 28, 2011)

vickybat said:


> In hardware level proprietary is taken as patent. Now there are a number of patents between intel and amd which they cannot utilize in their chips.
> Now many things are thought of when preparing a patent or developing something proprietary. Its not just done randomly.
> 
> In this case, physx code is proprietary and amd gpu's cannot process them which is absolutely fair & is a marketing strategy. *Can amd or nvidia gpu's utilize quicksync?* No, because its proprietary. Same can be said on cuda and stream. Manufacturers do it for the sake of competition.


Improper analogy here with Quick Sync. Quick Sync in reality is nothing more than an on-chip H.264 hardware encoder. Many devices use hardware encoders, but you have to pay a royalty to MPEG for that. It is still neutral. nVidia and AMD can employ hardware H.264 encoders if they want just like Intel.

Lastly, Stream is nothing more than OpenCL with AMD's API. OpenCL is an _open_ standard.


----------



## vickybat (Feb 28, 2011)

^^ Not that improper i guess. Of course quicksync is an on chip H.264 encoder but its patented not functionally but architecturally. Its a fixed function piece of silicon that solely has one function i.e encoding. 

Even if amd and nvidia develop their own dedicated encoders, they cannot violate the architectural patents that intel employs. The same thing happens in software too. Different codepath achieving the same functionality. Game engines also differ in a similar way. Nvidia and amd also differ in a similar way.

*Atleast this is my understanding.* Correct me if i am wrong.

*i53.tinypic.com/nnochy.jpg

Now architecturally, anything employed by nvidia or amd will be different from the above but will have same functionality i.e a dedicated block that can encode and decode.


What you said about stream is very true and i already knew it. Nvidia also supports OpenCL.


----------



## asingh (Feb 28, 2011)

What is highly irritating and naive about nVidia is that:

1. They literally pay the game developers to write code which only renders affects (certain) via there hardware. Monopoly.
2. For the game to launch their developed middle level (almost driver level) software has to be installed as a prerequisite which is not generic. Bloatware.
3. If nVidia specific hardware is not detected then the extra Physics gets dumped (partially) to the CPU. Force ware (literally their driver name) 
4. If partnered with another companies hardware the game refuses to run because of lock in driver mechanisms. Bad consumer experience.

It is what they used to do once upon a time with their nForce boards to get SLI. Now they have woken up, and license it openly - post X58. I hate a company with such tactics even if they make excellent hardware. They are just greedy and what the whole pie. Extremely expansionist, which company is not, but not unlawful practices. They will get caught and law suited. Similar to what happens to Intel every couple of years.

PhysX is nVidia on games is forced slip stream.


----------



## vickybat (Feb 28, 2011)

*@ asingh*

I couldn't understand the 4th point. Can you please elaborate a bit?

About your first point, paying developers to write code for their hardware is simply a market strategy. Even console manufacturers like sony , nintendo and microsoft pay developers to write code for their hardware( *read as exclusives*), even though there are more multiplatform titles. But exclusives is what sets these apart.

Monopoly will exist in competition. Amd too does the same in some titles i guess.


----------



## asingh (Feb 28, 2011)

vickybat said:


> *@ asingh*
> 
> I couldn't understand the 4th point. Can you please elaborate a bit?
> 
> ...



nVidia does not allow a consumer to use a GeForce accelerator as a PPU along with a non-nVidia GPU.

Please, console exclusivity to a title is different from being able to run on a certain type of hardware. It is like creating/selling/marketing a cola drink that only people who have are 5' 5" tall can digest. 

Can you show me a game title where AMD/ATI have done something similar. The game refuses to run unless an AMD/ATI sofware is installed, or hardware is mandate..? Yes monopoly will exist, but it should be told.


----------



## vickybat (Feb 28, 2011)

asingh said:


> nVidia does not allow a consumer to use a GeForce accelerator as a PPU along with a non-nVidia GPU.



This is obvious imo. They don't want their competitor's card be compatible. 
But i guess this is possible with hacked drivers.



asingh said:


> Please, console exclusivity to a title is different from being able to run on a certain type of hardware. It is like creating/selling/marketing a cola drink that only people who have are 5' 5" tall can digest.



He he this is really funny. Certainly you have a good sense of humour.



asingh said:


> Can you show me a game title where AMD/ATI have done something similar. The game refuses to run unless an AMD/ATI sofware is installed, or hardware is mandate..? Yes monopoly will exist, but it should be told.



Well amd backed titles like medal of honour , battlefield bc2 run fine in nvidia cards. I think the reverse is also true if you take physx out of the equation.


----------



## asingh (Mar 1, 2011)

^^
Guess you justified it.


----------



## mohiuddin (Mar 8, 2011)

Poll Results is showing what phyx really can give.
about phyx i will tell how unefficient it is.
In ut3 or even mafia, the phyx continue calculating, even after all particles got stopped moving....pretending that there is still may probability of collison or interaction, wasting those gpu cycles.
Hopefully amd is on a more practical, open (open-cl) , efficiant and easy to adobt by devs , the bullet phyx.(already used in gta4,movie 2012,handcock.. .).hope for the best.
Apex can't even simulate cloth phyx in mafia2 on gpu, it runs on cpu, even if u have a nvidia gpu.so, at high preset(cloth phyx on) u get a big fps hit unless u disable cloth phyx by some tweaks.


----------



## ico (Mar 17, 2011)

I was thinking of this. With fusion, what if AMD eases off processor's floating point calculations to the on-chip GPU itself?? Any need of PhysX alternative then? Because I think they're going to do this in the coming years.


----------

