# First DX 12 benchmark shows AMD beating the hell out of Nvidia!



## bikramjitkar (Aug 21, 2015)

Ashes of the Singularity shows R9 290x to be as fast or faster than 980Ti in DirectX 12

DirectX 12 tested: An early win for AMD and disappointment for Nvidia | Ars Technica



> *An AMD coup*
> 
> To say these benchmark results are unexpected would be an understatement. While it's true that AMD has been banging the DX12 drum for a while, its performance in Ashes is astonishing. AMD's cheaper, older, and less efficient GPU is able to almost match and at one point beat Nvidia's top-of-the-line graphics card. AMD performance boosts reach almost 70 percent under DX12. On the flip side, Nvidia's performance is distinctly odd, with its GPU dropping in performance under DX12 even when more CPU cores are thrown at it. The question is why?
> 
> ...


----------



## Desmond (Aug 21, 2015)

David has defeated Goliath at last. Glad to see AMD getting traction against nVidia.

However, this is just one benchmark. I am sure nVidia will step up its game later once DirectX 12 gets more mainstream.


----------



## chimera201 (Aug 21, 2015)

Nvidia doesn't care about games that no one is going to play


----------



## Desmond (Aug 21, 2015)

chimera201 said:


> Nvidia doesn't care about games that no one is going to play



It's not a matter of caring. Nvidia's drivers are probably not optimized enough for now. They might come up with new drivers that would work better.


----------



## chimera201 (Aug 21, 2015)

DeSmOnD dAvId said:


> It's not a matter of caring. Nvidia's drivers are probably not optimized enough for now. They might come up with new drivers that would work better.



Not really. No one cares about the Ashes game and no one did care about the Star Swarm game that Mantle was benchmarked on. Nvidia picks its titles wisely for optimization.


----------



## bikramjitkar (Aug 21, 2015)

chimera201 said:


> Nvidia doesn't care about games that no one is going to play



If they didn't care, they wouldn't immediately send out a press release and a new driver to improve performance.  
As the article says, Maxwell is optimized for serial execution which works for dx 10 and 11 while DX 12 favours parallel execution in which GCN is a beast.


----------



## chimera201 (Aug 21, 2015)

Of course they would care if it gets media attention. But honestly were you waiting for Ashes of the Singularity. Did you even hear about it before?


----------



## bikramjitkar (Aug 21, 2015)

Yes, we all saw how all their caring turned out for Arkham Knight.


----------



## Desmond (Aug 21, 2015)

chimera201 said:


> Of course they would care if it gets media attention. But honestly were you waiting for Ashes of the Singularity. Did you even hear about it before?



Whether we heard about it before is irrelevant. AAA games are not always a benchmark of quality.


----------



## chimera201 (Aug 21, 2015)

You don't seem to get the point. Nvidia would care about those games that would sell more copies like Witcher 3, MGSV, Project Cars, etc. More copies sold directly translates to more PCs with Nvidia GPU which directly translates to profit for Nvidia. It would work in Nvidia's interest to optimize for those kind of titles. Optimizing requires resources and Nvidia will choose those that gives them profit.

- - - Updated - - -



bikramjitkar said:


> Yes, we all saw how all their caring turned out for Arkham Knight.


Arkham Knight was WB's fault not Nvidia's fault. They even admitted that they knew PC port was a disaster before release and they allocated all their resources to polish the console version because the consoles were having problems.


----------



## $hadow (Aug 21, 2015)

Wait for a while Nvidia is not going to sit quietly there, another comparison incoming soon.


----------



## Desmond (Aug 21, 2015)

chimera201 said:


> You don't seem to get the point. Nvidia would care about those games that would sell more copies like Witcher 3, MGSV, Project Cars, etc. More copies sold directly translates to more PCs with Nvidia GPU which directly translates to profit for Nvidia. It would work in Nvidia's interest to optimize for those kind of titles. Optimizing requires resources and Nvidia will choose those that gives them profit.
> 
> - - - Updated - - -
> 
> ...



You are not getting my point either. Both had not optimized their drivers for anything. It was just circumstance that DX12 works better with GCN.


----------



## chimera201 (Aug 21, 2015)

What I am trying to say is wait for a proper benchmark of a DX12 game that is GOTY-like and released. Don't jump to conclusions on a benchmark of a game that is in alpha something and is a type of game that few people are interested in. It will probably take some years for a proper DX12 game to be out.


----------



## Desmond (Aug 31, 2015)

Well, Nvidia done f****d up.

*www.reddit.com/r/pcgaming/comments/3j1916/get_your_popcorn_ready_nv_gpus_do_not_support/


----------



## warfreak (Sep 1, 2015)

This is a never ending battle. Only thing I care about is power consumption and it that regards, nVidia still trumps AMD. It won't matter if either brand pushes 5%FPS more than the other but if a card is able to perform at 5% less energy and/or 5% less cost then that makes a world of difference.

It's all about the Watts:FPS:$ ratio. One who maintains this balance wins.


----------



## bikramjitkar (Sep 1, 2015)

DeSmOnD dAvId said:


> Well, Nvidia done f****d up.
> 
> *www.reddit.com/r/pcgaming/comments/3j1916/get_your_popcorn_ready_nv_gpus_do_not_support/



So, basically anyone in the market for a new GPU that they don't wanna upgrade for the next 2-3 years should only be considering AMD. If only AMD's PR and marketing team was half as good as Nvidia's...


----------



## Desmond (Sep 1, 2015)

warfreak said:


> This is a never ending battle. Only thing I care about is power consumption and it that regards, nVidia still trumps AMD. It won't matter if either brand pushes 5%FPS more than the other but if a card is able to perform at 5% less energy and/or 5% less cost then that makes a world of difference.
> 
> It's all about the Watts:FPS:$ ratio. One who maintains this balance wins.



Indeed. We need to wait and watch how AMD tries to build on this. I don't think Nvidia can do much about this other than coming up with a software implementation of Async compute in one of their driver updates.

I think the next Nvidia architecture would remedy this.


----------



## sam_738844 (Sep 8, 2015)

Read This : Exclusive: The Nvidia and AMD DirectX 12 Editorial - Complete DX12 Graphic Card List with Specifications, Asynchronous Shaders and Hardware Features Explained


----------



## gagan_kumar (Sep 8, 2015)

well now I am feeling good that I picked a AMD card........ I was starting to wonder why AMD is so focused on GCN arch even though the performance per watt.....

- - - Updated - - -



warfreak said:


> This is a never ending battle. Only thing I care about is power consumption and it that regards, nVidia still trumps AMD. It won't matter if either brand pushes 5%FPS more than the other but if a card is able to perform at 5% less energy and/or 5% less cost then that makes a world of difference.
> 
> It's all about the Watts:FPS:$ ratio. One who maintains this balance wins.


be happy with your extra 500 bucks saved per year....


----------



## ico (Sep 10, 2015)

so the rumours that DirectX 12 is sort of a rebranded Mantle are turning out to be true?

Good. The first thing I'll buy is a R9 Nano when I start earning.


----------



## Nerevarine (Sep 10, 2015)

^look who's back


----------



## anirbandd (Sep 10, 2015)

Nerevarine said:


> ^look who's back


 [MENTION=26711]ico[/MENTION] 

on topic: im not a nvidia or amd fanboy so i do not have any preference as of now. what i do have a preference for is per watt efficiency, and nvidia wins that for now, which leads to GTX960. 

now, dx12 is only just picking up the pace, so it will be a year or so before fully fledged dx12 games come out. hence i will stick to nvidia now. 

however, nvidia WILL inevitably build up something for the dx12 architecture. in case it does not, then back to the red club it is for me.


----------



## ico (Sep 10, 2015)

yea, performance per watt efficiency is good. 

But from a price perspective, it only saves me 3 Masala Dosas a month when I game for 10 hours each day.


----------



## seamon (Sep 10, 2015)

Nvidia fanboy here and banned by ^this dude for saying "AMD sucks".

Me interviewing Nvidia representative who works in the Architecture department:

1) Are you working on Pascal now?
A) Pascal is done, we're working on Volta now.
2) AMD beat Nvidia in DX12 benchmarks, your comments?
A) DX12 is still early for benchmarking but there was a recent benchmark contest where AMD went from 20(DX 11) to 41(DX 12) FPS and Nvidia went from 45(DX 11) to 43(DX 12) FPS. So, we're still technically faster. Driver updates will increase performance for Nvidia cards.
3) So, you guys will still beat AMD?
A) Yes, I have to be confident in that.


----------



## gagan_kumar (Sep 10, 2015)

seamon said:


> Nvidia fanboy here and banned by ^this dude for saying "AMD sucks".
> 
> Me interviewing Nvidia representative who works in the Architecture department:
> 
> ...


try joining NVidia after you graduate.... give us some exclusive discount coupons....


----------



## sam_738844 (Sep 10, 2015)

seamon said:


> Nvidia fanboy here and banned by ^this dude for saying "AMD sucks".
> 
> Me interviewing Nvidia representative who works in the Architecture department:
> 
> ...



Seconded and I will Quote  "Denial" from Guru3D



> Just going to increase latency like the AMD one, perhaps even worse since it's partially software based. I doubt this will yield any real benefits. Then again it's not like it matters. AoS is essentially the same thing as the 3D Mark draw call test, the Fury X and the 980Ti tie in performance. Why people care about this stuff so much is beyond me. The console guys see 30% increases because those systems are already CPU starved. Go look at low processor benchmarks of AoS with fast GPUs, pcper has a good example, the worse the processor the more the difference that dx12 makes.
> 
> But in the mean time you have people posting stupid bull****, including technical review sites, like the ars technia article that compares the 290x to the 980ti and circle jerks over the fact that a $300 card performs the same as a $650 one in AoS benchmark. What they fail to mention is that it also performs the same as the Fury X. But that doesn't fit the current narrative so they don't mention it.
> 
> Similarly Nvidia is also ****ing stupid for not just getting an engineer to explain it at all. Have Tom Peterson sit down and just have some slides so people can understand what goes on with this stuff. AMD should also probably put a leash on some of their employees. The technical advertising guy that made all those posts on reddit is looking pretty stupid right now.


----------



## chimera201 (Sep 10, 2015)

> _The technical advertising guy that made all those posts on reddit is looking pretty stupid right now._



lol yes. Why do people believe that AMD won't play tricks? After all the majority thinks that FX 8350 is a 8 core processor


----------



## gagan_kumar (Sep 10, 2015)

chimera201 said:


> lol yes. Why do people believe that AMD won't play tricks? After all the majority thinks that FX 8350 is a 8 core processor


they are not?? I thought they only shared resources....


----------



## chimera201 (Sep 10, 2015)

gagan_kumar said:


> they are not?? I thought they only shared resources....


No its Clustered Multi-threading not anywhere near double the cores. That is why Intel's i5 beats FX in almost every game. Read this:
AMD Bulldozer/Piledriver Modules and Hyper-Threading « Blog


----------



## ico (Sep 11, 2015)

seamon said:


> Nvidia *fanboy* here and banned by ^this dude for saying "AMD sucks".


actually you got banned for being an idiot. All fanboys are.

AMD is going bankrupt in 2 years anyway, may be then you can have your little laughs.


----------



## Nerevarine (Sep 11, 2015)

> AMD is going bankrupt in 2 years anyway, may be then you can have your little laughs.




any specific reason ? apart from their low market share


----------



## seamon (Sep 11, 2015)

gagan_kumar said:


> try joining NVidia after you graduate.... give us some exclusive discount coupons....



That's the plan.


----------



## sam_738844 (Sep 11, 2015)

Aquanox Dev : 





> We aim to develop a game that is enjoyable to everyone who wishes to join the world of Aqua. Implementing and/or focusing on technologies that would limit certain people from accessing the game is entirely against our philosophy of being a community focused developer. If at any point, there will be an implementation possible that will not limit NVIDIA card users, then we will certainly explore this option as well.



Read more: Aquanox Dev: We'd Do Async Compute Only With An Implementation That Doesn't Limit NVIDIA Users


----------



## chimera201 (Sep 11, 2015)

Nerevarine said:


> any specific reason ? apart from their low market share



You never read the financial reports?


----------

