# X1800XT vs. 7800 GTX



## venkat1605 (Oct 21, 2005)

Hey Icecoolz,I an really sorry for what happened between us.I personally wanted to ask u a question.My brother is coming back from Canada & i have asked him to get a GFX card, he personally recommended a X1800XT (512 MB) which is to be shortly released,but i also have the 7800 GTX in my mind.Which one should i go for.If its 7800 GTX i would use it in SLI & if its a X1800XT i would use it in CROSSFIRE.Hope u reply soon.This is opened to others also.


----------



## sahil_blues (Oct 21, 2005)

read this....it might clear out something!!


----------



## venkat1605 (Oct 21, 2005)

Hey Sahil,Thanks a lot for that.


----------



## enoonmai (Oct 22, 2005)

The X1800XT is better in Crossfire mode or even in Single Card mode when it comes to the 7800 GTX, but things aren't really crystal clear. While the ATI cards are better when it comes to D3D games, the Nvidias are better when it comes to OpenGL games. Although the performance difference between them otherwise isn't all that great. Of course, the X1800's 512-bit internal memory architecture, is of course, vastly superior. You've got to get yourself a Crossfire system of course, and a superior top of the line CPU, like an FX-57 or a dual core 4400+ to fully push these babies to their max.

Just wondering, casually, of course, is it possible to use two Dual 7800 GTs in SLI mode, kinda like dual SLI. Just wondering, and a bit too lazy to Google.


----------



## sahil_blues (Oct 22, 2005)

@venkat probably you can remove the "Attn Icecoolz" part and let it be just "X1800XT vs. 7800 GTX"....i think it'll help you get a lot more reviews....i hope you don't mind....it is only a suggestion....

@enoonmai did u check out my link??....


----------



## enoonmai (Oct 22, 2005)

@Sahil: Yes, I am well aware of the benchmarks that thrash the 7800 GTXs. However, its undeniable that Nvidia has an edge over ATI in OpenGL games, and ATI to have an edge over Nvidias in D3D games. And most new game developers and publishers are "aligning" with either of these companies and optimizing their code to run better on specific cards. For example, no matter what you try, you cannot run Doom 3 as perfectly on a X800 as you can on a 6800, and vice versa for HL2. There will always be a subtle, but definitely noticeable, difference. In the end, benchmarks are nothing but "show off" numbers. True, they are indicative of performance to a degree, but a higher score does not necessarily translate to a better performance, especially when it comes to "hardware optimized" games like D3, HL2 etc. that align themselves with a particular vendor. 

In the end, it really doesn't matter how much the X1800 scores over the 7800s. Because Nvidia has already won this round. Apart from a magnificient paper and actual release of the card, the 7800 series cards are very widely available everywhere, unlike the phantom series ATI cards, that are available only in limited quantities. A customer might just be tempted to go in for a 7800 GTX SLI setup rather than miss out on the X1800 XT Crossfire setup and sit and watch while his friends play at glorious resolutions.


----------



## gxsaurav (Oct 22, 2005)

haven't u read alll the benchmarks posted all over the internet

7800GTX has won the round already, it's available today, at a price far lower then even a X1800XL, X1600XT is launched to beat 6600GT but it costs as much as 6800nU ($250), which beats it by fair margin

F.E.A.R is out now, which is directX 9c based, but since there are no X1800XT to be found, people has no choice but to buy a 7800GTX to play it at max settings

Quake 4 is out now, which is OpenGL based, which playes best on 7800GTX

one thing to note, none of the benchmarks, are compleately fair, they are benchmarking a 512 MB X1800XT with 256 7800GTX, because of whch Z1800XT beats it by about 10 frames max, at the most common playing settings of 1024X768 with AA & Anis, 7800GTX beats X1800XT easily, despite having low RAM

ennonmai

Matrox parhelia already had 512 bit internal interface a long time back, which wasen't used properly at that time, so it came out as a failure, i guess this might happen with ATI too


----------



## Nemesis (Oct 22, 2005)

> From these two synthetic and four "real world" game engines you can see that ATI has taken the 16 pixel shader and 8 vertex shader pipelines from the X850 generation and massively overhauled them into a highly efficient system. In many of the tests at 10x7 and 16x12 resolutions, the 24 and eight pipelines of the NVIDIA GeForce 7800 GTX could not keep up with the ATI X1800XT.
> 
> One of the most drastic leaps was in 3DMark 2005 where the 16x12 4xAA 8xAF score lead the NVIDIA 7800 GTX by over 1,100 marks.



Source: THG

While the ATi card has more memory, you seem to conveniently ignore that the fact that it uses only 16 pipelines as compared with 24 for the 7800GTX. Besides, when you can afford such high-end cards, a price difference of $50-100 makes no difference if you can get better performance.


----------



## blade_runner (Oct 22, 2005)

Between the x1800xt and 7800gtx i would suggest going for the x1800xt series. Its clearly faster with amazing IQ, angle independent AF, HDR+AA and avivo. Plus remember the fact that it beats the 7800gtx series without getting proper drivers. Yes, the drivers havent been optimised as yet since it is a fairly new architecture. once the drivers mature you shud see a significant increase in performance. I'll ask the same question once Cats 6 are out  we'll see who's faster then. 

Nemesis pointed out this thread to me and i cudn't resist. 

EDIT: Also last week a small registry tweak gave a 30% increase to x1800 cards in doom 3 with aa at higher resolutions. This is just the tip of the iceberg here.


----------



## deathvirus_me (Oct 22, 2005)

7800GTX anyday .. much better architecture .... moreover the extra pipes surely helps when u turn on aa,af .....

Moreover .. having 512 MB ram over a 256 bit mem. interface doesn't give a big improvement .. if it did then the 512 MB 6800U would have been better than the 7800GT ......


----------



## AlphaOmega (Oct 23, 2005)

Who told you guys that the X1800 'beats' the 7800? Everywhere, where I found the comparison, people are saying that the new card from ATi is a disappointment (compared to the 7800). While it finally brings ATi up to par with nVIDIA in terms of feature set, the performance does not make it a clear cut winner.
Below are some numbers from AnandTech:
*images.anandtech.com/graphs/ati%20radeon%20x1000%20launch_10050581026/9341.png
Here the 7800 (both) is a clear winner, though Doom3/OpenGL has always been kinder on nVIDIA.

*images.anandtech.com/graphs/ati%20radeon%20x1000%20launch_10050581026/9342.png
This one is interesting. The 7800 actually takes a lead in DoD (albeit an insignificant one. But me thinks that a nVIDIA card one-upping an ATi one in a game based on Source is still impressive. A driver update might change that.

*images.anandtech.com/graphs/ati%20radeon%20x1000%20launch_10050581026/9343.png
Far Cry is still one tough engine, though, as far as I know, there aren't any games coming out based on it. ATi wins here.

*images.anandtech.com/graphs/ati%20radeon%20x1000%20launch_10050581026/9344.png
SC: CT is the newest game, and features support for every new feature there is. ATi fairly whoops nVIDIA's @ss. Though the situation changes when you push higher, as AnandTech did later. Read on...

*All tests below are to check the future validity of these cards. The kids gloves are off:*
*images.anandtech.com/graphs/ati%20radeon%20x1000%20launch_10050581026/9345.png
nVIDIA still the king here. No surprise here.

*images.anandtech.com/graphs/ati%20radeon%20x1000%20launch_10050581026/9346.png
This I found intriguing. nVIDIA actually pull away when the settings go through the stratosphere! In Source! Did Valve have a fall out with ATi?

*images.anandtech.com/graphs/ati%20radeon%20x1000%20launch_10050581026/9347.png
ATi again wins, but by a miniscule difference of .2 FPS. With the increase in resolution nVIDIA catches up. The difference is so insignificant, that it can't be called a difference. And the thing is, since ATi had a sizable lead at lower resolutions, losing ground at higher resolutions is a major drawback.


*images.anandtech.com/graphs/ati%20radeon%20x1000%20launch_10050581026/9348.png
The same as with Far Cry, ATi lets nVIDIA catch up at this resolution. Not good for ATi.

*images.anandtech.com/graphs/ati%20radeon%20x1000%20launch_10050581026/9349.png
nVIDIA wins here, but it probably could have gone either way.

So, going by AnandTech, X1800 is not quite the 7800 killer ATi has made it out to be. Specially, losing ground at higher settings is a big blow to ATi, since it raises questions of future proofing.

Please see: the X1800 wins at 3DMark05, with a lead of 700-1000 marks approx. However, going by synthetic benchmarks is very risky. My rig (though it barely qualifies as a 'rig'   ), P4 2.4c (with HT), i865GBF, Kingston 2x256MB PC-3200 Dual Channel, XFX nVIDIA 6600GT 128MB DDR3, Seagate 80GB 7200 gets around 3400-3450 in 3DMark05. This is really strange since a system with AMD 64 3400+, Corsair 1GB RAM, XFX 6600GT 128MB DDR3 gets approx. 3171 marks. So does that make my system better? Of course not, as in real world tests that system hangs mine to dry.

*enoonmai*: I don't recall reading anywhere that the X1800 is a 512*bit* card, but if you say so... While a higher bit memory interface is never a bad thing, it does not guarantee a better card. Case in point, I was reading a comparison between 6600GT 128MB (with 128bit interface) and the X800 GTO 256MB (with 256bit interface)(some vendors simply call it the X800*GT*). The 6600GT beat the card is most tests and came out the winner, with the exception of 3DMark05, where is lagged behind a bit. The core should have enough throughput to actually match the bandwidth provided with a 512bit interface. And the X1800 features 'only' 16 pixel pipes, compared to the 24 and 20 7800GTX and 7800GT have, respectively. The higher the no. of pixel pipes (among other things), the higher the memory bandwidth requirements.

The ATi's CrossFire technology is nowhere near SLI. You can't just pickup 2 ATi cards a put them together to get a CrossFire system, like you can with nVIDIA's SLI. You need a special CrossFire Edition master card that acts as the compositing engine and output device for the system. This card will not only cost more, but you will have a hard time finding it. Moreover, with the CrossFire in place you get limited to 1600x1200 resolution, with 60Hz. If you are putting 2 extremely expensive cards together, you wouldnâ€™t want to be limited in anyway, especially to tear-my-eyes-out 60Hz refresh.
The initial versions of nVIDIA's SLI supported either a single slot/card running at 16X PCX or a 2 card SLI setup with both running at 8X PCX. This is what the CrossFire offers even now, when nVIDIA has already updated SLI so that both PCI-e slots run at a full 16X. This will provide a drawback to CrossFire, since no way and 8X PCX can meet with the complete bandwidth requirements of a 256bit 7800, leave alone a 512bit card. Also, as far as I know, CrossFire enabled MoBos are quite hard to find, but I could be wrong.

So, in my opinion the 7800GTX (single or SLI) is a better choice of today and tomorrow, despite the X1800 being a very impressive card. The future might be favourable to the 7800 cards, with their higher pixel pipe no.

I wish ATi could have given us the same card that they developed for the XBOX 360, with the interchangeable on demand Pixel and Vertex pipelines. Now THAT would have given nVIDIA nightmares.


----------



## goobimama (Oct 23, 2005)

The thing about the 7800 is that its available. Nvidia has made sure there is a proper distribution of its new card. The 1800, God alone knows when India will be able to use it...


----------



## AlphaOmega (Oct 23, 2005)

goobimama said:
			
		

> The thing about the 7800 is that its available. Nvidia has made sure there is a proper distribution of its new card. The 1800, God alone knows when India will be able to use it...



The availability of 7800 is a plus point for nVIDIA. But if Venkat's chooses it over the 7800 and his brother can get it for him, then avilability becomes a moot point, atleast for this discussion.


----------



## Major-Minor (Oct 23, 2005)

AlphaOmega said:
			
		

> *enoonmai*: I don't recall reading anywhere that the X1800 is a 512*bit* card, but if you say so...




The X1800 has 512bit *internal* memory ring bus, externally it still has a 256bit memory interface.

Oh and if anyone is interested, I just found out yesterday that BigByte has stopped stocking the XFX cards, they will now be selling only their own brand of cards, the BIG brand, if you were wondering.
I also got the rates for both the brands (inclusive of taxes) - 
XFX 7800GTX - 31k
XFX 7800GT - 26k 
(Rates from Rashi Peripherals)

BIG 7800GTX - 29k 
BIG 7800GT - 24k
(Rates from BigByte)

I was also told by Mr. Vikas at BigByte that the BIG 7600 should probably be available in Dec.


----------



## funkymonkey (Oct 23, 2005)

Fir let the nvidia announce 7600 and then we will see 
About the 7800 series.
It has its own series of issues. Both cards are more than powerful enough to run your game sat max settings. I own 7800GT. And the same problem of alpha textures that exsisted in GF6 series is there with GF7 series.
What does it mean?
Well where game uses alpha estures to render shadows there will be huge problems with 7800GTX or GT.
The shadows dont get rendered correctly and appear as courrupted blocky testure.
Same things are rendered beautifully on any ati card from 9700 to X1800 series.
Thats disappoint to see. IQ wise ATI has upper hand at this moment. Its your choice what to pick.
Given me the choice to pic wither 7800GTX or X1800XT i would pick X1800XT any given day.


----------



## gxsaurav (Oct 23, 2005)

even I m waiting for 7600 series, since it supports OpenGL 2.0 which i need, at last a viable upgrade from my 5900XT


----------



## AlphaOmega (Oct 23, 2005)

gxsaurav said:
			
		

> even I m waiting for 7600 series, since it supports OpenGL 2.0 which i need, at last a viable upgrade from my 5900XT




Doesn't the GF 6x00 support OpenGL 2.0? I think it does, as my 6600GT box says so, and so does the XFX, eVGA, BioStar site.
*www.biostar.com.tw/products/vga/GeForce_6600_Series/index.php3
*www.xpcgear.com/evga6600tx.html
*www.xfxforce.com/web/product/listConfigurationDetails.jspa?productConfigurationId=1084




			
				funkymonkey said:
			
		

> Well where game uses alpha estures to render shadows there will be huge problems


Well, the 7800 is the first, and currently only, card to support AA in alpha textures. Traditionally, card can only remove the jaggies from the edges of textures, but the new AlphaAA can remove them from _inside_ textures, where they are transparent.
The blocky shadow problem was also effecting ATi cards, at least in UT 2004, as far as I know. That problem was removed from BF2 by using ForceWare 77 or higher. ATi ruled the roost, in IQ, during the time of the Radeon 9x00 series. Even the lowly 9200 was visibly better than comparable nVIDIA cards (in Doom3, which I have seen on 9200 and 5200/5600). But now, with the 6 and 7 series, ATi and nVIDIA are more or less on par. Also, pushing quality settings, AA and AF on the 7800 will be easier, due to its 8 extra pixel pipes.


----------



## asdf1223 (Oct 23, 2005)

comparing anandtech and toms benchmark the 7800gtx is the fastest(1600x1200 no aa) but it seems x1800xt takes the lead with aa/af all the way up. but atis problems are availablity of x1800 xts/master cards,power consumption and the fact that nvidia can beat them any day with an ultra version.


----------



## AlphaOmega (Oct 23, 2005)

asdf1223 said:
			
		

> comparing anandtech and toms benchmark the 7800gtx is the fastest(1600x1200 no aa) but it seems x1800xt takes the lead with aa/af all the way up. but atis problems are availablity of x1800 xts/master cards,power consumption and the fact that nvidia can beat them any day with an ultra version.



ATi has really created a super-efficient engine, something nVIDIA should try to emulate. But I wonder, how far will an engine that is just efficient be able to go against an engine that is much more powerful, with 50% extra of the no. of pixel pipes? Especially when games are getting more and more shader intensive, like the upcoming  Unreal Engine 3.
This can be seen when the resolution is really increased, which is even more stressing than enabling AA, as when AnandTech sets it to 2048x1536, the 7800 either increases its lead or catches up to the X1800.

Maybe, in the following generation, ATi will marry its efficiency to a powerful engine and then we will have another case like AMD and Intel, where AMD pulled away with sheer efficiency during the Athlon XP, but could not keep up with the amped up Northwood later. Then AMD not only made an efficient core, but one that was also super powerful, the Athlon64. Intel is still reeling in the aftermath...
IMO, ATi already has such a core, the XBOX 360 chip. I am betting that the only thing holding ATi back from releasing it now is some kind of agreement with Microsoft. The next gen ATi card for PCs will most likely carry that architecture, but that is still 6 months away


----------



## Nemesis (Oct 23, 2005)

Correct me if I'm wrong, but aren't ATi already working on the R580? If that's the case, then we can expect to see a terrific card from ATi that builds up on the R520. As far as I know, the 360 is just using a modified R520 - even Nintendo will be using a custom R520 chip. I doubt if ATi will be allowed to release these custom chips for the PC market.


----------



## AlphaOmega (Oct 23, 2005)

Nemesis said:
			
		

> Correct me if I'm wrong, but aren't ATi already working on the R580? If that's the case, then we can expect to see a terrific card from ATi that builds up on the R520. As far as I know, the 360 is just using a modified R520 - even Nintendo will be using a custom R520 chip. I doubt if ATi will be allowed to release these custom chips for the PC market.



I don't think that ATi's XBOX 360 chip is a modified R520, at least from what I know about its architecture. It is codenamed Xenos, and has a unified shader architecture, meaning that the pixel and vertex shaders are not discrete hardware units. The shaders present can act as _either_ pixel or vertex shaders. Depending on the demands of the game, any number of shaders can be set to process either. If a scene is renderer heavy more shaders can be allocated for pixel processing, or if a scene requires more geometry then more shaders can be set for vertex functionality. Talk about efficiency!
This architecture is so radically different from anything we have seen that I don't think that Xenos is merely an updated version of R520, and there are plenty of other differences.

ATi will not be allowed to release the exact same chip as the 360 to the PC. But no one can stop it from using an architecture that it has developed. Kinda like the XBOX (1) chip, which was better than the GF3, but nVIDIA released an even better one as the GF4Ti. And, if I am not wrong, XBOX GPU's shader programmability was announced before GF3.

Of course ATi is already at work on its next chip, since graphic hardware has a product cycle of approx. 6 months, ATi must have been working on the next chip for over a year. The work probably started as soon as the R520 left the chip design labs, as no one can possibly churn out a new chip from scratch in just 6 months! I would like to know if this new chip has the shader flexibility, cause vertex units have always gotten the short end till now.


----------



## vmp_vivek (Oct 24, 2005)

Maybe this will help: *hardware.gamespot.com/Story-ST-23719-2610-9-9-x


----------



## AlphaOmega (Oct 24, 2005)

BTW, does anyone know that the next version of of 3DMark is in the works. The guys at FutureMark have a policy to update the benchmark once the magical 10K mark is broken.

The new version will definately add support AGEIA PhysX PPU (oh God, not another component required for gaming ), and will most likely support multi-threading.

3DMark05 is no longer a valid benchmark, as both nVIDIA and ATi bring unique performance enhancements to the table. And there have been plenty of developments on other components, like multi-core CPUs.

Even then, regarding GameSpot's review, at 1024x768 X1800XT *512MB* gets 9240 against 7800GTX *256MB*'s 7749. The difference being 1491 Marks. Upping the resolution to 1600x1200, X1800XT's score is 6549, while the 7800GTX gets 5812. The difference has more than halved, to 737.
I think that at low resolutions, like 1024, X1800XT scores by having much higher frequencies, but increasing the resolution lets the 7800GTX flex its fillrate muscle.

Guess we will really have to wait for the new 3DMark (an event I find really depressing), cause 05 seems to have hit its peak.

Don't go by 3DMark scores only, I have usually found them to be better at judging non-GPU components, like CPU, RAM etc. to find which component is holding back the system. Just my opinion.


----------



## funkymonkey (Oct 24, 2005)

> Well, the 7800 is the first, and currently only, card to support AA in alpha textures. Traditionally, card can only remove the jaggies from the edges of textures, but the new AlphaAA can remove them from _inside_ textures, where they are transparent.
> The blocky shadow problem was also effecting ATi cards, at least in UT 2004, as far as I know. That problem was removed from BF2 by using ForceWare 77 or higher. ATi ruled the roost, in IQ, during the time of the Radeon 9x00 series. Even the lowly 9200 was visibly better than comparable nVIDIA cards (in Doom3, which I have seen on 9200 and 5200/5600). But now, with the 6 and 7 series, ATi and nVIDIA are more or less on par. Also, pushing quality settings, AA and AF on the 7800 will be easier, due to its 8 extra pixel pipes.



Dude what you are talking about is Gama correct AA and transperencey AA.
I am not talking about that. Have you seen the screenies of Farcry with GF6 and GF7 series. The shadows casted by trees are all messed up.
The problem was BF2 was simple driver bug and its solved. That was nothign to do with what i am talking here.
The whole GF6 and GF7 series has problems where objects casting shadows use alpha texture to do so. This is clearly visible on the entire GF6 and GF7 range. This has nothing to do with Transperencey AA or Gama correct AA. It happens irrespective of if thats enabled or not.
And no none of the ATI products have these rendering errors. 
Nvidia still using Angle dependent AF which ATI has ditched with X1800 series which is good. That will give it extra edge in IQ.


----------



## AlphaOmega (Oct 25, 2005)

funkymonkey said:
			
		

> > Well, the 7800 is the first, and currently only, card to support AA in alpha textures. Traditionally, card can only remove the jaggies from the edges of textures, but the new AlphaAA can remove them from _inside_ textures, where they are transparent.
> > The blocky shadow problem was also effecting ATi cards, at least in UT 2004, as far as I know. That problem was removed from BF2 by using ForceWare 77 or higher. ATi ruled the roost, in IQ, during the time of the Radeon 9x00 series. Even the lowly 9200 was visibly better than comparable nVIDIA cards (in Doom3, which I have seen on 9200 and 5200/5600). But now, with the 6 and 7 series, ATi and nVIDIA are more or less on par. Also, pushing quality settings, AA and AF on the 7800 will be easier, due to its 8 extra pixel pipes.
> 
> 
> ...



I know _that_! Its just that your talk of alpha textures reminded me of 7800's ability to apply AA to alpha. What I have written before and after are two completely different things. Actually there is a problem with ATi cards when creating shadows in UT 2004, but it seems to be an isolated problem, which does not appear in other games. Here is a link to a screenshot of the weird shadows: ATI X700 Pro in DM-1on1-Desolation: 
*i19.photobucket.com/albums/b182/forumposter32/utshadows.jpg.

The blocky shadow problems, with nVIDIA cards will most likely be solved in a driver release. Also, in Far Cry you can practically clear it up with this command: "e_activeshadowmapsreceiving 1" (default is 2) (though I haven't tried it and don't think it is a perfect solution, will have to test it find out). Is the problem visible on any other games except Far Cry? I have heard of it being visible a couple to times in 3DMark05.

And this is not the only problem with cards/drivers. There was a also the texture shimmering problem with the 7800, which ATi found and reported. nVIDIA resolved it in 78 drivers. But then it was found that ATi had it too! I don't know if it has been resolved yet, but ATi had not done so till the Catalyst 5.8.
*www.hardwareanalysis.com/content/article/1812/

Another thing. I noticed something strange when running Far Cry: The Project (the tech demo from CryTek) on my XFX 6600GT 128MB DDR3 AGP n/OC. During the entire demo, there was a weird kind of 'shimmer' (not to be confused with 'the shimmering problem') or bluriness visible on different parts of the screen, like an out-of-focus camera. First I though that it was a problem with my monitor, it was not. It was like there was a patch of bluriness on a random part of the screen. I don't know what is causing it. I ran it on ForceWare 78.01. Haven't checked it on 81.85, yet


----------



## blade_runner (Oct 25, 2005)

AlphaOmega said:
			
		

> funkymonkey said:
> 
> 
> 
> ...



The console command e_activeshadowmapsreceiving 1 hasn't solved the blocky texture bug yet mate. The issue is still very much there.
Picture Link
Also If its was a driver issue like you pointed out then farcry has been out for over an year now and this issue has been noticed with Nv's 2 generations of cards mainly series 6 and 7. Let's consider that this is a driver issue, in that case its over an year, so why is it not fixed yet ?  
This is the bug on 7 series card specifically a 7800gt 
*img468.imageshack.us/my.php?image=farcry0005large0cz.jpg

As for shimmering; at the defauly quality settings for AF the games still shimmer and only the hi-quality AF fixes albeit with a performance hit. So that makes one wonder whther Nv are really "over-optimising" again. 

For those who dunno, the Opengl crown has been taken away from Nvidia. 
Chk out the following 

*www.techenclave.com/forums/quake-4-high-end-graphics-shootout-36983.html

*www.techenclave.com/forums/ati-produce-tool-increase-doom3-scores-29508.html

posted the linkies i had off-hand. 

EDIT: a new fix that increases FPS without AA as well
*www.guru3d.com/newsitem.php?id=3208 

And for mc optimisations this is just the beginning, we should see similar gains in d3d games as well soon.


----------



## blade_runner (Oct 25, 2005)

AlphaOmega said:
			
		

> Nemesis said:
> 
> 
> 
> ...



Hehe the r580 is already taped and ready to go ! The r580 team was working separately while another team was working on the r520. besides the r580 never faced the issues that were faced by the r520  But don't expect to see the chip in the market soon. We might see the GPU socket with the r580


----------



## AlphaOmega (Oct 25, 2005)

blade_runner said:
			
		

> The console command e_activeshadowmapsreceiving 1 hasn't solved the blocky texture bug yet mate. The issue is still very much there.
> link
> Also If its was a driver issue like you pointed out then farcry has been out for over an year now and this issue has been noticed with Nv's 2 generations of cards mainly series 6 and 7. Let's consider that this is a driver issue, in that case its over an year, so why is it not fixed yet ?
> This is the bug on 7 series card specifically a 7800gt
> ...



I haven't tried out the Far Cry command, as I don't have enough space on my HDD, due to DataOne ahem, "downloads" , but many people report that it almost (not totally) clears up the problem. Again, I haven't checked it out, so I personally can't say.

As for the shimmering problem, I got my 6600GT when the 78 ForceWare was already out, so I have only tested it on ForceWare 78 and 81. There was absolutely no texture shimmerning while playing these games, with or without AA/AF (I normally test each game with and without these settings):
Doom 3: RoE - 78
GTA: San Andreas - 78
UT 2004 - 81
BF2 - 78
Halo - 81
NFS U2 - 78
Psi Ops - 78
Boiling Point - 81
Pariah - 78
Area 51 - 81

About the problem from my previous post, the Far Cry: The Project one. Has anyone else had the same problem?

nVIDIAâ€™s next offering (G80?) will probably in the market by early 2006, and will one-up X1800. Since the X1800 has been released quite recently, I doubt that the R580â€™s release will coincide with nVIDIA. I really preferred it when both companies had near simultaneous releases, cause now I donâ€™t know if nVIDIA, or ATi, is one step ahead or one step behind.

The X1800 competes with the 7800, and will end up competing with 8x00, until R580 comes out, so which one is ahead? One vendor will have to mess with their release dates to even this out.

BTW, what will nVIDIAâ€™s next to next card be called? The 9800?


----------



## blade_runner (Oct 25, 2005)

AlphaOmega said:
			
		

> I haven't tried out the Far Cry command, as I don't have enough space on my HDD, due to DataOne ahem, "downloads" , but many people report that it almost (not totally) clears up the problem. Again, I haven't checked it out, so I personally can't say.
> 
> As for the shimmering problem, I got my 6600GT when the 78 ForceWare was already out, so I have only tested it on ForceWare 78 and 81. There was absolutely no texture shimmerning while playing these games, with or without AA/AF (I normally test each game with and without these settings):
> Doom 3: RoE - 78
> ...


Well the shimmering problem is related to anisotropic filtering so u notice it only with AF on at default settings. Although the prob has been claimed to be solved shimmering still exists . And anyways angle independent AF is better than Nv's angle dependent AF. . I've read horror stories of the Nv af shimmering in WOW and other games. There was a video as well though i don't have a link right now.



> About the problem from my previous post, the Far Cry: The Project one. Has anyone else had the same problem?
> 
> nVIDIAâ€™s next offering (G80?) will probably in the market by early 2006, and will one-up X1800. Since the X1800 has been released quite recently, I doubt that the R580â€™s release will coincide with nVIDIA. I really preferred it when both companies had near simultaneous releases, cause now I donâ€™t know if nVIDIA, or ATi, is one step ahead or one step behind.
> 
> ...


About the g80 well, i don't think its going to be launched early 2006, cause Nv still has to release the mid-budget (g72) and low-end(g7?) parts for the g70 generation.  Right now Nv is one step ahead but thats only for like 2 months or so since the 7800 series released. But then again ATi will have all the parts in the market before Nv i.e. the whole hi-end to low-end range. So its more or less nullified.  

Haha good joke abt the 9800 ! i can imagine Geforce 9800GTX


----------



## funkymonkey (Oct 25, 2005)

well the shimmering and improper filtering is not the driver bug. Its hardware. Thats how GF7 and GF6 will perform anisotropic filtering. With no AF you wont notice any shimmering bcoz its visible only when AF is turned on. It will be always there less or more depending on the games and quality mode you use.
Nvidia still uses Angle dependent AF. And till it keeps using it, perfect AF wont be achieved.
And the shadow problem exsists in many games. Not only farcry but in many games.

About The project Tech demo. The demo is designed specifically for ATI hardware. There is out of focus bands on NV hardware. the only way to reduce it a bit is using software called 3danalyser and faje the X800XT device and vendor ID. That will render the water correctly but the focus problem still remains.


----------



## icecoolz (Oct 25, 2005)

Am sure at the end of all this Venkat is all at see, and did not get an answer to his questions  

Guys, funkey, Alpha, blade, nem and the rest. Just in a line tell them guy what he should get and let him be the judge.


----------



## AlphaOmega (Oct 25, 2005)

funkymonkey said:
			
		

> well the shimmering and improper filtering is not the driver bug. Its hardware. Thats how GF7 and GF6 will perform anisotropic filtering. With no AF you wont notice any shimmering bcoz its visible only when AF is turned on. It will be always there less or more depending on the games and quality mode you use.
> Nvidia still uses Angle dependent AF. And till it keeps using it, perfect AF wont be achieved.



I haven't checked with older drivers, but I have read that 78 and higher 'fix' the shimmer problem, which is also present in ATi cards, at least upto X800. Not sure, but I remember that the shimmer was caused by an aggressive negative LOD bias, so it should have been fixable with a driver update. And as per reports and my own experience, it has been fixed with ForceWare 78. UT 2004 (BF 1942 too) is one game that had the biggest problem with the shimmer problem, and I didn't see it, despite using different quality settings.



			
				icecoolz said:
			
		

> Guys, funkey, Alpha, blade, nem and the rest. Just in a line tell them guy what he should get and let him be the judge.



That's the problem. The choice is not clear. Neither product is overwhelming superior to the other. 7800 seems better when you ramp up resolution, but ATi is better at AA. ATi's 4x AA (for example) is nearly as good as 6x of nVIDIA's, leading to better IQ at comparable settings.
So, finally, it is upto Venkat to choose. Both cards are more than capable of handling games, for 2-3 years minimum (all things considered).

Choosing between bleeding edge technologies is always a gamble, but if I had to choose I would go with the 7800GTX, simply because its 8 extra pipelines should give it an edge in upcoming shader intensive games.


----------



## blade_runner (Oct 25, 2005)

icecoolz said:
			
		

> Am sure at the end of all this Venkat is all at see, and did not get an answer to his questions
> 
> Guys, funkey, Alpha, blade, nem and the rest. Just in a line tell them guy what he should get and let him be the judge.



Hahah ice  this always happens when recommending products. Like alphaomega observed that he would suggest venkat to go for a 7800gtx. Like wise i would suggest going for the x1800xt since thats the absolute fastest in opengl and d3d games right now and add to that avivo, regular WHQL drivers, programmable AA and MC so you can expect regular optimisations and benefits for your favourite games, the angle independent AF and the ability to do HDR + AA which is a hardware shortcoming in the 7800 series. Add to that the very fact that the r520 has some extra silicon left to accelerate physics, AI and other computational tasks. ATI already demoed the power of the r520 in accelerating physics. All this makes the x1800xt a very very desirable package.  Even the adaptive/transparency AA is now possible on ATI's old generation including the r300 series


----------



## icecoolz (Oct 26, 2005)

gee thats one big line Blade  Anyways I know of the R520's capabilities. Not as much as in details as you do (why am I not surprised ) But the R520 does seem like a better deal.


----------



## raj14 (Oct 26, 2005)

Although nVIDIA 7800GTX is destroyed by X1800XT, nVIDIA still has won this around, ATI R520 was delayed too much to makea  impact on Mainstream market. while nVIDIA 7800GTX was globally launched the very same day it was unveiled without much of a hyped press confrence or anything similar. that said, both 7800GTX and 1800XT offer smashing performace, nVIDIA always had a Edge over ATI in Open GL, so games likes of Doom 3, The Chronicles of Riddick: Escape From Butcher Bay,, Splinter Cell: Chaos theory among other games which use Open GL Engine offer better performance, 1800AT on other hand nails games like: FarCry, Serious Sam II, F.E.A.R. which rely Heavily on Speedy Core and mamoth bus to boot with. the 512-Bit which 1800XT offers is the main reason for it being a 7800GTX Killer. 512-Bit bus coupled with 512MB of 625MHz Ultra Fast Memory  gives incredible performance. in any case, buying any of them means you'd be able to play Games till 2007 with Decent Enough Eye Candy, none of these cards is Future Proof, put simply as soon as Direct X 10 is introduced, these cards won't weight anymore than a 9800Pro and 5950Ultra do today. for time being now and seeing the availbilty issues, any Ethusiast Gamer in India should pick a Gainward CoolFX 7800GTX, which comes factory overclocked at 503/1.3GHz or a PV-T70F-UND7 7800GTX From XFX which is also Factory overclocked to 490/1.3GHz.
@MAJOR-MINOR: The 7800GTX price you have mentioned is wrong, the PV-T70F-UNF7 7800GTX Model sells for Rs.42,500 not Rs.31,000


----------



## Major-Minor (Oct 26, 2005)

@Raj - From where did you get that rate of 42k? from the website?.  I actually called rptech and got the rates, I didn't pull out the rates from a hat. In any case I would never rely on the Indian distributor websites for the rates, according to the same rptech.com website the XFX 6600GT 256MB version is available for Rs. 17.5k and the 128MB version for Rs. 15.2k and we all know the 6600GT sells for no where as much.

 XFX Cards at Rashi Peripherals

In any case this is what they have to say about the rates they have put up on the website  - 





> ( * : The Prices Mentioned are the Maximum Retail Price (Inclusive of all Taxes), kindly contact the nearest Rashi Branch Or Our Channel partners for more information on pricing/availability and delivery of the products.)


----------



## raj14 (Oct 26, 2005)

I didn't pulled off the rates either, nor did i took them from their Website, there Offcials told me that, when i quoted for price. in my city offcourse.


----------



## blade_runner (Oct 26, 2005)

raj14 said:
			
		

> Although nVIDIA 7800GTX is destroyed by X1800XT, nVIDIA still has won this around, ATI R520 was delayed too much to makea  impact on Mainstream market. while nVIDIA 7800GTX was globally launched the very same day it was unveiled without much of a hyped press confrence or anything similar. that said, both 7800GTX and 1800XT offer smashing performace, nVIDIA always had a Edge over ATI in Open GL, so games likes of Doom 3, The Chronicles of Riddick: Escape From Butcher Bay,, Splinter Cell: Chaos theory among other games which use Open GL Engine offer better performance, 1800AT on other hand nails games like: FarCry, Serious Sam II, F.E.A.R. which rely Heavily on Speedy Core and mamoth bus to boot with. the 512-Bit which 1800XT offers is the main reason for it being a 7800GTX Killer. 512-Bit bus coupled with 512MB of 625MHz Ultra Fast Memory  gives incredible performance. in any case, buying any of them means you'd be able to play Games till 2007 with Decent Enough Eye Candy, none of these cards is Future Proof, put simply as soon as Direct X 10 is introduced, these cards won't weight anymore than a 9800Pro and 5950Ultra do today. for time being now and seeing the availbilty issues, any Ethusiast Gamer in India should pick a Gainward CoolFX 7800GTX, which comes factory overclocked at 503/1.3GHz or a PV-T70F-UND7 7800GTX From XFX which is also Factory overclocked to 490/1.3GHz.
> @MAJOR-MINOR: The 7800GTX price you have mentioned is wrong, the PV-T70F-UNF7 7800GTX Model sells for Rs.42,500 not Rs.31,000



Heh won this round ? !! Just releasing hi-end cards which account for less than 1% of the total card sales is not considered winning the round !! omg ! 
Btw most OEM deals will be in ati hands since they will come out with the mainstream cards before that nvidia so the ball is in ATi's court right now. 
Plus the memory ringbus is 512bit which should help it in bandwidth heavy situations, the memory interface bus is still 256bit . 

Also major minor is quite right with the 7800gtx since a person i know got his 7800gtx for 31k + taxes unlike the 42k you quoted. The Asus 7800gtx TOP is around 40k but not worth the price imho.


----------



## raj14 (Oct 26, 2005)

From Where exactly? I am getting mine for Rs.85,000.


----------



## Major-Minor (Oct 26, 2005)

@raj - 85k!!!, man someone is seriously ripping you off, which brand btw, XFX I presume.


----------



## raj14 (Oct 26, 2005)

I meant 85k for both, i am getting two 7800GTXs to Run them in SLi


----------



## blade_runner (Oct 26, 2005)

raj mate 42k for a single 7800gtx is too much mate ! You are getting ripped. Rashi is offering the XFX 7800gtx at 31k+ taxes while BIG 7800gtx is 29k+taxes. Please cancel the order if possible since you are paying 10k extra !!

Just in case you don't believe me, chk that out  
*www.techenclave.com/forums/7800gt-looks-very-good-vfm-10076.html


----------



## venkat1605 (Oct 28, 2005)

Hey Guys , Thanks for ur great suggestion i am enthraled about ur response,this forum is the best with great guys like u all.My bro is an ATI employee & there is no way he would get me a NVIDIA GPU so i am going to settle for the X1800XT (512 MB).I would be using it in CROSSFIRE it would be 512 x 2 = 1024 MB . I have a CROSSFIRE enabled MOBO.One more question will it be okay to have 1 GB of GPU memory or will it result in system instabilities.Anyway as soon as the R580 Chipset GPU is released i would jump to it.Thank You all of u guys.


----------



## raj14 (Oct 28, 2005)

Certainly not, you won't have any sort system instabilties, you would however notice INCREDIBLE performance, which i think isn't a problem  
Blade_runner: lemme check with Rashi mate


----------



## ammusk (Oct 28, 2005)

ding dong *forums.guru3d.com/showthread.php?s=&threadid=157257


----------



## raj14 (Oct 28, 2005)

Tada! i think he already made his choice.......


----------



## blade_runner (Oct 28, 2005)

venkat1605 said:
			
		

> Hey Guys , Thanks for ur great suggestion i am enthraled about ur response,this forum is the best with great guys like u all.My bro is an ATI employee & there is no way he would get me a NVIDIA GPU so i am going to settle for the X1800XT (512 MB).I would be using it in CROSSFIRE it would be 512 x 2 = 1024 MB . I have a CROSSFIRE enabled MOBO.One more question will it be okay to have 1 GB of GPU memory or will it result in system instabilities.Anyway as soon as the R580 Chipset GPU is released i would jump to it.Thank You all of u guys.



Don't forget a hefty PSU like the Antec TP 2.0 550W. And make sure tha cabinet is top notch in cooling and ventilation.


----------



## raj14 (Oct 29, 2005)

Antec True Power 550W EPS 12v Will do fine Justice, as for Case, go with ASUS Vento 3600 or Antec Performance Plus 1080AMG   i am getting Vento it looks super cool, especially the Red One, but me getting Black version


----------

