Play ON!

Updated on 01-Jan-2005
Another year is upon us; the time  to upgrade your computer is at hand. We could think of nothing better to start the upgrade cycle than a new video card. This cycle will probably be the last time you will need an AGP card. For one, you will definitely upgrade to a DirectX 9-based card, which will be good for at least  two more years, and by then, PCI- Express (PCI-X) cards would be common and ready to buy!

Of course, PCI-X has already made an appearance in our market, but the technology is still finding its feet. It will take PCI-X at least two more years to go mainstream from the niche it currently enjoys. Another point to note is the fact that games scheduled for release in the near future will not be able to take advantage of a PCI-X solution. They will perform just as they would with an AGP card-making your investment in AGP now somewhat future-proof.

If you are a gamer who demands the latest and the greatest right now, you can opt for a high-end video card, which will serve you well for a while, ensuring that you do not need to change platforms for at least a while. However, if PCI-Express is what you are looking for, you will need to upgrade your motherboard and CPU as well!

Another telling factor is that the upcoming Microsoft OS, code-named Windows Longhorn, will need a DirectX 9-compliant video card, if you wish to appreciate its interface in all its glory. It has lots of eye candy which can only be experienced through a DX 9 card.

Budget Video Cards
The budget segment consisted of eight cards that had the same chipset, which was the FX 5200 from nVidia. This category was specifically created for users who are more into using their computers for general purposes, with a little bit of gaming thrown in. We look at what the cards had to offer to us.

Their Features
Half the cards we received were of the XFX brand, and they offered a versatile range from the absolute minimum in video memory to the maximum possible available in this category. Only two out of the eight cards featured dual-monitor connectors. On the bright side, all the cards featured video-out, which can be used to connect a TV. And why would you connect a TV to your computer? Try watching a DivX movie on a TV, and you probably wouldn’t ask that again.

The XFX Personal Cinema was the only card in this category to offer a TV Tuner chip on-board. This definitely gave it a plus, since you need not install a separate TV Tuner card for watching TV on your computer. However, that does not explain the Rs 11,000 price tag, since you can easily get a 9600XT at that price. That was about all that the cards in this category had to offer. The rest of the cards were all just vanilla, and expectedly so, since they are not priced high.

Their Performance
The performance of the cards across the board ranged from mediocre to excellent, given the price these cards retail at. The MSI and the XFX reigned in this category. Surprisingly, the XFX GeForce FX 5200 64 MB posted excellent results compared to most 128 MB cards, and in fact, beat both the models from XFX itself. The MSI TD-128 FX 5200 was another good performing card in this category.

The Gigabyte did relatively well, but was not good enough to get ahead of the above three cards. The 5200s by themselves are crippled, going by the number of pipelines and core and memory speeds.

Most users will have to make do playing at a resolution of 800 x 600 using these cards.

We ran Half-Life 2 and Doom 3 as Tier 1 tests (See box “How We Tested and Decided the Award Winners” on page 62 for details on Tier 1 and Tier 2 tests) on these cards, because these are the latest games available-and if the cards we tested are not able to play them even at low resolutions, they may not be worth investing in at all. Doom 3 is a real resource-hogger. All the cards suffered and posted unplayable frame-rates at 800 x 600, but were okay at 640 x 480 (although 21 frames is still very jerky for playing a game), giving reasonable frame-rates. With Half-Life 2, the cards were more than happy and posted good frame-rates, with the XFX 64 MB giving 35 fps at 1,024 x 768. (These frame-rates were posted after the DX9 fix was applied for the cards in the game. See Box “OEM Deals and How They Affect Your Gaming Experience” on page 78)

A similar scenario was observed with Far Cry, with the settings at Medium in the game. The frame-rates at 800 x 600 were enough for any user to play the game smoothly. The XFX FX 5200 128 MB (non-TV Tuner) card was disappointing, and gave some of the most rock-bottom frame-rates amongst the whole lot. In our performance tests, the best performers were the MSI and the XFX, with the Gigabyte following closely.

First-time entrant PNY gave good frame-rates, but could not hold its own against the above-mentioned cards.

Overall, in the budget category, the cards gave us the expected results, and we filtered out the best amongst them to arrive at the winners.

The Winners Are…
From the results of the performance tests, it is clear that the winners were going to be from the XFX stables, given the frame-rates that they posted in each game. Agreed, the frame-rates in Doom 3 were nothing to write home about, but it does prove a point: if you want to play games like Doom 3, an FX 5200 is not the card for it-and you will definitely need to spend some more in the graphics department. On the other hand, with games such as Far Cry and HL2, the 5200 proves to be more than enough for that few hours of weekend gaming.

XFX Geforce FX 5200 (256MB)

The XFX GeForce FX 5200 256 MB and the XFX GeForce FX 5200 64 MB were the overall winners in this category not only because of their price but also because they performed well. These two get the Digit Best Buy Gold Award and the Best Buy Silver Award respectively.

#advanceampadstable0#

Gaming Category

Low End Video Cards
This category was created keeping in mind the falling prices of last year’s mid-category cards. An ATI 9600XT can now be purchased for Rs 10,000 or less depending on the manufacturer. With the arrival of more high-end cards, these cards have now been pushed to the low performance category but are still not low enough to fit into the budget category. There were 15 cards in this category and the fight a tough one. Here is how they fared.

Their Features
Almost all the cards had similar features with some featuring dual DVI inputs. One thing that should be understood is that when you are paying more than Rs 10,000 to get a video card; the criteria should always be to look at their performance rather than the software bundled along. Dual monitor inputs were  present on all these cards as was a video-out port.

The games that are bundled with the cards are limited and normally include demos from various games and some older games bundled with the card.


Half-life 2 on the XFX FeForce FX 5700 Ultra shows of extremely

In our comparisons, both the Gigabyte cards included the same software, which was a game and PowerDVD5 software, but one video card was priced higher than the other.

This underlines our assertion that the software bundled acquires a secondary status to the performance of the card and in your buying decision, it should always be performance that counts and not the software or other features the manufacturer bundles.

Their Performance
In our Tier 1 tests, the cards based on ATi Radeon 9550 chipset suffered badly in Doom 3. Although they did post playable frame-rates at lower resolutions, the price they retail at does not justify the performance. In fact, Doom 3 was the overall stress test for all the cards we tested. It brought almost all the cards in the low category to their knees including the 9600s and the 5700s. The 5600 chip from last year was missing this time and was replaced by a new and improved 5700. The clock speeds have been cranked up as have the memory speeds with the Ultra’s showing the most improvement. Similarly, the 9600XT is the improvised version of the 9600PRO with more memory and increased clock and memory speeds.

We tested all the cards for the Tier 1 tests and then filtered out four from each category to run the Tier 1 tests on. The criterion we decided to apply was performance. Better the frame-rates, higher the chances of it going into the next tier. In our Tier-1 tests, the Gigabyte duo and the PowerColor and XFX cards posted some of the best results.

XFX GeForce FX 5700 ULTRA(128-MB)

In the tests, we kept the in-game settings across all resolutions at medium. At this medium setting, the PowerColor 9600XT was on fire and gave excellent frame-rates on all games beating the 5700 Ultra, which has a higher memory speed and also has a faster memory type. The cards that made it to the next level were the PowerColor 9600XT, the XFX 5700 Ultra and the Gigabyte 5700 and 5700 LE.

Is PCI Express x16 Important?
PCI Express is the new technology that will invade consumer desktops starting this year.
Jumping up from 4X AGP to 8X AGP did not provide any major performance leap in terms of frame-rates in games. For the current crop of games that is played on desktops worldwide, AGP 4X is more than enough to provide the required bandwidth. However, the limiting factor is the memory management, which bottlenecks the data transfer. The increased bandwidth offered by AGP 8X does little to improve this. Card manufacturers have gone and improved the clock and memory speeds for the accelerator, but the overall system performance is not in their hands. Therefore, if you have a slower CPU and a faster card, games will still lag.
With PCI-e x16, the bandwidth has doubled from that offered by AGP 8X; however, the core issue still remains. Right now, any game that runs comfortably on an AGP card will do the same on an x16 card; however, the PCI-Express solution offers better data transfer rates for other components on the motherboard. The main reason companies are adopting PCI-Express is because there is a huge performance gain that can be tapped in all areas of data transfer from one component to another.
As always, moving on from AGP to PCI-e will be more of a transition brought about by market conditions than an option exercised by the end-user. As soon as the current motherboards start disappearing from the market and PCI-e becomes prevalent, you will have to go for a PCI-e solution. If you have an AMD 64 or Intel 3.2 GHz and above system with a good 4X or 8X AGP card, our suggestion would be to sit tight and not upgrade, since PCI-e will definitely not make Doom 3 run at 200fps at 1,600 x 1,200!


In Tier 2, we cranked up the resolutions and also the in-game settings. All settings were set to ‘High’ and the minimum resolution we started from was 1024 x 768. This is where the XFX GeForce FX 5700 Ultra shone bright. With its higher memory speeds, and optimised pixel pipelines which handled two textures per pixel, it posted the best frame rates in Tier 2.

The PowerColor Radeon 9600 just went limp here; in fact, it crashed twice with visible artefacts in the higher resolutions posting a measly frame-rate of approximately 5 fps at 1280 x 1024 in Far Cry. On the other hand, the card did well with 4X AA ON, at 1024 x 768 in both Half-Life 2 and Far Cry. But Doom 3 was simply not its strength.

The XFX GeForce FX 5700 Ultra was just waiting to be turned on at the higher settings. The ceiling for this card was similar to others in Doom 3 and with 4X AA at 1024 x 768, cranking up the resolution any higher spelt ‘doom’ for the card and the game was simply unplayable. However, the 5700 Ultra provided very good frame rates for all other games we tested and can be considered for future games.

The Gigabyte 5700 was the second in line to lay claim for the throne in this category and posted good frame rates. With the competition heating up, it was difficult for us to look for the runner-up to the prized Digit Best Buy award.

The Winners Are…
One thing you need to keep in mind is that the cards we tested in this category are only good enough to play games up to a resolution of 1024 x 768. You may get playable frame rates at higher frame rates with high in-game settings in some games, but if you don’t, don’t be surprised. You can rest assured, however, that these cards can deal with all games at higher resolutions if the in-game eye candy is kept low.


Gigabyte GeForce FX 5700 (128MB)

The Tier 2 test results immediately reflected the first winner, which was the XFX 5700 Ultra 256 MB, because of the frame-rates it posted, it was awarded with the Digit Best Buy Gold Award.

The low performance of the PowerColor 9600XT at 1280 x 1024 resolution was definitely puzzling and we ran the tests not once, but four times to double check our results. Nevertheless, the results remained the same. However, the value for money that the PowerColor Radeon 9600XT offers is unmatched by any other card in this category.

If you are low on money but still want a decent card, this card will not disappoint. The card next in line was the Gigabyte GeForce FX 5700 128 MB, which ended second and was awarded the Digit Best Buy Silver Award.

Gaming Category
Mid-RANGE Video Cards
The ‘Mid’ category of cards was the smallest in terms of numbers: just three. The mid-level cards this year consisted of the high-end cards from last year. There were two ATI, both 9800 PRO’s, from Club3D and PowerColor respectively, and one Gainward GeForce FX 5900 XT. With not much left to compare among the ATIs, it was finally performance that mattered.

Their Features
On the features front, the cards were plain vanilla with not much to talk about. With the regular bundle of old games and driver software along with cables and DVI adapter, formed the package of all three cards.

Their Performance
The clock speed is one thing that needs to be talked about here. Both 9800s were clocked at the same speed-which is a default 380MHz-the 5900XT, though, was clocked higher in terms of clock speed and memory speed. The pixel pipelines, however, are where the tables turned.

The 5900XT is more in the league of the 9600XT than the 9800PRO. It has four pixel pipelines as a standard but it can do pseudo eight-pixel pipeline operations. This, in non-technical language, means that it renders four conventional (colour Z-order) pixels per clock, but is capable of performing eight operations per clock for Z pixels, textures, and stencil and shader ops, which makes it partially crippled. Also, it is a toned-down version of the 5900 Ultra in terms of clock speed and is placed just above the 5700 Ultra in terms of price and the difference is around Rs 250. This card is also termed as the 5900 SE in some cases.

In our Tier 1 tests, the Gaiward 5900 XT blazed through the games. It gave more than acceptable frame-rates at high resolutions across all categories of games. The Club3D 9800 PRO 128MB made it through to the Tier-2 tests along with the Gainward GeForce FX 5900XT.

Gainward GeForce 5900 XT

In the Tier-2 tests, where the in-game settings were turned on to high, the Gainward reigned supreme. But it could not come even come close to the 9800 in the Call of Duty tests, which left it biting the dust. In all other games, though, the ATi 9800 was second-best. Once again, we encountered the same problem in Far Cry at 1280 x 1024 where the card posted a measly frame-rate of 5 fps with visible artefacts. This did the card in all the more and the 5900XT came through to win this competition.

The Winner Is…
The 5900XT is a strange species. With its pseudo eight-pixel pipeline processing, it could have fit into the ‘Low’ category, and would have beaten the daylights out of all the other cards including the 5700 Ultra.

On the other hand, the pseudo processing feature also made it a candidate for the mid-level category and that’s where it went in the end. But the performance that it offered was more than satisfying and it unquestionably gives you more bang for your buck. The Gainward 5900XT 128 MB is undoubtedly the winner of the Digit Best Buy Gold Award in this category.

Gaming Category
High End video Cards
In the comparison we carried last year, the 9800 PRO and the 5900 Ultra were the two top-of-the-line cards from ATi and nVidia respectively. Both cards had the 8-pixel pipeline architecture, running 256 MB of DDR RAM.

The 5900XT gives you the most bang for your buck


Cut back to the present, and we have the X800XT PE and the 6800 Ultra from the same manufacturers. The core clock speeds have gone higher, bundled with an increase in the memory speeds from the previous generation and the type of memory used. The biggest addition though, has been the doubling of the pixel pipelines. From 8, it has gone on to 16.

This means that with each clock cycle, the GPU will be able to handle twice the number of pixels and render them on-screen. Given the increased clock and memory frequency of the GPU, a card can chomp through a mind-boggling number of pixels.

The current generation of ATi cards give an increase of more than 50 per cent over the previous generation of cards, in terms of memory bandwidth; while their fill rate (Mtexels) has actually doubled from the previous generation.

SLI has also reappeared on the horizon, and some privileged users actually have a system running SLI. Going by the power that the GPU is being blessed with, it will not come as a big surprise to one day see a video card being compared with the PC processor!

This particular category featured the cream of the current crop. From the X800 to the 6800GT, they all fit into this category. The top-of-the line, most wanted desktop graphic cards on the planet! It’s a pity the 6800 Ultra and the Ultra Extreme could not make it to the comparison in time, but the 6800GT and the X800XT PE crowned the high-end category of the comparison.

Their Features
The privilege that high-end cards enjoy is money. For a user who wants to invest in such cards, money is never a concern. Performance is what counts. In addition, from the bundle of software that comes with the cards, it is more than evident that the manufacturers could not agree more.

The games and software bundled with each card is worth at least Rs 1,500, not to mention the extra cables and connectors.

OEM Deals And How They Affect Your Gaming Experience: Doom 3 & Half-Life 2
Two of the most awaited games this year were Doom 3 and Half-Life 2, which we used for our benchmarks. Most gamers around the planet swear by the lighting effects used in Doom 3 and the water detail in HL2. However, are you sure that the card you have will give you the best visual experience when you play these games?
Doom 3 by far is one of the most taxing games when played with the High settings. Both nVidia and ATi cards give comfortable frame-rates when played at lower resolutions. As you crank up the resolution, the nVidia cards just zoom past the top-end ATis, giving a performance increase of more than 30 per cent. So does that mean that ATi cards cannot play Doom 3 well? There is some truth to this. The ATi drivers are comparatively bad on OpenGL implementation vis-Ã -vis nVidia. Another reason is that nVidia’s 6-series architecture is optimised precisely for games such as Doom 3, since they worked more closely with the ID team during the development of the game. It is clear from this that any game based on the Doom 3 engine would most probably favour nVidia cards in the future.
On the other hand, there is Half-Life 2. Similar to all games, when you start the game it auto-detects the video card and provides you with a setting that gives you the best in visual quality for the card that you have. Fair enough, but if you check the Advanced Settings in the Video menu option, you may be in for a surprise. With the nVidia 5-series video cards, the DirectX option is set to 8.1 by default in the game. But with the 9-series ATi, the default DirectX level chosen is 9. Surprised? We were too. After some research we got the real deal. Apparently, Valve designed the game to use 24-bit floating point instructions at all times in the game. ATi has 16-bit and 24-bit floating-point registers while the 5-series NVIDIA has 16 and 32 bit registers, but no 24-bit ones. Since 16-bit is partial precision it can be used both in DX9 as DX 8.1 implementations. As the 24-bit registers are absent in the nVidia card, the game checks for them, doesn’t find them, and automatically implements DirectX 8.1 instead of DirectX 9! However, there is a way to run the 5-series cards in DirectX 9 in HL2. For this, start the console in HL2, and there, type in “dx_matlevel 90” without the quotes. This will make the card run the game in DirectX 9 at all times.
Oh, and before we forget, ATi has an OEM deal with Valve for Half-Life 2!


XFX bundled three games with their 6800 GT, while Asus, as usual, bundled three games, a complete CD of game demos, DVD software, drivers, and their proprietary Media Show software. The PowerColor bundled the complete version of Hitman: Contracts with their X800 cards, and a multi-utilitarian bag that is a dream for any serious computer user-geek and gamer alike. Gainward was disappointing, and there wasn’t much to their package-poor show in the features department.

Need for Speed Underground on the Gainward GeForce FX 6800 GT shines well as per image quality

The rest of the cards also came with commendable packages, but none could match the Asus. Most of the cards had dual DVI connectors on the back panel, for which requisite cables were provided in the package.

The Gainward card looks humongous, with a dual-fan cooling design, and covers up the adjacent PCI slot. This card sucks in air from inside the cabinet and circulates it to efficiently cool the card. But for that, you need a good case with more than enough cooling.

Looking at a RAID setup that most gamers would have, the heat generated inside the system would probably be akin to that of a little furnace.

On the other hand, the XFX 6800 GT is a work of art. Single slot, no fuss, noiseless and sleek-looking. The top of the card is covered by a sleek metal plate, which, if you look from above, partitions the motherboard into two perfect halves!

Their Performance
The performance of the high-end cards was the highlight of the comparison. The Tier 1 tests were child’s play for these cards. In the Far Cry test, the XFX 6800GT trounced the other cards with sheer authority.

It posted a record 138.57 fps at 1,024 x 768 with the game settings set to Medium. Doom 3 and Half-Life 2 fared similarly on this card, and we wondered if the manufacturer had hidden a steroid shot somewhere in the card’s heat sink! The X800 PRO is relatively underpowered because of its 12-pixel pipeline architecture compared to the 16-pixel pipeline architecture of the 6800GT.

However, that does not mean that the card in any way lacks spunk. In 3DMark, the card posted the highest scores amongst all the other cards, even ahead of the X800XT PE.

Moving on from the Tier 1 tests, we had four distinct winners. These were the Gainward 6800 GT 256 MB, the Gainward 6800 256 MB, the XFX 6800GT 256 MB and the PowerColor X800 PRO.

SLI Technology: What Is It?
SLI is an acronym for Scalable Link Interface, a technology originally developed by 3dfx for their Voodoo cards. Now that the company and their cards are history, (after nVidia gulped them out of business) the technology has been improved upon, and a spanking new version is out from nVidia.
In the current scenario, an SLI configuration needs two identical NV45 PCI-Express cards running on a motherboard with two x16 slots. These two cards are connected to each other using a small bridge-type PCB that connects them from above. In this manner, the card connected to the monitor becomes the master and the other card becomes the slave.
The technology aims to provide the user with an increase in graphics performance by rendering an image, using both the GPUs at the same time. The image to be rendered on-screen is automatically load-balanced by the driver, and the image is rendered on the screen. This is done dynamically for each image, and the load on each GPU is accordingly varied. In reviews done by Web sites that had the privilege of laying their hands on an SLI system, the percentage increase is almost 30 to 60 per cent at 1,600 x 1,200 with 4X AA and 8X AF in Far Cry depending on the image rendered. That is a difference of more than 20 frames in any given level!
Alienware, a PC major in the US, has developed technology called the Alienware X2 Video Array. The difference here is that the instead of just nVidia video cards, the Alienware array will let you connect any two identical PCI-Express video cards (like ATi X850s). Alienware uses its own proprietary software and a proprietary merger hub that connects both the video cards. This hub  synchronises the signals-one scene is collectively rendered by two GPU’s. Let’s take a look at the cost-to-performance ratio in the present scenario for an SLI rig. The performance increase offered is about 1 to 1.5 times a single card of the same type does. However, the cost increases sharply, since you need to have two top-of-the-line 6800GTs or Ultras to implement this solution-and no, you cannot mix and match! Both cards must be identical in all respects, including memory, and manufactured by the same company.
You will also need to have an efficient cooling system to dissipate the heat that will be generated inside the cabinet by such a setup, not to mention an SMPS of more than 500 Watt to provide for each of the cards drawing power from the SMPS. SLI implementations haven’t taken off in a big way because of various issues including buggy drivers. Nevertheless, once these issues are ironed out, SLI may turn out to be the wave of the future!


In our Tier 2 tests, the in-game settings were turned to High as usual. Before we get into the results, let’s talk a little about the textures in Doom 3. Any gamer who has played Doom 3 will agree that it has fantastic lighting and that the textures are excellent. In Doom 3, at low quality, everything is toned down-lighting and textures-and all the maps are compressed, while all the textures are fixed to 512 x 512 with specular maps fixed to 64 x 64. Anyone with a decent 64 MB video card can play in the low quality mode with relatively good frame-rates.

But as we move on to the High settings, the specular and diffuse maps are compressed using DXTC 1, 3, 5, while the normal maps are uncompressed. In addition, the texture sizes are not fixed.

This puts a major load on the GPU, and the DDR3 RAM on the high-end cards helps in this area since the faster speed lets it load the textures faster.

In the Ultra quality mode, everything is uncompressed, and here the texture data can go above 500 MB, making it suitable for video cards of the future which will feature 512 MB of video RAM. We did try the Ultra High quality on the XFX 6800GT and the Gainward 6800GT.

What we saw was utterly unbelievable! No game has ever looked so good. Everything seemed to jump out of the screen, and given the sound effects, it was truly a visual and aural treat to experience.

Half-Life 2 is a relatively less hungry for resources, and so even when you move on to high quality, the frame-rates dip, but not massively. The game is more about the physics and the engine rather than heavy textures. One place the game stands out is the water. It is amazing and this is where the pixel shaders of the card have to work. The shader-intensive water was quite a match for even the best video card we had.

In Far Cry though, effects abound. You name it and, the game has it, in order to look good. In the research time demo, you can check the detail of the textures with utmost ease (which are uncompressed, again) and the light that reflects off them. When the action takes place, you can feel the game lag when all the settings are set to very high.

Difference Between Shader Model 2.0 And 3.0
The new nVidia 6X series supports Shader model 3.0. What is Shader model 3.0? It is an improvisation over the old Shader Model 2.0 available in the last generation of cards which are the 5X series for nVidia and 95X series for ATi. In fact, ATi still haven’t moved to Shader Model 3.0 in their current crop of cards, which is the X8X series.
Technically, Shader model 3.0 offers more in every department over Shader model 2.0.
Given below is a brief table of the improvements of SM3.0 over SM2.0.
We can see that there is hardly any image improvement from SM 2.0 to SM 3.0. For now, it seems that SM 3.0 will probably provide more of a performance gain in games than actual image quality. The variation in image quality can only be measured with the emergence of new games that will implement Shader model 3.0 instructions from scratch.


With this said, let us see how our cards fared. The Powercolor X800 PRO stole the show in 3DMark 2005, but the tests here were done using Pixel and Vertex Shaders 2.0. For the Gainward and the XFX 6800 GTs, it used shader model 3.0, as that was the default selection of the games. Thus, for any future games that will utilise PS and VS 3.0, the X800 will not be a suitable proposition. However, it was not good enough for the Gainward. In our Doom 3 tests, the X800 PRO suffered, but regained its composure in the other tests. In Far Cry and Half-Life 2, the Gainward reigned supreme. We also tested Need For Speed Underground 2 to take image quality shots along with Doom 3 and Half-Life 2. The Gainward GeForce 6800 GT kicked butt, posted extremely good figures across all the tests, and was the ultimate winner in our high-end category.

The Winners Are…
That the competition was tough would be an understatement. It was more neck-to-neck than anything else. The XFX 6800GT performed commendably, but it could never come close to the Gainward in terms of either price or performance, and bowed out with grace.

Gainward GeForce FX 6800GT

The Gainward 6800GT 256 MB video card is one card that anybody would want to possess. It not only gave terrific frame-rates, but also hit the sweet spot with its price. It is unquestionably the winner of our Performance High Category and gets the Digit Best Buy Gold Award.

Powercolor Radeon X800 Pro

The X800 PRO gave good results, though they were just not enough for the first spot. Nevertheless, it did make it to second spot, and the PowerColor Radeon X800 PRO gets the Digit Best Buy Silver Award for the high-end AGP category.


Team Digit

Team Digit is made up of some of the most experienced and geekiest technology editors in India!

Connect On :