Featured Post
NICK STAM BLOG PIC

By Nick Stam Posted on Nov 19 2010 at 12:00:00 PM in Gaming
147 Comments

Testing NVIDIA vs. AMD Image Quality

driver page

PC gaming enthusiasts understand image quality (IQ) is a critical part of the PC gaming experience. They frequently upgrade their GPUs to play the latest games at high frame rates, while also dialing up the display resolution and graphical IQ effects to make their games both look and play great. Image quality is important, and if it were not important, we’d all be playing at 10×7 with no AA!

Important Benchmarking Issues and Questionable Optimizations
We are writing this blog post to bring broader attention to some very important image quality findings uncovered recently by top technology Web sites including ComputerBase, PC Games Hardware, Tweak PC, and 3DCenter.org. They all found that changes introduced in AMD’s Catalyst 10.10 default driver settings caused an increase in performance and a decrease in image quality. These changes in AMD’s default settings do not permit a fair apples-to-apples comparison to NVIDIA default driver settings. NVIDIA GPUs provide higher image quality at default driver settings, which means comparative AMD vs. NVIDIA testing methods need to be adjusted to compensate for the image quality differences.

What Editors Discovered
Getting directly to the point, major German Tech Websites ComputerBase and PC Games Hardware (PCGH) both report that they must use the “High” Catalyst AI texture filtering setting for AMD 6000 series GPUs instead of the default “Quality” setting in order to provide image quality that comes close to NVIDIA’s default texture filtering setting. 3DCenter.org has a similar story, as does TweakPC. The behavior was verified in many game scenarios. AMD obtains up to a 10% performance advantage by lowering their default texture filtering quality according to ComputerBase.

AMD’s optimizations weren’t limited to the Radeon 6800 series. According to the review sites, AMD also lowered the default AF quality of the HD 5800 series when using the Catalyst 10.10 drivers, such that users must disable Catalyst AI altogether to get default image quality closer to NVIDIA’s “default” driver settings.

Going forward, ComputerBase and PCGH both said they would test AMD 6800 series boards with Cat AI set to ”High”, not the default “Quality” mode, and they would disable Cat AI entirely for 5800 series boards (based on their findings, other 5000 series boards do not appear to be affected by the driver change).

Filter Tester Observations
Readers can observe AMD GPU texture shimmering very visibly in videos posted at TweakPC. The popular Filter Tester application from 3DCenter.org was used with its “ground2” texture (located in the Program Files/3DCenter Filter Tester/Textures directory), and texture movement parameters were set to -0.7 in both X and Y directions with 16xAF enabled. Each video shows the split-screen rendering mode of the Filter Tester application, where the GPU under test is on the left side, and the “perfect” software-based ALU rendering is on the right side. (Playing the videos with Firefox or Google Chrome is recommended). NVIDIA GPU anisotropic quality was also tested and more closely resembles the perfect ALU software-based filtering. Problems with AMD AF filtering are best seen when the textures are in motion, not in static AF tests, thus the “texture movement” settings need to be turned on in the Filter Tester. In our own testing with Filter Tester using similar parameters, we have seen that the newly released Catalyst 10.11 driver also has the same texture shimmering problems on the HD 5870. Cat 10.11 does not work with HD 6000 series boards as of this writing.

AF Tester Observations
ComputerBase also says that AMD drivers appear to treat games differently than the popular “AF Tester” (anisotropic filtering) benchmark tool from 3DCenter.org. They indicate that lower quality anisotropic filtering is used in actual games, but higher quality anisotropic filtering is displayed when the AF Tester tool is detected and run. Essentially, the anisotropic filtering quality highlighted by the AF Tester tool on AMD GPUs is not indicative of the lower quality of anisotropic filtering seen in real games on AMD GPUs.

NVIDIA’s own driver team has verified specific behaviors in AMD’s drivers that tend to affect certain anisotropic testing tools. Specifically, AMD drivers appear to disable texture filtering optimizations when smaller window sizes are detected, like the AF Tester tool uses, and they enable their optimizations for larger window sizes. The definition of “larger” and “smaller” varies depending on the API and hardware used. For example with DX10 and 68xx boards, it seems they disable optimizations with window sizes smaller than 500 pixels on a side. For DX9 apps like the AF Tester, the limit is higher, on the order of 1000 pixels per side. Our driver team also noticed that the optimizations are more aggressive on RV840/940 than RV870, with optimizations performed across a larger range of LODs for the RV840/940.

FP16 Render Observations
In addition to the above recent findings, for months AMD had been performing a background optimization for certain DX9 applications where FP16 render targets are demoted to R11G11B10 render targets, which are half the size and less accurate. When recently exposed publically, AMD finally provided a user visible control panel setting to enable/disable, but the demotion is enabled by default.  Reviewers and users testing DX9 applications such as Need for Speed Shift or Dawn of War 2, should uncheck the “Enable Surface Format Optimization” checkbox in the Catalyst AI settings area of the AMD control panel to turn off FP16 demotion when conducting comparative performance testing.

A Long and Winding Road
For those with long memories, NVIDIA learned some hard lessons with some GeForce FX and 3DMark03 optimization gone bad, and vowed to never again perform any optimizations that could compromise image quality.  During that time, the industry agreed that any optimization that improved performance, but did not alter IQ, was in fact a valid “optimization”, and any optimization that improved performance but lowered IQ, without letting the user know, was a “cheat”.  Special-casing of testing tools should also be considered a “cheat”.

Both NVIDIA and AMD provide various control panel knobs to tune and tweak image quality parameters, but there are some important differences — NVIDIA strives to deliver excellent IQ at default control panel settings, while also ensuring the user experiences the image quality intended by the game developer. NVIDIA will not hide optimizations that trade off image quality to obtain faster frame rates. Similarly, with each new driver release, NVIDIA will not reduce the quality of default IQ settings, unlike what appears to be happening with our competitor, per the stories recently published.

We are glad that multiple top tech sites have published their comparative IQ findings. If NVIDIA published such information on our own, without third-party validation, much of the review and technical community might just ignore it. A key goal in this blog is not to point out cheats or “false optimizations” in our competitor’s drivers. Rather it is to get everyone to take a closer look at AMD’s image quality in games, and fairly test our products versus AMD products. We also want people to beware of using certain anisotropic testing tools with AMD boards, as you will not get image quality results that correspond with game behavior.

AMD promotes “no compromise” enthusiast graphics, but it seems multiple reviewers beg to differ.

We have had internal discussions as to whether we should forego our position to not reduce image quality behind your back as AMD is doing.  We believe our customers would rather we focus our resources to maximize performance and provide an awesome, immersive gaming experience without compromising image quality, than engage in a race to the IQ gutter with AMD.

We’re interested to know what you think here in the comments or on the NVIDIA forums.

Tagged:

Share:

Similar Stories

Comments (147) Post A Comment

Diceman on November 23, 2010 at 9:58 pm

not only are these high profile sites these were also some of the places getting Fermi information well before other sites.

Diceman on November 23, 2010 at 10:00 pm

nvidia still holds the highest % of dedicated graphics card, professional and gamer combined.
Theres a reason professionals avoid FireGL…

Ivica on November 23, 2010 at 11:47 pm

For me, nVidia is like a German car, and AMD like Japanese. Sometimes Japanese cars look nice and run faster, but at the and, class, performance and stability is not in Toyota but in BMW :)

Spede on November 24, 2010 at 12:26 am

This is the only thing you Nvidia PR suckers can do. I bet you never done any “optimizations” yourselves :rollseyes:. At least AMD is keeping their reputation respectable unlike you with your Batman AA lock, Assassins Creed DX10.1 support removal and numerous other things you’ve done. Intel is like model citizen compared to you.

Spede on November 24, 2010 at 12:42 am

What is really funny you guys complain about AMD’s AF quality when they have angle independent filtering and you don’t have. Yours looks like a stop sign.

Diceman on November 24, 2010 at 3:13 am

of course you cannot disable what doesn’t exist.

Diceman on November 24, 2010 at 3:21 am

because angle independent AF really helped the 5870 not have crappy AF…. http://img402.imageshack.us/img402/4872/d3daftester13a201002051r.png oh, i guess it didn’t help at all.
i really don’t like brilinear.

nikola on November 24, 2010 at 12:08 pm

nvidia > amd

Trajan Long on November 24, 2010 at 3:42 pm

ATI and their idiot fanbois whine line scalded monkeys! Super!

Wayland on November 24, 2010 at 6:32 pm

Nvidia NEEDS to FIX the OpenGL problems with GTX480 cards! They are slower than the older cards and I purchased the high end GTX480 because it was supposed to be faster.. it NOT. Many people are finding the same issues.. PLEASE GUYS FIX THIS PROBLEM! I am trying to use this card for DVE use with ivsEdits. Thanks!

Diceman on November 24, 2010 at 9:01 pm

“Many people are finding the same issues..”
Depends what you use the card for, its no secret if you are using a Geforce card for work that should be done on a quadro that things are slower…….

anonymouse on November 25, 2010 at 2:32 am

When you guys play this low games you must be scared of something.
Watch out 6970 is comming soon.
And then 6990 will own you early next year!

havok/ opencl/ 78°Celcius/ XFX on November 25, 2010 at 3:24 am

hey nvidia,u r going to suffer until directx 13 launching(november 2014)

Mar on November 25, 2010 at 6:19 am

Well, I’ve got a similar experience of Nvidia cards. I got a 8600M GT in my laptop and it’s hot enough to make toddy by placing a small glass of rum right next to the fan exhaust. Fortunately the piece is still running but for how long?
I used a 6800 Ultra-card in my previous desktop build and after half a year it started showing graphical artifacts when running Need 4 Speed. Mind you the case was well ventilated.
A year ago I built a new computer using a Radeon 5850 and it’s never ever given me any problems of this kind. This is my own scenario and probably differs a lot from other users’ experience. What it boils down to is this: as long as I’ve been using Nvidia-cards, I’ve had issues with overheating and graphical artifacts. When I switched to ATI/AMD those issues went away.
To top it off, you guys (Nvidia) made fools of yourself with the whole Fermi launch, and now you’re trying to get a cheap shot at the competition by posting slander? I understand competition is fierce but what about making “honesty” and “commitment” part of your repertoire instead of simply flexing your muscles and going “RAARGH, BIGGEST GPU WINS!”…

Shibu on November 25, 2010 at 12:35 pm

I was wondering why guys were late to find this out. I have felt this difference from the time 5000 series launched. Even in review sites (guru3d, anandtech, etc.) when they post screenshots of games, there is a glaring difference in image quality. Whatever it maybe, NVIDIA’s graphic cards had the better image quality overall. And as Joe Eklund said, NVIDIAs drivers are the best and no nonsense.

rudy on November 25, 2010 at 9:14 pm

nvidia is very pathetic….

TwilightVampire on November 25, 2010 at 10:15 pm

Remember during the GeForce FX days, you guys were guilty of this too. In my opinion, nVidia has no room to talk on this subject.
Most PC gaming enthusiasts already re-adjust IQ settings to their liking anyway. Defaults mean little to nothing. And official benchmarking is done with the lowest IQ settings for every FPS or 3dmark point possible. Just like it was a moot point in 2003 when nVidia was guilty of this, its still a moot point today when AMD is guilty of it.

Play3r on November 26, 2010 at 12:36 am

has this been tested with the older 5000 or even 4000 series? or only with the 6000 series. testing on 2 very resently release graphics cards could just been they are ironing out the bugs in the drivers. It happens for both side everytime they release something new. there are always driver issues. and do the test with the 10.11 and see.

connta on November 26, 2010 at 5:27 am

great, i have slightly worse image quality (which i can correct in driver settings and guess what quality and high quality make up for 2-5fps difference and no real IQ difference as far as my eyes are capable to see)…
on the other hand i do not have a power guzzling card and heat nightmare, my southbridge used to overheat from nvidia cards… yeah, SB overheats and my comp crash, can you set that to “cooler” in driver settings (since i can set high quality) without making my comp sound like a jet? no sir. that is why you fail, complete, “rounded” product is important. very important.
and all these sites are german. how come no one else found and published this? i don’t see anything on so many other respectable sites… for one guru3d.com said that there is no real difference in IQ and you know what, moving from gtx260 to 6870 i didnt either. a bit fishy my kind sir…

Diceman on November 26, 2010 at 7:41 pm

Guru3D is well known for sugar coating reviews because Hilb3ert likes to keep recieving review samples.
Hilberts review only took still shots, and the problem is not visible in stills.

Diceman on November 26, 2010 at 7:42 pm

don’t post if you didn’t read.

Diceman on November 26, 2010 at 7:43 pm

i’ll stop you right there.
the nvidia cg compiler did use a lower precision in 3dmark, but there was NEVER a glaringly obvious decrease in quality.

Diceman on November 26, 2010 at 7:48 pm

I’ve tested from Release 257 to CUDA Development 263.06, all of these set the supersampling – alpha test setting that does not benefit from a higher FSMSAA mode.
I was contemplating moving to a GTX 500 series card eventually but i didn’t want part of the reason to be getting TrMSAA back without needing 3rd party tools xD

Diceman on November 27, 2010 at 7:09 am

It would be Libel, not slander. and its not Libel anyway because its testifiably correct.

Diceman on November 27, 2010 at 7:11 am

also, gj with your 5800 series with its grey screens and abhorrent driver resets due to improper per vendor implementation.
i can really see AMD holds the reigns over non-reference design.

boo@boo.net on November 27, 2010 at 12:16 pm

Pot calling kettle black eh Nvidia? You have done the same thing yourselves so many times that it is not even funny. Less QQ please and more focus on making good cards.

neliz on November 28, 2010 at 12:03 am

NVIDIA caught cheating in HAWX 2 benchmarks:
http://www.kitguru.net/components/graphic-cards/jules/will-gtx570-be-caught-in-new-app-detection-cheat/
AA IQ is lowered to raise the performance. Dear NVIDIA, maybe you should focus on making better products instead of trying to paint the competition black?

Jon on November 28, 2010 at 5:52 am

Please stop posting, you make yourself look like a tool.
“In a nutshell, the HawX application requests the highest possible AA “sample quality” at a particular AA level from our driver. Without our driver fix, the game would be running 16xCSAA instead of standard 4xAA when you select 4xAA in-game. It runs the proper 4xAA with the driver fix. You defeat the fix by changing the .exe name, causing it to run at 16xCSAA.”

Malte on November 28, 2010 at 6:26 am

Unbelievable.
I’m happy, that I bought a GTX460 from Gigabyte instead of a HD6850 or HD6870 …
Regards from Germany

Psycoz on December 1, 2010 at 11:33 pm

ahahahaaa!!!
http://www.chiphell.com/forum-viewthread-tid-143712-from-portal.html
C.O.

Psycoz on December 2, 2010 at 8:55 am

Guru3D strikes back!
http://www.guru3d.com/article/exploring-ati-image-quality-optimizations/

:)

Nick Stam @ NVIDIA on December 2, 2010 at 11:54 am

Hi Psycoz.
Please see this story where I explain the HawX issue:
http://www.kitguru.net/components/graphic-cards/carl/nvidias-nick-stam-addresses-hawx-cheating-allegations

Also, Hilbert at Guru3D showed the perf gains that we’re concerned about by AMD’s stepping down default IQ. Diceman is right that it’s most obvious in motion, not still shots. Even though Hilbert doesn’t think the actual default IQ degradation is a real big deal IQ-wise (which we disagree given all people who used to argue that we needed to improve our default IQ in the past, and not have any texture shimmer, etc, and now all of the sudden it’s acceptable to the community for AMD default settings to have texture shimmer???….), he does say this in conclusion:

“We urge and recommend AMD/ATI to disable the optimization at default in future driver releases and deal with the performance loss, as in the end everything is about objectivity and when you loose consumer trust, which (as little as it is) has been endangered, that in the end is going to do more harm then good. The drop in 3-4 FPS on average is much more acceptable then getting a reputation of being a company that compromises on image quality. And sure it raises other questions, does ATI compromise on other things as well ? See, the cost already outweigh the benefits. So the moral right thing to do for AMD/ATI is to make the High Quality setting the standard default. But again here we have to acknowledge that it remains a hard to recognize and detect series of optimizations, but it is there and it can be detected.”

Monkey Boy on December 3, 2010 at 12:03 am

What? You guys were deliberately caught cheating in the past. It was blatantly obvious! Whereas, in AMD’s case, not so much so. In fact, in he linked Guru3d article, the writer even states that they are incapable of discerning a real difference between the two IQ levels.

Monkey Boy on December 3, 2010 at 12:07 am

Also, why are some people saying they have to change the IQ for Nvidia cards with the recent driver releases? Here, I’ll provide a quote and exact link.

“I just sold my 5870 couple weeks ago for a 580 GTX.

Yes its true, I had to set my options from Quality to High Quality in the ATI drivers….this is something I have know since the 1900xt days….

But since I just installed my 580 GTX. I ALSO HAD TO SET IT FROM QUALITY TO HIGH QUALITY IN THE NVIDIA DRIVERS.

Why isnt this the article as well?

Seemed Biased to me.

And this is coming from someone who just had an AMD product to Nvidia… ”

http://forums.guru3d.com/showthread.php?t=333646&page=5

bore_u2_death on December 3, 2010 at 2:49 am

Directly quoted from the article published on http://www.guru3d.com comparing AMD and nVIDIA image quality.

“The optimization however allows ATI to gain 6% up to 10% performance at very little image quality cost. And I choose the words ‘very little’ here very wisely. The bigger issue and topic at hand here is no-matter what, image quality should be 100% similar in-between the graphics card vendors for objective reasons.

So what does that optimization look like in real world gaming. Well it’s really hard to find actually.

We seriously had a hard time finding an application where the optimizations show well. So mind you, the above example is catered to show the image quality anomaly.

This is Mass effect, 16xAF and Trinlinear filtering enabled in the configuration file. We are in the spaceship and have positioned ourselves with a surface area where the optimization should show really well. Yet we have a more complex scene with nice textures and lots of colors, much less bland then the simple example shown previously. This time just a Radeon HD 6850 with the optimization on and off and a GeForce GTX 580.

We have a hard time spotting differences as much as you do, and while making the screenshots we increased gamma settings to 50% and applied a resolution of 2560×1600 to try it look more visible.

Do you spot the difference ? Probably not, that is the rule we life by here at Guru3D, if you can not see it without blowing up the image or altering gamma settings and what not, it’s not a cheat. And sure, we know .. this game title is not a perfect example, it however is a good real world example.”

The ultimate indicator of image quality is that experienced by the end user while using an AMD or nVIDIA graphic card. If the image quality difference in not discernible during normal use, then Catalyst 10.11 is not harmful to image quality.

Just like the quality settings in the nVIDIA Control Panel, the Radeon HD 5000/6000 series GPUs have their Texture Filtering setting at the Quality setting by default. The end user can adjust this to his or her liking at any time. AMD has optimized performance for its most current GPUs at no expense to image quality, and has made its products more competitive at no loss.

Most importantly, you have to deliberately expose a slight difference in IQ during regular gameplay by using Mass Effect, a game from over three years ago by increasing the resolution to 1600p, artificially boosting the brightness beyond what a normal player would use, and then magnify the pixels by zooming in PhotoShop, the image quality loss experienced in that scenario is INVALID.

The claims nVIDIA makes in this article are groundless and invalid in some cases.

psycho_biscuit on December 3, 2010 at 2:11 pm

Groundless? I don’t think I’d call them that. It may not be much of a difference, but it is still a difference. When I’m playing a game and see something like that happening, I’m immediately drawn out of the immersion. The fact they purposefully do this is kind of insulting.

You may need to set NVIDIA’s card from quality to high quality as well; but does their ‘quality’ setting skimp out on anything? I like the idea of having a default setting that is not the highest a card can perform, if you don’t want it to be running as hard. Of course most people will turn it up if they’re needing more fps.

However, if ATI’s cards have less quality at the default setting, what does that say about their ‘high quality’ setting? Do they try and skimp out on IQ there as well, to say their high quality setting runs faster? Losing trust is extremely detrimental in the eyes of the consumer. This, along with all the crashing of games I have with my ATI card currently because of known issues with certain game engines (even with correct drivers) mean that I won’t be buying from them anymore.

I’ve never had a problem with an NVIDIA card. The only ATI card I’ve ever owned has frustrated me to no end. If they’re skimping out on quality, that’s the nail in the coffin. Give me better, stable quality over a few fps any day.

Monkey Boy on December 3, 2010 at 2:33 pm

This is my first ATI card in 10 years, and I’ve yet to experience a single problem with it. No one is saying AMD is “skimping” out on IQ, since no one can definitely prove it. As the Guru3d states, it’s damn near impossible. Anyone who says they can clearly discern a difference is a blatant liar. The question is, what’s the difference between Nvidia’s quality setting and their high-quality setting? That’s what needs to be asked as well. I want someone to compare and contrast the two quality settings from both companies.

bore_u2_death on December 3, 2010 at 3:09 pm

I applaud AMD for having increased the performance provided by its graphic cards with more effective optimizations.

I challenge nVIDIA to demonstrate a valid scenario where AMD’s newest Cat 10.11 driver compromises on image quality. A valid scenario is only one that can be found during regular gameplay, and not one where old games, high resolutions, certain angles, artificially high brightness, and zoom can combine to allow people to SPLIT HAIRS.

nVIDIA’s claims have NOT been substantiated in any way with real world gameplay. They are GROUNDLESS until proven otherwise.

bore_u2_death on December 3, 2010 at 4:13 pm

And just as a reminder, if anyone is so uncomfortable with a microscopic loss in image quality that has(as of yet) not been proven to be detrimental to the end user that you would rather forfeit a 6-10% increase in performance, you don’t have to stand for it.

Open Catalyst Control Center, and set the Texture Filtering option to High Quality, which just like nVIDIA, will disable all performance optimizations.

Finished.

Skreea Muhammad on December 3, 2010 at 6:29 pm

Thanks to all the shenanigans NV PR pulls off, this owner of a large LAN game center (we have over 200 gaming machines, with a mix of NV and ATI cards) .. we spend over $20,000 yearly alone on GPUs, I think it’s finally time we rid our systems of Nvidia products, and go full head on with AMD. At the very least we can retire a few Air Conditioners and saving a ton of electricity in the process. We cannot support a Corporation that acts like children.

In our 7 years of operation we have mostly used NV products thanks to their superior build quality and frequent driver updates. However the past few years had NV staggering … they’ve dropped the ball on both build quality and driver updates. That bumpgate fiasco? Yeah that hurt. With the amount of machines we have, any issue is catastrophic. As our systems are leased, any issues will cause downtime and loss of profits as we wait replacements.

When many of our cards started artifacting, some blowning capacitors, among others takes a machine down we are left with lost profits when the system is down. Out of the hundreds of AMD cards we’ve leased, only a handful have failed. However the same cannot be said for Nvidia. Most of our G80/G92 cards have failed (70%+) with the symptoms I’ve listed above. Note that I did not mention Fermi and I will not need to as that chips’ issues are highly documented on most credible hardware sites.

We have stuck with Nvidia through the good and bad. However it is time to say goodbye. NV isn’t interested in fixing any of it’s own problems and instead resorts to pools to rally up unknown, non-english sites with dirty fingers pointing at AMD is silly.

Farewell and Goodbye.

Sharky on December 5, 2010 at 12:15 pm

I read all the comments and it’s really scary how ignorant people can be. The AMD AF optimization is not something you can barely see. It’s a very noticable IQ degradation. You barely see it on screenshots. For this reason the Guru3D article is ridiculous. But you can see it during gameplay.

Watch these videos:

http://www.tweakpc.de/hardware/tests/grafikkarten/amd_radeon_hd_6870_hd_6850/s09.php

bore_u2_death on December 6, 2010 at 1:31 am

The longstanding rule has been that there should be no IQ difference between the Quality and High Quality AF settings, and both made an agreement that their Quality settings would have no optimizations that degrade IQ. So if A = B and B = C, then A = C.

To find out the effect that all these IQ “degradation” means to the end user, I conducted a research with the HardOCP community, asking Radeon HD 5000 and 6000 owners for their opinion. Read on and find out their subjective opinions. As you will see, it doesn’t amount to much.

http://hardforum.com/showthread.php?t=1566492

Sharky on December 6, 2010 at 4:24 am

The videos in the link I’ve posted above show that the Radeons have the shimmering problems even in HQ (which NVidia cards don’t). So Q and HQ comparison does not really matter. Even at best HQ settings the AMD cards have vast IQ issues.

Unco on December 6, 2010 at 3:30 pm

“And as Joe Eklund said, NVIDIAs drivers are the best and no nonsense.”

It was only a few months ago that NVIDIA drivers were killing GPU’s due to a dodgey fan controller. People have such a short memory these days.

Unco on December 6, 2010 at 4:17 pm

“I’ve never had a problem with an NVIDIA card. The only ATI card I’ve ever owned has frustrated me to no end. If they’re skimping out on quality, that’s the nail in the coffin. Give me better, stable quality over a few fps any day.”

Better stable quality? Did you not see the GTX 480 reviews on launch? Give me a break!

Sharky on December 6, 2010 at 6:29 pm

“Did you not see the GTX 480 reviews on launch? ”

Please give a link to a GTX 480 review where they say a word about instability or image quality problems.

You have none? Case closed.

Psycoz on December 14, 2010 at 5:18 am

AMD Catalyst 10.12
Changes:

[b]High Quality AF Mode is new default setting for 6xxx series[/b]
New Catalyst Control Center with Explorer Style menu and shortcuts
AMD Branded
Support for 6900 Series