Guild Wars Forums - GW Guru
 
 

Go Back   Guild Wars Forums - GW Guru > Forest of True Sight > Technician's Corner

Notices

Closed Thread
 
Thread Tools Display Modes
Old Jul 27, 2005, 12:47 AM // 00:47   #21
Furnace Stoker
 
lord_shar's Avatar
 
Join Date: Jul 2005
Location: near SF, CA
Advertisement

Disable Ads
Default

I can see Crossfire producing good image quality, but I'm not yet sold on its concept. The problem with daisy-chaining 2 video cards is that one card always has to wait for output from another card before before being able to render a video frame. Compared to nVidia SLI's parallel video processing set-up where video load is evenly distributed accross both GPU's, ATI's crossfire is essentially serial or sequential video processing, which doesn't sound as efficient speed-wise since one card is usually waiting for the other.

I will reserve judgement until the finished product is released, just to give ATI a fair chance.
lord_shar is offline  
Old Jul 27, 2005, 10:18 AM // 10:18   #22
Jungle Guide
 
Zakarr's Avatar
 
Join Date: May 2005
Location: Finland
Default

Quote:
Originally Posted by lord_shar
I have a 6800gt and 6800Ultra, both from BFG. If given a choice between the X800XL vs. the 6800GT, I'd probably go with the nVidia for several reasons:

1) the 6800gt supports full DX9.0c, while the ATI X800 only goes up to DX9.0b. This doesn't matter with today's games, but DX9.0c might become a requirement for some future titles.

2) ATI's drivers tend to perform dynamic texture quality adjustments during high stress video movement, lowering image quality in order to maintain fps. While this is an intelligent way of managing hardware resources, it doesn't allow accurate measurement of the hardware's performance until the feature is turned off. When it is, more than a few titles bench lower on ATI boards vs. nVidia.

However, the nVidia 6800 series, especially the Ultra, are huge power hogs. If you don't have a sufficiently beefy power supply, you may encounter some system stability issues if you go with one of these power-hungry video cards.
1) DirectX 9.0b and 9.0c are mainly bug fixes and don't define any new 3D hardware feature standards so both ATI and NVIDIA support DirectX 9.0 but X800 series support pixel shader 2.0b and vertex shader 2.0 while NVIDIA GF 6800 series support pixel and vertex shaders 3.0

Higher version shaders are useless if game don't support them. You won't automatically gain any image quality with better shaders.

2) You can turn it off if you want.

Benchmarks are no the whole truth. Results don't represent all the games in the world. Some games work better with ATI while some other games work better with NVIDIA. One might to want his/her favorite games work the best way rather than blindly looking for benchmark results from few games.

ATI consume less power which means less heat and less stress for power supply. Usually ATI cards have more silent stock coolers at least in certain brands and you can usually reduce the fan speed without overheating a lot more than in NVIDIA GF 6800 to reduce cooler noise.

ATI X800 has better quality anti-aliasing than in NVIDIA GF 6800. And better anti-aliasing performance with high resolutions like 1600x1200 or maybe even with 1280x960 or 1280x1024

NVIDIA usually have better OpenGL performance so Linux users and OpenGL graphics based games like Call of Duty, Doom3, Quake series, Linux 3D games and all games which are based on Quake or Doom 3 engine, are faster and/or less buggy.

NVIDIA often overclocks better so overclocker might want to choose it instead ATI. They are most likely more future proof cards because of more advanced shaders but no one know how soon and widely these shaders are going to be used in near future games and how much gamers get benefits from them. Latest Splinter Cell and Far Cry support these shaders but in Far Cry the cap between X800 and GF6800 is pretty minor. However, latest Splinter Cell gains noticeable image quality differences with GF 6800 but that is because from some reasons developers didn't add support for ATIs 2.0 shaders and ATI have to use old 1.1 shaders. Maybe it was done in purpose by offering bribes. Who knows.

Last edited by Zakarr; Jul 27, 2005 at 10:29 AM // 10:29..
Zakarr is offline  
Old Jul 27, 2005, 03:44 PM // 15:44   #23
Furnace Stoker
 
lord_shar's Avatar
 
Join Date: Jul 2005
Location: near SF, CA
Default

Quote:
Originally Posted by Zakarr
1) DirectX 9.0b and 9.0c are mainly bug fixes and don't define any new 3D hardware feature standards so both ATI and NVIDIA support DirectX 9.0 but X800 series support pixel shader 2.0b and vertex shader 2.0 while NVIDIA GF 6800 series support pixel and vertex shaders 3.0

Higher version shaders are useless if game don't support them. You won't automatically gain any image quality with better shaders.
Agreed, but when the game title supports both shaders, model 3.0 usually outperforms 2.0 by a good margin (e.g., Farcry). Furthermore, support for the newer, faster shader model 3.0 will only get better, not worse.

Quote:
Originally Posted by Zakarr
2) You can turn it off if you want.
True, but with the bri-linear filtering disabled, the X800 falls behind in performance and benchmarks when full trilear/anisotropic filtering is active. Software drivers can always be patched, but buyers stuck with the hardware.

Quote:
Originally Posted by Zakarr
Benchmarks are no the whole truth. Results don't represent all the games in the world. Some games work better with ATI while some other games work better with NVIDIA. One might to want his/her favorite games work the best way rather than blindly looking for benchmark results from few games.
True, but benchmarks are a reasonable metric to compare relative performance when comparing two competing products.

Quote:
Originally Posted by Zakarr
ATI consume less power which means less heat and less stress for power supply. Usually ATI cards have more silent stock coolers at least in certain brands and you can usually reduce the fan speed without overheating a lot more than in NVIDIA GF 6800 to reduce cooler noise.

ATI X800 has better quality anti-aliasing than in NVIDIA GF 6800. And better anti-aliasing performance with high resolutions like 1600x1200 or maybe even with 1280x960 or 1280x1024

NVIDIA usually have better OpenGL performance so Linux users and OpenGL graphics based games like Call of Duty, Doom3, Quake series, Linux 3D games and all games which are based on Quake or Doom 3 engine, are faster and/or less buggy.

NVIDIA often overclocks better so overclocker might want to choose it instead ATI. They are most likely more future proof cards because of more advanced shaders but no one know how soon and widely these shaders are going to be used in near future games and how much gamers get benefits from them. Latest Splinter Cell and Far Cry support these shaders but in Far Cry the cap between X800 and GF6800 is pretty minor. However, latest Splinter Cell gains noticeable image quality differences with GF 6800 but that is because from some reasons developers didn't add support for ATIs 2.0 shaders and ATI have to use old 1.1 shaders. Maybe it was done in purpose by offering bribes. Who knows.
Agreed on these points.

Last edited by lord_shar; Jul 28, 2005 at 12:19 AM // 00:19..
lord_shar is offline  
Old Jul 29, 2005, 02:07 AM // 02:07   #24
Jungle Guide
 
Join Date: May 2005
Default

Quote:
Originally Posted by MasterQu
Couldn't catch me using ATI. Nvidia sold me when I had to call them for driver support. As I was talking to them they rebuilt the driver, I was using on a (i think) 5700 card at the time that was having issues with UT2004, and emailed it to me before i hung up the phone. Thats service and it bought my loyalty. Never mind the fact that everytime ati comes out with some thing nvidia reciprocates a pwns ati a$$.
Yeah, just like Nvidia's FX series got owned by ATI. I never subscribe to blind allegiance.
MaglorD is offline  
Old Jul 29, 2005, 02:28 AM // 02:28   #25
Jungle Guide
 
Join Date: May 2005
Default

Quote:
Originally Posted by lord_shar
Yep, here's the link from Tom's Hardware Guide:

http://graphics.tomshardware.com/gra...603/index.html

The article from June 2004 found that the X800's catalyst drivers dropped the texture filtering mode from trillinear to bi-linear mode during high load situations, but ramped up to trillinear mode once GPU load went down. Although ATI insists that the X800 is still performing tri-linear filtering, it's actually "bri-linear" since it's mixing the two filter modes. This filtering could not disabled until many several driver revisions later. As far as I know, it is still present in the current ATI catalyst driver set.

Although the technique does have its merrits, it is completely software driven and doesn't paint an accurate portrait of the hardware's normal performance when benched against a "non-driver-optimized" video card.
From what I read in the article, both manufacturers heavily optimise their drivers to get the most out of the hardware. This is to be expected since delivering true filtering slows down the hardware considerably. The trick is not to sacrifice image quality too much while employing the optimisations and this ATI appears to have accomplished. Don't forget how horrible Nvidia's FX optimisations were.

Most review sites do not disable Nvidia's optimisations while leaving ATI's optimisations intact, so I don't see how the comparisons are unfair.

It is generally accepted the X800XL is comparable to the 6800GT in fair comparison, without disabling optimisations while running either card. So pick whichever is cheaper.
MaglorD is offline  
Old Jul 29, 2005, 02:57 AM // 02:57   #26
Ascalonian Squire
 
Join Date: Jul 2005
Guild: sotu
Profession: E/R
Default

Quote:
Originally Posted by Dirkiess
Algren, I read something just recently regarding the Crossfire series. I'll have to see if I can find the article, but it was related to the same thing, but it also mentioned that the drivers were only in Beta and not the final release.

Whether this was true or not, I don't know, but it's either plausible or a possible side step for ATI until they can get the drivers up to speed.

To be honest though, I think there may well be pro's and Con's for both ATI and NVidia.

These could all be related to the systems that people have and a prime example of this, was a news article I read today that says NVidia's own NForce 3 motherboards cause problems with there own 6800 series of cards. Now how is that for compatibility?

I'll see if I can find either of the links to the above articles. I've read a lot of articles today.
i use an nforce 3 board/xfx 6800gt combo and don't get no problems, i think most of the people who are having troubles are the ones who bought bfg's overclocked cards,also evga are to be avoided from what i have heard.
emirate xaaron is offline  
Old Jul 30, 2005, 09:01 PM // 21:01   #27
Furnace Stoker
 
lord_shar's Avatar
 
Join Date: Jul 2005
Location: near SF, CA
Default

Quote:
Originally Posted by MaglorD
From what I read in the article, both manufacturers heavily optimise their drivers to get the most out of the hardware. This is to be expected since delivering true filtering slows down the hardware considerably. The trick is not to sacrifice image quality too much while employing the optimisations and this ATI appears to have accomplished. Don't forget how horrible Nvidia's FX optimisations were.

Most review sites do not disable Nvidia's optimisations while leaving ATI's optimisations intact, so I don't see how the comparisons are unfair.

It is generally accepted the X800XL is comparable to the 6800GT in fair comparison, without disabling optimisations while running either card. So pick whichever is cheaper.
The reason nVidia's 6800-series didn't get slammed was because they maintained full anisotropic filtering and consistent image quality throughout the tests . ATI's x800's, on the other hand, dropped image quality for the sake of frame rates without informing its user-base until complaints came in. I'm just glad ATI responded positively to the criticism by adding an off-switch to disable bri-linear filtering. Unfortunately, this also revealed ATI's X800 hardware design starting to show its age vs. the 6800 series. I just hope they do better against the 7800GTX with their next solution sans "cheating."
lord_shar is offline  
Old Nov 09, 2005, 01:54 AM // 01:54   #28
Jungle Guide
 
Join Date: May 2005
Default

Quote:
Originally Posted by lord_shar
The reason nVidia's 6800-series didn't get slammed was because they maintained full anisotropic filtering and consistent image quality throughout the tests . ATI's x800's, on the other hand, dropped image quality for the sake of frame rates without informing its user-base until complaints came in. I'm just glad ATI responded positively to the criticism by adding an off-switch to disable bri-linear filtering. Unfortunately, this also revealed ATI's X800 hardware design starting to show its age vs. the 6800 series. I just hope they do better against the 7800GTX with their next solution sans "cheating."
I truly doubt any card performs trilinear filtering even with bri-linear turned off. So it's pointless to compare without optimisations. The important thing is to ensure optimisations do not degrade quality compared to the competition.

The reason Nvidia didn't get slammed is because ATI went and told everyone they were performing true trilinear filtering when they actually optimised by using brilinear. Even so, the image quality with ATI's brilinear filtering is superior to Nvidia's brilinear filtering and does approach true trilinear filtering.
MaglorD is offline  
Old Nov 09, 2005, 03:37 AM // 03:37   #29
Frost Gate Guardian
 
Techie's Avatar
 
Join Date: Nov 2005
Location: Fairfield, Ohio
Profession: Mo/W
Default

To be honest its any mans game. ATI and Nvidia have their ups and downs. Once Crossfire is used to its full potential I bet it will blow SLI away. ATI is known for being late to release cards, but they never disappoint.
Techie is offline  
Old Nov 09, 2005, 04:35 AM // 04:35   #30
Furnace Stoker
 
lord_shar's Avatar
 
Join Date: Jul 2005
Location: near SF, CA
Default

Quote:
Originally Posted by MaglorD
I truly doubt any card performs trilinear filtering even with bri-linear turned off. So it's pointless to compare without optimisations. The important thing is to ensure optimisations do not degrade quality compared to the competition.
Speculation is fine, but please provide some links backing up the above to at least support your statements and reasoning.

Quote:
Originally Posted by MaglorD
The reason Nvidia didn't get slammed is because ATI went and told everyone they were performing true trilinear filtering when they actually optimised by using brilinear. Even so, the image quality with ATI's brilinear filtering is superior to Nvidia's brilinear filtering and does approach true trilinear filtering.
I have yet to read any articles mentioning any bri-linear filtering being done by NVidia. Can you please link the URL here if you have it?

Yes, I'm sure Nvidia does have some degree of video driver optimization. However, we users cannot detect these optimizations since they don't leave any visible image degredation, while their ATI counterparts do.

Last edited by lord_shar; Nov 09, 2005 at 04:42 AM // 04:42..
lord_shar is offline  
Old Nov 09, 2005, 04:38 AM // 04:38   #31
Furnace Stoker
 
lord_shar's Avatar
 
Join Date: Jul 2005
Location: near SF, CA
Default

Quote:
Originally Posted by Techie
To be honest its any mans game. ATI and Nvidia have their ups and downs. Once Crossfire is used to its full potential I bet it will blow SLI away. ATI is known for being late to release cards, but they never disappoint.
Based on what I've read so far, both SLI and Crossfire will have their own quirks. What makes Crossfire more attractive costwise is the fact that the cards involved don't have to be matched twins. SLI, on the other hand, costs more since both video cards must be identical. SLI also performs dynamic load distribution between both GPU's. Either way, I can't wait for the benchmarks on both systems side-by-side.

EDIT: As of the WHQL-80 driver release, SLI no longer requires exact card brand matches (e.g., you can now SLi-link a BFG with a Leadtech 7800GTX). GPU core models and hardware specs must still match, but otherwise, it's now more flexible.

Last edited by lord_shar; Nov 09, 2005 at 11:26 AM // 11:26..
lord_shar is offline  
Old Nov 09, 2005, 06:38 AM // 06:38   #32
Jungle Guide
 
Join Date: May 2005
Default

Quote:
Originally Posted by lord_shar
Speculation is fine, but please provide some links backing up the above to at least support your statements and reasoning.

I have yet to read any articles mentioning any bri-linear filtering being done by NVidia. Can you please link the URL here if you have it?

Yes, I'm sure Nvidia does have some degree of video driver optimization. However, we users cannot detect these optimizations since they don't leave any visible image degredation, while their ATI counterparts do.
Quoted from the Tom's Hardware article.

This is indeed a "cheat" that both major vendors now do. Instead of always sampling the two adjacent mip map levels and doing a full blend between them, they have plateaus where only a single mip level is sampled, reducing the average samples from 8 to about 6. It is actually a pretty sensible performance enhancement, with minimal visual issues.

I'm not sure what visible image degradation you are referring to.

Here is what is quoted in the Tom's Hardware article:

The objective of trilinear filtering is to make transitions between mipmap levels as near to invisible as possible. As long as this is achieved, there is no "right" or "wrong" way to implement the filtering.

We have added intelligence to our filtering algorithm to increase performance without affecting image quality. As some people have discovered, it is possible to show differences between our filtering implementations for the RADEON 9800XT and RADEON X800. However, these differences can only be seen by subtracting before and after screenshots and amplifying the result. No-one has claimed that the differences make one implementation "better" than another.
MaglorD is offline  
Old Nov 09, 2005, 11:16 AM // 11:16   #33
Furnace Stoker
 
lord_shar's Avatar
 
Join Date: Jul 2005
Location: near SF, CA
Default

Quote:
Originally Posted by MaglorD
Quoted from the Tom's Hardware article.

This is indeed a "cheat" that both major vendors now do. Instead of always sampling the two adjacent mip map levels and doing a full blend between them, they have plateaus where only a single mip level is sampled, reducing the average samples from 8 to about 6. It is actually a pretty sensible performance enhancement, with minimal visual issues.

I'm not sure what visible image degradation you are referring to.

Here is what is quoted in the Tom's Hardware article:

The objective of trilinear filtering is to make transitions between mipmap levels as near to invisible as possible. As long as this is achieved, there is no "right" or "wrong" way to implement the filtering.

We have added intelligence to our filtering algorithm to increase performance without affecting image quality. As some people have discovered, it is possible to show differences between our filtering implementations for the RADEON 9800XT and RADEON X800. However, these differences can only be seen by subtracting before and after screenshots and amplifying the result. No-one has claimed that the differences make one implementation "better" than another.
That sounds better... thanks for the clarification.
lord_shar is offline  
Old Nov 09, 2005, 12:09 PM // 12:09   #34
Frost Gate Guardian
 
Techie's Avatar
 
Join Date: Nov 2005
Location: Fairfield, Ohio
Profession: Mo/W
Default

But has anyone actually determined what the difference is? Or has a program yet to be created that shows this?
Techie is offline  
Old Nov 10, 2005, 12:26 PM // 12:26   #35
Middle-Age-Man
 
Old Dood's Avatar
 
Join Date: May 2005
Location: Lansing, Mi
Profession: W/Mo
Default

I like my X800XL over the 6800GT only because it was a $100.00 cheaper when I bought it. The 6800GT has the Pix Shader 3...that is what makes it better in that respect. As for which one is faster? They are too evenly matched in speed.

How I look at it overall is whenever I upgrade to a better vid card I am always happy. It is way better then what I had before. My 9800pro still is a solid card in my "second" system. I was amazed at how much better my newer X800XL was over that.

Last edited by Old Dood; Nov 10, 2005 at 12:28 PM // 12:28..
Old Dood is offline  
Closed Thread

Share This Forum!  
 
 
           

Thread Tools
Display Modes

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off

Forum Jump

Similar Threads
Thread Thread Starter Forum Replies Last Post
nVidia GeForce 6800GT 256MB - Is it worth it? + Overclockable? Josh Technician's Corner 1 Dec 31, 2005 05:09 AM // 05:09
Crash at Hell's Precipice Only - 6800GT Billiard Technician's Corner 2 Aug 05, 2005 11:01 PM // 23:01
freejet Technician's Corner 2 Jul 01, 2005 04:31 PM // 16:31
Icarium Technician's Corner 6 Jun 19, 2005 11:54 PM // 23:54


All times are GMT. The time now is 04:41 PM // 16:41.


Powered by: vBulletin
Copyright ©2000 - 2016, Jelsoft Enterprises Ltd.
jQuery(document).ready(checkAds()); function checkAds(){if (document.getElementById('adsense')!=undefined){document.write("_gaq.push(['_trackEvent', 'Adblock', 'Unblocked', 'false',,true]);");}else{document.write("