Guild Wars Forums - GW Guru
 
 

Go Back   Guild Wars Forums - GW Guru > Forest of True Sight > Technician's Corner

Notices

Reply
 
Thread Tools Display Modes
Old Nov 08, 2005, 03:33 PM // 15:33   #41
Permanently Unbanned
 
Mr D J's Avatar
 
Join Date: Jun 2005
Advertisement

Disable Ads
Default

actually i dont think RAM matters alot cuz i got XFX GForce 6600 GT and it has 128 MB of ram but I can run any game on max resolution (if my monitor supports it) although my A64 processor might give it a boost too
Mr D J is offline   Reply With Quote
Old Nov 08, 2005, 04:17 PM // 16:17   #42
Frost Gate Guardian
 
Techie's Avatar
 
Join Date: Nov 2005
Location: Fairfield, Ohio
Profession: Mo/W
Default

Why not try running Doom 3 on 1600x1200 and let me know. That resolution with medium-high settings requires a 512MB card to produce the fastest rendering texture rate possible.

Like I said, the higher quality settings and the higer the resolution you go, the more GPU RAM you need.
Techie is offline   Reply With Quote
Old Nov 08, 2005, 08:14 PM // 20:14   #43
Furnace Stoker
 
lord_shar's Avatar
 
Join Date: Jul 2005
Location: near SF, CA
Default

Quote:
Originally Posted by Joker The Owner
actually i dont think RAM matters alot cuz i got XFX GForce 6600 GT and it has 128 MB of ram but I can run any game on max resolution (if my monitor supports it) although my A64 processor might give it a boost too
There's a feature on both NVidia and ATI cards called z-buffering, also called hyper-Z by ATI. Z-buffering uses extra ram to determine what polygons are hidden behind other objects, thereby allowing the GPU to intelligently skip rendering these obstructed objects. Howevever, this technique requires double the regular ram if I remember correctly.

So yes, more video ram never hurts...
lord_shar is offline   Reply With Quote
Old Nov 08, 2005, 08:54 PM // 20:54   #44
Frost Gate Guardian
 
swaaye's Avatar
 
Join Date: May 2005
Default

I've recently gotten myself a free Geforce FX5600 256MB. Immediately threw it into my AthlonXP 2.5Ghz (zoom).

I'd place the Geforce FX5600 (overclocked to 350/250(500) just around a Radeon 8500 (280/300(600) in speed in this game. The 8500 actually may be faster, I don't remember exactly. Running around Lion's Arch at 1680x1050 shows about 25fps max. Not a fast experience by any means.

By the way, GeforceFX fully supports DX9 in every way. It actually has a featureset superior to DX9 Shader Model 2. Problem is the architecture sucks at actually performing these effects. ATI's midrange cards that were available at the launch of FX5200/5600, the Radeon 9500PRO/9600 cards, were significantly better cards.

Stay away from Geforce FX unless you can get a FX5900+ for really cheap.

Last edited by swaaye; Nov 08, 2005 at 08:59 PM // 20:59..
swaaye is offline   Reply With Quote
Old Nov 08, 2005, 08:56 PM // 20:56   #45
Frost Gate Guardian
 
Techie's Avatar
 
Join Date: Nov 2005
Location: Fairfield, Ohio
Profession: Mo/W
Default

Yes but what settings are you running it at? Fastest or most visual?
Techie is offline   Reply With Quote
Old Nov 08, 2005, 08:57 PM // 20:57   #46
Furnace Stoker
 
EternalTempest's Avatar
 
Join Date: Jun 2005
Location: United States
Guild: Dark Side Ofthe Moon [DSM]
Profession: E/
Default

Quote:
Originally Posted by swaaye
I've recently gotten myself a free Geforce FX5600 256MB. Immediately threw it into my AthlonXP 2.5Ghz (zoom).

I'd place the Geforce FX5600 (overclocked to 350/250(500) just around a Radeon 8500 (280/300(600) in speed in this game. Running around Lion's Arch at 1680x1050 shows about 25fps max. Not a fast experience by any means.

Stay away from Geforce FX unless you can get a FX5900+ for really cheap.
I would go with the cheap Nvidia 6xxx > FX series. Nvidia 6xxx has AGP / PCI express if your on a budge.

I'm getting 40-50 FPS, 1024x768 @ 75mhz, 4AA refresh rate with my Fx 5700 ultra overall in GW (I run with the benchmark stats - I forgot to take it off the mod to the short cut). I also have an amd athlon 64 3400+ (older version), running on 1 gb of memory and older 120gb ATA100 (not SATA drive).

Game runs pretty smooth around only 30min fps over all.

Last edited by EternalTempest; Nov 08, 2005 at 09:01 PM // 21:01..
EternalTempest is offline   Reply With Quote
Old Nov 08, 2005, 08:59 PM // 20:59   #47
Frost Gate Guardian
 
swaaye's Avatar
 
Join Date: May 2005
Default

Highest Quality. I ran through the whole spectrum and only Lowest Quality sped it up significantly, to around 35fps max or so. 4X AA brought it down to about 15-19fps.

Techie's guide at post #1 is pretty spot on with what you should buy today. I just thought it would be fun to see how a FX5600 actually runs it. I wouldn't trade in my 4 yr old Radeon 9700 on this FX5600, that's for certain.

BTW EternalTempest I think the FX5700 uses a more powerful chip than the 5600. NV almost immediately refreshed their NV3x line after the launch of 5800. You know how the companies refresh every 6 months? Well NV30 was shelved for NV35 in about 3 months.

The 5900 at least has a lot more shader power than 5800 because they literally added some more math units to the core. It still wasn't enough to touch a 9700 or 9800 though. So your performance should definitely be a little better than a 5600.

Last edited by swaaye; Nov 08, 2005 at 09:11 PM // 21:11..
swaaye is offline   Reply With Quote
Old Nov 08, 2005, 10:36 PM // 22:36   #48
Furnace Stoker
 
EternalTempest's Avatar
 
Join Date: Jun 2005
Location: United States
Guild: Dark Side Ofthe Moon [DSM]
Profession: E/
Default

You jogged my memory, when I did do research on my video card the 5700 was based on a diffrent gpu, that's why I bough it.

The 5700 Ultra is based on the NV36 chip, a better but slower then the 5900 series used and the 5800 was based on older chip and that was the reason why I didn't go with it. If I rember correctly, may be slight off.

Found this info:
NV30 = GeForce FX 5800
NV35 = GeForce FX 5900
NV38 = GeForce FX 5950

I almost bet 1 plat the 5700 Ultra came at the tail end of the 5x line as a revsion card / fill the gap card with a much better chip.

Last edited by EternalTempest; Nov 08, 2005 at 10:49 PM // 22:49..
EternalTempest is offline   Reply With Quote
Old Nov 09, 2005, 01:45 AM // 01:45   #49
Jungle Guide
 
Join Date: May 2005
Default

Quote:
Originally Posted by lord_shar
ATI's benchmarks with the x800's and x850's are a bit skewed due to their catalyst driver's dynamic texture filtering. Also referred to as "bri-linear filtering," ATI's x800 series dynamically shifts between bilinear + trilinear filtering modes to achieve the best benchmarks. However, the X800's/850's could not perform true trilinear filtering unless you turn off ATI's filtering optimizations, but once you disabled this feature, ATI's X800's/850's fell behind NVidia's 6800's. ATI caught a lot of flak for the above and finally conceded by adding an "off" switch to their optimized texture filtering.

Why does the above matter? Simple: ATI was compromising video quality for the sake of benchmarks. NVidia did this in the past as well with their fx5000 series, so they're not squeaky-clean either. However, neither company should be resorting to such driver tweaks given the speed of their current card lines.

Drivers can always be updated, but you're stuck with the video card until you toss it, so you might was well get the best hardware possible until the next best thing comes out.
Actually, the 6800 Ultra still loses to the X800 XTPE when both cards are compared on even footing.

No card uses pure tri-linear optimisation, it's simply too resource-intensive to implement. See:

http://graphics.tomshardware.com/gra...i-x800-12.html

The benchmarks here are based on both ATI and Nvidia using their own optimisations.
MaglorD is offline   Reply With Quote
Old Nov 09, 2005, 03:19 AM // 03:19   #50
Frost Gate Guardian
 
swaaye's Avatar
 
Join Date: May 2005
Default

Both NV and ATI do exactly the same things. NV was FAR worse about it back in the FX days, but they are on about equal footing with ATI right now, perhaps slightly behind. I have problems in KOTOR with my 6800 and texture shimmering on the floor (fixed by jacking up the performance slider to HQ).

Actually ATI has been ahead of NV for a long time on image quality. ATI's antialiasing is far superior, especially compared to FX cards which can not do gamma corrected AA (I'm not sure 6x00 can do it either). And ATI can do 6X MSAA whereas NV can only do 4X MSAA. NV does have a questionable hybrid 8X AA mode which is 4X MSAA + 2X SSAA and is horribly slower than the 4X MSAA mode. The xS modes are junk because even though they perform well, they look awful.

Bottom line in IQ is:
ATI >>>> NV FX
ATI >> NV 6x
ATI > NV 7x (still no 6X MSAA)

Bottom line also is neither of them do trilinear or anisotropic the text book way. I have no problems with tweaks until I can see them. Actually the GF4 does anisotropic very well, but it's also very slow at it because of this.

NV has no ground to stand on with regards to image quality. They were caught red handed hacking 3dmark03 back in the day to make FX look better than the crap it is/was. They rewrote shaders in games to make them run faster on FX, shaders of lower quality than the game's originals. And their intensely optimized anisotropic and trilinear optimizations create bad shimmering in many games, and these settings are the default that you see benchmarked everywhere. In essense, NV cards are slower than they seem if you want to fix shimmering.

I like my 6800 a lot, but my 9700 looks better most of the time. Obviously it's slower though lol. ATI also doesn't release drivers for a SLI/dual-core niche market and break support for big games like Guild Wars, among others.

Last edited by swaaye; Nov 09, 2005 at 03:24 AM // 03:24..
swaaye is offline   Reply With Quote
Old Nov 09, 2005, 03:33 AM // 03:33   #51
Frost Gate Guardian
 
Techie's Avatar
 
Join Date: Nov 2005
Location: Fairfield, Ohio
Profession: Mo/W
Default

The 9700 pro is an AMAZING GPU. It was the most futureproof card to be released when it was. It is still pulling off rates that some cards that were releaed after cannot.
Techie is offline   Reply With Quote
Old Nov 09, 2005, 04:22 AM // 04:22   #52
Furnace Stoker
 
lord_shar's Avatar
 
Join Date: Jul 2005
Location: near SF, CA
Default

Quote:
Originally Posted by MaglorD
Actually, the 6800 Ultra still loses to the X800 XTPE when both cards are compared on even footing.

No card uses pure tri-linear optimisation, it's simply too resource-intensive to implement. See:

http://graphics.tomshardware.com/gra...i-x800-12.html

The benchmarks here are based on both ATI and Nvidia using their own optimisations.
Please check the date on the article you linked, then check the date on this one disclosing ATI's questionable optimizations:
http://graphics.tomshardware.com/gra...603/index.html

As you can tell, the optimizations were discovered after the X800's initial card review once people started complaining about X800's the image quality drops during high load situations.

And yes, NVidia's 6800 series does true trilinear filtering without any discernable image quality loss. Optimizations are fine, so long as users don't see any compromises in image quality, which is what ATI didn't do with the X800/850's. At the current performance levels being achieved by today's card lines, neither NVidia nor ATI is doing us video-enthusiast any favors by tossing image quality out the window for a few extra frames over its rival.

Last edited by lord_shar; Nov 09, 2005 at 04:29 AM // 04:29..
lord_shar is offline   Reply With Quote
Old Nov 09, 2005, 04:43 AM // 04:43   #53
Frost Gate Guardian
 
swaaye's Avatar
 
Join Date: May 2005
Default

No the 6800 does not do trilinear correctly by default. I have one and I can load up several games right now which show you how they tweaked things and what it did. You must use HQ mode to get full trilinear. At least better trilinear, maybe still not perfect.
swaaye is offline   Reply With Quote
Old Nov 09, 2005, 04:47 AM // 04:47   #54
Furnace Stoker
 
lord_shar's Avatar
 
Join Date: Jul 2005
Location: near SF, CA
Default

Quote:
Originally Posted by swaaye
No the 6800 does not do trilinear correctly by default. I have one and I can load up several games right now which show you how they tweaked things and what it did. You must use HQ mode to get full trilinear. At least better trilinear, maybe still not perfect.
So long as the option is there and properly labeled, I can live with it
lord_shar is offline   Reply With Quote
Old Nov 09, 2005, 06:12 AM // 06:12   #55
Jungle Guide
 
Join Date: May 2005
Default

Quote:
Originally Posted by lord_shar
Please check the date on the article you linked, then check the date on this one disclosing ATI's questionable optimizations:
http://graphics.tomshardware.com/gra...603/index.html

As you can tell, the optimizations were discovered after the X800's initial card review once people started complaining about X800's the image quality drops during high load situations.

And yes, NVidia's 6800 series does true trilinear filtering without any discernable image quality loss. Optimizations are fine, so long as users don't see any compromises in image quality, which is what ATI didn't do with the X800/850's. At the current performance levels being achieved by today's card lines, neither NVidia nor ATI is doing us video-enthusiast any favors by tossing image quality out the window for a few extra frames over its rival.
Lord Shar, Tom's Hardware states in the article about ATI's disclosures they did NOT disable Nvidia's optimisations when testing the X800 against Nvidia, yet the conclusion of the article rates the X800 XTPE favourably against the competition.

Here is what they said:

"In our X800 test NVIDIA's trilinear optimization was not disabled, so the comparable values continue to be valid and comparable"

And no, ATI's optimisations result in very good texture quality. Even the reviewers at Tom's Hardware thought so.

Last edited by MaglorD; Nov 09, 2005 at 06:15 AM // 06:15..
MaglorD is offline   Reply With Quote
Old Nov 09, 2005, 07:27 AM // 07:27   #56
Furnace Stoker
 
lord_shar's Avatar
 
Join Date: Jul 2005
Location: near SF, CA
Default

Quote:
Originally Posted by MaglorD
Lord Shar, Tom's Hardware states in the article about ATI's disclosures they did NOT disable Nvidia's optimisations when testing the X800 against Nvidia, yet the conclusion of the article rates the X800 XTPE favourably against the competition.

Here is what they said:

"In our X800 test NVIDIA's trilinear optimization was not disabled, so the comparable values continue to be valid and comparable"

And no, ATI's optimisations result in very good texture quality. Even the reviewers at Tom's Hardware thought so.
Which article are you referring to, the initial X800 review or the post-mortem report?

I agree that the optimizations ATI is preforming are still acceptable, but these should not be detectable by the naked eye. This is why the second report about questionable optimizations surfaced once again. I'm just glad ATI finally yielded to public criticizm and installed an optimization off-switch.

Either way, the 7800GTX is the current king of the hill... let's hope the X1800 can dethrone it or come close so that we'll see another appreciable price drop in the high end video card arena
lord_shar is offline   Reply With Quote
Old Nov 09, 2005, 08:34 AM // 08:34   #57
Jungle Guide
 
Join Date: May 2005
Default

Quote:
Originally Posted by lord_shar
Which article are you referring to, the initial X800 review or the post-mortem report?
This was quoted from the second more recent article.

Quote:
I agree that the optimizations ATI is preforming are still acceptable, but these should not be detectable by the naked eye. This is why the second report about questionable optimizations surfaced once again. I'm just glad ATI finally yielded to public criticizm and installed an optimization off-switch.
lol, but it's not detectable unless you do a lot of graphic manipulation. The same cannot be said of Nvidia's FX optimisations.

As for 7800, it will soon be dethroned :P

Last edited by MaglorD; Nov 09, 2005 at 08:39 AM // 08:39..
MaglorD is offline   Reply With Quote
Old Nov 09, 2005, 09:38 AM // 09:38   #58
Furnace Stoker
 
lord_shar's Avatar
 
Join Date: Jul 2005
Location: near SF, CA
Default

Quote:
Originally Posted by MaglorD
...<SNIP>...

lol, but it's not detectable unless you do a lot of graphic manipulation. The same cannot be said of Nvidia's FX optimisations.
Someone caught it with the naked eye... otherwise no one would be talking about it. And yes, the FX5000 series was a bad one.

Quote:
Originally Posted by MaglorD
As for 7800, it will soon be dethroned :P
I'm not yet sure about this... did you see the specs on the new 512MB 7800GTX? They really cranked it up in response to the X1800, and the X1800 hasn't even seen retail daylight yet. However, I'll stick to the facts and keep clear of any brand loyalty. I just want the fastest and best, without skewed numbers either way.

Last edited by lord_shar; Nov 09, 2005 at 09:54 AM // 09:54..
lord_shar is offline   Reply With Quote
Old Nov 09, 2005, 09:56 AM // 09:56   #59
Jungle Guide
 
Join Date: May 2005
Default

Quote:
Originally Posted by lord_shar
Someone caught it with the naked eye... otherwise no one would be talking about it.
This is what John Carmack says:

It is actually a pretty sensible performance enhancement, with minimal visual issues. However, having drivers analyze mip map uploads to hide the cheat is an unfortunate consequence.

So it was only spotted because of a driver issue and only because the X800 had a lower than expected frame rate, which can be fixed. Not that there was image degradation. I doubt very much the lower than expected frame rate was spotted with the "naked eye", but rather with FRAPS.

Last edited by MaglorD; Nov 09, 2005 at 09:59 AM // 09:59..
MaglorD is offline   Reply With Quote
Old Nov 09, 2005, 11:14 AM // 11:14   #60
Furnace Stoker
 
lord_shar's Avatar
 
Join Date: Jul 2005
Location: near SF, CA
Default

Quote:
Originally Posted by MaglorD
This is what John Carmack says:

It is actually a pretty sensible performance enhancement, with minimal visual issues. However, having drivers analyze mip map uploads to hide the cheat is an unfortunate consequence.

So it was only spotted because of a driver issue and only because the X800 had a lower than expected frame rate, which can be fixed. Not that there was image degradation. I doubt very much the lower than expected frame rate was spotted with the "naked eye", but rather with FRAPS.
ATI took a bit of negative publicity since their cards were incapable of true trilinear filtering due to optimizations that could not be disabled at that time. This has since been corrected. NVidia did the same with their past FX5000 series and received similar notice. Yes, they also perform brilinear filtering, but true trilinear filtering was available out of the box.

Either way, both the 6800's and X800's/850's have been relegated to mid-grade video card status, now that the 7800's and X1800's are out, or soon to be released. Something tells me that both sides will employ further brilinear filtering techniques to minimize GPU load and squeeze out every possible frame rate point.
lord_shar is offline   Reply With Quote
Reply

Share This Forum!  
 
 
           

Thread Tools
Display Modes

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off

Forum Jump

Similar Threads
Thread Thread Starter Forum Replies Last Post
Valerius Community Works 206 Apr 10, 2006 03:44 AM // 03:44
The Ultimate Armor Guide Justin_GW Screenshot Exposition 16 Jan 07, 2006 05:32 PM // 17:32
Dragon Incarnate Community Works 73 Oct 10, 2005 08:15 AM // 08:15
Legendary Battousai Gladiator's Arena 77 Sep 25, 2005 06:23 AM // 06:23


All times are GMT. The time now is 10:01 AM // 10:01.


Powered by: vBulletin
Copyright ©2000 - 2016, Jelsoft Enterprises Ltd.
jQuery(document).ready(checkAds()); function checkAds(){if (document.getElementById('adsense')!=undefined){document.write("_gaq.push(['_trackEvent', 'Adblock', 'Unblocked', 'false',,true]);");}else{document.write("