Guild Wars Forums - GW Guru
 
 

Go Back   Guild Wars Forums - GW Guru > Forest of True Sight > Technician's Corner > Computer Buying & Building

Notices

Reply
 
Thread Tools Display Modes
Old Jul 02, 2008, 11:15 PM // 23:15   #21
Ascalonian Squire
 
wolf trader's Avatar
 
Join Date: Jun 2008
Guild: Luna
Advertisement

Disable Ads
Default

Okay, first off if you are using the Celeron that came with it UPGRADE! My suggestion is to get a Core 2 E7xxx or E8xxx cpu. The Celeron will not let any video card you put in it perform over 50%, I know from experience.

As for a video card, ATI HD3k or HD4k series. They still have a lot of bugs in the HD4k series, as once again the manufacturers did a snoozer on their BIOS and the fans don't throttle properly. But hey, can't say we didn't see that coming...same thing happened with the HD3k's.

The integrated chipset you have isn't that bad though, and should hold you over till you can mod up. But seriously, a 1.6GHz Celeron just can't cut it.
wolf trader is offline   Reply With Quote
Old Jul 03, 2008, 02:26 AM // 02:26   #22
Jungle Guide
 
Lurid's Avatar
 
Join Date: Mar 2006
Profession: Mo/
Default

You've tried it at what resolution? Anything reasonable and the card won't bottle kneck. Real world performance doesn't suffer nearly as badly as synthetic benchmarks tend to show.
Lurid is offline   Reply With Quote
Old Jul 03, 2008, 02:31 AM // 02:31   #23
über těk-nĭsh'ən
 
moriz's Avatar
 
Join Date: Jan 2006
Location: Canada
Profession: R/
Default

it's a safe bet that any modern CPU (including the newer celerons) won't bottleneck with GW, except at really high resolutions.

however, for newer games, the bottleneck is noticeable and quite severe. the faster the graphics card, the more pronounced the bottleneck becomes.

btw, what kind of bugs are you talking about with the HD4800 series? mine run just fine. for the fanspeed thing, i merely tweaked a few settings and makes it run at 40% (instead of 5%). temperatures dropped by 20C.
moriz is offline   Reply With Quote
Old Jul 03, 2008, 04:29 AM // 04:29   #24
Jungle Guide
 
Lurid's Avatar
 
Join Date: Mar 2006
Profession: Mo/
Default

The lower the resolution** CPU power limiting factors only show themselves at lower resolutions, wherein the CPU is actually a limiting factor. At higher resolutions the GPUs are able to stretch their arms and actually begin to work more so.
Lurid is offline   Reply With Quote
Old Jul 03, 2008, 02:00 PM // 14:00   #25
über těk-nĭsh'ən
 
moriz's Avatar
 
Join Date: Jan 2006
Location: Canada
Profession: R/
Default

at what resolution the bottleneck occurs depends on the game. for instance, tom's hardware recently tested graphic cards across a few generations (geforce 6 to 9) with a variety of CPUs. in half life 2 ep2, the bottleneck on CPU was apparent all the way up to 1920x1200, while in games like COD4 the bottleneck eventually disappears at high resolutions.

http://www.tomshardware.com/reviews/...de,1928-9.html

either way, pairing a really fast graphic card with a slow CPU will give you poor results. i think we can all agree to that.
moriz is offline   Reply With Quote
Old Jul 03, 2008, 09:59 PM // 21:59   #26
Jungle Guide
 
Lurid's Avatar
 
Join Date: Mar 2006
Profession: Mo/
Default

I don't see how that proves anything? He's using high end CPU's and GPU's ranging from lower end to higher end. We're not talking about GPU power being significant in games, we all know it is. The discussion was that with a lower end CPU the bottle neck is highly over exaggerated once you hit higher resolutions.

Could it limit it somewhat? Yes, its very possible. Are you going to actually notice a huge difference? Not likely. I'm not arguing for the sake of argument, I agree that when building a new system buying old out dated hardware and pairing it with newer stuff is counter productive. The fact of the matter is, your bottle necking situations have been vastly over stated due to synthetics that weigh the performance of the CPU and the GPU together whereas most (Not all) games will not benefit from a faster CPU at any non ridiculously low resolution.
Lurid is offline   Reply With Quote
Old Jul 03, 2008, 10:21 PM // 22:21   #27
über těk-nĭsh'ən
 
moriz's Avatar
 
Join Date: Jan 2006
Location: Canada
Profession: R/
Default

he's also testing E2150(?) CPU as well. if you look carefully, that can dramatically cut the performance on high end graphic cards with certain games (HL2 comes to mind).
moriz is offline   Reply With Quote
Old Jul 03, 2008, 11:04 PM // 23:04   #28
Jungle Guide
 
Lurid's Avatar
 
Join Date: Mar 2006
Profession: Mo/
Default

I'll be honest, reevaluating those graphs has made me wonder if i'm correct or not. I've sat here wondering for a few minutes and honestly I think i've come up with something that coincides and makes sense. It seems to me as though the key to understanding why there are such large gaps in the data is more the cache differences between those CPUs than anything.

Cache size has a larger impact on games than CPU speed, or so that graph would indicate. Therefore that graph doesn't show that a "faster" CPU is required, it shows that cache plays a larger role than clock speed. The only way to test for certain whether or not the clock speed matters is to do this:

Test resolutions: low to high
Test image quality: low to high

You'd have to use the same CPU, GPU, monitor, etc.... The idea is that you underclock the CPU and see if there is a noticeable difference in FPS. My guess is that the CPU will limit the GPU more so at lower resolutions / textures, and less at higher. As it stands the Toms diagram isn't conclusive.

In a way you are correct, as more expensive CPUs generally have more cache. However saying that speeds effect the game more than cache size or resolution...debatable at best I think.
Lurid is offline   Reply With Quote
Old Jul 03, 2008, 11:19 PM // 23:19   #29
über těk-nĭsh'ən
 
moriz's Avatar
 
Join Date: Jan 2006
Location: Canada
Profession: R/
Default

further down the article, the author concluded that it's GHz that matters the most.

he overclocked the E2100 CPU up to 3GHz, and it suddenly began to post similar numbers as the more expensive core 2s.

cache does matter, since the overclocked chip is still slower clock for clock than the core 2s. it seems that pure speed is what's most important here, and i guess it makes sense: the new graphic cards churn through a lot of data. if you can't supply it fast enough, it will greatly hinder their performance.
moriz is offline   Reply With Quote
Reply

Share This Forum!  
 
 
           

Thread Tools
Display Modes

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off

Forum Jump

Similar Threads
Thread Thread Starter Forum Replies Last Post
I need a new video card. dark_slayer Technician's Corner 50 May 31, 2008 12:14 AM // 00:14
best video card i can get for UNDER $150 fatman12342 Technician's Corner 18 Oct 02, 2007 02:04 AM // 02:04
actionjack Technician's Corner 3 Jul 19, 2006 04:31 AM // 04:31
Looking For A Video Card. l Selling God l Technician's Corner 6 Dec 19, 2005 07:47 AM // 07:47
Video Card astokes Technician's Corner 26 May 06, 2005 10:09 PM // 22:09


All times are GMT. The time now is 02:54 AM // 02:54.


Powered by: vBulletin
Copyright ©2000 - 2016, Jelsoft Enterprises Ltd.
jQuery(document).ready(checkAds()); function checkAds(){if (document.getElementById('adsense')!=undefined){document.write("_gaq.push(['_trackEvent', 'Adblock', 'Unblocked', 'false',,true]);");}else{document.write("