Guild Wars Forums - GW Guru
 
 

Go Back   Guild Wars Forums - GW Guru > Forest of True Sight > Technician's Corner

Notices

Reply
 
Thread Tools Display Modes
Old May 24, 2006, 08:54 PM // 20:54   #21
Furnace Stoker
 
lord_shar's Avatar
 
Join Date: Jul 2005
Location: near SF, CA
Advertisement

Disable Ads
Default

Quote:
Originally Posted by swaaye
You will see zero benefit from a dual core CPU in almost all games right now. In fact, the only game I know of that uses my second core somewhat is Oblivion. There are some patches for some FPS games, like the awful Quake 4, but lol those don't mean much to me. I've read it's extremely difficult to make games take good advantage of SMP so while surely there will be future games that are capable of it, they won't be night and day better. And, in all games that support SMP right now, the GPU is what determines your speed far above anything the CPU can do.

A higher clocked single core CPU will be faster, and probably cheaper lol.
Not completely true . XP takes advantage of dual cores, so even if a game app is only single threaded, essential background processes are still being evenly distributed between both cores to maximize performance. I'm running a dual core laptop now, and when I check CPU utilization with only GW active, I usually see both cores working at ~55% load. If McAfee unexpectedly kicks in during a high load game scene, my system keeps right on going with no visible performance hit. I can open 10+ IE browsers, Windows Explorers, and other apps at the same time with no hit to GW. Such activity brings my single core P4-2.5ghz to a screeching halt. In today's multi-tasking OS environment, there's really no reason to go single core anymore.

Pentium4-D is only $130... How can you not be temped by this 4.1ghz bargain?

Last edited by lord_shar; May 24, 2006 at 09:02 PM // 21:02..
lord_shar is offline   Reply With Quote
Old May 24, 2006, 11:09 PM // 23:09   #22
Forge Runner
 
lightblade's Avatar
 
Join Date: May 2005
Guild: The Etereal Guard
Profession: Me/Mo
Default

The FX series just too damn expensive, and offer not much improvement!

$700+ for just 512 L2 cache...way not worth it
lightblade is offline   Reply With Quote
Old May 24, 2006, 11:23 PM // 23:23   #23
Jungle Guide
 
Lurid's Avatar
 
Join Date: Mar 2006
Profession: Mo/
Default

FX series CPU's (current ones) have a minimum of 1MB L2 cache per core. Thats not the major feature though, people buy them for the high stock clocks, the name, and the unlocked multiplier. Which is only useful if your overclocking, and / or are too lazy to just overclock the usual way.
Lurid is offline   Reply With Quote
Old May 25, 2006, 01:07 AM // 01:07   #24
Academy Page
 
Join Date: Jul 2005
Guild: Leviathan's Wake
Profession: W/Me
Default

Quote:
Originally Posted by Lurid
^-- Your Opteron section is alittle off. You fail to take into consideration the s939 Opterons, which not only overclock higher, use less voltage @ stock, and have higher L2 cache than most of the A64's that you would consider buying. IMO there are only two dual cores worth mentioning, the AMD Opteron 165 and the AMD X2 3800+. If possible get the 165.
Well again Its still around 330 dollars. In this Case I would still opt for the San Diego 4000+ based purely on performance in a gaming world, my post was in reference to how these stack in a gaming benchmark. This cache isn't going to result too much in the gaming world. When you start using the multitask capabilities of the X2/ highend opteron, then it gets interesting
Sax Dakota is offline   Reply With Quote
Old May 25, 2006, 04:27 AM // 04:27   #25
Desert Nomad
 
Seef II's Avatar
 
Join Date: Nov 2005
Location: US
Profession: R/Mo
Default

Just to clear up some misinformation:
Quote:
Xeon - This guy is optimized for the 64bit world. It doesn't run as efficient when placed in a 32bit world imho. But if you are running a server and or ultra high end at home WORKSTATION ( 3ds Max, heavy rendering etc.) where you would need the extra memory go this route.

Opteron stuff
Opterons are server-grade Athlon 64s. Exact same silicon, just that the Opties are tested more rigorously.
Xeons are server-grade Pentium 4 (Prescott). Again, it's the same silicon, just validated in a different process. There are some differences, like early Prescotts (5x0, 5x5 series) not having EM64T while the early Nocona Xeons did, or a 2 GHz 1Mx2 dual-core AMD64 processor (the 165, I believe? I'm more of an intel guy) but the chips are otherwise identical. A P4 551 G1-step will perform the same as a Xeon 3.4 G1 stepping, 64-bit or 32-bit. Within a family, of course, the 64 v. 32-bit performance will be relatively the same.

Quote:
The FX series just too damn expensive, and offer not much improvement!
$700+ for just 512 L2 cache...way not worth it
All FX processors have had 1M L2, even the old S940 FX51 and FX53s.
Seef II is offline   Reply With Quote
Reply

Share This Forum!  
 
 
           

Thread Tools
Display Modes

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off

Forum Jump

Similar Threads
Thread Thread Starter Forum Replies Last Post
Dual UW with Ele and a 55 LouAl The Campfire 14 Mar 17, 2006 07:34 PM // 19:34
Teelana Singh Technician's Corner 14 Mar 17, 2006 04:59 PM // 16:59
Dual-Core Compatibility With Guild Wars? No_Style Technician's Corner 19 Mar 14, 2006 09:27 PM // 21:27
Tommy Technician's Corner 15 Dec 19, 2005 03:21 PM // 15:21
Windows XP Dual-Core CPU Gamming Peformance Fix EternalTempest Technician's Corner 2 Oct 19, 2005 12:11 AM // 00:11


All times are GMT. The time now is 03:09 PM // 15:09.


Powered by: vBulletin
Copyright ©2000 - 2016, Jelsoft Enterprises Ltd.
jQuery(document).ready(checkAds()); function checkAds(){if (document.getElementById('adsense')!=undefined){document.write("_gaq.push(['_trackEvent', 'Adblock', 'Unblocked', 'false',,true]);");}else{document.write("