Guild Wars Forums - GW Guru
 
 

Go Back   Guild Wars Forums - GW Guru > Forest of True Sight > Technician's Corner

Notices

Closed Thread
 
Thread Tools Display Modes
Old Apr 06, 2005, 05:42 AM // 05:42   #1
Ascalonian Squire
 
Didymus C. Corax's Avatar
 
Join Date: Mar 2005
Advertisement

Disable Ads
Question AMD 64 socket 939 or not?

Since I've read a few tech threads and appreciated the quality of responses from this fine community *pats everyone on back*...I thought I'd give a jab at it with my situation....

I've got about $500 bucks burning a hole in my pocket...Current system looks something like this:

AMD 3200+ Socket A
MSI KT6 Delta Mobo
2xSeagate Barracuda 80gig HDs in raid 0
512mb RAM (generic)
MSI nVidia FX 5600 video adapter (gimme no crap about the card...works great, no problems, does UT2004 on full w/no probs and DOOM3 and HalfLife2 at near full except x4 antiA (does x2 ok))
WinXP Home OS

My current inclination is to get a new mobo, vid card and CPU...I can afford:

AMD 64bit 3000+ CPU socket 939 $169.99
DFI LANPARTY UT nF4 Ultra-D S939 PCIE $169.99
(note the potential for an SLI mod on this board...hmmmm)
nVidia 128MB eVGA 6600GT PCI-Express $179.99

(costs are about average for the area) With tax and all I'm pushing $600, but my wife is understanding about these things (sorta)...

Before I jump in to this...I'm curious about the experience of other folks whose opinions I trust (this means you GW community). I know GW plays sweet on my machine as it is, but the desire to improve on it is nearly overwhelming...Hi, I'm a tech-junkie, and I need your help.

Is it worth it to make this particular upgrade (think beyond GW if you must)?
Should I just add a new stick of RAM and call it good?
Will this stuff be outdated next week?
Am I throwing my money away? (not that I care)
Will I get any real performance advantages?

I don't personally know any 64 users yet, so I'm nervous...hold my hand and tell me what you think...be brutally honest. I might be the first kid on my block with a new toy

Ready? Set...DISCUSS!!! *starts stop watch, slurps jell-o through a straw, sics basset hound on the neighbors cat...again*
Didymus C. Corax is offline  
Old Apr 06, 2005, 06:25 AM // 06:25   #2
Desert Nomad
 
 
Join Date: Mar 2005
Location: Seattle, Washington
Profession: R/E
Default

That new setup will be good, and you have the option to add more ram and a 2nd video card later.

I would get it, gw would look good.
Lews is offline  
Old Apr 06, 2005, 07:07 AM // 07:07   #3
Ascalonian Squire
 
Khelnozz's Avatar
 
Join Date: Mar 2005
Guild: The Fireblades
Profession: N/W
Default

Good Choice,

That is close to what I am running:
AMD 64 3000+
MSI NEO 4 Plat Nforce 4
MSI TD-6600 GT
1 gig Corsair Ram 2.5-3-3-8
I am very pleased with my system, Guild Wars looks just gorgous and every bell and whistle is maxed with very little frame drop.

Just be sure that you get the Winchester revision of the AMD 64. They fixed some over heating issues that the NewCastle had. The Winchester is the one with the .09 micron process, and the NewCastle has the .13 micron process. That system also has tons of upgrade capabilities for later on when other things come down in price. (drools thinking bout 4000+ or FX-55)

Anyway my opinion is that you can't go wrong with an upgrade that sets you up for a few years of small incremental upgrades when you have the money to do so.
Khelnozz is offline  
Old Apr 06, 2005, 04:10 PM // 16:10   #4
Underworld Spelunker
 
Join Date: Feb 2005
Default

if you have the money to blow go for it

just be sure to get retail on the cpu due to the warranty spread with oem
Loviatar is offline  
Old Apr 06, 2005, 11:10 PM // 23:10   #5
Frost Gate Guardian
 
SSE4's Avatar
 
Join Date: Apr 2005
Default

Quote:
Originally Posted by Didymus C. Corax
AMD 64bit 3000+ CPU socket 939 $169.99
DFI LANPARTY UT nF4 Ultra-D S939 PCIE $169.99
(note the potential for an SLI mod on this board...hmmmm)
nVidia 128MB eVGA 6600GT PCI-Express $179.99

Before I jump in to this...I'm curious about the experience of other folks whose opinions I trust (this means you GW community). I know GW plays sweet on my machine as it is, but the desire to improve on it is nearly overwhelming...Hi, I'm a tech-junkie, and I need your help.

Is it worth it to make this particular upgrade (think beyond GW if you must)?
Should I just add a new stick of RAM and call it good?
Will this stuff be outdated next week?
Am I throwing my money away? (not that I care)
Will I get any real performance advantages?

Ready? Set...DISCUSS!!! *starts stop watch, slurps jell-o through a straw, sics basset hound on the neighbors cat...again*
What do I think about the upgrade? To be honest, the XP is likely to outperform the AMD64 1.8GHz in gaming and most other uses, depending on its model type. I would say you could overclock with the DFI but the low-end models are difficult to overclock due to low multiplier counts (And multipliers being locked).

The CPU is the low-end of the AMD64 market, and with the introduction of the Venice cores (Which is slated to be very soon) the technology should probably go down in price, because the NewCastle isn't going to outperform it in any way. Chances are that is the core you will have, unless it's a Winchester. I don't believe that any real performance boosts are in the works for an upgrade to a 3000+ Socket 939 AMD64, but then again I haven't done the upgrade myself, but even though the AMD64 is more efficient, it's also a 400MHz difference, and the 3200+ is still a good processor for gaming. However, the video card may cancel out any lesser performance you might get by going to the AMD64.

If that VGA is a decent amount better than your current one, then you are likely to see a fair performance boost in gameplay and playable settings. Although this is all just speculation. The upgrade itself should indeed give you higher playable settings, but I would be apprehensive on exactly how big the performance boost will be, except in consideration to the video card.
SSE4 is offline  
Old Apr 06, 2005, 11:19 PM // 23:19   #6
Underworld Spelunker
 
Join Date: Feb 2005
Default

i notice you are keepng that 512 mb ram

for now i would try getting a gig of namebrand (corsair micron kingston etc) value ram in a matched pair package

the jump in ram might make you happy enough to wait and see if better cheaper comes along

then you can swap your better ram into that

EDIT

i dont think you will notice much improvement except for the video card

EXAMPLE (MINE)

lousey 2100+ xp with 5600 ULTRA which is MB than 5200

3d mark 05 score of 407

swirch to 6600 GT (leadtec with included OC utility and 3 year war)

3d mark 05 score of 2746

run at max quality 4x aa and no noticibly slowdown

Last edited by Loviatar; Apr 06, 2005 at 11:25 PM // 23:25..
Loviatar is offline  
Old Apr 07, 2005, 02:43 AM // 02:43   #7
Ascalonian Squire
 
Feradin Trueshot's Avatar
 
Join Date: Apr 2005
Location: Chicago
Default

Wow Didymus C. Corax, we are in the same boat. I am currently running:

2.6 northwood
nVidia FX 5200
1 gig of ram

As soon as Venice comes out for AMD, I am going to get:

3000+ Venice AMD64
Asus A8N-E mobo
BFG 6600gt
Seagate Barracuda 80gig
another stick of 512mb (total of 1 gig) Corsair xms 2-3-3-6 (going to retire a stick of valueram).

So I have decided it is worth it to spend the money on the upgrade. My upgrade is going to cost about $600 as well. As you can see though, I am running an old, not so great, AGP 5200. So for me, this is going to be a huge improvement, for I am going to PCI-E, which makes it well worth the money. If you are satisfied with your graphics card though, you may just want to buy some new ram, that might be a cheap fix. I have noticed a big improvement when I went from 512 to 1 gig.

-Hope that helps
Feradin Trueshot is offline  
Old Apr 07, 2005, 06:08 PM // 18:08   #8
Banned
 
Join Date: Mar 2005
Location: Kansas
Default

Personally, I'd save it till dual core comes down in price. Then you will have something to really upgrade for. Plus, there are really no programs that utilize the 64-bits (till Longhorn or if your using certain flavors of Linux). So you won't see any performance boosts due to the processor. If you cut the processor, then you can cut the mobo. Then spend your money on your video card, and another stick of RAM. Then bide your time. Patience is key with getting great deals on computer equipment.

Lansing Kai Don

P.S. Someone correct me if I'm wrong, but are there any programs that utilize the 64-bits? I am still under the impression, that 32-bit mode is the norm. A sign extension is performed to make it 64-bits but that will not improve performance.
Lansing Kai Don is offline  
Old Apr 07, 2005, 06:21 PM // 18:21   #9
Lion's Arch Merchant
 
Darkmane's Avatar
 
Join Date: Feb 2005
Default

Quote:
Originally Posted by Lansing Kai Don
Personally, I'd save it till dual core comes down in price. Then you will have something to really upgrade for. Plus, there are really no programs that utilize the 64-bits (till Longhorn or if your using certain flavors of Linux). So you won't see any performance boosts due to the processor. If you cut the processor, then you can cut the mobo. Then spend your money on your video card, and another stick of RAM. Then bide your time. Patience is key with getting great deals on computer equipment.

Lansing Kai Don

P.S. Someone correct me if I'm wrong, but are there any programs that utilize the 64-bits? I am still under the impression, that 32-bit mode is the norm. A sign extension is performed to make it 64-bits but that will not improve performance.
Here I am again! I just won't go away!!

I second your opinion to just wait on a new mobo and chip. I would definatly spend a few hundred on some good memory 1 gig if you could at least; and if you want- grab a new video card for your future motherboard and chip.

Lansing, without the 64bit OS, any 64 bit applications wont run. So unless you are using a 64 bit OS (and most are not)- You are not running any 64 bit applications. They are backwards compatible (32 bit apps will run on 64 bit OS) but I am pretty sure not upwards.

GuildWars~
Forget what you thought you knew about online gaming.
Darkmane is offline  
Old Apr 07, 2005, 06:44 PM // 18:44   #10
Banned
 
Join Date: Mar 2005
Location: Kansas
Default

I believe your right Darkmane, but I wasn't "positive". I use a 64-bit Linux (but not a 64-bit processor lol). Anyways, no it shouldn't be possible for a program to run under a 32-bit OS as 64-bit. The only communication between a program and the processor must first go throught the OS (called system calls). And, I think, if you tried (either the OS will prevent it), or the CPU would send a trap to the OS and you would be having wonderful blue screen to look at (should go away with a restart... it would be in the form IRQL_LESS_EQUAL). I wasn't absolutely positive and didn't want to give false information to someone. Maybe Windoze gives control of the OS to the program? I didn't think so, but you never know?

Lansing Kai Don
Lansing Kai Don is offline  
Old Apr 07, 2005, 06:53 PM // 18:53   #11
Munchking
 
Ellestar's Avatar
 
Join Date: Mar 2005
Location: Russian Federation, Moscow
Guild: Ladder to Hell (ATM playing with Rus Corp)
Default

Get an addtional 512 RAM (it's a good idea to have 1 GB RAM) and GeForce 6800GT (FX series is awful in a new games with shaders), don't waste money on CPU and motherboard. You'll spend less money and you'll get a more powerful computer for games.
You can spend some money on a good mouse (i recommend MX510) and good mousepad for !optical! mouses (good = $20+ price), a sound card with 5.1 or better - it will really help in First Person Shooter games you mentioned. Actually, they help more than several additional Frames per Second so they well worth their price.

My 3DMark05 score is 3338. You'll get ~20% more than that only with a video card upgrade (to GeForce 6800GT) and newer drivers. So that's ~4000 3DMark05 score instead of ~400.
__________________
Knowledge is Power!
Russian Guild Wars fansite staff http://www.guildwars.ru/

Last edited by Ellestar; Apr 07, 2005 at 08:21 PM // 20:21..
Ellestar is offline  
Old Apr 07, 2005, 07:01 PM // 19:01   #12
Lion's Arch Merchant
 
Darkmane's Avatar
 
Join Date: Feb 2005
Default

Quote:
Originally Posted by Lansing Kai Don
I believe your right Darkmane, but I wasn't "positive". I use a 64-bit Linux (but not a 64-bit processor lol). Anyways, no it shouldn't be possible for a program to run under a 32-bit OS as 64-bit. The only communication between a program and the processor must first go throught the OS (called system calls). And, I think, if you tried (either the OS will prevent it), or the CPU would send a trap to the OS and you would be having wonderful blue screen to look at (should go away with a restart... it would be in the form IRQL_LESS_EQUAL). I wasn't absolutely positive and didn't want to give false information to someone. Maybe Windoze gives control of the OS to the program? I didn't think so, but you never know?

Lansing Kai Don
Yes, I am pretty sure I'm right, but I posted in one of my other techy forums just to be sure, I dont mind posting what I think is right, and then posting that I was horribly wrong later if I am-- LOL.

It might be able to be done with a software level interpreter. And some programs may have 64 AND 32 bit code. But if your using that 64 to 32 bit interpreter, I'd say your defeating the purpose of 64 bit programming to begin with = speed. I have seen some Rendering programs that were compared side by side between a 64 bit program runing on a 64 bit OS and a 32 bit program runing on a 64 bit OS and that same 32 bit program runing on a 32 bit OS. Speed wise, 64 bit will be GREAT for those doing a lot of graphics work and rendering. I am not familiar with any game that will take advantage of the 64 bit 'pipe' so to speak. But I bet its gotta widen the eyes of some developers.
Darkmane is offline  
Old Apr 07, 2005, 08:24 PM // 20:24   #13
Banned
 
Join Date: Mar 2005
Location: Kansas
Default

It shouldn't be possible in the first place (theoretically speaking). With a sign extension from a 32-64 being the only means (i.e. 'h8FFFFFFF turns into 'FFFFFFFF8FFFFFFF and 'h7FFFFFFF turns into 'h000000007FFFFFFF ). Since the OS can only 'handle' 32 bit programs the CPU (more likely the ALU) places a sign extension on the data. The only thing I could think of is if the OS literally hands over the system resources to a program (this would cause disaster... talk about security problems) but on some homemade OS's that's exactly what they do... so I didn't want to rule out someone hacking Windows (or Windows even supporting this ability on a "it isn't our fault" basis)... that's why I say someone correct me if I'm wrong (because I don't know everything... in fact I don't KNOW anything, I assume I know).


Lansing Kai Don
Lansing Kai Don is offline  
Old Apr 08, 2005, 02:27 AM // 02:27   #14
Frost Gate Guardian
 
SSE4's Avatar
 
Join Date: Apr 2005
Default

Code for games and other applications must be extended in order to compensate for 64 bit programs/processors. Therefore a 32 bit processor does not support 64 bit. A 64 bit processor has Legacy mode (For 32 bit applications in a 32 bit OS), Compatability mode (For 32 bit applications in a 64 bit OS) and of course pure 64 bit for a 64 bit OS. The main reason to upgrade to an AMD64 is not because it is 64 bit, but because it is more efficient than an AMD XP (But less efficient than a Pentium M). Therefore it does more IPC and is very good for gaming, especially due to the high number of floating point calculations (Due to all the instruction sets the processor has) it makes it really good at in-game calculations.
SSE4 is offline  
Old Apr 08, 2005, 02:41 AM // 02:41   #15
Banned
 
Join Date: Mar 2005
Location: Kansas
Default

Quote:
Originally Posted by SSE4
Code for games and other applications must be extended in order to compensate for 64 bit programs/processors. Therefore a 32 bit processor does not support 64 bit. A 64 bit processor has Legacy mode (For 32 bit applications in a 32 bit OS), Compatability mode (For 32 bit applications in a 64 bit OS) and of course pure 64 bit for a 64 bit OS. The main reason to upgrade to an AMD64 is not because it is 64 bit, but because it is more efficient than an AMD XP (But less efficient than a Pentium M). Therefore it does more IPC and is very good for gaming, especially due to the high number of floating point calculations (Due to all the instruction sets the processor has) it makes it really good at in-game calculations.

Well, good to know I was right again, where does your source come from? Efficient? Unless they revamped the instruction set, I seriously doubt it. I'm looking at amd.com and not seeing it, can you send me a link? Oh, and Pentium-M is not anymore efficient than the Pentium series (except heat and power wise) I have that from someone working at Intel, I'll look for the source, their just the cream of the crop processors from the line. I found on AMD's website that there is no emulation modes for their processors.

Lansing Kai Don
Lansing Kai Don is offline  
Old Apr 08, 2005, 02:47 AM // 02:47   #16
Frost Gate Guardian
 
SSE4's Avatar
 
Join Date: Apr 2005
Default

Quote:
Originally Posted by Lansing Kai Don
Well, good to know I was right again, where does your source come from? Efficient? Unless they revamped the instruction set, I seriously doubt it. I'm looking at amd.com and not seeing it, can you send me a link? Oh, and Pentium-M is not anymore efficient than the Pentium series (except heat and power wise) I have that from someone working at Intel, I'll look for the source, their just the cream of the crop processors from the line. I found on AMD's website that there is no emulation modes for their processors.

Lansing Kai Don
The Pentium M performs more IPC, and therefore does more work per cycle than an AMD64 processor. This can be found when looking at any benchmark. Also, by efficient for the AMD64 I mean the amount of work the processor does, which would also be IPC. Because it does more work per cycle than an Intel Pentium 4, that means it will perform better on a clock-to-clock basis. Similar to the Pentium M's processor. Also, the Pentium M has an extremely "efficient" L2 cache, which means the architecture is very good at working with the high L2 cache, which is 2MB, and so unlike Pentium 4 6xx processors, it can find what it is looking for a lot faster, with less latency. There is no comparison between a Pentium M and a Pentium 4, the M is simply better. My source comes from a lot of reading and personal research on both of the processor companies and how their processors themselves work. That means the endless inspection of benchmarks and raw information on the two. I don't use any one benchmark because it would be a partially biased opinion. And they have more instruction sets than the Intel, which means they do more floating point calculations. They have (Non-floating point) MMX and MMX+. Then 3DNow!, 3DNow!+, SSE, and SSE2. The Venice cores also have SSE3. In comparison to an Intel Pentium 4 Prescott, which has MMX, SSE, SSE2, and SSE3. (Northwood cores do not have SSE3)
SSE4 is offline  
Old Apr 08, 2005, 03:05 AM // 03:05   #17
Banned
 
Join Date: Mar 2005
Location: Kansas
Default

Still does me no good, unless you get it from a reputable source. Also the L2 Cache in any processor SHOULD be operating at the clock cycle of the processor (means little-no latency). That should be an easy one if you researched processors. I don't know the IPC for the processor's I haven't looked it up, I will now, but it would be better if you gave me your sources instead of just saying you did it. Did you just say there is no difference between the Pentium M and the Pentium 4, yet the Pentium M is more efficient? Confused? Please clear this up, nevermind reread the post. I will have to get that source, give me a few.

Lansing Kai Don

P.S. More instruction sets does not equal more floating point calculations. More floating point instruction sets to perform more accurate floating point calculations is a better statement. I could add alot of useless instruction sets to my processor and claim the equivalent. Maybe you remember the RISC vs. CISC wars? Do you know which one came out on top then? Now?

Last edited by Lansing Kai Don; Apr 08, 2005 at 03:09 AM // 03:09..
Lansing Kai Don is offline  
Old Apr 08, 2005, 03:10 AM // 03:10   #18
Frost Gate Guardian
 
SSE4's Avatar
 
Join Date: Apr 2005
Default

Quote:
Originally Posted by Lansing Kai Don
Still does me no good, unless you get it from a reputable source. Also the L2 Cache in any processor SHOULD be operating at the clock cycle of the processor (means little-no latency). That should be an easy one if you researched processors. I don't know the IPC for the processor's I haven't looked it up, I will now, but it would be better if you gave me your sources instead of just saying you did it.

Lansing Kai Don
You can easily measure the number of clock cycles it takes a processor to scan the cache using benchmarking programs, and instructions can also be measured in this way. I would suggest searching up some information on benchmarks that will show you that by increasing the cache, you increase the amount of "cache" memory the computer must search, and will naturally create latency. It's not a good idea to assume that the processor will search through the cache without latency. It's like assuming RAM doesn't have latency either, which is false. Some have a CAS Latency of 3, and others have 2.5, and some of the faster DDR400 modules (Or less of course) will have 2. I don't remember the sites. The best way for you to find out is to go looking for information yourself. I wasn't spoon-fed, and I wont give you special treatment because you don't believe me.

On a side note, a Pentium 4 takes longer to search through its cache than a Pentium M because the Pentium M is more efficient. Sure the Pentium 4 is fast and the cache is localized, but it there are still many factors to take into consideration.

Last edited by SSE4; Apr 08, 2005 at 03:13 AM // 03:13..
SSE4 is offline  
Old Apr 08, 2005, 03:19 AM // 03:19   #19
Academy Page
 
Join Date: Feb 2005
Default

If I may, I too am upgrading my PC very soon and had AMD64 in mind. After reading this thread, I started questioning myself...

Can someone please explain the differences between Winchester and the upcoming Venice? How long until it comes out and how expensive are they expected to be? (twice as much the Winchester, etc.)

BTW, noob at hardware here, please be nice when talking about tech stuff. A little explaining would be appreciated. And sorry for the quick takeover on the subject.
IdNotFound is offline  
Old Apr 08, 2005, 03:24 AM // 03:24   #20
Frost Gate Guardian
 
SSE4's Avatar
 
Join Date: Apr 2005
Default

Quote:
Originally Posted by IdNotFound
If I may, I too am upgrading my PC very soon and had AMD64 in mind. After reading this thread, I started questioning myself...

Can someone please explain the differences between Winchester and the upcoming Venice? How long until it comes out and how expensive are they expected to be? (twice as much the Winchester, etc.)

BTW, noob at hardware here, please be nice when talking about tech stuff. A little explaining would be appreciated. And sorry for the quick takeover on the subject.
The Winchester is better than the Newcastle, but said to be a lot worse than Venice. It's built on the 90nm process, which means it uses less energy and runs slightly cooler, but it is debatable if performance beyond these two things will be noticed. Nevertheless, it's a more viable option than any Newcastle core, because Winchester is better. To be honest, I don't know how expensive the Venice cores are intended to be, so if someone else knows I would be delighted if they could tell me. I know that the Venice cores will feature SSE3 instructions and it also appears that they will run a fair bit cooler than current models. It also has been documented to be more easily overclockable than all previous models of AMD64s. This could mean that if you buy a cheaper model, you may find yourself capable of overclocking the processor an extra 100-500MHz (Depending on how extreme you might go) and save money. I myself (So no scientific or economic document backs this) feel that the Winchester (But especially the Newcastle) will likely go down in price with the introduction of the Venice cores. They wont be as good, but if they go down in price you can't really argue in consideration to value.

It's safe to assume that the Venice cores will cost a fair bit more than current AMD64 models, but I honestly don't know.
SSE4 is offline  
Closed Thread

Share This Forum!  
 
 
           

Thread Tools
Display Modes

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off

Forum Jump


All times are GMT. The time now is 06:13 PM // 18:13.


Powered by: vBulletin
Copyright ©2000 - 2016, Jelsoft Enterprises Ltd.
jQuery(document).ready(checkAds()); function checkAds(){if (document.getElementById('adsense')!=undefined){document.write("_gaq.push(['_trackEvent', 'Adblock', 'Unblocked', 'false',,true]);");}else{document.write("