Guild Wars Forums - GW Guru
 
 

Go Back   Guild Wars Forums - GW Guru > Forest of True Sight > Technician's Corner

Notices

Closed Thread
 
Thread Tools Display Modes
Old Jan 13, 2006, 11:18 AM // 11:18   #1
Frost Gate Guardian
 
Join Date: Jul 2005
Location: Hawaii
Guild: FPS
Profession: Mo/Me
Advertisement

Disable Ads
Default Quad GPUs ?

Old news already....

Dell XPS600 Renegade
--Quad nvidia SLI - dual 1GB nVidia GeForce 7800
--Intel Pentium Extree Edition dual core 955 overcloacked to 4.26GHz
--2GB Dual channel DDR2 SDRAM at 667 MHz
--2x 10,000 RPM 150GB Raptor drives
--850 Watt Power supply

Supposed to be available this quarter.

Very interesting stuff. Should destroy anything from the boutique builders if Dell really ships it.
easyg is offline  
Old Jan 13, 2006, 12:30 PM // 12:30   #2
Jungle Guide
 
M1h4iL's Avatar
 
Join Date: Apr 2005
Location: Perth, Australia
Default

Interesting to see Dell to ship something THAT good, must cost an arm and a leg :/
M1h4iL is offline  
Old Jan 13, 2006, 01:50 PM // 13:50   #3
Lion's Arch Merchant
 
Narada's Avatar
 
Join Date: Sep 2005
Location: United States
Guild: Clan Foxrunner
Profession: R/P
Default

Quote:
Originally Posted by M1h4iL
Interesting to see Dell to ship something THAT good, must cost an arm and a leg :/
I bet. That thing's beautiful.
Narada is offline  
Old Jan 13, 2006, 02:25 PM // 14:25   #4
Desert Nomad
 
LiQuId StEeL's Avatar
 
Join Date: Jul 2005
Location: /u/liquidsteel30
Guild: Ego Trip From Rank [ZERO]
Profession: W/Mo
Default

Too bad its an intel

I also heard whispers of a 7-8000 price point :O
LiQuId StEeL is offline  
Old Jan 13, 2006, 02:42 PM // 14:42   #5
Desert Nomad
 
Xenrath's Avatar
 
Join Date: Oct 2005
Profession: W/Me
Default

My "old" 3Ghz, 1Gb, X800 rig can still run the latest games at top resolutions and full detail without any noticeable slowdown, and probably costs 1/10th this beast lol I suppose if you've money to burn...
Xenrath is offline  
Old Jan 13, 2006, 02:46 PM // 14:46   #6
Lion's Arch Merchant
 
temp's Avatar
 
Join Date: Nov 2005
Location: my bedroom
Guild: Band Of Death UK
Default

argh dell
argh intel

cant wait for the public release of the cards...sweet
temp is offline  
Old Jan 13, 2006, 02:55 PM // 14:55   #7
Middle-Age-Man
 
Old Dood's Avatar
 
Join Date: May 2005
Location: Lansing, Mi
Profession: W/Mo
Default

Dell has their good points. They will sell computers every now and then at a fantastic price. You just have to be quick to catch those "Deals". However, they have always been kinda weak on Video Cards. Sometimes it is better to buy retail cards then get them from Dell. I ended up doing that on my last two Dells. My Dimension 5100 I ordered from them only had the X600 ATI OEM cards as the "best". No fans on them either...I gave it away and bought a X800XL ATI retail. So I am curious to see what kind of cards they would use in these systems. Will they be OEM? Probably....
Old Dood is offline  
Old Jan 13, 2006, 05:11 PM // 17:11   #8
Desert Nomad
 
LiQuId StEeL's Avatar
 
Join Date: Jul 2005
Location: /u/liquidsteel30
Guild: Ego Trip From Rank [ZERO]
Profession: W/Mo
Default

http://www.nvidia.com/object/IO_28569.html

http://www.dell.com/html/us/products/ces/index.htm

Two Dual 7800GTX Running in SLIx4. ASUS already had card like these out a few months ago, but were a limited run of 2000 which only went to media for reviewing. Ill see if I can find the review...

EDIT! http://www.tomshardware.com/2005/12/...uad_gpu_setup/
LiQuId StEeL is offline  
Old Jan 13, 2006, 05:19 PM // 17:19   #9
Furnace Stoker
 
EternalTempest's Avatar
 
Join Date: Jun 2005
Location: United States
Guild: Dark Side Ofthe Moon [DSM]
Profession: E/
Default

Impressive technology that Ati will most likely be doing two (unlike sli will "not take off" and then be wrong and have to play catch up giving nvidia a 1 year development head start).

I still don't think driver's are up to par to unlock anywhere near the perfomance these could offer at the moment.
EternalTempest is offline  
Old Jan 13, 2006, 06:35 PM // 18:35   #10
Middle-Age-Man
 
Old Dood's Avatar
 
Join Date: May 2005
Location: Lansing, Mi
Profession: W/Mo
Default

LiQuId StEeL

That is cool. I am watching the Dell presentation right now. Wow...makes my little X800XL look like a TNT 32Mb card. hehehe
Old Dood is offline  
Old Jan 13, 2006, 08:16 PM // 20:16   #11
Dex
Wilds Pathfinder
 
Dex's Avatar
 
Join Date: Dec 2005
Location: Chicago, IL
Guild: Black Belt Jones
Profession: R/Me
Default

Ok, but what are you going to run on it? Current applications barely utilize dual-GPU SLi, and the ones that do start showing signs of being limited in other areas. Seems like a waste of money to me, unless bragging rights are worth that much to you.

I used to drool over specs, but now I have to ask myself, "what is this actually going to do for me?" Quad-GPU SLi is not going to enhance your gaming experience in any way at the moment. By the time your quad 7800GTX system is actually useful for something there will be newer GPUs that can get the same fps at the same detail levels with 1 or 2 GPUs.

Why waste your money?
Dex is offline  
Old Jan 13, 2006, 09:44 PM // 21:44   #12
Middle-Age-Man
 
Old Dood's Avatar
 
Join Date: May 2005
Location: Lansing, Mi
Profession: W/Mo
Default

With in time they will be very useful. Hell...I would not pay for it now. I am a "Couple Techs Behind Dood...". This is why my X800XL PCIe is just fine for me now. It is way better then anything I previously owned before and it didn't kill me price wise.

Dex....did you watch any of that Key Note speech by Michael Dell? That new system with the new 30" monitor would be niffty to have....if you have a trust fund. Personally I like seeing what is new and improved...that tells me what we will have as standard equipment in 5 years.

EDIT: I just priced out a XPS system for the hell of it and the lowest end processor is the highest end one that I could get on my Dimension 5100. A 650 series 3.4Ghz HT. That made me laugh. I am quite happy with my processor even if it is the lowest end one for the XPS.

Last edited by Old Dood; Jan 13, 2006 at 09:46 PM // 21:46..
Old Dood is offline  
Old Jan 13, 2006, 10:25 PM // 22:25   #13
Dex
Wilds Pathfinder
 
Dex's Avatar
 
Join Date: Dec 2005
Location: Chicago, IL
Guild: Black Belt Jones
Profession: R/Me
Default

Quote:
Originally Posted by Old Dood
With in time they will be very useful. Hell...I would not pay for it now. I am a "Couple Techs Behind Dood...". This is why my X800XL PCIe is just fine for me now. It is way better then anything I previously owned before and it didn't kill me price wise.

Dex....did you watch any of that Key Note speech by Michael Dell? That new system with the new 30" monitor would be niffty to have....if you have a trust fund. Personally I like seeing what is new and improved...that tells me what we will have as standard equipment in 5 years.

EDIT: I just priced out a XPS system for the hell of it and the lowest end processor is the highest end one that I could get on my Dimension 5100. A 650 series 3.4Ghz HT. That made me laugh. I am quite happy with my processor even if it is the lowest end one for the XPS.
I think we're saying basically the same thing here, Dood. All I'm saying is that buying a setup like this now, IMHO, would be an enormous waste of money. I'm sure that in the future there will be software that will utilize that amount of power and systems that will not bottleneck in other places, making the intense GPU power useless. What I'm saying is that by the time your quad-GPU setup is useful there will be hardware available that will give the same performance in a single or dual-GPU configuration that will VERY likely be much cheaper. Why not wait for better, cheaper hardware that is more in-synch with current software instead of blowing an insane amount of money on hardware that you won't utilize anytime soon?

I have a reason for taking this stance beyond just my predictions (which I still stand by). For those of you who don't know, SLi is not a new technology. It's been around since the mid-90's. The first company to use SLi video for consumer-level entertainment was a now defunct company called 3DFx. They had a 3D accellerator card (then many were add-on cards, not the primary video device) called the Voodoo2 that could be used in a dual-card configuration, and it was called SLi then, too. nVidia came along and slowly (or relatively quickly depending on how you look at it) overtook 3DFx, and I believe eventually acquired them. The point is, SLi was eventually dropped altogether. Why? For the reason I stated above. By the time the software caught up with what SLi had to offer there was newer, cheaper hardware that could perform on the same level.

So that begs the question: why is the software always behind the hardware? It's pretty simple. The bleeding-edge hardware is only purchased by the enthusiast market niche, which is rather small. Now, granted, it has become MUCH larger than it was in the days of 3DFx, but in the grand scheme it's still pretty miniscule. Software companies can't afford to make their games' requirements so high that only the enthusiast market can run them. They wouldn't sell enough copies to make it worth the high cost of development. It's also very difficult to make a game that runs well on a wide range of hardware whilst taking advantage of what the cutting or bleeding edge hardware has to offer. I don't see this changing anytime soon. It's already more expensive to maintain a reasonably powerful gaming rig than the average computer owner is willing to shoulder. The PC gaming world loses more software companies every year to game consoles. The consoles are much cheaper, and since everyone has the same hardware, have fewer compatibility problems and requirements issues. Consoles also have a longer prospective lifespan than a gaming PC (assuming you don't upgrade it) for the same reason.

The more we expect out of our gaming software the more we bump out the non-enthusiast community from the PC gaming scene. The more mainstream users we bump out the less profitable it becomes for companies to make games for PCs. Do you see my point? Software will never catch up to these obscene hardware configurations until you can get the same amount of power from a cheaper setup. The market just won't bear it.
Dex is offline  
Old Jan 13, 2006, 10:43 PM // 22:43   #14
Middle-Age-Man
 
Old Dood's Avatar
 
Join Date: May 2005
Location: Lansing, Mi
Profession: W/Mo
Default

Dex...I have a Voodoo2 12 Mb card still in a old Box just sitting collecting dust. How I had that system set up is...1 Voodoo2 and 1 ATI 8Mb AiW card. When I bought the Voodoo2 (for $150.00) I was going to add another one later on...never did. So by having the ATI AiW card I could chose between which card I wanted to use on what ever program I was using at the time. I mainly used the AiW card because it was cool to have my cable TV hooked up to it and watch TV while I waited forever for my dial-up to download anything. This was around 1996 and the computer is a K6 166Mhz.

Last edited by Old Dood; Jan 14, 2006 at 12:29 AM // 00:29..
Old Dood is offline  
Old Jan 13, 2006, 10:58 PM // 22:58   #15
Desert Nomad
 
LiQuId StEeL's Avatar
 
Join Date: Jul 2005
Location: /u/liquidsteel30
Guild: Ego Trip From Rank [ZERO]
Profession: W/Mo
Default

Final Word on Quad: Unnecessary for ANY game unless playing at max AA and AF af resolutions higher than 1600x1200 one a select few games. Supposedly quad gpu's would make even the FX-60 from AMD a severe bottleneck (the new, $1400 processor). 4 GPU's will never be necessary, ever.
LiQuId StEeL is offline  
Old Jan 14, 2006, 12:01 AM // 00:01   #16
Dex
Wilds Pathfinder
 
Dex's Avatar
 
Join Date: Dec 2005
Location: Chicago, IL
Guild: Black Belt Jones
Profession: R/Me
Default

One last thing from me on this subject.

There's an obvious parallel processing trend in the hardware world right now. It seems like all of the major CPU and GPU producers are moving toward multi-core and multi-processor configurations. I find this interesting because companies like Intel and Microsoft have been unenthusiastic about SMP (symmetric multi-processing) in the home and business desktop environment until very recently.

Why the sudden change of heart, guys?

Could it be that companies like Intel, AMD, and nVidia are anticipating a limitation in thier manufacturing process? Are they finally ready to shift some of the burden of Moore's Law over to software programmers?

We've come very close to seeing this before. At some point silicon is just not a viable base material, and as we try to cram more and more components into a silicon integrated circuit, the pathways need to get smaller and smaller. About 3 years ago the industry thought they were reaching the point that the pathways in an IC couldn't reliably carry a signal anymore. Finally, some engineers at IBM found a way to make IC pathways using copper, which wasn't possible before because the copper would leech into the silicon and ruin the wafer. The new technique was called SOI (silicon on insulator), and is used in most modern CPUs. It made the 90-nanometer (pathway width) manufacturing process possible because copper could reliably carry a signal on a more narrow path.

This was a near-miss for the semiconductor industry, because once the silicon IC is no longer viable they will need an alternative design, which they are nowhere near having in any kind of mature state. I'm beginning to wonder if that "wake-up call" is what started them planning their venture into the world of parallel processing at the consumer level. Are they anticipating an inability to continue the increase of core complexity and higher clockspeeds within the current chip design paradigm? Is this going to be their temporary solution to the problem: make all software multi-threaded and keep adding more cores? Is this their attempt to keep Moore's Law, and therefore the future of their profit margins, from stalling sometime soon? Is this getting waaaay too far off the topic of Guild Wars?

Yes, I think so. I'll shut up now.

Last edited by Dex; Jan 14, 2006 at 10:31 AM // 10:31..
Dex is offline  
Old Jan 14, 2006, 09:31 AM // 09:31   #17
Desert Nomad
 
Join Date: Jan 2006
Location: Moon
Profession: Mo/
Default

Could I keep my house warm in the winters by just turning that computer on?

My single GF6800GT runs very hot as is, then imagine a bit hotter GF7800, and FOUR of them. Do they sell the 1 MW power sources for that thing yet? Or do I have to get a diesel-generator to provide additional electricity?

Seriously though. Why bother developing new hardware anymore, atleast looks like they aren't. "Let's just cram dozen of our previous product into once PCB and be happy." Dual-Core CPUs are all good, no problem there. But Quad vidcard with 4 GPU-chips?

nVidia and ATi should have a chat with AMD designers, maybe they could fit their SLi-junk into one core. I like playing new games, but no way I'm getting two vidcards in. That's just so.. '90ies. Voodoo2, anyone? And now that we have 2 GPUs on one card, makes me think if nVidia let the 3dfx engineers out from their closets, and that's what they came up with.

Dex pretty much said that all, and yes, nVidia bought 3Dfx. Hence my bets on nVidia letting the engineers they got from 3Dfx see the sunlight and develop a new card. Too bad they were stuck on Voodoo5 5500-mode. I actually got Voodoo5 on my old comp ^^ Actually, isn't there Voodoo5 6000 with 4 GPUs on one PCB? Googletime! Kaching!

Last edited by Kaguya; Jan 14, 2006 at 09:42 AM // 09:42..
Kaguya is offline  
Old Jan 14, 2006, 10:40 AM // 10:40   #18
Dex
Wilds Pathfinder
 
Dex's Avatar
 
Join Date: Dec 2005
Location: Chicago, IL
Guild: Black Belt Jones
Profession: R/Me
Default

Quote:
Originally Posted by Old Dood
Dex...I have a Voodoo2 12 Mb card still in a old Box just sitting collecting dust. How I had that system set up is...1 Voodoo2 and 1 ATI 8Mb AiW card. When I bought the Voodoo2 (for $150.00) I was going to add another one later on...never did. So by having the ATI AiW card I could chose between which card I wanted to use on what ever program I was using at the time. I mainly used the AiW card because it was cool to have my cable TV hooked up to it and watch TV while I waited forever for my dial-up to download anything. This was around 1996 and the computer is a K6 166Mhz.
Heh. I had a K6 166 too. I also had 2 K6/2 233's.

In my closet right now I have my old gaming rig from my senior year of college. It's an old Celeron (Coppermine, baby) 300 overclocked to 450 (for you youngsters, you probably wouldn't have ever heard of overclocking had it not been for the Coppermine Celerons -- they made it 'cool' to overclock!), a Matrox Millenium II, and 2 x Voodoo2's in SLi!!! It's still assembled and would probably boot up into windows 95 if I turned it on! I later upgraded to an nVidia RivaTNT, but moved that to another rig and put my Voodoo2's back in that one.

Ah, the good ol' days of deathmatching my roommates in Doom 2 and Heretic...

I wonder what would happen if I tried to run GW on my Voodoo2's. Would they blow up? Would it rip a hole in space/time? Maybe I'll try it.

Nah.


Last edited by Dex; Jan 14, 2006 at 10:49 AM // 10:49..
Dex is offline  
Old Jan 14, 2006, 02:24 PM // 14:24   #19
Furnace Stoker
 
EternalTempest's Avatar
 
Join Date: Jun 2005
Location: United States
Guild: Dark Side Ofthe Moon [DSM]
Profession: E/
Default

Amd K6-3, Aureal SQ-2500 (dead 3d audio tech that WAS better then Creative's EAX), Ati Rage (non-pro) & 2 Voodoo 2's in SLI - Unreal Tournment kicked butt on my old machine, Diablo II w/ glide was not that bad either

Experience with Voodoo 2's in modern system - sometimes the voodoo cards would take control of rendering the game's 3d (ignoring the main card) causing some interesting things to happen (you want what game to run at what Direct X level ). That and you have to find homemade altered drivers to get 2k/xp support.

I think there was a project (that had some success) that emulated Voodo's Glide for use with modern cards but not sure how stable it was.
(edit - http://www.glidos.net/)

Dex, that is some great insight to something I never thought about but think you hit the nail on the head about why there is a shift in tech.

(edit) The Voodoo 6000 series had prototype boards and some were sent to review sites but 3dfx could not get enough funding (there were on the verge of broke) to make the card for market and then went bankrupt (and nvidia picked up it's R&D staff and tech). It featured an "external" pwr supply requirment as well as pulling power off of the agp slot and the reviews said it was a very good card (at the time).

Last edited by EternalTempest; Jan 14, 2006 at 02:30 PM // 14:30..
EternalTempest is offline  
Old Jan 14, 2006, 04:12 PM // 16:12   #20
Desert Nomad
 
Alias_X's Avatar
 
Join Date: Apr 2005
Default

I agree that we will probably not need Quad GPU's in the long future. More sounds better, so people might think it is better, but think of it this way:
Let us say my computer has 4gb of ram. I am running windows XP, which (lets just say this, I know it probably isn't true) which uses a max of 256mb of ram. No matter if I have 1gb, 2gb or 4gb, my computer will still only be using 256mb of that, the extra power isn't being used.
Alias_X is offline  
Closed Thread

Share This Forum!  
 
 
           

Thread Tools
Display Modes

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off

Forum Jump

Similar Threads
Thread Thread Starter Forum Replies Last Post
Quad Core -Believe it. Alias_X Technician's Corner 7 Dec 11, 2005 03:20 AM // 03:20
Quad Monk Team Arena Build Glap The Campfire 4 Dec 07, 2005 07:28 PM // 19:28


All times are GMT. The time now is 04:25 PM // 16:25.


Powered by: vBulletin
Copyright ©2000 - 2016, Jelsoft Enterprises Ltd.
jQuery(document).ready(checkAds()); function checkAds(){if (document.getElementById('adsense')!=undefined){document.write("_gaq.push(['_trackEvent', 'Adblock', 'Unblocked', 'false',,true]);");}else{document.write("