Guild Wars Forums - GW Guru
 
 

Go Back   Guild Wars Forums - GW Guru > Forest of True Sight > Technician's Corner > Computer Buying & Building

Notices

Reply
 
Thread Tools Display Modes
Old Oct 01, 2009, 06:05 PM // 18:05   #41
Furnace Stoker
 
Elder III's Avatar
 
Join Date: Jan 2007
Location: Ohio
Guild: I Will Never Join Your Guild (NTY)
Profession: R/
Advertisement

Disable Ads
Default

^^^looks like the Tesla Computing GPU..... or is it a prototype of the GT300?
Elder III is offline   Reply With Quote
Old Oct 01, 2009, 06:46 PM // 18:46   #42
Desert Nomad
 
Burst Cancel's Avatar
 
Join Date: Dec 2006
Location: Domain of Broken Game Mechanics
Default

Projections on availability and performance are worthless for computer hardware. Until the retail parts are in our machines undergoing real-world performance testing, I'm really not interested in anything manufacturers have to say about their products. I've had more than enough of bullshit paper launches, underperforming products, and shockingly arrogant pricing - oh, and unwarranted forum hype.

And while video card upgrades are nice and all, they're only relevant if we actually have quality games to play. Until developers decide to get off their collective asses and make something worth playing instead of more terrible casual-gamer shovelware, me-too MMOs, and Generic FPS v10.4, it's going to be hard justifying more video card investment. After all, it doesn't take cutting-edge hardware to play Starcraft.
Burst Cancel is offline   Reply With Quote
Old Oct 01, 2009, 07:01 PM // 19:01   #43
über těk-nĭsh'ən
 
moriz's Avatar
 
Join Date: Jan 2006
Location: Canada
Profession: R/
Default

Quote:
Originally Posted by Rahja the Thief View Post

And what is this DX11 doesn't matter business? GT300 and all mainstream plans for the chip are DX11 capable, and you will find that prices will be far better than the GT200 launch.
http://www.tomshardware.com/news/Nvi...-ATI,8687.html

not that i fault nvidia for making this kind of statement. after all, they don't have a retail product using DX11.

i also agree with Burst Cancel: as i've said in a previous post, GPU speed has already managed to make a core i7 965EE a CPU bottleneck at 2560x1600 resolution. if nvidia's cards are to drop right now, we'll just see the exact same thing. unless you have old hardware and making a full system upgrade right now, these cards are not worthwhile to upgrade to.
moriz is offline   Reply With Quote
Old Oct 01, 2009, 07:22 PM // 19:22   #44
Wilds Pathfinder
 
Join Date: Jul 2008
Location: netherlands
Profession: Mo/E
Default

hello GT300
mind to send me one
its small, compared to the HD5870 and HD5870*2 leaked pictures.
riktw is offline   Reply With Quote
Old Oct 01, 2009, 08:32 PM // 20:32   #45
The Fallen One
 
Lord Sojar's Avatar
 
Join Date: Dec 2005
Location: Oblivion
Guild: Irrelevant
Profession: Mo/Me
Default

Quote:
Originally Posted by Elder III View Post
^^^looks like the Tesla Computing GPU..... or is it a prototype of the GT300?
GT300 is a Tesla cGPU solution for general compute. G300 is the GPU version, but they are very similar.

The version pictured above is the Tesla unit, but suffice to say, the final retail rendering card will be extremely similar in dimension and cooling solution (that doesn't mean all manufacturers will use the same model though)


What is important to understand about GT300 is that it isn't just another GPU. nVidia's primary focus isn't 3D gaming anymore, since the PC gaming market has depreciated and is in a state of downward trends. GT300 addresses many issues with GT200 in terms of general compute, and is definative move towards a multi purpose card that not only serves as a very powerful GPU, but as a powerful CPU-like solution. OpenCL and Windows7 DirectX Compute will be the single biggest boon that GT300 brings to the table.

GT300 is meant to be a one time purchase for Windows7, at least until CPU technology advances enough to surpass nVidia cGPU compute (which I don't see happening until AMD Bulldozer)

The sheer power of Fermi coupled with the amazing advancement of hardware driven code manipulation and caching is going to really show the muscle of the large die design.

Here is what you can expect from GT300 vs GT200 (I can't give you hard numbers, but I can reiterate what you will find from full white paper release and press releases)

  • 4x the Anti Aliasing and Anisotropic Filtering performance at resolutions of 1920x1200 or higher.
  • 10x faster CUDA parallel processing with active switching into a pure C++ environment.
  • Low power consumption (225w or less depending on model)
  • Full DX11 and Tessellation support for Windows7
  • Fully parallel CPU/GPU transfers for 5-6x performance gains
  • Improved efficiency (140%+) thanks to MIMD structure and OP lineup (2x precision FP has been improved over 4x, and is 2.5x faster then even the HD5870 at its peak)

Bearing in mind though, any press release material you will find is for GT300 (not G300, our purely graphics oriented solution lineup)

I'm happy to answer any questions as they pertain to the press release material you can find online.

I've had 5 sections of my NDA lifted, but only pertaining to direct questions, not unlimited disclosure. Make questions specific, and I'll do my best.
__________________
Lord Sojar is offline   Reply With Quote
Old Oct 01, 2009, 10:46 PM // 22:46   #46
Desert Nomad
 
Burst Cancel's Avatar
 
Join Date: Dec 2006
Location: Domain of Broken Game Mechanics
Default

HPC isn't really relevant to home users either. nV has been pushing CUDA and GPGPU for a while, but Joe Sixpack still doesn't give a shit.

Honestly, about the only real benefit that mainstream consumers have gotten from recent advances in computing hardware is lower power consumption (and related noise/heat). In terms of actual raw computing strength, we've already passed the point of "good enough" for the vast majority of non-business users.
Burst Cancel is offline   Reply With Quote
Old Oct 01, 2009, 11:49 PM // 23:49   #47
The Fallen One
 
Lord Sojar's Avatar
 
Join Date: Dec 2005
Location: Oblivion
Guild: Irrelevant
Profession: Mo/Me
Default

Quote:
Originally Posted by Burst Cancel View Post
HPC isn't really relevant to home users either. nV has been pushing CUDA and GPGPU for a while, but Joe Sixpack still doesn't give a shit.

Honestly, about the only real benefit that mainstream consumers have gotten from recent advances in computing hardware is lower power consumption (and related noise/heat). In terms of actual raw computing strength, we've already passed the point of "good enough" for the vast majority of non-business users.

CUDA will come into its own with Nexus. GT300 has full native, hardware based support for C++.
__________________
Lord Sojar is offline   Reply With Quote
Old Oct 02, 2009, 01:17 AM // 01:17   #48
Desert Nomad
 
Join Date: Apr 2007
Default

Quote:
Originally Posted by Rahja the Thief View Post
GT300 is meant to be a one time purchase for Windows7, at least until CPU technology advances enough to surpass nVidia cGPU compute (which I don't see happening until AMD Bulldozer)
Sounds expensive...
Improvavel is offline   Reply With Quote
Old Oct 02, 2009, 01:29 AM // 01:29   #49
Furnace Stoker
 
Elder III's Avatar
 
Join Date: Jan 2007
Location: Ohio
Guild: I Will Never Join Your Guild (NTY)
Profession: R/
Default

It sounds like NVIDIA will be the way to go for work stations and research behomoth computers - for the home video gamer though I would say it will be a matter of whatever manufacturer gives the most bang per buck... which is the way it should be - competition galore = better deals for the consumer and since I build computers on the side, the more $$$ I can save for Joe Sixpack means the more I can potentially earn myself. So the more competition the better.

*Rahjah - do you feel that G300 (graphics oriented card) will compete with or surpass the HD 5870 in terms of gaming performance at the current price level? If that's too close to any NDA terms I won't be offended by a non-answer either.
Elder III is offline   Reply With Quote
Old Oct 02, 2009, 12:18 PM // 12:18   #50
Wilds Pathfinder
 
Join Date: Jul 2008
Location: netherlands
Profession: Mo/E
Default

rahja, are this real pictures of the GT300
http://www.hardware.info/nl-NL/news/...rt_op_de_foto/
its a dutch site, dont try and understand what they say

as they look like someone grabbed a saw and shortened the card.
if that are fake pictures, can you post good ones, as i would like to know how long the card will be.
i cant imagine thats it longer than my HD4870*2 but well

anyways
keep the shiny chrome

Last edited by riktw; Oct 02, 2009 at 12:21 PM // 12:21..
riktw is offline   Reply With Quote
Old Oct 02, 2009, 12:24 PM // 12:24   #51
über těk-nĭsh'ən
 
moriz's Avatar
 
Join Date: Jan 2006
Location: Canada
Profession: R/
Default

hmm, 6+8 pin PCI-E connectors... this does not bode well for power consumption.

EDIT: according to semiaccurate.com, that card is a fake.
http://www.semiaccurate.com/2009/10/...mi-boards-gtc/
keep in mind that this IS semiaccurate, so take what charlie said with a huge grain of salt. nevertheless, his take on it is quite compelling.

Last edited by moriz; Oct 02, 2009 at 01:07 PM // 13:07..
moriz is offline   Reply With Quote
Old Oct 02, 2009, 02:48 PM // 14:48   #52
The Fallen One
 
Lord Sojar's Avatar
 
Join Date: Dec 2005
Location: Oblivion
Guild: Irrelevant
Profession: Mo/Me
Default

Quote:
Originally Posted by moriz View Post
hmm, 6+8 pin PCI-E connectors... this does not bode well for power consumption.

EDIT: according to semiaccurate.com, that card is a fake.
http://www.semiaccurate.com/2009/10/...mi-boards-gtc/
keep in mind that this IS semiaccurate, so take what charlie said with a huge grain of salt. nevertheless, his take on it is quite compelling.

He's a nutcase... no really. The card is a Tesla unit, made for rack implementation, not for standard installation as a desktop GPU unit. The final GPU (G300) unit will look slightly different from its GT300 brother.

As for 6+8 pin connectors, that explanation is simple. When you plan a GPU for power consumption, you never try to come close to peak power consumption, but rather, plan for above it. GT300 has a ~225w TDP, but the connectors allow for 300w TDP. It really is quite power efficient, and has many power saving hardware based features. Load balancing has been drastically improved as well, so if the card is pushed to full load for long periods of time (which is mostly unlikely given the sheer computing power), the total power consumption won't be off the charts.

As for the validity of those Dutch photos, it's hard to tell. The card layout and design for the Tesla unit is exactly like that, yes, but are those real cards... unknown to be frank. But, I'll put it this way; if those are fake, they look almost exactly like the release Tesla unit (not the GPU unit though, G300)


Does that ^ help?

Quote:
Originally Posted by Elder III
*Rahjah - do you feel that G300 (graphics oriented card) will compete with or surpass the HD 5870 in terms of gaming performance at the current price level? If that's too close to any NDA terms I won't be offended by a non-answer either.
No, happy to comment. The G300 GPU will surpass the HD5870 in terms of gaming performance based on ATi's pricing model at this time. That doesn't mean ATi won't change their pricing model once their lineup is fully released, or once nVidia (getting out of the habit of saying we is tough...) releases their G300 GPU variant lineup. G300 is cheaper to produce than GT200 was by a significant margin, and as a result, you can expect a large chunk of those savings to be passed on to you, the consumer. GT200's release pricing was also horribly wrong, so take the past with a grain of salt.
__________________
Lord Sojar is offline   Reply With Quote
Old Oct 02, 2009, 03:09 PM // 15:09   #53
über těk-nĭsh'ən
 
moriz's Avatar
 
Join Date: Jan 2006
Location: Canada
Profession: R/
Default

um... why does the card have... wood screws on the heat exhaust? i thought the heat shroud should be hollow on that end... that kinda implies the entire shroud is made out of a single block of plastic.

no rahja, this is almost definitely a fake.

EDIT: in case you still don't trust charlie, here's techpowerup's take:
http://www.techpowerup.com/105052/NV..._Unveiled.html

kinda goes against what you said about "many functioning samples" available, doesn't it? if this were the case, why would nvidia show a dummy board? and a badly made dummy to boot.

Last edited by moriz; Oct 02, 2009 at 03:14 PM // 15:14..
moriz is offline   Reply With Quote
Old Oct 02, 2009, 03:38 PM // 15:38   #54
The Fallen One
 
Lord Sojar's Avatar
 
Join Date: Dec 2005
Location: Oblivion
Guild: Irrelevant
Profession: Mo/Me
Default

Quote:
Originally Posted by moriz View Post
um... why does the card have... wood screws on the heat exhaust? i thought the heat shroud should be hollow on that end... that kinda implies the entire shroud is made out of a single block of plastic.

no rahja, this is almost definitely a fake.

EDIT: in case you still don't trust charlie, here's techpowerup's take:
http://www.techpowerup.com/105052/NV..._Unveiled.html

kinda goes against what you said about "many functioning samples" available, doesn't it? if this were the case, why would nvidia show a dummy board? and a badly made dummy to boot.

Sigh... the board he held up isn't a "fake". Because showcased samples are mostly likely destroyed due to EMD, SD, etc etc, they are rarely working boards, and are altered at times for press release to prevent damage. GPU boards are not meant to be man handled by hundreds of people, and expected to survive. They are made to be installed of a stationary case (carefully I might add), and stay there until they are removed to be cleaned or die. The demo board isn't a fully functional board, but it is an accurate mockup of the release boards. Taking first series production boards away from the debugging and testing units would be foolish just so you could have 100% real "eye candy". The demo shown is on working GT300 silicon, and this entire "it's fake" business is getting rather silly.

We are still several months away from a retail launch, and people are complaining that we aren't passing them out as party favors or in nicely organized celeb gift sacks. Sorry, this isn't the Emmy's... get over it. In reality, GT300 is alive and well, not fake, and 100% working. nVidia doesn't have many of the final release boards floating around (I certainly won't be getting one) at this point in time, since they have only received about 3 bins worth. Suffice to say, just stop. It's real, the performance is real, and trying to debunk reality is rather dumb.

I get it, you like ATi, but enough already. This nVidia bashing is absolutely out of hand. ATi has the upper hand with a launch months ahead of ours. It's fine and dandy they got out ahead of us, and they will reap early adopter benefits this time around. However, that doesn't mean that GT300 isn't going to outperform ATi, and isn't going to really shake up the concept of what a GPU can do.

If I could choose a card for Windows 7, I'd choose GT300 over RV870 any day of the week, and that is said without bias. More PC centric features are found on GT300, and Eyefinity, while very cool, isn't practical. So, that's the bottom line. Which one will make your PC run faster, cooler, and more stable??? GT300, hands down.
__________________
Lord Sojar is offline   Reply With Quote
Old Oct 02, 2009, 03:53 PM // 15:53   #55
über těk-nĭsh'ən
 
moriz's Avatar
 
Join Date: Jan 2006
Location: Canada
Profession: R/
Default

i am not bashing nvidia, and i don't particularly like one brand over the other. i'll always put my money towards whatever happens to be the best buy at my time of purchase.

i am not doubting that the GT300 exist. i also have no doubt that it will be better than the RV870. i was just saying, the card put on display is most definitely not a working sample, or even a disabled sample. it's a mockup, and a poorly done one at that. i'd have no problem with it if it was actually presented as a mockup. unfortunately, it wasn't. your CEO clearly held that thing up and said: "this puppy here, is Fermi." no it's not.

i have great respect for nvidia and what's it's trying to accomplish... but pulling this kind of stunt is just stupid. i'd rather they show nothing at all.
moriz is offline   Reply With Quote
Old Oct 02, 2009, 04:02 PM // 16:02   #56
The Fallen One
 
Lord Sojar's Avatar
 
Join Date: Dec 2005
Location: Oblivion
Guild: Irrelevant
Profession: Mo/Me
Default

The cooling system and PCB specs are accurate, and there was a sealed GPU inside of that sample. It's a bit beyond a mockup, but not a working card.

Bear in mind, this is months prior to launch, so this isn't abnormal for any company.
__________________
Lord Sojar is offline   Reply With Quote
Old Oct 02, 2009, 05:01 PM // 17:01   #57
Hell's Protector
 
Quaker's Avatar
 
Join Date: Aug 2005
Location: Canada
Guild: Brothers Disgruntled
Default

My question is - does nVidia (or ATI) seriously think that GPGPU functions are going to be the "driving force" in graphics card purchases? Can they be that dumb? Do they really think that Joe Average, who doesn't use anywhere near the full potential of their computer now, is going to care if their GPU can do anything else?
Will someone who actually has a use for GPGPU apps actually buy a GPU to do it, or simply buy another additional PC? Will those few people who would actually use/want/need to do GPGPU functions actually be a significant enough user base to be a "driving force"? I seriously doubt it.

Last edited by Quaker; Oct 02, 2009 at 05:04 PM // 17:04..
Quaker is offline   Reply With Quote
Old Oct 02, 2009, 05:25 PM // 17:25   #58
Furnace Stoker
 
Elder III's Avatar
 
Join Date: Jan 2007
Location: Ohio
Guild: I Will Never Join Your Guild (NTY)
Profession: R/
Default

Quote:
Originally Posted by Quaker View Post
My question is - does nVidia (or ATI) seriously think that GPGPU functions are going to be the "driving force" in graphics card purchases? Can they be that dumb? Do they really think that Joe Average, who doesn't use anywhere near the full potential of their computer now, is going to care if their GPU can do anything else?
Will someone who actually has a use for GPGPU apps actually buy a GPU to do it, or simply buy another additional PC? Will those few people who would actually use/want/need to do GPGPU functions actually be a significant enough user base to be a "driving force"? I seriously doubt it.

My thoughts are running parallel to Quaker's... I recently have been reading up on the TESLA cGPU (GT300 right?) and if the next video card out from NVIDIA (G300 if I have my cookies straight) has a fraction of the computing power (not 3D graphics power) then it will be amazing. Cuda and ATI's equivalent (sorta) are both awesome and truly amazing when I think of what could hypothetically be accomplished with these new(ish) technologies. ***However, John & Jane Smith buy new video cards to play video games and sometimes to work with 3D design programs. Other than the workstation user, scientist and researcher (and they are a small % of the market no doubt)... who will use these wonderful features, or take them into consideration when buying?
Elder III is offline   Reply With Quote
Old Oct 02, 2009, 05:58 PM // 17:58   #59
Desert Nomad
 
Join Date: Apr 2007
Default

Quote:
Originally Posted by Elder III View Post
My thoughts are running parallel to Quaker's... I recently have been reading up on the TESLA cGPU (GT300 right?) and if the next video card out from NVIDIA (G300 if I have my cookies straight) has a fraction of the computing power (not 3D graphics power) then it will be amazing. Cuda and ATI's equivalent (sorta) are both awesome and truly amazing when I think of what could hypothetically be accomplished with these new(ish) technologies. ***However, John & Jane Smith buy new video cards to play video games and sometimes to work with 3D design programs. Other than the workstation user, scientist and researcher (and they are a small % of the market no doubt)... who will use these wonderful features, or take them into consideration when buying?
They don't have a CPU.

What do you think will happen to nVidia when it is time for Intel and AMD (if it survives - I hope so, monopolies suck my money out of my wallet, vide for example, the GT200 when nVidia thought ATI was lame*tease *) to integrate CPU and GPU on the same die?

nVidia needs to come out with something to survive in the long term.

On the other hand, I won't be paying for the extra GPGPU features I wont utilize. The company that gives me more GAME performance around my budget and its the cheaper, will get my money.
Improvavel is offline   Reply With Quote
Old Oct 02, 2009, 06:24 PM // 18:24   #60
über těk-nĭsh'ən
 
moriz's Avatar
 
Join Date: Jan 2006
Location: Canada
Profession: R/
Default

improvavel is correct. nvidia is pushing GPGPU because they don't have a CPU. not that they don't have the expertise to make one, though. it's because they don't have a x86 license. there was a push some time ago by nvidia to buy VIA, except their x86 license is non-transferable, so nvidia won't be able to use it to make a CPU.

as for myself, i'm going to skip this generation entirely. my HD4890 is more than powerful enough for whatever i throw at it. my next upgrade would be a platform one, since my G33+E7200 platform is getting seriously long in the tooth. i'm also vaguely suspecting that it's beginning to bottleneck my graphic card's performance. not having DX11 does not detract from the windows 7 experience at all for me, since i've been using it for months now, and haven't missed DX11 one bit.

my next graphic card update will likely fall around july of next year. here's hoping both AMD and NVIDIA can put out some compelling products around that time.
moriz is offline   Reply With Quote
Reply

Share This Forum!  
 
 
           

Thread Tools
Display Modes

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off

Forum Jump


All times are GMT. The time now is 02:17 AM // 02:17.


Powered by: vBulletin
Copyright ©2000 - 2016, Jelsoft Enterprises Ltd.
jQuery(document).ready(checkAds()); function checkAds(){if (document.getElementById('adsense')!=undefined){document.write("_gaq.push(['_trackEvent', 'Adblock', 'Unblocked', 'false',,true]);");}else{document.write("