Guild Wars Forums - GW Guru
 
 

Go Back   Guild Wars Forums - GW Guru > Forest of True Sight > Technician's Corner

Notices

Poll: Will you purchase Windows 7?
Poll Options
Will you purchase Windows 7?

Reply
 
Thread Tools Display Modes
Old May 30, 2009, 02:41 AM // 02:41   #41
über těk-nĭsh'ən
 
moriz's Avatar
 
Join Date: Jan 2006
Location: Canada
Profession: R/
Advertisement

Disable Ads
Default

for your needs, they are more or less the same. i personally cannot imagine any application that absolutely require having it maximized across two monitors. for a typical linux programmer, extend desktop is equivalent to span.

since extend desktop is very much similar to span (i'd say it's better, tbh), i very much doubt that people will not upgrade for this reason.
moriz is offline   Reply With Quote
Old May 30, 2009, 03:24 AM // 03:24   #42
Core Guru
 
Brett Kuntz's Avatar
 
Join Date: Feb 2005
Default

Mmm nVidia is gonna lose the GT300 vs 5800 matchup again. You can't sell a 512-bit chip and make money or compete against ATI's 256-bit chip. 512's are the size of pancakes, generate insane levels of heat, and the failure rate during growing will be astronomical. But since chips take so long to create, nVidia didn't actually know any of these things when they set out to invent the next gen chip. ATI's chip will once again be tiny & cheap to produce, something that will allow them to lay claim to the entire low and mid-line markets. It doesn't matter what nVidia comes up with in terms of 512-bit chips, since you can only ever fit one 512-bit chip on a PCB, and it's been rumoured for awhile ATI has come up with a way to fit 4 5800's onto a single PCB with shared memory. I think for the next 3 years we're going to see nVidia continue to get dominated in the GPU market by ATI.

And it's also noteworthy ATI has supported DX11 features since it's 3800 line and the GPU in the Xbox 360.

ATI's way ahead of the curve right now.
Brett Kuntz is offline   Reply With Quote
Old Jun 02, 2009, 11:57 PM // 23:57   #43
Desert Nomad
 
Zomgyogi's Avatar
 
Join Date: Apr 2007
Location: In a park
Default

Can you say or do you know anything about the power consumption of the cards?
Zomgyogi is offline   Reply With Quote
Old Jun 05, 2009, 08:47 PM // 20:47   #44
The Fallen One
 
Lord Sojar's Avatar
 
Join Date: Dec 2005
Location: Oblivion
Guild: Irrelevant
Profession: Mo/Me
Default

Quote:
Originally Posted by kunt0r View Post
Mmm nVidia is gonna lose the GT300 vs 5800 matchup again. You can't sell a 512-bit chip and make money or compete against ATI's 256-bit chip. 512's are the size of pancakes, generate insane levels of heat, and the failure rate during growing will be astronomical. But since chips take so long to create, nVidia didn't actually know any of these things when they set out to invent the next gen chip. ATI's chip will once again be tiny & cheap to produce, something that will allow them to lay claim to the entire low and mid-line markets. It doesn't matter what nVidia comes up with in terms of 512-bit chips, since you can only ever fit one 512-bit chip on a PCB, and it's been rumoured for awhile ATI has come up with a way to fit 4 5800's onto a single PCB with shared memory. I think for the next 3 years we're going to see nVidia continue to get dominated in the GPU market by ATI.

And it's also noteworthy ATI has supported DX11 features since it's 3800 line and the GPU in the Xbox 360.

ATI's way ahead of the curve right now.

I feel compelled to respond to this misinformation:

The bit length of the memory bus has no real size bearing on the total chip area. That is determined by the logic, primarily the core transistor count, cache systems and hierarchy, and total number of logic units per cell.

With 40nm technology in use on GT300 and ATI's RV870, you can expect similar results. We do plan to pack a bit more cGPU logic into our core, so the die size may in fact be larger, but ATI's die size is up this time too (due to them doubling the amount of shaders yet again) I think you will be surprised about the die size of GT300 and power consumption (I cannot comment on exact numbers, but you can expect GTX275 performance in power savings categories)

As for dominance in the market, it really depends on what segment you are looking at. We lead in the bleeding edge market, the high end market (enthusiast), and in the low end market. ATi is currently doing better in the mid-high and midrange markets. We lead in mobile hands down, and will continue to do so; Tegra and Ion will be major players this year in that regard.

Sorry about no Tech Insight this last week or this week. Been super busy with work and life!

I will have another new edition on June 13th, 2009 for you guys and gals. Until then, have a good week etc.
__________________
Lord Sojar is offline   Reply With Quote
Old Jun 06, 2009, 10:43 AM // 10:43   #45
Furnace Stoker
 
Elder III's Avatar
 
Join Date: Jan 2007
Location: Ohio
Guild: I Will Never Join Your Guild (NTY)
Profession: R/
Default

I'd agree that NVIDIA leads the high end market (although I'm not sure what the difference is between bleeding edge and high end?) and the low end too. However I would say that ATI/AMD is cleaning up on most of what's in between, which I would speculate is the largest part of the desktop market by a fair bit. Of course, the $$$ brought in by the top flight GPUs most likely equals it out in the end, but if the costs of producing those top flight chips etc. is higher...........

In any case I wish both companies the best, because when the competition is high, the consumer benefits, both from lower prices, and from constantly improving technologies. Keep it coming. XD XD XD
Elder III is offline   Reply With Quote
Old Jun 06, 2009, 05:15 PM // 17:15   #46
rattus rattus
 
Snograt's Avatar
 
Join Date: Jan 2006
Location: London, UK GMT±0 ±1hr DST
Guild: [GURU]GW [wiki]GW2
Profession: R/
Default

Quote:
Originally Posted by Elder III View Post
(although I'm not sure what the difference is between bleeding edge and high end?)
High end = high prices.
Bleeding edge = bleeding expensive.
__________________
Si non confectus, non reficiat
Snograt is offline   Reply With Quote
Old Jun 09, 2009, 03:22 AM // 03:22   #47
Core Guru
 
Brett Kuntz's Avatar
 
Join Date: Feb 2005
Default

Quote:
Originally Posted by Rahja the Thief View Post
I feel compelled to respond to this misinformation:

The bit length of the memory bus has no real size bearing on the total chip area.
You are incorrect. The bus-width (not length) directly affects the size of a chip. Each bus lane needs 1 physical pin-out on the chip, a 256-bit chip can be small because it only needs 256 pin-outs, where as a 512-bit chip must be pancake sized to fit all 512 pin-outs. There is a LOT of documentation written by industry professionals on why nVidia lost this round, and it all points to it's giant chips which are difficult to manufacture, have high error rates, and generate more heat.

Creating a 448-bit or 512-bit graphics chip is not something you can do and make money, which is why nVidia is losing and will continue to lose. nVidia has even said so themselves they learned a lot of hard lessons from creating their 448/512 chips and that their in-dev chips (the ones coming out in 3~ years) will be designed to be 256.

nVidia: 576 mm²
ATI: 256 mm²



And I'll also point out, all of nVidia's 448 X2 GPU's are all 2 PCB's glued together and they have to be dual-slotters. Where as an ATI X2 is a single PCB and can be a single slotter with an aftermarket cooler.

Quote:
As noted in SimHQ’s HD 4850/4870 performance previews, the entire Radeon HD 4800 series is based on AMD’s RV770 chip. This ASIC is relatively small in die size (256mm^2) in contrast with current products from NVIDIA, giving AMD more economic leeway and flexibility when designing a high-end board like the new X2 compared to a part like the GTX 280. Why? Defects in manufacturing rise exponentially with die size increases, and a defective chip is obviously a waste of money for any company. And two RV770 GPUs are still smaller in size than a single GT200, so the current cost of producing an X2-style board is less for AMD than a single GTX is for NVIDIA when considering the costliest component on a graphics board: the GPU. Furthermore, a single RV770 has shown itself to be rather competitive to the performance levels of the new GTXs, so placing two such GPUs on a single graphics board effectively doubles the potential performance (twice the stream processors, twice the bandwidth, twice the texture and ROP units, etc.).
http://www.simhq.com/_technology2/technology_130a.html

Last edited by Brett Kuntz; Jun 09, 2009 at 03:29 AM // 03:29..
Brett Kuntz is offline   Reply With Quote
Old Jun 09, 2009, 07:35 AM // 07:35   #48
Ascalonian Squire
 
Rasco's Avatar
 
Join Date: May 2009
Profession: Mo/
Default

Thanks for the information on the GT300. Looks like what Ill replace my aging 8800 GTS with. Any word on probable release dates? Thanks.

My views of ATI is the lack of driver support. My laptop still has that famous, black loading screen glitch. Thats why i remain loyal to Nvidia. Their driver support blows ATI out of the water.

I would buy a GT300 well over a RV870. Its not all about core performance.

That's just the fanboy speaking in me.
Rasco is offline   Reply With Quote
Old Jun 09, 2009, 07:34 PM // 19:34   #49
Core Guru
 
Brett Kuntz's Avatar
 
Join Date: Feb 2005
Default

Quote:
Originally Posted by Rasco View Post
Thanks for the information on the GT300. Looks like what Ill replace my aging 8800 GTS with. Any word on probable release dates? Thanks.

My views of ATI is the lack of driver support. My laptop still has that famous, black loading screen glitch. Thats why i remain loyal to Nvidia. Their driver support blows ATI out of the water.

I would buy a GT300 well over a RV870. Its not all about core performance.

That's just the fanboy speaking in me.
It's funny because for every comment like this I could find one saying the same thing about nVidias drivers. If someone has a driver issue, it's more than likely their fault and not the companies. I've never had a driver issue with either company, and I can't imagine ever having one. ATI's drivers have been top notch since I owned my 9600 XT. I like nVidia's drivers too. ATI and nVidia's drivers are almost identical in layouts and features. I haven't used SLI before, but ATI's CFX installation was easy as pie. After plugging in a 3850 into my mobo to go along with my 3870 OC, I didn't even need to reinstall ATI's drivers, it automatically detected the 2nd card and began working after I checked the "Use CFX" checkbox. Easy as pie!
Brett Kuntz is offline   Reply With Quote
Old Jun 10, 2009, 07:43 AM // 07:43   #50
Ascalonian Squire
 
Rasco's Avatar
 
Join Date: May 2009
Profession: Mo/
Default

Quote:
Originally Posted by Kuntz View Post
It's funny because for every comment like this I could find one saying the same thing about nVidias drivers. If someone has a driver issue, it's more than likely their fault and not the companies. I've never had a driver issue with either company, and I can't imagine ever having one. ATI's drivers have been top notch since I owned my 9600 XT. I like nVidia's drivers too. ATI and nVidia's drivers are almost identical in layouts and features. I haven't used SLI before, but ATI's CFX installation was easy as pie. After plugging in a 3850 into my mobo to go along with my 3870 OC, I didn't even need to reinstall ATI's drivers, it automatically detected the 2nd card and began working after I checked the "Use CFX" checkbox. Easy as pie!
And your loading screens does not show a black flicker when the game is maximized. I have multiple friends with ATI cards and this issue. Ive tried Catalyst 8.8-9.4 [currently 9.4].

Would be interested to know if their are any fixes to this. I know it doesn't affect gameplay, but im picky.
Rasco is offline   Reply With Quote
Old Jun 10, 2009, 03:12 PM // 15:12   #51
Furnace Stoker
 
Elder III's Avatar
 
Join Date: Jan 2007
Location: Ohio
Guild: I Will Never Join Your Guild (NTY)
Profession: R/
Default

Quote:
Originally Posted by Rasco View Post
And your loading screens does not show a black flicker when the game is maximized. I have multiple friends with ATI cards and this issue. Ive tried Catalyst 8.8-9.4 [currently 9.4].

Would be interested to know if their are any fixes to this. I know it doesn't affect gameplay, but im picky.

I have a HD4850 in my gaming rig and it doesn't do that when starting games (at least not ones that I have tried); it does do it when XP starts up, for 1/2 sec while the profile I have set up in CCC kicks in, but it did that with onboard graphics too.
Elder III is offline   Reply With Quote
Old Jun 10, 2009, 10:02 PM // 22:02   #52
The Fallen One
 
Lord Sojar's Avatar
 
Join Date: Dec 2005
Location: Oblivion
Guild: Irrelevant
Profession: Mo/Me
Default

Quote:
Originally Posted by Kuntz View Post
You are incorrect. The bus-width (not length) directly affects the size of a chip. Each bus lane needs 1 physical pin-out on the chip, a 256-bit chip can be small because it only needs 256 pin-outs, where as a 512-bit chip must be pancake sized to fit all 512 pin-outs. There is a LOT of documentation written by industry professionals on why nVidia lost this round, and it all points to it's giant chips which are difficult to manufacture, have high error rates, and generate more heat.

Creating a 448-bit or 512-bit graphics chip is not something you can do and make money, which is why nVidia is losing and will continue to lose. nVidia has even said so themselves they learned a lot of hard lessons from creating their 448/512 chips and that their in-dev chips (the ones coming out in 3~ years) will be designed to be 256.

nVidia: 576 mm²
ATI: 256 mm²



And I'll also point out, all of nVidia's 448 X2 GPU's are all 2 PCB's glued together and they have to be dual-slotters. Where as an ATI X2 is a single PCB and can be a single slotter with an aftermarket cooler.



http://www.simhq.com/_technology2/technology_130a.html
In this case, length actually, since bus layout on our chips aren't done in parallel... but ok, seemly irrelevant conversation.

You of course, realize, that you are citing total platter sizes right? Our chips are larger, yes, but GT300 is actually smaller than GT200. Memory bus bit length is irrelevant here when comparing size, due to the fact that we have consolidated core logic into GT300 that RV870 doesn't have. In this case, what you would be trying to compare a cGPU to GPU, and expecting the cGPU to be smaller in physical dimensions to the GPU, when it has twice the number of transistors. That is absolutely silly.

Whoever these "professionals" are stating that GT300 will fail against RV870 because of size... well... they aren't very professional. You never, ever weigh a processor against another based purely on platter size or chip dimension. Our architecture is vastly different from ATi's, even moreso with GT300. We use branching logic, ATi uses cluster logic. That alone makes their chips smaller.

The 512bit memory bus is absolutely needed when using high clocked GDDR5. Do you expect memory bus technology to stay the same forever due to size constraints? You also neglect that fact that we are now using a 40nm fab as opposed to a 55nm fab, allowing many more transistors in the same space constraint, with all the added benefits of a lower fab thrown in.

This is naive to believe that because a chip with 3.5x more logic than the competitor is going to fall flat on its face because it is larger. Our yields on A1 have been fair, despite TSMCs obvious issues. The same applies for our brothers at AMD, and their experience with TSMCs 40nm fab.

I can tell you that this isn't GT200... we won't be losing money, and we will be able to price GT300 at price points that will severely cut into ATi's recent high margins. GT200 was simply the stepping stone to GT300. You will see soon enough what I mean.

Oh, and ATI's platter and die size has gone up significantly. Don't be fooled, RV870 isn't as small as you think, not with that many shader processors... Don't forget to look up your "sources" for that too (although from what I read, they are fools anyways...) Your source doesn't even have an nVidia card to test, meaning they are not on the certified testers list, in all likelyhood due to misinformation or slander/libel in a past review.

Try a more reliable source for your reviews on cards, and try talking to an expert on core logic and process design. I would be one, but since you don't believe me, try seeking out another. I admire your effort in pointing out size issues with GT200, but again... GT300 isn't GT200, so all evidence you have is irrelevant.
__________________
Lord Sojar is offline   Reply With Quote
Reply

Share This Forum!  
 
 
           

Thread Tools
Display Modes

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off

Forum Jump

Similar Threads
Thread Thread Starter Forum Replies Last Post
Your Tech Insight [First Edition 5/16/2009] Lord Sojar Technician's Corner 20 Jun 23, 2009 06:31 PM // 18:31
Geckoo Sardelac Sanitarium 19 Jun 27, 2008 07:26 PM // 19:26
Gold Insight Bone Staff of Enchanting - Perfect enchant & insight half cast blood runner Sell 0 May 14, 2006 08:57 PM // 20:57
tech help Shazam Technician's Corner 0 Aug 31, 2005 10:47 PM // 22:47
Collectors Edition vs. Retail Edition Rushing Wind Off-Topic & the Absurd 54 Apr 11, 2005 01:30 AM // 01:30


All times are GMT. The time now is 05:35 AM // 05:35.


Powered by: vBulletin
Copyright ©2000 - 2016, Jelsoft Enterprises Ltd.
jQuery(document).ready(checkAds()); function checkAds(){if (document.getElementById('adsense')!=undefined){document.write("_gaq.push(['_trackEvent', 'Adblock', 'Unblocked', 'false',,true]);");}else{document.write("