Guild Wars Forums - GW Guru
 
 

Go Back   Guild Wars Forums - GW Guru > Forest of True Sight > Technician's Corner

Notices

Reply
 
Thread Tools Display Modes
Old Jul 29, 2009, 10:32 PM // 22:32   #61
Wilds Pathfinder
 
Join Date: Jul 2008
Location: netherlands
Profession: Mo/E
Advertisement

Disable Ads
Default

Quote:
Originally Posted by Kuntz View Post
http://www.anandtech.com/showdoc.aspx?i=3573
-180mm^2 die size (4870 was 260mm^2, nVidia's was the size of the Titanic)
funny one, but kinda true.

as i dont work at nvidia or ati, i cant say anything about the cards.
as i work at a computer shop i can say this.
we sold more ati cards than nvidia cards for the last 5 months, and that's not about the cards in computers, so only the cards we sell separate
3 months ago we stopped putting 8400 and 9400 cards in cheap computers, but ati cards.
nvidia drivers = 100 MB or so
ati CCC = 40
just ati drivers = 20
just nvidia drivers = O_O not possible.

for the rest, for about the same price we have a card with the same or better performance and less driver downloading time spend.
complains so far, none.
and the big business is in the smaller cards, the amount of people buying a big ass 400 dollar/euro card is not so big.
if dell or so starts to sell your low end card, that's a nice profit.

i cant wait to see the HD5870, gt300 and larabee, will be fun to see who wins, and i hope its the best one, gogogo 3dfx(wait nvidia buyed that huh?)
(found my old 3dfx today, still works and still good for quake 3 with that blazing fast pentium2)

and rahja, maybe i should PM this, but what kind of collage have you done, working at ati or nvidia looks like a great job as i do way to much with electronics already.
and i guess its not an easy collage but i can still try

Last edited by riktw; Jul 29, 2009 at 10:37 PM // 22:37..
riktw is offline   Reply With Quote
Old Jul 30, 2009, 01:47 AM // 01:47   #62
Core Guru
 
Brett Kuntz's Avatar
 
Join Date: Feb 2005
Default

Quote:
Originally Posted by moriz View Post
ah, i guess my source was really off then. however, are we sure that picture is of the RV870, and not a more mainstream chip (RV840)?

i'm quite familiar with AMD's "small chip" strategy. however, using a dual GPU solution for the very high end can have its drawbacks. the HD4870x2 seems to be completely at the mercy of drivers; if a game doesn't have a crossfire profile, it won't gain the performance benefit.

of course, i still remember the HD4850 debut, and its +105% crossfire scaling in COD4. yikes.
It's unknown what chip that was for, but it is assumed it was the "performance entry" chip/card at that show.

When two chips are placed inside the same MCM, it wont count as Crossfire X, so you wont need "driver support" for those cards. They will behave as one chip on a 512-bit bus. But ATI also said in another article I can't find for the life of me performance cards will come with 2 MCM's (4 chips total) in Crossfire X, that can combine with another card (8 chips total on 2 PCB's).

Although SLI/CFX is at the mercy of driver support, it typically comes out 2-3 months after the retail release of a game, depending on the popularity of the game. There are ways to force support for a game too, such as renaming your game to "crysis.exe".

To date, the only major driver issue's I've heard of were for the 4850 X2, since the card came out 5 months before ATI planned driver support for it haha.
Brett Kuntz is offline   Reply With Quote
Old Jul 30, 2009, 01:51 AM // 01:51   #63
über těk-nĭsh'ən
 
moriz's Avatar
 
Join Date: Jan 2006
Location: Canada
Profession: R/
Default

no, i'm sure the HD4870x2 is simply treated as two HD4870. this is why you have to enable crossfire to gain the additional performance. otherwise, it will function as a single HD4870.
moriz is offline   Reply With Quote
Old Jul 30, 2009, 01:53 AM // 01:53   #64
Core Guru
 
Brett Kuntz's Avatar
 
Join Date: Feb 2005
Default

Quote:
Originally Posted by moriz View Post
no, i'm sure the HD4870x2 is simply treated as two HD4870. this is why you have to enable crossfire to gain the additional performance. otherwise, it will function as a single HD4870.
I don't think anyone has said otherwise???
Brett Kuntz is offline   Reply With Quote
Old Jul 30, 2009, 01:56 AM // 01:56   #65
über těk-nĭsh'ən
 
moriz's Avatar
 
Join Date: Jan 2006
Location: Canada
Profession: R/
Default

Quote:
Originally Posted by Kuntz View Post

When two chips are placed inside the same MCM, it wont count as Crossfire X, so you wont need "driver support" for those cards. They will behave as one chip on a 512-bit bus.
well, you did.
moriz is offline   Reply With Quote
Old Jul 30, 2009, 02:08 AM // 02:08   #66
Core Guru
 
Brett Kuntz's Avatar
 
Join Date: Feb 2005
Default

The X2 cards are PCB's with 2 chips, so yes you need to enable CFX. Two chips in an MCM is like Intels Quad-Cores where they took Dual-Cores and glued them together. ATI hasn't released any MCM cards yet, since they are something new they are doing with the RV870's.

And new info from ATI's CTO just now:

http://www.fudzilla.com/content/view/14847/1/
Brett Kuntz is offline   Reply With Quote
Old Jul 30, 2009, 02:16 AM // 02:16   #67
über těk-nĭsh'ən
 
moriz's Avatar
 
Join Date: Jan 2006
Location: Canada
Profession: R/
Default

ah, my bad.

although i'd question whether fudzilla is really trustworthy or not, what that article said makes sense. if AMD actually makes good on that September release date, the yields better be good.
moriz is offline   Reply With Quote
Old Jul 30, 2009, 02:51 AM // 02:51   #68
Core Guru
 
Brett Kuntz's Avatar
 
Join Date: Feb 2005
Default

Well in that particular article, they did speak with the CTO and that other dude, so what it says has to be true. When FUD posts a rumor, which they do often, they clearly mark it as such.

I'd think the RV870 would be done around mid-August since they still need to ship it out to manufactures to put onto boards, and do the final debugging/testing of them, and all that junk. Maybe we'll start seeing ES's towards the end of August, and some early benches.
Brett Kuntz is offline   Reply With Quote
Old Jul 30, 2009, 03:54 AM // 03:54   #69
Core Guru
 
Brett Kuntz's Avatar
 
Join Date: Feb 2005
Default

Oooo looks like someone already has an ES:

http://www.chiphell.com/2009/0728/89.html

Two 6-pin connectors, what a beast.
Brett Kuntz is offline   Reply With Quote
Old Jul 30, 2009, 04:04 AM // 04:04   #70
über těk-nĭsh'ən
 
moriz's Avatar
 
Join Date: Jan 2006
Location: Canada
Profession: R/
Default

it's longer than a HD2900XT? ouch.
moriz is offline   Reply With Quote
Old Jul 30, 2009, 04:07 AM // 04:07   #71
Core Guru
 
Brett Kuntz's Avatar
 
Join Date: Feb 2005
Default

What I don't get is that 2 6-pins = 1 8-pin......

And a 6-pin can easily handle 24 amps (288 Watts) yet the stupid PCI-E spec limits it to just 75 watts.....

And 8-pin adds no more voltage lines, and yet the rating magically grows to 150 Watts....

lol

Basically a 6-pin connector could power all the worlds GPU's, including the X2's.
Brett Kuntz is offline   Reply With Quote
Old Jul 30, 2009, 04:12 AM // 04:12   #72
Core Guru
 
Brett Kuntz's Avatar
 
Join Date: Feb 2005
Default

And apparently GT300's are in their first swing:

http://www.xtremesystems.org/forums/...d.php?t=230847
Brett Kuntz is offline   Reply With Quote
Old Jul 30, 2009, 05:00 AM // 05:00   #73
The Fallen One
 
Lord Sojar's Avatar
 
Join Date: Dec 2005
Location: Oblivion
Guild: Irrelevant
Profession: Mo/Me
Default

Quote:
Originally Posted by riktw View Post
and rahja, maybe i should PM this, but what kind of collage have you done, working at ati or nvidia looks like a great job as i do way to much with electronics already.
and i guess its not an easy collage but i can still try
I did my undergraduate as physics, and my graduate studies (doctorate) as electrical physics with specialization in particle and nano-electrical physics.

Quote:
Originally Posted by Kuntz View Post
What I don't get is that 2 6-pins = 1 8-pin......

And a 6-pin can easily handle 24 amps (288 Watts) yet the stupid PCI-E spec limits it to just 75 watts.....

And 8-pin adds no more voltage lines, and yet the rating magically grows to 150 Watts....

lol

Basically a 6-pin connector could power all the worlds GPU's, including the X2's.
All wattage calculations are done before e- loss and before any inversion takes place. At wire vs at source is pretty different. You also have to take into account fluctuation and cooling systems on the cards. You don't go with the bare minimum, you go with a surplus to make up for resistance and inversion.
__________________
Lord Sojar is offline   Reply With Quote
Old Jul 30, 2009, 03:20 PM // 15:20   #74
Wilds Pathfinder
 
Join Date: Jul 2008
Location: netherlands
Profession: Mo/E
Default

most powersupply's have multiple 12V power lines.
so they make VGA cards with 2 times 6 pins.
and an 8 pins connector is an 6 pins with 2 extra ground lines.(3 yellow, 5 black)dont know why they made that 150 watts.
its nice if you want to know how much your VGA cards needs, no connectors = max 75 watt.
1 x 6 pins = max 150 watt, 2 x 6 pins = max 225 watt, 1 x 8 and 1 x 6 = 300 watt and 2 x 8 = 375 watt.
dont know if there are cards with 2 times 8 pins, maybe that weird asus card with 2 times an GTX285 on 1 board.
they called it an GTX295 but normally thats 2 GTX275 on 1 card.
aw well, both overkill vor GW
riktw is offline   Reply With Quote
Reply

Share This Forum!  
 
 
           

Thread Tools
Display Modes

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off

Forum Jump


All times are GMT. The time now is 05:19 AM // 05:19.


Powered by: vBulletin
Copyright ©2000 - 2016, Jelsoft Enterprises Ltd.
jQuery(document).ready(checkAds()); function checkAds(){if (document.getElementById('adsense')!=undefined){document.write("_gaq.push(['_trackEvent', 'Adblock', 'Unblocked', 'false',,true]);");}else{document.write("