Jul 29, 2009, 10:32 PM // 22:32
|
#61
|
Wilds Pathfinder
Join Date: Jul 2008
Location: netherlands
Profession: Mo/E
|
Quote:
Originally Posted by Kuntz
|
funny one, but kinda true.
as i dont work at nvidia or ati, i cant say anything about the cards.
as i work at a computer shop i can say this.
we sold more ati cards than nvidia cards for the last 5 months, and that's not about the cards in computers, so only the cards we sell separate
3 months ago we stopped putting 8400 and 9400 cards in cheap computers, but ati cards.
nvidia drivers = 100 MB or so
ati CCC = 40
just ati drivers = 20
just nvidia drivers = O_O not possible.
for the rest, for about the same price we have a card with the same or better performance and less driver downloading time spend.
complains so far, none.
and the big business is in the smaller cards, the amount of people buying a big ass 400 dollar/euro card is not so big.
if dell or so starts to sell your low end card, that's a nice profit.
i cant wait to see the HD5870, gt300 and larabee, will be fun to see who wins, and i hope its the best one, gogogo 3dfx(wait nvidia buyed that huh?)
(found my old 3dfx today, still works and still good for quake 3 with that blazing fast pentium2)
and rahja, maybe i should PM this, but what kind of collage have you done, working at ati or nvidia looks like a great job as i do way to much with electronics already.
and i guess its not an easy collage but i can still try
Last edited by riktw; Jul 29, 2009 at 10:37 PM // 22:37..
|
|
|
Jul 30, 2009, 01:47 AM // 01:47
|
#62
|
Core Guru
|
Quote:
Originally Posted by moriz
ah, i guess my source was really off then. however, are we sure that picture is of the RV870, and not a more mainstream chip (RV840)?
i'm quite familiar with AMD's "small chip" strategy. however, using a dual GPU solution for the very high end can have its drawbacks. the HD4870x2 seems to be completely at the mercy of drivers; if a game doesn't have a crossfire profile, it won't gain the performance benefit.
of course, i still remember the HD4850 debut, and its +105% crossfire scaling in COD4. yikes.
|
It's unknown what chip that was for, but it is assumed it was the "performance entry" chip/card at that show.
When two chips are placed inside the same MCM, it wont count as Crossfire X, so you wont need "driver support" for those cards. They will behave as one chip on a 512-bit bus. But ATI also said in another article I can't find for the life of me performance cards will come with 2 MCM's (4 chips total) in Crossfire X, that can combine with another card (8 chips total on 2 PCB's).
Although SLI/CFX is at the mercy of driver support, it typically comes out 2-3 months after the retail release of a game, depending on the popularity of the game. There are ways to force support for a game too, such as renaming your game to "crysis.exe".
To date, the only major driver issue's I've heard of were for the 4850 X2, since the card came out 5 months before ATI planned driver support for it haha.
|
|
|
Jul 30, 2009, 01:51 AM // 01:51
|
#63
|
über těk-nĭsh'ən
Join Date: Jan 2006
Location: Canada
Profession: R/
|
no, i'm sure the HD4870x2 is simply treated as two HD4870. this is why you have to enable crossfire to gain the additional performance. otherwise, it will function as a single HD4870.
|
|
|
Jul 30, 2009, 01:53 AM // 01:53
|
#64
|
Core Guru
|
Quote:
Originally Posted by moriz
no, i'm sure the HD4870x2 is simply treated as two HD4870. this is why you have to enable crossfire to gain the additional performance. otherwise, it will function as a single HD4870.
|
I don't think anyone has said otherwise???
|
|
|
Jul 30, 2009, 01:56 AM // 01:56
|
#65
|
über těk-nĭsh'ən
Join Date: Jan 2006
Location: Canada
Profession: R/
|
Quote:
Originally Posted by Kuntz
When two chips are placed inside the same MCM, it wont count as Crossfire X, so you wont need "driver support" for those cards. They will behave as one chip on a 512-bit bus.
|
well, you did.
|
|
|
Jul 30, 2009, 02:16 AM // 02:16
|
#67
|
über těk-nĭsh'ən
Join Date: Jan 2006
Location: Canada
Profession: R/
|
ah, my bad.
although i'd question whether fudzilla is really trustworthy or not, what that article said makes sense. if AMD actually makes good on that September release date, the yields better be good.
|
|
|
Jul 30, 2009, 04:04 AM // 04:04
|
#70
|
über těk-nĭsh'ən
Join Date: Jan 2006
Location: Canada
Profession: R/
|
it's longer than a HD2900XT? ouch.
|
|
|
Jul 30, 2009, 05:00 AM // 05:00
|
#73
|
The Fallen One
Join Date: Dec 2005
Location: Oblivion
Guild: Irrelevant
Profession: Mo/Me
|
Quote:
Originally Posted by riktw
and rahja, maybe i should PM this, but what kind of collage have you done, working at ati or nvidia looks like a great job as i do way to much with electronics already.
and i guess its not an easy collage but i can still try
|
I did my undergraduate as physics, and my graduate studies (doctorate) as electrical physics with specialization in particle and nano-electrical physics.
Quote:
Originally Posted by Kuntz
What I don't get is that 2 6-pins = 1 8-pin......
And a 6-pin can easily handle 24 amps (288 Watts) yet the stupid PCI-E spec limits it to just 75 watts.....
And 8-pin adds no more voltage lines, and yet the rating magically grows to 150 Watts....
lol
Basically a 6-pin connector could power all the worlds GPU's, including the X2's.
|
All wattage calculations are done before e- loss and before any inversion takes place. At wire vs at source is pretty different. You also have to take into account fluctuation and cooling systems on the cards. You don't go with the bare minimum, you go with a surplus to make up for resistance and inversion.
|
|
|
Jul 30, 2009, 03:20 PM // 15:20
|
#74
|
Wilds Pathfinder
Join Date: Jul 2008
Location: netherlands
Profession: Mo/E
|
most powersupply's have multiple 12V power lines.
so they make VGA cards with 2 times 6 pins.
and an 8 pins connector is an 6 pins with 2 extra ground lines.(3 yellow, 5 black)dont know why they made that 150 watts.
its nice if you want to know how much your VGA cards needs, no connectors = max 75 watt.
1 x 6 pins = max 150 watt, 2 x 6 pins = max 225 watt, 1 x 8 and 1 x 6 = 300 watt and 2 x 8 = 375 watt.
dont know if there are cards with 2 times 8 pins, maybe that weird asus card with 2 times an GTX285 on 1 board.
they called it an GTX295 but normally thats 2 GTX275 on 1 card.
aw well, both overkill vor GW
|
|
|
Thread Tools |
|
Display Modes |
Linear Mode
|
Posting Rules
|
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts
HTML code is Off
|
|
|
All times are GMT. The time now is 05:19 AM // 05:19.
|