Guild Wars Forums - GW Guru
 
 

Go Back   Guild Wars Forums - GW Guru > Forest of True Sight > Technician's Corner

Notices

Reply
 
Thread Tools Display Modes
Old Jan 05, 2010, 02:58 AM // 02:58   #1
The Fallen One
 
Lord Sojar's Avatar
 
Join Date: Dec 2005
Location: Oblivion
Guild: Irrelevant
Profession: Mo/Me
Advertisement

Disable Ads
Default Rahja's FREEEEEEEE!!!!

At midnight (EST, -5 GMT), I will fill in the blanks.... MUHAHAHAA.

GF100 outperforms ATi's 5870 by 46% on average
GF100 outperforms ATi's 5970 by 8% on average

The GF100 gets 148 fps in DiRT2
The GF100 gets 73 fps in Crysis2
The GF100 gets 82 fps in AvP3

*GTX misnomers removed due to Business NDA*

GF100's maximum load temperature is 55 C.

The release week of GF100 is Mar 02nd

Blackberry ordered about a million Tegra2 units for their 2011 smartphone.
Apple ordered a few million Tegra2 units for the 2011 iPhone.
Nintendo ordered several million Tegra2 units for their next gen handheld (the DS2!)

*Removed: Under business NDA*

That's all for now kiddies! See if you can guess for the time being, each - represents a letter or number

<font color="red" font size="3">Extra spoilers!</font>
  • GF100 and GF104 will feature a new 32x Anti Aliasing mode for enthusiast setups.
  • GF100 and GF104 can do 100% hardware based decoding for 1080p BluRay and H264 playback.
  • GF100 and GF104 feature full SLi capability and a new scaling method for rendering!
  • GF100 and GF104 will provide full on chip native C++ operation for Windows and Linux environments. This will be further augmented with CUDA and OpenCL.
  • GF104 will feature new technology designed for UHD OLED monitors!
  • GF100 promises to deliver at least 40% more performance than the GTX295 for less money. GF104 promises double that.
__________________
Lord Sojar is offline   Reply With Quote
Old Jan 05, 2010, 02:59 AM // 02:59   #2
End
Forge Runner
 
End's Avatar
 
Join Date: Jan 2008
Location: Rubbing Potassium on water fountains.
Guild: LF guild that teaches MTSC (did it long ago before gw2 came out and I quit...but I barely remember)
Profession: N/A
Default

YEY

12 chars of yeyness



On a side note...what are you going to do now without a job? dedicate yourself wholly to the gwguru technicians corner?

Last edited by End; Jan 05, 2010 at 03:22 AM // 03:22..
End is offline   Reply With Quote
Old Jan 05, 2010, 03:19 AM // 03:19   #3
Silence and Motion
 
Ariena Najea's Avatar
 
Join Date: Jul 2006
Location: Buffalo NY
Guild: New Horizon [NH]
Default

It's over 9000!

Also, congrats sort of!
__________________
Currently active in GW1 as of February 2015!
Ariena Najea is offline   Reply With Quote
Old Jan 05, 2010, 03:28 AM // 03:28   #4
The Fallen One
 
Lord Sojar's Avatar
 
Join Date: Dec 2005
Location: Oblivion
Guild: Irrelevant
Profession: Mo/Me
Default

Quote:
Originally Posted by End View Post
YEY

12 chars of yeyness



On a side note...what are you going to do now without a job? dedicate yourself wholly to the gwguru technicians corner?
Go back to school? DUHH
__________________
Lord Sojar is offline   Reply With Quote
Old Jan 05, 2010, 03:53 AM // 03:53   #5
Furnace Stoker
 
MisterB's Avatar
 
Join Date: Oct 2005
Location: Planet Earth, Sol system, Milky Way galaxy
Guild: [ban]
Profession: W/
Default

Quote:
Originally Posted by Rahja the Thief View Post
At midnight (EST, -5 GMT), I will fill in the blanks.... MUHAHAHAA.

Nvidia's cookies outperforms ATi's brownies by 24.89% on average
HAL9000 outperforms ATi's 5870 by 9001% on average

*Removed reference to removed content*


Rahja's
maximum load temperature is 40 °C.

The release week of the apocalypse is in 2012, the week of December 22nd. But it's all over on the 21st.

Cyberdyne Systems ordered about a million T-102 units for their 2011 smartphone.
Microsoft ordered a few million Playstation 2 units for the 2011 XBoxRetro.
The US Army ordered several million Stinger 2 units for their next gen handheld.

Phillip Morris USA is planning to move their primary focus to the juvenile market and the infant & toddler market.

That's all for now kiddies! See if you can guess for the time being, each - represents a letter or number
Mad libs are fun. I bet all my "guesses" are wrong.

Last edited by MisterB; Jan 05, 2010 at 05:16 AM // 05:16.. Reason: changes are in bold
MisterB is offline   Reply With Quote
Old Jan 05, 2010, 03:55 AM // 03:55   #6
End
Forge Runner
 
End's Avatar
 
Join Date: Jan 2008
Location: Rubbing Potassium on water fountains.
Guild: LF guild that teaches MTSC (did it long ago before gw2 came out and I quit...but I barely remember)
Profession: N/A
Default

Quote:
Originally Posted by MisterB View Post
Mad libs are fun. I bet all my "guesses" are wrong.
Quote:
The release week of Guild Wars 2's release
Fixed the release date

side note again: over 150 people viewing a Technician's corner thread O.o...
You'd think this was riverside...

Last edited by End; Jan 05, 2010 at 04:21 AM // 04:21..
End is offline   Reply With Quote
Old Jan 05, 2010, 05:32 AM // 05:32   #7
Frost Gate Guardian
 
Celestial Crown's Avatar
 
Join Date: Oct 2006
Default

What are the yields at now, 5%?

I like how you provide the hardware specs of the system for those benches. Hell, you don't even list resolution and you're using games that aren't out yet.

Fermi is going to flop. AMD currently holds every single market segment and Nvidia will only take away the high end at a ridiculous price. AMD will just release a higher end single GPU (5890 or something) and still contend in the enthusiast market.

"The architecture is broken, badly designed, and badly thought out. Nvidia does not understand the basics of modern semiconductor design, and is architecting its chips based on ego, not science. The era of massive GPUs is long over, but Nvidia (mis-)management doesn't seem to want to move their egos out of the way and do the right thing. Now the company is left with a flagship part it can't make."

http://semiaccurate.com/2009/12/21/n...-fermi-448sps/
Celestial Crown is offline   Reply With Quote
Old Jan 05, 2010, 05:44 AM // 05:44   #8
The Fallen One
 
Lord Sojar's Avatar
 
Join Date: Dec 2005
Location: Oblivion
Guild: Irrelevant
Profession: Mo/Me
Default

Quote:
Originally Posted by Celestial Crown View Post
What are the yields at now, 5%?

I like how you provide the hardware specs of the system for those benches. Hell, you don't even list resolution and you're using games that aren't out yet.

Fermi is going to flop. AMD currently holds every single market segment and Nvidia will only take away the high end at a ridiculous price. AMD will just release a higher end single GPU (5890 or something) and still contend in the enthusiast market.

"The architecture is broken, badly designed, and badly thought out. Nvidia does not understand the basics of modern semiconductor design, and is architecting its chips based on ego, not science. The era of massive GPUs is long over, but Nvidia (mis-)management doesn't seem to want to move their egos out of the way and do the right thing. Now the company is left with a flagship part it can't make."

http://semiaccurate.com/2009/12/21/n...-fermi-448sps/
Charlie is a fool. He has a journalism degree.

Games that aren't available to the public but use DX11... HMMM, why would I use DX11 games? That's a tough one....isn't it? Any other gems of wisdom you would like to spew? I revealed this info because I am no longer employed by nVidia and several of the technical NDAs expired at midnight, seeing as how Jan 4th was my last day. The business NDAs still stand.

AvP3 and Crysis2 can be taken with a grain of salt, since they are not finalized and drivers aren't finalized yet. DiRT2 may improve or decline slightly, depends on final drivers.

Those tests were run using a Corei7 920, 6GBs of DDR3 1333, and an Intel 64GB SSD paired with a single GF100 card. The tests were run at 1920x1200 with 4x SSAA and 16xAF. Happy? If you want more details, wait for a benchmarking site to run benchmarks like everyone else. I am spoiling you as it is. Don't bite the hand that feeds.
__________________
Lord Sojar is offline   Reply With Quote
Old Jan 05, 2010, 05:51 AM // 05:51   #9
Frost Gate Guardian
 
Celestial Crown's Avatar
 
Join Date: Oct 2006
Default

Quote:
Originally Posted by Rahja the Thief View Post
Charlie is a fool. He has a journalism degree.
Games that aren't available to the public but use DX11... HMMM, why would I use DX11 games? That's a tough one....isn't it? Any other gems of wisdom you would like to spew? I revealed this info because I am no longer employed by nVidia and several of the technical NDAs expired at midnight, seeing as how Jan 4th was my last day. The business NDAs still stand.

AvP3 and Crysis2 can be taken with a grain of salt, since they are not finalized and drivers aren't finalized yet. DiRT2 may improve or decline slightly, depends on final drivers.

Those tests were run using a Corei7 920, 6GBs of DDR3 1333, and an Intel 64GB SSD paired with a single GF100 card. The tests were run at 1920x1200 with 4x SSAA and 16xAF. Happy?[/QUOTE]

No.

The number of SPs was cut. Nvidia is a horrible company, just look at what they did with Batman: Arkham Asylum. They locked AA code to be vendor specific, so it disables when an ATI card is detected. This was proven by changing the VendorID to Nvidia on an ATI card and then magically AA could work in the demo. Also disabling PhysX for use on other primary cards than Nvidia ones was a bad business move, now people are just bypassing it with modified drivers. I could go on and on about why Nvidia is a bad business model, but surely you know from working there. Oh I'm almost forgetting the fake Fermi that Mr. Jen-Hsun Huang displayed, which Nvida claimed was real and later had to retract that statement. Oh, guess what else... REBRANDS. Nvidia rebrands everything (8800 = 9800 = gts 250) and the gtx 3xx cards or whatever are just rebrands of the gtx 2xx mobile cards. Epic failure on Nvidia's part.

You should also mention the ambients at which you consider 55C load temps. Are you running it outside in some remote location with an ambient of -20C? No way a card with 225W of power draw will run at 55C on load by the way the stock cooler looked.
Celestial Crown is offline   Reply With Quote
Old Jan 05, 2010, 06:24 AM // 06:24   #10
The Fallen One
 
Lord Sojar's Avatar
 
Join Date: Dec 2005
Location: Oblivion
Guild: Irrelevant
Profession: Mo/Me
Default

Quote:
Originally Posted by Celestial Crown View Post
No.

The number of SPs was cut. Nvidia is a horrible company, just look at what they did with Batman: Arkham Asylum. They locked AA code to be vendor specific, so it disables when an ATI card is detected. This was proven by changing the VendorID to Nvidia on an ATI card and then magically AA could work in the demo. Also disabling PhysX for use on other primary cards than Nvidia ones was a bad business move, now people are just bypassing it with modified drivers. I could go on and on about why Nvidia is a bad business model, but surely you know from working there. Oh I'm almost forgetting the fake Fermi that Mr. Jen-Hsun Huang displayed, which Nvida claimed was real and later had to retract that statement. Oh, guess what else... REBRANDS. Nvidia rebrands everything (8800 = 9800 = gts 250) and the gtx 3xx cards or whatever are just rebrands of the gtx 2xx mobile cards. Epic failure on Nvidia's part.

You should also mention the ambients at which you consider 55C load temps. Are you running it outside in some remote location with an ambient of -20C? No way a card with 225W of power draw will run at 55C on load by the way the stock cooler looked.
I apologize your school system didn't teach you to read.... such a pity.
Tesla (GT300) is very different from GF100. Do not mistake this again, or so help me Jesus I will intellectually murder you. Trust me, I am good at it.

Now shoo pest... you clearly are Charlie's prodigy. The man wouldn't be so annoying if he didn't have so much E-Rage towards a company. He fails to realize that without nVidia, ATi would charge insane amounts of money for their GPUs, and couldn't care less about what he thought. ATi is in it for the money, just as every other successful company is.

Short of the above: Stop being a flaming fanboy for a company that doesn't give 2 shits about you.
__________________
Lord Sojar is offline   Reply With Quote
Old Jan 05, 2010, 06:41 AM // 06:41   #11
Frost Gate Guardian
 
Celestial Crown's Avatar
 
Join Date: Oct 2006
Default

Quote:
Originally Posted by Rahja the Thief View Post
I apologize your school system didn't teach you to read.... such a pity.
Tesla (GT300) is very different from GF100. Do not mistake this again, or so help me Jesus I will intellectually murder you. Trust me, I am good at it.

Now shoo pest... you clearly are Charlie's prodigy. The man wouldn't be so annoying if he didn't have so much E-Rage towards a company. He fails to realize that without nVidia, ATi would charge insane amounts of money for their GPUs, and couldn't care less about what he thought. ATi is in it for the money, just as every other successful company is.

Short of the above: Stop being a flaming fanboy for a company that doesn't give 2 shits about you.
I assumed Fermi was the backbone for the Tesla and GeForce cards. Either way, I'm right about everything and you have done nothing to refute my statements.


Quote:
<Rahja_the_Thief> GF100 is single, correct
<Rahja_the_Thief> less than or equal to 300w
<Rahja_the_Thief> that hasn't been totally finalized
<Rahja_the_Thief> since A3 silicon just got back
<Rahja_the_Thief> so I am not totally sure

<Rahja_the_Thief> based on the spin, I'd take a stab at 250w
<Rahja_the_Thief> but give or take 50w
<Rahja_the_Thief> Sorry I can't be more specific, but I haven't received new data on this since November
<Rahja_the_Thief> They put you into a blackout period
You said this, taken from the IRC chat. Even at ~250W TDP, no way there is going to be 55C load temps, unless your testing conditions were like I previously stated.

And trust me, I'm not a fanboy. I could care less about AMD/ATI as I've owned both Intel and AMD processors, and both Nvidia and ATI GPUs.

Last edited by Celestial Crown; Jan 05, 2010 at 06:43 AM // 06:43..
Celestial Crown is offline   Reply With Quote
Old Jan 05, 2010, 07:22 AM // 07:22   #12
The Fallen One
 
Lord Sojar's Avatar
 
Join Date: Dec 2005
Location: Oblivion
Guild: Irrelevant
Profession: Mo/Me
Default

Quote:
Originally Posted by Celestial Crown View Post
I assumed Fermi was the backbone for the Tesla and GeForce cards. Either way, I'm right about everything and you have done nothing to refute my statements.




You said this, taken from the IRC chat. Even at ~250W TDP, no way there is going to be 55C load temps, unless your testing conditions were like I previously stated.

And trust me, I'm not a fanboy. I could care less about AMD/ATI as I've owned both Intel and AMD processors, and both Nvidia and ATI GPUs.

You are wrong? How many ways can I say this? Wrong? Incorrect? Mistaken? Dumb? Idiot? Invalid data detected? Negative on that Ghostrider? I mean... really!

The Fermi you were shown was Tesla. GF100 is different on many levels, which you will soon understand (provided you take the time to read through a site other than that abomination Charlie runs)

And yes, there are ways of getting 55C load temps at those TDPs. You just aren't thinking outside the box. Also, in house testing is bias. Expect the temps in most people's systems to be higher, I wouldn't refute that. I just posted the data I have. Take it how you will. (note: I just realized that says GF104, not GF100, I have changed it as that was a typo)
__________________
Lord Sojar is offline   Reply With Quote
Old Jan 05, 2010, 07:46 AM // 07:46   #13
Forge Runner
 
Join Date: Jan 2007
Default

I've heard nVidia sucked, I heard ATi sucked... I heard they both sucked. No one can ever settle. And even when I bring up something like this anywhere... your respective enthusiast will then try and explain to you why his brand is better than the other and so on.. Btw..I've even heard ATi out perform nVidia as this card outperforms the ATi card. Lost cause IMO. Framerates over 60 aren't even noticeable anyway.
Bob Slydell is offline   Reply With Quote
Old Jan 05, 2010, 08:00 AM // 08:00   #14
Desert Nomad
 
Join Date: Sep 2007
Profession: N/
Default

Quote:
Originally Posted by Rahja the Thief View Post

Short of the above: Stop being a flaming fanboy for a company that doesn't give 2 shits about you.
Biggest piece of irony on this forum? I think so.
jiggles is offline   Reply With Quote
Old Jan 05, 2010, 08:13 AM // 08:13   #15
Krytan Explorer
 
Join Date: Jan 2007
Default

Quote:
Originally Posted by Chrisworld View Post
I've heard nVidia sucked, I heard ATi sucked... I heard they both sucked. No one can ever settle. And even when I bring up something like this anywhere... your respective enthusiast will then try and explain to you why his brand is better than the other and so on.. Btw..I've even heard ATi out perform nVidia as this card outperforms the ATi card. Lost cause IMO. Framerates over 60 aren't even noticeable anyway.
ditto. i think the general consensus is that nvidia is powerful but weak, and ati is weak but powerful.
i guess we'll find out
jackinthe is offline   Reply With Quote
Old Jan 05, 2010, 08:18 AM // 08:18   #16
Jungle Guide
 
KZaske's Avatar
 
Join Date: Jun 2006
Location: Boise Idaho
Guild: Druids Of Old (DOO)
Profession: R/Mo
Default

Sounds like Nvidia is about to open a can of whoop-a$$.

Last edited by KZaske; Jan 05, 2010 at 08:19 AM // 08:19.. Reason: Fingers in the wrong place; bad keyboard, bad.
KZaske is offline   Reply With Quote
Old Jan 05, 2010, 08:38 AM // 08:38   #17
The Fallen One
 
Lord Sojar's Avatar
 
Join Date: Dec 2005
Location: Oblivion
Guild: Irrelevant
Profession: Mo/Me
Default

Quote:
Originally Posted by jiggles View Post
Biggest piece of irony on this forum? I think so.

Clearly ironic, seeing as how I own an ATi HD5870.... CLEARLY.
__________________
Lord Sojar is offline   Reply With Quote
Old Jan 05, 2010, 08:40 AM // 08:40   #18
Forge Runner
 
Join Date: Jan 2007
Default

Quote:
Originally Posted by Rahja the Thief View Post
Clearly ironic, seeing as how I own an ATi HD5870.... CLEARLY.
Oh yeah? What's with the nVidia sig then?
Bob Slydell is offline   Reply With Quote
Old Jan 05, 2010, 08:42 AM // 08:42   #19
The Greatest
 
Arkantos's Avatar
 
Join Date: Feb 2006
Profession: W/
Default

Quote:
Originally Posted by Chrisworld View Post
Oh yeah? What's with the nVidia sig then?
He worked for nVidia, so I suppose he's trying to promote/support it, even though he has a competitors card.
Arkantos is offline   Reply With Quote
Old Jan 05, 2010, 08:46 AM // 08:46   #20
Forge Runner
 
Join Date: Jan 2007
Default

Quote:
Originally Posted by Arkantos View Post
He worked for nVidia, so I suppose he's trying to promote/support it, even though he has a competitors card.
My appologies. Advertising. Never would have thought of it.
Bob Slydell is offline   Reply With Quote
Reply

Share This Forum!  
 
 
           

Thread Tools
Display Modes

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off

Forum Jump


All times are GMT. The time now is 04:45 AM // 04:45.


Powered by: vBulletin
Copyright ©2000 - 2016, Jelsoft Enterprises Ltd.
jQuery(document).ready(checkAds()); function checkAds(){if (document.getElementById('adsense')!=undefined){document.write("_gaq.push(['_trackEvent', 'Adblock', 'Unblocked', 'false',,true]);");}else{document.write("