May 22, 2008, 09:20 PM // 21:20
|
#21
|
rattus rattus
Join Date: Jan 2006
Location: London, UK GMT±0 ±1hr DST
Guild: [GURU]GW [wiki]GW2
Profession: R/
|
Abadeus, the vast majority of the downsides you posted are due to equipment/software companies not making drivers available - it's not Vista's fault (the companies had PLENTY of time to come up with the goods).
I'm happily running GW at 150fps - so what if XP can run it at 200; I'm not going to see a difference.
__________________
Si non confectus, non reficiat
|
|
|
May 22, 2008, 09:40 PM // 21:40
|
#22
|
Site Contributor
Join Date: Apr 2006
Location: Usa
Guild: TKC
Profession: N/
|
Quote:
Originally Posted by Abedeus
|
This is ALMOST true Vista service pack 0 ends in 2012, we are currently on Vista sp1, and xp will go end life at the release of Windwos7.
your link = fail
That article is based on info from 2006. news flash thats 2 years old.....
Vista is the 1st stage of the windows longhorn project and windows 7 is the completed longhorn project.
KTHXBYE
Last edited by zamial; May 22, 2008 at 09:42 PM // 21:42..
|
|
|
May 22, 2008, 09:41 PM // 21:41
|
#23
|
Desert Nomad
Join Date: Nov 2005
Location: United Kingdom
Profession: Me/
|
Quote:
Originally Posted by Brianna
Err, suppose it detects what version of DirectX so you don't have to do that? Who knows, but it's already been proven now (Thanks zamial) that GW2 will have DX10, so I don't see what the problem is.
And, assuming A-Net knows what they are doing, the community will not have to piss around with switches, and such. Definitely speculation on how that will work, but like I said I'm sure they know what they are doing.
|
You specifiecally stated running a switch. That was all i disagree'd with, nothing else.
|
|
|
May 22, 2008, 11:25 PM // 23:25
|
#24
|
Jungle Guide
Join Date: Mar 2007
Guild: Mature Gaming Association
Profession: Me/E
|
Quote:
Originally Posted by Abedeus
Vista:
- Slow
- High requirements, offers nothing worth it
- The main advantage, Aero interface, was already in use on other PCs and can be used in XP
- Doesn't work with older machines, too old procs' and graphics
- A lot of existing ones don't work, cause there are no good drivers
- DOS things and epic games work worse than on XP
- The longer it works, the slower it gets
- Some xp games dont work too...
- Good programs don't have licenses under vista.
And that's it. Oh and that Vista's support ends in 2012, XP's in 2014. And Windows 7 comes out not-so-long-from-here. Vista is like a beta version of W7...
Everything and more here: http://en.wikipedia.org/wiki/Criticism_of_Windows_Vista
|
That's a horrible Wiki page. It's out of date (badly), therefore it's useless since we're talking about Vista in today's incarnation.
Most notably, it doesn't consider Vista SP1.
Just look at the awful references in that article that try to cite that Vista is slower than XP. They're flawed in ways like 1) possibly pre-SP1, and 2) they compare using systems with 1 GB of RAM.
Windows XP had major issues at release and everyone was asking why they should switch to it from Win 2000. Then XP matured, just as Vista is already doing.
And if one more person says "OMG you can't find Vista drivers for things so it suxx", I'm going to laugh... Really, who is using old fossilized hardware from last millenia? And what hardware is it?
Buy a decently modern computer IMO. Or put Linux on your old machine and freeze time!
|
|
|
May 22, 2008, 11:36 PM // 23:36
|
#25
|
Site Contributor
Join Date: Apr 2006
Location: Usa
Guild: TKC
Profession: N/
|
Quote:
Originally Posted by cebalrai
That's a horrible Wiki page. It's out of date (badly), therefore it's useless since we're talking about Vista in today's incarnation.
Most notably, it doesn't consider Vista SP1.
Just look at the awful references in that article that try to cite that Vista is slower than XP. They're flawed in ways like 1) possibly pre-SP1, and 2) they compare using systems with 1 GB of RAM.
Windows XP had major issues at release and everyone was asking why they should switch to it from Win 2000. Then XP matured, just as Vista is already doing.
And if one more person says "OMG you can't find Vista drivers for things so it suxx", I'm going to laugh... Really, who is using old fossilized hardware from last millenia? And what hardware is it?
Buy a decently modern computer IMO. Or put Linux on your old machine and freeze time!
|
QFT or even better turn it into a myst box like the rest of us! (not the game)
I actually read that article and even for being dated it went on to say how almost all of the"problems" were fixed or false.
Last edited by zamial; May 22, 2008 at 11:38 PM // 23:38..
|
|
|
May 23, 2008, 07:19 AM // 07:19
|
#26
|
Jungle Guide
Join Date: Jun 2006
Location: Boise Idaho
Guild: Druids Of Old (DOO)
Profession: R/Mo
|
Not really wanting to bash Vista all I can say is DRM as built into that OS sucks. You can not even time shift some TV shows, unless the broadcast company chooses to let you time shift that show.
As for the real topic: DX10 support in GW2. I got the idea from reading Gaile's posts that it was an option. If you are using Vista, you will have the option of using DX10 if you are using Win XP you will be running DX9 or not playing. I do not remember Gaile ever specifing what version of DX9, but the most common video cards at the time she made the comment were DX9c. Given developments since Gaile made her comments and the improvements offered by DX10.1 I would guess that the game would scale up to that level. Most likely an auto switch detecting the highest level your video sub-system can use and load that feature set.
Just my guess.
|
|
|
May 23, 2008, 10:51 AM // 10:51
|
#27
|
Desert Nomad
Join Date: Nov 2005
Location: United Kingdom
Profession: Me/
|
I have to say i found vista quite decent myself. The only issues i had where:
It tried to install drivers for various pieces of hardware twice. Not a big problem but annoying.
Most importantly though there was a serious problem caused by a variety of things that made some people loose the ability to use usb storage devices. I got hit by that and went through 3 solutions found by other people before dropping back down to XP.
I run a variety of usb storage devices so my system was crippled without them.
Try searching around the web and you will find a variety of issues in vista that can cause this to happen. May have been fixed in sp1 but i'll just stick with XP till i do another large upgrade.
I suggest people who buy a machine with vista pre-installed give it a go though, or at the very least dual boot xp.
|
|
|
May 23, 2008, 07:00 PM // 19:00
|
#28
|
Ascalonian Squire
Join Date: Aug 2007
Guild: Darklight Brotherhood
Profession: W/
|
vista 64 bit?
Will GW run on a 64bit vista setup??
ty in advance
|
|
|
May 23, 2008, 07:43 PM // 19:43
|
#29
|
rattus rattus
Join Date: Jan 2006
Location: London, UK GMT±0 ±1hr DST
Guild: [GURU]GW [wiki]GW2
Profession: R/
|
Ooh, I just replied to a thread asking the very same question...
YES IT DOES!
Very happily, in fact
__________________
Si non confectus, non reficiat
|
|
|
May 23, 2008, 08:47 PM // 20:47
|
#30
|
Ascalonian Squire
Join Date: Aug 2007
Guild: Darklight Brotherhood
Profession: W/
|
ty ty appreciate the help..............and cant wait for that speed lol
|
|
|
May 23, 2008, 09:23 PM // 21:23
|
#31
|
The Fallen One
Join Date: Dec 2005
Location: Oblivion
Guild: Irrelevant
Profession: Mo/Me
|
Vista SP1 is fine and dandy. It is only slow if your PC is slow. Vista may be a bit bloated, but M$FT is improving the bloat, SP1 did a lot for that issue.
I am going to be upgrading this PC I am on right now to Vista after June 20th. Ooops, I said June 20th. *cough*
Just kidding guys! But here is a little known fact...
We (nVidia) will be releasing our newest graphics solution. I present to you... the Geforce GTX 280 and Geforce GTX 260.
These new cards will feature the PhysX stream processor directly built in to the PCB. In addition, they Geforce GTX 280 will feature 240 unified stream processors while its younger brother the Geforce GTX 260 will feature 192 unified stream processors.
Finally, the Geforce GTX 280 will feature a 512bit memory interface with 1GB GDDR3 onboard, and the Geforce GTX 260 will feature a 448bit memory interface with 896MB of GDDR3 memory onboard.
Excited? You should be! I was on the team that developed the fabs for the 280, so I am pretty proud of my new baby. Get ready for some amazing power regulation and heat reduction kids!
|
|
|
May 23, 2008, 10:08 PM // 22:08
|
#32
|
rattus rattus
Join Date: Jan 2006
Location: London, UK GMT±0 ±1hr DST
Guild: [GURU]GW [wiki]GW2
Profession: R/
|
Sell me PhysX...
If you can without breaking that NDA thang
I read nVidia were going to be incorporating it, but with the almost-zero take up of the PhysX boards, I wonder why.
__________________
Si non confectus, non reficiat
|
|
|
May 23, 2008, 10:43 PM // 22:43
|
#33
|
The Fallen One
Join Date: Dec 2005
Location: Oblivion
Guild: Irrelevant
Profession: Mo/Me
|
Quote:
Originally Posted by Snograt
Sell me PhysX...
If you can without breaking that NDA thang
I read nVidia were going to be incorporating it, but with the almost-zero take up of the PhysX boards, I wonder why.
|
Sales and marketing isn't my thing, lol.
But, the CUDA enabled GPUs we make can use the PhysX code perfectly, because GPUs excel at mass number calculation. CUDA enables us to essentially use the leftover processing power (the parts of the GPU not being used for rendering) to perform very fast, accurate math, thus creating 3D physics for PhysX enabled games.
CUDA in and of itself is a self contained C programming environment, used primarily for massive number crunching. GPUs excel (just like the Cell processor in the PS3) at number crunching, and they run at very high speeds, with 100s of stream processors. CUDA programming allows us to use a GPU more like a CPU without effecting CPU or GPU overall performance. PhysX calculations would take a massive chunk of the CPUs time to do, but because the Geforce cards actually are never fully utilized, we can tap the extra "wiggle room" and force it to calculate physics code rather then just sitting there doing nothing.
If you are savvy about technology, I would advise you to read up on CUDA a bit more.
http://www.nvidia.com/object/cuda_home.html
The website explains a lot, and you can see what we and our partners use CUDA for. It is fairly interesting. Remember, my focus is the fab production methods (hardware), not software design. So, I know how to use some basic CUDA functions and can write code with it, but I am nowhere near as skilled as our software engineers are (those guys are geniuses)
|
|
|
May 23, 2008, 10:59 PM // 22:59
|
#34
|
Insane & Inhumane
|
Quote:
Originally Posted by Rahja the Thief
Sales and marketing isn't my thing, lol.
But, the CUDA enabled GPUs we make can use the PhysX code perfectly, because GPUs excel at mass number calculation. CUDA enables us to essentially use the leftover processing power (the parts of the GPU not being used for rendering) to perform very fast, accurate math, thus creating 3D physics for PhysX enabled games.
CUDA in and of itself is a self contained C programming environment, used primarily for massive number crunching. GPUs excel (just like the Cell processor in the PS3) at number crunching, and they run at very high speeds, with 100s of stream processors. CUDA programming allows us to use a GPU more like a CPU without effecting CPU or GPU overall performance. PhysX calculations would take a massive chunk of the CPUs time to do, but because the Geforce cards actually are never fully utilized, we can tap the extra "wiggle room" and force it to calculate physics code rather then just sitting there doing nothing.
If you are savvy about technology, I would advise you to read up on CUDA a bit more.
http://www.nvidia.com/object/cuda_home.html
The website explains a lot, and you can see what we and our partners use CUDA for. It is fairly interesting. Remember, my focus is the fab production methods (hardware), not software design. So, I know how to use some basic CUDA functions and can write code with it, but I am nowhere near as skilled as our software engineers are (those guys are geniuses)
|
Looks like I'm building a new computer soon then..
|
|
|
May 23, 2008, 11:19 PM // 23:19
|
#35
|
The Fallen One
Join Date: Dec 2005
Location: Oblivion
Guild: Irrelevant
Profession: Mo/Me
|
Quote:
Originally Posted by Brianna
Looks like I'm building a new computer soon then..
|
Indeed, but from what I hear, AMD might be releasing some really crazy cards as well. So keep your eyes on their side of the court as well. Despite my position at nVidia, that doesn't mean I don't advocate other products as my personal opinion. From what I hear, the new AMD cards use GDDR5 memory... they might just be what AMD needs to stay in the game at this point; highly competitive graphics cards with a low price tag. We shall see at the beginning to mid June!
|
|
|
May 26, 2008, 12:20 AM // 00:20
|
#36
|
Site Contributor
Join Date: Apr 2006
Location: Usa
Guild: TKC
Profession: N/
|
AMD, did I hear AMD? Ill bring the steaks and mallows if we are going to use those chips! At least those will cook 'em.
|
|
|
May 26, 2008, 12:28 AM // 00:28
|
#37
|
Exclusive Reclusive
Join Date: May 2005
Location: Tuscaloosa, AL
Guild: Seraph's Pinion (wing)
Profession: R/Me
|
Yeah, AMD has a 280-killer in the works. That GDDR5 is something to consider.
If you work there, can ya get a bro some sponsorship for his project? I swear it's worth your time...
|
|
|
May 29, 2008, 01:18 AM // 01:18
|
#38
|
Academy Page
Join Date: May 2005
Location: Home
Profession: W/A
|
for those who runs on budget computers - they are not meant for graphic intense gaming
for those who runs on high-end computers - try vista 64bit instead
|
|
|
May 29, 2008, 03:17 AM // 03:17
|
#39
|
Lion's Arch Merchant
Join Date: Feb 2008
Guild: AFO
Profession: E/
|
Say hello to vista 64bit
i happily run 3 os'es on this Pc, Winxp for games like freelancer and one other oldy quake 3 both of those don't work flawlesly in MP in Vista sadly and one coding tool that refuses to boot up atm in vista (work on the vista version is works in progress ^^) and for 2 music apps which don't work at all in vista, there is a newer version out however i don't have the money to buy the upgrades atm and tried the demo versions and don't like the new versions either => another reason i keep xp
Second OS : Ubuntu 64 bit say hello to the pinguin All the coding except 1 compiling is done under here (both my coding for Project Crosus (huge modmanager/mod downloader tool) and Sirius Reborn (mod for freelancer))
Third Os: Vista 64bit newer games and other stuff ^^
and while there are things i don't like in all the 3 os'es i use one atleast makes up for the other so i'm happy
Oh and rahja when is nvidia releasing better drivers for linux?
and can't wait to see those new cards either
|
|
|
May 29, 2008, 07:34 AM // 07:34
|
#40
|
Lion's Arch Merchant
Join Date: Dec 2006
Location: Australia
Profession: Mo/
|
Quote:
Originally Posted by Rahja the Thief
I am going to be upgrading this PC I am on right now to Vista after June 20th. Ooops, I said June 20th. *cough*
Just kidding guys! But here is a little known fact...
We (nVidia) will be releasing our newest graphics solution. I present to you... the Geforce GTX 280 and Geforce GTX 260.
These new cards will feature the PhysX stream processor directly built in to the PCB. In addition, they Geforce GTX 280 will feature 240 unified stream processors while its younger brother the Geforce GTX 260 will feature 192 unified stream processors.
Finally, the Geforce GTX 280 will feature a 512bit memory interface with 1GB GDDR3 onboard, and the Geforce GTX 260 will feature a 448bit memory interface with 896MB of GDDR3 memory onboard.
Excited? You should be! I was on the team that developed the fabs for the 280, so I am pretty proud of my new baby. Get ready for some amazing power regulation and heat reduction kids!
|
Wow you work for NVIDIA! I have always been nvidia owner (Geforce 3, 5, 7950gx2, 8800 ultra) and its sounds like its time to sell my Ultra and tell my friends the release date is 20th of June lol.
Unfortunately those leaked pictures of the card don't say much about its performance, so can you leak any 3DMark 06 scores/Crysis benchmarks please?
(http://www.fudzilla.com/index.php?op...73&Ite mid=34)
O and btw its not quite a "little known fact": tech sites have been going on for months about it, especially in recent weeks.
Last edited by Evil Genius; May 29, 2008 at 07:42 AM // 07:42..
|
|
|
Thread Tools |
|
Display Modes |
Linear Mode
|
Posting Rules
|
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts
HTML code is Off
|
|
|
Similar Threads
|
Thread |
Thread Starter |
Forum |
Replies |
Last Post |
Brianna |
Software |
24 |
Jun 25, 2008 09:48 PM // 21:48 |
MorpheusDV |
Technician's Corner |
15 |
Feb 10, 2008 04:26 PM // 16:26 |
New pc and vista question.
|
kirbykrazy |
Technician's Corner |
15 |
Apr 06, 2007 10:30 PM // 22:30 |
Silly Vista question
|
Chuckman Jones |
Technician's Corner |
4 |
Mar 01, 2007 10:25 AM // 10:25 |
All times are GMT. The time now is 07:22 AM // 07:22.
|