Nov 11, 2006, 12:31 PM // 12:31
|
#21
|
Tech Monkeh Mod
Join Date: May 2005
Location: Good Old North East of England
Profession: Mo/Me
|
As I have said numerous times, comparing memory clock and core speeds is pointless, what matters in the case of graphics chips/cards is pixel pipelines, vertex shader units, wether your memory is DDR, DDR2, GDDR3, or the newest GDDR4, also the memory bus i'e 32bit, 64bit, 128bit, 256bit, as again with this higher is always better.
sadly as has been said the X1300 is a low end card, and will not give the performance you deem acceptable. the 9200 although older technology is actually a better card.
|
|
|
Nov 11, 2006, 08:29 PM // 20:29
|
#22
|
Ascalonian Squire
Join Date: Jul 2006
Guild: Team HAXX
Profession: Rt/Mo
|
Then why can the x1300 run games, and EVRY other game EXCEPT guild wars better then the 9200?
Since it's a better card?
Please make sense.
And if clock and core speeds is pointless, why when you overclock - that's what you are overclocking?
|
|
|
Nov 12, 2006, 12:12 AM // 00:12
|
#23
|
Tech Monkeh Mod
Join Date: May 2005
Location: Good Old North East of England
Profession: Mo/Me
|
Quote:
Originally Posted by Trizkit
Then why can the x1300 run games, and EVRY other game EXCEPT guild wars better then the 9200?
Since it's a better card?
Please make sense.
And if clock and core speeds is pointless, why when you overclock - that's what you are overclocking?
|
I do make sense, please read again what I wrote regarding performance of graphic cards, I said comparing clock speeds is irellevent and pointless, the points I made regarding the cards i.e pixel shaders etc are the important factor in performance.
Now before you make a fool of yourself read up on it please, as for overclocking it only comes into consideration on the card your overclocking.
Please do not make this into an argument over something so trivial, afterall I am right.
|
|
|
Nov 12, 2006, 01:17 AM // 01:17
|
#24
|
Ascalonian Squire
Join Date: Jul 2006
Guild: Team HAXX
Profession: Rt/Mo
|
Quote:
Originally Posted by cannonfodder
I do make sense, please read again what I wrote regarding performance of graphic cards, I said comparing clock speeds is irellevent and pointless, the points I made regarding the cards i.e pixel shaders etc are the important factor in performance.
Now before you make a fool of yourself read up on it please, as for overclocking it only comes into consideration on the card your overclocking.
Please do not make this into an argument over something so trivial, afterall I am right.
|
(From newegg , same cards I have. )
X1300 PCI specs.
Model
Brand VisionTek
Model VTKX1300256PCI
Interface
Interface PCI
Chipset
Chipset Manufacturer ATI
GPU Radeon X1300
Core clock 450MHz
PixelPipelines 4
Memory
Memory Clock 533MHz
Memory Size 256MB
Memory Interface 128-bit
Memory Type GDDR2
3D API
DirectX DirectX 9
OpenGL OpenGL 2.0
Ports
D-SUB 1
DVI 1
TV-Out HDTV/S-Video/Composite Out
VIVO No
General
Vista Ready Yes
Dual-Link DVI Supported Yes
Tuner None
RAMDAC 400 MHz
Max Resolution 2560x1600
Cooler With Fan
Operating Systems Supported Windows XP
Windows XP Media Center Edition
Windows 2000
Windows Vista
My 9200 Specs.
Brand ATI
Model 100-436009
Interface
Interface PCI
Chipset
Chipset Manufacturer ATI
GPU Radeon 9200
PixelPipelines 4
Memory
Memory Size 128MB
Memory Interface 128-bit
Memory Type DDR
3D API
DirectX DirectX 8
OpenGL OpenGL 1.3
Ports
D-SUB 1
TV-Out S-Video/Composite Out
VIVO No
General
Vista Ready No
RAMDAC 400 MHz
Max Resolution 2048x1536
Cooler Fanless
Operating Systems Supported Windows 98/ME/2000/XP
How can you say 9200 is a better gpu then x1300?
Give me links proving this - I'm not arguing with you, or trying to make myself or YOU look like an ass. So quit trying to do the same to me? I said please, isn't like I cursed you for your name or other stupid shit.
Besides; explain why 9200 can't get over 20 fps in source on high graphics but x1300 gets 40-60. I know why 9200 can't run some games, because of certain pixel shading I believe. (I'm not a 100%) but it doesn't run as fast as the x1300. It only has a heatsink to, though that wasn't affect its performance.
I have to overclock the 9200 just to get it to run at around 15 FPS on GW steady. Though I know it's not smart to overclock PCI cards, especially ones without a heatsink+fan.
Point is, the X1300 stands alot higher then the 9200. I could see if it was a 9200 PRO that they might compare, but it's just a 9200SE. It's doesn't have a better GPU build then the X1300.
If you wan't to give me links and prove me wrong, more power to you - I'm not trying to flame you and shit so don't take it that way.
I'm just looking for support as to why this card can't run GW but can run any other game. The visiontek support (makers of my x1300 card) says its the game - because the card can run any other games, and games with higher alot higher requirments then GW.
And besides the fact even if 9200 is better then 1300 - wich its not a better GPU, it still has enough power to run GW at playable framerates - not 5-10 FPS. I mean I ran GW on intergrated graphics with them FPS before.
If you would, just lock this thread - I'm oviously not going to get anywhere with this thread. I figured someone in the GW guru universe might have a x1300 pci card to, I guess I was wrong. Thread has just gone off topic, and pointless.
I'm just going to go buy a Geforce 6200 - This thread is pointless.
Last edited by Trizkit; Nov 12, 2006 at 06:35 AM // 06:35..
|
|
|
Nov 12, 2006, 06:29 AM // 06:29
|
#25
|
Lion's Arch Merchant
Join Date: Dec 2005
Profession: Mo/
|
Last edited by dronex; Nov 12, 2006 at 06:32 AM // 06:32..
|
|
|
Nov 12, 2006, 06:32 AM // 06:32
|
#26
|
Ascalonian Squire
Join Date: Jul 2006
Guild: Team HAXX
Profession: Rt/Mo
|
Looks to me the PCI X1300 comes out on top, stats wise still- and that 9200 is an AGPx8 - not a PCI 9200.
|
|
|
Nov 12, 2006, 07:27 AM // 07:27
|
#27
|
Lion's Arch Merchant
Join Date: Dec 2005
Profession: Mo/
|
Quote:
Originally Posted by Trizkit
Looks to me the PCI X1300 comes out on top, stats wise still- and that 9200 is an AGPx8 - not a PCI 9200.
|
yeh they dont have pci in the site that why i chose agp but still ... the x1300 dosent have the power to run gw in dx9
and come on ... PCI its realy time for upgrade dont you think
Last edited by dronex; Nov 12, 2006 at 07:30 AM // 07:30..
|
|
|
Thread Tools |
|
Display Modes |
Linear Mode
|
Posting Rules
|
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts
HTML code is Off
|
|
|
All times are GMT. The time now is 01:06 PM // 13:06.
|