XFX 8600GT comparison - please help!
I'm planning a new rig for when I recieve my tax returns and I'm at a toss-up between two GeForce 8600GT cards, both manufactured by XFX. I know my fellow lads and lassies at tbcs have the knowledge to help me make a decision so I need your input! :D
card A - XFX PVT84JUSD4 GeForce 8600GT link
Core clock: 680MHz
Memory: 256MB 1600MHz GDDR3
Price: $109.99 (-$30 MIR)
card B - XFX PVT84JYAJG GeForce 8600GT link
Core clock: 540MHz
Memory: 512MB 1180MHz GDDR2
Price: $99.99 (-$10 MIR)
My guess is that the extra 256MB of memory on card B is negligible if it runs more than 400MHz slower than card A. On top of that, the GPU is clocked at 140MHz higher on card A than card B. However, as much as I'd like to say that my "tech level" is "high" like most Newegg reviewers seem to, I honestly lack the experience to tell these cards apart in regard to real life application. What do you all think?
-Zac
P.S. This scenario reminds me of a ti4200 64MB vs. fx5200 128MB debate I encountered a few years ago, where the amount of memory was irrelevant as the titanium easily blew the fx card out of the water.
Re: XFX 8600GT comparison - please help!
I'd go with card A. The amount of memory only really plays into the resolution you're going to be running your games at... and 256MB is good up until a little higher than 1280x1024 from what I know...
Re: XFX 8600GT comparison - please help!
That and card a has gddr3. I say card a.
Re: XFX 8600GT comparison - please help!
Re: XFX 8600GT comparison - please help!
card a all the way. those are nice cards btw.
Re: XFX 8600GT comparison - please help!
I've always been told that, for the most part, 256mb is all the memory you will need. With of course the exception of these new monster cards.
So if thats true, then card a is definitely the better deal. Even if thats not true, the faster 256mb equals out with the slower 512mb doesn't it? Maybe? So if thats true, then all you need to look at is the core clock. Which would be card a. Which also explains why its more expensive.
Also, since its not a battle between ATi and nVidia, you don't have to worry about company support and drivers, since nVidia has that down pretty well.
Looks like card a has the better features as well. More powerful heatsink, dual dvi instead of single with dsub. Although you don't get that svideo cable... darn. lol
It also says its HDCP/HDMI ready, whereas the other one doesn't. So at least you know this one won't give you any trouble gaurunteed. Although I don't know what kind of trouble you could get in with the other card if any at all.
Card a is the better choice.
Re: XFX 8600GT comparison - please help!
Quote:
Originally Posted by
Zephik
Although I don't know what kind of trouble you could get in with the other card if any at all.
are you talking about the port or an actual cable?
Re: XFX 8600GT comparison - please help!
Quote:
Originally Posted by
Spawn-Inc
are you talking about the port or an actual cable?
I don't know what I was talking about. lol Neither I guess.
What makes a card "HDMI/HDCP Ready"? :think:
I guess I was thinking drivers or something like that. So I guess if you didn't update the drivers on the non ready card, you might experience poorer quality or something. But I don't know if thats true or whatever, so yea. lol why do I feel really confused all the sudden. ><
or maybe its that if you use the HDMI/HDCP, that the card becomes too hot? So the larger heatsink on card a would make that "ready", since it could battle the heat more efficiently. But, like I said, I really have no idea.
Re: XFX 8600GT comparison - please help!
not sure, i know some of the ati cards come with a connector that goes from dvi to HDMI, maybe you could take one from that and put it on the nvidia card?
Re: XFX 8600GT comparison - please help!
Quote:
Originally Posted by
Spawn-Inc
not sure, i know some of the ati cards come with a connector that goes from dvi to HDMI, maybe you could take one from that and put it on the nvidia card?
Good question. I would only imagine that it would work, but I wonder if it would ACTUALLY work? What I mean is, I'm sure it would display something, most likely proper video just as if you used a regular DVI or DSUB connection, but would it actually display HDMI quality?
:think: