Page 1 of 2 12 LastLast
Results 1 to 10 of 20

Thread: XFX 8600GT comparison - please help!

  1. #1
    Overclocked a.Bird's Avatar
    Join Date
    Sep 2006
    Location
    Ithaca, NY
    Posts
    343

    Default XFX 8600GT comparison - please help!

    I'm planning a new rig for when I recieve my tax returns and I'm at a toss-up between two GeForce 8600GT cards, both manufactured by XFX. I know my fellow lads and lassies at tbcs have the knowledge to help me make a decision so I need your input!


    card A - XFX PVT84JUSD4 GeForce 8600GT link
    Core clock: 680MHz
    Memory: 256MB 1600MHz GDDR3
    Price: $109.99 (-$30 MIR)

    card B - XFX PVT84JYAJG GeForce 8600GT link
    Core clock: 540MHz
    Memory: 512MB 1180MHz GDDR2
    Price: $99.99 (-$10 MIR)


    My guess is that the extra 256MB of memory on card B is negligible if it runs more than 400MHz slower than card A. On top of that, the GPU is clocked at 140MHz higher on card A than card B. However, as much as I'd like to say that my "tech level" is "high" like most Newegg reviewers seem to, I honestly lack the experience to tell these cards apart in regard to real life application. What do you all think?

    -Zac

    P.S. This scenario reminds me of a ti4200 64MB vs. fx5200 128MB debate I encountered a few years ago, where the amount of memory was irrelevant as the titanium easily blew the fx card out of the water.

  2. #2
    read my comic already! (sig) xRyokenx's Avatar
    Join Date
    Dec 2006
    Location
    Bloomingburg, NY
    Posts
    4,416

    Default Re: XFX 8600GT comparison - please help!

    I'd go with card A. The amount of memory only really plays into the resolution you're going to be running your games at... and 256MB is good up until a little higher than 1280x1024 from what I know...

  3. #3
    Gaming Console Modder Mitternacht's Avatar
    Join Date
    Nov 2006
    Location
    Pennsylvania
    Posts
    1,294

    Default Re: XFX 8600GT comparison - please help!

    That and card a has gddr3. I say card a.

    Quote Originally Posted by Crimson Sky
    I never like get into a battle of wits with an unarmed opponent.

  4. #4
    Fox Furry crenn's Avatar
    Join Date
    Apr 2005
    Location
    In the shadows behind you
    Posts
    4,067

    Default Re: XFX 8600GT comparison - please help!

    My money is on card A.
    Antec Sonata II | Pioneer DVR-212
    Good news! You can follow my website or follow me on twitter!

  5. #5
    . Spawn-Inc's Avatar
    Join Date
    Feb 2007
    Location
    Canada
    Posts
    4,145

    Default Re: XFX 8600GT comparison - please help!

    card a all the way. those are nice cards btw.
    CPU: Q6600 G0 3.5GHz@1.4v (4.2GHz max) / 4790k 4.8ghz @1.265v
    GPU: 9800GTX /GTX780 hydrocopper
    Ram: Samsung 4GB /gskill 16gb DDR3 1600
    Mobo: EVGA-NF68-A1 680i (P32) /AsRock Extreme6
    PSU: Enermax Galaxy 850Watt /EVGA 850 G2
    HDD: OCZ 120GB Vertex4, Samsung evo 840 250GB
    LCD: Samsung 32" LN32A450, Samsung 226BW 22" wide
    Sound: Logtiech Z 5500
    CPU & GPU: 3x Swiftech MCR320, 2x MCP655, MCW60 R2, Dtek Fuzion V2, 18 high speed yates @ 5v

  6. #6
    Sunshine Flavored Lollipops Zephik's Avatar
    Join Date
    Oct 2006
    Location
    Spokane, Washington
    Posts
    5,145

    Default Re: XFX 8600GT comparison - please help!

    I've always been told that, for the most part, 256mb is all the memory you will need. With of course the exception of these new monster cards.

    So if thats true, then card a is definitely the better deal. Even if thats not true, the faster 256mb equals out with the slower 512mb doesn't it? Maybe? So if thats true, then all you need to look at is the core clock. Which would be card a. Which also explains why its more expensive.

    Also, since its not a battle between ATi and nVidia, you don't have to worry about company support and drivers, since nVidia has that down pretty well.

    Looks like card a has the better features as well. More powerful heatsink, dual dvi instead of single with dsub. Although you don't get that svideo cable... darn. lol

    It also says its HDCP/HDMI ready, whereas the other one doesn't. So at least you know this one won't give you any trouble gaurunteed. Although I don't know what kind of trouble you could get in with the other card if any at all.

    Card a is the better choice.
    People are stupid; given proper motivation, almost anyone will believe almost anything. Because people are stupid, they will believe a lie because they want to believe it's true, or because they are afraid it might be true. People’s heads are full of knowledge, facts, and beliefs, and most of it is false, yet they think it all true. People are stupid; they can only rarely tell the difference between a lie and the truth, and yet they are confident they can, and so are all the easier to fool.

  7. #7
    . Spawn-Inc's Avatar
    Join Date
    Feb 2007
    Location
    Canada
    Posts
    4,145

    Default Re: XFX 8600GT comparison - please help!

    Quote Originally Posted by Zephik View Post
    Although I don't know what kind of trouble you could get in with the other card if any at all.
    are you talking about the port or an actual cable?
    CPU: Q6600 G0 3.5GHz@1.4v (4.2GHz max) / 4790k 4.8ghz @1.265v
    GPU: 9800GTX /GTX780 hydrocopper
    Ram: Samsung 4GB /gskill 16gb DDR3 1600
    Mobo: EVGA-NF68-A1 680i (P32) /AsRock Extreme6
    PSU: Enermax Galaxy 850Watt /EVGA 850 G2
    HDD: OCZ 120GB Vertex4, Samsung evo 840 250GB
    LCD: Samsung 32" LN32A450, Samsung 226BW 22" wide
    Sound: Logtiech Z 5500
    CPU & GPU: 3x Swiftech MCR320, 2x MCP655, MCW60 R2, Dtek Fuzion V2, 18 high speed yates @ 5v

  8. #8
    Sunshine Flavored Lollipops Zephik's Avatar
    Join Date
    Oct 2006
    Location
    Spokane, Washington
    Posts
    5,145

    Default Re: XFX 8600GT comparison - please help!

    Quote Originally Posted by Spawn-Inc View Post
    are you talking about the port or an actual cable?
    I don't know what I was talking about. lol Neither I guess.

    What makes a card "HDMI/HDCP Ready"?

    I guess I was thinking drivers or something like that. So I guess if you didn't update the drivers on the non ready card, you might experience poorer quality or something. But I don't know if thats true or whatever, so yea. lol why do I feel really confused all the sudden. ><

    or maybe its that if you use the HDMI/HDCP, that the card becomes too hot? So the larger heatsink on card a would make that "ready", since it could battle the heat more efficiently. But, like I said, I really have no idea.
    People are stupid; given proper motivation, almost anyone will believe almost anything. Because people are stupid, they will believe a lie because they want to believe it's true, or because they are afraid it might be true. People’s heads are full of knowledge, facts, and beliefs, and most of it is false, yet they think it all true. People are stupid; they can only rarely tell the difference between a lie and the truth, and yet they are confident they can, and so are all the easier to fool.

  9. #9
    . Spawn-Inc's Avatar
    Join Date
    Feb 2007
    Location
    Canada
    Posts
    4,145

    Default Re: XFX 8600GT comparison - please help!

    not sure, i know some of the ati cards come with a connector that goes from dvi to HDMI, maybe you could take one from that and put it on the nvidia card?
    CPU: Q6600 G0 3.5GHz@1.4v (4.2GHz max) / 4790k 4.8ghz @1.265v
    GPU: 9800GTX /GTX780 hydrocopper
    Ram: Samsung 4GB /gskill 16gb DDR3 1600
    Mobo: EVGA-NF68-A1 680i (P32) /AsRock Extreme6
    PSU: Enermax Galaxy 850Watt /EVGA 850 G2
    HDD: OCZ 120GB Vertex4, Samsung evo 840 250GB
    LCD: Samsung 32" LN32A450, Samsung 226BW 22" wide
    Sound: Logtiech Z 5500
    CPU & GPU: 3x Swiftech MCR320, 2x MCP655, MCW60 R2, Dtek Fuzion V2, 18 high speed yates @ 5v

  10. #10
    Sunshine Flavored Lollipops Zephik's Avatar
    Join Date
    Oct 2006
    Location
    Spokane, Washington
    Posts
    5,145

    Default Re: XFX 8600GT comparison - please help!

    Quote Originally Posted by Spawn-Inc View Post
    not sure, i know some of the ati cards come with a connector that goes from dvi to HDMI, maybe you could take one from that and put it on the nvidia card?
    Good question. I would only imagine that it would work, but I wonder if it would ACTUALLY work? What I mean is, I'm sure it would display something, most likely proper video just as if you used a regular DVI or DSUB connection, but would it actually display HDMI quality?

    People are stupid; given proper motivation, almost anyone will believe almost anything. Because people are stupid, they will believe a lie because they want to believe it's true, or because they are afraid it might be true. People’s heads are full of knowledge, facts, and beliefs, and most of it is false, yet they think it all true. People are stupid; they can only rarely tell the difference between a lie and the truth, and yet they are confident they can, and so are all the easier to fool.

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •