The 7800 was just released as a quick fix for the X850... Only a few more features, but nothing revolutionary.
The 7800 was just released as a quick fix for the X850... Only a few more features, but nothing revolutionary.
maybe, maybe notOriginally Posted by Frakk
http://omid.tomshardware.com/
Artificial intelligence is no match for natural stupidity.
- Unknown
what i get out of that article is that nvidia sells more videocards than ati and thats where they win. I dont know if thats true, nvidias last, the FX series was a real burn and a lot more ppl bought better ati cards for less. "Unless ATI can come up with an Extreme Makeover of epic proportions, it risks losing the high-end of the desktop forever. Not because of technology or product, but because it lacks the testicular chutzpah of Nvidia." what they say is its not the technology, its the marketing of ati that will hurt, which I dont believe. Anyways my post wasnt about this, but the performance and the overall dual gpu setups. Ati is just a better buy for price/performance and they have amazing tech support as well.
I respect your interpretation. I thought it was very lucid.
Artificial intelligence is no match for natural stupidity.
- Unknown
i think for other products the packaging might count, but no one thats building their own computer is going to pick the lesser of 2 cards
ATI will kick Nvidea's ass now cause they already had the better gpu and now they have taken away Nvidea's advanntage of SLI. Also i saw somewhere that the X900 XT P.E. will have 32 pixel pipelines. ATI also has the advantage of knowing the opposition's performance. My money is on ATI.
"Hey look, the topic!! It's getting away! Quick catch it! *running* Must *running more* get on *panting* TOP...IC!!!! *collapses* "-Temmink
Intel Q9450 @ 3.2GHz
Asus P5E3 Premium WiFi
OCZ Platinum PC3-12800 1600Mhz XTC 4Gb 7-7-7-24
2 x Powercolor HD4870 512Mb in Crossfire
Thermalright Ultra-120 Extreme with Noctua 120mm 1300RPM Pressure Optimised Fan
Windows Vista Ultimate 64bit w/ SP1
I was just wondering, is this the same chip thats gonna be used in the ps3?
nvidias have 32 pipelines as well, ati cannot afford to stay on 16 and they will have to go 32, or god knows if they took it a step further with their 90nm fab processing. if they did, nvidia will cry, but they most likely do 32 pipes. also ati is making unified pipelines on their R500 (xbox 360 gpu) which is a HUGE advantage for the future because the processor's power can be used 100% and there wont be resting vertex or pixel pipelines, each will do both. however, "It remains to be seen whether ATI will adopt a unified shader approach or a more conventional GPU with R520, but we will place our bets on it being a more conventional GPU for the time being, despite how radical R500, or Xbox 360's GPU, is designed."
I think that says it all. ati now can settle at a speed, but nvidia will bring out more models, most likely with higher speeds. i dont think that a few mhz will help them, but we will have to see.We get the impression that ATI are waiting for final performance numbers on GeForce 7800 GTX before they go ahead and finalise the clock speeds of top-end SKU based around their upcoming R520. By that time, NVIDIA have the option to release a faster part that will more than likely be labelled a GeForce 7800 Ultra, seeing as they've only used a single slot cooling solution that is constructed entirely from Aluminium – we await the arrival of a dual slot copper based cooler once R520 has 'beaten' GeForce 7800 GTX in a few benchmarks.
I thought u said flacid for a sec...Originally Posted by Zeus
Sigmund Freud strikes again!
having one running, as of yesterday morning, I can say its a little more than a clock revision or just a "few" changes to the 6800. Smaller, single slot, longer, and much quieter. Speculation is rarely accurate, anything that follows can potentially beat it having learned what to do differently.
It runs HL2 at 1600X1200 "like butter." I could list all the settings, but I was pushing it pretty hard to see how it behaved. The "optimal overclock" at a little over 400/1.13 dropped to desktop from BF2 and UT2K4 a couple times, so I expect its a stability issue not a RAM or Pipe issue at those clocks.
The Luna Demo, LIVE, is VERY impressive. Those at the Nvidia GeForce party and trade shows were likely as impressed as I was.
Bench in 3dMark05, using the Optimal driver overclock, was a little over 8600 running on an FX-53, stock (no OC on that yet) and 1Gig of PC4000 Corsair in Dual Channel mode. Motherboard is a DFI UT Lan Party, modded to SLI and SATAII, but I am still waiting on my second 7800 to test in it (Geforce LAN prizes, both).
In BF2, it opens up the 1600X1200 settings, and I tried them at High +2XAA while clocked, it ran smoothly, but the desktop crash concerns me so I dropped it back to stock until I get another cooler for it.