• Welcome to Overclockers Forums! Join us to reply in threads, receive reduced ads, and to customize your site experience!

GTX 280/260 Specs

Overclockers is supported by our readers. When you click a link to make a purchase, we may earn a commission. Learn More.
HD 4870 is rumored to be 1.3x as fast as 9800GTX. And in what games? Or just in 3dmark again?

ATI cards always seem to win in 3dmark. They also always seem to lose in real games. We're going to have to wait and see.

I don't think anyone here is a fanboy, and it's pretty ignorant of you to imply that.. I don't think there's a single person on these forums that will buy a slower card for more money simply because it's made by one company or the other. ATI guys just sling the fanboy term around as an insult, and it's pretty lame.

I'd never buy an ATI again, even if it was faster tbh

while back my 6800 broke, and i was forced to use an ATI for 2 months, had a magnitude of problems and would never look back at them again, ever.
 
Hynix 0.8NS. GTS and GT have Qimonda 1.0NS chips rated at 2000MHz. GTX has Samsung 1.0NS rated at 2200MHz.

Then why not the HD 4800 series. The HD 4870 is known to be 1.3x as fast as the 9800GTX, and the price is from $250-$300. And we already know the HD 4870 X2 beats the GTX 280 in 3DMark Vantage whilst costing less: http://sg.vr-zone.com/?i=5851

^^ Thought I'd post that as none of the Nvidia fanboys seem to have the courage to do so. Plus that's with unoptimized drivers.

Yeah Im not a fanboy either. My last card was ATI X1950 Pro and I loved that one, though it was poor overclocker. I switch to the better deals at the time of upgrading and ATI has alwayssssssss had problems with the drivers. I dont care about 3D Mark or 3D Vantage, the in-game performace is all that matters. And I dont believe that ATI can turn the tables with better drivers this time around too. Ofcourse, at that price HD 4870 looks damn good, but with the driver holding them back, one really wouldn't feel happy about it. (lets hope things change though), look the 2900XT specs, they're fantastic, but then the performace..........
 
HD 4870 is rumored to be 1.3x as fast as 9800GTX. And in what games? Or just in 3dmark again?

ATI cards always seem to win in 3dmark. They also always seem to lose in real games. We're going to have to wait and see.

I don't think anyone here is a fanboy, and it's pretty ignorant of you to imply that.. I don't think there's a single person on these forums that will buy a slower card for more money simply because it's made by one company or the other. ATI guys just sling the fanboy term around as an insult, and it's pretty lame.

Don't forget that NVidia cards had been consistently win at 3DMark Vantage. The fact that they are getting beaten now by ATI says something I think.
 
ATI cards always seem to win in 3dmark. They also always seem to lose in real games. We're going to have to wait and see.
They lose in some games because Nvidia has their TWIMTBP all over them, but there are a lot of games where ATI will come out on top. Source engine-based, Call Of Juarez, I've seen the 3870/3870 X2 performing better in Unreal engine 3 than the equivalent Nvidia cards (GX2/GT) mainly because it's a very shader intensive engine.

Like you said though, let's just wait and see I was merely suggesting to Fishman that he didn't have to shell out $650 on a GTX 280, and that the 4800 series is another option at a great price.
I don't think anyone here is a fanboy, and it's pretty ignorant of you to imply that.. I don't think there's a single person on these forums that will buy a slower card for more money simply because it's made by one company or the other. ATI guys just sling the fanboy term around as an insult, and it's pretty lame.
I actually meant that as more of a joke. But there wasn't even a mention of that piece of news in this thread so I just thought I'd bring it up.
I'd never buy an ATI again, even if it was faster tbh

while back my 6800 broke, and i was forced to use an ATI for 2 months, had a magnitude of problems and would never look back at them again, ever.
I don't think that's very intelligent. You should never just stick with one brand and remain oblivious to everything else, that's called being a fanboy. I have owned a X800, X1900XT, and a 2900XT (although the latter only for a very short amount of time) and I had no problems with either of those cards or their drivers. In fact, I had more trouble with my GT's drivers than I had with all of the ATI cards. Catalyst Control Centre in pretty sweet too.
lets hope things change though), look the 2900XT specs, they're fantastic, but then the performace..........
Why does everyone go on about the R600's specs. Lets start with the 512bit interface. If the GPU doesn't actually make use of all that bandwidth, there is no performance gain, this was proved with RV670 which was 256bit. Next up stream processors. 320??!! Yes, but remember ATI used to do AA and AF through the shaders and not through the ROPs, when you compare R600's TMUs and ROP's to G80's, suddenly, it's the Nvidia card that has the awesome specs.
 
I don't think that's very intelligent. You should never just stick with one brand and remain oblivious to everything else, that's called being a fanboy. I have owned a X800, X1900XT, and a 2900XT (although the latter only for a very short amount of time) and I had no problems with either of those cards or their drivers. In fact, I had more trouble with my GT's drivers than I had with all of the ATI cards. Catalyst Control Centre in pretty sweet too.

I hate catalyst control centre. Besides, I am a NVidia Fanboy :D.
 
I hate catalyst control centre. Besides, I am a NVidia Fanboy :D.

QFT. CCC is retarded, and the drivers slows down games that are CPU bound, MS FSX (and FS9) being very good examples, a 7600GT has no problem outperforming a 3870 here. Im not an nvidia fanboy by choice, its more that i dislike ATI Windows drivers soo much, and most important the terrible 3D support in Linux. Transgaming Cedega still dont support ATI afaik, and nvidia users dont need support (not that one can expect support from Cedega in any case)
 
The Source engine, and Call Of Juarez are ATI branded, but they have a much smaller segment than Nvidia because they don't have the resources.
 
The Source engine, and Call Of Juarez are ATI branded, but they have a much smaller segment than Nvidia because they don't have the resources.

Well they never wanted to spend huge amounts of moeny on a game to slap there name all over it.
 
Can't see what's wrong with TWIITBP if Nvidia pays for the developers time, because we get better framerates, if they force the devs to write worse code for ATI or remove their advantage, like DX10.1 that's shouldn't happen.

Eventually they optimize their code best for their biggest marketshare if ATI delivers kickass cards around 200$ they have nothing to fear.
Sticking with valve and giving away HL games with their vgas seems to be better than NVs branding.
 
Eventually they optimize their code best for their biggest marketshare if ATI delivers kickass cards around 200$ they have nothing to fear.

WHen the biggest market share is favoring nvidia, and games are being optimized ESPECIALLY for nvidia, how in the world do you expect ATI to perform better? It just doesnt make sense. e.g you can see ATI cards outperform nvidia cards on Half-Life 2, even though nvidia DOES have kick-*** cards.

This market share problem is killing the gaming industry in the sense that its ever harder than ever for consumers to make a choice on which card or even which CONSOLE to get because for consoles there are exclusive titles too. This is all too messed up and cynical on the companies part.
 
The problem with TWIMTBP is that it doesn't help to do BETTER games, it helps to do games that leaves nVidia in a better position than ATi, that's it. That's only helping to the company that's paying the money, and the fanboys that are excited about watching higher numbers coming from the nVidia side than from ATi.

They're not going to try to push developers to abuse of texture processing because it's good for the game, they're going to do it because they know that nVidia cards are strong with that and it's the major weakness of ATi ones. They are not going to discourage (maybe even to "bribe" to not use it) developers from using DX10.1 because it wouldn't be beneficial, they're going to do it because it would be an advantage to ATi. And so...

TWIMTBP is not there to benefit you as a consumer, it's there to benefit nVidia as the company that pays the money...
 
WHen the biggest market share is favoring nvidia, and games are being optimized ESPECIALLY for nvidia, how in the world do you expect ATI to perform better?

With releasing better cards than what NV has to offer. Would be quite disappointing if the new ATI gen couldn't deal with NV's 8800 die shrink.
With that they can improve their marketshare as sub 200$ makes up for the most of the market. I expect NV to own the upper end but can they come up with a midrange card in time ?
 
The problem with TWIMTBP is that it doesn't help to do BETTER games, it helps to do games that leaves nVidia in a better position than ATi, that's it. That's only helping to the company that's paying the money, and the fanboys that are excited about watching higher numbers coming from the nVidia side than from ATi.

They're not going to try to push developers to abuse of texture processing because it's good for the game, they're going to do it because they know that nVidia cards are strong with that and it's the major weakness of ATi ones. They are not going to discourage (maybe even to "bribe" to not use it) developers from using DX10.1 because it wouldn't be beneficial, they're going to do it because it would be an advantage to ATi. And so...

TWIMTBP is not there to benefit you as a consumer, it's there to benefit nVidia as the company that pays the money...


How's the tinfoil helmet fitting today? Perhaps you need to start a new thread for TWIMTBP conspiracy theories..
 
How's the tinfoil helmet fitting today? Perhaps you need to start a new thread for TWIMTBP conspiracy theories..

He's actually not wrong, you think TWIMTBP is just a little label in games when they boot up? It goes a little deeper than that. Watch interviews with the CEO of Crytek how they worked specifically with Nvidia to make the game perform better on their cards...
 
How's the tinfoil helmet fitting today? Perhaps you need to start a new thread for TWIMTBP conspiracy theories..

Nah. If someone else do it, maybe I'll post in there, maybe not, but I'm not specially interested in open a discussion about the business practices of one or another company.

But I don't think is necessary to be paranoid to think that when a hardware company is investing a certain quantity of resources to become an active part during the developement process of other companies' related software it's because they want to take advantage of that privileged position to benefit their own hardware against the competition. ;)
 
Last edited:
Nah. If someone else do it, maybe I'll post in there, maybe not, but I'm not specially interested in open a discussion about the business practices of one or another company.

But I don't think is necessary to be paranoid to think that when a hardware company is investing a certain quantity of resources to become an active part during the developement process of other companies' related software it's because they want to take advantage of that privileged position to benefit their own hardware against the competition. ;)

I LIKE that they work with game developers to make stuff run better on their hardware. AMD could be doing the same thing, and I sorta think they are foolish not to, since their cards would no doubt do better than they do now. The tinfoil hat remark was just because people seem to think Nvidia purposely sabotages AMD's performance in TWIMTPB games, and I don't think that's the case at all :)
 
Back