• Welcome to Overclockers Forums! Join us to reply in threads, receive reduced ads, and to customize your site experience!

GTX 280/260 Specs

Overclockers is supported by our readers. When you click a link to make a purchase, we may earn a commission. Learn More.
It probably has money involved somewhere, which would explain why AMD/ATI is unable to get involved. I don't know anything about NVidia's business practices, but other companies (like Intel) have been getting in massive trouble for purposely sabotaging their competitors (AMD), so it wouldn't surprise me too much, but like I said, I don't know anything about it.
 
Well nVidia didn't spend chump change on making sure they where the name on the boxes for Crysis for example. I thought I heard it was a multi-million dollar deal. Some reason 5mil sticks in my head but not sure.

ATI is out there too helping to improve on games as well. Just not spending money to slap there name on the box and the intro to the games.

nVidia has lots of money to spend, and ATI doesn't thanks to AMD.
 
I LIKE that they work with game developers to make stuff run better on their hardware. AMD could be doing the same thing, and I sorta think they are foolish not to, since their cards would no doubt do better than they do now. The tinfoil hat remark was just because people seem to think Nvidia purposely sabotages AMD's performance in TWIMTPB games, and I don't think that's the case at all :)

One goes with the other, if you work with the developer to favour your rendering methods, then all other rendering methods come second, and therefore inferior, otherwise there is no point in working with the developers in the first place. So yes, that is exactly what they're doing but in a more, subtle manner.
 
The last few cores from Nvidia remind me of Intel's tick/tock strategy.

G80 was a big jump up from the previous gen, power hungry, hot. G92 was a much smaller die and very efficient in terms of price : performance : yield, in comparison to G80.

If this trend continues, I think the next core after GT200 is going to be the next amazing core. After they shrink it down, make the thermals better, and improve speeds.

On the other hand, look how long the 8800GTX / 8800ULTRA lasted. I mean they are still good at current standards. So I think the GTX280 is going to have some nice longetivity. We'll see how fast Nvidia looks to revise this core.

We'll see if they also do something cool with the 512bit bus, because at this point it's performing similiar to ATI's smaller bus / faster memory combination. Yet having a 512bit bus costs much more to produce. Put GDDR5 on the 512bit bus Nvidia! kthanx!

G92 is still an amazing core, owns the RV600 and with price cuts will be able to easily compete with the RV700.
 
I have a hypothetical therory, and that is that nvidia and ati are in the wrong.

Nvidia should have stayed with the 8800 cards and really tweaked them so that the drivers could use the full bandwith of the cards and also providing HSF upgrades and testing the cards in an OC maner then letting those who are affraid of OCing know what the cards can be pushed to. For instance you buy a stock card at ultra speeds but are able to acctually get speeds of the new GTX with that extra upgrade HSF to keep it stable at low temps. Now I know you an watercool and OC greatly but not everyone has WC capabilities, so an increased CFM HSF would be needed. My point is stop making new cards and max out the potential of the current cards availible. The drivers are what makes the cards opperate at these speeds and stop with the V drop so we can get the cards the power needed. Also they can provid bios flashes like a mobo can to the cards for better performance with the 2.0 slots. Again my opinoion. I am also aware they only make new cards slightly better to get people to buy and that is their business goal to make more money.
 
I have a hypothetical therory, and that is that nvidia and ati are in the wrong.

Nvidia should have stayed with the 8800 cards and really tweaked them so that the drivers could use the full bandwith of the cards and also providing HSF upgrades and testing the cards in an OC maner then letting those who are affraid of OCing know what the cards can be pushed to. For instance you buy a stock card at ultra speeds but are able to acctually get speeds of the new GTX with that extra upgrade HSF to keep it stable at low temps. Now I know you an watercool and OC greatly but not everyone has WC capabilities, so an increased CFM HSF would be needed. My point is stop making new cards and max out the potential of the current cards availible. The drivers are what makes the cards opperate at these speeds and stop with the V drop so we can get the cards the power needed. Also they can provid bios flashes like a mobo can to the cards for better performance with the 2.0 slots. Again my opinoion. I am also aware they only make new cards slightly better to get people to buy and that is their business goal to make more money.

Slightly better... yah, if you consider that the new card is more powerful than a SLI of the old ones, I wouldn't say "slightly"...
 
Slightly better... yah, if you consider that the new card is more powerful than a SLI of the old ones, I wouldn't say "slightly"...


They better not be 800$ like I just read about..:screwy: I just got a 8800GT for my second rig for 144$ That leaves about 650$ for gas for my car.
 
I have a hypothetical therory, and that is that nvidia and ati are in the wrong.

Nvidia should have stayed with the 8800 cards and really tweaked them so that the drivers could use the full bandwith of the cards and also providing HSF upgrades and testing the cards in an OC maner then letting those who are affraid of OCing know what the cards can be pushed to. For instance you buy a stock card at ultra speeds but are able to acctually get speeds of the new GTX with that extra upgrade HSF to keep it stable at low temps. Now I know you an watercool and OC greatly but not everyone has WC capabilities, so an increased CFM HSF would be needed. My point is stop making new cards and max out the potential of the current cards availible. The drivers are what makes the cards opperate at these speeds and stop with the V drop so we can get the cards the power needed. Also they can provid bios flashes like a mobo can to the cards for better performance with the 2.0 slots. Again my opinoion. I am also aware they only make new cards slightly better to get people to buy and that is their business goal to make more money.

There isn't enough raw crunching power in the G80 core to make full use of the available bandwidth. It's not a driver issue.

Looking at just the speed of the GPU is not very wise. There are many other factors that play a part such as the architecture.

You could put a massive HSF on a G80, but it will still run hot, and heat up your room. Making cooler, faster, more efficient chips is better for everybody.

The 8800GTX/Ultra have been out for 2 yrs. I'm pretty sure they have been maxed out as much as possible. Just ask the DI/LN2 guys.

The G92 chips actually have negative vdroop. As load is applied the vcore goes up.

It is already possible to flash the gfx card BIOS. It doesn't provide PCIe2.0 support, though. Not that it's needed.
 
3SLI_GTX280_m.png

From techPowerUp .. Tri-SLI'd GTX 280s + 4GHz Quad scores P21350 3DMarks in Vantage! For comparison, my rig (3.6GHz Quad + a 9800GX2) scores P10010 .. and the current Vantage WR set by the Foxconn Quantum Force Team (basically all the best OCers in the world) @ Computex 2008 with a 5.6GHz Quad + a 9800GX2 is P23286 3DMarks ..

Graphics score wise .. Tri-SLI'd GTX 280s = a 9800GX2 @ 1085/1150 (under LN2)
 
3SLI_GTX280_m.png

From techPowerUp .. Tri-SLI'd GTX 280s + 4GHz Quad scores P21350 3DMarks in Vantage! For comparison, my rig (3.6GHz Quad + a 9800GX2) scores P10010 .. and the current Vantage WR set by the Foxconn Quantum Force Team (basically all the best OCers in the world) @ Computex 2008 with a 5.6GHz Quad + a 9800GX2 is P23286 3DMarks ..

Graphics score wise .. Tri-SLI'd GTX 280s = a 9800GX2 @ 1085/1150 (under LN2)

Could be CPU bottleneck? add 1.6 ghz to that CPU on the tri sli gtx 280 setup and I bet you get similar or better scores than the gx2 on ln2.
 
The graphics cards are also running stock clocks. OC the GTX 280's and ramp the CPU up to that 5.6ghz and then you'll see something different :)

On the extreme setting, any money says the GTX 280 setup blows the GX2 out of the water.

Don't get me wrong, GX2 is a great card, but let's compare apples to apples here..
 
The graphics cards are also running stock clocks. OC the GTX 280's and ramp the CPU up to that 5.6ghz and then you'll see something different :)

On the extreme setting, any money says the GTX 280 setup blows the GX2 out of the water.

Don't get me wrong, GX2 is a great card, but let's compare apples to apples here..

My thoughts exactly.

I can't wait to see what these cards do when OC'd!
 
Back