• Welcome to Overclockers Forums! Join us to reply in threads, receive reduced ads, and to customize your site experience!

NVIDIA G80 & ATi R600 Info, can you say dual GPU

Overclockers is supported by our readers. When you click a link to make a purchase, we may earn a commission. Learn More.

RedDragonXXX

Senior RAM Pornographer
Joined
Mar 3, 2005
Location
Jacksonville, FL
VR-Zone has got some tasty gossip for us today, and I do appreciate a good rumor. They heard that G80 will be in time for launch in June during Computex and the process technology is likely to be 80nm at TSMC. In the recent statement, NVIDIA has said that they will be backing the 80nm "half-node" process by TSMC where it allows reduction of die size by 19%. We have previously mentioned that G80 is likely to take on the Unified Shader approach and supports Shader Model 4.0. G80 is likely to be paired up with the Samsung GDDR4 memories reaching a speed of 2.5Gbps. As for ATi, the next generation R600 is slated for launch end of this year according to the roadmap we have seen and the process technology is 65nm. It seems that the leaked specs of the R600 that surfaced in June last year is pretty likely.

According to Xpentor, NVIDIA G80 will make ATI stumble on April. Quad SLI itself can be implemented on a single card with two chips solution because it will carry the first dual core GPU ever with the support of DirectX10 and Shader Model 4.0. The development of G80 is also mentioned as being running very intensive since NVIDIA's acquisition over ULi. As for the upcoming G71, there will be 32pipes, increase in ROPs and a little speed bump over the core clock.

# 65nm
# 64 Shader pipelines (Vec4+Scalar)
# 32 TMU's
# 32 ROPs
# 128 Shader Operations per Cycle
# 800MHz Core
# 102.4 billion shader ops/sec
# 512GFLOPs for the shaders
# 2 Billion triangles/sec
# 25.6 Gpixels/Gtexels/sec
# 256-bit 512MB 1.8GHz GDDR4 Memory
# 57.6 GB/sec Bandwidth (at 1.8GHz)
# WGF2.0 Unified Shader

Holy God and everything else that is holy, that's all that I can say right now cause I'm having trouble finding my jaw :drool: :drool: :drool:

Source: vr-zone/guru3d
 
Dang, looks like the G80 will crush the G71 even before it got out of the starting gate!

Don't you just LOVE competition!
 
I think nvidia would just release G71 first, and wait for ATI's response. If ATI has a great product out, they would quickly release G80 to knock over ATI, if ATI doesn't have a great product out, they'd hold on to G80 until 65nm probably.
 
I am actually getting sick of the little wars they are having here, they are releasing videocards too fast, leaving the buyer of a new powerful videocard with buyer's remorse, because two months after they buy it, it is outdated. I miss the days when a refresh would happen every six months, not two.
 
those were the days, highend was actually high end for a time longer than 60days.

http://www.megagames.com/news/html/hardware/atir600specsrevealed.shtml
http://www.tcmagazine.info/comments.php?id=10150


http://www.dvhardware.net/article2593.html
http://www.theinquirer.net/?article=15512

rumored r600 specs below

"# 65nm
# 64 Shader pipelines (Vec4+Scalar)
# 32 TMU's
# 32 ROPs
# 128 Shader Operations per Cycle
# 800MHz Core
# 102.4 billion shader ops/sec
# 512GFLOPs for the shaders
# 2 Billion triangles/sec
# 25.6 Gpixels/Gtexels/sec
# 256-bit 512MB 1.8GHz GDDR4 Memory
# 57.6 GB/sec Bandwidth (at 1.8GHz)
# WGF2.0 Unified Shader"
 
Last edited:
The one thing that worries me, is where exactly the ceiling is, and when they're going to hit their head on it? It seems that the only thing holding them back is the size of Joe Sixpacks power supply. eventually, were going to be buying CPU's and motherboards to plug into our video cards, not the other way around.
 
I think it's quite possible that they could do this. I love a good rumor as much as most people, and considering the jump ATI just did, why can't nV?
 
lowfat said:
I am actually getting sick of the little wars they are having here, they are releasing videocards too fast, leaving the buyer of a new powerful videocard with buyer's remorse, because two months after they buy it, it is outdated. I miss the days when a refresh would happen every six months, not two.

Hey I'm all for advances in technology. Call me a graphics *****, but if we want games too look like real life then this is what its gonna take. the faster the better imo.
 
Let 'em update every two months. That just means I won't have to wait as long for a price drop on that two month old card :D
 
Audioaficionado said:
Let 'em update every two months. That just means I won't have to wait as long for a price drop on that two month old card :D

Quoted for truth.

I hope, truly hope, that more GPUs on the market will lead to cheaper costs. Upwards of $600 for a card is ridiculous. But hey, they sell...
 
lowfat said:
I am actually getting sick of the little wars they are having here, they are releasing videocards too fast, leaving the buyer of a new powerful videocard with buyer's remorse, because two months after they buy it, it is outdated. I miss the days when a refresh would happen every six months, not two.

But then the "older" technology just gets cheaper. You dont always need the best of the best, sure its nice but not needed. I love seeing technology progress and I hope it keeps going at this rate. Its also motivating AMD and Intel to put out chips that can keep up with these cards.
 
i thibnk alot of people who buy cards want theres to be the best for a long period ..just so they can say mine is the best out..

not like the card gets outdated.. it just doesnt perform as fast as the new one..

so you got a 04 lexus compared to an 05 ..big deal ... that 04 will still keep up ..
 
so you got a 04 lexus compared to an 05 ..big deal ... that 04 will still keep up ..
What people are saying is that it's not between an '04 and an '05 Lexus anymore. It's a Q1 2004, Q2 2004, Q3 2004, Q4 2004, Q1 2005. So now your '04 is four models behind instead of one.

It also ecourages more sloppy coding from the software side. With that much power at the top end, they can do whatever the heck they want and it'll run. No more optimizations - you just have to buy bigger.
 
so its 4 models behind ...big deal ...

it still keeps up period...

and sloppy code encouragement ?? i highly doubt that ... if they are sloppy coders thats one thing ... but encouraged by the cards ?? no
 
johan851 said:
It also ecourages more sloppy coding from the software side. With that much power at the top end, they can do whatever the heck they want and it'll run. No more optimizations - you just have to buy bigger.

I don't think they'll be THAT sloppy about it, but yah you have a point.
 
Back