• Welcome to Overclockers Forums! Join us to reply in threads, receive reduced ads, and to customize your site experience!

Future of Graphic cards....

Overclockers is supported by our readers. When you click a link to make a purchase, we may earn a commission. Learn More.

pockytofu

Registered
Joined
Jan 30, 2003
Location
MN
Heyo,
I was reading over at the inquirer this goofy article,
http://www.theinquirer.net/ Geforce6 Rumormill, but it got me thinking.
Seeing that as graphics cards get more and more powerfull, requiring more nrg, and becoming mini blast furnaces,
I wonder if that we would actually have to seperate the graphics card from the actual pc, I mean cpus aren't running any cooler either.
Maybe we'll need 2 powersupplies or one that's like 500+ watts.

Ok, other than that do you guys think I should get the 9700 AIW Pro or wait for the 9800 AIW pro?

Thanks!
 
wait for the 9800 pro AIW to come out... then get a 9700 pro AIW at a cheaper price.

the 9700 pro aiw is awsome... im sure the 9800 aiw pro will be great as well.. but expensive!
 
Hey pocky... let me try to explain it to you.

Starting with CPUs

AMD released (years ago) their Athlon with Thunderbird core, 180 nanomicrons manufactoring. It needed 1.8V. And recently (not really) AMD released their Athlon with T-Bred core, 130 nanomicrons manufactoring. It needs 1.5~1.65V.

Same happens with video cards. ATI's Radeon 9500/9700/9800 series need extra power because their 150 nanomicrons manufactoring. The 9600 series don't need it, because their 130 nanomicrons manufactoring. They (9600) aren't as powerful as 9500, but their default core speeds are 400MHz. And same happens with nVIDIA cards.

I think if ATI made the 9800 on 130 nanomicrons, they could have the same or higher clock speeds than the 9600 and no extra power would be needed. It would need even less power if 9800 had DDR-II memory, but DDR-II is very expensive right now. Time will tell...
 
If anything power comsuptoin will beocme lower as the technology improves - as is almost always the case with technology - it gets smaller and more efficient.
 
DeathAngelLST said:
180 nanomicrons manufactoring. It needed 1.8V. And recently (not really) AMD released their Athlon with T-Bred core, 130 nanomicrons manufactoring. It needs 1.5~1.65V.


*ahem* nanometers :\ nanomicrons = ???
 
Mr.Guvernment said:
If anything power comsuptoin will beocme lower as the technology improves - as is almost always the case with technology - it gets smaller and more efficient.

Are you sure of that? My Commodore 64 runs off a 20-watt power supply the size of a pack of cigarettes. You should be familiar with the size and rating of modern power supplies.
 
Hmm... future of graphics cards... well, I say the geforce fx10000 will come out, and each owner will have one of them in orbit around the earth (for cooling of course, they could call them the ultimate in quiet computing), with an orbit such that they are always hidden from the sun by the earth (so its always cold), and the signal from the card will be sent to your computer via an orbital satellite network using lasers.

Oh yeah, and the lasers would also work as a type of "star wars type defence against other objects in orbit, like the radeon 20800 pro.

Wow, I thought wayyy to much for that post...
 
Well...we'er already starting to see water cooling coming as stock on some nv30's, so that might be one of the solutions
 
Carnil said:


Are you sure of that? My Commodore 64 runs off a 20-watt power supply the size of a pack of cigarettes. You should be familiar with the size and rating of modern power supplies.

Your Commodore's power supply likely only had to power a sub 5mhz processor, a few kilobytes of ram, a keyboard. a floppy drive, and a tiny display - not multiple hard drives, multiple cd drives, a floppy drive, a number of fans, a network card, mouse/keyboard, etc.

It's true that power requirements are getting lower, however, speeds are increasing, so power requirements are increasing.

What did you just say, Crazy InThrees? Power requirements for a given SPEED are lowering. If we were still dealing with 1ghz chips, the power requirements would be way less than the original athlon (the first 1ghz chip, right?) Since speeds are increasing, it's sort of a push-pull with power.

Anyway, I was thinking about 'the future of computers' today, and I can only see everything getting much smaller. Eventually, what we consider small form factor will be normal - micronization is necessary to achieve the sort of bus speeds we're all hoping for in the future. When you're dealing with transistors measured in nanometers, distances of inches between components will be unacceptable. I see tiny motherboards with tiny expansion slots / sockets for different peripherals, with one unified connector that connects to an all-in-one monitor/keyboard/mouse/network/printer header. Kind of like an SPDIF header or something, if you're familiar with those.
 
Back