• Welcome to Overclockers Forums! Join us to reply in threads, receive reduced ads, and to customize your site experience!

Rant about high end gaming today.

Overclockers is supported by our readers. When you click a link to make a purchase, we may earn a commission. Learn More.

obababoy

Registered
Joined
Jul 25, 2013
Location
Maryland
Maybe I have just had too much coffee this morning but do any of you feel like video cards are getting too powerful(hot/noisy) for their own good? CPU's have progressed quite well and are getting more powerful, efficient, and staying cool while not getting astronomical in size(less some of the coolers).

I feel like the physical video card design is starting to show its age. Single slot cards are done, dual slot is reaching its threshold, and triple is covering one or two of your motherboard's PCIe slots. They can't go taller because every case has the same size expansion slot. They can't go much longer for similar reasons.

To me the future is looking more and more like the R9 295x2 in terms of cooling and such where an AIO comes standard with a video card. There isn't much of a way around it unless they seriously reduce power consumption which is almost directly proportional to thermal efficiency.

On top of that it seems like developers are getting sloppy with game engine efficiency. I still revert back to Crysis because it came out in 2007 and although it took alot to run, it looked amazing even on lower settings with my Radeon 3870(s). Truthfully I think multi-platform has a bit to do with this and yes I do have an XB1.

IDK, I am sure ill get flamed for this post but I just don't have any recent wow factors with PC gaming lately and absolutely NONE with the XB1 so I felt privy to a rant. Last wow was probably playing the new Tombraider in 3D while only using a GTX460(LOVE THAT ENGINE). Am I alone on all of these things?
 
No, they're not too powerful at all.
It is easy to manage a good temperature and noise level with proper airFLOW through the case.

We're on the front of a 4K resolution being common, we need all the power we can get.
 
We're on the front of a 4K resolution being common, we need all the power we can get.

+1. I think this is one of the main reasons. "Standard" resolution for monitors has increased. These days 1080P monitors are common and cheap enough that having 2-3 isn't out of reach, 1440P/1600P monitors are increasing in popularity, and 4K monitors are the "it" thing to have. Compare this to maybe 10 years ago when most of my friends at least were rocking 1024x768 or 1280x1024. Even lower end cards have to be kind of beefy to even stand a chance.
 
but do any of you feel like video cards are getting too powerful(hot/noisy) for their own good?
Nope. TDP/power use of cards versus their performance is the same or lower. THink about it. The TDP of the GTX 480, 580, and 680, are 250W, 243W, 195W... Now the 780 is back up to 250W as is the 780Ti, but, the performance per watt has gone up considerably...Not to mention... we are approaching 4K now so... need all that power! Things are really getting more efficient.

I disagree with the AIO being the wave of the future. The 295x2 is 2 290x's on one card. It has to dissipate almost 500W of heat... which note that is a dual slot solution.

Gaming thing... sure. I agree there.

I am not sure if you are alone... but I don't agree with what you said about the GPUs considering the facts I mentioned.
 
Maybe I have just had too much coffee this morning but do any of you feel like video cards are getting too powerful(hot/noisy) for their own good? CPU's have progressed quite well and are getting more powerful, efficient, and staying cool while not getting astronomical in size(less some of the coolers).

Cpus aren't getting smaller actually. Just because there is a die shrink, that doesn't mean much when it comes to actual core and PCB size.

The war on power effeciency and green is the most number one thing over performance. The Cpu and Video Card manufacturers have a set of rules and guidelines they have to follow in order to stay below certain tolerances that could even be possibly harmful to humans. You can read about that stuff from FCC.

Otherwise they "could" potentially build hardware that gives off too much radiation perhaps harmful to humans.

So we as the public get stuck with little quads with HT or 8 cores from AMD. I think it was back in 07' Intel was showing off an 80 core cpu.

So I suppose for us consumers, we get what they offer and that's that. Otherwise we could cure our own cancer with the amount of radiation a quad cpu board with 1000 core processors. :rolleyes:
 
I don't think gpu's are getting to powerful, noisy fans are what bother me.
more power is just more better, but we pay for the package we install it in with noise, or we have to watercool them.
 
that could even be possibly harmful to humans. You can read about that stuff from FCC.

Otherwise they "could" potentially build hardware that gives off too much radiation perhaps harmful to humans.

Otherwise we could cure our own cancer with the amount of radiation a quad cpu board with 1000 core processors
I really have no clue what this is all about... LOL!

Doesn't all electronic devices have to go through this testing anyway? Isnt it contained inside the IHS, and case in the first place? I don't know much about it, but that doesn't sound remotely right... otherwise, I would be glowing when I left work every night (I work in a Data Center with hundreds of physical servers, most with dual/quad CPUs with 4-12 cores on each...).
 
I don't think gpu's are getting to powerful, noisy fans are what bother me.
more power is just more better, but we pay for the package we install it in with noise, or we have to watercool them.

Thanks, that is one of the fundamental things I forgot to mention. With all this power and heat it is cured...with annoying fans forcing us to either watercool, buy a full tower for mini itx board (exaggerating), or use headphones.
 
Thanks, that is one of the fundamental things I forgot to mention. With all this power and heat it is cured...with annoying fans forcing us to either watercool, buy a full tower for mini itx board (exaggerating), or use headphones.

Here's a simple fix: don't put a 250W GPU in an ITX case that smashes the fans against the side panel :)

The GTX 770 in my Hadron Air barely hits 70°C while gaming.
 
Nope. TDP/power use of cards versus their performance is the same or lower. THink about it. The TDP of the GTX 480, 580, and 680, are 250W, 243W, 195W... Now the 780 is back up to 250W as is the 780Ti, but, the performance per watt has gone up considerably...Not to mention... we are approaching 4K now so... need all that power! Things are really getting more efficient.

I disagree with the AIO being the wave of the future. The 295x2 is 2 290x's on one card. It has to dissipate almost 500W of heat... which note that is a dual slot solution.

Gaming thing... sure. I agree there.

I am not sure if you are alone... but I don't agree with what you said about the GPUs considering the facts I mentioned.

Good points with TDP and resolution increases. But to game with 4k we need (subjective) a 295x2 with its aio cooler to stay cool and not get obnoxiously loud.

Here's a simple fix: don't put a 250W GPU in an ITX case that smashes the fans against the side panel :)

The GTX 770 in my Hadron Air barely hits 70°C while gaming.

I knew someone would mention my itx build hahaha. Damn you! That is my own problem and not part of why I posted this.
 
Last edited by a moderator:
I knew someone would mention my itx build hahaha. Damn you! That is my own problem and not part of why I posted this.

Seems pretty directly related to me :shrug:

I've got a 780 on a custom loop, and a 770 in an ITX build. Neither have any issues with temperatures.

TDP hasn't increased for top-tier cards (read: any single GPU card) in a LONG time, they're all 250W or less for past generations AFAIK.

Getting airFLOW to the card keeps it cool, just like it did in the past 5+ generations.
 
Good points with TDP and resolution increases. But to game with 4k we need (subjective) a 295x2 with its aio cooler to stay cool and not get obnoxiously loud.



I knew someone would mention my itx build hahaha. Damn you! That is my own problem and not part of why I posted this.
Or you know, two single GPU cards like a 290x or 780ti...

I can't imagine your ITX build, that you are struggling with temperatures with your GPU, had anything to do with this.../sarcasm :D :p
 
Last edited:
No, they're not too powerful at all.
It is easy to manage a good temperature and noise level with proper airFLOW through the case.

We're on the front of a 4K resolution being common, we need all the power we can get.

+1

More power = Moar Kh for mining :attn: (usually)

Oh, that and 4K resolutions and ridiculously high refresh rates as mentioned before here :)
 
I have no problems with the size or power usage

If there is a Rant to be Runt it is the pricing ........... WTF

That is where I would start, competing globally is getting prohibitive and keeps a lot of great overclockers from competing
 
Or you know, two single GPU cards like a 290x or 780ti...
I don't buy it. With more terribly inefficient games like Watchdogs on the rise we will be running out of the 3gb of memory on the 780ti's at 4k res. Like most things these days, maybe im wrong...

I can't imagine your ITX build, that you are struggling with temperatures with your GPU, had anything to do with this.../sarcasm :D :p
Whatever!:bang head hahaha
 
...I feel like the physical video card design is starting to show its age....To me the future is looking more and more like the R9 295x2 in terms of cooling and such where an AIO comes standard with a video card.

I believe AMD is "attempting" to change all this with their FM platform. Are APU's the wave of the future? Will all high end graphics cards have AIO or Hybrid water cooling? It's, of course, impossible to say right now, but I believe until they can begin producing MOAR POWA with less heat that's the way it's going to have to go. A future alternative option could be to abandon the PCIe slots for graphics and come up with a new method that's not restricted by size. I dunno, maybe an external GPU/Mora thing.
 
I don't buy it. With more terribly inefficient games like Watchdogs on the rise we will be running out of the 3gb of memory on the 780ti's at 4k res. Like most things these days, maybe im wrong...


Whatever!:bang head hahaha

You realize that a 6GB dual GPU is an effective 3GB card, correct?
 
Back