• Welcome to Overclockers Forums! Join us to reply in threads, receive reduced ads, and to customize your site experience!

Driving a small monitor vs an HDTV

Overclockers is supported by our readers. When you click a link to make a purchase, we may earn a commission. Learn More.

xilix

Member
Joined
Jun 28, 2006
Location
MA
I was just recently having a debate with someone about this. He's saying that driving a 42inch HDTV at 1920x1080 takes "more power" for a graphics card to drive than say, a 23inch monitor at 1920x1080. My view is that they're both the same exact resolution therefor take the graphics card the same amount of energy to drive, the monitors themselves are pulling all the "power" they need right from their respective power supplies. The amount of pixels for both displays are the same, the smaller monitor simply has a denser panel than the bigger one, fitting more pixels in a smaller space.

My argument is that the graphics card is simply sending a signal, not power, and the card itself is not being stressed any harder simply because a display is physically bigger. What dictates how much stress the card goes through is the resolution and a myriad of features like AA, AF, SSAO etc etc.. not the size of the display.

I've no idea what he's trying to say but apparently he thinks that the larger a display the more the card has to do? That makes absolutely no logical sense to me. If I'm wrong someone please explain.
 
Your friend is an idiot :).


My argument is that the graphics card is simply sending a signal, not power, and the card itself is not being stressed any harder simply because a display is physically bigger. What dictates how much stress the card goes through is the resolution and a myriad of features like AA, AF, SSAO etc etc.. not the size of the display.


Correct. I wouldn't trust this person for advice on minute rice. He'd probably tell you it takes an hour.
 
You are correct sir, gaming on my 42 inch LG 1080p i get the same FPS in all games as on my 1080P Dell 23in...

They need to make 1600p TVs damnit!!!!
 
He's an intelligent dude and a good guy I just think he is somehow gone astray in this particular subject. What's weird is that he agreed with me that the performance (like a benchmark) doesn't change, but at the same time, he is saying it stresses the card more, because in his words "that's just the resolution so of course it's the same, but there's still more pixels to deal with". I just don't really understand what he's trying to say at all. It's dealing with 1920x1080 pixels. That IS the amount of pixels - physical pixels - on the entire panel. It doesn't matter what size the display is, it still has 2,073,600 pixels. Period.

This all stemmed from me getting Medal of Honor, and it going to blue screen every 40 minutes or so of gameplay. No other game gives me an issue, including the games that are most taxing like Metro2033, Crysis etc etc.. all of which I run with everything 100% maxed. MOH is a joke, my card rips through the game like nothing, even with 32xAA on.

One thing I DID notice though (and this is something not even the most insane games do), is that it will send all four cores to 90+% at times!! Is this just horrible coding or what? The game can't possibly be that demanding.

He says the reason for that is I'm on an Nvidia card and MOH was developed on ATI, so if I had an ATI card (never happen) it wouldn't peg my CPU as much. Again I doubt this, but I guess there is some logic to it, albeit not that much. I don't see a game actually sending more code to my CPU rather than my GPU depending on the GPU I have. That makes no sense and the only time I see that happening is with ATI and PhysX, but that is a unique situation with a unique niche feature.

The older I get the more I feel I'm slipping in my knowledge of things which is why I'm doubting myself here. I don't want to look like an idiot.
 
Last edited:
Here is a friendly advice. Stop taking his words regarding this too seriously.. Obviously he doesn't know what he is talking about. ^_^
 
Back