I was just recently having a debate with someone about this. He's saying that driving a 42inch HDTV at 1920x1080 takes "more power" for a graphics card to drive than say, a 23inch monitor at 1920x1080. My view is that they're both the same exact resolution therefor take the graphics card the same amount of energy to drive, the monitors themselves are pulling all the "power" they need right from their respective power supplies. The amount of pixels for both displays are the same, the smaller monitor simply has a denser panel than the bigger one, fitting more pixels in a smaller space.
My argument is that the graphics card is simply sending a signal, not power, and the card itself is not being stressed any harder simply because a display is physically bigger. What dictates how much stress the card goes through is the resolution and a myriad of features like AA, AF, SSAO etc etc.. not the size of the display.
I've no idea what he's trying to say but apparently he thinks that the larger a display the more the card has to do? That makes absolutely no logical sense to me. If I'm wrong someone please explain.
My argument is that the graphics card is simply sending a signal, not power, and the card itself is not being stressed any harder simply because a display is physically bigger. What dictates how much stress the card goes through is the resolution and a myriad of features like AA, AF, SSAO etc etc.. not the size of the display.
I've no idea what he's trying to say but apparently he thinks that the larger a display the more the card has to do? That makes absolutely no logical sense to me. If I'm wrong someone please explain.