• Welcome to Overclockers Forums! Join us to reply in threads, receive reduced ads, and to customize your site experience!

[Table] - Monitor Resolution & Screen Size impacts on perceived Pixel Density

Overclockers is supported by our readers. When you click a link to make a purchase, we may earn a commission. Learn More.

GearingMass

Member
Joined
Jun 24, 2014
Location
TX/CO
[Table] - Monitor Resolution & Screen Size impacts on perceived Pixel Density

--[Disclaimer: I use the phrase "image quality" in this thread as a placeholder to describe having a better viewing experience due to a higher pixel density, all else equal]--

I was looking into the idea of jumping from my current 1080p 24" monitor to 1440p, and I was curious as to how the jump in pixels would impact my perceived image quality compared to how much harder my GPU was going to have to work.

I created this chart to help me quantify the data and clear up some misconceptions I initially had.


Pixel Density Worksheet.png


Initially I was unknowingly calculating Pixels per inch [PPI], as that's the term I think we're all familiar with, but I was picturing Pixels per square inch [PPI^2] in my mind. That's why I was disappointed when I saw that jumping from a 24" 1080p monitor to a 27" 1440p one was only going to give me an 18% increase in pixel density while costing me 78% more workload on the GPU. This just didn't seem right to me; people rave about jumping from 1080 to 1440, and while I knew an increase in screen size area was going to exponentially drop pixel density the bigger the screen got (given a constant resolution), I didn't think it would be that bad.

I realized that the online calculators I was using were calculating linear pixel density, not 2D pixel density. So I found one that did both, double-checked its math by hand, and made the table above.

As far as I can figure, what we'll perceive as a higher image quality, etc., will be based on Pixels per square inch, not pixels per linear inch.
Argument: Hold constant screen size and aspect ratio (say 24" @ 16:9) ---> screen area is held constant. 4K will have 4 times the pixels of 1080p in this scenario, spread across the exact same amount of screen area. This naturally means there is 4 times the pixel density in this situation with the 4K resolution. If you're using PPI, however, as you'll see above in the table, it seems that you're only doubling your pixel density and thus image quality, however we know mathematically that you're quadrupling it. That's where PPI^2 comes into play -- it's an accurate measure of the pixel density you'll perceive.


Going back to the jump I was talking about: using PPI^2, I'd actually see a 40% increase in perceived pixel density with the 78% increase in GPU workload ---> much better :D
 
It might be good to also think about other variables like (TN vs IPS panels) and refresh rates (60Hz, 120Hz, 144Hz). This can have a great impact on your perceived image quality without making your GPU work harder.
 
It might be good to also think about other variables like (TN vs IPS panels) and refresh rates (60Hz, 120Hz, 144Hz). This can have a great impact on your perceived image quality without making your GPU work harder.

Oh you're absolutely correct about that. Those factors, along with build quality, out-of-box/calibrated color quality, response time, etc., will certainly all have an effect. That's why I felt it so necessary to add the disclaimer at the top - holding all those things equal so that the effect of simply the change in pixel density can somewhat be looked at quantitatively.
Once you start adding those factors back in as you're comparing different monitors across different companies, price ranges, etc., you have to take the above with a(n even larger) grain of salt and understand that it becomes simply a qualitative piece of the puzzle.
 
I like how you put this into numbers, but the most important thing is how each persons' eyesight affects the image. A lot of people can't differentiate the smaller pixels a 27" 1440p screen from the ones on a 27" 1080p screen. I consider myself fortunate and cursed (I mean literally my parents cuss at me for my good eyesight) with great eyesight, so for me the move to 4k is a godsend, it is so much easier on the eyes, other than the scaling (I use a 1080p as a second monitor).

I look forward to the day GPUs can push 4k monitors as easily as they push 1440p, but I know a lot of people that will never see the difference. It is interesting how you applied this thought, thanks for the information. 4K might be the new 1440p in a couple of years,but it's still taxing GPUs unproportionatly (is that even a word, should be) more than the clarity of the pixel density, I hope I worded that right.

My ideal setup would be 3 28" 4k monitors for everyday use, then adjust resolution to 1440p for gaming, I don't mind in one bit being limited to 60 viewable fps. I'd rather have a steady 60fps over triple 1440p than choppy 30-60 over triple 4k, but that will all change as the monitors and gpus get better.
 
Something I've discovered since I've made this thread, is that gaming fps performance does not seem to scale linearly with total number of pixels.

What I mean by that is this: If fps scaled linearly with pixels on-screen, then a 1440p resolution should only produce 56% of the fps of a 1080p resolution (78% more pixels... 1/1.78=.56), however from the benchmarks I've quickly looked at, it appears that the average modern title at 1440p gets about 65-70% the fps of the same title at 1080p.

What's even more interesting is that it appears that the more demanding the title (Think: Crysis/Witcher 3, Mordor, et al) the better it performs in this ratio. Those titles seem to see 75-80% the fps on 1440p that they do on 1080p.

If anybody has an explanation or thoughts on this, I'd love to hear it.


Note: I will try and update this post with my calculations/sources later when time permits.
 
Back