• Welcome to Overclockers Forums! Join us to reply in threads, receive reduced ads, and to customize your site experience!

Explain CPU and GPU usage at different resolutions

Overclockers is supported by our readers. When you click a link to make a purchase, we may earn a commission. Learn More.

Aldakoopa

KING OF PROCRASTINATION Member
Joined
Jan 11, 2012
Location
North Carolina
I really need someone to explain this to me; I've been reading that lower resolutions put more load on the CPU, and at higher resolutions that load is shifted to the GPU.

Now, either I'm misunderstanding, reading this wrong, finding bad information, or this actually happens and it makes no sense to me.

If you are playing a game at a lower res, why would that strain your CPU more especially if you have a more-than-capable GPU? It would seem to me that higher resolution = more things for the computer to generate and therefore strain the entire system more, and lower resolutions would be just the opposite.
 
Your general gut feeling is correct. The difference is not which one is loaded more, but which one is the bottleneck. At low resolutions the graphics card has an easy time rendering frames, so your CPU speed limits the maximum frames per second. At high resolutions, the GPU becomes the bottleneck because there's a lot more to render.
 
So it's not actually that the load is being transferred to the GPU at higher resolutions, it's more so the CPU is doing about the same amount of work no matter what the resolution, and the GPU has to put in the extra effort?
 
Kinda, what you have to understand is that the CPU sets the frame up + handles all the AI/resource allocation and then passes the parameters to the GPU which then draws the frame. So the CPU does its thing sends the frame along to the GPU. The larger the frame and the more processing required the longer it takes for the GPU to finish. At low resolutions the frames are drawn much much faster therefore the CPU has to do alot more work setting up more frames. Of course using vSync limits you to 60FPS so the CPU strain wouldnt change regardless of resolution provided that your GPU can handle 60FPS at the larger resolution.
 
Kinda, what you have to understand is that the CPU sets the frame up + handles all the AI/resource allocation and then passes the parameters to the GPU which then draws the frame. So the CPU does its thing sends the frame along to the GPU. The larger the frame and the more processing required the longer it takes for the GPU to finish. At low resolutions the frames are drawn much much faster therefore the CPU has to do alot more work setting up more frames. Of course using vSync limits you to 60FPS so the CPU strain wouldnt change regardless of resolution provided that your GPU can handle 60FPS at the larger resolution.

Ok, this makes a lot of sense, I think I understand now. It's sort of like the GPU is being too fast at lower resolutions for the CPU to keep up with, correct?
 
Ok, this makes a lot of sense, I think I understand now. It's sort of like the GPU is being too fast at lower resolutions for the CPU to keep up with, correct?
Right. Or the other way around, the CPU can't feed the GPU fast enough. That's why it's a "bottleneck" - everything's fast enough except the CPU.
 
Back