I'd argue about the need for AA at 4k... I'd say it is more a function of the pixel density. As a rough guide, if you can still resolve pixels by eye, you still need AA. Similar arguments may be found on photography forums, where some claim higher resolutions negate the need for AA on sensors, yet anyone with even a basic understanding of sampling theory will know that can't be said in itself. So without making allowance for AA, the VR numbers above don't even come close to 4k60, at 233MP/s vs 498MP/s respectively. Not even half. At 1.2x SS I make that only going up to 336MP/s. 1440p60 would come at 221MP/s.
At a really abstract level, you could argue if you need to push a lot of pixels, you tend towards being GPU limited. If you want higher framerates, you tend towards being CPU limited. I think the reality is in most systems, we're in that mid ground where both still have a significant influence. Makes sense for games companies to try to balance the use of both. Maybe the CPU bar to reach 60fps minimum is lower than I think.
Edit:
All this talk has reminded me of a gripe I had. When I tried to buy a gaming laptop, I was annoyed they put all the budget in the CPU, and not enough in the GPU. My current personal laptop has a 6700HQ and 970M in it, and I would have preferred a 980M with a lower quad core (with or without HT) but no one offered that at the time.