I know that for standard TV/display anything over 60fps is meaningless, but I just fear what the future holds for FX CPU's including my own, now that there are some newer titles out there that will hit 30 FPS CPU bound on an FX.
Rise of the Tomb Raider is one, The game suffers terribly even with my GTX 980Ti @ 1080p (not my normal resolution, but for testing) there are many areas that my game will crawl at 30 to 40 fps while my GPU is utilized 50 - 70 %. Crank up the Res to 3440x1440p, Frame rate stays exactly the same.
Looked @ some youtube videos of 6300's vs the I3 6100, in games like Grand Theft auto, and frames can drop into the 30s on the 6300, while the I3 6100 is holding 50's +. That's a big smoothness gap for just a CPU difference. Some other games weren't too bad and seemed to have similar results, but there are those titles, where big open areas with lots of objects just hammer you in the CPU.
The reason I'm looking @ FX 6300's compared to I3's now is because I recently helped a friend with a tight budget gaming PC. I put him on a 6300 because I thought for some reason DDR4 was really expensive, and I wasn't familiar with the Skylake I3's and how awesome the single threaded price/performance was. I just assumed that with a tight budget, I was looking @ AMD. Now to realize that for nearly the same price he could have had the I3 6100 build I feel guilty. I don't really have any faith in the future of Multi threaded gaming right now, I have a feeling that that single threaded reliance is where it's staying, and watching frame rates drop into the 30's knowing now that for $20 more it could have stayed in the 50's or higher in the same areas based on CPU performance alone... you know?