Welcome to Overclockers Forums! Join us to reply in threads, receive reduced ads, and to customize your site experience!
You guys sure like spending other people's money
Here I am with a rig from 2008 plus an AMD 270 (non X) 2GB and gaming just fine on my 1080p 52" TV. Haven't been around here for a long time, but when did it start that every rig build includes a $600+ video card by default? The games I play aren't bleeding edge, but they are modern and I play with max settings (except AA) and have no issues. Using 1080p and only 1 gaming monitor reduces your graphics processing requirements a great deal. A quality video card in the $150-250 range should be more than adequate. My 270 was $175 2 years ago.
When a person comes for help to build a rig at a price point we build it to that point. If you look at the first build I put together I actually did try and save a few dollars. Who knows what the financial status of the OP is, maybe 2k is pocket change for him? If the op came and said build me a rig for 1k I'm sure we could have done that as well.You guys sure like spending other people's money
Here I am with a rig from 2008 plus an AMD 270 (non X) 2GB and gaming just fine on my 1080p 52" TV. Haven't been around here for a long time, but when did it start that every rig build includes a $600+ video card by default? The games I play aren't bleeding edge, but they are modern and I play with max settings (except AA) and have no issues. Using 1080p and only 1 gaming monitor reduces your graphics processing requirements a great deal. A quality video card in the $150-250 range should be more than adequate. My 270 was $175 2 years ago.
You guys sure like spending other people's money
Here I am with a rig from 2008 plus an AMD 270 (non X) 2GB and gaming just fine on my 1080p 52" TV. Haven't been around here for a long time, but when did it start that every rig build includes a $600+ video card by default? The games I play aren't bleeding edge, but they are modern and I play with max settings (except AA) and have no issues. Using 1080p and only 1 gaming monitor reduces your graphics processing requirements a great deal. A quality video card in the $150-250 range should be more than adequate. My 270 was $175 2 years ago.
If you aren't playing modern, AAA games, a $150-250 card (GTX 960/ R9 380X) would probably be okay. In modern games, BF3 or later, however, you'll be getting 50 or less FPS on Ultra @ 1080P with those cards (probably 30-40 with your 270).
To max today's games @ 1080 and maintain a smooth frame rate, you need an R9 390 or GTX 970, at a minimum. With OP's budget, they can afford to bump that up a level (390X or 980) and be ready for 2016's slate of games as well.
http://www.bit-tech.net/hardware/graphics/2013/11/13/amd-radeon-r9-270-review/3Anything more is just throwing money away.
Can you actually see it, or is it a mental thing?
O boy where going here fps less than 60, I really think people can't see it I have done blind tests..
Varies person to person. I think, even more so than just differences in vision, it's differences in what a user is accustomed to. 10 years ago, I was comfortable with 30+ FPS. 5 years ago, I comfortable with 50+. Now, I'm comfortable with 60+. As my hardware has improved, I've gotten used to smoother frame rates and the difference when going backwards is more evident.
Though I can only speak for myself and what I'm comfortable with, the human eye is physiologically capable of detecting the difference in frame rates even higher than 60.
No the human eye and brain can't tell the difference from 25-30 fps or 60 fps and beyond. QUOTE: The human eye and its brain interface, the human visual system, can process 10 to 12 separate images per second, perceiving them individually.
Here is the link. https://en.wikipedia.org/wiki/Frame_rate
Movies are still at 24-25 FPS people still can't tell the difference, also all TV HD 1080i (i) means (interlaced), out of the 60Hz first the odd frames are drawn on your screen then the even frames by doing so it splits the frames into 30Hz for each, (can you see that happening, NO.)
The only reason why people think FPS is so important today compared to when I had my voodoo2 video accelerator Glide at 20- 30 FPS with my need for Speed SE that ran so smooth at 20-30 fps, is because if there is blurring or micro stutter, frame skip people get new graphics cards and the only thing we test is FPS with that understanding new graphics card always help anyway. Graphics are so complicated now that FPS is not telling the story anymore.
If you want to take a blind test to see if you can tell the differences from 60 fps or 30 fps I will do a run-through with BF4 locking my GTX 970 1080p Graphics card at a set frame rates and I won't have the OSD.
However recent studies have shown that the Retina actually juggles when processing information. In 2014, it was shown during research that the human eye could see at various frame rates varying from person to person. [10] In 2011, written on a forum was said that the retina takes about 5 to 12 milliseconds for an electrical impulse to fire and reset, 100 to 1000 rods depending on where in the retina you are, can fire every 7 milliseconds on average or around 140 fps.[11] Another website said that the human eye on average could see up to 150 fps. It is still unsettled on what the average "frame rate" of the human eye is, but so far based on recent studies within the last decade, show the human eye seeing anywhere between 75 to 150 fps with an average of about 140 fps.