• Welcome to Overclockers Forums! Join us to reply in threads, receive reduced ads, and to customize your site experience!

Convince me to buy an AMD chip

Overclockers is supported by our readers. When you click a link to make a purchase, we may earn a commission. Learn More.
If you want to build a new HTPC/Gaming rig, why are we not talking about thermal and noise issues? THAT is where I would make my arguments. Unfortunately I'm not up on the latest from AMD. But if there were any options that would give you better performance, with less power consumption and heat (which usually = less noise ;) ) That's where I would start.

edit: oh ya.. now I remember they didn't get that either... :shrug:
 
Your reliance on artificial benchmarks to determine performance is pathetic. I challenge you to run your 10 favorite games on I5 2500 or I5 3550 vs the FX-8350 and show me where VISIBLY you see a difference in performance. Nobody needs a frame rate above 40 fps. NOBODY. It can't be noticed as better above 35 fps. So your benchmarks are garbage, total manure and mean less than than garbage.
I will show you time in video and photo editing where the AMD FX cpu will taker a significantly less time to do the work. That DOES mean a whole lot if you are at work or time means something to you.

Yes, the CPU doesn't matter nearly as much as the GPU and a FX-8350 is more than enough for a gaming machine.

However, your facts on FPS are garbage, as the AVERAGE human can notice differences until 60FPS. Many gamers claim they can tell a difference above this as well, not as much as smoothness but with reduced motion blur and such.

http://boallen.com/fps-compare.html
http://frames-per-second.appspot.com/
 
technically the human eye can notice up to 200 fps... the trick is very few of us have ever seen more then 24fps (movies/tv are generally 24fps)... and most monitors don't break more then 60fps.

But when shown tvs with media playing at 120 fps and 200 fps, the human eye COULD see the difference; above 200fps the eye can't tell the difference anymore.


As to the AMD fan in this thread... I can give you a few good reasons to go AMD.

1) you can make a pretty solid HTPC rig for cheap.
2) you won't be supporting Intel
3) since it's cheap, you can overclock with abandon, never fearing for your pocketbook. This means you can do all the fun, funky cooling systems you always wanted to try but couldn't justify the risk.
4) a top end AMD chip is about all you'll need for any game anyway (as of today).
5) More (6ghz) SATA ports, more USB 3.0 ports...
6) More software is supporting multi-core chips, making the 6 and 8 core AMD more viable going forward.
 
Last edited:
I would purchase IB it uses less power. Your cpu is still fine like you said they have not made games and programs that demand new gear yet.

I know you don't like intel, however they have the best thing going for performance and power usage.:cool::popcorn:


About FPS you can't see action that is faster than 60fps you will miss what happened if the action was that fast so any faster than 60 is just a waste, movies are 24 FPS and you can see all the action

Games now days add motion blurring so it more like real life and consol games are limited to 30 or 60 FPS

Most monitors have a refresh rate of 60Hz the rest of the frames form the video cards are discarded so there not plotted on the screen.
 
Last edited:
You hit the nail on its head

technically the human eye can notice up to 200 fps... the trick is very few of us have ever seen more then 24fps (movies/tv are generally 24fps)... and most monitors don't break more then 60fps.

But when shown tvs with media playing at 120 fps and 200 fps, the human eye COULD see the difference; above 200fps the eye can't tell the difference anymore.


As to the AMD fan in this thread... I can give you a few good reasons to go AMD.

1) you can make a pretty solid HTPC rig for cheap.
2) you won't be supporting Intel
3) since it's cheap, you can overclock with abandon, never fearing for your pocketbook. This means you can do all the fun, funky cooling systems you always wanted to try but couldn't justify the risk.
4) a top end AMD chip is about all you'll need for any game anyway (as of today).
5) More (6ghz) SATA ports, more USB 3.0 ports...
6) More software is supporting multi-core chips, making the 6 and 8 core AMD more viable going forward.

120 hz probably,200 hz prove it. The point is actually 120hz monitors are few and far between and very expensive. Only one-dimensional rich boys have that kind of money to burn.
 
120hz monitors start at 249 on newegg. That doesnt scream one dimensional rich boy to me.

As for going amd over intel, the only really compelling reason to go intel for most tasks, is power draw. Piledriver still chews up power, which means heat and higher electricity costs. I honestly doubt i could tell you, when im gaming, if you switched out my 5ghz 2700k for a stock 8350.
 
120hz monitors

120hz monitors start at 249 on newegg. That doesnt scream one dimensional rich boy to me.

As for going amd over intel, the only really compelling reason to go intel for most tasks, is power draw. Piledriver still chews up power, which means heat and higher electricity costs. I honestly doubt i could tell you, when im gaming, if you switched out my 5ghz 2700k for a stock 8350.

The better quality 120hz monitors go for far more. If I were to go that way it would not be primarily ffor gaming but to take advantage of resiolutions and refresh rates above 1080p
standards. Before HD rared its ugly head I had monitors with 120 hz vertical refresh rate at
about 2100x1600. That was pre-lcd days of course. I would fefinitely appreciate a gigher refresh rate but just as much higher resolution. I have a 23" Samsung very high conrast monitor ran me over $300 a couple years ago. Only 60hz. What I would want would probably run me a good 500 to $600 Can't justify it.
 
http://www.cameratechnica.com/2011/11/21/what-is-the-highest-frame-rate-the-human-eye-can-perceive/

What is the Highest Frame Rate the Human Eye Can Perceive?

Patrick J. Mineault, a PhD student in neuroscience has summarized the surprisingly complicated answer on his blog. The phenomena behind human frame rate perception is flicker fusion - the point at which a flickering image appears stationary. Flicker fusion depends on three variables:

Stimulus Luminance – The brighter the image, the faster the frame rate needed to saturate human perception. Even the brightest stimuli were undistinguishable above 50fps.

Image credit: Scholarpedia
Stimulus Area – the larger the stimulus area, the faster the frame rate needed to saturate human perception. Human perception response drops off at 60fps. The area concept directly applies to screen size, meaning that higher frame rates may be more useful on a theater screen than a small monitor.

Image Credit: Scholarpedia
Stimulus location – the farther away the stimuli is from your center of vision, the less likely you are to perceive it.

Image Credit: Scholarpedia
So it would seem from this data that 60fps is approaching the upper limits of human perception. But movie frame rate perception is a harder effect to quantify since the images aren’t really flickering – they are smoothly transitioned from one frame to the next. Nevertheless, it makes sense then that Peter Jackson is shooting the new Hobbit movie at 48fps. Other directors are also investigating creative uses of higher frame rate movie capture. In the segment below, Douglas Trumbull, of Blade Runner fame, shows how high frame rate shots can be used selectively to enhance action shots.

Still not convinced? Go ahead, crank up that frame rate and try some experimentation of your own.


http://www.scholarpedia.org/article/Flicker_fusion
Flicker fusion
 
Because wiki isn't the best source. I'll need more then that. I do know that i've seen a number of things that point to 200-400fps being at the top range of the human eye.
 
http://www.cameratechnica.com/2011/11/21/what-is-the-highest-frame-rate-the-human-eye-can-perceive/

What is the Highest Frame Rate the Human Eye Can Perceive?

Patrick J. Mineault, a PhD student in neuroscience has summarized the surprisingly complicated answer on his blog. The phenomena behind human frame rate perception is flicker fusion - the point at which a flickering image appears stationary. Flicker fusion depends on three variables:

Stimulus Luminance – The brighter the image, the faster the frame rate needed to saturate human perception. Even the brightest stimuli were undistinguishable above 50fps.

Image credit: Scholarpedia
Stimulus Area – the larger the stimulus area, the faster the frame rate needed to saturate human perception. Human perception response drops off at 60fps. The area concept directly applies to screen size, meaning that higher frame rates may be more useful on a theater screen than a small monitor.

Image Credit: Scholarpedia
Stimulus location – the farther away the stimuli is from your center of vision, the less likely you are to perceive it.

Image Credit: Scholarpedia
So it would seem from this data that 60fps is approaching the upper limits of human perception. But movie frame rate perception is a harder effect to quantify since the images aren’t really flickering – they are smoothly transitioned from one frame to the next. Nevertheless, it makes sense then that Peter Jackson is shooting the new Hobbit movie at 48fps. Other directors are also investigating creative uses of higher frame rate movie capture. In the segment below, Douglas Trumbull, of Blade Runner fame, shows how high frame rate shots can be used selectively to enhance action shots.

Still not convinced? Go ahead, crank up that frame rate and try some experimentation of your own.


http://www.scholarpedia.org/article/Flicker_fusion
Flicker fusion

Because wiki isn't the best source. I'll need more then that. I do know that i've seen a number of things that point to 200-400fps being at the top range of the human eye.

My post is not from wiki, it's from Patrick J. Mineault, a PhD student.

What do you have from a PhD from 200-400 fps Human Eye Can Perceive?
 
Ok, enough about FPS. This is an AMD processor argument, remember?

If you want to continue the discussion, create a thread. Thanks.
 
Last edited:
I don't think anyone can convince you to buy an AMD over an Intel, they are better chips so that's impossible.

But AMD chip are not bad chips, nor are Intel chips that much better. its actually give and take, some things are faster or better on Intel, others are faster / better on AMD. Its more Intel win and less AMD win but when Intel win is that win by such a significant margin?

No i don't think so, I think AMD are right for the money, performance is comparable to Intel and the performance gains that Intel do have are not going to be so noticeable, to most people its not going to be noticeable at all.

This is a pretty good and well balanced review IMO http://uk.hardware.info/reviews/3314/amd-fx-8350--8320--6300-vishera-review-finally-good-enough

For me nothing in that stands out to where i think the Intel chip is the better chip for its money

The FX-8350 really is not a bad chip at all, i would go as far as to say its reasonably good considering the i7 3770K is mostly the benchmark.
 
Yeah... if you're building a gaming rig, the FX-8350 is about all the CPU you need (before overclocking, i've seen some pretty encouraging early reports that it also supports some sick overclocking) in many cases being about as good as an i5-2500k or i5-3570k... which is pretty much the standard for a gaming CPU.

So at the moment, i'd say it's an ideal time to get in on an AMD. Is it the best chip? No... but it's at the top end for price/performance in a gaming rig (stock)... and in aps that make use of it's 8 cores, it's far faster then pretty much every intel chip in it's price range, and faster or on par with intel chips on the top end of the price range in some cases. Granted, not much makes use of multi-cores... but the number of programs utilizing multi-cores has been growing rapidly and will only become more and more common as time goes on.
 
Back