• Welcome to Overclockers Forums! Join us to reply in threads, receive reduced ads, and to customize your site experience!

FRONTPAGE AMD FX-8150 - Bulldozer - Processor Review

Overclockers is supported by our readers. When you click a link to make a purchase, we may earn a commission. Learn More.

How happy are you with AMD FX-8150 price/performance?


  • Total voters
    205
  • Poll closed .
I hope you guys know that 32nm isn't a walk in the park, and smaller than this is not any easier. We are close to being an atom thick in certain wall structures of the transistor. I can't see how AMD can keep up with these price schemes with their CPUs.

Especially since they went fabless. I'll never understand why they did that, seems like it can only decrease their potential profits. Yes, risk goes down, but upside goes way down as well.
 
I know, but there's no getting around AMD's main problem, which is basically needing to have the amount of money to throw at the issue that Intel has.

Good engineering can only go so far, you need the $ for R&D and machinery..... No one in the industry is in the position Intel is in. Good for them, bad for their competition.
 
The fact is soon, they (Intel & AMD) won't be able to gain anything by die shrinking. There are physical limitations that will be reached at which point there is only another way to "grow" the technology, which is innovation and architecture improvements.

As I see it Hyper Threading was pure luck for Intel. When it first came out it seemed like a hail marry as AMD had the crown in both Architecture and performance. Why and how they lost that boils down to the fact that Intel has more OEM's buying their cpus and selling computers. I don't know the numbers at all but I wouldn't be surprised if Intel sells 100 cpu's for every 1 that AMD does. Thus Intel was able to go brute force by dumping tons of money into R&D and Tech. Then we got C2D, C2Q and then the i's.

Thus Intel has the advantage for no other reason that Marketing. Even when the P4 sucked, they still sold tons of them. Marketing. When was the last time you saw an AMD commercial on normal TV? I honestly don't think I ever have. The last time I saw an Intel one was yesterday.

Even if Bulldozer was everything we all hoped it would be AMD wouldn't be in a vastly different position. They would have probably sold more CPU's to enthusiasts, but I don't think that would have impacted the over all numbers very much. Which apparently doesn't matter anyway cause they are still sold out. You can't sell 1000 if you only have 100.
 
I hope you guys know that 32nm isn't a walk in the park, and smaller than this is not any easier. We are close to being an atom thick in certain wall structures of the transistor. I can't see how AMD can keep up with these price schemes with their CPUs.

The consumer wants the best price : performance ratio. When retail market up is $275 for a $245 MSRP... people will go to an alternative: 2500k @ $179. It appears intel has taken consumer confidence instead of our beloved AMD. Do i hear Apple buy up of AMD soon? Methinks so.
 
Still trying to get the money for a BD
Really, I don't know why AMD sold Global Foundries. Had they kept an iron grip on it, they could had them perfect the 32nm process a bit better then origionally. Also, a die shrink can only carry so far in performance. And I personally wouldn't mind the BD to be edited onto a 40nm die for better yields. 32nm is quite small. And I think intel may run into the problem of electrons suddenly jumping around the transistors into places they are not supposed to be, because they CAN freaking do that surprisingly.

But who knows. TMCS may have perfected their development. Also, I highly doubt that Apple will ever buy AMD. Apple so far has been using Intel to drive their machines, and using AMD for their desktops and laptops means a complete recode of the OS. AS WELL the fact that if they were used for just notebooks, then it would be a waste.

I would not say that AMD is behind just yet! All companies have their bad moments, but they will prevail sooner or later. Even if they do fail the CPU market, they do have the GPUs. Their GPUs are becoming more popular as gamers realize that most of the stuff on an Nvidia card does not really give you any special luxuries as people don't tend to use them. With OpenCL and the amount of cores it has, you would see some pretty god like performance.

SUPREME ULTRA MEGA EDIT: It seems that all the consumers of the FX-8120 and FX-8150 actually adore the thing. While most of the bad reviews are actually fakes, or people who don't own it just running up there after reading a review and going LOL do not buy. Hm... really AMD should try becoming a pimp and pimping it out as badly as Master Chief of Halo, and I bet you they can pull in the large corporations as well. One of the reveiwers also said to see the real potential of the core, you need to build a Project Scorpius system. This means an FX requires a 990x chipset, and a HD6000 series gpu to see it's true potential. Any conformations?
 
Last edited:
Really, I don't know why AMD sold Global Foundries. Had they kept an iron grip on it, they could had them perfect the 32nm process a bit better then origionally. Also, a die shrink can only carry so far in performance. And I personally wouldn't mind the BD to be edited onto a 40nm die for better yields. 32nm is quite small. And I think intel may run into the problem of electrons suddenly jumping around the transistors into places they are not supposed to be, because they CAN freaking do that surprisingly.

But who knows. TMCS may have perfected their development. Also, I highly doubt that Apple will ever buy AMD. Apple so far has been using Intel to drive their machines, and using AMD for their desktops and laptops means a complete recode of the OS. AS WELL the fact that if they were used for just notebooks, then it would be a waste.

I would not say that AMD is behind just yet! All companies have their bad moments, but they will prevail sooner or later. Even if they do fail the CPU market, they do have the GPUs. Their GPUs are becoming more popular as gamers realize that most of the stuff on an Nvidia card does not really give you any special luxuries as people don't tend to use them. With OpenCL and the amount of cores it has, you would see some pretty god like performance.

I have no idea about Apple and AMD.
What I know is that Apple is using Intel cpu's since January 2006, and that PowerPC Macs were Motorola powered.
Mac OSX was developped for Motorola chips and recoded for x86.
You can run a fully functionnal hackintosh with an AMD chip, so I don't believe that you even need recoding.
Apple already works a lot with AMD as they exclusively use AMD Radeon GPUs.
Apple weighing something like 5% of the market share, that could be a good thing for AMD...
 
Last edited:
SUPREME ULTRA MEGA EDIT: It seems that all the consumers of the FX-8120 and FX-8150 actually adore the thing. While most of the bad reviews are actually fakes, or people who don't own it just running up there after reading a review and going LOL do not buy. Hm... really AMD should try becoming a pimp and pimping it out as badly as Master Chief of Halo, and I bet you they can pull in the large corporations as well. One of the reveiwers also said to see the real potential of the core, you need to build a Project Scorpius system. This means an FX requires a 990x chipset, and a HD6000 series gpu to see it's true potential. Any conformations?

Our review used a Scorpius system (Crosshair V Formula, FX-8150 & HD6970). Our numbers were right on par with everyone else's. Price-to-performance, I concluded it was a good chip for AMD's price of $245. After they came out, I added a caveat that it is in no way worth $280 (unless you just want the newest CPU and are willing to pay the premium).

EDIT - The whole system naming thing is just a gimmick, there is no tangible added performance by using all AMD to get the name that I've ever seen. AMD would only hurt themselves if that were the case because they'd have to write the gain into their GPU drivers, which wound hinder performance on Intel systems. I think it's a very safe assumption that they'd rather sell more GPUs to the entire market than artifically limiting themselves to support the CPU/chipset division.
 
Tbh, no one seems to mention this but i think that it is important. The power draw on that thing is insane. If i bought that i would have to upgrade my 550W PSU just to run it with one GPU. Bulldozer and crossfire might be a nice way to heat the house in the winter but not much fun in the summer.
 
Lots of people have mentioned power draw. I don't mind it as much as performance, but there are those that value it more than me. Power draw was listed in the article and I did mention in the first comment no one without free electricity would want to fold with it. :D
 
Still trying to get the money for a BD

I would not say that AMD is behind just yet! All companies have their bad moments, but they will prevail sooner or later. Even if they do fail the CPU market, they do have the GPUs. Their GPUs are becoming more popular as gamers realize that most of the stuff on an Nvidia card does not really give you any special luxuries as people don't tend to use them. With OpenCL and the amount of cores it has, you would see some pretty god like performance.

SUPREME ULTRA MEGA EDIT: It seems that all the consumers of the FX-8120 and FX-8150 actually adore the thing. While most of the bad reviews are actually fakes, or people who don't own it just running up there after reading a review and going LOL do not buy.
Bottom line is most of these reviews are legit and while you may be having a hard time accepting that, its the truth. You can go back and see how many other CPUs they have rated to get more perspective, but most of these reviewers have been in the business since the K6-2 days and have reliable benchmarking tools. Fact is the FX underperforms when it comes to single thread applications despite the high clock speed. My feelings are that AMD fell short of the mark with this processor.
 
Do you think Anandtech is faking results?

http://www.anandtech.com/bench/Product/288?vs=434

Oh, but I forgot, it will work much better with Windows 2028...

EDIT: that's compared to a 30% cheaper cpu.

storm-chaser said:
Bottom line is most of these reviews are legit and while you may be having a hard time accepting that, its the truth. You can go back and see how many other CPUs they have rated to get more perspective, but most of these reviewers have been in the business since the K6-2 days and have reliable benchmarking tools. Fact is the FX underperforms when it comes to single thread applications despite the high clock speed. My feelings are that AMD fell short of the mark with this processor.

In tangletail's defense, I think he is referring to the consumer reviews on sites such as newegg, tigerdirect, etc., not to the professional reviews to which you are referring. I also second his comment in saying that it seems most (read: not all) who actually have the platform have been pleased with it and are having tons of fun tweaking it. With that said, I really hope AMD can get some improvements on track with Piledriver and successive updates.

Edit:
Also, something interesting. This site is doing extensive testing of BD under Linux and have implemented some of AMD's compiler optimizations in the process. The results seem pretty good and apparently there are still more optimizations to implement.
http://www.phoronix.com/scan.php?page=article&item=amd_fx8150_bulldozer&num=1
 
Last edited:
I have a lot of fun with my PhII.
But no one with common sense can say BD is not a failure. That is a fact.
And I don't believe Tangletail was talking about Newegg...

EDIT: better leaving this discussion. Between AS5 being junk and BD being a success, there might be a cure for bad faith...

EDIT BIS REPETITAS: nVidia makes great cards. You see my rigs? Not enough money to get a 580... but if I had, I would grab a pair of those!
 
Last edited:
I'd not call BD a failure. I think it's the first step in the direction they need to go to succeed, however I don't really intend to upgrade until Piledriver, possibly not until 2013 as what I really want is the successor to Trinity. The next gen APU's with fully vector based GPU component sounds like one enormous amound of potential and possibly the biggest leap performance for general computing I've ever seen.

From that standpoint I belive AMD should ignore x86 floating point altogether and keep trying to improve integer performance. After all, thats the only part will eventually carry over in their future heterogenous computing plans.
 
Anybody think this may be a repeat of the Phenom I to the Phenom II phase where everything just got better with the PhenomII?
 
I'd not call BD a failure.
How can it not be termed a failure? Reduced IPC, insanely high power consumption, beaten in most benchmarks by current Intel Sandy Bridge CPUs, and even in some by last generation Intel CPUs. The only hope is a Phenom I to Phenom II transformation. But by the time that can happen (6 months minimum), it will be confronted with Ivy Bridge from Intel and fall even further behind. AMDs more cores design philosphy just doesn't cut it with the way software is developed at this time. Single-threaded performance is way too low. I was hoping for a better result, but this is bad for AMD and bad for the consumer also.
 
Back