AMD’s FX-8150 – Bulldozer – Processor Review

Bulldozer. The name is fraught with implications of power. Pure, unadulterated, run-you-over power. Much has been said and debated about AMD’s newest offering over the past year plus. Now, the wait is over and we’re here to bring you the full scoop!

AMD FX CPU Die

AMD FX CPU Die

The Bulldozer Architecture

AMD has really reinvented the wheel with Bulldozer. It’s a completely different animal than other CPUs out there (we’ll see later whether that’s a good thing or a bad thing). With Bulldozer, AMD has taken a less-is-more approach. Their engineers put two cores side-by-side and considered what could be shared between them.

They decided to share the front end, floating point cores, and the L2 cache between two cores. It’s not shown in this diagram, but each module has its own dedicated L3 cache in addition to the intra-core L2 cache.

Bulldozer Concept

Bulldozer Concept

Bulldozer Realized

Bulldozer Realized

The next four slides just bore a little deeper into the Bulldozer shared and dedicated components, giving a bit more detail to the diagram on the right above. What may give you pause is the shared floating point core. While most applications do use CPU cores, those that use FPU cores would seem to be losing out on this deal.

There are still two 128bit pipes through which to process data. The front end splits 256bit FPU content into two 128bit FPU opperations and sends them down the pipes. These ops have to be processed at the same time. So, in essence, the FPUs can do either two seperate 128bit FPU operations or a single 256bit FPU operation. So, what if your application wants to run more than four 256bit FPU operations? With the loss of the extra FPU core, the lack of ability to split the workload further would indicate potential performance loss.

Shared Frontend

Shared Frontend

Dedicated Cores

Dedicated Cores

Shared FPU

Shared FPU

Shared L2

Shared L2

Bulldozer is the architecture, but the code name for the CPU we’re looking at today is actually Zambezi, the desktop variation of Bulldozer. All of AMD’s new products use Bulldozer cores on both the server and the desktop sides of the market.

The Die - Bulldozer Modules

The Die - Bulldozer Modules

The Die Labeled

The Die Labeled

Power management is a big deal for AMD, especially when it comes to the server market. It never hurts for the desktop side to reap the benefits though, and that’s what we have here. AMD has reduced idle frequencies to a meager 1.4 GHz with the FX-8150 we’re looking at today and reduced the idle Vcore to match. The changes between states is instantaneous as you would expect.

Consequently, while power isn’t a huge deal to many overclockers in general, it is worthy of a mention. Here is how this setup compares to an Intel setup with a 2600K and the same GPU. There is definitely a strong disparity in power consumption. Sandy Bridge is clearly much more power-efficient than Bulldozer.

Test Setup Idle (Watts) CPU Loaded (Watts)
i7 2600K 97 W 158 W
FX-8150 121 W 246 W

They also have advanced Turbo Core, which has a base turbo and a max turbo. All cores can exceed their stock frequency (up to 3.9 GHz in this case) if the chip’s TDP isn’t reached. However, that didn’t seem to happen much in our testing. If a stressful multi-threaded load is applied, chances are it will take up the TDP and the CPU will operate at its base frequency (3.6 GHz on the FX-8150).

Lightly-threaded loads do take advantage of Turbo Core greatly though. With a max turbo of 4.2 GHz, single- up to quad- threaded applications (which comprise a lot of consumer applications, sometimes even games) get a healthy 600 MHz boost over the base frequency.

Power Management

Power Management

Turbo Core

Turbo Core

Eventually there will be eight different Bulldozer iterations to choose from. There are actually seven different models, but the FX-8120 will be available in both 95W and 125W TDP variation.

At launch though, there will be four Bulldozer CPUs to choose from, the flagship FX-8150, the next step down FX-8120, the six-core FX-6100 and the quad core FX-4100. Telling of AMD’s cut-throat pricing (and, potentially, performance), the top of the line FX-8150 is going to be priced at a very reasonable $245.

FX CPU Lineup

FX CPU Lineup

FX CPUs Available at Launch

FX CPUs Available at Launch

With the technical demonstration out of the way, we get to the AMD marketing launch presentation. On the left, they compare the technology between the i7 and i5 family of CPUs with the FX-8150. Let’s have a list, shall we?

  • RAM speed you can dismiss most of you know by know that Sandy Bridge has a stellar memory controller that runs DDR3-2133 without changing a single setting other than setting the RAM timings & voltage.
  • The CPU spec comparisons can pretty much be dismissed as ‘we’ll see how it performs’.
  • The second most important comparison in this chart for overclockers and gamers is the fact that AMD has a full 32 lanes of PCIe graphics capability.
  • The single most important spec is that all FX processors will have unlocked multipliers. Fun for everybody, regardless of budget!
  • The slide on the right just reiterates some of these points with bigger text.

2600K Tech Comparison

2600K Tech Comparison

2500K Comparison

2500K Comparison

Of course, both sides in the CPU battle like to present benchmark graphs, mostly aimed at showing how their products come out ahead. On the left shows a very favorable comparison to the i5 2500K when gaming with Eyefinity. We’ll have to take their word for that one. We’ll be exploring several games but will do so using the more prevalent 1080p resolution.

On the right is a slide you’ve seen floating around already from a leaked presentation. We can show you now it was a real slide and does accurately reflect Bulldozer’s performance. There is a difference in this slide compared to the leaked slide though – in the previous version, WPrime performance was graphed incorrectly. See folks? This is why you take anything – even potentially official leaked slides – with a grain of salt prior to the NDA expiration!

Eyefinity Comparison

Eyefinity Comparison

Comparison Benchmarks

Comparison Benchmarks

Now we see a little of how Turbo Core can benefit vs. the base frequency. On the right is a gaming comparison between a system running a 980X and an FX-8150. I’m not sure what to think about these tests. Sure, it can game at 1080p right alongside a kilo-buck CPU, but so can Intel’s cheaper Sandy Bridge offerings. So take heart, if you’re a gamer and won’t use that thousand dollars worth of CPU, you don’t need to spend that much on a CPU. Whew, it’s a good thing they saved all that money for you isn’t it?

Turbo Core Benefit

Turbo Core Benefit

980X Gaming Comparison

980X Gaming Comparison

Now we get tot he exciting part of the presentation – the future! The left slide speaks in generalities – things you’d hope they would do anyway. On the right we get some specifics, and it looks pretty encouraging. You saw in the slide above that the FX-8150 is a worthy competitor to an i5 2500K but didn’t quite catch the i7 2600K. A 10-15% improvement with Piledriver should close that gap to almost nothing. Intel isn’t standing still though, so AMD will have to get that IPC up to boot. As a side note, I like AMD’s architecture naming scheme – large construction equipment!

Coming FX Enhancements

Coming FX Enhancements

Core Roadmap - Big Machinery!

Core Roadmap - Big Machinery!

Well, there you have it folks – the Bulldozer architecture. It’s definitely a big step away from the norm. Can it keep up with Sandy Bridge? We’ll see soon. It is important to consider that AMD has been stressing for a while with reviewers that the FX-8150 is priced to compete with Intel’s i5 2500K, not the pricier i7 2600K.

There are a lot of people out there that expected FX to destroy Intel’s current offering. As you can see above, that’s not where they are positioning this chip. Whether that’s a failure on their part is for you to decide – what we’re going to consider is whether this chip performs to justify its price. No more, no less.

The CPU – Now With Water!

AMD sent a nice, large press kit containing the board on which to bench their new CPU, the CPU in the socket, a box for photos and (for some reason or another) a belt buckle. Not too many people I know wear giant belt buckles, but to each their own.

Press Kit Box

Press Kit Box

Press Kit

Press Kit

Belt Buckle

Belt Buckle

AMD has switched up their packaging with this generation of processors and will be shipping their FX CPUs in pretty cool looking metal tins. Ours came sans cooler, so we just have an empty box, but this is what they’ll look like on the outside.

AMD FX-Series Box

AMD FX-Series Box

One very interesting item is that, for the first time to my knowledge, AMD is going to have a liquid cooling solution to sell with their FX CPUs. They aren’t hiding the fact that it’s an Asetek cooler, but it has a nice, thick radiator to hopefully compensate for it being a single-120mm radiator solution. It won’t get temperatures like you’ll see below using a custom water loop, but will be a fair sight better than the stock cooler. This thing just arrived today so there can’t be any testing, but we did snap some photos to show you what to expect.

Water Cooler Box

Water Cooler Box

Water Cooler Box Side

Water Cooler Box Side

Box Contents

Box Contents

Self-Contained Water Cooling Unit

Self-Contained Water Cooling Unit

Coldplate

Coldplate

Select-able RGB LED Logo

Select-able RGB LED Logo

Dual Fans

Dual Fans

The cooler will be rolling out to select markets “soon” (the US is not first, FYI) so don’t expect to buy it right when you buy a Zambezi CPU. Expected retail, subject to change, is in the $100 range.

Now we bring you the main event. Weighing in at ~2 billion transistors in an area of a mere ~315 mm2 is AMD’s Zambezi, the Bulldozer-based FX-8150!

AMD FX-8150 CPU

AMD FX-8150 CPU

AMD FX-8150 CPU Pins

AMD FX-8150 CPU Pins

Please hold your applause until the review is complete. No, really. Quit clapping. Thanks!

Overclocking for Stability

The brightest spot about Bulldozer is its ease of overclocking. It’s just begging to be clocked. After dialing in very reasonable numbers…

  • 250 MHz bus
  • 2500 MHz CPU-NB and HT speeds (with a minor voltage bump)
  • Rated RAM speed  & voltage
  • 1.41 V loaded Vcore

…I started bumping the multiplier and ended up with a very nice 24/7 overclock of 4.75 GHz. That’s 150 MHz higher than I settled on with Sandy Bridge, so consider me pleased.

Note that CPUz is reading this CPU incorrectly; it truly is an FX-8150 as you can see on the chip photo above. Temperatures are what you can expect from a custom water loop containing a Swiftech MCR-320 radiator, MCP35x pump (with 35x top) and EK HF Supreme Cu. Temps under load at stock were great, running in the low-to-mid 30° C range.

4.75 GHz 24/7 Stable

4.75 GHz 24/7 Stable

After that I reached 5.0 GHz within ten minutes and ran WPrime 32M without breaking a sweat.

AMD FX-8150 @ 5000 MHz

AMD FX-8150 @ 5000 MHz

Since this setup was going to be put under liquid nitrogen I didn’t see how far it could be pushed under water. That said, at this overclock it was getting kind-of warm (assuming older architecture temperature rules still apply) and going much farther on this chip without sub-ambient cooling may not be too wise.

Test System, Opponents and Methodology

We’ve got a who’s who of modern mainstream systems for you today. Considering they’re not exactly relevant for current purchasing decisions, we’ve dropped the plethora-of-CPU graphs in favor of easier-to-read-and-decipher graphs with only modern processors.

CPU AMD FX-8150 Phenom II x6 1100T Intel i7 2600K Intel i5 2500K
Stock / Turbo 3.6 / 4.2 3.3 3.4 / 3.8 3.3 / 3.7
Motherboard ASUS Crosshair V Formula ASUS Crosshair V Formula ASUS P8P67 WS Revolution  Gigabyte G1 Sniper2
RAM

G.Skill Flare

DDR3-2000

7-9-7-24

G.Skill Flare

DDR3-2000

7-9-7-24

G.Skill RipjawsX

DDR3-2133

9-11-9-24

DDR3-1600

9-11-9-24

GPU AMD HD6970 AMD HD6970 AMD HD6970 n/a

The UEFI used was ASUS version 0813. It was the most recent version available when the setup was tested and was supplied by AMD. The OS for testing was Windows 7 x64 with all updates and patches installed. The CPUs don’t appear in every graph below. Notably, the i5 2500K was (very kindly) tested by our esteemed editor EarthDog and he did not have an HD6970, so there are no 3D/game results with the 2500K.

Benchmark Results

The stock benchmarks were run three times each and the results you see are averaged. The only exceptions being 3D benchmarks, game tests, and overclocked benchmarks, which were run once each.

The results you see below are graphed relative to the AMD FX-8150′s stock performance. This means that results by the FX-8150 at stock all equal 100% and the other results are graphed as a percentage relative to the FX-8150′s performance. So, for instance, if the FX-8150 scored 200 points on a benchmark and the i5 2500K scored 180 points, on the graph the FX-8150 would = 100% and the i5 2500K would = 90%.

Enough talk, let’s bench this thing!

AIDA 64 Benchmarks

First up, we explore the AIDA 64 test suite. These tests were run only at stock to give a comparison of how the chips perform under various testing conditions.

AIDA 64 CPU Tests

AIDA 64 CPU Tests

Starting with the CPU tests, it seems AMD’s positioning this chip against the 2500K is accurate. The two chips trade blows throughout the CPU test suite.

AIDA 64 FPU Tests

AIDA 64 FPU Tests

Floating point performance isn’t looking so good. Not that it’s bad compared to the professed competition, but that they actually lost ground to the Thuban. I mentioned this might happen based on the sharing of FPU cores rather than duplicating the entire core, which would have given the chip a healthy boost with eight FPU cores instead of four. It seems those fears have been realized and the Thuban – with six FPU cores – out-performs Bulldozer. For better or worse, AMD has put all their performance eggs on the CPU side of their chip.

AIDA 64 Memory Tests

AIDA 64 Memory Tests

Memory is one area where Bulldozer does quite well for itself. They are within spitting distance of Sandy Bridge (which has a stellar memory controller) for reads and copies but lag behind a bit with writes. Still, AMD has gained significant ground over Thuban.

2D Benchmarking

We’ll get the bad news out of the way first. AMD gave up on SuperPi. That’s not new, it happened a while ago. SuperPi uses an old x86 floating point instruction set. There it is again – floating point. Not only did they not improve with this benchmark, they lost ground to their older architecture, which isn’t a surprise considering the AIDA FPU tests above. Ground was gained when overclocked, but not enough to say this chip is anything close to good at SuperPi.

SuperPi 1M and SuperPi 32M

SuperPi 1M and SuperPi 32M

But wait – Bulldozer focuses on multi-core performance, so it should do well in WPrime, right? Wrong. AMD would say this is an older benchmark. I would say phooey. This processor is supposed to be the epitome of multi-core performance. While it still beats its stated competition (2500K remember), it loses ground to the Thuban in a multi-core benchmark.

WPrime 32M and WPrime 1024M

WPrime 32M and WPrime 1024M

Overclocking the chip (to 4.75 GHz remember) brings its performance right next to the i7 2600K’s stock performance. To say running this benchmark was disappointing would be very much an understatement. Those that were shown these benchmarks with the Overclockers staff had comments ranging from “there must be something wrong” to simply ”this can’t be”.

That said, these results need to come with a bit of a caveat. Believe it or not, in a way they are supposed to look like this. Both SuperPi and WPrime calculations are executed in the FPU cores. There are only four FPU cores among the four modules, as opposed to eight CPU cores. With the loss of two FPU cores, it makes perfect sense that the Thuban out-performs Bulldozer in this benchmark. So while FX is a multi-core powerhouse, as feared the Bulldozer architecture that saved space (and saved TDP) by removing four FPU cores hurt performance when performing these calculations.

I presented that last paragraph to AMD to get their take; this is what they have to say on the matter:

It’s really more than the FPU as the Bulldozer core can handle two 128-bit FP calculations or one 256-bit. These applications use old extensions (really, yesterday’s tech), while Bulldozer is optimized for the workloads of today and tomorrow. We expect that the applications will be updated with new extensions which will dramatically improve performance (or become obsolete to applications that do get updated). As well, when we begin to see other optimizations like the Windows 8 scheduler, Bulldozer will show to be a great building block for our upcoming products.

Rendering, Video Conversion and Compression

Real world performance. That’s where AMD is hoping to have their moment in the sun. In the case of both Cinebench rendering tests and 7zip compression tests, I think we can happily hand it to them.

Cinebench 10, Cinebench 11.5 and 7zip

Cinebench 10, Cinebench 11.5 and 7zip

The FX-8150 beat out its predecessor and the 2500K at stock and put a good lickin’ on them overclocked, also beating the stock 2600K. That said, to require over a GHz overclock to beat a stock 2600K is not what you’d call a good sign.

How about some more rendering with some video encoding thrown in?

PoV Ray 3.7 RC3 and x264 Benchmark

PoV Ray 3.7 RC3 and x264 Benchmark

Well, PoV Ray is just that – a ray of sunshine through the clouds. Stock just nudged past the stock 2600K and overclocked jolted performance up impressively. x264 was a mixed bag, with Pass 1 requiring the overclock to gain parity with the Intel offerings. Pass 2 is much better for AMD, showing stock performance right on par with the stock i7 2600K and then leaping ahead overclocked. What’s important about Pass 2 is that it’s the actual encoding of the video. Pass 1 is just a scanning pass (see the x264 FAQ). So when you’re actually encoding video, the Bulldozer chip comes out smelling like roses.

Now, AMD supplied reviewers with a new version of the x264 Benchmark late in the game that was supposed to take advantage of the XOP and AVX extensions. Regrettably that didn’t make it into testing because the build was torn down and being insulated for sub-zero benching. If that is a different build than the one we used (which was the most recent build available), you can expect encoding using those extensions to improve even more.

So in real-world use, Bulldozer isn’t so bad – it’s good even. At stock in each test it is either close to or just below the i7 2600K’s performance and overclocked it overshoots the more expensive i7 by a good bit.

3D Benchmarking

Well, can 3DMark save Bulldozer for the benchmarking community, at least those going for globals? We’ll start with the oldest of the three benched here – 3DMark06

3DMark06

3DMark06

Um, err…to answer the question, no, it won’t. Yikes. This was just pitiful, I think the results speak for themselves. 3DMark06 does take advantage of multiple cores, but not as much as newer versions. Benchmarking team member dejo has said multiple times how he gets the same scores with HT off as he does with it on, so maybe this isn’t as bad as it looks. Hey, at least the FX-8150 did okay in the CPU test itself – when overclocked to the moon.

How about Vantage?

3DMark Vantage

3DMark Vantage

Nope, not here either. Things are not looking so hot for the DirectX 10 crowd either. Vantage can definitely take advantage of all available threads, but sheesh, this isn’t even close; and the Thuban continues to beat Bulldozer.

So…maybe 3DMark11 will look better?

3DMark 11

3DMark 11

Ahh, now that we have a heavily GPU bound benchmark, the Thuban finally gave way to its younger brother, but just barely. Last up in the GPU benchmark family we have HWBot Heaven, another GPU-intensive activity (the Thuban was not tested for this bench).

HWBot Heaven Benchmark

HWBot Heaven Benchmark

Well, at least that – on Heaven, which shows little to no variation between CPUs anyway (which is obvious from the stock-to-overclocked results), the FX-8150 is dead even with the 2600K.

Those looking for global boints from 3D benchmarks will unfortunately have to look elsewhere. It seems the 2600K is still the go-to for that. Bulldozer did make up ground when overclocked, but really not that much. I had really expected a greater-than-one GHz overclock to at least bring the FX-8150 into actual competition with the 2600K but that’s just not the case, especially considering Sandy Bridge’s hilariously easy overclocking.

Gaming

Now we move on to games, an area where AMD is touting Bulldozer as a top performer. All game tests were run at 1080p with graphics options turned to the max. We’re going for the most average gamer with a good video card and a single 1080p monitor. This is how it will perform for most people; we see no value in running low-resolution benches to show something no one will ever experience.

First up, the Stalker: Call of Pripyat benchmark.

Stalker: Call of Pripyat

Stalker: Call of Pripyat

Here again, Bulldozer continues to trade blows with Thuban. It’s not trading blows with the competition, it’s trading blows with AMD’s old architecture. Overclocked, it gets close to the 2600K’s numbers. There is little difference here as you can see, but the difference is definitely there and it doesn’t really improve Bulldozer’s outlook.

We’ll test three more games and be done. Alien vs Predator is the only one with a Thuban result, the other two weren’t run with the 1100T.

Aliens vs. Predator DX11 Bench, HAWX 2 and Dirt 2

Aliens vs. Predator DX11 Bench, HAWX 2 and Dirt 2

More of the same here. I double-checked the results and AvP & Dirt 2 did actually lose a hair of an FPS there, but that’s small enough to be considered just test variation. Put succinctly, overclocking your CPU won’t do much when running those two titles with an HD6970. Hawx 2 does show improvement overclocked, enough to get much closer to the 2600K’s performance.

Truthfully, based on this Bulldozer will be fine in a gaming system. Using one of these chips won’t hurt a gamer in any noticeable way. It also won’t really help a gamer in a noticeable way. If we had a 2500K to test in gaming, it’s safe to assume we would have shown Bulldozer performing right alongside it. I think the best way to describe gaming performance is on par with its stated competition.

Extreme Benchmarking

Ahh extreme benchmarking. AMD’s saving grace. Before launch they had several members of the reviewing press out to give the tech demo at which they broke the world frequency record. It’s definitely an impressive feat in itself, so I came into this review really excited to get my hands on a chip!

Then, I benchmarked it and was somewhat underwhelmed. It did overclock well on ambient though, so there was hope that it would be a great liquid nitrogen infused experience. I was not disappointed, thus the passion was reignited and the insulation process begun.

Bulldozer Cold Prep

Bulldozer Cold Prep

Some of you may know Giraffe Pot already. It’s huge. With the extension on, it’s darn near the same height as the thermos that was used to pour liquid nitrogen (LN2) into it! This pic is with the thermos at board level.

Giraffe Pot = Thermos Height (almost)

Giraffe Pot = Thermos Height (almost)

Anyway, I got the software installed and pulled the beast right on down to its full -196° C because as we know from the world record attempt, Bulldozer has no cold bug! It got so cold, it formed an icicle where the probe emerged from the pot insulation/tape (see the above pic on the left).

Bulldozer Frozen

Bulldozer Frozen

Bulldozer Frozen Running WPrime 1024M

Bulldozer Frozen Running WPrime 1024M

I love the smell of nitrogen in the evening (seriously though, don’t inhale the stuff; it’s odorless and can asphyxiate you). These were taken while running WPrime 1024M with a full pot.

So, how did it do after about three hours of torture? Not too shabby. I had hoped for more but one or two of the modules in this CPU were much weaker than the first two. Multi-threaded benchmarking could “only” be had in the 6.2-6.5 GHz range. WPrime 1024M passed at 6271 MHz and WPrime 32M passed as high as 6528 MHz.

You can’t see it with CPUz, but CPU-NB and HT were both at 2,500 MHz for the entirety of this cold run. Time didn’t allow experimenting with those as my meager hours were all spent trying to squeeze the most out of the CPU.

WPrime 1024M @ 6271 MHz

WPrime 1024M @ 6271 MHz

WPrime 32M @ 6528 MHz

WPrime 32M @ 6528 MHz

That was multi-threaded and at those speeds Bulldozer made a comeback for competitive benchmarking. Regrettably it’s still not enough to topple an i7 2600K. At 5.4 GHz I managed 4.515 seconds, more than 0.3 sec faster at over a GHz less. Looks like we’ll need to be in the 7.0+ GHz range with Bulldozer to make a dent in Sandy Bridge times. As you can see, that’s not necessarily going to be easy.

Thus, we continued hunting out max frequency with single threaded benchmarks. SuperPi times start getting half-way decent when you start running them over 7 GHz – and run them it did. SuperPi 1M was able to complete (barely stable) at a very impressive 7507 MHz.

SuperPi 1M @ 7507 MHz

SuperPi 1M @ 7507 MHz

If you didn’t look too closely at that screenshot, look again; specifically the memory clocks. That’s right folks, when this memory controller is cold (and it has to be cold!) it can run some really strong memory. In my case it was running at DDR3-2416, but I’ve heard of at least one instance of an FX-8150 running some insane speeds of DDR3-3000 at ASUS HQ.

Last, but not least, we have the maximum speed – the highest frequency CPUz could save a validation at without crashing. It’s the hardest and the simplest “benchmark” all at the same time. Based on my result, I have a feeling this was an ‘average’ chip based on the ease with which they binned the few used in the world record attempt. Alternatively, it could be the fact that I wanted a living, happy chip after benching it cold – meaning I didn’t put 2+ volts through it like they did. Anyway, the maximum frequency at which this chip could validate was a whopping 7623 MHz!

Check out the validation link too – it did it with the memory even higher at DDR3-2452.

While the scores themselves aren’t exactly show-stopping, the clocks definitely are. On top of that, this was a very fun chip to bench. There really is nothing like filling your pot to the brim (less a bit to allow for boiling without splashing!) and going to town torturing the newest CPU to market.

Final Thoughts and Conclusion

Please read this conclusion to the end. I’ll go ahead and spoil the surprise and say this CPU gets an Overclockers Approved. There, now you have no reason to skip to the logo.

This CPU is a mixed bag, start to finish. Like I said, AMD reinvented the wheel and took a big chance. It took years to get to this point. Their engineers poured blood, sweat and tears into this architecture. It wins in some places and loses in others. Overall, it’s an improvement from their previous Stars-core derived Thuban. In a few places it falls short, but where it counts for most people – real-world applications like rendering, compression and encoding – it’s a strong step forward, competing perfectly in the market segment where AMD’s pricing is positioning it – against the i5 2500K.

Which is really the rub with many-an-enthusiast. Bulldozer has been talked about for a long time. The way a lot of people at overclocking forums talk about it, you’d think it was supposed to be the second coming. It has been argued about, hotly debated and speculated upon forever. Our own Bulldozer Rumors thread has been going strong since January 14 of this year. It is definitely not the Intel-toppler many thought / wished / hoped / begged it would be. Was I a bit disappointed? Absolutely. I bought into the hype just like many of you and AMD did not produce what a lot of people thought was going to be the return of top-of-the-hill FX.

That, of course, brings us to AMD’s tried-and-true formula: Price-for-Performance. It’s not priced to topple Intel’s higher-end mainstream i7 2600K. At $314.99 shipped, that chip still remains the mainstream enthusiast king-of-the-hill. Every single consumer Bulldozer FX chip will be an unlocked CPU priced at or below $245. Considering the gains seen against the i5 2500K, it’s worth the $25 difference in price.

With all of that said, the overclocker in me is just screaming so what if it just competes with the i5 2500K and doesn’t topple the i7 2600K?! It clocks like a madman. It does its job at the right price for the performance it offers. So, because we try to review based on what a product is relative to what it offers at its price point – and not based on what people with nothing but hopes and dreams expected and/or hoped it would be – it’s safe to call Bulldozer another solid offering from AMD.

Jeremy Vaughan (hokiealumnus)

Author’s Note: Please see the first comment below. There is important information regarding the launch prices for these CPUs. I am changing nothing about the review above, but price point was a huge factor in calling this Overclockers Approved. Seeing the price at $280 when these went on sale made me feel duped. Anyway, regarding CPU purchasing decisions, please see the first comment below the article.

Tags: , , , , , , , , , ,

759 Comments:

hokiealumnus's Avatar
After seeing the pricing on these things, I feel the need to add a comment here to go along with my conclusion. Even with all the negativity surrounding the launch, I still think it's a good performer for multi-threaded applications at its $245 MSRP. At the $280 price point they are selling for, no way would I recommend this CPU to anybody. Get the 2600K for a bit more or save some cash and get a 2500K to perform close to as well. This CPU is worth $25 more than the 2500K if you use multi-threaded applications, but it is definitely NOT worth $60 more.

For existing AMD users, if you already have a Thuban, don't bother upgrading unless you want a new toy. Even then, don't upgrade until the price comes down to where it should be.

I'm very disappointed in the current price point of these things. If it's the latest AND greatest, sure, crank the price up at launch...but if it's only the latest, you're just gouging suckers and early adopters. EarthDog does bring up a good point though. While I can't recommend this CPU at $280, AMD is not the one gouging right now, the retailers are. I don't blame AMD for what their retailers are doing, nor do I think AMD themselves set the price too high. At the same time, that's what they're selling for and I can't recommend the CPU at that price point.

Once prices drop to MSRP (or, hey, lower would be even better!), all will be well and I'd recommend this CPU as stated in the conclusion to the article.

---------
I had promised folding results to Shelnutt2, but they couldn't make the review. So, they will be posted in the first post!

Regular SMP work unit - 13698.9ppd

Bigadv work unit - 13859.2ppd

So between 13,500 and 14,000 at stock, which is right where it's positioned - around the PPD of a 2500K. Based on power consumption compared to the fact that this performs right alongside the 2500K, no person that focuses on distributed computing should touch these with a ten foot pole (unless they have free electricity).
Dolk's Avatar
There will be another article popping up in a couple days to talk about the Architecture of this CPU.
Sammich's Avatar
hmm... well, interesting enough. Saw what I wanted to see, too invested at this point to turn back. I suppose I will be taking a learning experience from this one.
Xterra's Avatar
Looks like we have a price to performance winner again, no?
Route44's Avatar
This is what I was hoping for!
I.M.O.G.'s Avatar
Price performance goes to Intel at the moment with the current Microcenter deal, so long as you have a Microcenter nearby. Checkout this thread in our cyber deal forum:
http://www.overclockers.com/forums/s...d.php?t=687633
Dolk's Avatar
@ Xterra

Pretty much
Djak777's Avatar
Seems so. including the normally 60-100 dollar cooler is nice.
Metallica's Avatar
Hmm, not up on the Egg, wonder how long until they have it up.
Llew's Avatar
What type of load temps were you reaching at 4.7? I'm pushing 70c at 4.2 at stock volts... I'm confused as to why my review sample is performing so poorly.

Sorry I'm blind, i see you were in the 30's under load on your swiftech loop. I'm testing on air with the noctua c-12, so I expect my temps to be higher, but not 40deg higher.
Dolk's Avatar
He was still under 60C so still in the range of the current Fire Wall Rule.

We are still seeing the same rules apply with BD as we saw with Thuban and Deneb.
Xterra's Avatar
Microcenter always gives nearly $50-70 in CPU savings... Bulldozer too I assume?
bmwbaxter's Avatar
nice review. a bit of a performance disappointment, but i already ordered mine. so let the good times roll
I.M.O.G.'s Avatar
I hope so! Either way however, the price is right given where the performance drops in.
doz's Avatar
So much for my hope of AMD's repeat of the Athlon XP 1700+ cpu.
Bassplayer's Avatar
Thanks for the review!! I'll be picking on up for some subzero fun.

Come to think of it... we should have organized an event at Microcenter for the launch!
EarthDog's Avatar
I cant say Im jumping ship. My socks are still on, they have not been blown off. Price for what you get though cannot be beat.......outside of that apparent Microcenter deal posted above...
Archer0915's Avatar
I want to see some BOINC numbers. I see no real advantages or reason to move to a new platform if it will not burry an Intel Quad or AMD X6. Not beat but burry.

For me this is a little sad; actually Come on AMD

someinterwebguy's Avatar
Nice review.

Looks like I'll grab a BD and other stuff (case, PS, etc) by the end of the month.
Janus67's Avatar
Nice review there men, overall I view it as a disappointment for all of the hype people have created for it (although I never heard much hype by the way of AMD to be honest)

I think that if they made the chip at the $200 mark it would be a sure-fire winner, but at $245-$250 and putting it right between the $200 2500k and $300 2600k, I'm safe to recommend anyone that asks me to spend the extra $50 and get the 2600k, especially as the stock 2600k beat out the overlocked 8150 in a lot of benchmarks. I guess BD for the most part is an improvement over Thuban (which looked to be more of its competitor, rather than Intel), so if you are already invested in an AMD setup and are looking for an upgrade (and to sell off your old CPU) then it wouldn't be a bad choice.

Would love to see more benchmarks from the gaming side, there are a lot more than can be done Somebody send Earthdog an HD6970 for him to 'review' so that you can update those graphs with those numbers. It is hard to compare with the graphs when we don't have numbers against what it is supposed to compete against (according to AMD)

Would have also liked to see some benchmarks of the 2600k @ 4.5-4.7ghz to compare against the BD overclocked chip.
Devil_Dog's Avatar
meh....

Nice review though.
Djak777's Avatar
So as of now is there any updates planned for release that could increase performance?
Sammich's Avatar
what is newegg waiting on ._.
Dolk's Avatar
I wouldn't be down about what you see. In a lot of ways this is an improvement, in some ways it is not. An explanation of the Architecture itself will come shortly.
SupaMonkey's Avatar
Nice review Hookie,

What about Bulldozer vs i5/i7 performance in crossfire?
terran2k's Avatar
maybe they need to release a bulldozer II? Im almost angry with my dissappointment in AMD.
I.M.O.G.'s Avatar
@Djak777: None announced. AMD hasn't said anything about microcode updates planned for performance increases.
wickedout's Avatar
The i7 2700K! Lol! Newegg hasn't put up any BD CPU's yet. Wonder why?

Nice review as always. It's good value CPU, but I'll stick to my i5 2500K!
Sammich's Avatar
Kind of shocked they would go to the extent of mocking Intel, but not really disappointed.
Dolk's Avatar
Increase performance will come with the next generation of the architecture, and steppings. Piledriver is due to come out soon after Bulldozer. From what we do know it will be next year for Piledriver, but could be sooner.

Silicon production for these new chips is more advanced and more difficult.
Archer0915's Avatar
If you or hokie could do some DC testing that would be great. I mean what is the point of having 8 cores if they are not all used? Show me some BOINC 8 thread work, some CD conversions and some real encoding with these things. Those should be pointed out guys.

Stop fighting with the normal tests and pull some work on these things damnit.
Djak777's Avatar
good to know thank you both
Xterra's Avatar
Its a true quad core rather than an oct core, with that in mind, positive waves mannnn!
Archer0915's Avatar
Has 8 threads don't it? Have you examined what the module is by any chance?
EarthDog's Avatar
Id be happy to do that and return the card. The 2600k is on the GD65... Hell I'll even pound on it at 5.5Ghz and play.......

Unless that thing has another gear, I dont think the results would be any different.

AS far as games go, I dont think anything would be different there either. It may not have the clock for clock ummph, but more clockspeed and cores are not hurting this thing it seems.

<- = for technical knowledge on this thing. Looking forward to hearing more about it with Dolk's architecture breakdown though!

so does a 2600k but nobody calls it an octo core.
Xterra's Avatar
Yeah but from the sounds of it, we won't know how everything works out under the hood until 1) we get a chip of our own or 2) Dolk publishes his architecture article.
Dolk's Avatar
You have to remember that this is not exactly an 8 core CPU, its actually a 4 core CPU with 4 threads.
Black C5 Z06's Avatar
I've only had my Thuban about 6 months now, so no reason to change. REALLY excited to see what Ivy Bridge and Piledriver bring to the table, though.

Still, I'm pretty happy with the performance of BD, especially given the price point.
EarthDog's Avatar
Any chance we could/should add a poll to this thread to get the overall pulse of the results? LIke 1-10/1-5 or something...
LancerVI's Avatar
Thanks for an excellent review. I'm pretty new around here and I really like the format. Very good read.

As for BD, I must say I'm a bit dissappointed, though expected the result. Looks like I'll stay with the blue team for now.
Archer0915's Avatar
Each module has 2 cores with an independent L1 correct?
I.M.O.G.'s Avatar
Poll added, please vote so everyone can see what the group thinks.

1 =
10 =
Janus67's Avatar
Side question, if the 8150 is supposed to compete against the 2500k, what is the 8120 there to compete against? It seems that it is within $5-$10 of the 2500k price making it more of a direct competitor.

To the poll, I voted it a 4. Definitely not super impressed, but they tried. Green star for effort?
Dolk's Avatar
First you have to define a core to say what a core is. A core contains a Fetch, Decode, Execution, Memory, and Write Back sections. BD shares the Fetch and Decode, but not the Execution (apart from FP), memory or write back. So in a sense the Module is a single core, with a very beefed up middle part.
Archer0915's Avatar
for now. Man I was hoping for an excuse to spend money.
I.M.O.G.'s Avatar
@Archer: Each module has 2 "cores", and each one has a dedicated L1. Best portrayed on this slide:

Archer0915's Avatar
Logical; it has been a long time since I studied CPU arch. But can it run 8 threads? If so what is the performance when it is pushed in a role where it could use everything it has?
PolRoger's Avatar
The build up and launch of BD seemed so much more (long/frustrating?) compared to SB early this year. I'm glad that it is finally over! I thought early on that I would definitely be picking up a new combo at Micro Center on opening day but then the (early preview) and now that it is officially out I'm not feeling so compelled... still might though?
Archer0915's Avatar

Yup those are what I was looking at early on.
muddocktor's Avatar
Heh, havewn't gotten finished reading through the article yet; just got to the part where hokie is talking about the water cooler it's coming with. Now I know where all those Asetek built H70's are going to and why Corsair switched cooling OEMs for the H60-H80-H100 series of LCLC units. AMD probably bought all the production capacity of Asetek.
Fineas's Avatar
I love the reviews here. I may not post alot but before i make a purchase i check here first.
Dolk's Avatar
When it uses everything it has, it does a pretty good job at it, outside of FP calculations.
ratbuddy's Avatar
Not by a long shot. The review showed the 4.75ghz results for this chip, but what about SB OC'd results? It leaves this thing in the dust, and frankly, I'm shocked that this failure of a chip got the approved stamp.
EarthDog's Avatar
Thats like a Dos Equis commercial...

Stay overclocked my friends!

I.M.O.G.'s Avatar
It can run 8 "real" threads. It isn't like hyperthreading.

The answer to your second question depends on which benchmark is used to measure.

wprime and superpi are very FPU dependent, of which Bulldozer only has 4 FPUs which is why, I think, you see it trading blows more or less in these benches with Thuban which had 6 FPUs. So basically, it could be said that the architecture of Bulldozer is strong enough to overcome the lack of FPUs to perform comparably to the old architecture with 2/3rds the FPU count. There are a few different ways you could frame that, so don't pick that comment apart too much - but hopefully it makes sense.
Archer0915's Avatar
So I would be looking better than X6 and perhaps better than i7 in say WCG or other 8 thread DC work and encoding?
I.M.O.G.'s Avatar
Approved was judged by price/performance/positioning. Look at the stock results compared to comparably positioned products, and maybe you will disagree slightly less.

The initial poll result average is above 5, so that will help measure how on base or off base we are as people continue reading and voting. I understand how you feel though.
doz's Avatar
You cant give a disapproval on such a large ticket item and expect to be part of the next NDA

Seems OC got a good chip as well for overclocking. Many other sites are reporting issues in the range of 4.4 to 4.6ghz saying its tough to get stable even with a high end cooler.
Dolk's Avatar
Hey rat, look at the difference between it an the Thuban, thats why it got an approval stamp. SB is a generation ahead of AMD, but in a single jump AMD got pretty close in a lot of ways.
Janus67's Avatar
It may also be worth noting the power draw differences at load between BD and SB.

They covered it in the [h] review (re-hosted the image, so it doesn't give them ad rev)


huge power draw difference, definitely not winning on the performance/watt area.



For being effectively the same chip, I think the 8150 would get disapproval and the 8120 would get approval for costing $50 less.



I wonder if BD can pass SB in 3d benching while it is cold for globals?
Brolloks's Avatar
Excellent Review Jeremy, very thorough and also balanced, well done man.

I'm not terribly disappointed as I did not really expect BD to beat SB on benchmarks etc, I'm nicely impressed on the high clocks cold, that is way above what SB can get, so I will get one after all to play with. I think it has a fair price point especially with 8 full cores

Jeremy, what was temps under water with CPU at 4.75 Ghz under load?
someinterwebguy's Avatar
So, if I were to run Deep Rybka 4, it would show up as 8 cores to the program?
I.M.O.G.'s Avatar
We covered power draw also, though in less detail. In case you missed it, from the article:

100309
SuperTuner12010's Avatar
Just got home...I see you guys finaly got to talk. Seems the bulldozer are doing better than the previous benches,. But still not what some of us expect.

Anyone have some charts compared to a 965 or 980. Id like to see how the 4100 stacks up?
Brolloks's Avatar
What are load temps clock for clock with the same cooler, say 4.5 Ghz, now that would make a difference getting it for a gaming/24/7 rig over a 2500k/2600k?
doz's Avatar
Not trying to argue with you Dolk, but can you say its a generation behind? I understand that PII was the generation of Core2 and was meant to slug it out there, but it was AMD who delayed and put off this chip for how long? Yeah, technically its a gen behind but will AMD ever catch up?

You can say its a generation behind and its a success because its better than the previous but as long as it was in production it should be. Its already being knocked around by SB which is 10 months old already and about to be killed by SB-E and IB. I know you mentioned PD but come on, I have no faith in AMD getting it out anytime soon.

Atleast AMD is good about their graphics cards and hopefully will get the 7xxx out by December or January at the latest. Seems thats more their department these days anyways.

And Brollocks, it was reported in HF's report that they had issues keeping temps within reason at 4.6ghz (86c) with a H100 cooler to give you an idea. H100 should be about what a highend air cooler is so that isnt saying a whole lot. A good population of SB chips can get to 4.5ghz without many problems and temps are pretty reasonable.
ssjwizard's Avatar
As AMD pointed out to hokie and as its in the review. Its performance today != future performance. The capacity for 256bit instructions is kinda important. AMD if nothing else is good a driving tech forward even if they fail to utilize it all properly. Considering the aim of the APU designs they are going to be pushing next year and the idea of homogenous computing the lower FPU count should be quite adequately supplemented by the GPU moving forward.

Id rank my satisfaction with the new design as a ~7.5, I expected to see more solid right now improvements. What were getting is marginal right now improvements with a potential for big later improvements. IMO adopters of BD should experience a pretty solid longevity for there systems.

One thing Ive yet to see are CF/SLI comparisons. Considering the number of PCI-E lanes that BD has it should be capable of some gains in multi GPU setups.
Janus67's Avatar
Gotcha, I must have skipped over it, although those were just ones at stock, it's helpful to see the scaling when OCed.

Then maybe it would be better to wait for the future when the performance is utilized?
Fineas's Avatar
lol now im going to change my avatar
I.M.O.G.'s Avatar
We could, they would still send us gear. They didn't pay for my flight and hotel for me to get briefed on the architecture because they would throw us out with tomorrows trash - I sat in the same room as Kyle, Anand, Charlie, etc... Hell, I got in trouble while at HQ for installing superpi and wprime on their demo machines and they gave me a stern talking to. A slap on the wrist, but not a you guys are never welcome back. The demo machines were there, I had a USB stick with the apps... What's an overclocker supposed to think a demo machine is for?

But they know as well as we do, credibility is the most important thing. If we don't have credibility, our articles aren't worth anything. Hokie wouldn't ever lead you guys wrong...

So basically, keep in mind, the Overclockers Approved rating is best compared to the USDA stamp for beef. Overclockers Approved isn't saying you are getting a premium slice of Kobe Beef - its saying the product delivers on the performance one should expect given the price and position in the market. You have to read the review to gather the details on what leads to that conclusion - we don't spoonfeed in the conclusion or by ratings, so that people have to read and decide for themselves. That is really important.
Dolk's Avatar

It is and it isn't a generation behind. This is a new territory for AMD. AMD has always gone by and pushed for more and more cores to help increase performance, now they are utilizing a form of Threading to create a more powerful CPU. If you think about it, adding more cores can only go far in performance. So the next step is to exploit each core to its fullest. AMD has done this by adding in more of a middle area to increase performance.

I'll be talking about this more in depth later, but to put it shortly, AMD is catching up, and it will not be long for them to be up against Intel, hopefully.
ratbuddy's Avatar
So where I live, even if you only run the chip 5 hours a day, it's going to cost you an extra $36 a year over a better performing SB chip. Fold on it, and it's costing you (about 90 watts more, so 24 hours at 90 watts = 2.16kwh more per day, or about $.40 per day, $12 per month) $146 more per year to run in electricity costs alone. Run it in Denmark and that's more like $300 a year.

This is without a doubt the "AMD GTX 480." What a shame.

edit: Almost forgot, in those OC'd results, the 8150 at 4.6 uses 177 watts more than the 2600k at 4.8. 177 watts means in about 5.65 hours, you just sucked down a whole extra kilowatt hour compared to the SB machine. Math says if you fold on that machine, 1550.52 killowatt hours per year. Where I live, that's about $310.10 extra per year on the ole' power bill, figure about double that in Denmark.

You'd have to be some kind of idjit, or get free electricity, to even consider folding on BD.
doz's Avatar
Thats the problem though. Its really not delivering on price and position in the market. SB chips are priced so well now that they are almost as cheap as BD. Couple that with the fact that SB system does NOT cost more outside of the chip itself, a MUCH HIGHER power consumption, reports of higher temperatures (requiring better cooling, remember with SB you can pretty much hit 4 to 4.2ghz with stock cooling and be OK), and a price point that really isnt that great.

If it came in at $180 to $200 for the 8150 I might say OK, decent chip at a good price but its not even there. When theres a much better product at the same price I just dont see why anyone would want to get this for a daily rig (I can understand benchers wanting to play, or other people wanting to have fun) when theres another option thats better at close to the same price (I mean really, whats $30 to spring for the 2600k when you are going to be spending $400+ on a mobo/cpu?). Also, you can spring for the 2500k which is almost always $180 now and itll run ALMOST as well in most things (and better in some) than BD's new chip?

Sorry, Im just really disappointed.
xsuperbgx's Avatar
Will it kill me if I eat it?

Good review guys!

I'm gonna get one and take it cold.... Has anyone seen one for sale yet?
Janus67's Avatar
I guess we will have to see how well the FX chips do when under different aftermarket cooling (not just a custom water loop) -- or I guess, more directly, how well the bundled watercooler that comes with the chip performs
Bijiont's Avatar
Unless I am blind, Tiger doesn't have them nor does EGG.
I.M.O.G.'s Avatar
I respect your position.

I would consider SB vs BD board prices also in the total package, without forgetting cooling, as the FX-8150 ships with upper-end air cooling (the watercooling unit).

If you look at the benchmarks there is a lot of trading blows going on outside of wprime and superpi. If you then consider the cost of CPU+Mobo+HSF... Where do things land? The power consumption is a knock against AMD it appears, so that can't be forgotten either. I dunno.

I'm not saying everyone is going to agree, but I'm saying given the total package, what you get seems to be on the mark to me.

EDIT: This is not a rush out and buy it argument from me, I'm just saying it seems alright. I voted 7 in the poll. Partially also because I found SB a terrible bore in regards to overclocking at best, and infuriating at worst. On SB I could run wprime1024 on all threads at 5.3GHz, but I couldn't even run CPUz at 5.4GHz. What the crap is that? I'm a bencher though also, so take that for what its worth.
zizux's Avatar
@DOLK,

Given that this is a significant change is x86 architecture do you think that vast improvements can be made via software (os, app, game, etc) optimization? This is my gut feeling but it will take probably a year before we would see those is my guess.
eviljab's Avatar
i was thinking of updating to a new mobo and BD, but i think i'll just buy a couple of high end SSD III 120gig and raid them. should be about the same price.
eviljab's Avatar
nice review team! i'm sure it took a lot of effort
ratbuddy's Avatar
Look at it from a business point of view, and check my math from a few posts back. Is the TCO even anywhere near the competition? I don't think so, it's way higher.
Seebs's Avatar
I'm torn on this one.

Will I buy BD for some subzero fun? You bet your cookies I will.

Was I expecting more out of it? I was. Especially on the subzero department; I hoped that what BD lacked in ooomph at "normal" speed it would make up for when at -196C, but it really doesn't, or at least not to the level that I had hoped for.

I gotta say though; for all the hype that AMD threw at this chip it surely feels like a very anti-climactic ending to the saga. Kind of like the ending of "The Spranos"; all that build up just to do a "Fade to black"? Pfft.
Metallica's Avatar
They have been taken down .. weird.
Fineas's Avatar
ok new avatar and signature
Dolk's Avatar
I don't know, its hard to judge at this point.
doz's Avatar
Well Ill keep an eye on it and maybe itll surprise in the end, although I dont see it happening. If they immediately come w/ a price drop I could see it being OK.

That cooler it comes with is really nothing. If a H100 has issues at 4.6ghz keeping it cool, that thing wont do much. As far as board pricing, a little cheaper YES but not really much at all.

Maybe I was just expecting too much out of it. I was really hoping that the 8150 would put away the 2600k and give me a reason to buy it. I dont do serious (or cold) benchmarking so I really have no reason to even try it. Id rather save my money for SB-E upgrade at this point (or 7xxx graphics).
mjw21a's Avatar
Hmmmm, kind of where I expected it to be. It's the one of the building blocks on the way to the next gen Trinity APU..... Now thats what I'm waiting for. The fully redeveloped vector based GPU should theoretically blow the socks off anything else once released. From this it's cleat that they'll be relying heavily on using the GPU component for compute tasks.

Hopefully the x86 component will improve IPC, and the poor FPU performance will be offloaded onto the GPU anyway so I think I can wait a fair while longer before upgrading. My good old 955BE is still going strong.

EDIT: It will be interesting to see if Trinity can be done right. If so then it has the capacity to be an Intel killer in certain tasks. I expect it to remain a poor performer on integer performance though, and as stated before, the x86 FPU performance won't mater. I wouldn't be surprised if the FPU unit is completely removed in Trinity in favor of offloading everything to the GPU component.
eviljab's Avatar
wonder how sandy 2500K pairs up to BD clock for overclock @ 4.7 Ghz
wickedout's Avatar
They were there earlier. Maybe a AMD recall! But yes it's weird!
I.M.O.G.'s Avatar
TCO doesn't look good when you look at power consumption, and I had read your post. To be fair, it is a worst case scenario estimate, and I don't really know what your local electric rates are compared to mine... But you are right, there isn't any good news on the power consumption side, especially for those running distributed computing.

Most importantly, for those who don't run their processors at full load all the time though, I think there needs to be more extensive power consumption testing. For someone who encodes video occasionally, but the processor spends a lot of time in a near idle state during normal PC usage... I would really like to know what those power consumption numbers looked like. AMD talked a lot about the power gating and being able to turn off various unused parts of the CPU - there could be some bright news there for gamers, regular users, school computers, or typical office workloads.

Under load its sucking down juice, but as a normal daily driver not running distributed computing, I'd like to read more about power consumption - we certainly didn't have the time or equipment to evaluate that in this review.
ratbuddy's Avatar
That kinda bummed me out about this review.. Two highly OCable chips and only one was benched OC'd.
I.M.O.G.'s Avatar
Compare the stock results, that is pretty much your answer. Except on BD, you would have a bit more flexibility with FSB to tweak your memory OC.

Time constraints, and clock for clock with nothing else changed is going to be virtually equivalent to stock comparison. Hokie's sleeping, but he could comment directly on that, and probably will tomorrow when he has a chance. Only had the chip for a bit over a week, he's got a full time job and family, had a mother in law in town, and was laid out sick for a couple days. Not making excuses, but with the number of runs and tests, these things do take time - wish we had him as a full time reviewer so he could drop his day job.
SupaMonkey's Avatar
Can we start the Ivy Bridge rumour thread now?
I.M.O.G.'s Avatar
I believe Archer0915 already started the Ivy Bridge thread, just head over to the Intel section. Doz has the Piledriver rumors thread cooking I think too. lol
ratbuddy's Avatar
Not knocking hokie at all, I'm sure he did the best he could in the time he had available. I just don't like seeing one side OC'd and the other not

Also, if anyone cares, the Anandtech review went up a few minutes ago.
muddocktor's Avatar
That $100 unit looks to be the Corsair H70. A good cooler, but not as good as the Corsair H80, which is also $5-10 cheaper too.

Hokie, good review man! I find the performance to be kind of disappointing, considering that AMD has been working on this for several years. It does OK and I hope that in the future, apps can be updated to better utilize the new arch of BD, but it's kind of meh to me as it sits now. And the power consumption issue is real and bad too. Will be like trying to cool a Bloomfield off I think.
eviljab's Avatar
BD didn't make that much of a difference overclocked IMO according to the graphs. And i think it was stated that way as well; however, does sandy show big improvements @ 4.7 Ghz? 4.7 appears to be a high point for sandy 2500K, whereas it appears as though BD can go higher but at no real improvement.
Archer0915's Avatar
It sickens me to be honest. Intel has had toooo much dominance for toooooo long. There needs to be an equalizer and I thought this would be it
eviljab's Avatar
there needs to be balance in the force eh
Archer0915's Avatar
Yes. I think that this will lead to further advancements but they need to come fast.
Sammich's Avatar
the prices are so fluctuated on every site, what gives .___.
eviljab's Avatar
according to the early poll figures, BD appears to be a disapointment. Only four person ranks it 8 or above.
I.M.O.G.'s Avatar
11 people voted under 4, 11 people voted over 6... But 6 people voted a total disappointment while no one thinks its everything they hoped for. Poll results are looking like a big fat "meh" so far to me, slanted heavily towards the "fail" side of the spectrum.
SamSaveMax's Avatar
Thanks for the review. Saved some $ from buying a new platform.
As Archer0915 had mentioned, I would like to see how it performs in real world video/encoding scenarios with all 8 cores.
Sammich's Avatar
Meh.. to invested at this point. As soon as it hits newegg I am clicking next day shipping.
Janus67's Avatar
http://www.anandtech.com/show/4955/t...x8150-tested/8

A good breakdown of gaming benchmarks of the chip. It looks like it hangs in there for the most part on GPU-bound levels (1920x1080/1200) but if you are playing a CPU-bound game, or your resolution is at 1680x1050 or lower it is a huge drop off from SB.
I.M.O.G.'s Avatar
Do you have any suggestions for what you might consider more real world? PovRay and x264 are real encoding utilities, but we're always looking to expand the testing suite to include more widely accepted and real world tests over things people consider synthetic. Audience opinion on this sort of stuff is as important as anything.
dkizzy's Avatar
BD feels like a stop-gap attempt by AMD. The architectural improvements are certainly welcomed, and I do think these chips have some future scability to them, but what good does that do AMD with the ever-coveted benchmarks? They should have emphasized to the tech community the relevance of this design and what will benefit the most from it. Sure a brief tidbit is in the ad but that's not enough. I'm interested in learning what benefit the new instructions this lineup has that the SB does not have. This kind of feels like it is just the Phenom all over again minus TLB. Piledriver will be the one that will have the tweaks, the improved power consumption, etc etc. With SB-E around the corner what is the point of going with BD? I guess budget is about the only reason, or 8 VM's. I don't think it's a bust but it's just lackluster, certainly not a splash. But if AMD is just going for being relatively competitive I guess they accomplished that. if PD production doesn't get delayed we should see it next year

*Edit I got to read more of the review and I can totally see how some older bench programs would not reflect the performance improvement of this architecture. Nevertheless I'm waiting for PD unless some of you guys put up some great BD deals in the months to come
Archer0915's Avatar
Hey why were there not more tests based on the real torque the BD should have delivered?

Was it a time limitation?

EDIT: Multitasking tadks are best for multi core where you really want to show off prowess. Something like I did in the AthII PhII comparison where you load the crap out of the CPU and then run timed tests to see how it could handle extreme usage.

SETI tests, WCG tests, F@H tests, Multi file Zip while transfering files and running a virus scan and encoding in the background to give a feel for the entire package. Open the damn thing up. Dont just play with it. Sure you have to satisfy the kids who want toys but some of us will work the crap out of a CPU and system. Some of us actually need and use more than 8 gigs of memory as well. We like to be catered to in reviews.
I.M.O.G.'s Avatar
torqus? I'm not sure what you mean.
Janus67's Avatar
I think a photoshop render wouldn't be a bad idea (or 3dsmax or something similar).

I would like to see more games in the testing, but I understand that can be difficult to standardize as each reviewer would have to be given/purchase copies of each of the games in the suite.

Wouldn't hurt to have PCMark Vantage or 7 used
Archer0915's Avatar
Torque SP error on my part. The Intel have the high end for speed but do they have the low end grunt when the work is piled on.
I.M.O.G.'s Avatar
I think the challenge there is what sort of render? I don't think there is a standard photoshop render benchmark, at least not that I know of. PovRay, x264, 7zip... These are real world apps that include a benchmarking routine. We could develop our own standard render that is reproducible, but we'd need to spend some time figuring that out which hasn't happened yet - or we'd need a member to contribute a reliable and repeatable procedure.

Games aren't hard for us to get test keys for, they have handed those out to us like candy when asked, so we could actually include more of those I expect. We typically go for representative samples here I think, as its something we've talked about doing more of but they are some of the longer benchmarks typically I believe. I think more specific game title tests could make sense and its not impossible for us to work in.
ratbuddy's Avatar
The only suggestion I would make is just about the only 'heavy CPU' task I do (besides playing MWLL which is based on Crysis Wars, mostly single threaded, and wouldn't be anywhere close) and that's taking a FRAPS video recorded at 1080p and using Windows Live Movie Maker to compress it down using the standard 1080p output setting, as would be typical for someone to do before uploading game footage to youtube.
neo668's Avatar
Firstly, thank you Hokie for the excellent review. Been waiting months just for this.

Now, please go easy on a noob like me. Is it safe to say that the BD is worth getting because it is a completely UNLOCKED cpu? Or should I just consider the i7 2600? The costs for both systems appear to be about the same, give or take $50.
I.M.O.G.'s Avatar
Reproducible testing. We try not to do anything in a review that we can't repeat identically, or that any reader couldn't reproduce nearly identically.

We could run a synthetic bench that is designed to throw a bunch of different crap at the CPU, but doing any sort of real world mixture of applications is too error prone and hurts professional credibility. Throwing a bunch of different apps in you introduce all sorts of variables like board, ram, and disk performance - then if you can't do the same exact thing for several iterations on each piece of hardware in the comparison... A considerable fudge factor develops that we aren't comfortable with in an official review capacity.

We probably could do some community type result stuff with that we'd post as a forum reply or discussion, but its not the sort of stuff we'd want to publish as part of an official review - we aim for as scientific of a standard as possible given limited resources and realistic time constraints.
Janus67's Avatar
Fair enough, I think we would have to create our own filter/render package for photoshop, I just know that a lot of the go-to sites (ars, anand, etc) have a suite with photoshop/3dsmax included in them, although I'm not an expert with those pieces of software so someone would have to be able to explain how to make render file for it.

Interesting part for the games, I didn't know that it worked that way for those Want me to do some video card reviews



On a different note, from what I've read Windows 7 isn't going to give very good performance for Bulldozer (the task/thread schedular isn't designed to take advantage of BD), but apparently Windows 8 does a better job/will do a better job when it is released end of next year which by that time we should have a whole new chip (Piledriver?) to worry about.
Istariol's Avatar
KYAAAAA....i just saw socket for FX is AM3+!!!! is this can be applied in AM3+ mobo too???? can i used this on my mobo????
I.M.O.G.'s Avatar
I heard that rumor, but I don't know that there is any substance to it. The only scheduler issue is one that exists with the Linux kernel, and its scheduler options have little to nothing in common with windows. I'm not sure there is any source worth its weight behind any Win7 scheduler performance concerns with BD... I have only heard the rumors though, maybe there's an actual source I haven't seen yet.

On an unrelated note, really interesting results in the polling so far. Thanks to everyone who has already voted.
Janus67's Avatar
If this is your board (according to your sig)
http://www.newegg.com/Product/Produc...82E16813130290

It doesn't look like it, as it needs to be the 990 [correct me if I'm wrong, AMD gurus] chipset.


Edit: for IMOG:

http://www.anandtech.com/show/4955/t...8150-tested/11

with a grain of salt:




So there may be some hope, yet, if you can get Jeremy to install the developer preview of windows8
Archer0915's Avatar
But it is really the only way to tell how they can perform under load. Is it as precise as one benchmark at a time? Absolutely not and it should not be taken as such. What it does show is where the CPU falls in comparison to to the competition.

Take a one on one comparison of the 2600K and the 8150. Start both rigs out with a fully loaded disk of the same software. then start both crunching the same data. While that is happening begin a virus scan in the background and when all that is going start a 40-100 gig zip. Now throw in an encoding job on a serperate drive and time everything. If the times to final completion are within a few seconds (30) then thay are equilavent but if there is a clear difference there will be no doubts.
Istariol's Avatar
AMD FX 6100 Black Edition (3.3/3.9GHz, 95W, 6MB, total dedicated L2 cache, 8MB L3 cache, 5200MHz HyperTransport™ bus, socket AM3+)......this is my next processor!! heheh....would my mobo supported this processor???? the watts also only 95w rather than phenom i use which is 125w.....if i can use this without changing mobo again!! that's gonna be wowwwwsome and cheaper upgrades for me
Istariol's Avatar
only 990????...............hiks!! damm...if this processor arrive in Indonesia any soon....i will start saving money then for buying mobo FX supported
I.M.O.G.'s Avatar
If you don't have an AM3+ mobo, then Bulldozer CPUs will only be compatible with your board if the board maker provides a BIOS update.
ssjwizard's Avatar
Id imagine that even current BD chips could benefit from this, and mind you that there are supposed to be FM2 FX cpus without GPU BUT I can see these GPU assisted designs could lead to a combo marketing schema to push both sides of there business. AMD FX faster when paired with Radeon HD xxxx.

I will probably be picking one of these up unless I decide to sell my 990 since its just sitting in a box until I buy a cpu(sempron 140 I got to bench does not count). The fact that it comes with that water cooler I consider this a ~190 cpu and for that it performs. No doubt we will be seeing lots of them for sale in forums/ebay as the new super budget WC setup.
Istariol's Avatar
.......mine only AM3......well!! looks like i have to wait little longer to upgrade both mobo and processor into next level, also wait for the dollar currency in my country loosen grip to our currency...as for now if i do both upgrade! took my saving to zero and bleeds again...well!! i'v to be happy what i have now....
SteveLord's Avatar
I was hoping this would be in line with Sandy Bridge performance and AMD would get rid of the trend of releasing CPUs as fast as Intel's previous models. I can't help but be disappointed when myself and others remember their last FX line which was kickass.

Anyway, have fun to whoever buys and plays with em!
Bijiont's Avatar
Now the wait for someone to actually stock the chips.

I am going to still pick one up myself.

1) Already sold my 1090T to a buddy of mine
2) New toys to play with is always fun, even if they aren't the "zomg" best out there.

All in all I knew it was to good to be true that AMD would be on top again with BD, but I really hoped.
SamSaveMax's Avatar
For benchmaking it's a bit time comsuming but I was thinking in the line of using applications like Handbrake, ConvertXtoDVD, Photodex or some video editing...etc.
As anyone would hope for more cores more threads to do the heavy work loads in a short time period. And my upgrade is most likely due to that as I do quite a few of rendering/encoding slideshows in HD.
Ironsmack's Avatar

If you're going to buy now or in a couple of weeks - get the 2500/2600. You've got more bang for your $$$. Even if you didnt get the K series, it isnt as power hungry as BD.
Badbonji's Avatar
I was hoping for more. Whilst it has half the FPU units of that of a true 8 core, Sandy Bridge also only has 4 FPU units surely? And it still cannot beat the i7 2600k except in a only a couple of the tests of the reviews I have read.

I may not have an AMD system, but I like to see competition against intel as it drives down their prices. I cannot really see that happening, because whilst it is cheaper in the short run, it is far less efficient per watt and will end up costing more in the long term from electricity.

From http://www.bit-tech.net/hardware/cpu...-8150-review/8 The overall score at 4.818GHz is lower than that of a core i7 920 at 4.04GHz, loses to the 2500k with it at stock 3.3GHz and barely beats the 1100T at 4.2GHz.
mjw21a's Avatar
Hmmmmm, I think 2013 will be upgrade time for me.... I'm keen on seeing how their x86 + vector based graphics solution goes..... Should even things out somewhat, though we need ot see AMD improve their integer performance for this to pan out.
Cigarsmoker's Avatar
Disappointing considering all they hype. Barely manages to keep up with Intel's year old CPU. It's an improvement over current AMD's CPU, yes, but just barely in some test. Was hoping for more. I'm sure by 2012-2014 it'll be much better but where will Intel be? Doesn't really entice me to upgrade from by 955 BE systems just yet.

BTW - great review. Thanks.
Bijiont's Avatar
Just curious does anyone have a good benchmark of a 8150 overclocked for 24/7 vs a 1100T overclocked for 24/7 use?

I see several of the benches show it vs a stock 1100T but just wonder how much of a performance gap there really is when they are both overclocked to moderate values for their design. Such as having the 1100T at 4.1Ghz and the 8150 at 4.8Ghz.

If the gains on the 1100T are much higher than the 8150 I may go back on my earlier statement. . . I keep seeing more and more reviews say just how bad the 8150 is.
SamSaveMax's Avatar
I always enjoy reading about power consumption section on any cpu review that I read.....but this flipped me out of my seat. 586 watts load at 4818MHz? Dang! No wonder why it came with H2O cooler. Too bad that they didn't mention about the heat output.
Thanks Badbonji for the link.
Cigarsmoker's Avatar
I would think comparing stock to stock is the way to go as comparing OC to OC is so chip dependent. Not all 1100T are going to hit 4.1 GHz and maybe dependent on other factors such as memory and HT settings. Same goes for any other chip. It's nice to know that it's a good OCer but most unlocked chips from Intel or AMD are good OCers these days.

Fact is when SB came out it BLEW Intel's old CPUs out of the water, even old Intel OCed. In some ways it's not unreasonable to expect new AMD to do the same to old AMD at stock. Otherwise where's the incentive to upgrade to AMD?
neo668's Avatar
Thank you for your advice.

I must say I'm pretty disappointed with BD after reading all the reviews that have come out - Hokie, tom's, anand, etc. I'm in no rush to upgrade so I'll wait for IB now.

Time to sell all my AMD stock.
Yensen's Avatar
I was shocked when I saw the power consumption figures. It's Pentium 4 all over again. Let's hope their next architecture is as much of an improvement as Core was for Intel.

My poor E6400 will have to keep fighting the good fight until Ivy Bridge is out.
Bijiont's Avatar
I agree however several benchs show the 8150 overclocked and how it stands again base clock chips. Which sure shows just how much you have to overclock the chip to get even better performance than say the 1090 or 1100 at base clocks but not if both chips have a mild overclock.

Showing a bench with the 8150 overclocked and just barely beating an 1100T at stock does nothing but show just how bad the 8150 is. I think if the bench reflects both chips under even a mild overclock the difference becomes even more apparent. I guess what I am saying is the more I read the more I am tempted just to buy an 1100T because the 8150 is just that bad.
manu2b's Avatar
Sell them before NYSE reopens!
manu2b's Avatar
Might go this way: a cheap Thuban to replace my Phenom...
Bijiont's Avatar
Ordered an 1100T, I will wait for the second generation as the 8150 family just isn't going to cut it. Would have stayed with my 1090 but that was already sold
capttripppp's Avatar
Great review Hokie
To say I am disappointed would be an understatement, but it is what it is...
someinterwebguy's Avatar
Did some thinking, including about getting a i7-980, and have decided at least for now, to hold off on upgrading until after the holidays.

I haven't had an Intel based system since I had a Pentium II 233 back in early '98, but it looks like I might have one come the new year.

I'm going to wait for Ivy Bridge and see what happens with that before deciding though.

My current system is nice and stable as is (I only reboot to install Windows updates and shut down to clean out the dust filters on the bottom of the case) and nothing feels slow.

I'm just glad I didn't buy anything ahead of time, as much as I wanted to.

Overall I don't consider BD a total failure, but looking at the benches from all the different sites, I definitely wouldn't consider it a success either. Maybe I, and others, expected to much?
Brolloks's Avatar
After all the wait they could have at least some in stores on launch day, that is a screw up imo, only TD have the 8150 in stock.
manu2b's Avatar
I think a lot of people are bitter, having purchased BD Mobo's (I almost did in June... BD was "unofficialy" planned early July, you remember?).
And now, finally out, not even in stock!
WTF are you doing AMD?
At least, they manufacture great GPU's...
EarthDog's Avatar
I think since we saw AMD hold the performance crown with S754 and S939 some years back, we kind of expect them to do it again...at some point.

I am a bit disappointed in the lack of availability of the CPU also... makes no sense to hype this thing to have no availability on the major sites... WTH?
Rattle's Avatar
I couldn't be more bummed about these new chips if I tried. I was really wanting to go from my i5 setup to this and build one for my GF too. Not gonna happen though. These chips are an embarrassment, especially overclocked...
zitnik's Avatar
Welp, bye bye BD, hello expensive Corsair cooler. Gonna overclock my 1090t sky high, I kind of want to move to Intel after this debacle, but that would depend how much I could sell my Sabertooth/1090t for. If I could get $200 for the board and CPU I'd move to Intel and pay the $300 for the 2600K (I'd use the $200 for the Sabertooth Intel board)
Brolloks's Avatar
At least it saves me time to set up the 990fx, will go straight back to the egg, can't run it without a CPU can I??
PolRoger's Avatar
Last week the Crosshair V Formula went "OOS" at NewEgg. I decided to stop by my local Fry's to pick one up as they showed some still in stock but when I actually got to the store they also were "OOS" so I went home empty handed. That now looks like it was a blessing in disguise.
hokiealumnus's Avatar
I'm just getting started this morning, but did need to clear one thing up - the water cooler is NOT included. FX comes with the standard AMD cooler. As stated in the review, the water cooler is expected to cost around the $100 mark and is not available in our market yet.
Janus67's Avatar
As long as the drivers work
(I personally have had only a few issues with ATI cards and/or their drivers though, just speaking from others' experiences).


So ~$100 for a (probably) rebadged H70, when a better performing H80 sells for $10 cheaper. Nice!

(Thanks for the clarification, Jeremy!)

I'm interested to see how the chip OCs on a stock cooler now.
manu2b's Avatar


Well, not an Intel fanboy, AMD neither, but some posts really make me laugh!
Look at the text in this one:
http://www.overclockers.com/forums/s...postcount=1842
manu2b's Avatar
Never happened to me... But I am back in the computer world for only one year lol!
Have a 6950 unlocked with 20%OC and a 5830 with 20%OC as well, and they run really fine.
Archer0915's Avatar
Let us not forget one thing; though this was anticlimactic it was really only made that way by the people looking forward to this processor and pumping themselves up.

The BD is great for a new build and will be equivalent in most things that you can see. Until more time consuming tests are done we will not get the big (complete) picture.

Hokie great job and I have one question for you. How does it feel? The AthII x4 felt faster than the PhII x4 in normal usage and you could feel the snap, though it was just off the line.

Dolk: What is your opinion of the cache system? I have noticed some tests that are better with no L3 on the PhII CPU and was wondering if this may have similar issues due to the design. That is why I suggested bogging this down and seeing it work. It could be a multitasking monster.
Brolloks's Avatar
Still want to know how load temps compare with the 2600 k at same clock speeds?, given the power package id say the FX must run damn hot
manu2b's Avatar
Archer, you forgot that:

Quote:
Originally Posted by JF-AMD View Post

IPC will be higher
Single threaded performance will be higher

That is all we can say at this point.

And that:
http://www.youtube.com/watch?v=Qbp6H...&noredirect=1#!

So? How will we handle this much power?
madhatter256's Avatar
Someone put some of these CPUs under bigadv folding and see what kind of points they get!!!

Just read the first few posts :P
HankB's Avatar
Whereas we had been hoping for a home run, I think AMD has provided a base hit. I think there's enough there to keep them in business, particularly if they can provide a cost/performance benefit vs. Intel. (*)

I'm skeptical that the integrated GPU can do anything to prop up FPU performance either. Consider that in the state of the art (CUDA, OpenCL), it is required to have specialized S/W to perform processing on the GPU. It seems like a pretty big jump to have the processor itself shift instructions over to the GPU. A more likely scenario is that an intelligent compiler could optimize for these units by identifying segments of code that could be executed by the GPU and producing binaries that would move data and code to the GPU and collect the result when finished. Even that seems like quite a stretch since GPU programming is significantly different from CPU programming.

(*) It seems like AMD has priced the BD to compete on a cost/performance basis with SB. That works for buyers who will compare shelf price between PCs. However the power consumption can negate the initial price benefit due to the cost of electricity to drive the CPU. For server farms, that may also require additional energy for cooling and perhaps even additional cooling. I'm sitting here with a 4 core + GPU system dumping exhaust heat on my leg. On mornings like today (54 F/12 C) that's not a big issue. When the temperature gets near the point where I need to consider A/C, the heat put out by the system makes a pretty big difference in room temperature. In fact, during the summer I cut way back on crunching to moderate this effect. If I had to choose a system today, I would lean towards Intel which would make it my first Intel system since I ran Pentium Pros.

Edit: Another consideration is the compiler used in the benchmarks. Windows benchmarks are most likely compiled using Microsoft's or Intel's compiler. In particular, Intel's compiler has a strong reputation for maximizing performance on Intel chips (and perhaps hobbling AMD chips...) I see that AMD has also produced a compiler that no doubt optimizes code for their processors. I wonder how binaries compiled with the AMD compiler and executed on AMD systems compare with binaries compiled with the Intel compiler and run on AMD and Intel processors. Potential benefit for Windows users seems limited but if you're running something like Gentoo Linux where you compile all binaries to begin with, there is potential for significant benefit.
madhatter256's Avatar
So, for hte bigadv, that's about a 50minute TPF... My i7 930 was getting 55TPF @3.5ghz....


I have yet to read the rest of the pages to see if you posted up stable overclocks and bigadv folding... so i hope it scales much better...
Neuromancer's Avatar
Nice write up again Hokie!

I am disappointed in the power consumption department, it is about twice what I expected. I am glad though that this thuban I just picked up still has some life left in it, and will probably be passing this first gen BD and wait for the refresh.

Not worried about the performance aspect so much, we all know Intel rocks at running old code. But AMD has to build for what people want and that is by and large performance.

It just sucks they did so at the cost of perf per watt.


Okay just found a more reputable source for power consumption (no offense [H])
http://www.anandtech.com/show/4955/t...x8150-tested/9

10W higher than Sandy B / 30W lower than previous AMD chips at idle.

230W at load though is significantly higher then even Thuban.
Dolk's Avatar
Holy blah, I went to bed with only 4 pages, now there is 9 in a single night? No way am I sifting through all this.

@Archer - The L3 cache is interesting. Each one of the blocks has to hold the memory of the other blocks. This helps in that a module does not have to wait in line to access the L3 cache. The bad thing is that all the L3 cache's must be updated all the time. This can cause miss hits in the execution rather than read delay. Depending on how big the penalties, AMD chose between these two and probably saw that miss hits cause the least amount of penalties.
Archer0915's Avatar
Spin and hype; companies and politicians do it all the time and PPL buy into it. PPL are peed because they went out and spent money on a board for this CPU and expected a killer. Well the CPU is fine and good for the price as well.

All I have to say is I remember the days of the K6/K6-2/3/+ and in magazines and in adds benches showed the K6 beating the PII. Well it never really happened. I knew the score and I never had a PII because they were overpriced IMHO.

Today it is a little different and the price field is about level and it is all about performance.
capttripppp's Avatar
I put a post on FB to AMD about the lack of supply, so far no response
manu2b's Avatar
Do you really think the price is fine? Here in Europe, it is 249... 2600k is 279 and 2500k 195.

That's not nice! Who would buy a BD? If you need a heavy multi thread cpu, it is because you make money out of your rig (I think, but might be wrong), and it will be a 2/3 years investment.
In this case you don't buy a BD, you go for a "real" 8 core cpu. IMHO...
mxthunder's Avatar
Tiger direct has them for $260.

I will not be buying. Will wait a while and see if they come out with a 8170, etc, or wait for this "piledriver"
MattNo5ss's Avatar
It looks like FX is on the fence of our Meh and Approved rating. I highlighted what I think is related to Bulldozer.

I'm disappointed in performance after all the hype, as many other people. Performance is okay, but the power consumption to get that "okay" performance is huge... I will not be buying one... with the 2500K $50 cheaper for about the same performance, the 2600K only $50-65 more for better performance, and the MUCH better performance per watt of Intel, it would be hard for me to suggest a FX CPU to someone too.
Archer0915's Avatar
Yes I think the performance is good (not great) and it does offer up competition so the price is fair. Forget about us, here, and read my sig.

We have not yet seen what they can do under extreme conditions either.

Dont forget ppl bought the P4 when the Athlon was crushing it. We are not the mass market and PPL buy what they are told is good and what gets the pretty packaging.
Theocnoob's Avatar
They have a right to spin and hype, but I mean... come on.
It's one thing to say 'our new stuff is coming soon and it's going to be great'. But if you actually hire the "This summer!/Coming soon from the director of..." guy to do the voice over... one has flashbacks to being blown away by T2 and Jurassic Park and The Matrix.

We were promised this:
http://www.youtube.com/watch?v=-4SlhJZiCXQ

And we got this:
http://www.youtube.com/watch?v=iDe4v318f64

EarthDog's Avatar
I wonder if a certain, no longer a member here, just rolled over in his 'grave'...
Archer0915's Avatar
Sadly we will not be the final judge. PPL who think a benchmark is a stain on a bench will be the deciders.

We should know better and if I had not had to pick this intel system up for some work and testing I did I may have also bought a board for BD.
capttripppp's Avatar
Ha, the lego version was pretty good though
manu2b's Avatar
Exactly, and PPL are told to buy Intel.
I had quite a few laugh since this morning, but inside I am sorry.
Sorry because why would OEM builders put AMD in their "mainstream" rigs as Intel does better? PhII is sooner than later EOL. i3 and i5 are cheap and perfectly match most of the people needs and have Intel branding.
Pro users will go to Intel for real multi core.
Big companies will go Intel servers: have you seen the power consumption of these 8150? I doubt that server versions of the chip will be much better and when you go to TCO, that counts A LOT.
So yes, I am sorry to see no competition.
Somehow, I hope PPL will be fooled by AMD commercials and buy FX chips, but as you stated, PPL buy what they are told...

EDIT: If I were loaded, I would have bought one, for the fun of overclocking it and to support AMD. But I am not, and go for the best bang for the bucks
MattNo5ss's Avatar
That one is over my head... I don't get the reference, but I'm glad you got a laugh...lol.
Neuromancer's Avatar
In all honesty, it is AMDs flagship product, clocks VERY well has a strong IMC, and outperforms the Thubans.

If it has Phenom/Thuban desktop snap thats a thumbs up as well.



It is not better than SandyB, uses too much power and the price is a little high compared to the competition. (not much though...)

Also remember people are saying that there is no price difference on boards, but there still is. granted 990FX is carrying a bit of a price premium at the minute which which brings them closer (possibly SLI licensing?), but generally speaking, same feature sets, AMD boards are still cheaper.

newegg listing excluding 890FX and P67 (because we are excluding last gen AMD exclude last gen Intel)

990FX $139-239
Z68 $80-360

990FX does not have the ultra low end boards available ATM, but Z68 wins with more than a dozen boards that cost more than the most expensive AMD. Hell even an ASROCK board costs more than the top of the line AMD.
Archer0915's Avatar
Man it is a state of the art CPU and has 8 cores. Intel only has four cores in the same price range. The 8150 automatically clocks up to higher speeds than the competition as well.

Do you think they will let you install PCMark on their floor model PC? No! More core + more Speed equals a Phat system in the eyes of the consumer. And now they can Facebook and Check email at light speed. Oh they can also brag to their friens because they have 8 cores.
Brolloks's Avatar
Here are 45 P67 boards all under $200 and all will bring a SB close to 5 Ghz if it is a good chip on air, SB boards are very affordable and generally cheaper than 990FX boards
http://www.newegg.com/Product/Produc...D&Pagesize=100
Neuromancer's Avatar
They have 8 core laptops now?
hokiealumnus's Avatar
Yep, IMOG summed it up nicely. It sucks, but that's how it is. We had this chip for 12 days before the review had to be published. Inside that time all that happened, plus this isn't my actual job, which of course takes up most of every day. If this were a full time gig, these could have more content but we do the best with the time we have. Don't forget that in addition to benchmarking, we have to graph the results (which is more time consuming than you might think!) and write these things - this one was ~4,500 words. Hopefully it still made for a well-rounded review.

That said, I also think stock comparisons still have plenty of value. Stock clocks are actually pretty close between the chips, with all of them between 3.3 and 3.6 GHz. IMHO it's fair to say that clock-for-clock dead-on comparisons aren't really necessary because the variation in closing those 300 MHz wouldn't be all that great.

As far as benchmarks, I tried to come up with a comprehensive test suite accounting as best I could with quantitative results in three (and a half) categories: The Benchmarks, The Real World and The Games, with the half being AIDA 64. Because graphs are condensed it might not look as though there are very many benchmarks, but here are the numbers:

Benchmarks: 9
SuperPi 1M & 32M, WPrime 32M & 1024M, 3DMark 06, Vantage & 11, HWBot Heaven DX9 & DX11
Real World: 5
Cinebench R10 & R11.5, 7zip, PoV Ray, x264
Games: 4
Stalker, AvP, HAWX 2, Dirt 2
AIDA: 1, but really 13. They're fast but all have to be run three times; as do all 2D benchmarks.

Folding@Home results were also obtained and posted in the first comment post in case anyone missed it. I didn't take the time to editorialize when that was posted (I sleep too!), but my $.02 on DC performance with these - no one is going to touch them with a ten foot pole unless they have free electricity.

We're always open to adding more and/or changing around benchmarks. They have to be benchmarks though. Archer has an interesting take on testing with lots of stuff going on, but that's not a repeatable, quantitative, single test. You could never get that process precisely repeatable down to the second so any value would be lost except for a subjective 'this feels faster when multitasking'. We need quantitative results.

Re: PCMark 7, I did run it and will post a screenshot up when I'm able. The result was not included because of all the storage tests - I had neither the time nor motivation to format the exact SSD used in this review and to reinstall windows on the Intel system and see how it performed there. PCMark is great and all, but it looses a lot of value because of practical considerations. That said, I'm happy to post up the result for what it's worth, give me a little bit of time to pull it and post it.

There are always those that will disagree with the logo we put on. Ratbuddy & doz (and I'm sure others), I completely see your points and your views are understandable. Looking at our ratings explanations, this is what "Approved" means:

I would recommend this chip with a clear conscience for the reasons I went through in the conclusion to the article. Approved is very broad. A product has to pretty much screw the pooch not to get it, which it is always imperative that people read the reviews. Everything is in there, good, bad and ugly. I even implored people to read the conclusion through rather than just looking at the logo. We can only open the door, the readers must walk through it. Only through reading will true enlightenment be obtained. Like that? A little Matrix with a little Ghandi'esque-speak.

Thanks to all for the kind words and all of the feedback too, much appreciated. We wouldn't be here writing these things if it weren't for you reading them!
Janus67's Avatar
Fixed that for you.

I guess the question comes down to, how well does a lower-end Z68 board overclock in comparison to a similarly priced/marketed lower-end 990FX board. I have to say that one nice part about SB is the ease of overclocking, to do so you have to change 2 things: vcore and multiplier [until you get into pll override zone]. Even that is easy to explain/walk someone through, instead of dealing with random other ratios, bclk, and multiplier (I don't have a problem with it, but joe-newbie would get confused, especially in comparison to SB)



Edit: ^ @ Hokie, thanks for going into the explanation, I think it is weird to have a binary rating system (Meh vs Approved, I remember talking with Matt about it on the way back from the Philly benching party), as you said, it has to be a total screw-up to not get an 'approval' rating, meaning that it would probably have to not work or be ultimately the slowest chip of this generation and downclock instead of overclock.
EarthDog's Avatar
I spent my thanks, so QFT.

The value really isnt there at that pricepoint either... with motherboards from the two camps.
manu2b's Avatar
Future will tell us, and I hope you're right.
If so, Intel prices will drop and SB-E/IB will be launched on time. Plus the fact that it will bring money to AMD and allow them to invest more in R&D.

David and Goliath myth again.
capttripppp's Avatar
Good play Sir, good play
Archer0915's Avatar
No I actually know PPL that will spend over 1G on a desktop to do that.

It is about feeling good man. Why do PPL drive a car with 3000+ in rims and tires? they do not need it but it makes them feel good.
MattNo5ss's Avatar
Yeah, FX performance is good when only considering AMD products, it's an upgrade, but not a huge upgrade. The power consumtion would still come into play too though. Boards prices really aren't that much different; good OCing, CFX/SLI capable SB boards can be had for ~$180, which isn't bad at all.

I still think FX is on the fence leaning toward Meh based on the rating definitions and what I highlighted in my previous post.
someinterwebguy's Avatar
This is why I ALMOST went and bought a i7-980 + mobo. Just to look and see 12 threads under Windows Task Manager...
wingless's Avatar
It reaches 2500K folding performance at a higher price point. This isn't too bad as it overclocks decently. Power draw under load is crap. We can likely blame Global Foundries for that. All-in-all it isn't a total loss, but it sure isn't good.

Let's hope server performance, which it was made for, doesn't fail miserably.
manu2b's Avatar
That's what I am afraid of: servers need low power consumption.
Lord_of_Decay's Avatar
Not the performance I was hoping for. Time to continue working an SB build.
EarthDog's Avatar
Can you explain how a smaller process, that usually uses less power, can be blamed on the fab plant?
MattNo5ss's Avatar
FX8150 for $280 at NewEgg

No way I could suggest one to someone when the 2600K is only $20-35 more...
Archer0915's Avatar
Could just be a leaky chip revision and that is why all the heat and the ability to take the volts.
EarthDog's Avatar
Ok, educate me. Isnt a leaky chip part of the architecture and not the silcon/fab?
hokiealumnus's Avatar
Completely agree; they just priced it out of competition. At $245, I'm ok with them. At $280, definitely not.
SuperTuner12010's Avatar
A qucik question for you guys. What about gamers. How does the fx comare to our Thubans and Deneb?

For a lot of people, they want to know how the fx compares to there current AMD setups. When can we see benches comparing to 9xx and 1100t?
Dolk's Avatar
FYI, for server side, the Bulldozer CPUs will have more advanced power gating on the cores. Not sure how much better it will be, but it should result in better power saving. Also, the BIOS may not fully support the CPU's power gating just yet. We could see improvement in the future.
Dolk's Avatar
@SuperTuner12010

All the tests were run on a 990FX system. You can see the comparison of the 1100T and the FX8150. Pretty much if you have a Deneb you can go for FX8150, for Thuban users, you probably will not see a difference. But I wonder how it would look with games like Tanks or Total War.
Archer0915's Avatar
Well they could be designed that way: yes. If it is a crappy fab setup for the rev then it is a facility issue not a design issue. Think Twkr
Eldonko's Avatar
Nice job, glad to see some sub-zero in there. Talk about epic fail for AMD though, this is almost laughable. Not sure why anyone would buy BD over 2500k or 2600k now and when IB and 2011 comes out..
SuperTuner12010's Avatar

Reason I ask is in most current games a 970 will beat an 1100t in all games exceptf for those better optimized for multi core. But then again I believe the 2500k kills denebs and thubmans in all games. And if the 8150 can match the 2500k in games why not.
EarthDog's Avatar
Ahhh... ok...so assuming Intel uses this fab plant, their chips should yield the same results?
bmwbaxter's Avatar
Part is they might not know better. Also for some like me it wasn't buying BD over SB but me getting bored of SB and getting BD too.
SuperTuner12010's Avatar
Well already have an AM3+ capapble board. Spend $280 on an 8150, or $500+ for a new 2600k or IB and a new board? When I wouldnt see any increase in performance in any invironment I use. Thats what its going to come down too. Not everyone wants the top of the line when what they have is already more than enough.
Archer0915's Avatar
Not necessarily. Every arch. is diffrent (you already know that) and depending on the actual design, process, materials and QC they might be able to produce perfect intel chips.

I am going to go out on a limb here and say it is because of the module design. I am probably wrong but sometimes I am right.
EarthDog's Avatar
My gut tells me that its the design more so than the fab. TWKR proves that to me since the regular chips werent as leaky?
Janus67's Avatar
It looks like from what I have read in very gpu-limited situations it performs right on par with the others. In CPU-limited situations it doesn't do that great.

http://www.anandtech.com/show/4955/t...x8150-tested/8
Dolk's Avatar
Wait Archer, ED what are you talking about? Its hard keeping up with everything here.
Eldonko's Avatar
Ok fine if you have an AMD board already I can see buying BD but if you are building a system from scratch why would anyone build a BD system? Other than BD being new and interesting or having a preference toward AMD I can see no reason.

I was really hoping to see BD come out with performance at least ON PAR with a 2600k. For the benches most interesting to me it is not even in the same ballpark.
hooflung's Avatar
That is simply not true. Here is the scenario. You want a gaming rig that can also be a workstation where you virtualize all your development environments. If you want VT-d then there are only a few good choices. The 2500 and 2600 ( non K series ) or a 6 core i7 are the only good intel options. The Phenom II x6 is the only other good option.

So it is pointless, in this scenario, to buy a 2600k or even 2500k. So its all about the performance per $. AMD clearly wins. You get the gaming performance that is very similar to a current x6, sometimes even as good as a 2500k, and great thread peformance for virtualization duties.

AMD has a clear win. You can dual boot a Win7 / Xen Server machine and have a beast. And if you don't care about VT-d you still get 8 cores for VirtualBox loving.
Archer0915's Avatar
Well I am thinking of all the issues they have had with this CPU and the 32nm fab that it uses: http://www.anandtech.com/show/4894/a...obal-foundries

The actual design may not call for such high voltage and they simply raised the bar to get these out to market. I think the TWKR processors were simply rejects that would run at higher voltage. I think the BD is also one step away from being a reject.
Archer0915's Avatar
I have some work to do. I will just hang out.
Mario1's Avatar
Looks promising, as far as priceerformance ratio goes.
Also looks like Intel won't be releasing Ivy soon, since there's obviously no competition in the eyes of AMD.
Lets hope Piledriver will make an AMD comeback and kick some ***!
Great review, hokie!
EarthDog's Avatar
The extreme power consumption under load... WHY? Design or process issues?
Dolk's Avatar
Well, the Cores now have DOUBLE the execution units, L1 and Memory... That tends to eat a lot of power.

Intel only beefs up their front end while leaving the same amount on the rest of the core.

Yes this uses more power and resources but as you can see, the Multitasking aspects of this CPU are paying off.

As for the Fab itself, there are so many different parts of the silicon design process that it could really be anything if its leaking power. I'm not going to guess on that end. I would need one of my professors to help me out.
EarthDog's Avatar
So as it stands its in the design and not the fab...until we can get more information.
Dolk's Avatar
Indeed. Like I have said several times, I am writing up a paper on the architecture itself to hopefully describe the complications and the wonders of this architecture.

The architecture itself is new, different, but old all the same. AMD tried something out for the first time and it payed off pretty well. Most of you do not see it, but if you look past the results and really look at what AMD has done, you may be able to see what I see.
Janus67's Avatar
?

Oh, we already have a smiley for that



I can't wait to read through your paper though, are you doing it for a class as well?
EarthDog's Avatar
Im hoping your article will show the technical side and be elementary enough to help those without your technical backround to 'see what you see'...

As it stands performance wise, its just not for me as a bencher. When the price gouging stops and it comes back to MSRP, then its worth taking a look at for a daily driver, if I wanted to save some money.

Im just not the type of person to marvel at the technology to make a widget. Im the type of person that enjoys using that widget..... like most users are. They dont care about the marvel of the guts of this chip, they just care how it performs.
Archer0915's Avatar
I think the answer will come with revisions. If there is no drop in power consumption but an increase in clocks (air OC avreage) then it is design but if it is burning considerablly less power with the same OC then it was the fab.
EarthDog's Avatar
OMG... polls updating threads in USER CP...please turn off. This thread is just constantly lit up!!!
redrumy3's Avatar
lol they can see who voted omg run
Nathan0490's Avatar
Do we have any idea if the 8120 will clock within range of the 8150 (4.6-5GHZ).

If so, I can't justify the extra $60 if I can just OC it nearly the same. They are both unlocked and both have the same cache.
Janus67's Avatar
Looks like we have 3 people that love this chip so far
hanleychan, Mario1, redrumy3

For whatever reason
redrumy3's Avatar
they overclock like a beast who wouldnt love them
Archer0915's Avatar
Well I would love it if I needed an upgrade and had a board for it. Hell I was totally disapointed but I would still have one just to check out the brute force power.
EarthDog's Avatar
You already have the same brute force power with your 2500k (I kid, I kid!).

I gave this thing a 7... only because of the MSRP pricing.
Sammich's Avatar
I went ahead and ordered mine with next day from newegg. I come out of this with a learning experience, AMD was formerly my "people's champion" offering gaming power @ an affordable price. If I would have saw this outcome months ago I would have just went with a 2500k.. who wouldn't?

But it's my fault for holding on to that Phenom II 555 for so long, several people told me if you wait for the best you'll be waiting forever. Oh, and that marketing campaign from AMD, poking fun at Intel? Why... why would you do this? Judging by the graphs AMD expects to be on par with the 2600k by 2014? Le sigh, lesson learned. At least I can run a decent livestream now, maybe I can get some ad-sense going from that to pay off the processor.

Oh, and I gave it a 6. To be honest it's a strong 7, but at this point AMD has no business taunting intel. It's like that kid at the red light in his honda civic racing his engine with eyes locked on a ferrari driver.
terran2k's Avatar
okay so what needs to be done for this chip to be all it can be?
Archer0915's Avatar
Dude I just think there is more to these things than has been revealed. I might buy one just to push it to its limits if nobody else will. I am just one of those PPL who cant believe this CPU has no real redeeming qualities.

I refuse to accept that this is just an incomplete processor (hint: win 95) waiting for a patch; piledriver.
hokiealumnus's Avatar
Clock it as high as it will clock and remain stable. The good news is it's a great clocker.
Dolk's Avatar
@Terran, that is a complicated question. Something I hope to figure out here soon.
redrumy3's Avatar
I wish I had the money to grab a setup for 8150 and clock is as high as I could, 8 cores 5ghz+ sexy
terran2k's Avatar
yeah i understand AMD said it was meant to work at high speed, so comparing clock for clock with Intel probably isn't the best.
what about the next revision, does it need a stronger FPU or something?

sheesh, talking about the next rev when it just got released today.
Nathan0490's Avatar
Consumers only get aircooling with the BD, the WCing that OCF received is only a prototype. Japan will be the first market to receive the WCing with their BD's.
bmwbaxter's Avatar
Wow, I bought it and only gave it a 4. I am a sucker for moar hardware.

You must have been feeling generous.
someinterwebguy's Avatar
I'm still just going to sit on the fence until January or so. If I was still rocking my quad core (which I just gave away to a family member), I would jump at the 8150 in a second, but with my current setup, not so much. I gave it a 7, as it's not horrible, just not much of an upgrade for me.
EarthDog's Avatar
With proper pricing, it has its place. As it stands priced now at newegg. 1.
Nathan0490's Avatar
"AMD FX Processors Go Head To Head | Competitive Performance For AMD's New Eight Core"
http://www.youtube.com/watch?v=8rDwX...layer_embedded
Mario1's Avatar
I voted 10, because it overclocks like a champ!
I'm currently writing in OCF, right?
bmwbaxter's Avatar
+1 newegg priced it out of even really being considerable.
Janus67's Avatar
So does a socket 775 Cedar Mill CPU, but I wouldn't give it a 10 if it doesn't have the performance to be competitive in benchmarks against even a stock competitor.
Dolk's Avatar
Probably the best way to describe what this Chip can actually do, and where the future is going.
MattNo5ss's Avatar
This is also "The Performance Computing Community"
Archer0915's Avatar
I hate to do this but :quicksync:

Now put that up because it is built into the CPU. I dont want the crap marketing stuff I want to see some real work damnit and some real comparisons.

I am not saying we do not have good info but some of us (as I said before) work with our systems and don not just bench them.
Janus67's Avatar
Then I think you need to find an easily reproducible, quantitative way to get a number/score in order to get what you are looking for. You can't have a subjective 'does it feel snappy' view, as any number of things can contribute to that. And it has to be pretty much automated, it can't involve manually opening 6 things and having each of them run, as it has to be able to be done over and over with different chips, different installs, and different hardware to get rid of variables.
Archer0915's Avatar
My post was specific and it pertained to that link that did not use all of the i5 processor. It contained a GPU you know and that GPU is wicked bad.

I do completely understand what you are saying and I agree but this post had nothing to do with that and your complaint is actually a little off. It can be done.

I can run a caned benchmark and get different results every time. I can run tests while the CPU is heavily loaded as well.
hokiealumnus's Avatar
Heh, did you guys see the pretty significant error in there? They said it was out-performing a 980X in Cinebench 11.5. The CPU run and scored was a 2500K, which they do state in the configs at the end; but someone screwed up. Wonder if they'll fix that...
Dolk's Avatar
Didn't catch that part. Going up against the 980x is a bit difficult. I'm still very impressed by that CPU.
hokiealumnus's Avatar
As promised - PCMark 7 at stock: 4188. Warning, it's a big screenshot.
Mario1's Avatar
Overclocking an Intel DX series CPU from 29 to 33mhz ( ) still gains you performance so I'm technically on point.
Mario1's Avatar
Sorry to disappoint you, but we're not twin brothers that think identically.
I still worship my Celeron 330j.
xsuperbgx's Avatar
Could you run a wprime with one thread per module and then 4 threads on 2 modules? I am curious as to what the results would be.
hokiealumnus's Avatar
Already done it experimenting before on Dolk's suggestion. Windows 7 isn't smart enough to assign it to the right modules and you lose time. Several seconds in WP32.
Dolk's Avatar
Superbg, we did that as one of the first rounds of tests. It actually came back with some strange results. I can post those when I get back to my main computer. Or hokie can post them.
hokiealumnus's Avatar
Digging..... found.

Stock time with 4 threads and affinity to 1,3,5,7 gets 15.351sec for WP32M. Changing the 2nd WP process that comes up after you run it for the first time to that affinity gets 14.828sec. Changing them both to affinity 0,2,4,6 does not change anything. Setting affinity to all cores & running with 8 threads gets 9.083sec.
SteveLord's Avatar
Just wanted to share this for a chuckle. Can you spot what is wrong with this Newegg Bulldozer combo deal?

http://www.newegg.com/Product/ComboD...t=Combo.739582
Theocnoob's Avatar
Video link removed - sorry, lots of language there that we don't allow here. -hokie

Hitler finds out about the official FX 8150 benchmarks.
Archer0915's Avatar
ASUS Rampage III Black Edition LGA 1366 Intel X58 SATA 6Gb/s USB 3.0 Extended ATX Intel Motherboard
bundled with:

1x AMD FX-8120 Zambezi 3.1GHz Socket AM3+ 125W Eight-Core ...

I need one of those.
Sammich's Avatar
oh crap BD is LGA1366 now? uhhhhhhhhhhhhhh


EDIT: http://www.newegg.com/Product/Produc...tent-_-text-_-

Oh wow.. newegg is out of stock, mine is shipped though :O
SteveLord's Avatar
Perhaps it runs better in a 1366 board. nyuk nyuk nyuk
MattNo5ss's Avatar
Read down a little ways...
I guess this combo is for people with two systems, or two friend trying to save money.
xsuperbgx's Avatar
Interesting. Thanks!
Archer0915's Avatar
You mean I cant force that CPU in the socket and slap a HSF on it and go? BUMMER
Fineas's Avatar
Wow i thought it was going to be in the $240 range. not sure its enough of a savings over the 2600k for me
Sammich's Avatar
Twas $302 after shipping
Metallica's Avatar
I'm not sure what to think. For the price they are currently, they should offer better performance. I'm not horrible let down, but I was hoping it would be ~10% of the 2600k, and cost $250. It seems it's more like ~20%-30% and costs $279. Meh.

I'm thinking the 6100 right now is a better deal. $324 for the 6100 and UD3 combo on the Egg.

Even though the 2500k can be had at 180ish, the motherboards are quite expensive, if you want a higher end one. I like how the AMD counterparts are $220ish for top of the line, while it's $300ish for Intels side.

With that said, I think if you need high end performance, Intel is still the way to go. With Ivy Bridge coming in 4 -5 months, and it being cross compatible with SB, grabbing a 2600k, and replacing it with IB later seems like the best deal.

The 8150 is just way to damn expensive for the performance it offers. I'm interested to see some benchies on the 6100, as it's drastically cheaper, and seems to be the same CPU, with one less module. I'm leaning toward the 6100 as your best bet for a mid-high range gaming rig, while Intel still dominates for users who need A LOT of CPU power.

My .02

EDIT: I'd like to see some benchies comparing the 6100 and the 2500k, actually.
MattNo5ss's Avatar
Anyone notice the "bulldozer fail" tag someone put on this thread?

I didn't say you can't... you can do whatever you want and have fun doing it

While you're at it get one of these:

100329
GoD_tattoo's Avatar
Good thing about Sb is there really is NO NEED for the top of the line boards. They mostly all over clock the same since it is multi based. There isn't a real need to buy those $300 boards when the most common sell for around $180-200.

Also after looking at the reviews, I doubt a BD would be your best bet for anything related to gaming unless I missed something. It barely edged out the 2500k in a few tests. At $179 I'd rather have the CPU power and gaming ability. But that is just my opinion.
Ronbert's Avatar
Looks like this chip isn't the dragon slayer we all hoped for, it's a nice chip given it's price/performance ratio but I feel a little less guilty about buying my 1090T now.
Archer0915's Avatar
Was this with an OC and what video card? That score is a tad low.
I.M.O.G.'s Avatar
"PCMark at stock" typically means non-OC. Have you been drinking again Archer?
Archer0915's Avatar


No just multitasking And evidently not doing well at it
MattNo5ss's Avatar
The GPU is a HD6970 based on shaders, and that he used a HD6970 in all other tests.
hokiealumnus's Avatar
MattNo5ss is correct, video was a 6970.
Archer0915's Avatar
Cool. Well I am happy with my little score for now. I just hope AMD follows suit with something similar to quicksync on the piledriver. They have the technology and I just do not understand how it is not the case with the A series.
Badbonji's Avatar
Talk about handpicked (not that I am surprised, being AMD's own video); firstly goes against the 980x in a GAME test at GPU bound resolution, then it manages to beat a 2500k in multithreaded benchmark (it wins sure, but then costs more) and then shows a couple of benchmarks where it can keep up to a 2600k (from other reviews I have read, the majority of the time it loses by quite a bit).

Not even to mention the fact that the CPU uses much more power under load and will end up costing more in the long run. Even though it cannot park modules in Windows 7, it doesn't seem to save much power in Windows 8 from core parking: http://www.tomshardware.com/reviews/...x,3043-23.html

Except for people who prefer AMD or want something new and exciting to bench, I cannot see this chip selling too well.
Janus67's Avatar
Is that your intel chip @ 4.9ghz or is that showing the performance increase of a BD @ 4.9ghz from stock?
manu2b's Avatar
Lol, that's my rig vs 8150 stock

EDIT: try to find some 8150@4.9, back in a minute
EDIT 2: that's the only one I found. Going to put a screen with a stock 2600k
SuperTuner12010's Avatar
Thats what i was trying to figure out...Wasnt sure what I was looking at.
Archer0915's Avatar
Guys I feel flames. We are all a little astonished here but some PPL are locked into this. I have not been on my best behavior in this thread but lets not pick. Let us maintain a higher level of maturity here; we are not grade schoolers.
muddocktor's Avatar
Hells yeah, you can make it work!!! Just break out the dremel and cut off all those pesky pins and make pads out of them, then cram that Zambezi into an X58 board!!!!

I wonder how many of these bundles Newegg is gonna eat?
EarthDog's Avatar
Interesting poll results so far...

~40% = Dissapointed (1-3)
~47% = MEH (4-7)
~13% = EvIhoped (8-10)
Metallica's Avatar
The same can be said with the lower end AM3+ boards.

The 2500k can only be had at 180 if you are near a Microcenter. So motherboard/CPU for the 6100 would be around $330ish, while the 2500k/motherboard would be around 410ish. That's also going with the higher end AM3+ board. For a mid-range AM3+ board, the two would be around $280ish.

Depending on how the 6100 holds up to the 2500k as far as games go, I would assume it would be around the same as the 8150 as most games aren't using more than 6 cores.

So having near 2500k performance, while saving around $100 - $120, seems like a good deal to me. Plus they seem more fun to play with

EDIT: Having * hopefully* near 2500k performance. I guess I'll have to wait a few days to see some 6100's.
manu2b's Avatar
I think the guy has a SSD and a 6950stock... not sure.
Mario1's Avatar
What's up with Newegg pricing it at $280?
manu2b's Avatar
Hey ARcher, cool
I think we are ALL astonished, including myself.
And what astonish me the most is to have been taken for a m....n by AMD marketing, you know?

That's all they deserve, sorry!
EarthDog's Avatar
This is normal gouging upon release. It happens with CPU's, GPU's, well everything here really... It can be found for $20 over MSRP at TigerDirect.
muddocktor's Avatar
Thing is, you don't need a high end SB board to kill this somewhat laughable attempt to make Intel worry in the mid priced cpu category. I have a $110 Biostar P67 board running a 2500k at 4723. And I can push it higher than that, but why, for 24/7 usage?
Metallica's Avatar
This is true. I'm just hoping pricing will go down, closing the gap of price/performance.

If I lived near Microcenter like I used to, I'd already have a SB system xD I could have my sister buy it and ship it .. or just wait out a few months and see what happens
I.M.O.G.'s Avatar
Newegg had it well above MSRP and are already out of stuck... Until they can get enough inventory in, they'll probably keep the prices high.
Mario1's Avatar
I have a question towards hokie/Dolk:
How did you guys get AMD to send you the board & the CPU?
Looks like they don't send free stuff to random guys, how did that happen?
Janus67's Avatar
They are reviewers for Overclockers.com, which requested review units to be sent to us (them). And with OCF's notoriety and history they were able to send the unit(s).

That's how they get the majority of the things that get reviewed, if I'm not mistaken, making it much more difficult for joe-schmo to say "i can haz free chip to review plz?"
Hardin's Avatar
I feel bad for AMD. This was supposed to be something special and it turned out to be mediocre. This is not a processor for gamers especially games that are single threaded like World of Tanks.
ScrewySqrl's Avatar
they were involved in the record-setting overclock earlier. This is a fairly well known review site, several hundred were sent out for that.
Mario1's Avatar
Gosh, I wish I was a reviewer... Wouldn't mind getting the newest hardware available for FREE for doing reviews that I love..
Black C5 Z06's Avatar
You gotta start somewhere. Start writing reviews about stuff you already have, submit it to websites as a free-lance. Move your way up the food chain and eventually you'll be able to get free stuff.

But you have to be two things. Good making your thoughts clear through the written word, and objective.
Archer0915's Avatar
There is hella work in it and most reviewers go in the hole on stuff. Some spending as much of more than what thay actually get.

When I do a review I generally get less than min. wage if you count the hours spent and the value of the product. Hell that does not count the pile of things you need to keep around for a while in case you need to folloow up.
Mario1's Avatar
The current value of the Bulldozer is $280, water cooling unit costs ~$100 too, no idea about the Crosshair V, but my guess is that its around $300.
So yeah $680 worth of products aren't THAT bad, plus I'm pretty sure hokie loved reviewing it.
Archer0915's Avatar
Yeah it is fun but at times you can get something and then find out they want a faster than expected turn around. That would be fine if it was all one did but most of us have a life and some still have a wife.

Many sleepless nights as well when you have kids and the AC will not cool you lab/shop in the summer.
Brolloks's Avatar
It is not easy being a reviewer, especially turning out the quality reviews Hokie does, it is under tight deadlines and a lot of editing goes in before publishing, plus he makes no profit from it, whatever he sells from it he re-invest in equipment to do other reviews, hence the reason why he could do extreme cooling in his reviews, LN2 aint cheap.
I.M.O.G.'s Avatar
If he would actually sell anything. Hokie is a packrat, and most things he needs to be because in the next review he'll need the parts to re-run the comparisons.

Many sites use databases of past results to do comparisons - thats how they can compare a dozen different components in a single graph. We never re-use old scores - we rerun all the tests to ensure there aren't other OS, update, or driver differences and the scores are legit.
Mario1's Avatar
Nobody said its easy (at least I didn't).
I do respect all of the reviewers in here, since I know it must be hard for them, but why does every review have to be considered as some kind of a 'job'?
I'm pretty sure hokie was eager to share this review with us, since there's so much hype going around the Bulldozer and all so he must of had a good time.
You're making it look like its a bad thing to do... It isn't like AMD forced anyone to review anything. He decided to do it, because he has a fetish for hardware, don't we all?
dkizzy's Avatar
Guys is there any incentive to get an 8150 over an 8120? If they're both unlockable then I don't see the point in spending more on the same exact thing essentially. Let me know your thoughts!
Mario1's Avatar
No point of getting the 8150 if the only difference is the stock clock.
I.M.O.G.'s Avatar
Binning possibly between the 8150 or 8120, depends if the 8120 clocks as well or not - I'd compare it to a 1090T or 1100T, probably a similar situation. The FX-6100 is probably the best deal though, as far as bang for buck.
Archer0915's Avatar
Nobody said it was not enjoyable It is an investment and it can give you satisfaction.
dkizzy's Avatar
Yeah it will be interesting how the binning turns out. Does anyone know if the6100 is designed to be just three modules or does it have a disabled/deactivated module on the die?
Metallica's Avatar
It's not just about getting free gear, and reviewing it. Hokie has earned his way to get the opportunity he gets. Getting to his status is expensive. You have to start somewhere, and you do not get anything for free. That "$680" is a lot less than I'm sure he's spent in his years here at OCF. Also, having the knowledge to do a review takes time. Anyone can become a "reviewer". But gaining respect is the hard/expensive part. In time it may become rewarding, but if they are getting free gear, there is a reason for it.

I know of people who buy gear, review it, submit it, sell it for %20 less than they bought it for, and get the next best thing to review. They never get free stuff. Maybe when they gain some reputable respect, they will.
Metallica's Avatar
This is what I was getting at. Although it still may not be better bang/buck compared to intel. Waiting on prices to settle, and seeing some benchies will determine that.
Black C5 Z06's Avatar
I would love to see benchmarks comparing the 6100 and the 1100T, since they both have the same stock clock, and they are both the same price (on Newegg, atleast).
Mario1's Avatar
Yeah... everyone PAYS to get respected..
Janus67's Avatar
Doing things like that can make it difficult for a reviewer when their own money is on the line, by that I mean having an unbiased opinion. People HATE to be told that they wasted their money, and will fight/argue til their face turns purple that they made a good purchase to give themselves some self-verification that they didn't do something wrong.

I think it is easier to do an unbiased review when you receive something for free, but sometimes (look at video game reviewers) publishers/companies expect a certain score/review given past history/etc, which leads to occasional issues.

In a way yes, be it through time or money (and since time = money), it is true. Hokie could be doing a lot of other things that aren't writing a review (effectively for free) with his time (including making money, and for sure it would under most circumstances put him farther ahead), but if you put the time and effort in you can become a respected member of almost any community.
Metallica's Avatar
Don't twist around what I said. Yes, when it comes to reviewing, it does take money/time.
Brolloks's Avatar
Exactly, spot on there
At Mario...Reviewing is great but it is time consuming, compare it to a job if you may, also once you get a few under the belt you raise expectations and are asked more frequently. I have done one mobo review and believe me it is not easy, fun yes and rewarding once you done.
Anyone can start, why dont you do one and see how it goes?
hokiealumnus's Avatar
Yes I'm a packrat, but in my defense many of the items that come to me are engineering samples that can't be sold. Ask Brolloks, he really wants my 6990 but I can't sell it. Also can't give it b/c I need two video cards and that's one of two. Each ES has a unique serial number that can be tracked. No way am I endangering our site's ability to get cutting-edge hardware to bring to our readers by selling ES hardware to make a buck. It's not worth it.

Re: re-running benchmarks clarification - I didn't retest everything on the 2600K IMOG; so it's not re-tested for every review. Only when there are strongly relevant changes (like a service pack or something) would I do that. I am definitely a packrat though. The only things I've sold are a case and a GPU to fund the other things Archer is speaking about. Most recently that meant a Dewar for the Bulldozer review.

Mario, it may sound like a lot, and it is an absolute blast so don't get what I'm going to say wrong, but reviewing is work. A lot of it. Work the cost of the hardware doesn't surpass. Heck, I don't need half the hardware I have. I don't game, I don't encode, render or any of that other stuff much at all, if any. I probably have one of the world's fastest exclusive web browsing machines.

I do this because of the love of the hardware. Beacause I enjoy putting my words to paper and hopefully having people benefit from it. But if you start out just doing it to get free stuff or profit (hah!) you'll burn out quickly. I know you're not saying it was a bad thing or not work, just thought I'd throw that out there in response to the quote above.
Mario1's Avatar
I can give you a fine example that this isn't necessarily true:
TiN broke the world frequency record with extremely old hardware, later on he got offered a job by EVGA in Taiwan and is now living happily over there.
Metallica's Avatar
I believe I remember you back in 2009 constantly buying/selling motherboards for review. I'm sure you know what I'm talking about. Remember that 920?
Brolloks's Avatar
Oh boy, I was obsessed at one stage, none of those were formal and got published but I spent hours and very late nights doing benchmarks on video cards and CPU's and writing it up.
Archer0915's Avatar
And a lot of it.

Not to mention it is hard not to sound bias at times and at times you actually find a product is great but because of some of the many variables involved you have to find a way to express that view and still not give the product a good rating.
Mario1's Avatar
Yeah man, no hard feelings.
I understand that there is much more to it, but what I meant was that free hardware as a bonus is something good, since you're not just getting recognition from us - the readers, but also from big companies sending you review samples.
EarthDog's Avatar
Its not a bad thing to do. I think we all do this because we enjoy new hardware, and getting to keep (most) of it is certainly a wonderful perk. But make no mistake, it is time consuming to complete reviews, especially (as already mentioned) the quality reviews that Hokie does here.

Archer is about right with most single product reviews, that it comes out to less than minimum wage. Not to mention when sold, you cant get near the new price. Most of the reviewers also pour money back in to the site to help with varies activities.

Also, note that some, Hokie and I at least, haave full time jobs, as well as families. So whatever free time we may have can easily be taken up with reviews.

Its a delicate balance...that I wouldnt trade for anything.
Janus67's Avatar
I'm sure that him breaking the CPU-Z world record was helpful, but he has been in the overclocking/benchmarking game for quite some time working with Kingp!n and others with incredible soldering/modding skills. The sheer amount of money and time that he has spent doing this stuff got him where he was, not simply breaking a cpu-z record. (and being part of a sponsored team didn't hurt either, i'm sure).



That said, I will be doing my first review for the site here (it's in progress), working with MattNo5ss to do it. I've already put more hours into it than its retail price, but it definitely is fun to do.
Brolloks's Avatar
I take myself as an example of where I actually dont like formal reviews as then it feels like a job with deadlines etc....this is my hobby, I like playing with hardware and "reviewing" it for the benefit of others and satisfying my appetite to test new stuff, it does cost me a fortune as I dont get sent free stuff, that is OK with me and I will continue to test and play hardware.
Metallica's Avatar
And then sell to me at a discounted price, after I know said hardware is a gem

That 920 is still kickin, I wish I wouldn't of sold it. My boss who I sold it to, has ran it stock ever since.
Brolloks's Avatar
Yeah I have kicked myself silly for selling gem chips and then later looking for more....Kentsfield and Nehalem were just crazy stages
Mario1's Avatar
TiN has had 80% of his hardware being sent out by companies and he never considered it as job as far as I'm aware of.
His primary job before going to Taiwan was building custom things (electronics obviously) for people.
He had already turned down a decent amount of offers, due to him not being able to get a visa, but EVGA somehow worked things out and he was able to leave for Taiwan.
That's the basic story. If you want us to finish this convo than feel free adding me on skype:its.gucci or MSN:shigalbigal@gmail.com, since I don't believe this is the right place talking about random people that have nothing to do with the review.
I.M.O.G.'s Avatar
I have an FX-6100 available locally, 50L of LN2, and I just found there's a beta bios for the Asus CIVE... On my way to pickup gear, then setup an OS, and I should have some more results this evening.
hokiealumnus's Avatar
Live. Stream. Post on front page. Do it NOW!
Brolloks's Avatar
Sounds good Matt, looking forward to some frosty pics and 7Ghz + results
Mario1's Avatar
+1 on this one, I'd love to see you stream the benching too!
capttripppp's Avatar
Great! I was considering picking one of these up for myself so it will be cool to see how it performs. +1 for livestream
Badbonji's Avatar
Just to let you know, legionhardware have reviewed the quad core and hex core if you want to see the numbers, although the quad is emulated by disabling half of the modules, this is exactly what it will consist of and should give an accurate representation.

http://www.legionhardware.com/articl...fx_4170,3.html

Unsurprisingly the bulldozer quad and hex lose out to the last generation counterparts. Even worse is that the FX4170 uses 50W more at load than the 2600k!
boucher91's Avatar
who bought my 8150 off of the egg....
came home from work set down to order and thier out of stock....
waaaaaaaaaaaaaaaaaa
zitnik's Avatar
Sticking with my PII x6, my next upgrade won't be the 8150 like I thought. I can see the Ivy Bridge on the horizon
Black C5 Z06's Avatar
Sweet!!!
I found that review moments after I posted here. Very disappointing.
Yup. I'm going to wait until Piledriver is out, and see how IB and PD compare to one another.
Looks like I'll be upgrading my GPU this holiday instead. Oh well, suits me just as well.
SuperTuner12010's Avatar
thats what I was looking for. Looks like the 980be is still going to be the best budget bang for the buck for gamers...
Xterra's Avatar
$275 price point... I retract my previous point on price to performance... that took the wind out of my sails.
wingman99's Avatar
I voted wrong wish i could change my vote.
bmwbaxter's Avatar
I can attest to how much work a review is. I have only done one and it was fun but way harder than I thought at first. the actual testing was the easy part. it was all the writing and formatting and editing that goes in before it even reaches the frontpage that was hard.

So again I want to say great job on this review Hokie
I.M.O.G.'s Avatar
Fx-6100 was $199 before tax from microcenter.

2500k had a $70 instant discount, making it $179 before tax.

So retailers are inflating msrp, and Intel bottomed out SB prices. At these prices right now, I don't know what normal person would buy this chip...
Mario1's Avatar
2500k's prices didn't drop in Bulgaria..
P.S Do a livestream if you're benching the 6100, please.
I.M.O.G.'s Avatar
Targeting 9pm for livestream (eastern). Need to drop an image on the drive and flash the cive BIOS.
Apht3rThawt's Avatar
Funny thing, I guess I'm too old for hype or no hype anymore. The lack of PR from AMD said to me "this chip is not what people are expecting." Last year I was ready for the 990fx and a new cpu. As time went on I could see the bulldozer had water in the diesel fuel. Sorry guys, really, but there is a limit to the way we manufacture and our engineering is not where it could be. That's why we still make cars with gas engines after more than a hundred years and burn coal to run our computers.

Sad deal, but at least something new to bench. Let's see 8GHz.
Artas1984's Avatar
AMD bulldozer is more disappointing than the original AMD Phenom, AMD HD2900XT, AMD 700 series chipsets all combined.

Nuff said.
I.M.O.G.'s Avatar
8GHz. That would be nice.

I haven't ran this board since our microcenter event. I have frozen the crap out of it a couple dozen times. It has literally had puddles of water sitting on it, because I forgot to tell a friend that it wasn't conformal coated - vaseline is the only insulation it has ever had.

So the board has been thru some crap, but it has taken two Phenom II's to 7GHz and those chips really did not want to get there.

I am a dangerous man though tonight. This FX-6100 only has one hope of making it to sunrise tomorrow, and that's with 8Ghz under its belt. So long as it keeps scaling with voltage, the voltage will keep going up.
Janus67's Avatar
Well it looks like there aren't any CPU-Z uploads on HWBot yet, Matt. So go for the gold
Apht3rThawt's Avatar
Awesome deal, I will be tuned in. Good luck and make us all laugh.
Mario1's Avatar
Could you please link us to the livestream when it starts in this topic or something
2:47AM here so yeah, no idea when its going to be 9PM EST.
Janus67's Avatar
1hr and 15 minutes from now.

Odds are he will link it here and have it streaming posted in the benchmarking section.
doz's Avatar
Well I decided id try it out just for a review even though its a waste of time and money but frys wont have it instock till the weekend at the earliest my local store said. So much for a launch.
Black C5 Z06's Avatar
An hour from now.
netmask's Avatar
Come onnnnnnnnnnnnnn already.....
bassnut's Avatar
Yup ....s mae boat here they figure not till the end of the month. I might wait till Jan for the 8170 ........ wondering what tweaks or changes they might have instore ofr it by then. But then again knowing me if I see one in at Canada Computer I just might drop a dime .... or two.
I.M.O.G.'s Avatar
Sweet! Gold cup! lol

Looking more like 10PM EST, got stuck in traffic on the way back from Microcenter.

Livestream will be up at www.overclockers.com/overclockers-live and I'll update this thread, and start a new one to embed the stream there.
trents's Avatar
I expect we'll see better performance from BD as the manufacturing process is refined, new steppings come out, and software is optimized for it. I also expect newer, more robust AM3+ motherboard chipsets will evolve to compensate for the extra power draw BD seems to be imposing upon other hardware components. More robust chipsets could help lower temps which seem to be higher than expected. Don't jump to too many firm conclusions just yet.
Black C5 Z06's Avatar
I really have high hopes (I don't know why I allow myself to) for PD.
buffalowings's Avatar
guys... to me, a product is a flop when it doesn't even generate enough interest for me to bother saving up my pennies...
Ironsmack's Avatar
Im not sure if this is allowed in the forums, i apologize if its not.

Take it with a grain of salt...

http://www.hardwareheaven.com/review...roduction.html
Robert17's Avatar
I read the review early this A.M. here; good review. But like you Trents, after mulling this all day during a 300 mile drive mostly in a downpour, I'm thinking that more hardware and software to take advantage of BD will come forth over the next few months, some probably already near tape-out, and there will be some more positive reviews at those times. But for now, my personal tick must await the next tock, or something.............
Ivy's Avatar
Sadly i have to say, even if the new Bulldozer seems to be able to catch up to most Sandy Bridges on many terms, or even exceeding at some spots, the backside is its massive TDP. Well, i mean, that have to be expected at 8 cores, thats not gentle on smaller systems anymore, its kinda a small supercomputer when it keeps continuing like that. Although, the Sandy Bridge will mostly handle the same performance with half the TDP, which clearly is showing that AMD still isnt fully up to the task on the CPUs.

Sure, those who want to save up bucks and got huge systems who can handle such a TDP, may become happy getting that alternate CPU, however, expecially users of SFF and Notebooks/Laptops will have to stay with Intel, thats almost certain.

I mean, i already wondered.. what 2 billion transistors? While the flagship of the Nehalem series had "only" 1 billion, which was still a lot. And we know, usualy the TDP is calculated by 4 things:

1. Amount of transistors (twice as much, so its usualy resulting into twice the TDP)
2. Manufacturing process and overall architecture (aswell nm)
3. Stepping
4. Voltage

Now, all values are about same such as Sandy Bridge with the exceptions of the transistor amount, right... what else to say. Anyway, at least AMD users can now mostly catch up on raw performance, which is at least some good news.
trents's Avatar
Speaking of notebooks, I see where A6 laptops are now coming out. Anyone tried an A6 laptop/notebook yet?
mjw21a's Avatar
Did you check out the Toms Hardware Guide review? TDP issue will mostly be resolved with Windows 8.
neo668's Avatar
But Win 8 is supposed to come out only late next year.
Nathan0490's Avatar
Now that review is interesting... hmm.
SuperDave1685's Avatar
Sigh.... I was expecting a lot more from AMD this time around.... looks like I'll settle for a 980x or 990x in January or so once I get my tax return back. By then, their price hopefully will have gone down. From the benchmarks I've seen, its not worth buying a completely different mobo, PSU, and cpu, when I can just pop in a 980/990x into my board and get better performance.

wingman99's Avatar
That review looks equal in gaming.
Ivy's Avatar
At first, i got no clue how they gonna reduce the TDP by a large margin, if so, it may bring some optimations regarding some technology which isnt currently activated in WIN 7. Although i dont know why they dont just add it on a update, cant be impossible. Since they update the OS all the time, and all what it is on about is to enable more support, not to completly enable a disabled CPU. AMD was always supported by WIN and ofc im happy about because the market is living by having competition. So what MAC was doing isnt toward my liking and reason enough not to use it anymore (just to put a clear signal that im not gonna support such a strategy). Anyway, as a gamer and video user i dont truly need a MAC OS, so i drive well using WIN, even if i have to pay for it (although the hardware cost is 10 times higher on high end systems).


And at second, WIN 8 will aswell need some time in order to etablish itself in a fixed and great working condition, so we cant expect it before the end of the year 2012 in a good condition, so that means it will be a full year to get the AMD CPU working with lower TDP, and its just not very advisable to jump to AMD any quick in that case.

I mean, dont get me wrong, im sure some users might not have a issue with higher TDP and may have fun using the new AMD CPU, for some stuff it surely may perform great and it got high OC capabilitys, which isnt always given at CPUs with that much cores (as more cores as harder to get a OC stable). At least in that term AMD got a fair deal of possibilitys and it surely can be fun using it. But its simply still not on par with Intel, so i cant give it more than a average 5ish rating. AMD made big improvements on performane at least, that have to be noticed and at least AMD users are able to live with, without feeling like a aged snail, finally some performance for those kind of users, thats indeed good improvement still.
]-[itman's Avatar
I'll throw some more fuel on the fire:
http://www.xtremesystems.org/forums/...-Threaded-Perf.

It's interesting, I'm not entirely convinced of his conclusion as I think there is more at play than what he thinks, but very interesting none-the-less.
Ivy's Avatar
The link is apparently about the fact that half of the disabled cores doesnt give half the performance, much more than that. Thats however not very surprising considering that it isnt only the core being the limitations, a core is just a part of the CPU, but its by no means the only thing which does matter for performance. So, even with half the cores, the other areas of the CPU still are fully enabled, so we may miss some cores but indeed.. the performance isnt dropped by half, in that case.

Maybe its possible that the performance goes up a bit when some optimations done to the OS and programs too, which is currently not truly the case because everything is fully optimized for Intel. So i rather keep careful with doing premature judgements in that term.

In overall, i may be disappointed lesser than a huge user base, because i did never expect the new AMD CPU to be a Intel killer or even be on par with it, AMD wont reach such a condition out of the blue sky, thats no realistic approach. I just notice improvements, and truly huge ones (compared with the old FX CPUs), and thats what i did expect and so i am happy in a average manner, because its exactly what i did expect, not more and not less.
Badbonji's Avatar
http://www.tomshardware.com/reviews/...x,3043-23.html

You say resolved, but when under full load it will still use the same power in Windows 7 or Windows 8. The only benefit I see here is core parking, which even then only saves roughly a couple of watts per module (8W lower for 3 modules parked). Not sure how it will be able to reduce the usage to anywhere near Sandy Bridge though.

Also, instead of parking a module (which I guess is simpler ) where both cores within that module are disabled, wouldn't it make more sense to disable one core per module in lightly threaded tasks?
So for the 8150 loaded with 4 threads, this would mean instead of losing 2 FPUs by shutting down 2 entire modules it would stop the sharing of the FPU between cores and each core would have their own like a normal quad core.
elari20's Avatar
Very dissapointed and s(m)ad ...

I told many of my friends to wait with the upgrades till BD hits the market. And now you give me THIS???

2600K, here I come.
Gautam's Avatar
No fab would willingly attempt to produce a leaky chip (except for some very rare cases like the TWKR...which I believe were simply binned for and not specifically made)

I don't have time to elaborate atm, but keeping all things constant (including voltage), a die shrink will lead to increased power. The reason that it usually does not is that AMD and Intel generally decrease voltage along with process technology nodes. This isn't quite the case with BD.

However, I'd mostly pin the massive power draw on the sheer size of the chip.
hokiealumnus's Avatar
I'm not sure if someone posted this link already or not (apologies if so), but it seems a new BIOS has enabled turning individual cores on & off. It also comes with some very interesting results, which are posted @ XS. This is similar to what Dolk had me attempting to do from Windows by isolating the FP cores. Based on that it looks like the scheduler needs a bit of work. If AMD could get Windows + the CPU to properly assign loads, FX could actually show a marked improvement in lighter-threaded loads.
EarthDog's Avatar
Thats certainly good news. I didnt read the link (yet) but will these scheduler changes hit W7 or will we have to wait for W8?
3line's Avatar
Well, I was never personally impressed with the AMD hype machine. I said this in the rumors thread after I saw the power draw on the ES (jesus christ).

You don't just leapfrog back into relevance after being buried, performance-wise AND efficiency-wise by the industry leader ever Since core 2. To be honest, I am a bit disappointed, as I had thought that this thing would at least be a decent drop-in replacement for my phenom II.

I voted a 3.
trents's Avatar
Well, I guess we know now why there were so many delays in the BD release. There were engineering problems from the get go and still really wasn't ready for release when it finally was.
Archer0915's Avatar
So from what I see in those tests the cache is what is causing the hit when the module has both cores active. Wow I feel vindicated. I went out on a limb when I called that with no backup.
manu2b's Avatar
I wonder what is worse: releasing a somehow "weak" cpu, or just cancel and launch PD when ready?
Archer0915's Avatar
What I feel is really bad, sad and sickening about this entire situation is the "Wait for win 8 to fix the cache deficencies". Hell now MS not only has to deal with their own crap but they have to patch and SP for a damn CPU.

Alll I have to say is "WE" the Enthusiast community hyped this more than AMD ever thought about doing.
hokiealumnus's Avatar
They couldn't have cancelled this if they wanted to. Too much time & money went into designing it. Their only saving grace was price. At $245, it's not a bad chip. At $280 it's just not worth it.
Archer0915's Avatar
The arch. is awesome. The cache algorithm and cache sharing seem weak. I would think that there could be a cache optimization in the very near future.
manu2b's Avatar
Well, cost in PR and marketing.
If I am not wrong, PD is on the same train as BD for R&D. Which would imply that either time and money invested in BD are not not wasted.
Archer0915's Avatar
Yeah this can be a good thing. AMD had released a good product that will get the job done. It was a leap and has some deficencies but it is in no way a crap CPU. It will get the job done and many of us see the potential for the future as you seem to.

Hey is there a review in media/home servers yet?
manu2b's Avatar
And you know what ARcher? After the instant disapointment, I wonder if I won't upgrade my PhII to a 8150 finally...
Archer0915's Avatar
Well considering there is alot of unrealized potential in the design that has not yet (may not be) been tapped I can not see what it would hurt. I have bought plenty of things to play with and this is no diffrent.
manu2b's Avatar
I just wait a bit, expecting a price drop in both CPU and Mobo.
PolRoger's Avatar
Maybe more like...

8150 ~ $200/$225
8130 ~ $175/$200

I'm kind of expecting Intel to slot in their announced i7-2700K sku at the current 2600K price and then bump down the price of the 2600K/2500K to keep pressure on AMD.
Archer0915's Avatar
Why? At this point PPL will pay more for the Intel processors.
manu2b's Avatar
Yep, at this price range, I would get a combo 8150+mobo+7970 as soon as the gfx card is out.
EarthDog's Avatar
Prior to BD release, Intel stated it would be more expensive than the 2600k. Seeing this though, that seems more logical.
zitnik's Avatar
I just looked at some more reviews, apparently OC'ing the 8150 to 4.7Ghz puts it toe to toe against the 2500K. Trading blows and beating it barely in some things and the 2500K beating BD barely in others. AMD did say 5Ghz on air is easily achievable. So do you count that as a win? It can beat the CPU it's up against when it is OC'd to 5Ghz.
Archer0915's Avatar
what speed was the 2500K at?
Archer0915's Avatar
but it will get better. It must get better
MattNo5ss's Avatar
If it takes a 5GHz BD to beat a stock 2500K, then that's almost as far away from a "win" as possible...

Think about the power consumption of a 5GHz BD -vs- a stock 2500K...
hokiealumnus's Avatar
Do you have any links? We also ran a 2500K (in 2D) and the FX-8150 was toe-to-toe when both were at stock through everything except SuperPi. If you're talking about gaming that may well be correct from what I've seen, but for 2D our 4.75GHz FX-8150 beat a stock 2500K handily.
EarthDog's Avatar
LOL, no. Like fractions, do to one side, what you do to the other... Overclock that 2500k and that chip is right back where it started.

Not to mention, that info that you mention is a bit off from our testing...
PolRoger's Avatar
For many people price is important. An average Joe at BestBuy is getting advice from a sales person on new comps and asks about the pricing and finds the Intel systems higher... many may/will go with AMD since stock clocks are similar and they figure it surely must be good enough and AMD now has 8 cores. AMD is striving to increase market share with the FX line and Intel could counter this by making their offerings even more cost competitive.

On the other hand maybe Intel will decide to maintain thier current margins because they don't really view AMD and the new FX line as a competitive threat.
GoD_tattoo's Avatar
From the reviews I've read, this thing gets boiling hot at those speeds, and not to mention the power usage. So no not even in my opinion. I'd rather have a cool running cpu that #1) I paid less for #2) Costs less to run than a hot running, more expensive to buy, and more expensive to run cpu...
I.M.O.G.'s Avatar
Just requoting this... Some references in this thread are making the performance out to be worse than it is. I haven't seen where an OC'd FX-8150 does not beat a stock 2500K, and our testing.

Zitnik, got links for what you are reading? I'm not challenging you, just I'd like to read what you read that.

EDIT: By the way, wow this poll took a turn after the first few hours. The first few hours after reviews the vote was almost even for positive and negative... Since then, the disappointment seems pretty overwhelming!
SSDconvert's Avatar
had to log in to congradulate on this one. funny

zitnik's Avatar
Yeah it was in gaming. I will find one of the reviews and post it...
zitnik's Avatar
Alright, here's one.

http://hexus.net/tech/reviews/cpu/32...es-stand-tall/

This one I was a bit off, it comes close to the 2500K/2600K but doesn't beat either of them except for 3DMark Vantage it beats the 2500K.

I should have clarified so that's my mistake it gets close to a stock 2500K at 4.7ghz but may be able to barely beat it in some at others. There was another review I looked at, though that may have been why I misinterpreted the performance, where it has the 8150 beating the 2500K in most games, pretty handily, too.

In this review they have the stock 8150 beating the stock 2500K in almost all of the games...? http://www.hardwaresecrets.com/artic...-Review/1402/1
EarthDog's Avatar
Hexus link is borked for me...

Outside of Dirt3 and Dues Ex its within 1-2 FPS... a margin of error in most of those tests. Nobody 'plays 3d11 which is core dependent (scores higher with more cores) anyway.

EDIT: and that was at stock speeds... Im lost.
Brolloks's Avatar
Works for me....how is it that a 8 core CPU at 4.7 Ghz gets beaten by a quad at stock in the Hexus gaming benchmarks, that does not look right?
zitnik's Avatar
I want to see comparison of it running Crysis 2, BF:BC2, RAGE and Metro 2033
Archer0915's Avatar
I saw some of the things I wanted to see (zip testing) but I want more.
zizux's Avatar
Isn't BD Supposed to run mem at 1866? or is it just that it supports it.

I've seen quite a few reviews where they were running slower ram.
hokiealumnus's Avatar
Ours ran 1866. It takes no tweaking other than setting the proper divider, RAM voltage & timings.
RADIO_ACTIVE's Avatar
I want the belt buckle
a c i d.f l y's Avatar
Me too.

I'm also curious how well it performs 256-bit FPU operations (or where or how this could be implemented in existing software), considering this is something Intel chips lack entirely. And I'm curious how much the XOP and FMA4 instruction sets will alter performance when software or the environment (Win7) are updated to incorporate these instructions.

I'm already picturing a quad processor setup, each with 32 cores (16 modules) with quad channel memory. The VM capacity per rack capacity is making my head hurt.
Theocnoob's Avatar
I actually really wanted that tin. Then I realized theres a frickin hole in the side of it for the CPU IHS to peek through. .

A CPU that doesn't live up to the hype and a tin you can't even keep cookies in. I wonder if the belt buckle works.
Archer0915's Avatar
I think that part will be for software down the road to take advantage of. Perhaps in 10 years you can dig a BD out of a landfill and watch it fly.

I bet it could work great for custom security software on a custom OS and for hacking and cracking PW and security with the proper setup.

Perhaps cracking z10 security easily?
statikregimen's Avatar
I voted it a 7 out of 10. Why? because its just not as bad as everybody is making it out to be.

There are 2 key areas keeping it from competition:
- Sinlge threaded performance
- TDP

The thing that could make up for those two:
- Lower the price

That's it. They have a decent product here, that with some refining to the architecture over the next year, could easily pull ahead. They're just charging about $30 too much for what they have right now.

I bought an 8120, due to the high price of the 8150 and the fact I can just OC the 8120 far above and beyond 8150 stock speeds. That there will make it a great bang for buck upgrade from my 1100t. Sadly, however, until they do drop the prices, this will not be as sweet of a deal to most people - most definitely not for new builds.
manu2b's Avatar
Why 7/10? because of the price AMD will charge in 6 month? When IvyBridge is out at the 2xxxK serie pricing?
Archer0915's Avatar
I honestly think if the poll were redone it may change. Some of the initial votes were votes of emotion not logic. I really had my feelings hurt because this was not what we expected because of the false hope that we built up and the lies we told ourselves. We hate being wrong and I think that is why there is such a negative showing in the poll.
statikregimen's Avatar
I'm not sure what you mean... I just said they SHOULD lower the price, and I most certainly hope they do a lot sooner than 6 months from now. And beyond that, the reason I say 7/10 is that it is a decent product, with some shortcomings. If i felt they had knocked it out of the park, I would have said 10/10, which they did not. But I do not feel it's really all that disappointing either. Its just my opinion, though :P
I.M.O.G.'s Avatar
That's very similar to the way I look at it, and I think its fairly accurate to say. I also don't care about power usage on my systems, as most the time they are sitting idle, and when idle Bulldozer is doing pretty well.

However I'm concerned about the comparison between the 1100T and FX-8150. I already have a 1090T, and I don't know that the performance shown in the benches compels me to upgrade... If I didn't already have a 1090T, then I see the potential for interest in the FX-8150 being greater.

What would you state as your reason for upgrading to an FX-8150 from an 1100T? Not criticizing, just curious about the perspective from someone who has made the jump already... New/shiny counts a lot in my book, so I'm not going to judge your answer.
statikregimen's Avatar
You misread- I'm getting an 8120 that I'm hoping to get up to around 4.4ghz on air, with a hefty northbridge overclock (a critical, and often overlooked aspect of overclocking AMD chips). So for $220, I'll be out pacing a stock 8150, paying around $50 less, easily beating my 1100t in all areas and having a good time, since I get to tweak and play more ^^ I've exhausted the overclocking potential of my 1100t and to me, overclocking is every bit as fun as the games I play when I'm done

EDIT: my 1100t has topped out at 4.1ghz w/ 3.0ghz northbridge and 2.0ghz DDR3...Its blistering fast compared to what it is at stock, so it will be a very interesting (but I feel very doable) challenge to get the 8120 to really bulldoze my current scores!
I.M.O.G.'s Avatar
Thanks, I didn't mean to say 8150 there, you were talking about the 8120 but I slipped up. Appreciate the insight.

I identify with your reasoning. Actually, I don't actually have time for games because the spare time I do have I want to spend it tweaking and seeing what sort of performance I can squeeze out.
AngelfireUk83's Avatar
Think I'll keep reading some more on the way BD is going from users on here though I was a little worried that my old 945 is just as good as BD I'll probably still upgrade but now in January maybe March/April time my 945 still has some life left in it just now need a new board as mine below fried,.
Theocnoob's Avatar
If an 1155 setup thoroughly throttles an 8150 about the face and body, a 2011 setup is going to be like a (for give the term) bulldozer running over an ant. I wonder why AMD would not have simply chosen to nix this iteration in favor of waiting for Piledriver, investing more money in that direction, and saying Bulldozer has been cancelled?
manu2b's Avatar
Really, no aggressivity in there!
I just say that you could have given 7/10 in 4/6 month time, because of a decent pricing, which is not the case right now.

BTW, as I said in some other post, I might get one, never mind the poor performance, just to torture it and see what it gets in the guts!
Archer0915's Avatar
For the same reason Ford no longer sells 77 Mustangs. Change the body style and it changes from SOSDD to NSND.

It was time for a refresh for the OEM market dude. I mean how can you keep selling 3.2 ghz processors to the same PPL. They already have one you know.

Put a shiny 8 core in there and they "CHARGE IT" (flintstones)

AMD will probably not lose money over this I mean it is 8 cores and the Intel 4 core costs more money. They run both boxes and see no diffrence and get the sales pitch to boot.
Seebs's Avatar
^^^^ This ... Oh and I fixed that last part for you Archer.

What most people that visit this site and others like it forget is that "we are the minority" target market for these chips. Even if all of us decided to boycott BD and not buy a single chip; AMD would still sell millions of them them to "normal" people through the OEM market. These things will sell without a problem at all of your Best Buy, CompUSA, Microcenter, Circuit City, etc of the world... People that buy "prebuilt" computers don't give a damn about how BD stacks up against SB on benchmarks... They just want to go into the store, hand their credit card over to the cashier and go home to unpack their brand new computer. They will be sold whatever the sales people at the store want them to buy. Period. And a smart salesman will move BD faster than you can blink with the "It's like having 8 computers crammed into one tower" line. I would bet money on that.
a c i d.f l y's Avatar
I wish they hadn't called it an 8 core processor. At least call it an 8 thread, 4 module chip like Intel calls their 8 thread, 4 core chips due to Hyperthreading. HT takes up transistor space; so does an extra integer core per module. Conceptually they've achieved the same end result.


Would you trash millions of dollars in research, development and production, not to get at least *some* return? The chip performs between a 2500 and 2600 at stock, save a couple single threaded applications that are arbitrary in relation to real-world application or use. AMD themselves stated, "This chip has features that we will be building a platform from." Though I'm convinced their platform doesn't relate to the desktop platform in the slightest. Also, someone is buying -- Newegg sold out of 8150's in 24 hours.
Archer0915's Avatar
Yes and those PPL will go home happier than any intel OEM retail buyer because none of those Intel guys have anything that can match what they have. I mean the most they can have is 6 cores and what the hell is HT?
hokiealumnus's Avatar
Someone elsewhere asked about LN2 vs LN2 since this thing clocks like a madman under LN2. This was my response:

It depends on what you're measuring. The only real reason to go LN2 is for HWBot benches (IMHO). We already know AMD just sucks at SuperPi (and pifast for that matter). That leaves WPrime, which Bulldozer is also horrible at because it lost FP cores relative to Thuban and really sucks compared to a 2600K. Doesn't matter the max multi. Here are just my personal results:

Time------------Frequency-------CPU
4sec 890ms * 6523 MHz * FX-8150 (WP32M wouldn't run any faster on my chip; full LN2 pot)
4sec 708ms * 5644 MHz * Phenom II X6 1100T BE (Also LN2, full pot.)
....and the real kick-in-the-teeth...
4sec 515ms * 5407.3 MHz * Core i7 2600K (On...wait for it... water.)

These things are fun to play with and do well at certain tasks, but don't buy one to go for HWBot global points.

It is good at what it's good at. Regrettably that does not include the older benches used at HWBot.
I.M.O.G.'s Avatar
I haven't followed the latest quarterly reports out of AMD, but I believe in 2009 they were on a tear of 9 consecutive quarters that they posted a loss in their report. That's been a couple years now, so maybe they aren't doing that every quarter any longer... but the odds are likely not on AMD's side.
manu2b's Avatar
Yes, but they had 10 of them
Brolloks's Avatar
Wow, 1 Ghz lower than the BD and 2600k still cuts the banner
The PHII did pretty well I'd say
statikregimen's Avatar
Well, since then, they've come out with their APUs, which they're selling as fast as they can make. So their earnings are in the positive now, but still below expectations, due to supply issues from GlobalFoundries. AFAIK, they're also doing very good in the GPU market, but I have not seen any specific data to back that up. Once they get the supply issues, and more importantly, the management issues ironed out, I think they'll be back in the game, as a corporation....even more so if they can fix up bulldozer's single threaded performance/IPC for Piledriver, but they have a lot of work to do there
Xterra's Avatar
And a huge lack of corporate direction. Part of AMD's problem is not being able to retain a leader. Leadership is everything in this business and AMD has had little to none.
mjw21a's Avatar
I still believe they should have retained Dirk Meyer. The company getting back to where it is now is primarily down to his leadership. This new guy is an unknown.
Xterra's Avatar
Samuel J. Palmisano was an unknown IBM executive who single handedly restructured IBM and got them back into the green... way into the green. Its possible for AMD too. They just need the right stuff. I mean, AMD's board can't be as bad as Yahoo's.
neo668's Avatar
I feel it's the perfect time now for Intel to kill off AMD, if it so wishes. AMD surely didn't give itself much of a fighting chance.
buffalowings's Avatar
HP, AMD, yahoo...they're all going down under due to **** poor leadership...they need a savior like steve jobs (sarcasm...he wasn't destined to be the next jesus people) just a genius businessmen
Cigarsmoker's Avatar
They could but that would be bad for Intel. Without AMD, Intel would effectively be a monopoly. You know what the US gov will do to monopolies.

Nope the way I see it, Intel is happy to let AMD have 10% or less of the sales. They only need to innovate enough to be a generation ahead of AMD.
Kado's Avatar
I agree with Cigarsmoker.
Xterra's Avatar
Besides, I see Apple taking over AMD in the next 2 years anyway.

I wish Microcenter would give me a HUGE deal on an 8150...
Ohioviper's Avatar
AMD really angered me. .Im done with them. Intel from now on unless they can pull something out of their rear to fix the BD.

Mind the language please. -hokie
Artas1984's Avatar
Imagine if Intel had cut their prices by some 15 - 25 % for all the current processors - AMD doomed.
Archer0915's Avatar
I do not see a takeover but perhaps a parternership or merger. Now that Jobs (RIP) is gone it may be a possability. But there again if PD does not show that it can readily handle the Intel monster it may just be a pipe dream. Apple left motorola did they not? Why?
someinterwebguy's Avatar
I don't think it would be in Apple's best interest to do business with AMD (partnership/merger-wise). They charge a premium for their computers and, since they use Intel CPU's, they can rightfully claim to have fast machines. Perhaps they might arrange something with AMD for the low end of the market, say the Mac-Mini and perhaps a special deal on graphic cards, but other than that, I don't see it happening.
EarthDog's Avatar
Quite the contrary. If they used AMD chips and prices stayed the same, that = higher profit margins.. OR they can actually LOWER the price and be more competative in the pricing of their units.
someinterwebguy's Avatar
Apple can spin anything positive true, but it's easier to use Intel and say, at least semi-seriously "The fastest computer $9,000 can buy!!!!", even if one could build it themselves for 1/3 that .
hokiealumnus's Avatar
Why are we talking about apple & AMD in a review thread? It's probably time to steer this one back on topic folks.
someinterwebguy's Avatar
Fair enough. Anybody try OC'ing one with a Noctua NH-D14?
freeagent's Avatar
Im a bit dissapointed. Theyve been talking about bulldozer for years. At least 5? I had high expectations until I saw a screen cap a month or so ago, it was running at 4.6ghz wich is awsome. But looking at the numbers it put down my stock 970 rocked it. I was hopeing it was all lies, but apprently its not. Id be pissed to if I bought hardware to support this chip. Even I felt mislead by their advertiseing.. it was like they had a secret weapon, but no, it was all just smoke and mirrors, and a crazy hype machine.
DaveB's Avatar
I have been looking for a reason to build an AMD system since 2007 and I still don't have one. Except for some fun with the Asus PC-DL dual Xeon setups, I built almost exclusively AMD single and dual Opterons setups from late 2003 until mid 2006. But pretty much nothing AMD since Core 2 came out, and BD is an even bigger disapointment than Barcelona was a few years back. Looking at the AMD Roadmap, there is no hope that I can see for them as a serious challenger to Intel.
muddocktor's Avatar
Nice to see ya around, Dave.

I was pretty much in the same boat about that time frame too. Almost exclusively AMD during that time frame. But it seems like when they were owning Intel with socket 939, they sat on their butts and read the headlines instead of lighting a fire under the R&D section by giving them more funds to further develop the processors. And now instead of investing in R&D, they rather try to trash benchmarks (BAPCO) because they show how badly their newest and bestest product sucks.

I voted a 5 on the poll, but now I wish I had waited. The more I think about it, the more I am thinking I rated this processor line about 3 numbers too high. I sure am glad that I decided to just go ahead and buy a 2500k for the crunching farm last month instead of waiting for SnoozeDozer to come out.
Archer0915's Avatar
Not just the benchmark trashing either. The claim that Win8 will make a difference is preposterous because it adds $100 (OEM) + to the cost of a system. I mean unless you just let things sit until an OS comes out.

I honestly have noticed some degradation ever since AMD started implementing the L3 cache. Perhaps there is a read/write through issue on top of the latency and algorithm issues. I really do not understand these issues because it seems Intel does fine. Perhaps AMD needs a custom compiler and some sort of call translator.

The reason is Intel perfected the Arch. and perhaps there are coding nuances that just take better advantage of Intel.

Perhaps AMD should have a compiler for the code.
Super Nade's Avatar
I don't understand how AMD claim that the current OS is not optimized for a newer processor? Shouldn't the newer processor be backwards compatible? Why would they release something this underwhelming so late in the day? Really puzzling decision making.

BTW, Linux benchmarks are out at Phoronix. The result overview table is at Open-benchmarking . Compare with i5 2500K

The author states the following. Take it as you will.

EarthDog's Avatar
Thet 2500k link is a SSD review? And what benchmarks are thos in the other link?
Super Nade's Avatar
I was just digging up some Linux benchmarks (not familiar with any of them). It seems like these guys just ran the Phoronix Test suite. Definitely not an apples to apples comparison.
terran2k's Avatar
Im disliking AMD as a company now; just the tactics and marketing they employ lately just rubs me the wrong way.
Intel isn't making me like Intel, AMD is making me like Intel. go figure.
I'll always recommend my company buy intel servers now.
PolRoger's Avatar

The lack of supply at launch surely leaves much to be desired!
Theocnoob's Avatar
Step 1) Assemble engineering team, try to make awesome CPU

Step 2) Hype CPU

Step 3) Realize CPU is not coming along so delay in attempt to fix

Step 4) Steal underwear

Step 5) Delay CPU, trash leaked benchmarks, attempt to fix CPU

Step 6) Realize you can't fix it because all of your good engineers have left and the ones left are angry at the trickle of resources they are given.

Step 7) Release (terrible) massively hyped and delayed CPU to absolutely devastated and shocked public.

Step 8) Transfer all power to forward shields and claim that NOW the reason the CPU is bad is because it's ACTUALLY meant for Windows 8.

Cough "If" it proves to be false that Win 8 is better for FX, what will they say? It's meant to run on Win 9? It only works in the fifth dimension where it renders rainbows and fairy dust?

Never has the term "EPIC FAIL" had a more fitting application.

Nerd blurb:
AMD knew this CPU was crapp in May. What they should have done is release 9 series chipset, because they needed it, and called it AM3. That's it. Cancel Zambezi, pray that Piledriver will fix it. OR bail on the pure CPU game entirely and make APUs.. Which seems wise at this point because with things like SB-E around the corner, AMD truly has no chance of catching up to intel in pure CPUs any more. It just isn't going to happen IMO. We've come a long way since P4 vs FX. We hit the frequency wall six years ago. Silicon computing is very very very mature, we're approaching the point where we will literally no longer be able to shrink the process any more (somewhere around 5 to 8nm?). Heat dissipation is an ever increasing concern (look at the average size of an aftermarket heatsink from 2002 to 2011). Much like the 'they all look the same' 2 to 4 engines under the wings stuck to a metal tube airliner situation, it's almost gone as far as it will go. You can only tweak the same shape for efficiency so much. If Netburst, Barcelona, and Zambezi are good for anything, its showing us that the last decade has seen us get into a situation where the diminishing gains are so obvious and the requirements to truly innovate and excel are so great (and will continue to increase), that building 'the next' cpu becomes exponentially more difficult every time. Lately, the diminishing gains have become terrifying. Look at what we're having to do to keep up with Moore's law, and how the 'Gospel' interpretation of that law is starting to go into some kind of Bizzaro universe. Zambezi has something like 2 or 3 X the transistor count of Sandybridge and double the cache. And it's fail. Amazing.


990FX should have been nothing more than 'the ultimate AM3 chipset'. That's it.


Issue #5821 with FX was poor yields at wafer level when manufacturing. That means limited launch supply.
Ivy's Avatar
Ehm yes yes.. not to impressive.
Anyway, its not entirely true that the 2500K at stock can beat the BD in any term, in some massively multithreaded applications the BD can have the edge. Its still not any good not even to be on par with 2600K on most occasions. Sigh...
Although for gamers its not much use, because games still lack at massively multithread support, most games dont support more than 6 cores or threads, as of it is today.

I still want to add, what did people expect? AMD now being face to face with the currently best SB? That was a dream almost to big to come true. Intel made such huge advancements while AMD barely was increasing since the old days where they was building single cores (and they certainly was totaly on par with Intel at that time, but thats the past). Anyway, dont forget what AMD did for ATI and that they surely helped ATI to maintain a very strong position. The question only is if they will be able to bring the CPUs back to a true competition, where they should belong.

Without AMD/ATI we would have GPUs at the price of up to 1500$ and 90% of that cash goes directly into the funds of the monopolist company and not to the poor chinese workers... The whole Intel strategy is a total rippoff to be gentle, and its surely to much to charge 1000$ for a 6 core CPU (not server grade 10+ core), however, as long as there is no competition they can do whatever they want and i even will buy it because i hate the SB concept even more. Its just here because they can sadly afford it to act like they do. I would never buy a SB without 6+ cores, but they made it in order to make a side step between to charge more out of the milk-cow.

Anyway, yeah, cant use the BD on smaller systems... just to much TDP. Who would be able to cool that down, unless its a super sized freezer-PC.. and ofc... up to 90W more is equal to all the light bulbs i have at home because most of it is either LED or a halogen bulb using "only" 18W, finally not more than 90W.
Super Nade's Avatar
A tad harsh, but I essentially agree with everything you have said (i.e this series is not competitive as of now).
Ivy's Avatar
I think there wont be much improvements on WIN8, thats truly a fake statement. Seriously, a CPU either works or it doesnt, why would a single currently disabled option make such a huge difference? Lot of excuses.. but just never bring it down to the OS, thats something they should avoid. MS at least was sill supporting it and cant adapt for every new update in a instant. I think in that term they have to pick theyr own nose and check out responsibility inside theyr own building, thats clearly the main issue here. Not a OS or anyone else, even if the support may not be at highest grade yet.

Now all they can do for now is to be honest with theyr issues and try to please anyone supporting it with nothing but the truth.
I.M.O.G.'s Avatar
Stuff like this is bunk. Both companies are garbage depending on where you look.

Fair market practices? Intel slashes SB prices by $70 at microcenter right at BD launch, that is a pretty obvious tactic... but nothing compared to the monopoly claims they've paid millions to settle.

Every marketing department pumps their product claims. AMD is no more guilty than Intel, except Intel products have the R&D support that they've actually been able to deliver better on many claims recently.

So to be clear - I'm not defending AMD or Intel here, but making a point that making this an AMD vs Intel thing is pointless. Stick to criticizing the products and our comments usually make more sense than when we start making arbitrary criticisms about the way the companies do business... It's big business, and its dirty in countless ways.
ratbuddy's Avatar
One thing I haven't heard in a while is "Bulldozer is really meant for servers though!"

Am I off my rocker in thinking that these will be terrible for servers due to the heat and power consumption? Was the whole server argument just another flavor of "wait for Windows 8, then Bulldozer won't stink?"
Theocnoob's Avatar
I think that no matter how you slice it, AMD marketing has gone from inept to just plain lying. (They forgot to inform the public that FX was better than P4). Now they say that FX is the best thing since margarine.

http://www.amd.com/us/products/deskt...ges/amdfx.aspx

^AMD's FX product page
Full of 'overly ambitious claims... eherm.. , in my opinion


"AMD FX Processors give you more bang for your buck with aggressive performance" (Ya if you overclock the hell out of it maybe. Bang for your buck I think not).

Intel on the other hand, has never really said its products did... anything. They tell you how many cores it has, then they say something meaningless that sounds warm and cuddly like a politician would. "Visually smart computing". What the hell is that even supposed to mean? As someone who is informed, I ignore it, and as a layperson, it sounds like its good but the exact meaning must be going over my head, so I'd better buy it.

http://www.intel.com/content/www/us/...processor.html
^
Intel's i5 product page.
Full of gibberish and nonsense.
But nonsense is not a lie
Ivy's Avatar
I dont see much reason, they could aswell just get the new IBM power7 based stuff if they hate Intel and that would perform better than the AMD stuff. Face the truth.. IBM is now Intels oponnent on servers and not AMD.

I dunno how much TDP a server can handle but considering the fact that they need a huge amount of CPUs it wont be any gentle and they surely have to watch TDP. Additionally it makes a difference if a CPU is in need of 90W more or less.. on a server there are so many CPUs that you will easely reach several thousand W of additional power consume, thats just a mess in my eyes. The additional consume is so much that you could easely heat up your room with, while having 0C surrounding your home. Well.. you could say.. at least i dont need to heat up my server room anymore, but that only works in a very cold country. So any country to the south is eliminated from the server list.

I am sorry for to much of my truthful view.
Super Nade's Avatar
If it were marketed as a server CPU, Phoronix would have definitely gotten a sample as the *nix crowd has a very competitive presence in the server market. I guess, the only recourse is to give it a spin with Win 8 Alpha/Beta and see what happens?
Ivy's Avatar
The new AMD BD is in my view best used by main consumers who are having huge systems and are OCing. Any other might not become happy. Its aswell the only way to make it barely competitive to a cheap 2500K CPU. Although Intel made a unlocker for a reason, so they can totaly destroy theyr oponnent since, unlocked multi was a long time advantage of AMD while Intel usualy only had that option enabled on theyr 10000000000000000 $ CPU (not exactly that much but thats how it feelt like). Lets say the 2500-2600K is priced equaly such as the Gulftown, and only the 2600K is unlocked at a price of 1000$, what would happen? Yes.. we know it! AMD would be getting massive support.. that have to be avoided, and thats why they are such of a wise nerd. Now while Intel was adapting to it in such a manner, AMD is now in big trouble because they cant lower theyr price much more, no matter what they trying to do, they need at least so and so much bucks for it. So its hitting them very hard.
satandole666's Avatar
So I take it the classifieds won't be flooded with cheap SB parts?

Damn.
Ivy's Avatar
Those are always cheap, its a baby-CPU specially created to upset power users. Because it does barely exceed the Gulftown but is much cheaper. Did i need something like that? I never had such a weird situation ever. They are delaying theyr own advancement because no competition anymore. The power users are all happy and are comparing to a totaly overpriced Gulftown, which could be much cheaper if they are willing to do so, but why to give a "better" option? Im already confused enough why the heck a old Gulftown is more expensive than a SB? Well because many users dont know what to buy anymore and would pay twice not to get such a SB baby ("if you do something, then do it right" mentality). Soon, you can throw those old SBs away, because of even better SBs, have fun. But no worry.. Intel will sell you a new one every time it happens.

AMD is much more gentle on marketing than that, but it simply doesnt help when it cant compete...
neo668's Avatar
In a way I'm glad BD sux. I save some money this way. All I have to do now is replace the 840 I'm using now with the 1100T. Instantly, I get BD performance at a fraction of the cost.
rollincoaster's Avatar
lol guess that's a good way of looking at things Was waiting for them to mature myself anyways.
xrror's Avatar
Bah, the FX series isn't supposed to be a price/performance anything

AMD should have kept the Black Edition marketing for these, and reserve the FX badge for if/when they can get a Bulldozer stepping/Piledriver into halo range again.

Of course, this is the same marketing that used the Radeon name to push mid-grade DDR3 modules too

Oh well, here's hoping GloFo can get their 32nm process yielding some higher clocks soon.
EarthDog's Avatar
"main consumers" (does that mean non enthusiasts??) Di not overclocking sooooo....
DaveB's Avatar
Since the server versions will be clocked slower, I can't see them being winners in the server market either.
muzz's Avatar
Wow...........what a disappointment.
That's about all I can say without going off the edge and really bashing this POS.
HankB's Avatar
I think the TDP is likely to be an issue for servers as well. It represents a double hit. First off, the servers would require more power to run and perhaps even bigger PSUs. In addition to that, they could require additional cooling. I'm sure those costs factor into server purchases.
someinterwebguy's Avatar
I've already advised my friend/client that was interested in the BD server CPU's to let me look into Intel offerings instead. After reading several articles I linked him to, he wholeheartedly agreed.

Perhaps things might change down the road, but I'm not holding my breath.
Robert17's Avatar
I found this to be an interesting read concerning AMD:
http://www.overclockers.com/forums/s...688547&page=25
Archer0915's Avatar


I just do not get it

The lower end BD for 15 more than the 2500K Something is strange here.
bmwbaxter's Avatar
Intel I guess is taking a page out of AMD's book and undercutting them to get more sales.
Archer0915's Avatar
But the total system costs are almost the same.

All I can say is I hope this enthusiast blunder works out to be an OEM wonder or AMD may start to go under and their assets put asunder.
bmwbaxter's Avatar
AMD still has the GPU market and it is doing find there.

also I have a good feeling about the OEM market for BD Acer, HP, Dell any of the OEM won't have to pay an inflated price for BD like us since they buy direct so it should be cheaper than an intel setup for them. Also 8 cores will sell better to the average consumer than 4c8t from intel. before I got into computer I had no clue what hyper threading meant. so OEM should be good
Archer0915's Avatar
I agree I just thought the poetry sounded good

Hyper threading is the stitch they use in Mexico on wrangler jeans is it not?
bmwbaxter's Avatar
and Calvin Klein.

I think some of the people here have forgotten that 'we' enthusiasts make up a tiny portion of the market and the general public doesn't know anything for the most part.

Am I unhappy with how it performs... YES! but do I think this is the end of AMD no!
g0dM@n's Avatar
Me personally... I'm very upset with BD. I upgraded my awesome Gigabyte 790XT board to an AM3+ board so that I was locked and ready to swap my 1100T for a BD. Well, it turns out it was completely useless b/c I am not buying a BD. It's not even a true upgrade from my 1100T...

I've never, EVER been upset with AMD... until now. I was never biased, but always gave AMD a slight edge b/c I like rooting for the underdog. Not only that, but AMD always gave me great value, keeping their motherboard prices down and delivering the performance I was paying for.

I hate to say it, but I regret not moving to 1155 when I had the chance... at this point I'm going to sit this one out one more time and just hang onto my system while it's doing fine.

I do hope that this new architecture is just the start of something new/better... only time will tell.
Archer0915's Avatar
Dont worry IB and SB-E are coming ane they are based on current working performance monsters. I figure this slap from the enthusiast community (kick in the nutz) might be what AMD needs to get their house back in order.
bmwbaxter's Avatar
I think that the APU market is where AMD is better suited. since they already have the GPU know how they might as well capitalize on it. I think the APU was a far better product than the BD.
Robert17's Avatar
http://www.xbitlabs.com/news/cpu/dis...er_Fiasco.html

I managed to foul up my last post; here's what I was referring to.......
SuperDave1685's Avatar
So when I read the review I was like



And then I was like

Apht3rThawt's Avatar
Server farms in Iceland. We know that BD produces heat, 240 watts at load at stock speeds, over 500 watts oced. It sucks at single threaded applications and is designed for multi-threading, which most consumer apps aren't. It fits the G34 Opteron socket as a drop in. Professional and workstation apps use multi-threading and in fact many are n-threaded, meaning as many threads as cores, or modules. AMD is depending on compilers in the future to take advantage of the multi-threading of BD/Zambezi; Windows 8 will have a better time with BD than current oses. The architecture of BD is such that it is a future chip, albeit with some problems. Server farms are being located in cold climates to take advantage of colder ambient temperatures, so heat will be cycled off without so much active cooling. In fact it may be a good choice for servers but not the low level, home style servers most of us think of. On the one hand, it sucks as a consumer chip, but will probably be a great server chip at parallel processing. The other side is that AMD did put the GPU/CPU engineers together. BD may get better as they fine-tune the stepping and compilers catch up but right now we see the problems for us consumers. It will depend on future software compilers to make BD into what it can be.
Xterra's Avatar
I tend to agree mostly that BD took a step backwards in order to move forwards. AMD over hyped the chip and their vacuumed sealed NDA destroyed consumer confidence by not being able to *beat* Intel. This isn't an engineering failure, its a flawed marketing and business model that, unfortunately came from the top.
DaveB's Avatar
The i5 2500K has been available at Microcenter stores for $179 since they came out. So, no undercutting required.
bmwbaxter's Avatar
the 8150 was always priced above 2500k for a reason. it is a "better" chip... if you have programming coded for more cores and an operating system that fixes AMD's screw up.
Archer0915's Avatar
http://www.techreaction.net/forums/s...&postcount=315

If true I am still peed at amd but I will be happy with the competition.
neo668's Avatar
Thanks for the link.

"By most estimates the AMD Bulldozer FX is underperforming by 40-70% in most Windows 7 benchmarks."

If you can get a 40%-70% improvement that would be very good news.
hokiealumnus's Avatar
I put exactly zero credence in that article. You know why?

That dude doesn't have a freaking clue what he's talking about. You know how many "ad-dollars" Intel spends around here? Go ahead, guess. None. Nada. Zilch. Same with many hardware sites. Intel doesn't exactly need to throw around ad money to get recognized.

The CPU performed poorly in benchmarks, but did well in real-world tests. I outlined all of that from both sides and made it perfectly clear, as did many other sites. This dude is nothing but a fanboy with no proof. His 'proof' is that it performs well in real-world (read: rendering/encoding/etc) benchmarks. Well thanks dude, we know that.

I'll eat my hat if there is ever a Windows patch that produces "40-70% more performance". If there ever is a patch, then it may improve performance, but no Windows patch can turn water into wine.
Archer0915's Avatar
Hey hold fire. I am just a messanger I did not write the message. I am very skeptical of this I did say "If true".

Personally I think all of the testing that was done by everyone that had these processors for review was great and even though all tests may not have been to everyones satisfaction, when you have a time limit there is only so much you can do. When things trend toward the crap pile just looking for something that shines can call into question the reviewer and cause the person doing the review to lose credibility.

I agree with your views on this.
hokiealumnus's Avatar
I do just want to clear this up for everyone that sees my post - absolutely none of my rant was aimed at Archer. He's just the messenger and has nothing at all to do with the guy that posted that. Please don't misconstrue my post as having anything at all to do with Archer, it was all to the dude that wrote that nonsensical post.

Better?
Seebs's Avatar
\Off topic...
LOL... You best be careful hokie... Archer owns lots of guns and knows how to use them; he's been getting ready for the zombie apocalypse for a long time now.
/Off Topic

And on the subject of BD and that article Archer pointed to. I have Win8, does anyone want to send me BD and the "miracle" patch? I'd be happy to test it out for you guys.
Theocnoob's Avatar
The "It's meant for Win 8" thing is only the latest in a string of boy who cried wolf style moves.

Lie, delay, lie, delay, delay, cringe, release, make people cry, lie again.
vdgamer's Avatar
Is the only difference between 8150 and 8120 are stock core clocks or is there something else? Because 8150 is $60 more on newegg
buffalowings's Avatar
guns you say?.. hmm...any bows?
Theocnoob's Avatar
Major difference in cache. Like 3MB I think.
Also the 8150 contains 20% more broken dreams.
bmwbaxter's Avatar
stock core clock is basically it. but since the 8150 is clocked higher it is 'probably' binned a bit higher. but you might get a 8120 that will overclock higher than a 8150. it is mostly luck of the draw.

EDIT: @ ocnoob - they have the same amount of cache. both have 8MB L2 and 8MB L3
vdgamer's Avatar
So 8120 is a better buy without question, what about 6100 how does it compare to thuban 1100 and 1090? There priced very evenly
dihartnell's Avatar
Hell I hope they dont sell too many. I'll have to get to higher ground so when the Polar Ice caps melt, I'll be ok. Global warming could be an AMD plot.
bmwbaxter's Avatar
well if the thubans beat an 8150 then they should easily beat a 6100.
Archer0915's Avatar
Cool! I knew it was not aimed at me but I did not want anyone who did not click the link thinking it was me
Archer0915's Avatar
Fiberglass and Al Grass hoppers, field, blunt, bullet and broad heads all the way up to 275gr. Also do gigs, swords, spears and splosives Yall come, ya hea? We gona gits dem zombees
Apht3rThawt's Avatar
Archer, you crack me up, buddy. Ben der, dun dat, got da wife-beater. Bring em on!

Seriously folks, think AMD64. First to the gate with 64 bit instruction sets, Intel had to follow suit. This may be more of the same. Looks goofy, but may kick butt. AMD says they are working with MS on the Win8 implementation. Could be they are just ahead of the curve. The compilers have to push threads through both cores in a module to use 256 bit processing, only 128 bits per core. As a consumer cpu it may not be the cat's meow or it may get better as software catches up. Only time will tell. I would say, stick with the BD if you have one, otherwise may be best to wait and see.

I was curious. Anybody gaming with the new Zambezis? How are they on games?
dihartnell's Avatar
Totally agree. Scrapped my plans to upgrade to Bulldozer. Gonna build an HTPC around the 3850 APU and socket FM1 instead.

Might go ahead anad get a 980BE as a holding CPU until bulldozer sorts itself out or pile driver turns up
neo668's Avatar
Please correct me if I'm wrong. Reading Hokie's review I had the impression that the 1100T was slightly slower than the 8150, ie. the 8150 is the superior cpu though only marginally. Somebody please put me straight. Thank you.
dihartnell's Avatar
Most of the time it appeared it didnt. Even if it was marginally better why droip extra money for it.
neo668's Avatar
I've already given up on BD. I'll be getting the 1100T instead to replace the 840 I am using now. I won't need to get a new mobo and therefore save a bit. So if the comment that the 1100T is better than BD is true I'd be very very happy.
bmwbaxter's Avatar
I went through your article again, I couldn't find where or if you had mentioned which bios you had used for the testing.

if you could let me know which one that would be great
hokiealumnus's Avatar
You had the correct takeaway. When benchmarking (in, say, SuperPi or WPrime), the 1100T scores better because it has two more floating point cores, but when you actually use the CPU (rendering/encoding/compresion), the FX-8150 is absolutely superior. At stock it scored as well as the 2600K and gave it a good whoopin' overclocked.

So if I were buying to upgrade from a dual- or quad-core chip, I'd go with the FX over the 1100T. If I already had a 1090T or 1100T I'd stick with what I had. It's worth it if the upgrade is a jump (but NOT at $280...it's worth it at $245), but not if the upgrade is only from a Thuban. That just wouldn't be a strong enough difference to justify the upgrade.

0813 - It was the most recent available from ASUS & AMD at the time. I think it's the most recent on their site now too.
Theocnoob's Avatar
Really?

So they're charging SIXTY BUCKS for one bin higher? Should be 30 max.
This just gets worse...
neo668's Avatar
Thank you Hokie. Great to hear from the Master himself.

I think an upgrade to the 1100T from the 840 should be a bit of a jump. As I won't be moving to a new platform my only cost would be the price of the 1100T. I don't feel the whole move to the new BD platform would be worth it at the present moment.
Theocnoob's Avatar
I agree. The 840 is a (Rana?) core rather than a Deneb right? It's too bad they don't have FX working on 8XX like they said they would.
Super Nade's Avatar
I would have to disagree with your assessment. The 8150 is an inferior chip to the 2500K, no matter how you dice it. You (AMD) don't create a problem and then propose a mythical solution that allegedly alleviates it.
neo668's Avatar
The 840 is a Propus. If BD would work on the 8xx chipset I would get the 6100 at least.
vdgamer's Avatar
But if we take the overclockability of 6100 into account and say that realistic 24/7 oc for it would be 4.5-4.8Ghz and around 4Ghz for 1100/1090, which one would be a better performer at those clocks? 6100 is 32nm so it runs cooler and more power efficient also
bmwbaxter's Avatar
thanks

Asus just doesn't show FX as a supported cpu yet on their website for the CHVF
hooflung's Avatar
Does the 2500k or any K series have more than 4 real cores and Vt-d (IOMMU)? No. So for certain things the K's are not, and cannot, be superior CPUs.

For developers like me, only the normal 2500 or 2600 fit the bill and they loose out to the x6's and BD chips every time. Not everyone is looking for the fastest gaming machine or video encoder.
Archer0915's Avatar
Though it is an 8 core cpu those cores have much less compute power than the iX w/HT processors. Yes highly threaded software can take advantage of it and those extra threads should be stronger than HT but let us not forget that HT is a diffrent scheme on intel with one powerful core running 2 threads instead of 2 meh cores running 1 thread each.
Theocnoob's Avatar
I think the take away on BD at the moment is that if anything, the 8 core, once (very) overclocked is a good performer, despite it being overpriced and a power hog. But only if the app uses 8 threads effectively.

The 6 core is a bit of a debacle. I'd say if you wanted six cores, stick with the devil you know and get a 1055 or a 1090. I've seen 1055s at 140-145 lots of times that's a good price for that cpu. Beyond that I'd say it's 8120P.

Just being 32nm doesn't make things run cooler. Bulldozer may be 32nm but it is far from power efficient and far from cool running. The whole thing is still kind of 'tripping me out'.
Super Nade's Avatar
Why limit comparisons of performance (or extrapolations) to either cores or clocks? Architectural nuances are completely irrelevant if there is no performance advantage to be garnered. This sounds eerily similar to Intel's Itanium argument, on how those chips were vastly superior to AMD's offering if a strict set of criteria were met. Itanium was a flop.

What kind of developing/coding do you do? Just curious. My work is mainly with scientific simulations (ODEs and FFT), which more often than not are limited by memory, before the CPU.
tangletail's Avatar
I haven't given up on Bulldozer just yet. When software catches up, it may dominate. But if not, who cares. I happily wasted money on a company who was not scared of taking a huge risk. I will run it to the ground when I get the money to purchase one!

All those nay-sayers need to realize how the programs were compiled. Intel is only dominating because it has the most used instructions, AND windows 7 was optimized for it's logical cores (try using it on xp, not going to do so well :P). The hype may been a disappointment, but it's not to knock such a ****ty or amazing core yet until you actually purchase it! Judge it for yourself and actually buy the thing before believing the reviews. Everything on the computer happens in nanoseconds, you are not going to notice a four or five frame difference, or a 400 points for that matter.

Price per-performance. While it may be 20 dollars more then a I5 2600k, who is faster then zambezi, it's four more real cores at five bucks each in comparison. If you are buying this for gaming, yeah go with the I5 then... but if you do more then that, like I. I will pay the extra twenty dollars of money for 80% the performance and more REAL cores then Intel's Flagship.
ratbuddy's Avatar
Wait, you'll pay $20 more for 80% of the performance? Hell, I got a 286 to sell you, it's like .2% of the performance, that's gotta be worth thousands
wingman99's Avatar
They are not real cores like intels Flagship.
mjw21a's Avatar
A shame we don't have Windows 8 Beta benchmarks. As I understand Win 7 doesn't know how to handle the CPU's resources for BD.

I think I'll wait for this info before bothering with anything.

Thats a pretty dumb thing to say. There are at least as many real cores and an 8150 as there is in an i7. Hyperthreading gives for 4 other virtual cores..... AMD's approach never should have listed the CPU as having 8 cores. Each module rather has two pseudo cores, however they're not fully fledged cores.

I'm waiting for Windows 8 and Piledriver before I buy, my current rig is perfectly capable of playing current titles a max detail.
hokiealumnus's Avatar
Sorry, but what's the point of Windows 8 benchmarks at this time? AMD will be at least one generation (Piledriver) ahead before that is even out in beta form. The earliest we'll see Win 8 out is next year in the 4th quarter. While I do understand wanting to see if it fixes Win 7's scheduling issues, it seems less than a great use of time at this point since being able to actually use it in that manner is so far off.

EDIT - Re: Intel's flagship, I presume he was referring to the 980X/990X, which has six "real" cores, containing integer and floating point components in all six cores, which would have more FP cores than BD. At least I think that's what he's getting at; he'd have to answer that one though.
tangletail's Avatar
Perhaps you misunderstood me. Keep your 286, I will only pay a penny for that :/. You probably did not even read my entire post. The 20 dollars give you more cores, which some of you just might say they are not real. But yes, I do need those cores.

The approach AMD had taken is yes, on debate. But the Integer cores still count as real cores because that is where most of the actual processing will typically take place. The I7 does not have to many real cores, it has logical cores which is basically a duplication of certain sectors of the core, without increasing resources. The reason why I would by a bulldozer over intel's is that this design was for splitting work and processing simultaneously, instead of all cores getting work done when they can. Why is this an issue to me? I am working with 3D rendering, and if all but a few buckets are finished using a hyper-threading cpu, and each bucket is on a separate core, then the render times have effectively gone up to some ridiculous amount, because the program will not divide that square up into more because you will hit an endless loop, which activates the I series -timing- security glitch. And god this is annoying when it happens.
mjw21a's Avatar
Hokie, I think a it would be a good indicator of future performance. As I usually tend to factor in future performance and upgradabillity into my purchasing decisions then I bleive this has relevance.....

It would tend to indicate whether the way Windows handles the architecture is a source of their performance woes or it's simply a crappy architecture.
wingman99's Avatar
That is the flagship i was referring to and sharing instruction fetch and decode stages, floating-point units, and the L2 cache.

thanks.
doz's Avatar
Still waiting on Windows 8 benchmarks.. Everyone keeps talking about them yet I havent seen them yet (or did I miss them?, if so I apologize).
ratbuddy's Avatar
I sure did. I still can't believe you'd pay more for less performance, I don't care if there's 8 cores or 50. If they can't beat the competition, they shouldn't be charging MORE money.
statikregimen's Avatar
This is the only one I know of: http://www.tomshardware.com/reviews/...x,3043-23.html

Realistically, a Windows scheduler improvement would only help in lightly threaded applications. Anything that utilizes the full 8 cores, is still going to run into the normal bottlenecks of shared resources. Single threaded apps wouldn't benefit, really at all, I don't think.
hokiealumnus's Avatar
The only one I've read is Anand's very short gaming comparison, where it did show an improvement. Check out the conclusion page of his review.

EDIT - I see Tom's did too. Doesn't look like too much of a change change moving to Win 8 based on those two sites.
Xterra's Avatar
Windows 8 performance increase seemed like a pipedream and now its turning into one.
SupaMonkey's Avatar
When i first voted in this poll i voted 5, but after reading for the last week i want to change my vote to 1. I was going to buy a bulldozer but now i'm going to wait and buy an ivy bridge.
bmwbaxter's Avatar
Win 8 is still just a beta so the final product might be much better.

but that doesn't really help anything since BD is still a failure for most of us now.
devilDogbert's Avatar
After all that hype, this is seriously disappointing. And I just got a new AM3+ mobo to boot...

SupaMonkey's Avatar
To be fair there wasn't any hype by amd.
bmwbaxter's Avatar
Well there was a 8.429 ghz CPU validation and the operation scoripus comics. But beside that there was only an air tight NDA.
SupaMonkey's Avatar
Yea, fair enough. I forgot about the comics, i only remembered not knowing ANYTHING until the release date.
hokiealumnus's Avatar
The non-AMD hype is (IMHO) why this CPU came out to such a bad reception.

I still think it's a good performer for multi-threaded applications at its $245 MSRP. At $280, no way would I recommend this CPU to anybody. Get the 2600K for a bit more or save some cash and get a 2500K to perform close to as well. This CPU is worth $25 more than the 2500K if you use multi-threaded applications, but it is definitely not worth $60 more. If you already have a Thuban, don't bother upgrading unless you want a new toy. Even then, don't upgrade until the price comes down to where it should be.

Actually, I think I'll edit that into my first comment so people see it at the bottom of the review.

EDIT - Added that to the first comment. I will also add an author's note at the end of our review pointing to that comment. I'm very disappointed in the price point of these things. If it's the latest and greatest, sure, crank the price up at launch...but if it's only the latest, you're just gouging suckers and early adopters.

EDIT II - Done.
EarthDog's Avatar
But WHO is doing the gouging? Not AMD. They set their price at $245...so should we hold AMD accountable or the retail chain?
ZapTap's Avatar
I agree. And the price should go down. but only time will tell
hokiealumnus's Avatar
That's true. No, we shouldn't hold AMD accountable for it. I've edited the comment to reflect the following:

So while you're right we can't hold AMD accountable for their reseller's gouging, they're still selling there and should not be purchased at that price point. Once retailers bring the prices to a reasonable point, then Bulldozer has my blessing.
muddocktor's Avatar
Hell, you can't even find any 8150 or hardly any 8120 procs in stock right now either. To me, it looks like not only is the performance low, but they can't even manufacture enough to do a decent hard launch of these furnaces.
vdgamer's Avatar
Well lets hope ivy bridge delivers, as of right now it seems like the only reasonable upgrade for me, not that my 920 @ 4Ghz was slowed down by anything, just wanted BD to live up to all the hype
Xterra's Avatar
Of course not, but considering the inflation of the MSRP, no one can land a BD chip for $245.
pinkfloyd48's Avatar
I wish they hurry up and restock its suppose to be a cold winter
bassnut's Avatar
NCIX has them in stock now ..... 8120's, 6100's and 4100's

http://www.ncix.com/products/index.p...inorcatid=1400
g0dM@n's Avatar
Ridiculously bloated prices... no way would I pay that...
Neuromancer's Avatar
Hah hah this reminds me of an old joke.

An Elderly couple goes to a restaurant and after they finish their meal the conversation went as follows.

Waiter: "How was everything tonight?"
Old Man: "Absolutely terrible, the food was cold, i think the fish was rotten, the vegetables were burnt and I found hair in everything!"
Old Woman: "Yes, and the portions were too small!"


So the commentary so far is its sucks no one should buy it, its worthless and why can't I find it in stock?

ratbuddy's Avatar
Think it's just morbid curiosity. Yields must really stink on those things.

Meanwhile, Ivy Bridge has entered mass production...
tangletail's Avatar
Why are the prices so freaking bloated anyways? The prices are all different. New egg sells the top model for $274. NCIX is like $287 plus shipping if they charge it. Amazon may be a bit around that point. And AMD set the price point for the 8150 at 245?!

Dang you retailers for feeding off the demand D<


Because no one seems to want to buy a core for themselves if the reviews say it's crap. I unfortunately don't have the money for one just yet. But I do intend to buy one unless AMD does the same to BD like the Phenoms I (Which we all could say that it was the worst thing ever on silicon since the first pentium). I honestly don't see the point of believing a review to the exact dot just because the reviews say it's a few points, fps, or seconds behind a 2700.

The difference is very minuscule and you will hardly notice the difference honestly. Though I do prefer that AMD fix the BD up, I am not really disappointed.
Apht3rThawt's Avatar
The price is per thousand. That is usually the way they set price, not retail. $245,000 per 1000 units. Get 999 of your friends to go in on a batch and you get it for $245. That is still pretty high though IMO.

Here, fresh off the press. All about Bulldozer. http://arstechnica.com/gadgets/news/...ting-debut.ars
Janus67's Avatar
There is a very large difference in performance/watt and heat generation though, especially in comparison to the 2500k and 2600k.

I still don't think I would be able to recommend Bulldozer at $245-250 when it only sometimes competed with the 2500k, if it was around $200 then I would consider it for people who heavily favored AMD and were only going to be doing heavy multitasking (no single-threaded applications).

Oh, and I searched the entire thread and didn't see anyone use this term. I shall now coin this chip *drum roll*



Dulldozer



Looks like Dulldozer will be Phenom1 style, may as well wait for Piledriver which we can hope would be akin to a Phenom2 and have the performance to finally beat a Phenom2
wingman99's Avatar
coin this chip BD BadDozer.
I.M.O.G.'s Avatar
The stated price is not per thousand. The MSRP is for a PIB (processor in a box). $245 is what AMD suggests. Retailers charge what they want based on availability and demand. They know many early adopters aren't that price sensitive, and I would guess availability is limited. By Christmas I would expect to see prices drop closer to msrp.
bmwbaxter's Avatar


the price isn't per thousand units.


Also why does everyone think that because bulldozer isn't as perfect or efficient as we had hyped it up to be that this is the end of AMD. AMD still has the GPU market in a good place and BD isn't all that bad. in the OEM market it is likely to sell good and that is a much larger portion than us. AMD also has the Llano APU's going into the mobile market and from the reviews I have read they are excellent!

before someone accuses me of being an AMD fanboy save your time. I am far from an AMD fanboy, I have up until this point only owned Intel and nvidia products. yes I bought BD, but I purchased everything after the NDA was lifted so I am not some poor sucker trying to make myself feel better about my spent money. I knew the performance and bought BD anyways.
wingman99's Avatar
bmwbaxter How do you like your BD is it doing good for you, i read in one post it's having trouble gaming.
bmwbaxter's Avatar
I don't game. I bench gonna be running it under a full pot of ln2 next weekend going for a nice cpuz validation.

but just from using it I can't noticed any downsides to the CPU. for everyday use it is just fine, thats why I am confident it will do fine in the OEM market.

as for how it does gaming check this out. Its not terrible but its not great, it performs about where it is priced. power consumption is going to be its biggest problem i think.
Apht3rThawt's Avatar
Sorry guys, when they set a price it is for OEMs and the price was per thousand. Here's a ss of intel and amd prices, notice 1k tray prices. That's 1000 pieces. Why jump on me when you can see for yourself. I've been in this a long time. Put in over 500 cpus myself. Like a megabyte and a million kinda thing.

100644 100645
Janus67's Avatar
A couple key parts of the Ars article linked above that I agreed with:

That is pretty much what I have been thinking. People keep saying that this chip is made for the future, the future is more going to depend on how much/if developers move away from single-threaded coding and code to scale to multiple cores. It also seems that Piledriver would/could be a much better solution as it should be able to have its own FPU separate to assist in calculations (or to put them off to a discrete card -- although for games that isn't much of an option unless you would have a card akin to a Physx card just for calculations...)

tangletail's Avatar
I honestly don't understand WHY developers have not moved on to the scaling yet. This technology has been out for years, and pretty much been perfected. But only the 3D software and some design suits actually use all threads.

It is not hard to code for honestly, and the compilers will do most of the work for you. I can understand the lack of use for most of today's basic applications will benefit. But heavy work like games, development, computing, ect. I can understand that finding errors could be troublesome if it spreads across the processors, but damnit move on to the future! Technology advances double every year, but software ALWAYS remain the same. Or changes every five or six years.
Apht3rThawt's Avatar
On the cpu price per thousand; if I'm reading the price list wrong, let me know. At $245 unit per thousand a price of $275 gives a markup of ~12%; that seems about right. I never saw anything that said the price was MSRP. Point it out to me.

You would think with the cool reception to FX/Bulldozer they would hit the lowest price point they could but they have a specific formula for pricing the cpus. These have to do with consumer welfare(money to spend), competition, quality of product, inventory of previous models, product life, return on investment and so forth. Here's a rundown. A bit much I know but it shows how complicated the pricing of cpus is.

http://econ-www.mit.edu/files/6981
ftp://128.151.238.65/fac/MSONG/paper...-song-rev3.pdf
Xterra's Avatar
$275 for BD or $179 for a 2500k. Its purely embarrassing.
Apht3rThawt's Avatar
Yeah, the price is high. They(AMD) are talking with microsoft about windows 8 and better performance with FX/BD. Hopefully something will be worked out. At the present time, the cpu is like a time machine, it may work better in the future. Part of the problem is it was designed by machine, not custom made like SandyBridge. The lack of humans shows a lack of art. AMD doesn't have the money to hire a lot of top notch engineers nor did they have the time to put into it. Other cos with more money hire the best engineers, like Google, MS, even Facebook. Those cos even have the best antivirus engineers also, leaving 3rd party cos to hire whats left. Global Foundries had their probs and it left AMD holding the bag(of cpus).

"Now I am become Death, the destroyer of worlds." The Bhagavad Gita, a great story. Oppenheimer quoting Lord Krishna quoting Shiva.
tangletail's Avatar
One would think that with machines there would be less error in the process and smaller parts to be added with precision. But this needs to be worked out on Global Foundries. Maybe if AMD actually made their own CPUs for a change they would be better???

But I can see why they can't. Their location would raise public concern, and the amount of pollution would definitely rile up the neighbors in Arlington Texas.
bmwbaxter's Avatar
I am pretty sure AMD owns Global Foundries...

EDIT: as in AMD is Global Foundries parent company.
wingman99's Avatar
AMD sold Global Foundries
QUOTE
GlobalFoundries Inc. is the world's third largest independent semiconductor foundry, with its headquarters located in Milpitas, California. GlobalFoundries was created by the divestiture of the manufacturing side of AMD on March 2, 2009

LINK:http://en.wikipedia.org/wiki/GlobalFoundries


What Does Divestiture Mean?
The partial or full disposal of an investment or asset through sale, exchange, closure or bankruptcy. Divestiture can be done slowly and systematically over a long period of time, or in large lots over a short time period

Read more: http://www.investopedia.com/terms/d/...#ixzz1bTxkKIS5
bmwbaxter's Avatar
I stand corrected. thanks for the info.
ratbuddy's Avatar
Be careful. It's a bad idea to start reciting that as if it's proven fact - it's anyone's guess if that interview is real or if the guy in it actually knows what he's talking about. I also have my doubt that Intel didn't use any automated layout tools for Sandy.
Apht3rThawt's Avatar
Of course Intel did. But you have to agree that at this point Intel builds a cleaner running cpu. I think part of the reason is the engineers. I used to know an engineer at Intel that worked on one of the cpu projects. If he was typical of Intel talent, I can see how they design good chips. Plus Intel is very aggressive and spends (a lot more)money on talent and R&D.

Disclaimer; I'm not an Intel fanboy. My daily driver is AMD.
g0dM@n's Avatar
I always try to favor AMD to keep the underdog in business and b/c I always get good value. I lean towards intel in situations like this, but I already committed to AM3+. I still can't get over how upset I am that I invested into an AM3+ board. I didn't have to leave my 790XT board behind as it was very stable for my needs.

Anyway, I love Windows 7 and don't want to switch, but if at some point in the future BD + 8 becomes a good combo, I may just settle for it.
Xterra's Avatar
If Piledriver somehow does not give the desired yield, I can see Apple swooping in for the kill and purchasing AMD. Its very embarrassing that AMD has to go to TSMC to fulfill market demand, because its main fab Global Foundries, is lacking in almost every regard. This, I suppose, explains the sacking of a couple of executive VPs a month ago.
terran2k's Avatar
so is it poor production yield from GlobolFoundaries that's causing the BD performance issues? is it supposed to run in the 4GHz+ range default? I know I've read that it's supposed to run @ 4.2 - 4.7 Ghz range.
Apht3rThawt's Avatar
I understand it is 32nm wafer production that is putting the crimps on BD. AMD has a WSA, wafer supply agreement, with Global Foundries that specifies clean, usable wafers. Global Foundries is ~2 years behind Intel in it's ability to make clean 32nm wafers. Intel is transitioning to 28nm technology and will put AMD even further behind. AMD was buying any 32nm wafers from GF but has since required just clean wafers, hence the bottleneck.
wingman99's Avatar
Intel has transitioned to 22nm.
Bobnova's Avatar
Intel is transitioning. The as-yet unreleased SB-E is still 32nm.
mjw21a's Avatar
Thing to remember, AMD has always been at least one full process shrink behind Intel. Problem is, this time around Intel have Trigate tech also, so I think its fair to say they're at least 1.5 years ahead of AMD on manufacturing once moved across to 22nm.

Heck, far as I know the best AMD can offer is 28nm on TSMC tech and thats only for their GPU's....
Dolk's Avatar
I hope you guys know that 32nm isn't a walk in the park, and smaller than this is not any easier. We are close to being an atom thick in certain wall structures of the transistor. I can't see how AMD can keep up with these price schemes with their CPUs.
ratbuddy's Avatar
Especially since they went fabless. I'll never understand why they did that, seems like it can only decrease their potential profits. Yes, risk goes down, but upside goes way down as well.
mjw21a's Avatar
I know, but there's no getting around AMD's main problem, which is basically needing to have the amount of money to throw at the issue that Intel has.

Good engineering can only go so far, you need the $ for R&D and machinery..... No one in the industry is in the position Intel is in. Good for them, bad for their competition.
wingman99's Avatar
I have Bios for my motherboard for the Support of Intels 22nm CPU

LINK:http://www.gigabyte.us/products/prod...?pid=3863#bios
Apht3rThawt's Avatar
Sorry, hard to keep up. Check out these papers. Quantum circuits, your next pc on a pinhead.


http://pubs.acs.org/toc/nalefd/0/0
http://www.sciencedaily.com/releases...1206085833.htm
zizux's Avatar
The fact is soon, they (Intel & AMD) won't be able to gain anything by die shrinking. There are physical limitations that will be reached at which point there is only another way to "grow" the technology, which is innovation and architecture improvements.

As I see it Hyper Threading was pure luck for Intel. When it first came out it seemed like a hail marry as AMD had the crown in both Architecture and performance. Why and how they lost that boils down to the fact that Intel has more OEM's buying their cpus and selling computers. I don't know the numbers at all but I wouldn't be surprised if Intel sells 100 cpu's for every 1 that AMD does. Thus Intel was able to go brute force by dumping tons of money into R&D and Tech. Then we got C2D, C2Q and then the i's.

Thus Intel has the advantage for no other reason that Marketing. Even when the P4 sucked, they still sold tons of them. Marketing. When was the last time you saw an AMD commercial on normal TV? I honestly don't think I ever have. The last time I saw an Intel one was yesterday.

Even if Bulldozer was everything we all hoped it would be AMD wouldn't be in a vastly different position. They would have probably sold more CPU's to enthusiasts, but I don't think that would have impacted the over all numbers very much. Which apparently doesn't matter anyway cause they are still sold out. You can't sell 1000 if you only have 100.
Xterra's Avatar
The consumer wants the best price : performance ratio. When retail market up is $275 for a $245 MSRP... people will go to an alternative: 2500k @ $179. It appears intel has taken consumer confidence instead of our beloved AMD. Do i hear Apple buy up of AMD soon? Methinks so.
tangletail's Avatar
Still trying to get the money for a BD
Really, I don't know why AMD sold Global Foundries. Had they kept an iron grip on it, they could had them perfect the 32nm process a bit better then origionally. Also, a die shrink can only carry so far in performance. And I personally wouldn't mind the BD to be edited onto a 40nm die for better yields. 32nm is quite small. And I think intel may run into the problem of electrons suddenly jumping around the transistors into places they are not supposed to be, because they CAN freaking do that surprisingly.

But who knows. TMCS may have perfected their development. Also, I highly doubt that Apple will ever buy AMD. Apple so far has been using Intel to drive their machines, and using AMD for their desktops and laptops means a complete recode of the OS. AS WELL the fact that if they were used for just notebooks, then it would be a waste.

I would not say that AMD is behind just yet! All companies have their bad moments, but they will prevail sooner or later. Even if they do fail the CPU market, they do have the GPUs. Their GPUs are becoming more popular as gamers realize that most of the stuff on an Nvidia card does not really give you any special luxuries as people don't tend to use them. With OpenCL and the amount of cores it has, you would see some pretty god like performance.

SUPREME ULTRA MEGA EDIT: It seems that all the consumers of the FX-8120 and FX-8150 actually adore the thing. While most of the bad reviews are actually fakes, or people who don't own it just running up there after reading a review and going LOL do not buy. Hm... really AMD should try becoming a pimp and pimping it out as badly as Master Chief of Halo, and I bet you they can pull in the large corporations as well. One of the reveiwers also said to see the real potential of the core, you need to build a Project Scorpius system. This means an FX requires a 990x chipset, and a HD6000 series gpu to see it's true potential. Any conformations?
manu2b's Avatar
I have no idea about Apple and AMD.
What I know is that Apple is using Intel cpu's since January 2006, and that PowerPC Macs were Motorola powered.
Mac OSX was developped for Motorola chips and recoded for x86.
You can run a fully functionnal hackintosh with an AMD chip, so I don't believe that you even need recoding.
Apple already works a lot with AMD as they exclusively use AMD Radeon GPUs.
Apple weighing something like 5% of the market share, that could be a good thing for AMD...
hokiealumnus's Avatar
Our review used a Scorpius system (Crosshair V Formula, FX-8150 & HD6970). Our numbers were right on par with everyone else's. Price-to-performance, I concluded it was a good chip for AMD's price of $245. After they came out, I added a caveat that it is in no way worth $280 (unless you just want the newest CPU and are willing to pay the premium).

EDIT - The whole system naming thing is just a gimmick, there is no tangible added performance by using all AMD to get the name that I've ever seen. AMD would only hurt themselves if that were the case because they'd have to write the gain into their GPU drivers, which wound hinder performance on Intel systems. I think it's a very safe assumption that they'd rather sell more GPUs to the entire market than artifically limiting themselves to support the CPU/chipset division.
manu2b's Avatar
Do you think Anandtech is faking results?

http://www.anandtech.com/bench/Product/288?vs=434

Oh, but I forgot, it will work much better with Windows 2028...

EDIT: that's compared to a 30% cheaper cpu.
SupaMonkey's Avatar
Tbh, no one seems to mention this but i think that it is important. The power draw on that thing is insane. If i bought that i would have to upgrade my 550W PSU just to run it with one GPU. Bulldozer and crossfire might be a nice way to heat the house in the winter but not much fun in the summer.
hokiealumnus's Avatar
Lots of people have mentioned power draw. I don't mind it as much as performance, but there are those that value it more than me. Power draw was listed in the article and I did mention in the first comment no one without free electricity would want to fold with it.
storm-chaser's Avatar
Bottom line is most of these reviews are legit and while you may be having a hard time accepting that, its the truth. You can go back and see how many other CPUs they have rated to get more perspective, but most of these reviewers have been in the business since the K6-2 days and have reliable benchmarking tools. Fact is the FX underperforms when it comes to single thread applications despite the high clock speed. My feelings are that AMD fell short of the mark with this processor.
]-[itman's Avatar
In tangletail's defense, I think he is referring to the consumer reviews on sites such as newegg, tigerdirect, etc., not to the professional reviews to which you are referring. I also second his comment in saying that it seems most (read: not all) who actually have the platform have been pleased with it and are having tons of fun tweaking it. With that said, I really hope AMD can get some improvements on track with Piledriver and successive updates.

Edit:
Also, something interesting. This site is doing extensive testing of BD under Linux and have implemented some of AMD's compiler optimizations in the process. The results seem pretty good and apparently there are still more optimizations to implement.
http://www.phoronix.com/scan.php?pag...ulldozer&num=1
manu2b's Avatar
I have a lot of fun with my PhII.
But no one with common sense can say BD is not a failure. That is a fact.
And I don't believe Tangletail was talking about Newegg...

EDIT: better leaving this discussion. Between AS5 being junk and BD being a success, there might be a cure for bad faith...

EDIT BIS REPETITAS: nVidia makes great cards. You see my rigs? Not enough money to get a 580... but if I had, I would grab a pair of those!
mjw21a's Avatar
I'd not call BD a failure. I think it's the first step in the direction they need to go to succeed, however I don't really intend to upgrade until Piledriver, possibly not until 2013 as what I really want is the successor to Trinity. The next gen APU's with fully vector based GPU component sounds like one enormous amound of potential and possibly the biggest leap performance for general computing I've ever seen.

From that standpoint I belive AMD should ignore x86 floating point altogether and keep trying to improve integer performance. After all, thats the only part will eventually carry over in their future heterogenous computing plans.
Angry's Avatar
Anybody think this may be a repeat of the Phenom I to the Phenom II phase where everything just got better with the PhenomII?
muddocktor's Avatar
I have actually been thinking very much along those lines myself. At least, I hope it works out that way for AMD.
DaveB's Avatar
How can it not be termed a failure? Reduced IPC, insanely high power consumption, beaten in most benchmarks by current Intel Sandy Bridge CPUs, and even in some by last generation Intel CPUs. The only hope is a Phenom I to Phenom II transformation. But by the time that can happen (6 months minimum), it will be confronted with Ivy Bridge from Intel and fall even further behind. AMDs more cores design philosphy just doesn't cut it with the way software is developed at this time. Single-threaded performance is way too low. I was hoping for a better result, but this is bad for AMD and bad for the consumer also.
Xterra's Avatar
I'd call it a botched marketing campaign and an engineering triumph. With that said, the consumer market will make BD into a failure because it doesn't beat Intel on white paper.
ratbuddy's Avatar
Sorry, but I need clarification.. How does a chip that uses more than double the transistors and draws way more power yet performs worse equal an engineering triumph?
Dolk's Avatar
Simple, AMD did something that no other company has attempted, and they executed pretty damn well. This could have been a lot worse than it looks if they would have remained with their older architecture revision of BD.
ratbuddy's Avatar
Yeah but the old one worked better
Dolk's Avatar
Not really. The old one was outdated. Module Architecture is exactly the same thing as HT Architecture, but different in every way.

Like a Gold Fish and a Cat Fish are the same but different.
Janus67's Avatar
So what you are saying is that Bulldozer is a bottom feeder/glass licker like a catfish?



manu2b's Avatar
I can walk on my head and spout rethoric....
Dolk's Avatar
Funny... What I'm saying is that AMD is going to play Intel's game of core manipulation.

I wrote an article explaining all this.
Janus67's Avatar
George, you know I'm just playing with you. I'll give that article a read tonight if I have some time, I'm really interested in how it all works.
bmwbaxter's Avatar
the consumer market is the only thing that will keep BD from being a failure.

"8 cores" will sell better than 4c+4t. doesn't matter if it is slower the average user won't know that and they definitely will not know what hyper threading is.

for OEM they are noting going to be paying an inflated price for BD so it will preform admirable I would expect.
someinterwebguy's Avatar
If I didn't already have a nice, stable OC'd 1090T-based system, I would still end up buying a BD CPU - once the prices come down to realistic levels. This, despite my plans (coming to fruition soon) to build an Intel-based rig.

As others have noted, OEM's will easily sell BD simply because they can say "It has 8 cores while Intel's CPU's only have 4/6".

Most people buy computers without doing much research, let alone, scouring forums/test sites for benchmark data.

When I gave a friend my old AMD Phenom II X4 940-based system, I told her "It has 4 cores running @ 3GHz and 8GB of ram, etc". Her reply: "What does that mean in English?". "It's really fast!" I answered. That was enough for her, and arguably most others.
Khan's Avatar
Having sold consumer electronics for a number of years, if the price is right BD will sell well, I remember when the P4 came out it would still sell more than AMD x64 systems because of intel marketing even if you explained that it was actually faster and more efficient than the P4.

Power consumption is not a factor for the average consumer, they never turn power savings off and they never leave thier pc on 24/7 and BD is actually pretty fast for everyday use and will most likely be much faster than what they are replacing leaving happy customers anyway.
tangletail's Avatar
The consumer market causing the downfall of the BD is only an effect. The cause is by the benchmarks done over the CPU, therefore effecting the decisions of the customers who still have the mentality that the Benchmarks are everything, and a CPU must be better then all the others. I have actually realized this recently when I began to ponder over the real concepts of a benchmark. And it completely blew my mind. Seeing how a core 2 duo and an Anthalon 2x still kicks but in the game world, I see how it works.

If you surf the net a bit, you will find that people will immediately say BD is a fail core. None of them has actually bought it, but they have seen the benchmarks, and they are looking at it as what it was supposed to be, and not what it is.

Explain to someone that a Benchmark is over processes that you will rarely if ever run into, and should not be the guideline when buying a CPU, and they will throw a fit. Tell someone that a Avg FPS is only for Speed stability and not speed as a whole, and they will scream. Tell someone that in real world scenarios a certain core will pull through indefinitely, the benches say other wise in the mentality of your peer. The benchmarks are seen as a marketing method to some average consumers and they ignore them completely. For the more experienced customers, who have been in the game for a while, the Benches means everything. So who typically makes a good decision?
ratbuddy's Avatar
Benchmarks are not just a 'marketing method,' they are a measurement of the performance of the processor at certain tasks. The informed consumer will look at these performance measurements, along with price, cost of ownership (power use and mobo price), and possibly brand preference.

The Anthalon 2x (???) does not "kick butt" in the game world, at least not in the games I play. IPC is king, and Sandy is the only way to go for me. Yes, it wins in benchmarks, because it also wins for my actual uses, and not by a small margin. It's simply disingenuous to claim that looking at benchmarks is not an important part of the CPU selection process. Would you really suggest that folks in the market for a new processor just go with whichever one comes in the nicest looking box?

I'll quote this part again, because it's important:

Looking at what it actually is... Overpriced, underperforming. Mince words all you like, but that's a simple fact. The 8150 should sell for about $200 based on how it performs.
Apht3rThawt's Avatar
Marketing and what's on the shelf are what sells computers. Does it play games? Does it play videos on my 52 inch TV? Even those points are losing out to consoles, internet appliances and TVs that hookup directly to internet. Seriously, how many people do you know that build their own? Other than techs and my family I don't know anybody else who does. I was told that in Boise, Idaho, Newegg will be my friend because there are hardly any places to get parts, and that's a fairly big town. 90% of computer tech's work is to run interference on viruses and build networks, not build systems. Somebody come up with stats for custom builds opposed to off the shelf. Netbooks, tablets, consoles, most consumers are fine with those choices, not if the cpu is 6 core or 8 core. Price and function.
mjw21a's Avatar
Not really, the problem is calling is an 8 core when its not in reallity an 8 core. Marketing won out over common sense it would appear.

Likewise, it's the server workloads where it will truly shine.
Janus67's Avatar
But it won't unless heat output and power draw isn't an issue, and in a data center it is always an issue.
Apht3rThawt's Avatar
Iceland my friends, Iceland. Other cold climates areas are also drawing data center interest. Central Oregon has a Facebook data center in Prineville, climate was a factor; long, cold winters. Basically, just pump ambient air through the servers and leave the windows open.
Xterra's Avatar
Most data centers filter their air

On another note, power draw in terms of efficiency not heat is the problem in data centers. Using less power on all machines in house saves money and by making hardware more eco friendly, data centers can maximize gains and potentially add more machines at their previous allotted power limits.

Investing in "cold climate" data centers is a moot point. Using already existing data centers in regional metropolitan areas is more cost effective than building a data center in the Arctic Circle just because 1 of 2 enterprise CPU manufacturers have high power draws. If anything, that logic would encourage data centers to adopt Intel products over AMD products.
Dolk's Avatar
All you Data center people read before continuing the argument:

http://realworldtech.com/page.cfm?Ar...WT062011114950

And don't just read the title and come back, read the whole thing.
Xterra's Avatar
I read this I nearly keeled over. I <3 George.
EarthDog's Avatar
I work in a Data Center...

Anyway to discuss the context about data centers, I recently completed a study of our own data center environment. This included using ASHRAE for standards. As Im sure most know there is a huge push for the 'green' effort. Because of this (and other factors), temperature thresholds have increased over the past few years, even since that 2008 publication I linked (there is 2011 one available). With that said, Aphterthawt is right in that some data centers are beginning to use ambient outside air in order to curb cooling costs. You do not have to move to Iceland either, but the cooler the environment, the less you have to spend on HVAC. And certainly, like any other intake to a data center, this air needs filtered and evacuated.

Its going to cost more money to keep an 85C processor at 30C than it is leave that bad boy run at 85C... no?
Janus67's Avatar
That is a pretty interesting read. In the data center that we have at the OSUMC I haven't seen anything there that would appear to be watercooled (doesn't mean it doesn't exist, but I don't think they have any applications that require that sort of server to begin with).
EarthDog's Avatar
Our maniframe used to be watercooled. Then they figured out that it was more expensive to run and maintain that watercooling than it was to let the ambient air do its thing.

Oh and what about the LOE (Level of Effort) and cost to water cool 150 physical servers? How the heck do you route tubing through racks? Not to mention the much higher than air risk. Its just not a viable option in today's data centers to use water. But cooler ambient air is getting huge financial backing and in cooler envirnoments, its easily viable to drill a large hole in the wall and pump ambient, filtered air in the system to take the load off of the internal HVAC.
Xterra's Avatar
Or have chips that run natively cooler to begin with. I just don't see the argument of having hotter chips in a data center when their are cooler alternatives that run faster. How can BD excel in server work loads when it runs hotter and gets outperformed by cooler chips the begin with?
Apht3rThawt's Avatar
Of course they do. Facebook is building a data center above the Arctic Circle in Sweden. We're going to see more of those I'm betting. Colder air is dryer also.

http://gizmodo.com/5853819/facebook-...hilling-effect
100887
g0dM@n's Avatar
While that's awesome to use arctic air for cooling, heating is heating... the CPUs and other electrical parts are still producing the same exact amount of heat into the planet. It doesn't matter where you have it producing heat. Now, power generated by dams is another thing...
Apht3rThawt's Avatar
That's a discussion for another thread/forum, but I live in an area with lots of dams, and I miss the fish. A dam upstream means less water downstream. I was flooded last spring and one reason was irrigation dams downstream backed the water up. The river I live on(the Silvies River, Oregon) doesn't even get to the lake it used to feed. Nor can I float it because of irrigation gates, fences across it, and such. Also they are putting in 300 windmills on Steens Mountain, an incredibly rich and sensitive desert area, with a huge transmission line running across beautiful and unique landscape. No, the answer is use less power, but with 7 billion people on the planet that expect TVs and computers there is no escape. I feel bad with my multiple 500 watt computer systems running.

Right now I can feel the heat from my system as I type this. High TDP cpus/gpus suck and we need better. You're right, heat is heat. My next project is low TDP sytem.
]-[itman's Avatar
Yes, but I believe the point is that you then don't have to use so much energy in cooling the things. If you take a server you can cool with ambient air in a cold environment and move it into a warm environment, you are (for sake of simplicity) doubling your energy use because you have the same energy produced by the server, and now you need the same amount of energy to cool the server because of the lack of cool ambient air.

As far as bulldozer goes(back to topic), we actually don't know how well Interlagos is going to perform. Yes, we have an idea from Bulldozer, but to me, it seems where Bulldozer does do well is exactly what a lot of servers need. Plus, if the chip's large power use is in fact seriously affected by an immature process, then having lower frequency chips should help a lot with leakage so then with more cores in a heavily multi-threaded environment, it may turn out to be a good option. I'm not saying it is, I'm just saying I want to see Interlagos in a server environment before I pass judgement.
Cyrix_2k's Avatar
read below
It takes more energy to remove the heat generated by the servers than it does to create the heat in first place since A/C is nowhere near 100% efficient. Back at my prior job, we used to have a "datacenter" of maybe 30 populated racks. It took something like 10 tons of A/C (that's an ~35kW draw) 24/7 to keep it cool. Move to a cold environment and that power bill goes away - that's a large cost saving as well as huge environmental win.
ZapTap's Avatar
But what he's saying is that If you're going to build in the arctic, there's still no reason to use Bulldozer. You could have even lower cooling costs with the intel chips that use less power and put off less heat. So there is absolutely no reason to use Bulldozer as it is now in a datacenter environment.
Apht3rThawt's Avatar
The Intel Xeon TDP is up to 135/150 watts. AMD Opterons range up to 115 watts TDP, slightly less than the Xeon.The Interlagos may have slightly lower TDP than current Bulldozers, according to AMD, allowing more cpus in a smaller footprint.
Xterra's Avatar
Many consumers seem to be under this spell...
tangletail's Avatar
Deleted my post because it showed up as a double for some reason.

Anyways, I have found something interesting. Verified buyers had confirmed the BD to outperform the I7 2600 on many instances, but fails at single threading. My hypothesis to this is they built the Scorpius system, OR that FPS averages were insignificant, and The FPS range was used to compare. Or someone saw that the performance was neck to neck with a I7 2600 after seeing what AMD intended for it to be.

Also, I am not up to par with the terms of B2 or B3. Could someone tell me what that is? I rather not trust wiki.
ratbuddy's Avatar
Scorpius is just marketing lingo, and most of the professional reviews used it anyway.

You said 'verified buyers' which makes me think of newegg reviews, so I went and looked. They are mostly the same excuses you hear all over.. "Wait for Windows 8!" "This will be faster than SB when software catches up!" "Bulldozer is from the future!" and so on. It's a load of rubbish, sorry to say.
Cyrix_2k's Avatar
True, but the difference in heat output per server would be minimal. Stability and performance will be the deciding factor - and it looks like bulldozer does perform well when given an appropriate workload.

BTW, I'm trying to get one to play around with at work if they ever come back in stock. It seems like a decent and very cheap way to setup a VM server. However, at home, I'm looking at a 2500k or 2600k for my new rig. I'd really like to go AMD, but their single threaded performance is pretty bad compared to sandy bridge (Also, note that my sig is out of date - I'm running a 3.0ghz c2d at the moment)

:edit: Assuming newegg hasn't cancelled the order, we should have an fx-8120 in the office this week!
EarthDog's Avatar
Actually, if u had a quad hex setup times however many servers that's pretty Damn significant.

There is no way I would get BD in a data center...even if I had the right load on it. Just not worth it.
zizux's Avatar
The server versions of AMD's CPU's allways end up with slower clocks and less power consumption. Then the HE's come out with even less consumption and heat with the same or close to same clocks.

I have yet to see an opty 6200 available. But I haven't looked for a little while.

They will probably be a good option but we won't know for sure till they are out.
EarthDog's Avatar
Riiight, but if its using the BD architecture, I would imagine its still going to use more than Intel's server chips.
Bobnova's Avatar
Really depends on the clocks the use for the server chips.
If it runs in the mid 2ghz range like many server chips they can likely drop the voltage a lot compared to the mid 3ghz range. Dropping the voltage makes crazy huge changes in power draw.
zizux's Avatar
@EarthDog Possibly but in my early testing BD is good for virtual desktops and virtual machines in general. At least it is better then hyper threading.
tangletail's Avatar
BD can run at it's stock speed at a lower voltage with the auto volt turned off.

But a server processor is designed to be efficient period. The way it handles data streams and what not are unique when compared to a faster desktop CPU. Though how did the discussion land here?
manu2b's Avatar
I remember when I used to work for Apple, and then Sun Microsystems, TCO, power saving and "going green" were BIG. It is a major benefit when chosing a server.
What counts the most for a customer buying a server is how much data per dollar it can handle during its lifetime.
Period.
DaveB's Avatar
Sixteen core Interlagos chips are composed of two BD eight cores interconnected via MCM similar to Mangy Cours, using the same G34 socket. It seems that only the 6272 2.1 GHz and 6276 2.3 GHz 16-core versions are currently available. Clock speeds are that low because AMD needed to limit thermal design power to maintain compatibility with current generation of G34 socket servers. IMO, they need to get up to 3.0 to 3.3 GHz to be competitive.
g0dM@n's Avatar
Did anyone else notice the 8150 came in stock on newegg then disappeared quickly last night? I think the price was still at 279.99...
hooflung's Avatar
I don't think that is a fair assumption really. You don't need IPC to be on par with a desktop machine on a server. Your bottleneck is generally I/O and then compute threading.

There is nothing different about the I/O on the Opty BD platform, as you can just plug them into current Magny-Cours boxes if you get a bios update. And what type of workloads run on Servers? Well... VM's that house well threaded application servers is usually par for the course. J2ee servers. .Net web services... etc etc.

For stuff like PHP based sites the more OS's you can run the better your NginX server can distribute load. Magny-Cours don't run that fast and they are usually great option. Nothing has changed here for the worse. It's actually gotten better for AMD.

It is intel that has to hit higher clocks to compete, not the other way around.
Cyrix_2k's Avatar
ok, that's you... however, I've seen several instances where others are looking at BD just for servers and my own office jumped on one.
true
BD is great for server class workloads which can often be easily distributed among cores. The BD server I just put together is actually for running VMs, but our web & sql servers also tax the processor pretty heavily among a boatload of threads. Even though BD has a high TDP, it's still a lot less costly per unit of work since it destroys single cpu systems and often dual cpu systems - and the dual cpu systems consume a LOT more energy.
the "green" movement really depends on the organization, but the last sentence is 100% true.
EarthDog's Avatar
Im sure there are a few instances when its ok. But when you are talking about life cycle management and replacing hundreds of physical servers, most data center managers worth anything would turn and run from BD. In a lot of cases the extra power consumption and heat just are not worth it compared to a similarly performing, less power hungry, and cooler running chip.
zizux's Avatar
There is a significant difference between typical IT guys buying a server, and a data-center.

Typical guy looks at what he can get for his current budget and that is usually about as far as it goes, sometimes there is a TCO convo but in my experience it doesn't happen often.

DC's are more concerned with operational expenses then others as such I think that is where EarthDog is coming from. To a point it makes total sense.

Till you realize that you can get 64+ cores in 1U of rack space. For virtualized environments this can mean higher consolidation ratio's per rack per watt then other platforms. Some workloads do fine with hyperthreading some require cores to be happy. In these cases BD makes sence. Although an 8 Core version of the previous generation would have been better.
EarthDog's Avatar
And here is where that arguement falls short. I can get Hex's with the same performance from intel using less power and heat... I too can cram 48 threads in a box (4 Hex's with HT).

But unless its blade technology, no way are you cramming 64 cores (where did you get 64 cores) in a 1U unit.
zizux's Avatar
4 Socket * 16 Core CPU's = 64

http://www.newegg.com/Product/Produc...82E16816101317

Not only that but intel hex cpu's with ht aren't cheap. AMD's stuff is / was cheaper per core.

Like I said depends on workload. Some things don't like hyper-threaded cores especially when virtualized. If real core count matters and it does and will for some tasks then AMD's cpus are a good solution.
GoD_tattoo's Avatar
Link says quad 8 or 12 core processors. 4x12=48

Either way, I would love to see the cooling solution used to keep those 4 cool. That alone would pay for a few more Intel's. I would imagine the price spent in cooling would negate the savings of the processor. I can't see the advantage of having 48 cores of super hot running, not as efficient cores over more expensive, cooler running, more efficient cores. But that's my opinion.
zizux's Avatar
Opty 6200's are available in 16 core and in theory anything that supports the 6100's should just require a bios flash. But we shall see.

Also quit thinking the thermal output from the desktop cpu's will be what is seen from the server ones.
Cyrix_2k's Avatar
FWIW, I've worked in an infrastructure group that both handled its own "datacenter" as well as was contracted to handle much larger, true datacenters. I currently work at a smaller firm that rents rackspace and we don't care about thermal output or power consumption as that is billed at a flat rate. What we do care about is raw processing horsepower per u and the cost of the server; BD is a clear winner here. Even in the case where we are directly responsible for power & a/c, the difference in TDP isn't large enough to warrant passing up BD as a solution.

BTW, from what I've seen, the BD server chips are supposed to be 16 core, so a quad CPU server will be 64 cores - and that is VERY much worth looking at in a server environment.
Matt180's Avatar
Been following this topic for awhile now. Still cant make up my mind whether to get a bull dozer or not. Although seems most places sell out quick. Although from what ive been reading from reviews its not living upto its hype.

Would be worth upgrading To this? over a 965 x4 BE CPU?
Khan's Avatar
Personally I'd spend a little more and get an 8120 or stick with what you have.
manu2b's Avatar
It depends on what you do with your computer.
If gaming, an OC to 3.8/4GHz plus a nice gpu will do better than upgrading the CPU.
If using software as Photoshop, or doing heavy rendering or encoding, for sure a better cpu will help.
When my girlfriend needs to do heavy PS work, she uses the 2600k and I put the 6950 in the AMD rig. I see NO difference while gaming.
I would wait for march 2012 and grab an Ivy bridge or wait a little bit more and see what PileDriver got in the guts.
Xterra's Avatar
I love the logistics at AMD... :O
Bobnova's Avatar
In the staggeringly subjective "feel of windows" test, I like the 8150 better than my 2600K, it's second only to my 980x in the all time Bobnova's Favorite 24/7 Chip Ranking.
980x is going to be hard to beat.
mjw21a's Avatar
I'm rather happy with my P-II X6, since I got the chip for $60 I couldn't really go wrong.
manu2b's Avatar
And I prefer my 955BE to y SB as well.
g0dM@n's Avatar
I've heard some people say that, or at least the last thing I read was someone moved from an i5 2500K to the FX-8150 and he said while games may have drop a tinsy bit (negligible) that the feel of W7 and its multitasking was better... I'd say that sounds like it all depends on the user and what he/she does.

I don't multitask at home the way I used to so I'm not sure it's worth upgrading from an X6 1100T, though I am still thinking about it. My 1100T runs like a champ at 4ghz with barely a voltage jump (haven't even tried more).

Now, if I popped in an 8-core at work, I'd shine... but I'm happy that I got my hands on an i5-2400 and 8GB of RAM at work (it pays to be in I.T. and have the goodies).
Bobnova's Avatar
Been using it a couple days now as my 24/7 box (I was afraid it'd murder my install, it didn't), and it definitely seems to be better. Running boinc with rosetta and LHC@H plus a GPU bitcoin miner and Pandora (especially pandora, I hate the new pandora, bring back flash!) I ran into slowdowns with the 2600k that haven't popped up with the 8150.
Why? No clue.
mjw21a's Avatar
Pft, still in IT here however they don't replace stuff here till it breaks.... Which is good because they keep giving people this god aweful HP Thin Clients to run virtuals instead. Completely ***** for support work.

I think if my poor old Core2Duo dies, I'll build myself a rig and avoid telling anyone. Can never go to a virtual, too restrictive. Doesn't pay to be IT in govt.
g0dM@n's Avatar
But I don't multitask that much, so not sure if it's worth it for me... hmmm
Our desktops and laptops are leased so we have to replace hundreds of machines every quarter. It's funny when users complain about "how come I have to change my computer" b/c little do they know they are getting a much more powerful computer.
Janus67's Avatar
http://arstechnica.com/business/news...astrophe.ars/1

Server benchmarks are in. Not looking good here either.
MattNo5ss's Avatar
No talk of power consumption for a server performance article? If they would have mentioned power consumption, it would have been even worse.
Frakk's Avatar
I'm not going to be upgrading from my 1090T until AMD come up with something that beats the 2500K convincingly and costs around the same or less.

If that's not going to happen in sight of the next two years i will be moving to Intel in two or three, i can't afford to have both an Intel and AMD. while i like and respect AMD for what they have done in the past and continue to do now in terms of value they just don't seem capable of getting themselves on top again.

My Phenom for example is little more then a reworked K8, the K8 always was and still is good, but its old. Where is the innovation? the new technologies?

Come on AMD sort your selves out, please.
bmwbaxter's Avatar
Bulldozer is a new innovation. Total rework of its core design. Also it is on a 32nm fab process new technology their for them atleast.
Frakk's Avatar
Ok point taken, but the performance difference between it and mine needs to be detected with a microscope (yes, slight exaggeration there)

It begs the question, what was the point of it?

I know its new, so who knows a bit more work and v2 might surprise us all and eat 2500K's alive. but it does not seem that will happen.
bmwbaxter's Avatar
The point was a step in a certain direction. It had to be taken sometime and only time will tell if it was the right direction.
Shelnutt2's Avatar
I think you make a good point here. Intel did this same thing back with P4s. The only difference was Intel sort of hid this transitional steps because they continued pushing the P4 while they first worked on dothan and then yohan as steps toward the conroe architecture. Doth and Yohan were solely in the mobile segment minus the few server based version and the ASUS adapter. Was dothan and yohan better than prescotts and ceder mill? Sure on a clock for clock bases but there were many other issues and refinements (some minor some major) before intel got to conroe. AMD didn't have this luxury of working on two architectures at the same time. AMD already is stretched in the resources and have bobcat to worry about along side bulldozer. I think the real indication of whether or not this was the right road will be piledriver. If its a "flop" then this architecture might not be the best. However if piledriver shows good improvements similar to conroe over yohan than it'll have been a good move.
Frakk's Avatar
That makes sense and i'm a little wiser about the whole thing now.

I hope it comes good for them
SupaMonkey's Avatar
Do we know when piledriver is supposed to be released?
hokiealumnus's Avatar
Best thing we have so far is "2012". No month or quarter mentioned to my knowledge.
g0dM@n's Avatar
I have an FX-8150 sitting in my room. I'm wondering if I should replace my 1100T or just flip the 8150 at cost... very eager to try it, but I don't want to disappoint myself.

I too was willing to wait for Piledriver since my 1100T honestly kicks butt...
hokiealumnus's Avatar
The FX-8150 is a lot of fun to play with, especially if you go cold. It depends on what you do with it as to whether it will disappoint you. If your plan is to run standard HWBot-style benchmarks on ambient cooling and compare it to your 1100T, I'd sell the thing. If you want to see how it works with productivity and the like (video/audio editing and encoding, photo manipulation, compression...things like that) as well as freeze it just to see how far you can go...then you won't be disappointed.
g0dM@n's Avatar
Honestly, I don't care much for benchmarks anymore. I just always strive for a "snappier" computer...
hokiealumnus's Avatar
Well, Bobnova says it's second only to his 980X, so perhaps it would be worth giving it a try.
PolRoger's Avatar
I was disappointed about BD when the reviews officially came out on launch day but I still decided to get a new AMD combo so I could try something different from the Intel's offerings that I've been running over the past few years.

I'm also looking forward to Piledriver... might even take a shot at an 8170 in the hopes of possibly getting a better clocking sample...

I would go ahead and try it out. You might find that when overclocked you can make up for the 1100T's ipc advantage. I don't have much trouble running mine at 4.5/4.6GHz "daily" on air. Although some samples can be hot running... I see you are on water already.

If you don't like your 8150 you can always sell it off and go back to your 1100T ... at least you will then know first hand and your curiosity about BD will have been answered.
marjamar's Avatar
Been a month now working with BD on a CHIVE MB. Quite a few hiccups to contend with using the chipset on this MB and also the less then stellar cooling I've been having. Seems the BD is capable of overclocking, but for the life of me I can't keep it anywhere near the temps I want.

Since this is the last day of my replacement from newegg, I ordered a replacement.

We'll see if this is any different.

-Rodger
hokiealumnus's Avatar
What are you doing to cool it? They run pretty warm no matter what individual chip you have. On top of that, support with the CIVE is flaky at best. It would probably be a better setup for you if you sold the CIVE and got a CVF (or some other 990FX board that's not the Gigabyte 990FXA-UD7) to replace it.
marjamar's Avatar
Maybe so. I've had 3 different water coolers on it so far. 1st was Antec Kuhler 920 (pump quit), then Corsair H60 (too small), now Corsair H80, still runs too hot at anything over 4Ghz or so with Prime95 torture test. I don't think the CHIVE could really contribute to too high temps though, just limit stability and OC'ing to BD's full potential.

I use Arctic Silver 5 and have (as always) been very careful to use just enough to almost reach the edges of the CPU when the cooling device is installed. On each of these coolers I'm seeing socket temps reach up to 80C. and core temps hit 70C. under Prime95 and stable OC @ 4.678GHz. So, if it's just the way the BD does things tempwise or maybe a bad chip, or even a less then good enough cooler I'm not sure. But I was sure that this was my last day to exchange the CPU, so had to get that done.

I am looking at possibly buying a new MB, but I've been hearing rumors that that may be some support chip changes coming to help this BD work better, so I will most likely wait a bit and see if this is true or not before upgrading.

-Rodger
I.M.O.G.'s Avatar
I would stick with your board for daily overclocking needs. Support is poor on it for extreme oc, can't disable cores, and I had trouble disabling throttling on the cive, but other than that I think its fine for daily use on bd I guess. mainly, unless there are other upgrades/features you want specifically, I don't think the cost of a new mobo is worth it for you.

there wont be any updates that change bd perdormance.
hokiealumnus's Avatar
My point about the board sort-of applies to socket temp, which may be inaccurate. I'd trust Coretemp personally. That doesn't sound too bad for that overclock under Prime load with that kind of cooler. I was hitting core temperatures of ~65C with 1.41V at 4.76GHz on a full water loop (Swiftech MCR320, MCP35x, EK Supreme HF Cu). I'm just saying don't expect much to change with a new chip.

EDIT - IMOG has tried the combo, if he says the CIVE is good for daily use, then that's a good enough endorsement for me.
marjamar's Avatar
Yeah, wouldn't disagree with either of you. Just thought I'd like to see if temps would be lower with a different BD. On a side note, I also have a Delta "turbine rocket jet-drive" fan coming in today to see what 6000 RPM, 230 C.F.M. and 24 mm/h20 static pressure will do to my radiator (and possibly cooling).

-Rodger
I.M.O.G.'s Avatar
felt I should clarify my thoughts. I have badmouthed bulldozer on the cive, but as far as I remember, my issues would only effect extreme on dice/ln2 benching really. still valid strikes on a board of this caliber and disappointing as a proper BIOS would fix what issues do exist. as is those issues forced me to buy a cvf, costing me a couple hundred bucks. that's why I have made some grumpy cive comments in the past.
marjamar's Avatar
The whole BD thing and all that's related to it is not so good for anyone. I just wish AMD would have thought-through their choices a bit more before committing to a few of them. I'm still a bit shocked my memory timings/transfer rates are worse then on the 1090T and with the high current/temps this chip exhibits, I know things are just not right on the insides for minimal increase in some areas over the 1090T.

I guess since I just finished building a i5-2500K setup, I'm a little miffed to see what is possible with better thought-through designing.

-Rodger
marjamar's Avatar
Installed the Delta QFR1212GHE-PWM 120mm Case Fan -- WOW!

The reviews I read are right-on about jet turbine sounds. One of my companies builds R/C flying wings and we use all kinds of electric power plants, turbines being one of them and this muffin fan is about as close to a turbine sound I ever heard in ANY fan of this type.

This fan so far seems to be the ticket to get temps down on this CPU. I done a little testing so far, and I can see a 10C. drop in socket temps and about 5 C. drop in core temps (under Prime95 torture test) on my BD. The biggest things though, is running prime95 for over 30 minutes the CPU didn't clockdown at all. Before installing this fan, it was clocking down after about the first 5 minutes or so, and then every couple of minutes for maybe 15 seconds at a time.

More testing will be done over the weekend, so I'll maybe start a new thread then. The new CPU will be in today as well, so after I test this one enough to get hard numbers, I'll install the new one and test it as well for comparison.

I would like to see 5Ghz stable, but I still don't hold too high a hope for that.

BTW-- I've set a custom profile in AI Suite II's "Fan Xpert" to quite this thing down when high performance cooling isn't necessary. Set the curve so normal workloads don't raise the fan above idle (1950 RPM), so it about the same noise level as it was with the 2 H80 stock fans on the radiator, with BIOS fans speeds set to NORMAL.

-Rodger
marjamar's Avatar
Here's a link the the NEW THREAD where I explained the final outcome of my upgrade to BD on the CHIVE.

Sorry for the extra click to read about it.

-Rodger
Cyrix_2k's Avatar
Soo, the office BD has been an absolute beast so far. This is hands down the best chip for the money when it comes to building a budget VM server.
hokiealumnus's Avatar
Thanks for sharing! What did it replace?
g0dM@n's Avatar
I'm curious to know as well.
RGone's Avatar
A Plus big 1
Cyrix_2k's Avatar
two HP DL360 G4s with dual 3.4 ghz xeons (each). Those suckers were LOUD.
hokiealumnus's Avatar
I bet that cost a pretty penny in its day. Bet the BD server didn't even approach that. Glad you're happy with it.
djangry's Avatar
Sending my fx back.
It is a black screen machine.
Buying any x6 Phenom2's I can find.
The secret is out this chip is junk.
The industry needs to come clean-
fx is not ready for prime time.
Breaks my heart.
hokiealumnus's Avatar
Well, that sounds like a hardware problem. You can complain about floating point performance with plenty of justification, but 'black screen machine' indicates a problem with the system. They function just fine in my and many others' experience. They just don't necessarily function as strongly as expected.
bmwbaxter's Avatar
^^^ +1

The black screens are not a product of your FX processor. Somethkn else is wrong. You may have gotten a bad chip. But not every FX processor is a black screen machine.
djangry's Avatar
I will say it again with love,
There is a power issue with the fx series chips.
The hot fix has been pulled,
The transistor count lowered ,why power issues.
I did not come here to throw stones,
I choose djangry because amd fan boy is usually taken.
You guys are the hot rodders I value your suggestions,
But the fx is a dog...can any of you recall the phenom1 tlb bug.
You bet it's a hardware problem and when newegg takes a product back after two months and tells me your the second caller today with bsods.
Just lucky I guess.
Please don't attack me, I have built computers since 2000 ,it is how I earn my living.
Amd forever ,but the fx is crap for now.
I am attacking the problem ,none of you personally.
Maybe this isn't a site for adult discussions.
djangry's Avatar
Hulk
I bet you every press kit is tested to the max.mobo included ,cooler included,cpu cherry picked..belt buckle(haha).

The real world is something else.....
With love and respect......djangry
Khan's Avatar
well I apologize, guess I woke op on the wrong side of the bed this morning, the FX does have its problems, but like others have said, you might have just gotten a bad chip on that one, or it could be incompatible memory, but then it would probably just not post if it was.

As for the lower transistor count, I believe that was a marketing mishap and they gave the wrong transistor count at launch from what I have read.
Bobnova's Avatar
It's not the FX chip in general, it is your setup specifically.

My FX8150 works fine.
Two BSOD issues for thousands of chips that newegg sold is hardly a concern, especially when other components have a far higher "failure" rate out of the box.

Failure is in brackets as the #1 cause of DOA parts as far as I can tell is user error.
djangry's Avatar
Thank you again,
The problem with the transistor count is important...
Because the mobo company's counted on that in their designs,
why would you lower transistor counts(power issues)leakage etc.
This is very important and I think a lot of people are not putting 2 and 2 together....not you per say,just in general.
Peace
Bobnova's Avatar
When was the transistor count lowered? Why do you believe it to be an issue that it was?
djangry's Avatar
Dear Bob,
I'm glad yours works,
I will never build a fx box for any of my customers- until I'm sure this puppy is stable.
Compute in peace.
djangry's Avatar
They changed the whole chip,good god man.That is serious!
They are changing fabs- something happened ,when is your guess.
djangry's Avatar
Never ever have I had a bad chip from amd.!!!!!

I sent one or two back just for my own peace of mind on troublesome builds.
But I will admit like a man-
It was never the amd chip that was the root of the problem.

So now they make shoddy chips in Africa or where ever their fab was .
Not buyin it.
Flawed design or failure to update mobo manufactures of changes...
Is all I can think of.
I don't have all the answers, that's why I'm here ,
but there is a problem that is developing and comin to light.

Bashing my skills, setups and hardware is not cool.
hot fixes pulled,hardware manufactures just finger pointing and passing the buck,that's crapola.
We need Amd I am concerned about this issue.
And there is an issue!
Even if you diehards don't believe it.
Time will tell.....one love.
Be excellent to each other!
bmwbaxter's Avatar
First thing, there is an edit button you can use so that you don't have to keep making double and triple posts.

They have not changed fabs, the # of transistors has not changed. There was merely a marketing mistake.
djangry's Avatar
What a error on my part to seek info ,from such a jaded bunch of flamers.
go melt your sockets.

How about someone with something positive to say about this issue.?
That's why I came here, I think some of you need a little cooling.

I was just trying to warn people and get some feed-back .
You guys just want blood -to much gaming perhaps.
I just want my computers to run right ,not pick fights all day.

way...later
I.M.O.G.'s Avatar
Please try not to insult eachother. Your posts are a bit passionate, so maybe a little thicker skin is ok to hold a good discussion. bmwbaxter was just lending some input, he wasn't flaming you.

Their fab is not in africa. Global Foundries has fabs in Germany, Singapore, and NY to my knowledge.

As for the transister count, AMD PR initially claimed almost twice the transistor count as what they later corrected the number to be. I started a thread on this at the time, something titled "AMD can't count their own transistors" because it was such a ridiculous screw to make. That was a PR statement - I believe Mobo manufacturers received the correct numbers for specifications. Over half the AMD PR team got canned after launch, and this screw up could have been related.

It is one thing to report you've had problems with a chip djangry, but you should understand that many of us hang out here every day and hear real user input about their experiences every day. A lot of members have tried out the FX series to find what they think for themselves, and I haven't heard any comments like yours from actual users - some people aren't happy with the performance, others are stoked, but no one is saying the chips don't work.

It sounds like you had a problem with the platform. Whatever it was, you haven't reported any troubleshooting steps to give anyone confidence in your evaluation. We only work of what we know, and trashing the chip without knowing what components you used, what chipset you were on, or what bios version you were running... We don't have anything to go on, except one guy out of hundreds we've heard from is ticked off about how Bulldozer worked for him.

By the way, I think Bulldozer performance flopped. I think the transistor screwup was ridiculous. But I've had a FX-6100 at 7.7GHz and a FX-8120 at 8.23GHz, and I've beat the crap out of them with voltage - they have been fun chips for me. I had a better experience on the 990FX chipset than I did on the 890FX chipset - support on 890FX chipsets was crap, and thats the fault of BIOS design and motherboard manufacturers. The 990FX support was everything it should be in my experience.

This isn't a flame, no need to get defensive... Just sharing. Look above at the replies, no one is flaming you (one guy did, but our staff took care of that and he apologized - so take it easy a little bit and be more descriptive, and your responses will be taken more serious than rants)
hokiealumnus's Avatar
I'm sorry, what? The person that flamebaited you had their post removed and they were given an infraction minutes after it was typed (if you'll go back to the prior page, you'll notice that post doesn't exist).

Currently the only person with a problem or an axe to grind in this conversation is you sir. No one is flaming you. Several of us, including myself, think you are incorrect in your assessment. Our own experience shows your problem is not one due to the FX processor line as a whole but your particular system. Yes, you could have received a bad CPU but as many people as come through here, we'd know if it was as systemic as you seem to be implying.

You say you wanted to warn people; you did. You say you wanted feedback, you've received it. You seem to be upset because the feedback isn't what you expected.

Regardless, one more post like your last post insulting the members of this forum and you'll soon find it difficult to do so again.
djangry's Avatar
Sorry!
I didn't mean to cross that line.
Thank you Imog for the best response so far.
Hokie didn't mean to rile you.

Just seemed like when you call a company and they tell you your the only one
with an issue.
Then you google it and find your not alone.

I have no axe to grind with anyone.

But have spent a month swapping parts and mobos and memory and power supplies and vid cards and cases to get this puppy to run.
Just lucky I guess.....
djangry's Avatar
Dear clockers,

I'm waiting for a call back from Amd tech dept.
Before I send my 8150 back to newegg.
I will let you know what ,they have to say about all this.
I did find a few sites really bashing this product,but to be fair to them and you/us...I will see what they have to say about the black screen issues,
I have been dealing with.
Thank you for some of your advice on this matter.
Love and kisses ,your humble narrator.
Frakk's Avatar

Thanks, please let us know how that goes, also some sites are simply infested with trolls who use the age old AMD vs Intel mud slinging as a vehicle to get a rise out of people.

The best and most effective way to deal with them is the cold shoulder, that's the opposite of what there looking for.
djangry's Avatar
One more....
I bet you half of the problem might be the mobo makers just slapping an fx
sticker on the box ,so to speak.
As the higher the platform- as IMOG suggests,the better the result.
I'm going to guess that they should have changed sockets for this new line,a bummer as that is.(socket love has always been pretty good with Amd).
But the new complex power staging ,might not be do-able on the lower number platforms.
That's just my guess as I'm still waiting for them to respond.
I'll let you know......
Khan's Avatar
There are a few reviews on here that put the chip close in performance to the intels and beats it in some things, especially programs and games that use all the cores, I think most peoples dissapointment came from the hype that it would blow intel away, I think its the integer math that intel really has the upper hand and single threaded performance.

However most people that own it from what I've seen also say that the FX desktop feel is better than that of the comparable intels.
djangry's Avatar
i concur,
I have always thought amd's were faster around the desktop,maybe not rendering a photoshop filter,but overall quicker with the browser, opening files etc.
Real world feel ,benchies are one thing ,the boot and general environment speed is another.

don't laugh....
I just can't help myself(during all this I had a Gigabyte micro atx 754 board recapped ala capacitor plague).
That little junky 2.2 754 with 0ne gig of flakey ballistics ram is running win8 64 developers version and the new Ubuntu ala wubi installer with some free boot-loader I found on cnet or one of the others.
It is surprisingly fast ..it shouldn't even have enough cpu or ram ,but it's still kicking butt.
Ahh ancient aliens....
Bobnova's Avatar
Winrar. FX absolutely stomps in Winrar.
Frakk's Avatar
Yeah, that much is true, i do go back and forth between my own rig and a SB one i use when i'm not at home.

and it feels no different other than actually pulling things up, getting things loaded up and going it can feel like its behaving like a moody teenager, kind of an unusually long armed, trouser waist around ankles, delayed, slow,- oh... do i have to? i can't be bothered, i hate you!!!!

when i go back to mine its like its on some sort of sugar rush, you want that? BANG..here..oh and that? BOOM... more? what now? i'm ready... give me something to do, i have this...do you want this??????? more more more!!!!!!!!!
djangry's Avatar
Oh well,
No call from Amd,
funny they said within two hours ,my rep would be calling me back,
All I got was a ticket number ...haha.
The jokes on me ,I call back they no open any mo.

This is not the Amd I remember.
They did send me an email ad about some enterprise server cloud nonsense.

hey you get offa my cloud....just trying to lighten the mood a little.

Meanwhile my rma is ticking away.
If any Amd company reps cruise this site, you're looking a little sad or maybe it's just me that's sad.

I will try one last time tomorrow.
If they can't be bothered.....
Then I will turn my back on an old friend.
djangry's Avatar
Dear Clockers,
My story is at an end,I forced Amd to talk to me.
After my rep spoke to two techs.
They concluded -thermal issue inside one of the stages .
Please Rma chip.
I will complete my builds with x6 p2's for now and try this again when...
The platforms, the chips and win7 or 8 have it together.
Thanks for putting up with me .
Love your humble narrator.

Some corrections ...when I said Africa as a fab.. that was in error.
I meant global foundries is owned by an arab emirate state.
And I believe Amd is doing some fab switching.
I might also mention you guys super cool your 8150's ,
and might relate your black screens to your over-clocking.
A parting word after studying this issue,
This chip is inherently flawed in it's construction.
A bad batch perhaps,AMD claimed the chip either works or it doesn't and claimed my issue rare.
But after surfing and studying the chips layout and talking to some engineers.
This is just the beginning of a wave of returns.
This chip has internal heat issues!!!-
that is why the press kit was shipped with a water cooler .!
AND IT IS WHY TRANSISTOR COUNTS WERE LOWERED!!!!!
The press core fired to cover their as@#!
PUT IT TOGETHER...!
It is why other sites are AMD bashing.
This is a heart breaker for me!
I will never build a sandy or ivy sloggy Intel bridge rig.
I will wait until the industry fesses on this issue and fixes the mystery machine...Scooby.
Frakk's Avatar
Perhaps i don't know, AMD have money issues, FAB issues, staffing issues, issues issues issues........

Its only the success of there GPU's that's keeping them afloat, there GPU division is probably subsidizing there CPU division.

This CPU was AMD's last chance, i think its performance issues are exaggerated, the AMD bashing you speak of has largely become a trolling hobby.
even if Piledriver becomes a good CPU those trolls will still find ways to bash it, find the one thing wrong with it and then pound on it relentlessly.

I'm not in a mind set of avoiding Intel at all costs, if PB can't cut it for i will switch.

Either way i do think AMD are now doomed no matter what happens, a very loud minority will not let them pick themselves back up again in market share.
djangry's Avatar
I think AMD will recover ,they have always been the underdog!
Except for the glory days of the 939- when it whipped INTEL for three years straight -
and they(INTEL) had to stop being Fat and Lazy "AMD made them a better company"...
And we all benefited from that.
I didn't mean to scorch INTEL earlier- their O.K.-
I have one in my mac and its all right- I suppose.
Just not my style.. slow, steady and powerful in the render, is not what I'm after right now.
I just like the way a sceamin Amd chip works around the desktop ,
and they make good audio riggs when setup right.

We just need to remember Phenom 1.
All was well when P2 hit.
I think that's whats going on right now- P1 all over again.
Frakk's Avatar
P-1, yes indeed i know where your coming from.

We will have to wait and see, late this year if they stick to the PD road map.
djangry's Avatar
On a happier note,latest bios B.80- stabilizes click bios- on MSI 990FXA-GD80 .
Finally it's not jumpy!
mjw21a's Avatar
Well, I'll be upgrading when PD comes out.... I don't need it, I do want it though. I always believe in skipping at least 1 CPU design, and whenever AMD comes out with a new architecture it often takes another chip revision to get it right.

I actually think they've done pretty well this time around, no TLB bug or anything and the architecture shows plenty of promise. This is AMD's first ground up redesign since the original Athlon/K7 after all, K8 and K10 were just an evolution of the K7 architecture. It will be interesting to see where this goes.
Frakk's Avatar
Yes, David aludded to this in another thread...

The architecture may not have delivered yet, but with a few more tweaks here and there?

We will see
djangry's Avatar
Update... bulldozer compatibility,
Well some of you guys were right about platforms for the FX.I am humbled .
Make sure your board supports 125 watt chips ...one board I was trying -only would work well with lower wattage cpu's .
It would run an X6 phenom 2 125watt, but MSI recommended only 95 watt cpu's-
playing it safe .It definitely wouldn't run my flakey FX-8150.
I personally think they changed the recommendation on their site,but I have no proof.
Nowhere on new egg was it mentioned that the board only supported 95 watt cpu's.
I called MSI they said it could run 125 watt cpus, but didn't recommend it.
I asked if they made the change for stability ,but they never called back.
When I bought my dragon rig MSI 790 something- with an x4 140 watt it was all over the place that the board supported 140 watt cpus,now you have to really dig to be sure.
Just because the box says phenom and FX dosen't mean anything.

On the Intel side- a client of mine bought a sandy bridge dell biz rig-cost over a grand easy -and it had serious power issues and was returned so was my clients money (from dell).
What is going on ? Do I have to buy fussy Asus boards again,they always give me a hard time with memory compatibility and longevity.
Hopefully the gigabyte amd board I bought lights up.It is at least rated for 125 watt Cpu's.
Obviously always check the compatibility lists on the manufactures site ,but these days it's more important then ever.
By not double and triple checking ,I suffered a lot of grief and cash-ola.

Also the heat sinks on the 95 watt amd chips are a true joke.
I had to use a stock cooler that was on one of my 125watt x6's.
Smart fan was my saving grace to stop it/them from sounding like hair dryers.
Cool and quiet and turbo are bad news- at least that was/has been my experience.
Measure twice ,cut once and make sure your tape measure doesn't have the numbers on the bottom.
Tool
hokiealumnus's Avatar
Thanks for the update djangry!

Re: ASUS & Memory - In my experience, ASUS has some of the most comprehensive memory control & compatibility on the planet. If you look at G.Skill's high-end kits, ASUS boards show up first for compatibility with others to follow. The trick is you absolutely must set at least the first four timings, the command rate & voltage properly. The rest can usually survive on auto, but if you don't set those five things to spec you're in for a headache.
mak1290's Avatar
I have heard flaws on net that AMD will soon launch Bulldozer FX series revised versions as current versions dissatisfied many..
So When Does actually FX Bulldozer series CPU(8120,8150)New Revised...
hokiealumnus's Avatar
Again, where are you hearing this? I'm pretty attuned to CPU rumors and have never heard of such a thing.

...and to what flaws are you referring? The CPU didn't perform so great, it's not a flaw; it's the design.
Frakk's Avatar
+1 please back up what your saying with some links if you have them, IF AMD are going to make some improvements to FX chips we would all like to know about it.

Its not inconceivable, they made some pretty good improvement's in Phenom CPU's going from Phenom I to Phenom II because no one was happy with it.
ratbuddy's Avatar
He might be thinking of Piledriver..
djangry's Avatar
Well some time has passed since I last stirred things up.

I wish to know how the other members are making out with the fx series.
Especially the eight core.Any black screens ,thermal issues etc.

I was reading some old max pc articles online and they said some guy found a bug in 2011(dragonfly something?) about the Fx- the article was vague and old so I will post no link.
I found some other sites that were just screaming matches about the fx vs the 1100t.

I also notice MSI making v2 revisions on the 990fx-gd80 possibly to account for the fx's shall we say "special needs".

I wrote to AMD this morning asking them not to drop the ball and get their head out of the cloud (so to speak) and back to the enthusiast retail end of things.

I had one fx chip -it gave me grief and I returned it-I then bought every 1100t I could find ,because I had bought so many mobos ...and have been happy ever since.
But we need AMD to keep Intel flying right and keep the prices down and clocks up.

So I would like to know how the Fx is doing in the real world,please let me know how your fx builds are panning out.
Thanks,
Your humble narrator.
Bobnova's Avatar
I ran that chip at heavy loads 24/7 since new. Finally shut it down three weeks ago for the summer.
While running it on heavy loads I used it, a lot, for daily stuff as well as gaming.

I never had any issues. At all.
djangry's Avatar
Check out this article from Softpedia.....

http://news.softpedia.com/news/Ex-AM...g-227816.shtml

Revision time ........
hokiealumnus's Avatar
That article was from October 2011 and was well publicized at the time. If a new revision hasn't come yet (and with Piledriver reportedly coming up this year) I very highly doubt we'll see a new Bulldozer revision.
Leave a Comment