Comments On Nehalem Comments

Overclockers is supported by our readers. When you click a link to make a purchase, we may earn a commission. Learn More.

The last few days, I’ve been speaking about Bloomfields, and not all too kindly.  Some have found this baffling, if not a bit blasphemous. 


To summarize what I’ve been saying:

—Bloomfields don’t do a whole lot for a whole lot of people, and given the additional cost of the platform, people ought to think twice before spending that extra money, especially when the economy doesn’t look so hot. 

—So far, the OCing results as measured by overclocked GHz, have been kind of underwhelming compared to Penryns, further reducing the bloom on Bloomfield.  The rather greater difficulty in the mechanics of overclocking isn’t an encouragement, either.  

—Maybe Intel’s vision of an overclocking audience composed of rich kids/kids-at-heart paying superpremium dollars for what seems to be evolving into a managed experience isn’t in the long term interest of the overclocking (if only because there will be far fewer of them) and should be resisted rather than embraced.  

So far, the comments I’ve seen fall into these categories:

It’s a much better server chip! Well, that’s nice, but if I’m not serving anything, what is that to me?  I don’t want something that does something I don’t do much better, I want something I do do much better. 

It’s a great gaming system when you give it three video cards!  I see.  The fault is not the CPU, the fault is being so softcore in fueling the beast.  Just buy your video cards in bulk and you’ll be fine!  

I guess it doesn’t dawn on some that buying video cards in bulk isn’t a terribly common event, you might even call it elitist, even among gamers.  Even worse, if you actually look at the tests being mentioned, the big advantages claimed only apply at relatively low resolution.  Maybe I’m weird, but I wouldn’t buy three video cards just to get a lot more essentially useless FPS at 1024X768.  Once you get to 1920X1200, and especially 2560X1600, though, the Bloomfield difference drops down to practically nothing10% if even that.  If even three GTX 260 get bottlenecked in 2560X1600, that will certainly be the fate of any cheapskate who’ll buy a single future video card the next few years.    Maybe the answer is more video cards.  Maybe, someday, every gamer will have 32 video cards each.  And there will be three gamers in the whole world playing each other.   This doesn’t strike me as a good long-term strategy. 

But it does overclock!  Even around 50%! I’m afraid some straw horses are getting slaughtered.  I didn’t say it didn’t overclock at all, I said it didn’t overclock enough.  Some might say, “50% isn’t enough for you?”  I say, “How much faster is irrelevant, how fast is what matters.”  If you had a 10MHz processor that you could overclock to 110MHz, that’s a 1000% overclock.  Would you want that running your current machine?  Of course not.  

Pricing being equal, I couldn’t care less about overclocking percentages.  I’m concerned about the final result, which is the top speed/most work being done.  If CPU #1 does what I want it to do 10% faster than CPU #2, but CPU#2 overclocks 10% more, that’s a pretty close contest, but if CPU #1 (and its platform) costs a lot more than CPU #2 and its platform, I think that’s the tie breaker for most people. 

But Bloomfields render and encode so well! Yes, they do, that seems to be the only things they do much better than Penryns, and if you do either a lot, hey, go for it, which I already said.   But don’t assume that because you do something that everyone else does, too.    

We’re still very early in this ballgame, and we may see over the course of the next few weeks/months some better reasons why Bloomfields aren’t a bad buy, which is an excellent reason why most of you should wait to see them.  For now, though, the defenses I’m seeing seem a little weak.



About Ed Stroligo 95 Articles
Ed Stroligo was one of the founders of in 1998. He wrote hundreds of editorials analyzing the tech industry and computer hardware. After 10+ years of contributing, Ed retired from writing in 2009.


Leave a Reply