• Welcome to Overclockers Forums! Join us to reply in threads, receive reduced ads, and to customize your site experience!

Need advice for buying a CPU on BUDGET

Overclockers is supported by our readers. When you click a link to make a purchase, we may earn a commission. Learn More.
I really hope you didn't just use a psu calculator... Let's look at realistic numbers.

50 for the system, 50-70w average for the CPU(2500k) in any given game, 130w tdp for the HD 7850. 250w total. Wallah. My 2500k at 4.5ghz and gtx 580, yes 580, pulls 350w From the WALL at 'full load' during bf3. 431 is laughable for a gpu that uses less than half the wattage mine does.
 
I really hope you didn't just use a psu calculator... Let's look at realistic numbers.

50 for the system, 50-70w average for the CPU(2500k) in any given game, 130w tdp for the HD 7850. 250w total. Wallah. My 2500k at 4.5ghz and gtx 580, yes 580, pulls 350w From the WALL at 'full load' during bf3. 431 is laughable for a gpu that uses less than half the wattage mine does.

Interesting, I've never seen such a rounded number "pulled from the wall". Now what's the efficiency of your PSU? Because what you pull from the wall is NOT the actual power consumption of the computer. Let's assume for the sake of argument for a moment that it's efficiency at that level is 85% and your "350W" number is correct. That means, as you state, your system is consuming 297W (350 * 0.85=297.5). Now let's look at the "power consumption" of a GTX 580 -
http://www.legionhardware.com/articles_pages/inno3d_geforce_gtx_580_oc,11.html
System Idle = 168W (for the lowest 580 on the list) and 451W while running furmark. I'll even throw you a bone and say that while you're running BF3 you're not using as much as it would running furmark so we'll be generous and shave off 50W for a total power consumption of 401W and an "at the wall" draw of 471W (401 / 0.85 = 471.76). So everything else being equal you're saying that the 2500K uses 104W LESS than an i7 965 at stock speeds. I have to call bunk on that one.

Here's a little hint for you. TDP doesn't = power consumed (unless your the guys at NVidia drafting up the paperwork). Both Intel AND AMD agree that, quote "it’s a measurement of waste heat output from a chip". Only NVidia seems to be the one coming up with this new definition of, quote, "TDP is a measure of maximum power draw over time in real world applications". So in other words simply taking the "TDP" values for all of your components and adding them together is not going to give you an accurate measurement of power consumption (which is what I suspect you did to come up with the "350W"). Also you ALWAYS use figures for MAXIMUM power draw when deciding which PSU to get. Kind of rediculous to choose a 350W PSU simply because MOST of the time you will be under that. The moment you "stress" the system you'll run into problems. On top of that most PSU's reach their maximum efficiency somewhere between 50-65% power draw which is why if your actual power consumption is, say for example, 340W you ideally WANT a PSU rated at 550W to put you in the optimal efficiency range.

Now as a side note I was talking to the OP about power consumption as he was talking about possibly shifting to an FX-8150 and wondering if the PSU he picked would be sufficient. Next time read a little more carefully.

http://www.bit-tech.net/blog/2010/11/11/what-does-tdp-mean-nvidia/
 
I never said to get a 350w psu. I'm saying realistic numbers are just that. He won't be furmarking or IBT constantly now will he? Yes I'm well aware tdp doesn't equate to power consumption. Actually I idle at 90w. Hx 650 psu. I'd be running around 300w yes. I know that. I don't need a lecture in power consumption. Even max stressed he's barely going to hit 300w with a 2500k oc at HD 7850. That was my example. With an fx chip that's entirely out the window. And your charts are based off system draw, not individual draw. I owned a I7 930 at 4ghz. Bloomfield easily pulled over 200w oc stressed out. It was rather ineffecient.

Point is. Apart from a select few who's going to run their system at 100% loading all the time? The op has already specified its a gaming rig. And even then to test stability you'd only run ibt OR furmark. Rare to run both at the same time.

His psu regardless, is plenty for both variants. Your calculation with the psu calc is overestimated as most of those calcs are. I don't have 'links' to try prove myself simply because I'm on my phone and not bothered. But I assure you unless he throws some crazy volts at that fx chip, he's not going to break 400w

http://www.techpowerup.com/mobile/reviews/AMD/HD_7850_HD_7870/24.html

Here. Just for you. That HD 7850 isn't going to be burning out any half decent psu anytime soon...

Besides. Furmark is a horrible program that puts a very unrealistic load on a gpu. Frankly it shouldn't be used at all. Run some benchmarks or a loop of a game benchmark sequence for stress testing. Furmark is just. Yuck.
 
Last edited:
Ok...so i havent build my PC , i cant tell , i barely know how to calculate lol.

Well the calculator told me its 300+ to 400 , not even close being 500.

I will have a 550 so think i be safe.

Back to the casing question.

Casing comes with 3 fans, i plan to go up to 6 (maximum) , only 4X 4P Molex. How then ? I heard some people said you could plug in one of the fans onto the Mobo , the System Fan pin / socket , i do see it. They said if so you will have to control fan speed via BIOS some said its auto based on the temeperature they Mobo will do the work.

If you direct via PSU its 100% fan speed they said.
 
If you plug the fans into a Molex, it'll run 100%. A molex can supply way more power than a fan needs, so you can plug in multiple fans into a single molex connector.

There's also smaller 4 pin and 3 pin connectors that plug into your motherboard. Depending on your motherboard, you may be able to control both or only the 4 pin.

3 pin is like this. One wire is power, one wire is ground, and one wire reports how fast the fan is spinning. You control this fan by lowering the amount of power going to it. Some motherboards have this option, some don't.

The 4 pin is a PWM fan. This uses a slightly different/more efficient/more complicated way of controlling the fan speed. Most motherboards only have one of this type of connector, usually meant for the fan on the CPU heatsink.
 
I see. Well this Mobo i see closely there's one for CPU Heatsink , it written CPU FAN , another is a SYSTEM FAN if not mistaken , all the way the other side , where all the PCIE slots are.

What do you mean plug in multiple fans into a single molex connector ? I thought 1 molex connector = only 1 fan ? Unless there is like 1 X molex split into 2 X molex

Also as i said this Enermax NAXN 550W only comes with 4 Molex =(

The case is able to go up to 6 , maybe i cant add in more fans, unless i can plug it in onto the Mobo , i will have to check.
 
Uh this only come for 3 pin to Molex thats i think its a 100% must because our Mobo wont have that many slots for it to put in.

So it doesn't have a 1 4P Molex split into 2X 4P Molex ?

You mean i need something like this ? ?

http://ecx.images-amazon.com/images/I/31XJJqnuUvL.jpg

Also the way you explained 3 wires ,Ground , Live , Signal / Sensor its just like in the car !
What i recently learn lol , how to check one of the car sensors =)
 
Guys once again another problem zz , if you guys getting annoy about this i wont blame you i myself is too lol.

So as you all know i plan to switch from a I5 3450 + Gigabyte H77M-D3 to a FX 8120 + Gigabyte 970A-D3 instead

Problem now is , people having problem running the FX 8 on this Mobo , 970A-D3 , some said the VRM is bad and tend to overheat and causing the PC to freeze and crashed.

Also some said it has a 4+1 Phase power which is bad.

This guy here never even OC it and its causing him problem already....

But he also said that unless you get the Rev1.3 version of this Mobo then its ok

http://www.overclock.net/t/1278065/...er-load-in-games-and-prime95-not-a-temp-issue

Mobo : http://www.gigabyte.com/products/product-page.aspx?pid=4215#ov

So what should i do ? Go back to the I5 3450 + H77M ? (Will this Mobo be problem for the I5 Ivy bridge too?)

Or

Go a less power CPU like FX 6100 ??

Saving money for a better Mobo its quite not the answer....because its least 30 - 50 dollars more which is alot here....but if even its the same problem will go for the I5 with the H77 board i guess i will have no choice then zzz
 
PSU calculators really give huge numbers.

My computer, measured by a Kill-a-watt, when gaming, does not even hit 650W. Outervision? MINIMUM 842W. Yea, sure.
 
= = ? ?

But i was talking about VRM im still quite confused....actually because does VRM have anything to do with being your system lack of power ?

OR

Is it that it controls the CPU voltage or something something , being that this 970A board having a lousy VRM and the FX8120 takes up alot of power in the end causing problem zzz

But if you can confirm its 100% no problem and safe then i guess there wont be any problem , i will go with this board. Though i did compare the MSI 970A-G46 its the same price as this Gigabyte 970A-D3 Rev 1.0 , but it seems have better heatsink for the VRM section but if it was the Rev 1.3 version , then its same for the Gigabyte

MSI 970A

http://www.msi.com/pic/product/five_pictures7_2501_20111215093233.jpg

Rev 1.3 Version Gigabyte 970A-D3

http://www.techfresh.net/wp-content/uploads/2012/07/Gigabyte-GA-970A-D3-REV1.X-ATX-Motherboard.jpg
 
Nvm i have decided to stick with my old plan the 3450 + H77M , its been decided i will be fine with this.

Also since many of you said the FX series is bad for gaming zz even an AMD fan like me have lost faith , for now lol , not till they come up with a new CPU !
 
The fx at stock is 'bad' for gaming in a world where cpu doesn't matter. Go grab a phenom ii x4 965 and cool it down, get it to 5 ghz and ball. It's 109 -15 on newegg.ca so proly cheaper whereever u are. U know u want a better video card. ;)
 
But usually when you guys said something like " bad " for gaming as in how ?

Because uh i myself not sure what kind of gamer am i , if im a person who just plays almost anything , FPS , RTS , RPG anything is good, but want a decent gameplay , 0 lag totally 0 , High graphics if possible , no need for Ultra , least lowest is Medium thats all.

Will the CPU be bad for gaming in my state ?

I know for sure when coming in to benchmarking it matters alot but for gaming is it a big effect ?

The sad thing is we dont carry old stocks here ,once its sold out they only bring in latest.

So we dont have Phenom II here anymore.

Also i do have intention of OC , but not sure will i , another thing is i have to dump at least 100 - 200 bucks more (my money) getting a better board for OC

Knufire will understand my story , so thats y i stick to this , maybe in future i get bit more money i will change CPU and Mobo again , maybe at then i will learn how to OC =)

For now i need a new PC to face all the new games coming , for which mine now is totally not working so well.

Cant wait to play Far Cry 3 !
 
It's not that it's bad for gaming, it's simply that the Phenom II is better. Since we're performance focused...we tend to ignore the least optimal options and focus on always getting the best price/performance.

If you can't get Phenom II, then you don't really have a choice but to get the FX, you'll be fine.
 
Nah no need , since as i mentioned i might change to a CPU and Mobo thats able to OC , maybe at then they will have a new CPU out =)

So i will just stick with INTEL for the moment and feel how is it so great that everyone said it to be lol

But some tend to say INTEL is good for working , comparing to AMD its better for gaming...

Though i must say INTEL is lot more expensive and they release freaking lot of version of their CPU , they release quite often too making the old ones outdate easily well not outdate maybe but least its no longer the latest.

Its funny though , back then i always like Nvidia and INTEL since they are the first one i got in my PC.

Then later i found out about Radeon and AMD CPU , i like Radeon GPU and AMD CPU instead lol well now AMD own both !
 
Back