• Welcome to Overclockers Forums! Join us to reply in threads, receive reduced ads, and to customize your site experience!

Best Intel processor for 1155 socket under 200$

Overclockers is supported by our readers. When you click a link to make a purchase, we may earn a commission. Learn More.
There is no such thing as future proof, it's simply not possible with as fast as new technologies are arriving.
 
There is no such thing as future proof, it's simply not possible with as fast as new technologies are arriving.

It's certainly more of a thing today than it was 10-15 years ago. Tech comes out fast but it's not as fast as it used to be. A CPU from 2008 is still viable for gaming (i7 920). A GPU from late 2010 is still viable for gaming (GTX580). That's pretty good at 6 and 4 years respectively.

Compare that to the situation in 2004 vs 2008. In 2004 we had 1 core Pentium 4, in 2008 we had 4 core i7. What a massive jump. The Pentium had no hope of keeping up with the i7. Look at cards, we went from ATi 9800 Pros to HD4890's. Massive, massive leap.

Today, things progress more slowly. We're still seeing "rapid" (though not as rapid as in yesteryear) increases in GPU performance and efficiency but on the CPU side things have slowed.
 
In 2008 a cpu from 2004 was plenty serviceable too (a64 x2 chips with two cores...there was also pentium d dual cores out as well but amd ruled the roost). Gpus on the other hand I would agree...though a default 580 with its paltry 1.5Gb or vram would hiccup on modern titles at 1080p so........
 
Last edited:
It's certainly more of a thing today than it was 10-15 years ago. Tech comes out fast but it's not as fast as it used to be. A CPU from 2008 is still viable for gaming (i7 920). A GPU from late 2010 is still viable for gaming (GTX580). That's pretty good at 6 and 4 years respectively.

Oh, yes. One can easily game on a CPU and GPU from that era (2008-2010-ish).

I've played Titanfall with an Intel Core 2 Quad Q6600 (2007) and an Nvidia GTX 570 (2010) and noticed no problems with performance. I had all settings maxed out and was still getting a minimum of 40FPS at 1080p. The only problem I did notice was that the GTX 570 has a tendency to run quite hot when running at maximum settings in high-action scenes, and does exceed 80°C on occasion (with the default fan profile).

I'd say it's not really a question of future-proofing a system. It's more a question of how long one can go without having to replace that system and build a new PC (be it due to being too slow to cope with newer software, or just plain not having appropriate instruction sets to run new software). Or to put it another way, how much performance one can put into a PC so that they can put off having to upgrade for a longer period of time.

No PC is fully future-proof. All of them will have to be replaced a some point. Eventually circuitry breaks down and ceases to function anymore (though that's 20+ years as far as I know). Components will tend to go bad, eventually.
 
Last edited:
Oh, yes. One can easily game on a CPU and GPU from that era.

I've played Titanfall with an Intel Core 2 Quad Q6600 (2007) and an Nvidia GTX 570 (2010) and noticed no problems with performance. I had all settings maxed out and was still getting a minimum of 40FPS at 1080p. The only problem I did notice was that the GTX 570 has a tendency to run quite hot when running at maximum settings in high-action scenes, and does exceed 80°C on occasion (with the default fan profile).

I'd say it's not really a question of future-proofing a system. It's more a question of how long one can go without having to replace that system and build a new PC (be it due to being too slow to cope with newer software, or just plain not having appropriate instruction sets to run new software). Or to put it another way, how much performance one can put into a PC so that they can put off having to upgrade for a longer period of time.

No PC is fully future-proof. All of them will have to be replaced a some point. Eventually circuitry breaks down and ceases to function anymore (though that's 20+ years as far as I know). Components will tend to go bad, eventually.

You're talking about a 4 core Q6600. I'm talking about a 1 core Pentium 4, horrible Architecture and all. Titanfall would roll over and die on a P4 system. Also I don't feel Titanfall is the best example as it uses the source engine which isn't very demanding on the CPU.
 
Yeah but it wont be good if i cant buy a new one !

Why would it not be good if you can't buy a new CPU?

I've found quite a few good deals on used/pre-owned PC parts that worked very well. CPU's, Motherboards, Memory, Hard Drives, Power Supplies, Cases. I've built entire PC's with pre-owned parts.

As far as answering your question though, a Core i5 3570 or 3570K is about the best you can afford in your price range for the LGA1155 socket type. You could also get a 2500 or 2500K, but why would you when you can get a 3570 or 3570K for around the same price?

A 3570/3570K has higher CPU Frequency than the 2500/2500K and CPU's based on the Ivy Bridge architecture typically have better performance at the same clock speeds than their Sandy Bridge predecessors.

I looked up others, but the 2600, 2600K, 2700K, 3770, and 3770K are all well outside of your price range (most are $200+ on the current used market).

You're talking about a 4 core Q6600. I'm talking about a 1 core Pentium 4, horrible Architecture and all. Titanfall would roll over and die on a P4 system. Also I don't feel Titanfall is the best example as it uses the source engine which isn't very demanding on the CPU.

You were talking about a gaming on a Core i7 920. That was the part I quoted, and the part I agreed with.

I didn't say anything about doing any of that on a single-core Pentium 4.

And I was agreeing with you and you're now telling me I'm wrong?

Well, Titanfall is the most taxing game I play, the only other one is COD:MW2, which is much older.
 
Last edited:
Arm

ARM has a long way to go to catch up to top end X86. I think "Wintel" has at least 10-15 years left in it personally.

Wintel will be around for much longer than that but the tipping point of being viewed as a legacy product might be sooner.

Arm designed chips currently sell 25x more than intel, which purely in economies of scale is huge.

There is a powerVR graphics chip that is supposed to be console style quality to be released next year and Arm is just about to release its new 64bit range.

There is no comparison as Arm is just a chip designer and Intel still make 53x revenue.

That might go against Intel though as the Arm network of licencing to firms who licence fabrication is proving to be far more diverse and rapid.

1 in 3 kids in the Uk have a tablet and 2 in 3 use a tablet @ home and currently usage is tripling each year.

Arm might not replace the wintel desktop, but what we use might not be a desktop.

Have a look at this and realize you can still make the same money on products that cost 25x cheaper when you sell 25x more.

http://www.androidauthority.com/tegra-k1-in-depth-look-331548/
 
Last edited:
To clarify Ivy was the tick of Sandy and Haswell is the tock of broadwell.

Edit: Tock, new arch. Tick die shrink.
 
Wintel will be around for much longer than that but the tipping point of being viewed as a legacy product might be sooner.

Arm designed chips currently sell 25x more than intel, which purely in economies of scale is huge.

There is a powerVR graphics chip that is supposed to be console style quality to be released next year and Arm is just about to release its new 64bit range.

There is no comparison as Arm is just a chip designer and Intel still make 53x revenue.

That might go against Intel though as the Arm network of licencing to firms who licence fabrication is proving to be far more diverse and rapid.

1 in 3 kids in the Uk have a tablet and 2 in 3 use a tablet @ home and currently usage is tripling each year.

Arm might not replace the wintel desktop, but what we use might not be a desktop.

Have a look at this and realize you can still make the same money on products that cost 25x cheaper when you sell 25x more.

http://www.androidauthority.com/tegra-k1-in-depth-look-331548/

Arm is the firm that designs the chips, yes. the Arch they are using is from the 80's possibly older, im not that old to really know much about it. Risc is what Arm is using for the arch, problem back in the day was getting clock speeds up with the arch. I am not sure why this was the case vs intel's X86 arch. it was how ever at the time of the 386/486 time a good competitor, used cmd line linux boxes or dos, there was also a windows NT4-risc kernel as well. Big thing for arm is not needing alot of die space like current intel/amd x86 cpus for just the cpu cores to get work done. For Arm designed cpus ODM can choose from other graphics processors as well. There might be more then what i list but we have 3 to choose from, the first one being one you talked about PowerVR, they also for a short time did desktop graphics. Second is Nvidia with its Tegra line of Arm based cpus and last is also from Arm the Mali cores. As i said there might be more but those are the ones that come to mind when i have seen Arm specs being posted for next gen phones/tablets.

Now if you really want to venture and use arm for your more day to day computing usage, you do have options. You can buy dev kits for the Tegra platorm and you may find images for Andriod as well as linux. The Odriod kits not really dev kits, offer alot of performance for the price vs some other arm kits out there. that is if looking for more of a gaming type arm setup, gaming on arm and those types of graphics have a ways to go. Last specs i saw for the Tegra platform were the graphics processor had 192 cuda cores, that might put it around 2-3 gens behind, since i didnt find any info on which arch from nv they were implementing. i will also say i havent looked at specs much lately on newer hw, more rl going on so tech stuff has taken a back seat.
 
Arm is the firm that designs the chips, yes. the Arch they are using is from the 80's possibly older, im not that old to really know much about it. Risc is what Arm is using for the arch, problem back in the day was getting clock speeds up with the arch. I am not sure why this was the case vs intel's X86 arch.

RISC = simple instructions, simple and efficient CPUs, and shifting most of the optimization burdens to the compiler
CISC = complex instructions, complex and inefficient CPUs, and shifting most of the optimization burdens to the hardware designer

RISC CPUs can often be clocked faster and have higher instructions-per-clock given the same die area.

RISC is old, and CISC is equally old. They have been competing for decades, and neither has been proven to be superior to the other, though most CPU architects nowadays agree that RISC is more suitable for more applications nowadays.

Almost all programmers who have programmed in x86 assembly will agree that it really sucks as an instruction set architecture, with the tiny register set, instructions that only work on certain registers, and obscure instructions. x86-64 solves many of these problems, but it still needs to remain backward-compatible.

In the old days, compilers sucked at optimizing and programmers often wrote in assembly, so it's beneficial to have a higher level assembly language to make programmers' jobs easier. Nowadays, with very good compilers, RISC comes out ahead because they give compilers more flexibility for optimization (instruction scheduling for maximum use of ALUs, etc). A good compiler can do a much better job at optimization than a good hardware designer, because the compiler has much more information about the program at its disposal.

In fact, all modern Intel CPUs are internally RISC. They have a frontend that translates CISC instructions into an internal RISC instruction set, and that's what actually gets run. CISC-ness of x86 is definitely a drawback at this point in time.

ARM CPUs are very modern designs with sophisticated branch prediction, cache management, superscalar execution, etc. They achieve higher performance and lower power usage than Intel chips per transistor. They just don't make big chips because there isn't a market for it.
 
To clarify Ivy was the tick of Sandy and Haswell is the tock of broadwell.

Edit: Tock, new arch. Tick die shrink.

I just happened to be reading this earlier today, LOL. I have not budged off my Ivy setup due to this ASrock Z77OC formula being the most awesome board I think I have had. I just love it. I will go forward with upgrade of board, chip and memory when Broadwell comes out in sometime 2015. So that makes me a "Tick" guy :D

http://www.techradar.com/us/news/co...know-about-the-latest-core-processors-1251904

Let's be clear - the Haswell refresh isn't a die-shrink of the existing 22nm Haswell microarchitecture. That will have to wait for Broadwell. So in Intel's tick-tock processor cadence the new releases have gone from Sandy Bridge (tock), Ivy Bridge (tick), Haswell (tock) and Broadwell (tick).
 
I just happened to be reading this earlier today, LOL. I have not budged off my Ivy setup due to this ASrock Z77OC formula being the most awesome board I think I have had. I just love it. I will go forward with upgrade of board, chip and memory when Broadwell comes out in sometime 2015. So that makes me a "Tick" guy :D

http://www.techradar.com/us/news/co...know-about-the-latest-core-processors-1251904

Odd that Broadwell is a tick, but supports a brand new memory type, DDR4. They did that with 775 and C2D/C2Q going from DDR2 to DDR3 but the memory controller was not on-die at that point.
 
Back