• Welcome to Overclockers Forums! Join us to reply in threads, receive reduced ads, and to customize your site experience!

Intel Core i3, Core i5, and Core i7 GPUs

Overclockers is supported by our readers. When you click a link to make a purchase, we may earn a commission. Learn More.

zzzzzzzzzz

Member
Joined
Apr 7, 2009
I have read that Intel Core i3, Core i5, and Core i7 integrate a GPU with the CPU.

Can the GPU be turned off?

If the GPU can be turned off and is turned off, should power consumption of the Intel processor be much less than when the GPU is on?

I would think that if the GPU part of the processors can be disabled, the CPU parts may be clocked higher than with the GPU part enabled.
 
Its actually just the LGA1156 I3's and I5's besides the I5 750. You will need a new H55 or similar chipset board for the dual core integrated GPU chips. Most of these boards I believe currently allow you to disable the on-chip graphics. Hey Zeus is the first person that comes to mind who has had experience with these chips. Maybe he will chime in before long and be able to give you more details.
 
The GPU can be disabled but since it on the same die as the PCH the power savings are presumably not great over basic 2D idle (and any discrete GPU is going to use more). However, Gigabyte specifies the maximum supported memory with integrate GPU is 1600 and that to reach 2200 and above it is required to use a CPU without it (not just disabled). Also, in a beta BIOS a GPU clock adjustment has been added but I have not tried it.
 
the i5 750 has a lower idle state power usage then the lower end i5/i3's with on-package video. Though depending on the setup, you may be able to lower idle power by choosing certain parts,ie RAM/GPU/HD.

will you be ocing the system?
 
With all advanced CPU functions enabled in BIOS/CMOS setup, Gigabyte-Intersil Dynamic Energy Saver reports power consumption by the CPU at idle of 2.061W, reduced to 0.393W when DES is turned on.
 
With all advanced CPU functions enabled in BIOS/CMOS setup, Gigabyte-Intersil Dynamic Energy Saver reports power consumption by the CPU at idle of 2.061W, reduced to 0.393W when DES is turned on.
What is "DES"?
 
will you be ocing the system?
Possibly. I was hoping that the CPU itself could be overclocked (perhaps a multiplier adjustment) greatly after the GPU part of the processor is disabled and not outputing energy.

I am also trying to take into account the net present value of the cost of power consumption.
 
However, Gigabyte specifies the maximum supported memory with integrate GPU is 1600 and that to reach 2200 and above it is required to use a CPU without it (not just disabled).
What should be the units for the memory numbers.
 
Possibly. I was hoping that the CPU itself could be overclocked (perhaps a multiplier adjustment) greatly after the GPU part of the processor is disabled and not outputing energy.

I am also trying to take into account the net present value of the cost of power consumption.

The clarkdale can be easily overclocked to 4Ghz with the IGP enabled or disabled. I've had a few runs with the IGP enabled at 4.9Ghz with no issues in stability
 
The clarkdale's memory controller is in the GPU, so even disabled will draw some power.

I have a post linked in my sig that compares all these CPU specs if you are interested, I find it helpful because it's all too confusing now.
 
What should be the units for the memory numbers.

MHz

The clarkdale can be easily overclocked to 4Ghz with the IGP enabled or disabled. I've had a few runs with the IGP enabled at 4.9Ghz with no issues in stability

Good to know. But what about over time?

Also, the previous mentioned GPU clock adjustment being made available by Gigabyte is apparently up to 850Mhz from original 733Mhz (bridging the gap to the i5-661's 900Mhz).
 
Who cares they're 100 dollar chips :)

Yeah but reliability can be more valuable than the cost of the parts. Aside from that, consider that the same amount could be spent on two cheaper CPUs than just opting for the higher performance one in the first place. Maybe it's just me but overclocking is more about giggles than value these days.
 
Back