Electricity consumed = watts of heat generated.
The temperature of the chip has remarkibly little to do with how many watts of power it is drawing however. The core temps depend on a variety of things, including but not limited to the watts being consumed, the core design, the location of the sensors in the core, the thermal interface between core and heatspreader, the thermal interface between heatspreader and heatsink, the heatsink, the airflow over the heatsink, the motherboard temp, the number of pins in the socket, the material the socket pins are made out of, and so on.
An excellent example is comparing the temp of a 140w PhII 965 to the temp of a 130w i7 950 on the same cooler. The PhII draws 10 more watts but runs significantly cooler.
Why?
Core temp probe placement and core heat dissipation abilities.
In my personal opinion there needs to be a specifier attached to any time anybody says "hot cpu" or "runs hot". The 140W CPU eats more power and puts out more heat, as heat is measured in watts. Having cooler core temps doesn't mean it puts out less heat than the 130w i7, it actually has little to do with heat production.
The 3770k is guaranteed to be a lower heat production CPU.
The 3770k may well have higher core temps for a given heat output. In fact I would be very surprised if the temps were not higher for a given heat output.
If we're all lucky the lower heat production will be a large enough factor that they don't overheat constantly.