I got some complaints after writing this piece about hyperthreading.
To quote one:
“Many applications are written single
threaded and will by themselves not see increased results from a
hyperthreaded processor. But as soon as you run two applications (like
viewing a web page while doing some other task (downloading a file, viewing
a DVD, etc.) you have better overall performance. This is where the
improvement is. The newer benchmarks are working to measure this.”
Let’s look at this:
Hyperthreading: Intel’s Mini-Me
The idea behind hyperthreading is derived from the notion of multiple processors. The reason why people want to use multiple processors is very simple: two (or more) heads are better than one.
In a sentence, the difference between multiprocessing and hyperthreading is the difference between twins and the combo of Dr. Evil and Mini-Me.
To understand Intel’s Mini-Me better, we must first look at what twins can and cannot do well.
Two Heads Are Not Always Better Than One
You can deploy two heads two different ways.
If you want to get a single task done faster, you need to get the two heads to work on the same task at the same time. This involves splitting up the task between the two heads and coordinating their efforts so that they together get the task done faster than if just one had been doing it.
A good example of such a task is grocery shopping. Two people can split up the list, pick up half the items each separately, and meet at the counter and get this done faster than if just one did it.
However, many tasks are not amenable to being split up in a way so that two people can get it done faster than one. Driving to the supermarket doesn’t take less time when you have two drivers rather than one.
Most computing tasks are (or at least are programmed to be) more like driving to the supermarket than shopping there. In those cases, multiprocessing (or multithreading) does you little good.
However, like a single person, computers can do more than one task “at one time.” In either case, neither is really doing two tasks simulataneously; they just shift rapidly from one task to another as needed.
It’s like broiling a steak and boiling potatoes “at the same time.” You don’t actually do anything to both at the same time, you take care of one, then the other.
In today’s environment, multiprocessing is more commonly used to figuratively put one person in charge of the steak, and the other in charge of the potatoes.
For this particular task, it doesn’t help much because one person can handle the task well enough without undue strain. Two people (or processors) do you little good.
There are multiple tasks, though, in which each of which does require the full attention of one person.
If you’re cooking dinner for ten people, and you also need to greet and entertain guests at the same time, a single person is going to have problems doing all of these things timely. It’s much better to have one person doing the cooking, and another doing the greeting.
Even before the guests show up, if the house has to get straightend up a little, you’d still be better off with a second person doing the last minute tidying up rather than one person doing it all.
The same holds true for processing.
We can see that multiprocessing (or hyperthreading) only really helps under certain specific circumstances, namely, when there’s at least one task which can keep one processor fully occupied, and there are also other things to do.
Those who swear by multiprocessing describe the improvments from minor (Windows operates more smoothly while a major task is being completed) to major (two major tasks can be done well at once). It’s relatively rare that they say they can get a single task done faster (and in those cases, they’re running software programmed to take advantage of two processors at once).
There’s certainly nothing wrong with these things, but that’s not what the average computer buyer expects.
More often than not, he assumes that twice as much power means he can get a particular task done twice as fast. That’s not usually the case at all, which leaves an unhappy camper with extra bills and little extra (in his view) to show for it.
Just like Mini-Me is a miniversion of Dr. Evil, hyperthreading is a mini-version of multiprocessing. Rather than exact twins, you have a little twin that tries to squeeze in to help the big guy out.
Just like Mini-Me, hyperthreading can sometimes help, often makes little difference, and sometimes it just gets in the way.
Any boost in CPU power shown by a benchmark like Sandra may be technically accurate as a measurement of the total potential power of Dr. Evil and Mini-Me combined. However, if you just look at the number and assume you’re buying a bigger stronger Dr. Evil capable of faster malevolence, you’re not going to be too happy when you see Mini-Me tagging along instead.
At this point in time, while we have a very good idea what multiple processors can and cannot do, we’re not so sure about Intel’s Mini-Me. He certainly will cost you less to feed and maintain than a twin, but he’s also capable of less, even under the best of circumstances. We know he helps sometimes, at least appears to be useless other times, and actually messes up Dr. Evil at least sometimes.
Nor are we too sure Mini-Me is up to handling some of those situations that twins handle very well. For instance, twins can handle two simultanous major tasks very well; it’s at least questionable Mini-Me can.
The key in reviewing will be to try to thoroughly check out what Mini-Me can and cannot do, and see if he’s more help than bother to you for the price.