There’s an article about an alleged extremely preproduction Tejas sighted in Taiwan.
According to x86-secrets.com, it’s so pre-production, it’s pre-generational. They show a picture of a Prescott, and it sure does look identical to the “Tejas” being displayed.
Whether or not this particular report is a matter of gullibility or not, there has been some sort of odd multi-core Intel CPU floating out there somewhere, Taiwanese mobo makers are planning for 150-watt Intel CPUs, and Intel does plan dual-core desktop processors for 2005.
So the picture may be false, but the concept isn’t.
We mentioned the likelihood of this before, and pointed out that it is likely that each of the dual-core processors will likely have slower cores
than one might otherwise expect, that a likely dual-core processor is more likely to have two 3 or 4 GHz processors rather than two 5 or 6 GHz processors.
There’s one big problem with these things, though.
Wanted: Software Support
If dual core is to become a standard in 2005 (and apparently both AMD and Intel are going to go this route), it is going to require a lot more software support than it has received in the past. Serious dual-processor support has only been found in a handful of applications.
Either that, or having a dual-core is going to be no more useful than a DP system is on a desktop today. Yes, DP owners, I know that operations can be smoother with two heads rather than one, and there’s something to be said for that. However, that’s not exactly what most people seeing systems advertised as 6GHz will be expecting. They’re looking for faster, not smoother. Telling them that they can burn a DVD and play Doom III at the same time smoothly is unlikely to impress most of them.
The headline is going to read, “6GHz Processor Runs No Faster Than 3.5GHz Processor,” not, “6GHz Processor Runs Much Smoother.”
Intel has paved the way a bit with Hyperthreading, but it’s not like everyone has jumped on that wagon, either.
Changing software to take advantage of two processors is fairly involved most of the time, and sometimes, it’s impractical. Some software tasks are like going to the bathroom, you don’t go twice as fast if you bring somebody along. 🙂
Even when a task can be profitably split up and the work shared, it’s not just a matter of recompiling code, it’s a matter of rethinking and rewriting it, and if tasks get done fast enough using one processor, a big rewrite to make good enough twice as good will probably be a lower priority.
The role of software in determining how fast the hardware will be is often underestimated. Here, it will truly be a case of the tail wagging the dog.