There’s a review of the SiS 648 here.
This looks like a real launch, unlike the P4X333. There’s already one out there on Pricewatch.
The pattern that you see in this review will very likely remain the same through the rest of this year, up to and including the first generation of dual DDR boards.
The SiS and Via boards will officially support the latest non-memory standards; Intel will trail. For items like memory speed and FSB, Via and SiS will officially support the latest standards, while Intel won’t (though the equipment will work at those speeds, anyway).
Benchmarking done by “official” standards will show the Via and SiS chipsets ahead, probably by 2-4% in office-type benchmarks, closer to 10% in gaming benchmarks.
I don’t think it’s going to be as easy as that.
I’m not talking about nonperformance factors. I think the real-world (as opposed to typical website benchmarking) performance differences will be a lot closer, and might even put Intel ahead.
The reason for that is because most websites nowadays run benchmarks like they’re PC Magazine, at official settings. By definition, overclockers don’t do that.
If you buy a SiS or Via board because it does 10% faster than an Intel board in some gaming benchmark; at the settings you actually use, the gap is unlikely to be anywhere near as wide.
Put another way, seeing benchmarks run at 133/200 for a SiS and 133/166 for an Intel 845G isn’t going to mean much if you plan to run your system at 150/200. The size of the gap will likely be quite different.
You may say, “It will be just a few percentage points.” Sure, but a few percentage points is what people are basing their buying decisions on. If you buy SiS or Via because of a 10% difference in some benchmark, does the decision change if the difference is actually 5%? 3%? 2%?
More Than One Way To Get A Number
That’s fairly clear-cut and testable by a single person. There’s another factor far less clear-cut and testable that is likely to be just as important.
Numbers aren’t everything. Stability counts, even if numbers are everything to you.
In this sense, I don’t mean stability in the sense of “how many times does Windows crash a day” (though that’s certainly very important in the minds of many).
By stability, I mean “I can run this processor stably at 2600MHz on Board A and only 2400MHz on board B.”
You’re never going to find that out if the review just runs at spec (and even if they did, it’s almost certainly wrong, see below for more).
Let’s assume for argument’s sake that a SiS 648 board does 10% better “officially.” Let’s presume 6% of the gap goes away if you run both boards at the same settings (this kind of change is very likely).
Let’s presume that you can run your processor 200MHz faster on the Intel board (this is a good deal iffier, but a few have gone from SiS to Intel boards and reported much bigger changes). That should be good for the remaining 5%.
Now the Intel board in reality is slightly better.
So instead of buying a system that’s 10% faster, you actually end up with one a tiny bit worse. That’s because you went with a number that was absolutely scientific, absolutely provable, and absolutely wrong. Why? Because your number didn’t take all the relevant factors into account, and the factors that got ignored just got up and bit you in the ass.
There is at least a reasonable possiblity over the next six-nine months that we will see the spectacle of the review sites will all be saying one thing about PIV mobos, and all the real overclockers will be saying the opposite. Forget about hanky-panky, this could well happen for 100% legitimate reasons. The review sites will be doing one thing, the overclockers will be doing something else.
But then what’s the use of review sites to overclockers?
Can We Stop The Intel Ad Now?
Take a look at our CPU databases in the PIV categories getting the traffic. Not too many SiS and Via boards up near the top, are there?
You may say, “Once these new SiS and Via boards show up, they’ll show up near the top, too.” Oh ye of too large faith. Will they? At the very least, it’s something to check out before you buy, isn’t it?
I can’t say for sure that an Intel board can make up a 10% gap, but based on the evidence we have so far, it’s hardly a wacky notion. Narrowing the gap through higher settings is practically a given, and stability may close the rest. At the very least, these are items that should be looked into.
The problem is you’ll never know the possibility even exists if you just look at the typical comparison nowadays.
If You Get Extremetech Benchmarks, You Might As Well Go To Extremetech For Them
I mean really, if the place you go to just runs benchmarks at spec, there’s no difference between Hip Cool Attitude Website and Extremetech. They’re doing exactly the same thing.
Now if you’re running your system exactly at spec, that’s fine and dandy. Are you?
It’s Not Just A Matter of Overclocking, Either
This is not a matter of Ed Stroligo or Overclockers.com playing Mighty Mouse, singing “Here we come to save the day,” then overclocking a few boards. That would be just as bad, though for entirely different reasons.
The reason for this is rather simple. Manufacturers set up their testing mechanisms to ensure, one way or the other, that the vast majority of the products meet a certain standard. Either through design or testing, those items that don’t meet that standard get eliminated, so practically all variance on the low end gets eliminated. That’s why components almost always work at their rated speeds.
When you overclock, you usually (though not always) exceed the point where variance is eliminated. That’s part of the reason why people with the same components get different results.
When you’re in the realm of natural variance unchecked by human intervention (i.e., weeding out lower-performing CPUs), you can’t speak about certainties. Only statistical likelihoods, and you can’t do that with a sample of one (or even a handful).
Even when it looks like you can, you’re actually not. For instance, it’s probably pretty safe to say you can run any late model 1.6A Northwood at 2.13GHz. All that means is that Intel either doesn’t make or allow any CPU that can’t do at least 2.13GHz.
You can’t say the same about a 1.6A doing 2.6GHz, though. Now we’re in the realm of statistics, because we’re in the realm of natural variance.
I have two 1.6A PIVs here. I can’t get one past 2.48. The other can do 2.67 with relatively little fuss. The first is on the middleish left-hand side of the bell curve, the second is probably more towards the right of the bell curve.
And that’s with everything else being the same. Add in just one new variant (i.e. a different motherboard), and you greatly complicate the picture.
A PC consists of many components that can affect how far you can go. So reporting the overclocking results from just one system is virtually useless as a predictor of how your system, even if you use exactly the same components as the system tested, will overclock.
Precise and Wrong vs. Sloppy and Right
One person overclocking with one system will give a nice precise number as to the degree of overclock. It will almost certainly be wrong for the vast majority of people.
To get a better idea, you need to see what a ton of systems do. Unfortunately, there’s nothing “scientific” about it. Take a hundred systems, and you’ll have a hundred different configurations handled a hundred different ways.
Very unscientific. The results will also be very imprecise; looking at, say, our CPU database involves a good deal of guesswork and guesstimating. The answer won’t be a neat little number, it will talk about likelihoods and unlikelihoods.
But guess what? For all its mechanical, ritualistic failings, it will probably be more accurate and reflect reality better than any review. Do you know why? Because the value of a much larger sample population properly analyzed far outweighs the variances within that sample population.
I Came Here For Overclocking, Not Statistics
Sorry, guy. Statistic probability is as closely tied to overclocking as stink is to sh**. You can ignore it or refuse to believe that, but it still affects you. It’s like the guy who believed that sh** didn’t stink, put a turd in his pocket, went to the dance, and wondered why nobody wanted to dance with him.
There are people who sincerely believe the earth is flat, too. It’s certainly their opinion. I’m sure it makes their lives mentally simpler. They certainly have a right to that opinion. What they don’t have is the right to be considered right.
Sorry, but variance in this field is as real as your CPU. So are statistical probabilities. Reality is often complicated and messy and can’t be summarized in a nice easy package.
Do I know for sure that Intel boards on the whole will overclock better than SiS or Via boards? No, but based on prior history, it’s not an unreasonable proposition to be tested through the experience of many people, not just one. That’s a lot smarter than just ignoring it.
What We’re Going To Do
If someone else beats us to it, all well and good, but we’re likely to pick up a SiS board and do a little testing just to demonstrate how much of a “real” gap there is between it and an Intel board in a real life overclocking situation.
The change in performance due to equivalent settings will probably be accurate enough to rely upon, but I wouldn’t consider any difference in the degree of overclocking I personally find to be. For that, I’ll keep an eye on our databases and other sources of mass data and see what emerges.
It will be hard to do. It won’t be precise. It will require some personal judgment.
But it’s worked for us before better than any testing we’ve done, and we prefer accuracy over preciseness.