- Joined
- Mar 25, 2011
- Location
- Pacific NW USA
Can you/Did yuo run tests with REAL applications and see what happens? Would "real" applications show the same behavior with changing the ID?
(your picture is blocked at my office so I cant see your response - host the images here if you would be so kind)
Haven't figured out how to host an image through overclockers yet. The picture I posted is from my excel chart showing Aida results and how MASSIVELY they change. I'm not done testing just benchmarks yet and haven't gotten to "real world applications" or games yet (they will be forthcoming). So far I've seen huge differences with Aida even to the point of when I name the Vendor string "Bubba Hotepp" I get the message "the benchmark is not optimized for your CPU". Meaning it's optimizing code dependant on which vendor string or model number it detects and NOT by which optimization flags it detects in the CPUID. That IMHO makes it completely unreliable. I'm seeing suspicious results in PCMark 7, 3DMark06, 3DMark 11, Passmark PT7 among others. IF they aren't doing the same thing as Aida (which I suspect is a result of being coded using Intels compiler and/or libraries) then even the fact that the results are changing as much as they are regardless of the reason make them unreliable as a comparison of CPU performance.