• Welcome to Overclockers Forums! Join us to reply in threads, receive reduced ads, and to customize your site experience!

Do we loose Windows optimization with mobiles in desktops?

Overclockers is supported by our readers. When you click a link to make a purchase, we may earn a commission. Learn More.

c627627

c(n*199780) Senior Member
Joined
Feb 18, 2002
c627627 said:
What potential are we talking about being lost if Windows doesn't display AuthenticAMD as the system's CPU?


dands42 said:

From the AMD WEB Site:

"Microsoft optimized the DirectX 8.0 interface for Windows XP specifically for the AMD Athlon XP processor. The AMD Athlon XP processor’s innovative QuantiSpeed architecture helps propel Windows XP application performance to a new level. To unleash the rich features found in Windows XP, AMD and Microsoft worked together to optimize applications like Media Encoder 8.0 for AMD’s 3DNow! Professional technology."

Does it have this automaticly or does windows have to know that an AMD XP is present?


So what he said, does it have this (and anything else optimized for AMD) automatically or does Windows have to know that an AMD CPU is present?





The beginning of the thread centered around speed. However:

"AMD and Microsoft worked together to optimize applications like Media Encoder 8.0 for AMD’s 3DNow! Professional technology."

So in other words, only specific applications may be affected.

Detected und undetected AMD CPUs are obviously both as fast at the same settings, it's just that we're trying to find out would they affect specific applications differently and if so which ones and how?



EDIT EDIT :

Gnufsh said:
I think that an important question is: Do the apps check at runtime to see if optimized instructions are supported themseleves, or do they just ask the OS?
 
Last edited:
i wonder if there is an option in windows xp somewhere to force amd optimizations....hmmm.....
 
With the K6-2+ and K6-3+ mobile CPU's, many of the improvements over the standard K6-2 couldn't be implemented if the BIOS wasn't capable of recognizing the CPU.

So it could be that you may not be getting all the CPU has to offer.

But that was a very different situation. The plus series of the K6 were a final facelift on an otherwise obsolete CPU. AMD never made an equivalent CPU available for desktops.

These chips are mobile versions of a still widely used and competetive CPU. I'm sure the Power Now features aren't fully implemented in desktop use, but who needs that?

Some benchmarking of desktop Bartons Vs. mobiles in otherwise identical systems ought to settle the question.
 
Last edited:
I am willing to take on this. Here is what I can offer. Next week is spring break so I will have all kinds of time on my hands. The processors I have available are an 1800+ and a m2600+. I can underclock the mobile and compare them. If you guys would like me to do this just send me a pm to remind me.
 
I can do whatever you guys want me to do to these things that you think will make it a far and useful study. I will let you guys argue that. I cant do anything until Monday, by then I hope we have should have a concensise(sp) on how to do this.
 
I'd be interested in the results of the back to back comparison of those two.

But to have truly valid results, I feel you would have to have the closest counterpart to a mobile Barton as possible. This would mean same PR, and as close as possible on the date code.

I'm sure some member of this board could do this.

It wouldn't completely settle the question, because results could vary from one motherboard make and model (and whether or not the BIOS programmer saw fit to include any possible optimisations for a mobile CPU) to another.
 
I would love for someone to send me a desktop 2600 to test, but I am not holding my breath. I doubt there is anyone out there that will send one to someone they dont know .
 
i have a desktop barton 2500 and a mobile barton 2500 right heres hehehe, and i can tell you in 3dmark01, they are equal if you run them both at 200x11
 
The entire thread centered around speed. However:

"AMD and Microsoft worked together to optimize applications like Media Encoder 8.0 for AMD’s 3DNow! Professional technology."

So in other words, only specific applications may be affected.

So once again, they're obviously both as fast, it's just that we're trying to find out would they affect specific applications differently and if so which ones and how?
 
Of course, the question is not whether they're the same CPU, the question is how does Windows recognizing "AuthenticAMD CPU" affect certain applications.
 
I doubt it does either. Does a laptop suffer from not having performance optimizations? As long as WinXP (or whatever OS) realizes that it's got an AMD CPU and loads the correct drivers and such for it, there should be no difference whatsoever.
 
I would think that it is possible for them to make the cpu's where they dont work as hard in the laptop's so the done get as hot where as inturn give us the reason they are lower wattage cpu's.
 
johan851 said:
I doubt it does either. Does a laptop suffer from not having performance optimizations? As long as WinXP (or whatever OS) realizes that it's got an AMD CPU and loads the correct drivers and such for it, there should be no difference whatsoever.

But my hunch is that Windows relies on the BIOS detecting the CPU properly. A laptop with a mobile Barton will undoubtedly have a BIOS that will recognize it properly.

This isn't necessarily the case with with a desktop motherboard. A motherboard manufacturer might take the attitude of "why would someone run a mobile CPU in a desktop?" and not go to the extra trouble of writing any additional code necessary for proper recognition.

With the K6-2+ and Socket 7 motherboards, most manufacturers were good sports, and wrote BIOS that would properly recognize them. This was in spite of the fact that they were not approved for desktop use by AMD.

Some were not so accomodating. FIC really dragged their feet for K6-2+ recognition on the VA-503+. They even stated in their FAQ's that it wasn't for desktop use, so they were not going to release a BIOS for it. They finally relented to popular pressure, stating that since some people in some parts of the world may not be able to buy the right CPU, they would support it.

We all know examples of hardware, such as video and sound cards, that will work with some sort of generic driver. But they will often work far better with the proper driver. But being able to load the proper driver would depend on the OS ability to recognize the hardware for exactly what it is.
 
Last edited:
Because CPU-Z and other CPU detection programs read the CPUID code from the CPU, while windows probably relies on the BIOS for cpu detection.

The BIOS in my laptop does not, for some reason, enable SSE on my mobile Athlon-XP (SSE must be enabled on AXPs by setting a bit, I believe in the MSR). Most BIOSes do this, but appearently noone thought to include such a performance enhancing feature in my BIOS. THere are ways to enable it from within windows, but the average consumer is not going to know about them, and so miss out on extra performance from SSE-optimized programs. On the other hand, my linux kernel reads my cpuid on boot, determines that it does support SSE, checks to see if SSE support is enabled, and enables it if it isn't.

edit: I think that an important question is: Do the apps check at runtime to see if optimized instructions are supported themseleves, or do they just ask the OS?
 
Back