- Joined
- Nov 21, 2002
- Location
- Colorado, USA
Ok, my real question is what difference would there be in performance between a 266mhz barton 2600+ running @ 166mhz x 13 = 2158mhz. And a 333mhz barton 2600+ at 166mhz x 13. I'm not sure if they have difference sized caches or not, but regardless of cache size. I mean couldn't AMD come out w/ a 500mhz FSB chip tommorrow just by making it's default values 250mhz x "a lower multiplier"? My 266 t-bird will do over 225mhz FSB.
I'm asking this just to understand this stuff better. I'm not really seeing why default FSB speeds matter. You can make them whatever your mobo/memory can take. I guess the 333mhz core should overclock higher than the 266mhz core, based on the L12 (wire trick) mod results, but that's about all I see.
I was also kinda looking at getting a 400mhz barton when they get below $150 (a ways away, I know).
What are your thoughts on this?
I'm asking this just to understand this stuff better. I'm not really seeing why default FSB speeds matter. You can make them whatever your mobo/memory can take. I guess the 333mhz core should overclock higher than the 266mhz core, based on the L12 (wire trick) mod results, but that's about all I see.
I was also kinda looking at getting a 400mhz barton when they get below $150 (a ways away, I know).
What are your thoughts on this?