• Welcome to Overclockers Forums! Join us to reply in threads, receive reduced ads, and to customize your site experience!

Excavator core FX coming soon?

Overclockers is supported by our readers. When you click a link to make a purchase, we may earn a commission. Learn More.
They do have that rep. The first gen spot A Athlon on the other hand out performance the same clock speed spot 1P3 of the time.

The, the Thunderbirds where competitive but ran way too hot, with no initial thermal protection. The Palominos wouldn't scale and had some serious yield issues. The Thoroughbreds fixed that and the Bar tone built on that with more cache. The next chips, the Athlon 64's were really their golden erase before Intel introduce the Core2Duo. They never really enjoyed the same performance advantage again. 1st gen Phenom sucked, though the 2nd gen chips were rather nice.

Looking back, the Bulldozer architecture is the only ground up releasing which was a complete dude. K5, k6, k7 and k8 were all competitive at there respected price points. Mind to, during the k6 era it was hard to beat Chris chips on value. The M3 sold for $26, mainboard only slightly more.
 
I think you mean P4 , not P3. P3 was beating AMD at the same clock but AMD quickly went above Intel's clock. Actually P3 was beating P4 up to 2.4GHz+. Later Intel made Pentium M on about the same architecture.
AMD / Socket A was much better for most users because of much lower price of both, CPUs and motherboards. Also P4 was slower so AMD started to use that *+ numeration like 2600+ which had 1.8GHz and was almost as fast as P4 2.6GHz.
It was like Socket A, 754, 939 were great series and for home users were much better and much easier to OC than anything what Intel could offer. After 939 AMD had nothing interesting as Intel released Conroe which had maybe 30-40% higher performance than older series. Somewhere then AMD stuck with old series as AM2 were not much better and were not so good for OC. Phenoms were kind of improvement but Intel had already good quads. When Intel moved to Core i then AMD was still on Phenoms, when Intel moved to 2nd gen of Core i then AMD was still selling Phenoms ... and later released Bulldozers which were slower clock to clock than Phenoms and were generating more heat ... that was like 5 years ago.
 
Good infos Woo!!
I remember when Phenom Agena B series stepping was advertised as the "first true quad core" as it was launched just before Core 2 quads. That was a main selling feature of this AMD processor. Shortly after Core 2 stomped AMD because of the TLB issues on Agena processors. That TLB errata really hurt AMD because they had a good thing going....

Started delidding on Agena cores. Phenom 9850BE was my first de-lid. (for soldered cpus) It helped greatly with temps but was by no means a cure for the TLB errata issue. I never saw stability past 3ghz.
 
Good infos Woo!!
I remember when Phenom Agena B series stepping was advertised as the "first true quad core" as it was launched just before Core 2 quads. That was a main selling feature of this AMD processor. Shortly after Core 2 stomped AMD because of the TLB issues on Agena processors. That TLB errata really hurt AMD because they had a good thing going....

Core 2 quads were already out (Kentsfield, Q6600 etc.), they called Phenoms the first true quad cores because the C2Q were multi-chip module design (essentially two dual core CPU dies) unlike the Phenom CPUs. Penryn (or Yorkfield for the quad cores), the die shrink of Kentsfield, came out around the same time the Agena Phenom CPUs when the Phenoms were slower than Kentsfield.
 
Still two 2 cores in one chip were better idea than 2 threaded modules of FX ;) If I'm right then now only Kabini and one more APU ( don't remember its name ) are real quad cores on the AMD side. I will tell that again .. if AMD could at least make sandwich Kabini with 8-12 real cores and double memory controller+additional pcie lanes it would be a great CPU. Looking at Kabini performance at this low clock it's pretty fast chip ... just low frequency.
 
Also the AMD XP Thoroughbred's ran a 166 FSB Double Pumped (333MHz) up until 1.7 - 2100+ when they hit a wall. AMD added an extra layer to release congestion and was able to scale further. When AMD came out with the Barton core (200 FSB Double Pumped to 400) Intel had already released the P4 @2.5. With this new release, Intel moved from the 100 FSB Quad Pumped (400MHz) to the 133 FSB (533MHz). Later @ the 3.0 GHz, they moved to a 200 FSB (800 MHz Quad Pumped) with totally put them ahead of the AMD XP. When Intel hit the thermal wall with the P4, the Israeli R&D section of Intel came to the rescue. The P III was dropped @ 1.2 for the P4 Netburst Uarch but it still had life left in it. Maximum PC magazine took a P M mobile CPU and pitted against the AMD FX 53 and the P4EE edition. When the chips where OCed, the P M had beaten both:) The Pentium M Uarch was based off the PIII Uarch

Thank You For Your Time
In Reading My Post
 
3GHz P4 with 800MHz bus were as fast as 3000+ AMD Barton and later single core 3000+ A64 ( these had lower clock than Socket A ). What was also weird, Northwoods were usually faster than Prescotts while they're using less power. Prescotts had some design flaws what was corrected later.

I remember when I was preparing computer for article in press about the fastest PC on the market. I decided on FX55 Clawhammer ( there was no faster CPU that day ) while copetitive company made P4 3.46 EE based PC. We got list of benchmarks to prepare computers for the test. At the end FX55 was faster in every game, 3D Mark etc. but P4 EE was faster in Office applications and one more test. Redaction published results from Office and similar applications only and added one 3DMark at the end so readers would think that P4 is much better. As summary they said that P4 PC is better overall but for home better will be AMD FX. Looking at all results AMD won ~80% tests and whole PC was still ~30% cheaper. Simply they could keep P4 PC while we didn't want to let them keep AMD/FX ( then it cost about $5k ). Let's say not much has changed in most redactions since then ;)
 
I sold my AM3+ over a year ago so I don't care. If they release something good then maybe I will buy it. I don't like AM3+ anyway as there are various issues with heat etc. The only board which I liked was CHV. Actually one of the best boards I had ( on modded BIOS or maybe new version which could fix RAID problems ;) ).
 
Double-dis. ^^^^

If it's still based/useable on AM3+ I might try it but not prepared to upgrade to an entirely new socket just yet.
Trying to get the most life from what I have now.

I tired newer'ish Intel stuff. It's fast. BUT

un-zipping is slower. I found installing a new OS and un zipping files for installation to happen faster on AMD systems.


I sold my AM3+ over a year ago so I don't care. If they release something good then maybe I will buy it. I don't like AM3+ anyway as there are various issues with heat etc. The only board which I liked was CHV. Actually one of the best boards I had ( on modded BIOS or maybe new version which could fix RAID problems ;) ).

Yea, my formula-z is tits. Much better than even their Sabortooth board which does really well for most people. Never had an issue on raid with AMD other than picking the proper driver for it where Intel is pretty much plug and play with MS os. That may change with W10, but I doubt that lol.
 
Pretty sure the 1st gen Athlon was marginally ahead. Then Intel released the P4, and funnily enough a couple of revampled P3's (Tualatin core) which scaled up to 1.4GHz and handily outperformed the 2.4GHz P4's. Problem is that most of the early P4's were saddled with slow RAM when that architecture was optimized for high bandwitdth RAM. I'm not arguing the P4 was any good, I'm not, it was a pig. Just that Intel sadled it slow single data rate RAM in many systems due to the prohibitive cost of Rambus memory. Wasn't until 266MHz DDR that things looked a bit better.

Anyway, my poor old FX will continue in use until it won't run current games at a decent rate. Generally I just update the video card. Can't be bothered updating anymore.
 
Maybe someone here could enlighten me on this. I have read on forums that whatever shortcomings the FX cpu's have in Windows they are mitigated in Linux. That the Linux kernel is much better at using multiple cores. Can anyone elaborate on this?
 
Pretty sure the 1st gen Athlon was marginally ahead. Then Intel released the P4, and funnily enough a couple of revampled P3's (Tualatin core) which scaled up to 1.4GHz and handily outperformed the 2.4GHz P4's. Problem is that most of the early P4's were saddled with slow RAM when that architecture was optimized for high bandwitdth RAM. I'm not arguing the P4 was any good, I'm not, it was a pig. Just that Intel sadled it slow single data rate RAM in many systems due to the prohibitive cost of Rambus memory. Wasn't until 266MHz DDR that things looked a bit better.

Anyway, my poor old FX will continue in use until it won't run current games at a decent rate. Generally I just update the video card. Can't be bothered updating anymore.

Athlons had higher clock but clock to clock were slower than P3 because P3 had much larger and faster cache. AMD as 1st reached magical clock of 1GHz on Slot A CPU.
When P3 passed 1.13GHz then Intel was focused on P4 with overpriced Rambus memory which wasn't really as fast as expected. Higher clocked P3 were overpriced that days and hardly available. Intel was manufacturing them in limited quantities and tried to push P4 some more. All knew that P3 is faster but since barely anyone could buy them then all moved to P4.

Funny were first comparisons of FX to P4. It was scalling like 8 threaded P4 ... high clock but poor performance. Pentium D wasn't any better but Conroe changed a lot. First CPU for a long time which was giving real performance boost and it was worth to move to the new platform. It was also in reasonable price comparing to earlier generations.

Many people say that AMD made mistake buying ATI but if they wouldn't make it then there would be no AMD right now. For a years only ATI part of the company is bringing profits and actually counts on the market.
Right now AMD is trying to find its place. They move to SoC as Intel is not so strong there. Also they try to push IGP as they have probably the best option on the market. The main issue is that higher APUs are still really slow comparing to their wattage. Mobile users can live with slower graphics but they want faster CPU. AMD is offering them fast graphics which they don't need and slow CPU which uses more power than the competition. They just can't win this way.
Right now they delay every new series while Intel is releasing new products with small steps but most potential customers have no idea about hardware so Intel can advertise any new technology as a breakthrough and product will sell itself. AMD has one big problem here, many customers don't trust them. Instead of gaining this trust with more often released good products they release something like FX every 3-4 years and later try to sell it while barely anyone wants it.


Yea, my formula-z is tits. Much better than even their Sabortooth board which does really well for most people. Never had an issue on raid with AMD other than picking the proper driver for it where Intel is pretty much plug and play with MS os. That may change with W10, but I doubt that lol.

Many AM3+ boards had issues with RAID on early BIOS releases but ASUS didn't care to fix it for almost a year even for CHV. CHV had no RAID ROM. It had only option to set RAID but you couldn't adjust anything. So RAID without any cache options etc. At the end performance was 30-50% lower than expected. First fix was made by one of the users who added MSI 990FX RAID ROM to ASUS BIOS. ASUS made it work about half year later even though many users were complaining.
At the same time Gigabyte failed RAID options on their 990FX UD series. I was writing guides how to solve it as even Gigabyte staff wasn't sure and support was always answering that all is fine with their boards. Simply to enable any caching and correct RAID0/10 you had to enable RAID 5 and change some more options. Later you could enable caching in system. Without changes RAID performance was about 50% of what it should be and regardless of used SSD amount controller couldn't pass ~600MB/s. I made it beat Intel controllers running 3 SSD up to 1250MB/s while Intels couldn't pass 1.05-1.1GB/s because of hardware limitations.
 
Well, either way I'm watching to see how they go. If Zen has a cost advantage I may get one
 
After reading more about the New Zen, unless AMD temps us with a FX release of the Excavator Core for the AM3+. IMO this would be an easy way of making some extra cash. When the Zen Core get's released, this will require an entire upgrade. There is no mention of a DDR3 controller so this would be > 1090FX chipset???? Using socket AM4????
The Uarch for the new ZEN will require a NEW : Mobo > Ram > Cpu Cooler and it looks like they will combine the AM3+ & FM2+ sockets. This is the same thing they did Waaaaaaay back with the 939 & 754 sockets, when they combined them into the AM2 socket.

zen.jpg
 
I already sold most my DDR3. I left only kits good for benching. In next year except cheaper versions of chipsets, all new boards will support DDR4. With current price drops DDR3 is not much cheaper than DDR4. DDR4 has some more advantages than manufacturers are trying to tell us. I mean who cares if it's 1.2V or 1.35V when wattage difference is like 0.5W. It counts only in laptops where all are cheating anyway lowering costs rather than improving performance. What counts is higher internal bandwidth which gives hopes on higher performance without raising number of channels. Less channels = lower cost = higher popularity = more cash for AMD ... if they make it right.

AMD just has to design new socket because new architecture but I hope it won't be LGA. LGA is total fail ... it's much easier to fix bent CPU pins than anything what happens with mobo socket. I just hate what Intel made with these LGA sockets telling all it's much better for power delivery etc. In real it's much worse from the user's side but for Intel it's lower RMA cost. There were many more issues with bad connection of LGA sockets than anything what happened with standard CPU sockets for many years earlier. All who had burned CPU pads or socket pins know what I mean. In most cases damage due to bad contact = lost warranty when it's not user's fault. Somehow AMD/FX can still run fine at 300W+ while for Intel it's a problem when their CPUs have much lower TDP than their older series ? I just see no other reason than money savings.
 
I already sold most my DDR3. I left only kits good for benching. In next year except cheaper versions of chipsets, all new boards will support DDR4. With current price drops DDR3 is not much cheaper than DDR4. DDR4 has some more advantages than manufacturers are trying to tell us. I mean who cares if it's 1.2V or 1.35V when wattage difference is like 0.5W. It counts only in laptops where all are cheating anyway lowering costs rather than improving performance. What counts is higher internal bandwidth which gives hopes on higher performance without raising number of channels. Less channels = lower cost = higher popularity = more cash for AMD ... if they make it right.

AMD just has to design new socket because new architecture but I hope it won't be LGA. LGA is total fail ... it's much easier to fix bent CPU pins than anything what happens with mobo socket. I just hate what Intel made with these LGA sockets telling all it's much better for power delivery etc. In real it's much worse from the user's side but for Intel it's lower RMA cost. There were many more issues with bad connection of LGA sockets than anything what happened with standard CPU sockets for many years earlier. All who had burned CPU pads or socket pins know what I mean. In most cases damage due to bad contact = lost warranty when it's not user's fault. Somehow AMD/FX can still run fine at 300W+ while for Intel it's a problem when their CPUs have much lower TDP than their older series ? I just see no other reason than money savings.

I have to agree LGA is bad for the end user. Too fragile.
 
I hope that they do release 1 more chip for the AM3+. The slides that AMD presented at the begging showed a 10%-15% performance increase with each new release. I'm not sure if the current released 83xx's have anything other that a few tweaks of the Vishera Core. If we look on the REALLY BRIGHT SIDE, we could be getting a CPU with 30% better performance over the Vishera Core and 45% increase over the Bulldozer.
 
Back