• Welcome to Overclockers Forums! Join us to reply in threads, receive reduced ads, and to customize your site experience!

AMD goes 32nm SOI with Fusion GPU-CPU

Overclockers is supported by our readers. When you click a link to make a purchase, we may earn a commission. Learn More.
I don't get you guys at all. The A64 X2 era lasted well over 3 years using the same architecture with improvements every so often. Would you really compare the performance of the first 3800 with a 6400+? Phenom was released in 2008 and 2011 is three years later. The basic architecture may remain the same but I doubt the performance will any more than the X2 did in that time-frame.

Adding the GPU while shrinking the die is enough of a major step without changing the architecture of the core as well. But once the GPUCPU is coming off the line correctly I would then expect another major leap in core design - that would be consistent with AMDs practice of the last decade or so. What's the deal ...? :shrug:
 
Watts are watts, a computer drawing 200w DC will be drawing about 240 watts at the wall with a decently efficient PSU.
You're thinking of amps, which change with voltage (20 amps at 10v is 200 watts, two amps at 100v is 200 watts).

thank you for clearing this up. I didn't know what he meant by a 10w drawing 2w but I was pretty sure his rant was based on wrong numbers.

The less power these things draw the less expensive they are in the long run to own, the less heat they output and the longer the battery life, for laptops anyway. I'm hating the trend of using that headroom for performance though since laptops don't exactly last very long on a charge.

I think they are begining to see more sales of laptops so they are going for that market and since the money is in the budget pcs, that is what they are building for, integrated solutions for value. It might save them money if they can provide a dual solution on 1 chip since they already make and sell them separately. This also forces pc builders to buy both a cpu and gpu(not nvidia) from amd meaning more money for them. What the consumer gets is probably less power draw, smaller form and a solution with fair(phenom II x4?) performance that was meant to work from the start.

I have no idea, after months of not following up, what intel's equivalent of igp on chip is going to be but intel might lose out a little since their igp are sub standand compared to amd.

Eventually I think this is heading into the direction of using the gpu for personal computing, made more efficient by putting them right next to each other. They should really look into the hybrid crossfire thing, that will up the sales of ati video cards so they get more market share. Cornering the market with cpu+igp, optional discrete graphics and improving their cpu should prove quite interesting.
 
Last edited:
MD Llano will house up to four processor cores based on the Phenom II architecture, paired with 4MB L3 cache. While the integrated GPU will be based on Radeon HD 5000 technology sporting DirectX 11



Did everybody move in with the Keebler elves and how lives in a 2x3 foot room? And I didn't get the memo?

It's all well and good to have mini ITX boards for people who think it's cool-- but when manufacturers actually start selling ULV/micro form factor desktop boards, ie, ion/atom boards, and people actually shell out for PC's that are = to like 5+ years ago performance, if not more... something's wrong.

I'm going to make this it's own post but it fits here too so appologies in advance for DP.

We do not NEED this stuff.

Atom/Ion is the laptop platform we've been waiting for. It's finally here. Once the Atom get's a little faster, you've got the ideal machine. Plays movies properly, even in HD, gets the job done, low power, low heat. Bam.

Explain to me why you need an 8" cubed desktop? Please? Anybody? Give me a logical reason why you need a desktop that draws like 10W DC at 12V which works out to like freakin two watts out of the wall?

Do you know that an average PC drawing 200W in windows at 12V is drawing about the same out of the wall as a fairly dim lightbulb? Is that any reason to switch to a piddling slow machine? No. Just turn one light off in your house. Keep the good computer.

Do you live inside a hollow tree? Can you extend your arms without bumping into something while at home? Then you don't need an 8" cubed tower. I don't care who you are or where you live. You can fit a midtower- and you can find the tremendous 30ish watts it draws out of the wall.

Here we are, again, with this CPU/GPU in one package. Great for laptops, again-- especially at 32nm.

But we all know it's going to live in a lot of desktops too- and that's just rediculous.

All of these absurd industrial designs with touchscreens and laptop components-- what's wrong with people? Can't fit a proper midtower in your kitchen but want a computer there? Get a laptop- then you can move it later.

I just don't get it. People think it's Star Trek or something all of a sudden.

Just because you don't have a use for something doesn't mean nobody else does. This is how technology progresses. People said computers were useless, the internet was useless, and so on. Nothing ever changes if nobody tries to do anything different.
 
Last edited:
I don't get you guys at all. The A64 X2 era lasted well over 3 years using the same architecture with improvements every so often. Would you really compare the performance of the first 3800 with a 6400+? Phenom was released in 2008 and 2011 is three years later. The basic architecture may remain the same but I doubt the performance will any more than the X2 did in that time-frame.

Adding the GPU while shrinking the die is enough of a major step without changing the architecture of the core as well. But once the GPUCPU is coming off the line correctly I would then expect another major leap in core design - that would be consistent with AMDs practice of the last decade or so. What's the deal ...? :shrug:

the same arch wasnt used for 3 years when X2's came along.
there were brisbanes and windsors remember? the windsors seemed to get more done per clock but with a larger die and the brisbanes had a faulty temp sensor that was doubly frusterating
 
This is just going to be tied to cloud computing anyway, and we are all going to buy our PCs one Gigabyte at a time. Efficiency over performance.
 
no theres a difference there and thats what i was getting at on the "k8 is k8 is k8" thing,
they arent the same. they are very much different
 
32nm dual-core Fusion show-cased today ...

:)

http://www.zdnet.com/blog/computers/computex-2010-amd-demonstrates-fusion-apus/2701?tag=nl.e539
AMD showed two demonstrations of a “low-power Fusion APU,” which presumably refers to a dual-core Ontario. The first showed it's ability to play a demanding DirectX 11 game, Rebellion’s Aliens vs. Predator. Bergman noted that this was the same game AMD used last fall to demonstrate the capabilities of its high-end discrete GPU. “Can you imagine getting performance of that quality in a netbook this size?” he asked the audience while holding up a standard netbook that could accommodate an Ontario APU.

The second demonstration showed the performance in Internet Explorer with and without APU acceleration using a Browser Flip test in the latest IE 9 Platform Preview. The performance increased from 2-3 frames per second without acceleration to about 60 fps using the APU.
 
Last edited:
Back