• Welcome to Overclockers Forums! Join us to reply in threads, receive reduced ads, and to customize your site experience!

FRONTPAGE AMD A8-3850 APU Review - Llano for Desktop

Overclockers is supported by our readers. When you click a link to make a purchase, we may earn a commission. Learn More.
Question: Why is the core speed in CPU-Z read 3693MHZ and 1704.3MHZ in Core Temp 0.99.8? Should the speed be the same for both programs? :shrug:

Turns out the 25x multi was just wonky and I needed to use the 26x multi to take it as far as it would go. I'll post up the LinX screenshot and one benchmark....the rest I'm running and will publish a Llano/Lynx overclocked article when I'm done.

View attachment 96947

View attachment 96948

Not to shabby clocks for a $135 CPU on air. 142MHz bus = 852MHz on the IGP. Could push it to 144 bus but the CPU wouldn't go that far without dropping to the bad mojo multi. I'd rather sacrifice fewer GPU MHz for the 142MHz CPU bump. :D


:comp:
 
Coretemp's detection is incorrect, simple as that. I've been working with its author and he has a properly functioning version (I've used it, but it's a debug version and not very screenshot friendly, plus I doubt he'd want a screenshot of it out there) but hasn't released it yet.
 
Quoting a post from Dolk in the Bulldozer rumor thread because of its pertinence to this APU. Well put as always Dolk!


You can read about the full details of the Microarchitecture here: http://realworldtech.com/page.cfm?ArticleID=RWT062711124854

Basically what happened is this:

The CPU got a huge die shrink and tossed out the L3 Cache, but increased the L2 to 1M. Added a couple more bits here and there to increase the speed and throughput of the CPU. After that, the rest of the space was used to house a Cayman GPU with two buss lanes: Onion and Garlic. Onion is attached to the IFQ, and Garlic attached to the IMC. The GPU is made to be programmable, and talkative to the CPU. If the GPU can find data in the CPU memory banks before flushed, it will retrive it and use it. Most of the time this information is stored in the DDR3 though, but the GPU will always probe the L1, and L2 sections first.

Each core got a NMosfet upgrade so that each core can be turned off to save power. 3 PCIE2.0 lanes were put on so that communication with other PCIE devices can happen without the use of a Northbridge.

And thats about it. The Llano is not a huge progress forward. It is in the right direction and it works fantastic. Yet there are a lot of improvments that can be made, with the GPU and CPU talking to eachother. We should see this with Trinity. Trinity will be the Bulldozer based APU that houses the new 7 series AMD GPUs.

...and thus the reason for requiring tight-latency memory. You don't want to have high latency when running a GPU off your system memory unless the MHz makes up for it, like DDR3-2400 or something. Basically, it's not necessarily wise to run CL9 DDR3-1866 over CL7 DDR3-1600. This review split the difference, running DDR3-1866 at CL8, which is reasonably affordable and available. :)
 
I had a question about the gigabyte motherboard in the review. I already bought one and am very excited about the build. However, the box has me a bit confused where it says dual graphics. I purchased a 6670 to go with it but I was wondering if the MOBO had a 2nd GPU onboard the way some of the mobile platforms do?
 
No, it only uses the APU's graphics, there is no additional GPU onboard. Your 6670 was the right way to go. :thup:
 
Good to know, Ill get to check these out in dallas this weekend but my fusion box wont be up and running for several months, still building the case.
 
Can the llano simultaneously display on the DVI and VGA port?

I was reading the specs for a MSI board when I came along this little snippet:
* HDMI and DVI-D can't output simultaneously.

While my A8-3850 build would be plugged to a TV for movies, I still want to have a second display for desktop use.

Otherwise I'll need to buy a discreet card.
 
Looking around, I found this on a gigabyte specs list:
Note: When Dual-link DVI enabled, the remaining display output ports will be disabled.
Seems like when you want to use higher resolution it disables the other ports. Unfortunately I can't test if the DVI & VGA work simultaneously b/c the APU is on its way to another person to review another motherboard.
 
"My only gripe, and it’s minor, is video connectivity. I’d like to see HDMI and/or DisplayPort outputs to go along with the VGA and dual link DVI outputs."

WTF? It has got HDMI and Display port
 
Woah...that was supposed to be deleted from the review. I specifically remember thinking of that. Good catch, it will be pulled in a couple minutes. Sorry!

EDIT - Fixed, with an author's note where that once appeared. Thank you!
 
Fixed, with an author's note where that once appeared. Thank you!

No problem. You had me worried when I read that, I just bought the same board for an HTPC, but not yet installed. I grabbed the box & started reading, thinking maybe I stuffed up and bought a board without HDMI! :eek:
 
It's a great HTPC solution, you should definitely enjoy it. What case are you using?
 
Origen S14V. Expensive, but it has to look nice in the lounge. I'm still waiting for delivery, but from what I hear they're very well made.
 
Back