• Welcome to Overclockers Forums! Join us to reply in threads, receive reduced ads, and to customize your site experience!

AMD and a lost chance?

Overclockers is supported by our readers. When you click a link to make a purchase, we may earn a commission. Learn More.

Evilsizer

Senior Forum Spammer
Joined
Jun 6, 2002
I feel amd lost a bit in the DIY market with the ryzen 3 2200GE and the ryzen 5 2400GE. with prices in line say $130 to $150 for the 2200GE and around $200 for the 2400GE. both cpus i would go after for a low power/TDP cpu for a stock computer. low power usage, low noise, a strong on die gpu vs intels iris 650, as seen below.
Intel Iris 650 vs Vega 8
Intel Iris 650 vs Vega 11

i really wish i knew why they were not released retail. i did find oem cpus on ebay in the UK and the 2200GE is $160ish not far off from what i would expect to pay for a 35watt quad core with vega 8. the 2400GE they have listed as well is a bit more then i would think it is worth retail at $230ish, another 35watt quad core, higher clock speed, and vega 11. i will say im not shocked at the performance difference of amd's vega's vs intels iris 650. the only thing i cant figure out is why, when it comes to 4k disc playback on your pc only the intel is able to play that back.
What are the minimum system requirements for Ultra HD Blu-ray movie playback?

also came across this, find it interesting they are showing more vega's in the line up, wonder if those are on die or stand alone.
https://www.notebookcheck.net/Vega-...cs-650-vs-Vega-8_8470_7655_8144.247598.0.html
i am thinking vega 3/6/10 are on die vs stand alone but at this point who knows. NV low end cards now cost $100 which wasnt the case not that long ago, the GT1030@$80is to $100+ not worth it vs going GTX1050. if amd had some kind of low end card now even based of the vega 3/6/8/10/11 they would have a pretty good winner on their hands but for some it would make no sense. as if they had a amd cpu the on board would be just as fast or not much slower, depending on stand alone card specs.
 
65watts vs 35watts, i want a 35watt cpu even if the cpu is slightly slow in clock speed. i must have left something out that was, what i was getting to about a lost chance with the 35watt 2200GE and 2400GE.
 
They are listed as 35/65W TDP, in real it's the same chip, maybe different voltage range but if you get "standard" series then can set it in the same way. I had my 2200G in a really small box undervolted and overclocked.

I find these APUs to be slow in any games. I won't even mention Intel. If anyone thinks about gaming, even in low details then APU is just a bad idea. We can compare it to the GF1030 but it's not a hard competition, also really bad for games. I wish to see something at least as good as GTX1050Ti. Maybe in next gen.
 
the only thing i cant figure out is why, when it comes to 4k disc playback on your pc only the intel is able to play that back.

The link you provided mentions the need for Intel SGX, presumably as part of the copy protection/DRM code. There may or may not be an AMD equivalent to that, but it is up to the software creators to implement it if so. Is there alternative software from another source that may not be affected similarly?
 
They are listed as 35/65W TDP, in real it's the same chip, maybe different voltage range but if you get "standard" series then can set it in the same way. I had my 2200G in a really small box undervolted and overclocked.

I find these APUs to be slow in any games. I won't even mention Intel. If anyone thinks about gaming, even in low details then APU is just a bad idea. We can compare it to the GF1030 but it's not a hard competition, also really bad for games. I wish to see something at least as good as GTX1050Ti. Maybe in next gen.

A 4.0 GHz 4C/8T APU with onboard GTX1060 6GB quality for $350 would answer a TON of entry level needs. Price wise it seems reasonable, especially given the next generation of GPU's coming soon and CPU's in another year. Doubtful to happen however until it becomes irrelevant because then AMD and Nvidia lose money on discrete GPU options in the entry level market and AMD and Intel would lose money on selling slightly higher priced CPU's without onboard video. $40 extra going that route doesn't sound like much, until you multiply by tens of thousands a year.
 
They are listed as 35/65W TDP, in real it's the same chip, maybe different voltage range but if you get "standard" series then can set it in the same way. I had my 2200G in a really small box undervolted and overclocked.

I find these APUs to be slow in any games. I won't even mention Intel. If anyone thinks about gaming, even in low details then APU is just a bad idea. We can compare it to the GF1030 but it's not a hard competition, also really bad for games. I wish to see something at least as good as GTX1050Ti. Maybe in next gen.
maybe things have changed with video decoding vs back in the day when it was NV purevideo? vs amd VP engine. that current hardware video decoding for BD or 4k BD ect, there is little to no difference between AMD,intel,nv on the lower end or even higher end. i recall days where if you wanted the better video decoding engine you had to get the higher end video cards or wait another year for the mid range to get that engine. anyway i was mostly thinking of such an application for video playback more then gaming.

though im still at a loss in how you can make a 65watt turn into a 35watt. to me anyway it doesnt seem to add up even if you drop the voltage and cpu speed you can get a 65watt to 35watts.

The link you provided mentions the need for Intel SGX, presumably as part of the copy protection/DRM code. There may or may not be an AMD equivalent to that, but it is up to the software creators to implement it if so. Is there alternative software from another source that may not be affected similarly?
more or less what i was eluding to, was why doesnt AMD have support for 4k playback in windows 10.

A 4.0 GHz 4C/8T APU with onboard GTX1060 6GB quality for $350 would answer a TON of entry level needs. Price wise it seems reasonable, especially given the next generation of GPU's coming soon and CPU's in another year. Doubtful to happen however until it becomes irrelevant because then AMD and Nvidia lose money on discrete GPU options in the entry level market and AMD and Intel would lose money on selling slightly higher priced CPU's without onboard video. $40 extra going that route doesn't sound like much, until you multiply by tens of thousands a year.
this will never happen, the amount of power needed/used by a 1060 6gb is way more then a 4c/8t amd cpu. not to mention there is alot of other things in the way with needing GDDR5 and a 1060 core+cpu core on one die. the failure rate alone in manufacturing would make this the most costly cpu ever made, if they did it.
 
though im still at a loss in how you can make a 65watt turn into a 35watt. to me anyway it doesnt seem to add up even if you drop the voltage and cpu speed you can get a 65watt to 35watts.

TDP is thermal design. It doesn't mean the same as CPU wattage. In the same way Intel is marking all their lines at the same TDP. I mean mobile i3/5/i7 can have the same TDP, desktop i3 and i5 can have the same TDP ... I mean how, when some have 2 cores, some 4 cores and default voltage is the same ? Simply all of them can use the same cooling solution and are designed to have the same maximum temp per core under load.
Ryzen was designed as 30W something chip. At lower voltage it will have ~30W ... each small voltage increase is causing much higher wattage/heat. It's possible to lower TDP to 35W when CPU runs at lower voltage or its voltage range is lower. Ryzen 2200/2400G has up to 1.45V ... I bet that GE has lower maximum voltage but I can't find info how high it goes.

2200G during full load using CPU+GPU has ... ~120W... measured when I was preparing reviews. I could still use 95W TDP cooler and temps were quite low.
 
yea woomack, im not looking at power usage but figured that was a side benefit of using a lower TDP cpu. was just thinking for what i wanted to do there were others out there, lower TDP means smaller hs, and even using less of a raditor for water cooling setup.

im comparing what i want to something i kind of have but its out dated. second rig in my sig, using a mobile intel cpu at 35watts came with a HS that looks like you would use it on a chipset. that is kind of my goals now for building pc's, less gaming, smaller, and less heat.
 
Since Ryzen was meant to be low power CPU then it has a great efficiency at a lower voltage. I just don't know how low is the voltage of the GE chips as I doubt there any significant changes comparing to standard G series.
One thing which is misleading is the TDP as you could think that ~100W PSU is enough. I was testing my 2200G on a 85W external PSU. During tests it was randomly shutting down so I measured real wattage and when both CPU+GPU were loaded then I could see over 120W. I could lower the voltage and make it run up to 90W under full load without any stability issues with all cores at 3.7GHz.
 
I don't know about the AMD ones, but on the Intel side, the lower power models generally have lower clock, and implicitly lower voltage than the standard parts. Between them it can add up to a good reduction.

I did try a low power build before. i3-4150T (35W TDP) and 750Ti (60W TDP), and under gaming loads the peak system power was 100W. Just as well, as I was using a PicoPSU that was rated not much above that on the 12V rails (remaining power wasted on 3.3v and 5v rails). If I wanted to, I could easily limit the power of the GPU side. It could probably be done on the CPU too but I'm less sure how other than manual underclocking.

I do wonder, if I were to revisit this today if an AMD APU would match or exceed this in itself.
 
Since Ryzen was meant to be low power CPU then it has a great efficiency at a lower voltage. I just don't know how low is the voltage of the GE chips as I doubt there any significant changes comparing to standard G series.
One thing which is misleading is the TDP as you could think that ~100W PSU is enough. I was testing my 2200G on a 85W external PSU. During tests it was randomly shutting down so I measured real wattage and when both CPU+GPU were loaded then I could see over 120W. I could lower the voltage and make it run up to 90W under full load without any stability issues with all cores at 3.7GHz.

I don't know about the AMD ones, but on the Intel side, the lower power models generally have lower clock, and implicitly lower voltage than the standard parts. Between them it can add up to a good reduction.

I did try a low power build before. i3-4150T (35W TDP) and 750Ti (60W TDP), and under gaming loads the peak system power was 100W. Just as well, as I was using a PicoPSU that was rated not much above that on the 12V rails (remaining power wasted on 3.3v and 5v rails). If I wanted to, I could easily limit the power of the GPU side. It could probably be done on the CPU too but I'm less sure how other than manual underclocking.

I do wonder, if I were to revisit this today if an AMD APU would match or exceed this in itself.


yea i wouldnt think that about the psu per say but that to me TDP has more to do with heat output of the cpu it self. thus why i went with a i3 370m in the jetway NF98 setup, single SSD+BD drive and one fan ran fine on a pico psu for the the wattage, highest one they had. now maybe AMD is different but im still not sold on it. while it has been a long while since i really looked at differences in cpus, not just amd vs intel mind you. Years ago lower TDP cpus from intel used different transistors for the lower TDP cpus vs what ever they were using for desktop higher clock speed and servers with more cores. which to me means more then just a voltage and cpu speed decrease to me doesnt add up to about a 54% decrease in TDP. now maybe new manufacturing techniques/materials allow them to use the same stuff. With better binning in some instances can yield lower TDP or lower voltage cpus, i still feel there is more to it.

im just more disappointed that amd will only sell these to OEM's.
 
Years ago lower TDP cpus from intel used different transistors for the lower TDP cpus vs what ever they were using for desktop higher clock speed and servers with more cores. which to me means more then just a voltage and cpu speed decrease to me doesnt add up to about a 54% decrease in TDP.

On the first part, I must have missed that. Anything more specific so I can go have a look for it?

I wouldn't underestimate the effect of just clock and voltage adjustment. Maybe this is something else to try practically than on a theoretical level. For clocked circuits, power can be proportional to the clock speed. Actually, that leads to another question I never got answered. For a resistive load, power is proportional to the voltage squared. Does this also hold for switched semiconductors? I'm not so sure as they're non-linear in voltage-current characteristic, but is the eventual dissipation resistive anyway? Wish I paid more attention at university now. Either way, working lower down the curve leads to some potentially big savings in power if that's what you're optimising for.
 
Here is an overclocking table from an 8700K.

What we can take away from this is that wattage doesn't change much with clocks (in this case less than 10W for 600 MHz), but with voltage, we can see ~16W isn't an issue for .05V. I'd imagine .1V and another 100 MHz difference that can be 25W+ depending. That is nearly the difference between a 35W CPU and a 65W CPU. Add binning into it and than and we can see how things fit pretty easily.

That said, Intel defines TDP as power consumed at base frequency....which these things never sit at in the first place.

51ghz spreadsheet.jpg


EDIT: Yes, I know this is an AMD thread, but trying to show a relationship between voltage and clocks here. AMD may of course vary, but, I think the point has been made.
 
ED,
Then are we looking at intel TDP is different then AMD TDP or is it the same for both? if it is just cpu thermals then how can you figure that out based on speed and voltage reduction without some kind of thermal probe. much less a way to look at voltage/current going to the cpu. your post is still insightful but then that leads me to, without a 2200GE or 2400GE to compare to 2200G and 2400G. Then setting for fixed clocked and same voltage and measuring using a plug-a-watt meter to see if its just that cpu/voltage. one question im wodering about based on the table, there is no defined CPU LLC so how much of a difference would that make. as the table shows that power usage doesnt increase much in some spots. going from 4.3ghz to 4.4ghz shows 3watt increase where it goes 1watt to 4.5ghz then 145watt at 4.6ghz and 4.7ghz. it seems that per 100mhz increase/decrease is hard to measure if at all, at some speeds per the same voltage. you are right though maybe the combation of die binning with lower cpu speed as well as cpu voltage could make up for a 54% decrease in TDP. its just hard for me to think thats enough for that kind of drop.

mackerel
yea along time ago i remember reading or someone posting intel had low power transistors they used for laptop and low power parts. possibly not true after all then?
 
This thread illustrates a great point on TDP. I think when we talk TDP some people don't realize it's directly dependent on two factors: Clock speed and voltage. These are the two aspects over clockers can use to manage TDP and thermal output. For my small sized HTPC build Woomack's idea comes to mind: undervolt and overclocked. It's an interesting paradox, to say the least.
 
ED,
Then are we looking at intel TDP is different then AMD TDP or is it the same for both? if it is just cpu thermals then how can you figure that out based on speed and voltage reduction without some kind of thermal probe. much less a way to look at voltage/current going to the cpu. your post is still insightful but then that leads me to, without a 2200GE or 2400GE to compare to 2200G and 2400G. Then setting for fixed clocked and same voltage and measuring using a plug-a-watt meter to see if its just that cpu/voltage. one question im wodering about based on the table, there is no defined CPU LLC so how much of a difference would that make. as the table shows that power usage doesnt increase much in some spots. going from 4.3ghz to 4.4ghz shows 3watt increase where it goes 1watt to 4.5ghz then 145watt at 4.6ghz and 4.7ghz. it seems that per 100mhz increase/decrease is hard to measure if at all, at some speeds per the same voltage. you are right though maybe the combation of die binning with lower cpu speed as well as cpu voltage could make up for a 54% decrease in TDP. its just hard for me to think thats enough for that kind of drop.

mackerel
yea along time ago i remember reading or someone posting intel had low power transistors they used for laptop and low power parts. possibly not true after all then?
This board was left on auto. LLC is simply an option that will mitigate vDroop and doesn't really play a role with the END load voltage (though it can effect idle voltage...which we aren't really testing here). For example, If I set 1.2V and on load get 1.15V, it doesn't matter if I leave LLC off and reach 1.15V with droop, or enable it and reach 1.15V. Its still 1.115V no matter how you get there (offset/LLC/vcore itself).

We also have to consider the power readings are from a Kill-A-Watt. Not exactly the most accurate, but good enough for (most) of our purposes. We can take away that there isnt much difference in clocks alone looking at he big picture.
 
Last edited:
Back