• Welcome to Overclockers Forums! Join us to reply in threads, receive reduced ads, and to customize your site experience!

A thought

Overclockers is supported by our readers. When you click a link to make a purchase, we may earn a commission. Learn More.
@ RGone, sadly yes we are a dyeing breed, and perhaps AMD are thinking more about servers than they are about us. AMD's acquisition of SeaMicro for $334 million is a pretty good clue to there direction.

Just as long as the chips they want to sell to me do what i want them to do i don't actually care about AMD's true calling in this age of cloud computing, nor do i care if they are 20% per core behind Intel on the Desktop.

That 20% is surplus to requirements and i have more cores to play with.

Last night, i was having one of those BF3 rounds where everything was going my way, no hackers and everyone working as a team, i was deep into a round on Wake Island and thoroughly enjoying myself pushing 70 FPS on Ultra minus MSAA easily all round long, and then i was rudely interrupted by a *bing* noise, i hit Escape ALT Return and was reminded that i was re-encoding Iron Sky from Blue Ray to DVD on AVS VE so my Mother could watch it, the bing was it telling me it had finished and it the only thing noticeable about that workload going on in the background.

AMD under performing? i don't think so :)

I don't care about Super PI, i don't care about Intel's i3 vs some AMD on some selective game running Mobile Phone resolution.... i don't care about any of that stuff anymore.

All i care about is what will the chip do for me?

Bulldozer is to warm and to power hungry, but thats really all thats wrong with it.

If AMD can fix that at the right price they have sold me an 8 core chip.

And i think more people are actually sick and tired of synthetic benchmarks which don't really tell them anything, hence Bulldozer has not burnt AMD.
:thup:

glass_half_full.jpg
 
That is the best glass half full reference ever!

And Frakk, "AMD under performing? i don't think so" BINGO!
That post was so spot on that its crazy. I run my games at 1080p without issues, while running Vent, a vent server, 6 Firefox tabs, EVGA Precision X, HWMonitor and sometimes Outlook all at once.

Shoot, I was running The Old Republic on one monitor and Force Unleashed on the other, with all that in the background. Not a stutter.
 
Last edited:
@ RGone, sadly yes we are a dyeing breed, and perhaps AMD are thinking more about servers than they are about us. AMD's acquisition of SeaMicro for $334 million is a pretty good clue to there direction.

Just as long as the chips they want to sell to me do what i want them to do i don't actually care about AMD's true calling in this age of cloud computing, nor do i care if they are 20% per core behind Intel on the Desktop.

That 20% is surplus to requirements and i have more cores to play with.

Last night, i was having one of those BF3 rounds where everything was going my way, no hackers and everyone working as a team, i was deep into a round on Wake Island and thoroughly enjoying myself pushing 70 FPS on Ultra minus MSAA easily all round long, and then i was rudely interrupted by a *bing* noise, i hit Escape ALT Return and was reminded that i was re-encoding Iron Sky from Blue Ray to DVD on AVS VE so my Mother could watch it, the bing was it telling me it had finished and it was the only thing noticeable about that workload going on in the background.

AMD under performing? i don't think so :)

I don't care about Super PI, i don't care about Intel's i3 vs some AMD on some selective game running Mobile Phone resolution.... i don't care about any of that stuff anymore.

All i care about is what will the chip do for me?

Bulldozer is to warm and to power hungry, but thats really all thats wrong with it.

If AMD can fix that at the right price they have sold me an 8 core chip.

And i think more people are actually sick and tired of synthetic benchmarks which don't really tell them anything, hence Bulldozer has not burnt AMD.

Okay man, pretty well said. However that would place you in a not overly large group of educated consumers. The problem is your education is 'now' hands-on and that is not the real situation as Review Sites continue to review in the manner that has been around for years. Synthetic benches.

Now I don't really know if the BD burnt AMD or not. I know it pretty much fried my need to have a BD and put me looking at the next round of BD called PD. Too much heat and not a lot of performance in single-threaded benches is what turned me off of BD.

I was right here when the first BD's came in. HOT is all I can say. That and about 400mhz lower in general overclockability; than the Review Sites that all were furnished the CHV to test with, seemed able to do with that top of the line motherboard.

I was now in a state of tight on money and did not need what BD was selling. If I find I need a new rig in my life after PD arrives, my mind will be made up right here in the forums. Review sites will have only the most minor of effects. What is the struggle or glory of real users will influence my buying decision. Now I know that is strange but that is how it has been for about 4 years now. RGone...ster.
 
One cannot change how reviewers review products, its just easier to use the usual canned stuff......

Dose that hurt AMD? possibly, i don't know. I don't want to lay on predictions here, but i think PD will run cooler and use less power, there may even be a 10% clock for clock performance improvement, Tom's seem to think its 15%, which is great.

If that is so, we will see 'then' how many take notice of those reviews. if i may make another prediction? I think it will sell significantly better than BD.

AMD will never catch up with Intel in this sector, certainly not in terms of revenue, but they don't need to, there revenue is a fraction of Intel's yes, but there outgoings are also just a fraction.

As long as they can go on making a profit they will always be here giving us what (we want)
 
It's not that it is easier. The "synthetic" benchmarks give numbers that allow sites and users to compare.

If I say: "The bulldozer completes this winrar benchmark 25% faster than a 2500k" you have a comparison number.
If I say "The bulldozer is definitely faster at winrar" you do not have any basis for comparison.
Worse, if I don't use synthetics, you're stuck with "BD feels pretty quick in general, some stuff is definitely slower while a few things are faster".
That's useless. It's like PSU reviews that use a computer system for a load and OCCT for "ripple" testing. Completely and totally useless.
 
It's not that it is easier. The "synthetic" benchmarks give numbers that allow sites and users to compare.

If I say: "The bulldozer completes this winrar benchmark 25% faster than a 2500k" you have a comparison number.
If I say "The bulldozer is definitely faster at winrar" you do not have any basis for comparison.
Worse, if I don't use synthetics, you're stuck with "BD feels pretty quick in general, some stuff is definitely slower while a few things are faster".
That's useless. It's like PSU reviews that use a computer system for a load and OCCT for "ripple" testing. Completely and totally useless.

A bit more Win RAR, a bit less Super PI and 1024 x 800 gaming. :)
 
Seriously though, how often do you rar/unrar things large enough that 20% is a meaningful difference? You're talking >1GB at that point.
The only stuff I've ever seen that is that large and RAR'd are things downloaded in a great many smaller parts from dubious websites...

Most of the gaming reviews I've seen have either been 1680x1050 (my native res) or 1920x1080 (or those and more), those are the resolutions that make sense to test as well.
 
Most of the gaming reviews I've seen have either been 1680x1050 (my native res) or 1920x1080 (or those and more), those are the resolutions that make sense to test as well.

This is true, but I still see plenty of the 720, 800 height stuff too.
 
Seriously though, how often do you rar/unrar things large enough that 20% is a meaningful difference? You're talking >1GB at that point.
The only stuff I've ever seen that is that large and RAR'd are things downloaded in a great many smaller parts from dubious websites...

Most of the gaming reviews I've seen have either been 1680x1050 (my native res) or 1920x1080 (or those and more), those are the resolutions that make sense to test as well.

Where there is one there are others, there is a reason for that result.

What about Photo shop, that's something i use, Handbrake / iTunes ecte... Its free, its good... i might use them.

These are apps people use. :)

And you look around. most people have a GTX 550ti/ 560ti / 570 and 6870 / 6950 / 7850 / 7870 running one 1080P monitor and the latest games, those people want to know what the CPU does for them.

Here's an idea.

Why not divide the review into (everyday real life) and those with money to burn on SLI GTX 690's and 5 screens.

And for the benchers crowd you can have a separate section for them to with Super PI and all that.

Be more targeted.
 
The benchmarking suites use the same functions though. Like for encoding it takes a sample file and encodes it. Or takes a sample file, and decompresses it. How is that actually different than doing it in the application? The test is the same every time that way, even if it isnt using, for example, Winzip to do it. This goes back to the testing Bubba is doing though. Perhaps Winzip runs better/worse with AMD/Intel vs PCMark7 and how it renders/encodes/decompresses...

That would also render a comparison across reviews impossible if all reviews used the actual application but different 'sample' files to do its work.
 
Maybe, just maybe, the consumer should be able to think a bit and say "Oh hey, iTunes, I use that" and "Oh hey, SuperPi, I don't use that"?
Or "These 2xGTX690 results don't really apply to me, I'll skip them".

If the consumer doesn't have the brains to do that they might as well go to Best Buy, really.

Speaking as a reviewer, I like my reviews to be comparable to each other. That means that all conditions other than the condition being tested must be identical in every review, or as close to identical as possible. They also need numbers.
Most reviews are sectioned into game test benchmarks and "synthetic" benchmarks, the good ones are anyway.
When you see lower resolution being tested with a CPU that is rather specifically to test the CPU. At high res the CPU makes very little difference.

Lastly, both the benchmark links you gave have the i3 stomping in some places and the A10 in others (while drawing more power), I'm not sure what your point is there.
 
Lastly, both the benchmark links you gave have the i3 stomping in some places and the A10 in others (while drawing more power), I'm not sure what your point is there.

They are thing people might use, at least they can make 'real' informed decisions based on there own personal preferences.
 
No seriously, what is your point? That Tomshardware has the best reviews?
That Toms uses benchmarks that involve "real world" stuff, while other people don't?
 
No seriously, what is your point? That Tomshardware has the best reviews?
That Toms uses benchmarks that involve "real world" stuff, while other people don't?

My intention is not to insult anyone, its just a suggestion, and i don't expect anyone to agree with me, they can draw there own conclusions to what i'm saying, and i don't think i can make what i'm saying anymore clear.

So i think its about done.
 
@ RGone, sadly yes we are a dyeing breed, and perhaps AMD are thinking more about servers than they are about us. AMD's acquisition of SeaMicro for $334 million is a pretty good clue to there direction.

Just as long as the chips they want to sell to me do what i want them to do i don't actually care about AMD's true calling in this age of cloud computing, nor do i care if they are 20% per core behind Intel on the Desktop.

That 20% is surplus to requirements and i have more cores to play with.

Last night, i was having one of those BF3 rounds where everything was going my way, no hackers and everyone working as a team, i was deep into a round on Wake Island and thoroughly enjoying myself pushing 70 FPS on Ultra minus MSAA easily all round long, and then i was rudely interrupted by a *bing* noise, i hit Escape ALT Return and was reminded that i was re-encoding Iron Sky from Blue Ray to DVD on AVS VE so my Mother could watch it, the bing was it telling me it had finished and it was the only thing noticeable about that workload going on in the background.

AMD under performing? i don't think so :)

I don't care about Super PI, i don't care about Intel's i3 vs some AMD on some selective game running Mobile Phone resolution.... i don't care about any of that stuff anymore.

All i care about is what will the chip do for me?

Bulldozer is to warm and to power hungry, but thats really all thats wrong with it.

If AMD can fix that at the right price they have sold me an 8 core chip.

And i think more people are actually sick and tired of synthetic benchmarks which don't really tell them anything, hence Bulldozer has not burnt AMD.

Frakk, I couldn't have said it all better myself.

To all of those talking about "synthetic" benchmarks I only have one response to that.



Hardware and all settings are exactly the same only thing changed was CPUID Vendor ID string (i.e. from via via via to genuineintel, authenticamd etc) and CPU model/family #.
 
The point in that testing though guys is to place the load on the CPU. Higher resolutions do not do that. ;)

DERP, that would be real-world instead of synthetic. Didn't think about that part of it.

Edit: Bubba, what the table you have there.
 
Can you/Did yuo run tests with REAL applications and see what happens? Would "real" applications show the same behavior with changing the ID?

(your picture is blocked at my office so I cant see your response - host the images here if you would be so kind)
 
Back