• Welcome to Overclockers Forums! Join us to reply in threads, receive reduced ads, and to customize your site experience!

Techreport push their microstuttering analisys. ( updated whit PCPer.com data )

Overclockers is supported by our readers. When you click a link to make a purchase, we may earn a commission. Learn More.
PCPer too made a similar review. I think those site are linked since its published at the same time and reviews are very similar.

http://www.pcper.com/reviews/Graphi...ils-Capture-based-Graphics-Performance-Testin

There are a list of reviews and they will even publish 7850/70 and many other multi GPU config with this setup to test the REAL FPS. Read these article, you wont be disapointed and you will learn ALOT.

In short, they record the entire gameplay using a secondary system with a special card and a prog that ad watermark next to each frame. Then they can run through some analysis. FRAPS is recording its framerate at the begining of the display chain but it seems that alot of frame can be dropped later on before going to the screen.

BF3_1920x1080_FRAPSFPS.png


BF3_1920x1080_OFPS_0.png


BF3_2560x1440_FRAPSFPS_0.png


BF3_2560x1440_OFPS_0.png


This have nothing to do with fanboysm, i was in fact a click away from buying 7970's to run a CF in order to improve my surround ( eyfinity ) setup FPS and IQ. I think a second 670 will do it until AMD can fix this.

After reading all this, you know that FPS numbers are in fact wrong, mostly with AMD multiGPU setup since in alot of title a bounch of frame are dropped. FRAPS will show 80-100fps but you in fact only have maybe 50-70fps really going to the screen.

Some games seems to run OK and dont have frame dropped, like skyrim, but i dont play skyrim ! :(
 
I think I will wait this one out for a while.

Using Nvidia software to measure in a specific way that puts Nvidia ahead of AMD.

You also have a reviewer that needs controversy to give this 'new' way of testing validity.

New software/hardware interfaces. They admit that there were problems getting it to work at all.

I'm not saying it's not true, but there are enough red flags here to wait for the other shoe to drop.
 
Last edited:
The computer that "work" the image wont really know if its AMD or Nvidia that push the image, it only record the gaming and add watermark. Then they can thinker whit all the data recorded. The fact that in a few title i heard for a while from HardOCP that AMD CFX seems to give a worst gameplay experience, in simple word, they need 80fps to give a smooth gameplay when Nvidia deliver a smooth gameplay at 60fps, this just proved it. The AMD driver seems to drop some from more than often.

I'm aware that Nvidia developped the tool bu they spoke whit AMD and they never said " ho it's an Nvidia tool, we are right, they just want tu f**k us up". No ... AMD told YES, we are working in this problem and it should be fix in few months. And i trully want to beleive it's going to be only few months.
 
Last edited:
From the article.

NVIDIA was responsible for developing the color overlay that sits between the game and DirectX (in the same location of the pipeline as FRAPS essentially) as well as the software extractor that reads the captured video file to generate raw information about the lengths of those bars in an XLS file.


AMD is working on new drivers that will give the end user a choice.

LINK


As I said, I will wait to see the final results of this.
 
On that we are two, i'm also waiting for more !

I was really in the proces of buying 2 7970's instead of a second 670. I also like to try each solution, my choice is not sealed with Nvidia cause i already have a 670. A crad is easilly sold. I paid 310$ for this 670 and i'm sure i can resell it nearly the price i paid for. 7970's are kind of cheap compared to 680's and even sometimes they sell at the 670 price.

I wait.... If i wait too much, i'll simply got with next gen, GTX 7xx or HD 8xxx.
 
Another site tested the FCAT

http://www.guru3d.com/articles_pages/fcat_benchmarking_review,1.html

@soulcatcher
guru3d said:
It is a set of tools that derives from the NVIDIA performance laboratory. Now please don't throw objectivity and subjectivity concerns at us as yes, this methodology is coming from NVIDIA. Let me state upfront that pretty much all software and scripts can be read out as source code and the simple truth is that the FCAT benchmark method can't see the difference in-between AMD and NVIDIA graphics cards as we look at rendered frames, we are not measuring inside the graphics card(s).
 
I like that article much better. It actually reads like an impartial report.

Isn't Everybody Overreacting?

Stuttering, measuring anomalies... isn't everybody overreacting? Well yes and no. Small anomalies rendered and which you can experience on screen always have been a part of a graphics card and your overall game experience. For years now we have had that, for years now most of you have not been bothered by it. Aside from a small group enthusiast end-users and analysts, that is the primary context you need to keep in mind when it comes to FCAT measurements, really.

Yeah, average FPS is still (in my opinion) the most important denominator in terms of determining how fast a game can be rendered. Now that doesn't mean I am disqualifying frame time or what I like to call frame experience measurements, contrary. Frame experience measurements in my mindset will help as an extra tool and data-set to show you the relation of render performance versus what you see on screen. Frametime measurements are a tool to detect anomalies that we never really measured. So it is more a question of what can we accept when analyzing anomalies and what not, because some people will totally freak out if they see a couple of latency spikes in a chart. Realistically you'll be hard-pressed to notice it, heck one big massive scary spike in a chart could even something as simple as a game scene change. Frame Time / Frame Experience measurements however are becoming a part of Guru3D test and benchmark methodology. It will sit next towards what we have always shown you, average FPS, as average FPS we still consider to be the best measurement we can fire off at you if you are asking the question "how fast is my graphics cards". But an extra data-set that can detect anomalies obviously is great to have and show.

The other ones read like modern media, fear and doom make people watch.
 
You guys REALLY need to to listen to the PCper.com podcast.

The "Nvidia software" is done by the same guys that have done Riva tuner. From what I understand, its the EXACT same type of overlay that MSI afterburner uses.
And from reading the article, thats almost exactly what they did.


And Guru3d seems to have done pretty much exactly what PCper.com has been doing. Its even the exact same capture card. They even came to the same issue with the PCIe based SSD being to slow, which is why Ryan ended up using the two thunderbolt raid arrays with SSDs.

That probably reads that Im saying Guru3d ripped off PCper's technique/research/procedure....and in a way I am. Give credit where credit is due.
Guru3d did do good by using their own overlay, but with the same results to dismiss the whole "ZOMGerd software made by nvidia!"

Hardocp's SLI vs Crossfire reviews have been talking about the experience they get from each for a quiet some time now. Saying they prefer the SLI experience over Crossfire.

But, back in the late 9800GX2 days, there was alot of talk about "microstutter" back then as well. Im sure its always been there. Nvidia just fixed it first before AMD did, and up until now AMD didnt have a reason too. They've already projected a fix by june.
 
Back