I am have to do this from an iPad so I apologize for any mistakes etc. I was wondering if someone could give a quick explanation of how the power management on these more modern cards works. I was out of computers from 2004-2011, came back into it blind and built a new machine, haven't been into it anymore and looking to build another but this power management stuff really has me turned off on upgrading the video cards.
Most of us probably remember how the old school cards worked. Ti400, 9800pro, etc etc. your card ran at max 100 percent of the time. You didn't get robbed of any processing power because the card wasnt smart enough to think you didn't need anymore.
The power management stuff is nice when not playing games. Saves on noise, heat, etc but I am extremely turned off by the way it works in game so I'm trying to see if maybe I am misunderstanding how it works. This has been my experience.
Started with a single 6970 playing bc2. Got annoying fps drops here and there, the ati software was showing it was running at 70%. Why not 100%? If it was running at 100% perhaps the fps drops wouldn't be an issue. Added another 6790 just for ****s and giggles. Fps increased a bit but nothing major with both cards not running near max.
Bf3 with the 2 6970s same thing. It was good but some maps would get pretty rough fps drops etc. check a log of the gpus and what do you both running at 60-70 percent if I remember right.
So if I understand this correctly which I very well might not. The cards had more processing power available but the power management felt it was not necessary there fore not allowing me to take full advantage of a product due to wanting to help keep heat, power bill, etc down?
Most of us probably remember how the old school cards worked. Ti400, 9800pro, etc etc. your card ran at max 100 percent of the time. You didn't get robbed of any processing power because the card wasnt smart enough to think you didn't need anymore.
The power management stuff is nice when not playing games. Saves on noise, heat, etc but I am extremely turned off by the way it works in game so I'm trying to see if maybe I am misunderstanding how it works. This has been my experience.
Started with a single 6970 playing bc2. Got annoying fps drops here and there, the ati software was showing it was running at 70%. Why not 100%? If it was running at 100% perhaps the fps drops wouldn't be an issue. Added another 6790 just for ****s and giggles. Fps increased a bit but nothing major with both cards not running near max.
Bf3 with the 2 6970s same thing. It was good but some maps would get pretty rough fps drops etc. check a log of the gpus and what do you both running at 60-70 percent if I remember right.
So if I understand this correctly which I very well might not. The cards had more processing power available but the power management felt it was not necessary there fore not allowing me to take full advantage of a product due to wanting to help keep heat, power bill, etc down?