• Welcome to Overclockers Forums! Join us to reply in threads, receive reduced ads, and to customize your site experience!

Whats a good 6950 cooler upgrade?

Overclockers is supported by our readers. When you click a link to make a purchase, we may earn a commission. Learn More.
IT makes sense that Archeage would cause problems, it's based on the Crysis 3 engine so I'd guess it puts a lot of load on your 6950.
Do you have good ventilation inside your case?

But Dead Island is based on the Chrome engine and -although DI isn't fantastically optimised- should not put that much hurt on the card.

What settings are used for DI?

This is for a "sing & dance" at 1920x1080 - note that shaderpath must be set by hand and then video.scr flagged as read only.

Code:
!Resolution(i,i)
!WindowOffset(i,i)
!Monitor(i)            // -1 primary monitor
!BPP(i)
!FSAA(i)               // Full Screen AntiAliasing, 0 - none, higher is better
!TexBPP(i)
!TextureQuality(s)     // VeryLow, Low, High
!Filtering(s)          // Bilinear, Trilinear, Anisotropic, AnisotropicTrilinear
!GammaFloat(f)
!MaxRefresh(i)
!Shadows(s)            // Low, High
!ShadowMapSize(i)
!SpotShadowMapSize(i)
!Lightmaps()
!Fullscreen()
!VSync()               // enable vertical sync
!MaterialQuality(i)    // lower is better
!WaterQuality(i)       // lower is better
!GrassQuality(i)       // lower is better
!FXQuality(i)          // higher is better
!FXFadeLevel(i)        // 0-4 (lower is better)
!EnvQuality(s)         // FullDynamic, RareDynamic, Static
!ShaderPath(i)         // 0 - 1.1, 1 - 1.4, 2 - 2.0, 3 - 3.0, 4 - x360, 5 - 4.0, 6 - ps3
!PostProcess(s)        // Simple, Normal
!DisplayDeviceType(i,f,f,f,f)  //device type (LCD TV etc.): 0 - default
!Curves(s)             // curves texure

Resolution(1920,1080)
Monitor(0)
BPP(24)
FSAA(2)
TexBPP(32)
TextureQuality("High")
Filtering("AnisotropicTrilinear")
Fullscreen()
Shadows("High")
ShadowMapSize(1024)
SpotShadowMapSize(1024)
Lightmaps()
GammaFloat(0.98)
MaxRefresh(60)
MaterialQuality(2)
FXQuality(3)
FXFadeLevel(1)
WaterQuality(2)
GrassQuality(0)
EnvQuality("FullDynamic")
ShaderPath(5)
VSync()
DisplayDeviceType(0,0.000000,1.000000,1.000000,1.000000)
 
The inside is where the heatsink is at. Cleaning the outside would do nothing to help temps. I'm saying clean it, the heatsink...which is under that shroud, BEFORE you buy the cooler to see if that helps.
 
The default fans on the HAF-X. Though next month I want to get a few more fans as well. However I need to know which fans to get as I have to maintain positive pressure but at the same time need to get fans that will actually do something.

The outside of the card is clean and the inside I will blow out with compressed air when I go to install the new cooler.

I'm also buying the cooler because I don't want a card that I have to ramp the fan up to 85% to stay at safe temps.

That Artic Cooler GPU cooler is that a two or three slot cooler? Just wondering if I have to move my sound card further down.

The Arctic Cooling one I have takes up three slots w/the stock fan assembly (which I never used).

Outlaw Star, I never saw that anime show, was it ever any good? What was it about?
 
The Arctic Cooling one I have takes up three slots w/the stock fan assembly (which I never used).

Outlaw Star, I never saw that anime show, was it ever any good? What was it about?

I liked it. It was a sci-fi action/comedy. Have you seen Cowboy Bebop? Both series were created by the same guy. It's like that. Kinda starts out slow but it picks up quickly.

As for the heatsink. I'm afraid to do this before the cooler comes but I will take the card apart and look it over and blow it out. It's just I have big hands so I try to do all my work at once. Less chance for my hands to mess something up.

I still need to run Archeage later on today. I went to see if upgrading the driver to 14.4 from 13.9 helped (Doubt it). And I also want to log the temps, voltages, and currents for all hardware in HWINFO. I want to be thorough in my investigation. I'll post the log here... I just need to get some coffee for my nerves.
 
But Dead Island is based on the Chrome engine and -although DI isn't fantastically optimised- should not put that much hurt on the card.

What settings are used for DI?

This is for a "sing & dance" at 1920x1080 - note that shaderpath must be set by hand and then video.scr flagged as read only.

Code:
!Resolution(i,i)
!WindowOffset(i,i)
!Monitor(i)            // -1 primary monitor
!BPP(i)
!FSAA(i)               // Full Screen AntiAliasing, 0 - none, higher is better
!TexBPP(i)
!TextureQuality(s)     // VeryLow, Low, High
!Filtering(s)          // Bilinear, Trilinear, Anisotropic, AnisotropicTrilinear
!GammaFloat(f)
!MaxRefresh(i)
!Shadows(s)            // Low, High
!ShadowMapSize(i)
!SpotShadowMapSize(i)
!Lightmaps()
!Fullscreen()
!VSync()               // enable vertical sync
!MaterialQuality(i)    // lower is better
!WaterQuality(i)       // lower is better
!GrassQuality(i)       // lower is better
!FXQuality(i)          // higher is better
!FXFadeLevel(i)        // 0-4 (lower is better)
!EnvQuality(s)         // FullDynamic, RareDynamic, Static
!ShaderPath(i)         // 0 - 1.1, 1 - 1.4, 2 - 2.0, 3 - 3.0, 4 - x360, 5 - 4.0, 6 - ps3
!PostProcess(s)        // Simple, Normal
!DisplayDeviceType(i,f,f,f,f)  //device type (LCD TV etc.): 0 - default
!Curves(s)             // curves texure

Resolution(1920,1080)
Monitor(0)
BPP(24)
FSAA(2)
TexBPP(32)
TextureQuality("High")
Filtering("AnisotropicTrilinear")
Fullscreen()
Shadows("High")
ShadowMapSize(1024)
SpotShadowMapSize(1024)
Lightmaps()
GammaFloat(0.98)
MaxRefresh(60)
MaterialQuality(2)
FXQuality(3)
FXFadeLevel(1)
WaterQuality(2)
GrassQuality(0)
EnvQuality("FullDynamic")
ShaderPath(5)
VSync()
DisplayDeviceType(0,0.000000,1.000000,1.000000,1.000000)

Settings for DI? I had them maxed at 1600x1200 since that's my personal favorite resolution.
 
yeh well, the primary gaming monitor is an Acer GD245HQ - 120Hz 3D for some games (not for DI though, DI is a bit weird in 3D... But DeadSpace in 3D, now that's awsome)
and the Acer kinda likes its default 1920,1080

I wonder if there will be decent, fast 4k gaming monitors around x-mas/new year, i'ld like one.. but currently they are far from mature or practical.

I mean... "split screen" 30 Hz left + 30 Hz right... its like having a flashback to the late 70's and "wauw, this is some much more awsome as Pong, dude.. look how they show that approaching death star by interlacing the image"


Anyways... [/hijack]
 
Last edited:
Well the native resolution for my monitor is 1920x1080 I personally just prefer 1600x1200. It's big enough I can make out everything on screen but high enough to give good visuals but doesn't produce that crunched image the widescreen resolutions do.

=========================

Ok I have the troublesome GPU removed and sitting on the top of my desk wrapped in a anti static bag that I kept when I bought my two 6950s (I regret buying them now.) So my 2nd 6950 is in the computer now and the question now is I've been looking at some videos online. I have to remove all screws on the back and theres supposed to be screws holding the connection plate in, right?

EDIT: Also I have my GPU in the top PCI-e slot on my Crosshair V Formula motherboard and my sound card is in the 2nd from the bottom. Will I have enough space to fit the cooler (Arctic Cooling Accelero XTREME Plus II)? It looks like it's roughly the same size of the stock cooler. Though pictures are known to deceive.
 
Last edited:
you'll probably have to find a new home for the sound card

If if it would fit in the slot below, it would (partially) block the fan(s)

Also, unless you need extreme hi-quality sound, like for tinkering with music/composing... then there's nothing wrong with the build-in audio chip. Ok the ALC is maybe not the best outthere, but if its for explosions & bullet impacts, it does fine
 
you'll probably have to find a new home for the sound card

If if it would fit in the slot below, it would (partially) block the fan(s)

Also, unless you need extreme hi-quality sound, like for tinkering with music/composing... then there's nothing wrong with the build-in audio chip. Ok the ALC is maybe not the best outthere, but if its for explosions & bullet impacts, it does fine

I don't think the bottom slot is good. I remember having to move it from that slot over a year and a half ago. *Sigh* And I liked my card to....
 
I don't think the bottom slot is good. I remember having to move it from that slot over a year and a half ago. *Sigh* And I liked my card to....

Need to go through the specs/manual of your MB.. the bottom slot is probably "shared"
for example there are 7 slots, but as soon as you use the 3th pci-e slot, the 4/5 pci slot is disabled. They've been doing that with SATA also.. 6/8 SATA ports; but the last two are shared with the E-SATA ports, meaning you can have either an internal HardDrive OR an external e-sata harddrive on that port, not both.

Could be somthing similar, the last slot/port being disabled if all PCI-E slots are "filled"

After all the 990 FX only supports TWO PCI-E x16 .. so, how do they get the 3th & 4th PCI-E to work? By some trickery in detection & fallback speeds of course. If i look at the board , the slots are as follows top to bottom
PCIE_X16
PCIE_X1
PCIE_X8/X1
PCI
PCIE_X16/X8
PCIE_X4

they are probably doing some trickery with the slots, see pages 43-44 of the MB manual
 
Last edited:
Need to go through the specs/manual of your MB.. the bottom slot is probably "shared"
for example there are 7 slots, but as soon as you use the 3th pci-e slot, the 4/5 pci slot is disabled. They've been doing that with SATA also.. 6/8 SATA ports; but the last two are shared with the E-SATA ports, meaning you can have either an internal HardDrive OR an external e-sata harddrive on that port, not both.

Could be somthing similar, the last slot/port being disabled if all PCI-E slots are "filled"

After all the 990 FX only supports TWO PCI-E x16 .. so, how do they get the 3th & 4th PCI-E to work? By some trickery in detection & fallback speeds of course. If i look at the board , the slots are as follows top to bottom
PCIE_X16
PCIE_X1
PCIE_X8/X1
PCI
PCIE_X16/X8
PCIE_X4

they are probably doing some trickery with the X4 slot

No the port just died. The sound card was in it for close to a year and a half with two 6950s briefly and now the one GPU setup when one day Windows froze and I lost sound on reboot. I moved it up one slot and it's been fine ever since.

That Gelid cooler. That Mr.Scott posted: http://www.performance-pcs.com/cata...oduct_info&cPath=54_196_692&products_id=29262

Could I get that on my GPU without having to ditch my sound card? I really don't want to get rid of it if I can help it.
 
the gelid is stil a 2,5 / 3 slot cooler... and it wont change the fact that the sound be blocking air to the fans

Wanna do xfire, with the sound card between the two GPU's...?
Put the top one under water, leave the bottom on air (stock or Arctic or Gelid)

If only one GPU, move it to below the sound card Put the GPU in the X8 or bottom X16/X8 slot.
 
Ok I have the troublesome GPU removed and sitting on the top of my desk wrapped in a anti static bag that I kept when I bought my two 6950s (I regret buying them now.) So my 2nd 6950 is in the computer now and the question now is I've been looking at some videos online. I have to remove all screws on the back and theres supposed to be screws holding the connection plate in, right?

EDIT: Also I have my GPU in the top PCI-e slot on my Crosshair V Formula motherboard and my sound card is in the 2nd from the bottom. Will I have enough space to fit the cooler (Arctic Cooling Accelero XTREME Plus II)? It looks like it's roughly the same size of the stock cooler. Though pictures are known to deceive.

If you remove the heatsink, I think the thermal pads on the VRAM have
to be replaced -- as well as the thermal pads on the VRM's (for the VRM's this might even be especially important).
 
I don't know what the deal with Dead Island is but when I played Archeage the reported board temp was only 79 C. I updated my drivers to the latest non beta.... What are the max temps for the entire chipset?

Just my .02 cents here, 79*C is quite cool IMO for an AMD GPU.

I've run mine mining (270x, 7850 6850) 85+*C for weeks without issue. :shrug:
 
Just my .02 cents here, 79*C is quite cool IMO for an AMD GPU.

I've run mine mining (270x, 7850 6850) 85+*C for weeks without issue. :shrug:

Yep, that bothered me to. But MSI Afterburner only reads the general board temp. I would've had to been reading the other temps as well which HWINFO provides me. I'd have to wait for Closed Beta 3 for Archeage but I'm just gonna get me a cooler and put it on, but for now I'm gonna open the board up and check for dust but I'm gonna be honest I'm not expecting to find anything significant.

Yup, unless they're totally destroyed. :thup:

What would destroy them, over time or from me removing the covering and cooler?

Also do coolers come with replacement thermal pads or do I have to order those separately?
 
Last edited by a moderator:
MSI Aftberburner reads the core temp, not the board temp. HWmonitor wouldn't show that either. I am not sure I have ever seen any GPU read board temps. Motherboards do however.

The coolers come with the pads in most cases. Just read what it comes with before buying. ;)
 
Ok I've removed all the screws on the back, the two screws on the connection plate, and a extra screw that was holding the connection plate to the card. I still can't lift the covering off and I don't see anymore screws and I'm afraid to force it open in fear of damaging the card. So what's the trick?

MSI Aftberburner reads the core temp, not the board temp. HWmonitor wouldn't show that either. I am not sure I have ever seen any GPU read board temps. Motherboards do however.

The coolers come with the pads in most cases. Just read what it comes with before buying. ;)

HWInfo gives me several GPU temps... If these are not board temps then what are they? They can't all be core temps, can they?

Under ATI/AMD Radeon HD 6950: Internal

GPU Thermal Diode
GPU TS0 (DispIO)
GPU TS1 (MemIO)
GPU TS2 (Shader)

Under ATI/AMD Radeon HD 6950: CHiL/IR PMBus - GPU Core

GPU VRM Temperature 2

Under ATI/AMD Radeon HD 6950: CHiL/IR PMBus - GPU VRAM

GPU VRM Temperature 1

Also why doesn't the VRM Temps move beyond 25C? Does the 6950 just not have sensors for that part of the GPU? Because both 6950s don't seem to record those temps.
 
Last edited by a moderator:
None of those are board temps, right.

Some 6950s may have those vrm sensors, but not all.
 
Back