• Welcome to Overclockers Forums! Join us to reply in threads, receive reduced ads, and to customize your site experience!

Linux GTX 970 questions

Overclockers is supported by our readers. When you click a link to make a purchase, we may earn a commission. Learn More.

RoXQi3x

Member
Joined
May 5, 2012
Location
Norway
So, I'm currently enjoying two Ubuntu builds with hi-end Nvidia cards, but I'm really a Linux noob and also a hi-end graphic craphics noob. So I am quite used to AMD Tahiti/Hawaii and low-end mobos and even GPU extenders - because even PCI 1x didn't really hurt the ppd. And I did get excited about 970 and Ubuntu and I run out buying a new mobo that I already knew without looking too much on PCI Express needs for Nvidia..

Anyways, I would like a Linux way of seeing current PCI Express speeds. The Nvidia X Server gives some nice numbers, but only max PCI Speed, not current - at least not that clearly.
GPU-Z in Windows shows the real deal - there's the 1x extender I got going!

How can I see the PCI speeds (4x/x/16x) in Linux?

And btw I said questions - I have managed to overclock the first GPU with cool-bits, but how can I get that option on the other cards?

PCI_GUPPIE4.jpg
 
Below is a copy of the current xorg.conf file from one of my computers with coolbits overclock enabled on 2xGTX970. The cfg was originally set up with a pair of 750Ti and earlier drivers, but this is copy of file currently running. Hope this helps.

Do not remember the details on how I got it working, so have been replacing the default file with the working file as the systems are upgraded. Recall editing the xorg file per the various suggestions found via google, but still having issues until connecting monitors to both cards at same time, then was able to get working on the 2nd card. Continued to work after disconnecting the 2nd screen. I do not have voltage adjustments, but the 970 run well at ~1500 and the 750Ti at ~1400.

My OC settings have to be manually entered after rebooting, so if someone knows how to make them stick, please help.


Code:
# nvidia-settings: X configuration file generated by nvidia-settings
# nvidia-settings:  version 343.13  (buildd@lgw01-30)  Mon Aug 11 19:50:14 UTC 2014

# nvidia-xconfig: X configuration file generated by nvidia-xconfig
# nvidia-xconfig:  version 340.46  (buildmeister@swio-display-x86-rhel47-03)  Wed Sep 24 14:38:35 PDT 2014

Section "ServerLayout"
    Identifier     "Layout0"
    Screen      0  "Screen0" 0 0
    Screen      1  "Screen1" 1024 0
    InputDevice    "Keyboard0" "CoreKeyboard"
    InputDevice    "Mouse0" "CorePointer"
    Option         "Xinerama" "0"
EndSection

Section "Files"
EndSection

Section "InputDevice"

    # generated from default
    Identifier     "Mouse0"
    Driver         "mouse"
    Option         "Protocol" "auto"
    Option         "Device" "/dev/psaux"
    Option         "Emulate3Buttons" "no"
    Option         "ZAxisMapping" "4 5"
EndSection

Section "InputDevice"

    # generated from default
    Identifier     "Keyboard0"
    Driver         "kbd"
EndSection

Section "Monitor"
    Identifier     "Monitor0"
    VendorName     "Unknown"
    ModelName      "IBM T560"
    HorizSync       31.5 - 60.0
    VertRefresh     56.2 - 75.0
    Option         "DPMS"
EndSection

Section "Monitor"
    Identifier     "Monitor1"
    VendorName     "Unknown"
    ModelName      "DELL E153FP"
    HorizSync       30.0 - 61.0
    VertRefresh     56.0 - 76.0
EndSection

Section "Device"

#   Option         "Coolbits" "8"
#   Option         "Coolbits" "12"
    Identifier     "Device0"
    Driver         "nvidia"
    VendorName     "NVIDIA Corporation"
    BoardName      "GeForce GTX 750 Ti"
    BusID          "PCI:1:0:0"
EndSection

Section "Device"

#   Option         "Coolbits" "8"
#   Option         "Coolbits" "12"
    Identifier     "Device1"
    Driver         "nvidia"
    VendorName     "NVIDIA Corporation"
    BoardName      "GeForce GTX 750 Ti"
    BusID          "PCI:2:0:0"
EndSection

Section "Screen"

#   Option         "Coolbits" "8"
    Identifier     "Screen0"
    Device         "Device0"
    Monitor        "Monitor0"
    DefaultDepth    24
    Option         "Coolbits" "12"
    Option         "Stereo" "0"
    Option         "nvidiaXineramaInfoOrder" "CRT-0"
    Option         "metamodes" "nvidia-auto-select +0+0"
    Option         "SLI" "Off"
    Option         "MultiGPU" "Off"
    Option         "BaseMosaic" "off"
    SubSection     "Display"
        Depth       24
    EndSubSection
EndSection

Section "Screen"

#   Option         "Coolbits" "8"
    Identifier     "Screen1"
    Device         "Device1"
    Monitor        "Monitor1"
    DefaultDepth    24
    Option         "Coolbits" "12"
    Option         "Stereo" "0"
    Option         "nvidiaXineramaInfoOrder" "CRT-0"
    Option         "metamodes" "nvidia-auto-select +0+0"
    Option         "SLI" "Off"
    Option         "MultiGPU" "Off"
    Option         "BaseMosaic" "off"
    SubSection     "Display"
        Depth       24
    EndSubSection
EndSection
 
Last edited:
Do not remember the details on how I got it working, so have been replacing the default file with the working file as the systems are upgraded. Recall editing the xorg file per the various suggestions found via google, but still having issues until connecting monitors to both cards at same time, then was able to get working on the 2nd card. Continued to work after disconnecting the 2nd screen. I do not have voltage adjustments, but the 970 run well at ~1500 and the 750Ti at ~1400.

I tried copying this file, and it crashed my Zorin setup. Had to copy the backup back over it.

If I'm reading it right, you need 2 monitors plugged in (one for each card) at boot to get this to work?
 
yes. Think I had one screen connected to each card when booting, then went into nvidia settings and detected the screens. Only have 2x gpu in the hosts. Was not able to get manual edits to work with just the single screen, most likely due to not getting the cfg file format correct.

Sounds like chuckerants got coolbits working.

edit: uploaded copy of xorg.conf with coolbits working on 2x gpus, ubuntu 14.04. had to add .txt extension to upload the file, so need to rename file back to xorg.conf
 

Attachments

  • xorg.conf.txt
    3 KB · Views: 263
Last edited:
Mine is working. All I did was copy HayesK's file into the config file. Though I only have one monitor connected. I have the fan on 75%, and I "think" I may even be OCed.
yes. Think I had one screen connected to each card when booting, then went into nvidia settings and detected the screens. Only have 2x gpu in the hosts. Was not able to get manual edits to work with just the single screen, most likely due to not getting the cfg file format correct.

Sounds like chuckerants got coolbits working.
 
I only have single screen attached to each 4-way KVM. There are 3 kvms, all with different screens (2x15" and 1x22"), and all using the same xorg file. Currently have 8xGTX970, 6xGTX750TI, 2xGTX670 and 2xGTX660Ti cards running.
 
Back