Results 1 to 4 of 4
  1. #1
    Senior Member KillrBuckeye's Avatar
    Join Date
    Jan 2005
    Location
    Livonia, MI

    Question about using a gaming resolution that differs from desktop resolution.

    I want to use 1152x864 resolution for a certain game, but my desktop resolution is 1280x1024. When I change to 1152x864 while playing a game, the image on my CRT monitor (19" Mitsubishi Diamondtron) becomes small, i.e. it does not use the entire viewable area of the screen. I realize that I can adjust my monitor so the image covers the whole screen, but my question is whether my monitor will "remember" its settings in 1280x1024 mode. I don't want to mess with my monitor settings until I know that my settings (geometry, size, etc.) at desktop resolution will be safe.

    I believe that my previous video card allowed me to adjust the size and geometry of the image on the monitor via software controls. I briefly looked for such a feature with my current card (GeForce 6800GT 128MB w/ Coolbits registry edit), but I couldn't find one. Is there such a feature with the nVidia driver/Coolbits that allows one to save different geometry, size, etc. settings for each resolution mode? Thanks in advance for any help you can offer.
    KillrBuckeye

    Scarlet: i7 3770K @ 4.4 GHz (1.19V), Gigabyte GA-Z77X-UD5H, 16GB (2x8GB) GSkill Ripjaws DDR3 1600, eVGA 8800GT, OCZ Vertex 3 120GB, WD 750GB Black, Samsung EcoGreen 2TB, SB Audigy 2, Corsair 450VX, Dell 2005FPW + ASUS VH226H, Windows 7 Ultimate 64-bit

    Gray: Q6600 G0 @ 3.4 GHz (378*9, 1.31V), Abit IP35-E, 2x2GB Corsair XMS2 PC2-6400, HD 5450, OCZ Agility 60GB SSD, Corsair CX500

    http://www.killrbuckeye.com

  2. #2
    Inactive Pokémon Moderator JigPu's Avatar
    10 Year Badge
    Join Date
    Jun 2001
    Location
    Vancouver, WA
    A CRT should have enough memory to hold different settings at several (if not every) resolution. Unless your CRT uses analog instead of digital controls, it should be fine to adjust and tweak to get it to fill the screen.

    JigPu
    .... ASRock Z68 Extreme3 Gen3
    .... Intel Core i5 2500 ........................ 4 thread ...... 3300 MHz ......... -0.125 V
    2x ASUS GTX 560 Ti ............................... 1 GiB ....... 830 MHz ...... 2004 MHz
    .... G.SKILL Sniper Low Voltage ............. 8 GiB ..... 1600 MHz ............ 1.25 V
    .... OCZ Vertex 3 ................................. 120 GB ............. nilfs2 ..... Arch Linux
    .... Kingwin LZP-550 .............................. 550 W ........ 94% Eff. ....... 80+ Plat
    .... Nocuta NH-D14 ................................ 20 dB ..... 0.35 C°/W ................ 7 V


    "In order to combat power supply concerns, Nvidia has declared that G80 will be the first graphics card in the world to run entirely off of the souls of dead babies. This will make running the G80 much cheaper for the average end user."
    "GeForce 8 Series." Wikipedia, The Free Encyclopedia. 7 Aug 2006, 20:59 UTC. Wikimedia Foundation, Inc. 8 Aug 2006.

  3. #3
    Member mblue's Avatar
    Join Date
    Nov 2004
    Location
    York, PA
    Shouldn't just adjust it in the games video options. Then when you exit the game your desktops resolution should go back to it's original setting. ^^^What the hay is CRT memory?


    MSI P6N-Sli, E8400 C2D @ 3.6 Ghz, 2 x 2 Gb of DDR2 800 Geil, EVGA 9800GTX+ 512 sli, 22" WS and Vista Ultimate X64, SP1

    MSI P6N-Sli, E2160 @ 3 Ghz, 2 x 1 Gb of DDR2 800 Geil, ATI X1950XTX,19" LCD and Vista Ultimate X64


    MSI P6N-Sli, E2160 @ 3 Ghz, 2 x 1 Gb of DDR2 800 Geil, Asus 3850-512, 19"LCD and Vista HP X64

  4. #4
    The memory that holds the settings you select (colour settings, brightness, crap, crap) in your CRT is also used to hold the display adjustments for different resolutions. It only strictly applies to one card at a time since the output freq from them can differ and they don't match the timings you have on the monitor (happened more in old days then now) which explains why sometimes taking a monitor to another card would result in needing readjustment because it is remembering the ones you chose for the other card....

    That software adjustment thing just changes the aformentioned output frequency (cards have horizontal and vertical frequencies in the signal) to match the monitor instead of the other way round. I think.

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •