• Welcome to Overclockers Forums! Join us to reply in threads, receive reduced ads, and to customize your site experience!

resolutios not available.

Overclockers is supported by our readers. When you click a link to make a purchase, we may earn a commission. Learn More.

shadokatnj

New Member
Joined
Jul 8, 2011
Hello,

I have a toshiba 46in HDTV with HDMI input. Its rear projection.
I have Windows XP hp 2300 with 2GB RAM and ATI Radeon 5750 graphics card
highest direct x avail

After installing the graphics card with drivers, i can get either 720p (1280x720 @ 60hz) or 1080i (1080 @ 30hz) on the computer working fine in windows, playing movies, watching tv on my tv tuner card etc. Windows looks kinda crappy in 1080, but the movies and tv always looks great/fine. My issue is when I play games. When I load up COD or unreal or borderlands, everytime the maximum resolution the games say I have is 480p (720 x 480). Every single game. and I can't figure out why the games aren't picking up the resolution that windows is currently using.

Whats more annoying is that my laptop (win7 dual core w/ 4GB RAM and radeon 3200(or 2300, i forget) graphics accelerator), loads HD resolutions perfectly fine when I connect it to my tv), i can play borderlands in 720p or 1080, or whatever i want, (obviously its slow cause the graphics accel can't handle the game), but it at least appears as an option.

Its hard for me to believe that somehow my weakly graphics accel on my laptop can push those resolutions to the game, but my 5750 radeon card can't. IDeas?

tried reinstalling drivers and such. no dice. also tried to add resolution to registry with powerstrip. still no dice.

How does the game even pick up resolutions??

i want to get into overclocking, i just want to get things squared away working first.
 
simply. Analog type TV outs only Have resolutions to about 720x480 , that is thier max.
scan converting for tvouts, will usually only go so high in resolutions then they start panning. so even high end D-A converting would stick you at some lower resolution, but anything Over the ~720x480 is going to have been interpolated. The output to "video" or S-video will always still be ~720x480. so mabey the laptop has different A-D conversions, and does a nice interpolation, so you can view on any tv.

so that seems like your stickler. then cloning (forces same res), which is primary, and detection of the tv and all that stuff.
what is on what is off, how the video card is being trapped based on its capablilities

now you know, and if you need more help just go again, but i think that is the box your finding the video trapped in. usually you can have your cake and eat it too, but it might mean switching something for your different things. like turning off an adapted monitor when you want to run digital high res or vga, and not cloning.

the HDMI output should do what an HDMI does (as you know) it is some other analog TV like adaption or connection, detection, or setting in the video cards software.
.
 
Last edited:
hey,

Thanks for the reply. I realize i probably wasnt that specific in my question. All the video is being sent by HDMI, not converted at all. or using an analog tv out. HDMI on my video card on the desktop to HDMI on my tv. Same for the laptop.
 
yes , i re-read and figured that , except for the word tv tossed in.

i tried to think of other reasons, but the 720 thing stuck, because that is what used to happen when we were trying to show on normal tv items.
so
was there ever a old analog tv connection made?
did you check for all the things that are Like that?
any adaptions on the card that can head that direction?
in the "TV detection"? did you look in the CCC really well?

removing drivers does not always remove "settings" if it is some leftover setting item in the registry.
if you used a driver sweeper i think it shreds even settings items. (3dguru has a good driver cleaner)
did you check for cloning of monitors?

then mabey go to driver manager , get in the monitors section, show hidden devices, and toss out ALL (uninstall) the monitors there, then reboot, or re-install the one monitor again.
while your in driver manager make sure you check what video card is shown there.

what else:
make sure that any "refresh rates" are not way to high, set the refresh rates on the video card for the monitor to 60, for testing.
run through a quick DXdiag (directX diagnostic) check, see what it is saying in there.
Ahh and for games, it might be some "compatability" mode set.
.
 
Last edited:
hmm. I have definitely looked through CCC well, but i haven't tried some of those other suggestions. AND you are correct, previous i did have an analog tv connected so, I will go through each suggestion u made and see how it goes, give me a day or 2. I'll report back my findings. thanks for the help again!
 
Back