• Welcome to Overclockers Forums! Join us to reply in threads, receive reduced ads, and to customize your site experience!

HELP: Windows 7 and 720p 32" panel

Overclockers is supported by our readers. When you click a link to make a purchase, we may earn a commission. Learn More.

g0dM@n

Inactive Moderator
Joined
Sep 27, 2003
Long story short.

32" Element 720p HDTV (1080i compatible). Native 1366x768.
HD4870 512MB Radeon with the LATEST CCC/drivers

1366x768 is never an option, so far as I have seen, on any OS. Closest I've seen was 1360x768, but that ends up with a few fuzzy verticals.

On Windows XP, the monitor looked great at 1280x768:
We had small sections cut off on the edges, but the image was crisp. Both VGA and HDMI would do this.

On Windows 7:
I CAN'T GET IT TO WORK RIGHT! The only thing that looks crips is 1024x768 with huge areas cut off. If I set 1280x768, the monitor still displays it into a 1024x768 box, so it ends up fuzzy.

Anyone have a clue? I've been messing around for a while and even tried PowerStrip (no pro at it).
 
Long story short.

32" Element 720p HDTV (1080i compatible). Native 1366x768.
HD4870 512MB Radeon with the LATEST CCC/drivers

1366x768 is never an option, so far as I have seen, on any OS. Closest I've seen was 1360x768, but that ends up with a few fuzzy verticals.

On Windows XP, the monitor looked great at 1280x768:
We had small sections cut off on the edges, but the image was crisp. Both VGA and HDMI would do this.

On Windows 7:
I CAN'T GET IT TO WORK RIGHT! The only thing that looks crips is 1024x768 with huge areas cut off. If I set 1280x768, the monitor still displays it into a 1024x768 box, so it ends up fuzzy.

Anyone have a clue? I've been messing around for a while and even tried PowerStrip (no pro at it).

I'm rocking a 720p samsung 32" lcd

my rig is in my sig.(still running vga, not even hdmi :D)

I don't have a bit of problems watching "bluray" movies, x264, surfing the web or anything

Screen Resolution is 1360x768 60 htz 32 bit color(ran off the gts450)
 
Well my brother's is 1366x768 but it's overscanning when I choose and I cannt adjust the overscan :(
 
Older 720p panels truly are a PITA!! I hate how badly they shake hands with pcs.
 
how's it hooked up to the TV?

I tried both VGA and DVI to HDMI.

I used PowerStrip (application) and was able to get 1366x768 to set in windows and the TV recognized it as 1366x768 but it's overscanned. With ATI's CCC, you cannot mess with scanning with VGA, only with digital (stinks for me).

With Digital I was able to get resolution options all the way to 1920x1080 (that's most likely b/c the TV is 1080i compatible)...

The TV only is able to set ONE resolution to a CRISP setting, which is 1024x768. Since the TV is a 720p monitor, 768 is the setting you really want, so that the horizontal synch is set up perfectly, which is what gives you a crips picture. That's why 1280x768 and 1366x768 should just work, and the difference should only be how much of the monitor is taken up to the outer (left and right) edges. So @ 1024x768, you get a crisp picture in the middle of the screen with a nice chunk off of the outer edges.

When the PC had WinXP, 1280x768 worked, and a much smaller piece was black on the outer edges, and that was VERY bearable from my brother. 1366x768 would never work on WinXP either...

So I tried both VGA and DVI to HDMI. :( When I set 1280x768 (which used to work on XP), the TV then detects 1024x768 and jams it into a 1024x768 box, so the image is not crisp.

Both the TV and the computer can be set to 1366x768, but neither will detect the exact match of any resolution EXCEPT for 1024x768... if any of this makes sense.

Note: to get 1366x768, you HAVE to use powerstrip.
 
Older 720p panels truly are a PITA!! I hate how badly they shake hands with pcs.

In my experience with older TVs the trick was the Hz. Some do not like the default of 60Hz, some need lower or higher depending on the TVs. Do not ask what utilities I used as this was some time ago and my memory continues to degrade. :sly:

Not sure if it will help your issue(which appears to be resolved :thup:), just figured I would mention.
 
In my experience with older TVs the trick was the Hz. Some do not like the default of 60Hz, some need lower or higher depending on the TVs. Do not ask what utilities I used as this was some time ago and my memory continues to degrade. :sly:

Not sure if it will help your issue(which appears to be resolved :thup:), just figured I would mention.

It's not fully resolved. It's still not a 1 to 1 match. We are running 1366x768 in Windows and it detects as 1280 on the TV, so the 1366 is smushed down to 1280... not too bad, but it loses the crisp native resolution feeling. I personally cannot EVER use an LCD at anything but a native res to be quite honest.

So... I did try messing with the hz by using Powerstrip, but man there are SO MANY other options in there!! I was scared to try them lol...
Any pointers?
 
So... I did try messing with the hz by using Powerstrip, but man there are SO MANY other options in there!! I was scared to try them lol...
Any pointers?

Yes, be 100% sure it's working before you hit OK. :D

Honestly, it has been so long since I have played around with a TV/Monitor exhibiting this issue, but do recall if you set the wrong Hz or go to high/low, it will POOF'r on you. My suggestion would be to go VERY slow either way, (60 Hz default, try 58/62), also if you happened to have the old TV manual or you can dig it up somewhere, it _usually_ has the information on what Hz the TV can support, helps alleviate the mystery from it, just LESS fun. :thup:
 
Yes, be 100% sure it's working before you hit OK. :D

Honestly, it has been so long since I have played around with a TV/Monitor exhibiting this issue, but do recall if you set the wrong Hz or go to high/low, it will POOF'r on you. My suggestion would be to go VERY slow either way, (60 Hz default, try 58/62), also if you happened to have the old TV manual or you can dig it up somewhere, it _usually_ has the information on what Hz the TV can support, helps alleviate the mystery from it, just LESS fun. :thup:

Lower HZ may destroy the TV? Oh boy...lol cuz I did try some random #s at some point... well I basically was going through all of the profiles in PowerStrip. You know, some were like 1366x768 PDP (something like that), 1366x768 (Sony), 1366x768 (LCD), 1366x768 (Analog), then there were even 1368x768 and 1360, and 1280... I worked with all of the 768 options. I know the most important seems to be the horizontal #s with HDTVs as they usually give the most crisp images.
 
This may be a dumb suggestion, but have you looked in the onscreen menu of the tv to see if there is some setting that is conflicting with the computer.
-Greg
 
This may be a dumb suggestion, but have you looked in the onscreen menu of the tv to see if there is some setting that is conflicting with the computer.
-Greg

There's not much in there, and it's too damn bad that the IR receiver on the TV doesn't work so I can't eve use the remote to test aspect ratios and such. :(
 
Sorry to hijack this thread, but it seemed on the track of my problem too

I have a samsung 26in TV, if i connect my acer revo 3700 (nvidia Ion) via hdmi directly to the tv, it detects it perfectly as a 1360 x 768 (native) PC connection 60Hz.

I recently bought a "Neet" hdmi switch, not a cheap one it does HDCP and came recommended.

When i try to use my revo through the switch, it doesn't detect the native resolution and tries to use 1080i which looks just awfull.

Clearly, the switch is stopping something possibly EDID detection, my question is, is there something i can do to make my resolution stay the same when using the switch. I have tried setting up a cusotm resolution using the nvidia tool, but my tv just tells me that the resolution isnt supported.

Thanks for any help you might give!
 
Back