• Welcome to Overclockers Forums! Join us to reply in threads, receive reduced ads, and to customize your site experience!

TV won't run 100hz!

Overclockers is supported by our readers. When you click a link to make a purchase, we may earn a commission. Learn More.

Cown

Member
Joined
Jun 9, 2002
Location
Denmark
Hi Guys

I recently bought a Samsung LE40A866S1 TV and now want to connect it to my computer.

http://www.samsung.com/dk/consumer/...XE/index.idx?pagetype=prd_detail&tab=features

The TV has what Samsung calls 100hz/200hz Motion plus, and from what I can see it has no problem running 1920 x 1080 in 100hz?

I'm running Windows 7 64-bit on my main with a asus 4890 gfx and Windows 7 32-bit on my laptop with a nvidia 8600M GS gfx. (both updated with the latest drivers)

I've tried connecting both of these machines, but none of them will go above 60hz, I've tried making my own monitor driver with powerstrip, but still no luck.

Nvidia and ATI control panels wont allow me to force 100hz either.

Does anyone have any clue why I can't get it to work?

Much appreciated!
 
its not a monitor. It's a TV.

Some HDTV's are DAMAGED by certain VESA timings and do not recommend res above 1366 or 1600. Check the supported PC timings, if any, in your owners manual.
 
its not a monitor. It's a TV.

Some HDTV's are DAMAGED by certain VESA timings and do not recommend res above 1366 or 1600. Check the supported PC timings, if any, in your owners manual.

I've read the manual and apperently it wont allow anything above 60 hz, but on the profuct specs on the web page it says it can run 200hz in some settings, but it wont!
 
Any of those 120/240/whatever hz TVs can't actually accept a signal that high. They just use interpolation and other gimmicks.
 
Any of those 120/240/whatever hz TVs can't actually accept a signal that high. They just use interpolation and other gimmicks.

+1.... no 120hz 240hz tv's will accept anything higher than 1920x1080@60hz.... your best picture is going to be achieved by turning OFF all those stupid interpolation gimmicks as they just cause serious artifacts and input lag.
 
+1.... no 120hz 240hz tv's will accept anything higher than 1920x1080@60hz.... your best picture is going to be achieved by turning OFF all those stupid interpolation gimmicks as they just cause serious artifacts and input lag.

With PC, yes, because the TV doesnt properly understand the signal- but watching hi-def cable with a good signal, an upscaled DVD or a BLU-RAY, 120/240/400/600Hz interpolation makes a freaking MASSIVE difference.

Anyways, OP, what we're getting at is this:

your TV, WHATEVER Hz it does, is a 60Hz monitor. It will only TAKE a 60Hz signal (or 24Hz). It isn't like a 120Hz monitor which has TWO 60Hz engines. It has a SINGLE 60Hz engine and can accept signals only up to 60Hz. It then interpolates (adds ghost frames) up to X times per second (a 120Hz TV interpolates an extra 60 frames into a 60 frame signal to make it smoother and try to prevent motion blurr.)

It's basically post processing. It's not 60+Hz input capable, but it can take a 60Hz signal and OUTput simulated 120Hz.

Please refer to the accepted VESA input resolutions and timings for PC input (whether HDMI/DVI or VGA) for your TV in your manual. You will see nothing over 60Hz listed. You probably won't even find 1920x1080 officially supported as a PC resolution. More likely 1600x something
 
With PC, yes, because the TV doesnt properly understand the signal- but watching hi-def cable with a good signal, an upscaled DVD or a BLU-RAY, 120/240/400/600Hz interpolation makes a freaking MASSIVE difference.

Anyways, OP, what we're getting at is this:

your TV, WHATEVER Hz it does, is a 60Hz monitor. It will only TAKE a 60Hz signal (or 24Hz). It isn't like a 120Hz monitor which has TWO 60Hz engines. It has a SINGLE 60Hz engine and can accept signals only up to 60Hz. It then interpolates (adds ghost frames) up to X times per second (a 120Hz TV interpolates an extra 60 frames into a 60 frame signal to make it smoother and try to prevent motion blurr.)

It's basically post processing. It's not 60+Hz input capable, but it can take a 60Hz signal and OUTput simulated 120Hz.

Please refer to the accepted VESA input resolutions and timings for PC input (whether HDMI/DVI or VGA) for your TV in your manual. You will see nothing over 60Hz listed. You probably won't even find 1920x1080 officially supported as a PC resolution. More likely 1600x something

Now that made sense :) Thanks, the VESA has 1920x1080 at 60hz written and this is what I am using atm, so I'm guessing everything is ok!

Thanks!
 
With PC, yes, because the TV doesnt properly understand the signal- but watching hi-def cable with a good signal, an upscaled DVD or a BLU-RAY, 120/240/400/600Hz interpolation makes a freaking MASSIVE difference.

Anyways, OP, what we're getting at is this:

your TV, WHATEVER Hz it does, is a 60Hz monitor. It will only TAKE a 60Hz signal (or 24Hz). It isn't like a 120Hz monitor which has TWO 60Hz engines. It has a SINGLE 60Hz engine and can accept signals only up to 60Hz. It then interpolates (adds ghost frames) up to X times per second (a 120Hz TV interpolates an extra 60 frames into a 60 frame signal to make it smoother and try to prevent motion blurr.)

It's basically post processing. It's not 60+Hz input capable, but it can take a 60Hz signal and OUTput simulated 120Hz.

Please refer to the accepted VESA input resolutions and timings for PC input (whether HDMI/DVI or VGA) for your TV in your manual. You will see nothing over 60Hz listed. You probably won't even find 1920x1080 officially supported as a PC resolution. More likely 1600x something

i dont think you can get much better than a true 1920x1080@60hz @ 32bit from a computer.... HD cable and Blurays all have some form of video compression. The signal from your computer is completely uncompressed.
 
i dont think you can get much better than a true 1920x1080@60hz @ 32bit from a computer.... HD cable and Blurays all have some form of video compression. The signal from your computer is completely uncompressed.




The signal from HDMI is not compressed necessarily. The signal on DVI and HDMI is electronically identical. What's 'compressed' is video, be it on cable or a DVD or a BLURAY. It's compressed to about 5-10m/bit (25 if your cable company is really good) for cable. I don't recall the compression rate for BLU-RAY. It's a very lightly compressed high resolution MPEG-4. You could hardly really call it compressed at the bitrate blu-ray comes. It's more formatted to MPEG-4.

The 'signal' itself is not compressed. The content you get across that signal may be compressed depending on the source (all the porn and videos you watch on your computer are compressed. Your desktop itself as an item is not compressed but it is made up of compressed items like the jpg you use as a background and the icons you use...

Anyways- it's not a compressed signal format. No such thing. HDMI and DVI video signals at the same refresh rate/resolution are identical. The CONTENT they transmit may or may not be compressed but the two are not interdependant nor are they related.

Example:

You can use an HDMI / DVI adapter to hook your PC to TV. OR you can use a video card with direct HDMI out. This does not create or affect compression.
 
The signal from HDMI is not compressed necessarily. The signal on DVI and HDMI is electronically identical. What's 'compressed' is video, be it on cable or a DVD or a BLURAY. It's compressed to about 5-10m/bit (25 if your cable company is really good) for cable. I don't recall the compression rate for BLU-RAY. It's a very lightly compressed high resolution MPEG-4. You could hardly really call it compressed at the bitrate blu-ray comes. It's more formatted to MPEG-4.

The 'signal' itself is not compressed. The content you get across that signal may be compressed depending on the source (all the porn and videos you watch on your computer are compressed. Your desktop itself as an item is not compressed but it is made up of compressed items like the jpg you use as a background and the icons you use...

Anyways- it's not a compressed signal format. No such thing. HDMI and DVI video signals at the same refresh rate/resolution are identical. The CONTENT they transmit may or may not be compressed but the two are not interdependant nor are they related.

Example:

You can use an HDMI / DVI adapter to hook your PC to TV. OR you can use a video card with direct HDMI out. This does not create or affect compression.

so that being said wouldn't that go against what you said above on how

With PC, yes, because the TV doesnt properly understand the signal- but watching hi-def cable with a good signal, an upscaled DVD or a BLU-RAY, 120/240/400/600Hz interpolation makes a freaking MASSIVE difference.

What im trying to get at is that OK the end source from your HD set top box / Bluray player / DVD player / HTPC is HDMI (aka dvi with digital audio) but its the source thats compressed.... 1080 Cable tv is HIGHLY compressed, blurays have their dithering, an upscaled DVD is even worse.... but a computer pumping out 1920x1080@60hz @32bit is going to be the purest most crisp form. How effective Interpolation of frames is highly depends on the quality of the source... a computer being the best will have the best result problem is interpolation of frames sucks, it creates tons of input lag due to the processing of frames, and can have BAD artifacts.... look up triple ball effect (TBE) on google.
 
Last edited:
so that being said wouldn't that go against what you said above on how



What im trying to get at is that OK the end source from your HD set top box / Bluray player / DVD player / HTPC is HDMI (aka dvi with digital audio) but its the source thats compressed.... 1080 Cable tv is HIGHLY compressed, blurays have their dithering, an upscaled DVD is even worse.... but a computer pumping out 1920x1080@60hz @32bit is going to be the purest most crisp form. How effective Interpolation of frames is highly depends on the quality of the source... a computer being the best will have the best result problem is interpolation of frames sucks, it creates tons of input lag due to the processing of frames, and can have BAD artifacts.... look up triple ball effect (TBE) on google.

No man. That's not necessarily true. My 600Hz Plasma has ZERO input lag (as measured by GH metallica on Xbox) and runs all my games WWAAAAYY smoother to my eye than the 60Hz LCD. Frame interpolation sucks on older TV's. It still sucks somewhat on cheaper TV's. On a HIGH end LCD HDTV or even just a decent Plasma HDTV, the frame interpolation works absolutely phenomenally now. Post-processing has come a long way. Frame interpolation now is so smart and so intuitive-- here's an example. The cable company I have kind of sucks but I have limited choices. The signal is not very good. My 60Hz LCD when watching basketball loses little bits of detail here and there. Facial details, little hairs on arms against a brightly lit background, the woodgrain on the court in a specific area, the exact outline of a shadow or reflection. The 600Hz subfield drive on my plasma keeps every single detail on screen even when it disappears on my other TV. It's rediculously good. It also makes GTA liberty city stories on Xbox look like its at least 10FPS smoother with no noticeable tripling/doubling and no input lag. Same goes for any game I've played on it.

A computer pumping out 1920x1080x32/RGB is THE SAME as, say, your XBOX or PS3 dashboard, assuming you have it set to 1080P and RGB color pallette (HDTV's usually display color more vibrantly as cyan yellow magenta than red green blue- which is why most HDMI sources default to it).

The CONTENT is what's compressed. Your PC will have compressed or non compressed content and more and less compressed content (game VS desktop VS Divx movie) as will your PS3 (DVD movie vs BLU-RAY movie vs game etc)

the SOURCE (Xbox/PS3/PC/DVD player/BLUray player) does NOT in itself implicitly imply or suggest compression.

CONTENT is compressed. The "SOURCE" of that content is NOT. I don't know how else to explain that.
You can't compress a source.

Just because you're getting a compressed cable signal (let's say 8mbps) doesn't mean that if your cable company wanted to feed it to you, your cable box and HDMI cable couldn't handle a 30mbps signal. See what I mean?
 
Last edited:
No man. That's not necessarily true. My 600Hz Plasma has ZERO input lag (as measured by GH metallica on Xbox) and runs all my games WWAAAAYY smoother to my eye than the 60Hz LCD. Frame interpolation sucks on older TV's. It still sucks somewhat on cheaper TV's. On a HIGH end LCD HDTV or even just a decent Plasma HDTV, the frame interpolation works absolutely phenomenally now. Post-processing has come a long way. Frame interpolation now is so smart and so intuitive-- here's an example. The cable company I have kind of sucks but I have limited choices. The signal is not very good. My 60Hz LCD when watching basketball loses little bits of detail here and there. Facial details, little hairs on arms against a brightly lit background, the woodgrain on the court in a specific area, the exact outline of a shadow or reflection. The 600Hz subfield drive on my plasma keeps every single detail on screen even when it disappears on my other TV. It's rediculously good. It also makes GTA liberty city stories on Xbox look like its at least 10FPS smoother with no noticeable tripling/doubling and no input lag. Same goes for any game I've played on it.

A computer pumping out 1920x1080x32/RGB is THE SAME as, say, your XBOX or PS3 dashboard, assuming you have it set to 1080P and RGB color pallette (HDTV's usually display color more vibrantly as cyan yellow magenta than red green blue- which is why most HDMI sources default to it).

The CONTENT is what's compressed. Your PC will have compressed or non compressed content and more and less compressed content (game VS desktop VS Divx movie) as will your PS3 (DVD movie vs BLU-RAY movie vs game etc)

the SOURCE (Xbox/PS3/PC/DVD player/BLUray player) does NOT in itself implicitly imply or suggest compression.

CONTENT is compressed. The "SOURCE" of that content is NOT. I don't know how else to explain that.
You can't compress a source.

Just because you're getting a compressed cable signal (let's say 8mbps) doesn't mean that if your cable company wanted to feed it to you, your cable box and HDMI cable couldn't handle a 30mbps signal. See what I mean?

i definitly agree with you.... the source wether it be an xbox or pc isnt compressed by the content is. I dunno if its just me being pickey as hell but every time i go to like frys and look at the highest end 240hz tv it always looks really ****ty as far as motion... it could be silky smooth at one point then studder for a split sec every now and then and that ruins it.
 
i definitly agree with you.... the source wether it be an xbox or pc isnt compressed by the content is. I dunno if its just me being pickey as hell but every time i go to like frys and look at the highest end 240hz tv it always looks really ****ty as far as motion... it could be silky smooth at one point then studder for a split sec every now and then and that ruins it.

Are they running it off a splitter?

I watched Transformers for awhile on a top of the line 240 or 480 Hz samsung LED/LCD and it was the smoothest thing I've ever seen. Then I left the store and went home to my inferior TV.
 
Back