TV's run at 60 interlaced fields per second, which creates a full 30 frames per second, which produces a bit of motion blur. And in most cases, the actual source media for your TV (DVD's, VHS tapes, actual production over-the-air video) has motion blur in it too (ever pause a high-speed action sequence on a DVD? notice how it's blurry?)
Computers update the screen one-entire frame at a time, which gives no sense of blur. Combined with full-frame rasterizers that don't blur either, and you get 60 completely different frames per second technically. They aren't blurred together, they are all discrete in their motion. Thus, you can perceive the difference easier on a monitor than you can on a TV.
What's really interesting is watching a normal TV compared to an HDTV running in a non-interlaced video mode (480p or 720p). The demonstration I saw was Pirates of the Carribean on DVD, both being played at the same time index on two identical DVD players plugged into two identical TV's. One was running in "standard" mode, one was in HD 480p mode...
The HD mode was incredibly smooth, it could almost make you motion sick how smoothly the skeletons were crawling over the ship deck in the final fight scene. That's because HDTV in non-interlaced mode is actually running at a true 60fps, and you can tell the difference quite noticeably.