Ferg (Jul 07, 2001 08:27 p.m.):
The human eye can only see 24 fps. Movies use only 24 fps for that reason. So if you can get around 40 then in the most intense graphics you should be able to stay over the 24 fps and your eye will never see the difference.
Actually this is absolutely NOT true. People have different rates of perception, just like they have different quality eyesight. You absolutely CAN see more than 24 frames per second. Anyway, when you're interacting with a game you will discover you are much more aware of how responsive your computer is. If, for example, you're just sitting back and watching a game of Q3a (or anything else), chances are you could not tell the difference between 30 and 90 FPS. BUT, when you are actually playing the game you can tell that your fps is lower or higher based on how sluggish the responsiveness is.
Up around 50-60 fps or so, you're generally not going to really notice anymore. 24 fps is great for movies, but for games it is HORRIBLY sluggish. I know that when my frame rate drops down into the mid twenties I absolutely can tell the difference from when it is higher.
I agree that fps isn't everything, but I think you're not taking into account that more powerful graphics cards can run in higher resolutions. I couldn't care less whether my card was turning out 60 or 100 fps, but I COULD care if it could run at 1600*1200 at a decent frame rate, instead of say 1024*768.