I understand your points, and I appreciate that you take your time to be part of this discussion. But I would like to point out that I wasn't referring to the loss of performance in synthetic benchmarks in my very first post, that was mentioned in the very first few sentences. I was referring to the loss of performance in actual games. Additionally I would like to say that, no, I do not use v-sync, it's in fact forced off in the nVidia control panel, simply because I cannot reach even 50FPS in most of my games with my CPU at only 3.0Ghz (stock speed). And my refresh rate on my LCD when I still had it was 75Hz, but on my current CRT (replacement in the time being, until I get a new LCD since my previous one died recently) refresh rate is set at 100Hz in most resolutions and 85Hz in higher ones (1600x1200, 1600x1024, 1280x1024, and some others).
There's technically no need for v-sync since the bottleneck created by the CPU at stock speed is so high/restricting on the performance that, as I said and I will repeat, I'm barely seeing even 60FPS in most of my games at resolutions of 1600x1200 for instance (with or without AA/AF, doesn't matter), and it's not just Source-based games, it's many others, on many engines and even some in OpenGL. If I had to mention my whole games collection I would, but I don't feel like it, although I can say that there's exactly 37 titles installed on my HDD and that I've based my very first post and the ideas I had on the tests I made on 14 of them ever since I bought my GTX285 about two weeks ago.
To be honest I don't care much about 3DMark scores, if not for pure curiosity. But I did mentioned 3DMark06 in a later post after my first one in reply to someone else, that if he really wanted to he could make comparisons in 3DMark06 to give himself a basic idea of the points of my first post. I know that you said that my points were valid "anyway", but as I said, I wasn't referring to 3DMark06 at all, I don't actually "play" 3DMark06, I don't care much about it. I'm simply mind-bashed and absolutely disappointed to see that when I play for instance Warhammer 40,000: Soulstorm, which is based on a now quite aging engine, and that I see my FPS drop to the low 10's mark (I can prove it whenever you ask me to) I just wonder if I actually physically left my previous GPU installed instead or if I actually ever even received my GTX285, or if it was a dream or something. Is it normal? No, it's not, is it the fault of the engine being old? Is it only the fault of the bottleneck? I don't know, and I shouldn't have to know, my "job" isn't to know about all of that, I'm not their engineer, I'm just a gamer and the only thing I want is that if I buy a $400 or $500 component focused on processing graphics (and in the future even physics and God or Intel/nVIDIA knows what else) then I should not only hope for but actually get a minimum of good performance, not just "decent", but actually good.