I guess I read something different than you did... The 2Gb card used 200mhz less speed on the memory first of all, so in most games that lost a few frames could easily be overcome if the clocks were bumped.
Grid picked up 11FPS at the highest res (29 vs 40, which could certainly mean the difference between "barely playable" and "actually playable"
In Crysis, the 2Gb card was 15% faster at every point until 1920x1200 where the gap narrowed to about 8%, obviously due to GPU saturation. I'm not even sure why they tested at "medium" details when every other game in the list was at "uber max" -- that's kinda crap. But oh well...
I'm not going to argue that 2Gb is worth it, but I think there's sufficient information here to suggest that a 1Gb card could certainly see benefit -- so long as the manufacturer makes it fair and keeps the clocks at the same speeds.