At 60 FPS, you'd need to be loading 133MB of new data to the card per frame to saturate PCIE 2.0 x16 (8GB / 60). Even a 600MB/s SATA III SSD couldn't sustain that. You'd have 13.333 seconds of reading the drive per single second of screen time. Even if you loaded everything into RAM in advance... You need 480GB of data to saturate the bus for one minute. That'd take 13:20 of loading time from the disk (and you'd need to have that 480GB of RAM somewhere, too).
A "3.0" bus might be useful for a 16- or 24-drive RAID controller with SSDs, but your GPU is still nowhere close to needing it. The GPU is capable of using the bandwidth, there's just no application that requires it, nor storage system that can actually get that volume of data to the card that fast.
Interesting. Are there any types of Bitmap or High resolution images that are about 133MB in size per frame that when rendered/wrapped, can run a HD movie? For example a 1080p video runs on a dual core computer usually pretty well. If there were a more heavy processing need, I'm curious if 4x128GB RAM chips will one day be able to process that (It seems like yesterday there were 128MB RAM chips, and my first computer had 8MB RAM- 4x2MB, and today 4x2GB isn't too uncommon, so it seems like 128GB chips will be out soon?
). I'm also thinking about animation and quadro/firepro cards.
https://secure.wikimedia.org/wikipedia/en/wiki/WQSXGA#WHUXGA_.287680.C3.974800.29
"WHUXGA an abbreviation for Wide Hex[adecatuple] Ultra Extended Graphics Array, is a display standard that can support a resolution up to 7680×4800 pixels, assuming a 16:10 aspect ratio. The name comes from the fact that it has sixteen (hexadecatuple) times as many pixels as a WUXGA display. A WHUXGA image consists of 36,864,000 pixels (approximately 37 megapixels). A monitor of 7680×4320 would also qualify as a WHUXGA display. UHDTV video requires a display of similar resolution (7680×4320) for properly displaying UHDTV content, which is 16 times the pixel count of the 1080 ATSC HDTV video standard."
Edit 2: so if that resolution is 16 times the pixel count, maybe an 8 core or 16(or 32) core computer could process that?
Also
"According to CSIRO, in the next decade, astronomers expect to be processing 10 petabytes of data every hour from the Square Kilometre Array (SKA) telescope.[10] The array is thus expected to generate approximately one exabyte every four days of operation. According to IBM, the new SKA telescope initiative will generate over an exabyte of data every day. IBM is designing hardware to process this information.[11]"
https://secure.wikimedia.org/wikipedia/en/wiki/Exabyte
I'm just putting together some ideas. Maybe there will be a practical use somewhere.
Edit: I guess if the human eye can't tell the difference it might not matter much while viewing, but during editing/pausing it might be useful for zooming in.