• Welcome to Overclockers Forums! Join us to reply in threads, receive reduced ads, and to customize your site experience!

GTX 770 vs. AMD 6970: more similar than not?

Overclockers is supported by our readers. When you click a link to make a purchase, we may earn a commission. Learn More.

magellan

Member
Joined
Jul 20, 2002
I've been looking to go green after a long hiatus and was looking at techspot's GPU
specifications and decided to compare my Diamond 6950 flashed to 6970 and overclocked to the GTX 770:

SPECS
GTX 770 (KFA2 LTD OC) 6970
shading units 1536 1536
TMUs 128 96
ROP's 32 32
SMX Count 8
Pixel Rate: 33.5 (38.5) GPixel/s 26.6 GPixel/s
Texture Rate: 134 (154) GTexel/s 79.7 GTexel/s
GLOPS 3,213 (3,693) GFLOPS 2,550 GFLOPS
memory bus width 256 256
memory bus speed 1753Mhz 1425Mhz
memory bus bndwdth 224 GB/s 160 GB/s (182.4 GB/s w/o'clk)
GPU core speed 1085Mhz(1254Mhz) 1035Mhz
PCIe 3.0 x 16 2.0 x 16
memory 2 GiB to 4 GiB 2 GiB

I don't know what the SMX count refers to or what advantages it brings
to the table. I think I'm going to wait on 780's w/more VRAM. before
upgrading.
 
The GTX 770 is essentially a rebuilt GTX 680. Therefor, these benchmarks can be used for comparison.

You don't need more than 3GB of VRAM for a single 1080p / 1200P screen. It would be a waste. Since GTX 780s dropped from ~$660 to ~$530 last week, I highly suggest is as a possible upgrade path.
 
I guess yes to answer the thread title... but it is not as if you can remotely compare the two the though. The architecture of the card/cores work so much different, you just can't do that.

SMX clusters are a group of shaders essentially. In this case, each cluster contains 192 shaders.

I think, as was discussed in your other thread, you are about nuts to buy a 780 for 1600x1200 or whatever low resolution you have going on there. A 770 will easily best a 6950 regardless of your apples and oranges comparison up there.
 
I don't know if I would buy a new gpu with only 2gb of Vram because even at 1080p bf4 has been using 2.2gb on my 7970. I'm not certain if this is due to memory leakage since the first 5 minutes of game play uses about 2040mb then as the game continues it soars up to 2356mb. I made the mistake on my last build by buying a 1gb 560ti thinking it would be enough for 1080p gaming in late 2011. The first day I played bf3 on ultra I slapped myself for being stupid and swore never to make that mistake again.

what you are comparing is apples and oranges because of the differencs in architecture. Gtx 770 will hands blow your flashed 6950 in every game and benchmark.
 
I don't know if I would buy a new gpu with only 2gb of Vram because even at 1080p bf4 has been using 2.2gb on my 7970. I'm not certain if this is due to memory leakage since the first 5 minutes of game play uses about 2040mb then as the game continues it soars up to 2356mb. I made the mistake on my last build by buying a 1gb 560ti thinking it would be enough for 1080p gaming in late 2011. The first day I played bf3 on ultra I slapped myself for being stupid and swore never to make that mistake again.

what you are comparing is apples and oranges because of the differencs in architecture. Gtx 770 will hands blow your flashed 6950 in every game and benchmark.

I concur. IMO, his options are the R9 280x, R9 290x, GTX 770 4 GB, or GTX 780.

However, I'd not get the 4 GB 770 though...the last gig will likely be forever alone (unused...lol).
 
I remember when the 4870 2 GiB models were available and everyone said no game would ever use that amount of VRAM.

The GTX 680 benchmarks that Darknecron linked to show two games that would double my FPS (BF3 being one) over my 6970, but in Crysis and Metro 2033 Lastlight, the difference would be less than 50%, even at the low resolutions I'm currently playing at.

Since I still use XP 32-bit for EAX games, AMD isn't an option for me.
 
I guess yes to answer the thread title... but it is not as if you can remotely compare the two the though. The architecture of the card/cores work so much different, you just can't do that.

SMX clusters are a group of shaders essentially. In this case, each cluster contains 192 shaders.

I think, as was discussed in your other thread, you are about nuts to buy a 780 for 1600x1200 or whatever low resolution you have going on there. A 770 will easily best a 6950 regardless of your apples and oranges comparison up there.

My video card runs perfectly at the clock speeds of a 6970, and is running a 6970 BIOS and is identified as such by GPUz and windoze.

Would a GTX 770 be able to run 8xAA, 16xAF in Crysis 3, Far Cry 3 or Metro 2033: Last Light even at 1600 x 1200?

I don't want to buy a GTX 770 now, then have to buy another video card when I go to 1920 x 1200.
 
I doubt that in Crysis 3, perhaps FC3, and no chance in hell in Metro:LL. If that is your goal to run such settings (most of which are over standard ultra settings) then a 780 may be the best choice (or 290x).
 
The lower the res though, the more AA you need to make it look better. 8xAA may be needed for some at the lower res he currently plays at.
 
I upgraded from 1440x900 to 1080p just earlier this year. 8x has absolutely no visual improvement over 4x. I used to take screenshots of the same scene and compared them in MS paint (way back when I was figuring out what settings did what). You literally have to zoom in 300% to see the difference lol.
 
lmao! Yeah, I'm trying to do this on my phone while at work and it trips out if I try and edit a post. I just leave it for you guys to humour me. lol
 
I upgraded from 1440x900 to 1080p just earlier this year. 8x has absolutely no visual improvement over 4x. I used to take screenshots of the same scene and compared them in MS paint (way back when I was figuring out what settings did what). You literally have to zoom in 300% to see the difference lol.
As I said, SOME can notice. You happen to not fall in that group. :thup:
 
For crysis 3, you need at least a 780, as ed mentionned. For FC3, the 770 should do.
 
I've always read to buy as much GPU as you can afford to get, because for the past 5 years (10 years?) it seems like the GPU is the far and away the greatest determining factor in gaming benchmarks. I've "upgraded" in the past and ended up w/no noticeable difference in performance (when I went from a 9800 XT 256MiB to an x850XT 256MiB for example).
 
I upgraded from 1440x900 to 1080p just earlier this year. 8x has absolutely no visual improvement over 4x. I used to take screenshots of the same scene and compared them in MS paint (way back when I was figuring out what settings did what). You literally have to zoom in 300% to see the difference lol.

How many games did you make this comparison with? One?
 
Back