• Welcome to Overclockers Forums! Join us to reply in threads, receive reduced ads, and to customize your site experience!

FRONTPAGE AMD Launches R9 290 Series Graphics Cards

Overclockers is supported by our readers. When you click a link to make a purchase, we may earn a commission. Learn More.
I play many games at 30 FPS (mainly on console) and some of them was a better gaming experience than any of the 100 FPS PC games i ever had. A game isnt becoming good with FPS, its becoming good with gameplay and a common sense by a clear margin. Im not even sure if those people truly enjoy gaming, prehaps what they truly enjoy is how cool the machine is running them.

I remember, when i was very young i was unable to afford a good PC, and some games was running at 20 FPS but i had lot of fun, no matter how bad the PC did perform... i was just happy that my machine was able to run the game at all. Performance was out of range, but gameplay was always with me and joy too. So, unfortunately i dont agree with you.

In term anyone would stop playing a game at 40 FPS, a nice and very playable value, i think its not a real gamer but thats my own mind, other people may have other terms for "real gamers" but i wont change my own term because with all my heart and soul i know myself, a real game, and my joy. I know houndreds of RPGs and other games from around the world on more than ten systems and i feel i had more joy playing them at "bad" performance than 10 of the "high performance" gamers added together.

In term of Tomb Raider, just need to disable TreesFX and the FPS will eat breakfast. The loss of that eyecandy is actually low, its not the stuff making a good game out of it, just a marginal eye candy trying to bust up the hardware... (loss of half the FPS, just for hair simulation, thats terribly ineffective). Still, i see close to no difference playing it at 40 vs. 60 FPS because as long as the game is bad, it wont even rock at 1000 FPS, performance will be no use at all. However, I enjoy Skyrim at 40 perfectly fine, 60 is even better but the difference in joy is just marginal, a RPG doesnt need insane FPS, all it takes is a great game and a lot of fantasy.

Actually, anything above 100 FPS is waste of time even to speak about. I dont think there is any human able to tell the difference between 100 and 1000 FPS, if so, its just belong to theyr own believes, but not a proven fact. Most humans cant even tell the difference between 100 and 60 FPS because 60 is close to the "perfect value" without any loss in order to deliver a smooth engine. More important than that is a stable FPS without microjudders and sudden drops. I usualy cap my FPS to 60 because any value higher than that is no use to me, its just stressing the hardware for no reason.
 
Last edited:
To each their own I guess. But I dont own a console for that exact reason. You can't say that 290 or 290x is overkill at 1080 though, just because you dont mind 40 fps and screen tearing. For a good gaming experience, imo, at 1080, with 60fps and very little drops below, a 290 or a 780 are great cards.
 
I got no screen tearing, not even at 30 FPS. Screen tearing is usualy a software issue, not hardware based but i can perfectly understand that you always are pointing at the hardware, its just how you behave.
 
Thats a native ability of any modern hardware. The issue when it comes to V-sync issues is usualy located at the software. But just as i said, i dont own a single PC game with V sync issues. In the past 5 years the only game i truly had some bad tearing was on the Xbox360 and thats because they used bad software, the Xbox360 got a native Vsync ability but that doesnt mean that a bad software cant "disable" it. The reason is because it was a cheap PC port... and ported over to console without love. With the release of a DRM free gog version i stopped playing it and nowadays that game is running on my PC... so actually not a single game still in use is lacking Vsync. :D
 
I agree V-sync is a setting in software depending on your monitor speed. If you have 60Hz monitor and your FPS is 70 you will get screen tarring. if you have 30-60 fps and stay there you will not have any.
 
I dont understand what you're really getting at. I haven't run a single game on my pc without vsync for years. I cannot stand tearing. I also cannot stand 30fps. So if my card cannot keep up 60fps at least 80% of the time, I dont find the gameplay experience enjoyable. That's not me blaming hardware for software issues, it's just the fact that certain cards, like the 280x do not provide enough muscle for the gameplay experience that I find enjoyable. It comes down to personal tolerances for things like FPS and screen tearing. Someone who doesn't like tearing or 30fps should not buy a 280x for 1080p, not if they want to run maximum settings. If you want to sacrifice settings, or experience tearing/30fps drops then my all means save the money and get a 280x.

Everyones preference is going to be different. You can't just go and say that a card is overkill when you are basing that on your not very demanding needs.
 
I dont understand what you're really getting at. I haven't run a single game on my pc without vsync for years. I cannot stand tearing. I also cannot stand 30fps. So if my card cannot keep up 60fps at least 80% of the time, I dont find the gameplay experience enjoyable. That's not me blaming hardware for software issues, it's just the fact that certain cards, like the 280x do not provide enough muscle for the gameplay experience that I find enjoyable. It comes down to personal tolerances for things like FPS and screen tearing. Someone who doesn't like tearing or 30fps should not buy a 280x for 1080p, not if they want to run maximum settings. If you want to sacrifice settings, or experience tearing/30fps drops then my all means save the money and get a 280x.

Everyones preference is going to be different. You can't just go and say that a card is overkill when you are basing that on your not very demanding needs.

People don't need to deal with screen tarring by buying new hardware, you just need to enable v-sync and it will not run over your refresh rate of the monitor.

I don't have screen tarring and because I run adaptive V sync.
 
That's really not a true statement. When running adaptive v-sync as soon as your fps drops below 60 (for a 60hz display) the software turns off v-sync. That immediately introduces tearing. (tearing happens both at fps above refresh rate as well as below refresh rate.)

So if you can't maintain 60fps with your hardware, you will experience tearing when your fps drops, or you can use normal v-sync are you will eliminate tearing but you fps will jump straight to 30fps when you can't maintain 60fps.

Both scenarios are terrible IMO, that's why I buy a graphics card capable of maintaining 60fps over the majority of my gaming.

Luckily we have g-sync coming, which will entirely eliminate tearing and fps halving. But sadly only for Nvidia buyers.
 
That's really not a true statement. When running adaptive v-sync as soon as your fps drops below 60 (for a 60hz display) the software turns off v-sync. That immediately introduces tearing. (tearing happens both at fps above refresh rate as well as below refresh rate.)

So if you can't maintain 60fps with your hardware, you will experience tearing when your fps drops, or you can use normal v-sync are you will eliminate tearing but you fps will jump straight to 30fps when you can't maintain 60fps.

Both scenarios are terrible IMO, that's why I buy a graphics card capable of maintaining 60fps over the majority of my gaming.

That is not true and if you had my hardware you could see that you are incorrect. screen tarring only happens when your video card sends more frames then you monitor can handle causing tarring.

The artifact occurs when the video feed to the device isn't in sync with the display's refresh.

You don't have to deal with screen tarring so then you don't understand adaptive v-sync works great, nvidia would not have made that setting if it did not work.

When the video drops below 60 then it is in sync with the monitor how do you think it works.
 
You aren't listening. Go look it up. Screen tearing happens both when you fps is above and below your refresh rate (vsync off)
Adaptive vsync doesn't get rid of screen tearing if your card is only pushing out 45fps, in fact adaptive vsync doesn't do anything at all at 45fps. I dont need your hardware. I have 2x 670s and a 650ti boost. I have plenty of experience using adaptive v-sync since it came out. I understand exactly how it works. You do not.
 
I got a single 280x so far oc to 1175/6500

i play a lot bf4 and in 1080p 64MP online with view 120, scale 100 and everything on ultra and 4 msaa i can do 60+ fps and sometimes more, but in will fall down to 40 to many times, so i need another 280x to top the game so i can hit 60fps+ all the time.

if i play high and 2xmsaa i do 60fps+ all the time never see it under 60. so for the last Ultra settings i will buy another 280x. ;)
 
You aren't listening. Go look it up. Screen tearing happens both when you fps is above and below your refresh rate (vsync off)
Adaptive vsync doesn't get rid of screen tearing if your card is only pushing out 45fps, in fact adaptive vsync doesn't do anything at all at 45fps. I dont need your hardware. I have 2x 670s and a 650ti boost. I have plenty of experience using adaptive v-sync since it came out. I understand exactly how it works. You do not.
Yes I'm listening you don't understand that you don't need v-sync 60 FPS and under because the monitor syncs with the video card and it does not use vertical sync, it uses refresh rate that is why I don't get tarring at 48-60 FPS

You need to read how monitors sync with video cards.

Tearing

http://www.tweakguides.com/Graphics_9.html
It is an unfortunate fact that if you disable VSync, your graphics card and monitor will inevitably go out of synch. Whenever your FPS exceeds the refresh rate (e.g. 120 FPS on a 60Hz screen), or in general at any point during which your graphics card is working faster than your monitor, the graphics card produces more frames in the frame buffer than the monitor can actually display at any one time. The end result is that when the monitor goes to get a new frame from the primary buffer of the graphics card during VBI, the resulting output may be made up of two or more different frames overlapping each other. The onscreen image may appear to be slightly out of alignment or 'torn' in parts whenever there is any movement - and thus it is referred to as Tearing. An example of this is provided in the simulated screenshot below. Look closely at the urinals and the sink - portions of them are out of alignment due to tearing:
 
Last edited:
Screen tearing happens whenever your fps is not synced to your refresh rate. At lower than 60 fps as well as above. It's much worse if your fps is a lot higher than your refresh rate, in those scenarios you could get 2 or three tears on screen at a time. At 45 fps you will still get tearing just not in every frame. In order to have no tearing the fps has to be in sync at 60 he or the same frame must be displayed twice : 30pm.

Tldr: tearing happens below 60 fps even when using adaptive v-sync. That is not really debatable
 
Last edited:
Adaptive v-sync works just as STH describes. After it gets under 60 FPS, it shuts off.

NVIDIA's Adaptive Vsync does shut itself off below 60 FPS. It shuts off below 60 FPS to reduce the stuttering introduced by normal vsync which clamps to frames to multiples of the refresh rate (45/30. etc). Vsync though tends to add some input lag...in my limited use of it, the adapative vsync minimizes that though does not eliminate it.

Tearing can occur above or below, it is just most visible above. It happens below the same way as above, the panel starts to refresh the image (using the old frame) and then halfway through it gets the new frame and starts drawing that instead.
 
Last edited:
Anyway, a little more on topic. I'm really considering picking up a 290 for benching purposes. It's so cheap and such great performance. Anyone have benchmarks under water yet?
 
Anyway, a little more on topic. I'm really considering picking up a 290 for benching purposes. It's so cheap and such great performance. Anyone have benchmarks under water yet?

From ocn:

7027cf69_00001.png

Valley Extreme HD -- Sapphire r9 290 -- 1150/1575 -- stock bios/volts -- stock cooling

30a83d24_1150core1550mem.jpeg

Sapphire r9 290 @ 1150/1550 Firestrike -- 12348 gpu score -- stock volts and bios -- stock cooling
http://www.3dmark.com/fs/1121788

All water scores I find are 290X cards, but they already have a custom gputweak for voltage control, and you can also add +100mv in Afterburner beta16 with a command line tweak. AB support will be ready soon, looking forward to it. (Asus soft is buggy and the interface is quite horrid)

"You can alter voltage on it even with current MSI AB beta by sending commands to VRM via MSI AB command line:

To set +100mV offset:
MSIAfterburner.exe /wi4,30,8d,10

To restore original voltage:
MSIAfterburner.exe /wi4,30,8d,0

Use it at your own risk"

http://forums.guru3d.com/showthread.php?t=382760&page=6

You can get an Asus card for best compatibility with gputweak, no need to flash bioses to get voltage control and best compatibility with the modded PT1 bios (PT3 is not advisable since it seems to wreck cards eventually, better wait for better AB support for more hardcore benching). Most PT3 users report black screens that are a pain to troubleshoot, and most get worse when you touch memory colocks.

EDIT: We need an R9 290X/290 club thread here...I don't wanna hang out at ocn so much once I get mine :p
 
Last edited:
Wow those are some impressive numbers for a 400 dollar card. Highest I've seen the 780ti is 86 in valley. I plan on buying both cards for benching purposes at some point. I haven't bouight ATI since a bad experience with the 6990. What's the best manufacturer for extreme voltage control? Asus? How about hotwire support?
 
Wow those are some impressive numbers for a 400 dollar card. Highest I've seen the 780ti is 86 in valley. I plan on buying both cards for benching purposes at some point. I haven't bouight ATI since a bad experience with the 6990. What's the best manufacturer for extreme voltage control? Asus? How about hotwire support?

LN2, Msi Lightning...For air and water, right now I'd say a reference Asus card. (at least current offerings, we have yet to see which will be the best non ref this gen)
 
Back