• Welcome to Overclockers Forums! Join us to reply in threads, receive reduced ads, and to customize your site experience!

Video Card for DaVinci Resolve

Overclockers is supported by our readers. When you click a link to make a purchase, we may earn a commission. Learn More.

rainless

Old Member
Joined
Jul 20, 2006
https://www.pugetsystems.com/labs/a...VIDIA-GeForce-RTX-3080-3090-Performance-1903/ I know that this shouldn't even be an issue... but somehow it is. I haven't really thought much about graphics cards in years. I usually get whatever the most reasonable used NVIDIA card I can find... and so far ten times out of ten that has been enough for whatever I've needed to do over the past ten years. (I don't think I've bought a brand new video card since the EARLY 2000s.) But... as everyone here knows... this is an unusual year. We've got the 3080 out at an unprecedented price/performance level and the 3070 soon to come.

After reading this article: https://www.pugetsystems.com/labs/a...RandAMDRadeonRX5700XTperforminDaVinciResolve?

...and another that contained the 1080TI... I became convinced that there simply wasn't enough of a performance difference, in the DaVinci Resolve video editor mind you... for me to care about anything beyond an RTX 2060 Super. (Literally 1fps difference between that and an RTX 2070 super.)

But then I saw their updated chart (up top) with the 3080... and that gave me pause.

Not because of the huge performance increase so much (though I'm sure that is a very nice thing...)

...but because of the PRICE!

I don't think I can swing an RTX 2060 Super... or even a used 1080TI... for anything under 350. That's just the reality. And if I pay that much for one now... the value for that card is just going to RAPIDLY decline as soon as the 3080 is more widely available.

This places me in a weird dilemma as I have projects to edit NOW...

So the most logical thing (which simultaneously strikes me as insane) would seem to be spending twice as much as I was looking to spend and getting the 3080. Or just scraping the bottom if ebay for a used 1080ti and holding onto that until the 3060s come out. (I mean... how low could a 1080ti POSSIBLY go for on eBay?)

International Superstar TeamRainless would normally just get the 3080... but with also the release... next month mind you... of both the PS5 and XBox Series X... this is shaping up to be an unusually expensive year... So I'm trying to save a little money wherever I can.

What say you?
 
OP
rainless

rainless

Old Member
Joined
Jul 20, 2006
If you plan on upgrading anyways (or not, even) then perhaps a used 1070 Ti would make sense? Or a 1660 Ti new or used?

Personally, I think I may force myself to pull the trigger on a used 1070 Ti. They're on eBay for less than $250 BIN. https://www.ebay.com/itm/EVGA-NVIDI...1-KB-KR-GAMING-ACX-3-0-Black-8GB/274533988581

I'm sure hunting around yields a better price.

Yeah they're closer to 300 here in Germany... and I just got a 50 euro gift card for a major electronic store here... which makes them about even to the 2060 Super. So I think I may go that route.

Trying to decide between this one:

https://www.mediamarkt.de/de/produc...per™-1-click-oc-8gb-26isl6hp39sk-2574153.html

and this one:

https://www.mediamarkt.de/de/produc...al-evo-v2-oc-8gb-90yv0dz0-m0na00-2621744.html

Major difference (besides 15 bucks) seems to be that the Asus has an extra HDMI port and looks like garbage. :D

Other than that... I would expect everything to be the same.

I haven't used dual displays in like a loooooong time. But... that said, I do have like a hundred monitors laying around here. So for the extra 15 bucks it would be nice to have the option. (For reference the KFA2 is the same as Galax).
 
OP
rainless

rainless

Old Member
Joined
Jul 20, 2006
Well... It's settled. I'm picking up the 2060 Super tomorrow. (Very CONVOLUTED route... I wanted to buy it online using my 50 euro coupon which... it turns out... is only good in the store... Sigh... Least that means I can just go to the store and get it tomorrow as opposed to waiting until Monday.)
 

Ben333

Folding for Team 32!
Joined
Feb 18, 2007
Please let us know how that works out! The 2060 is also on my list of considerations.
 
OP
rainless

rainless

Old Member
Joined
Jul 20, 2006
Please let us know how that works out! The 2060 is also on my list of considerations.

I've done EXTENSIVE research that seems to point to that card. So i'd highly implore you to skip the 1070 at anything over 200 bucks.

My main concern is how much my 9600kf processor may be a limiting factor... but it seems to run fine for everything resolve and fusion does until I add effects (like "Fast Noise) at 4k.

Don't know whether the extra 3GB of the 10080ti would make much difference...

This page may be useful to you:

https://www.richardlackey.com/the-best-gpu-davinci-resolve-nvidia-amd-2020/

As well as the two I placed above. But, of course, I'll provide a first-person account once I've had time to check for improvements after tomorrow. ;)
 
Last edited:
OP
rainless

rainless

Old Member
Joined
Jul 20, 2006
Please let us know how that works out! The 2060 is also on my list of considerations.

As promised... Here are the results:

castle_render2.jpg

First off: I feel it's important to note that my problem has been COMPLETELY solved just by upgrading from the GTX 960 to the RTX 2060 Super. I didn't even change the driver or anything. (I'm assuming Windows 10 downloaded the new driver automatically as it took quite a while before the system finished booting.)

As for the photo above: What you see is a photo of a castle that I took covered in fog created in Fusion using an effect called "Fast Noise." This is what, effectively, ground my system to a halt with the 4GB GTX 960. (The timeline would crash IMMEDIATELY the moment I hit play. I couldn't even directly render it without dropped frames.) I suspected that the GPU was simply running out of memory... and I did receive an "OUT OF MEMORY" pop-up error for the very first time. I was secretly hoping that it wasn't also the i5 9600kf slowing me down... thank goodness it wasn't.

Second: So the extra 4GB of VRAM completely solved my problem. As an additional bonus... my render time for that clip (UHD 4K h.264) went from 2:33 to just 40 seconds. I'm kind of over-the-moon about that... (I'd expect similar results in DCI 4K... which is what I actually shot in.

Below are the BRAW Speed Test results.

braw-960gtx-speed test.jpg braw-rtx2060-speed test.jpg

Consequently... I think I'm hooked on benchmarks again!

Can't wait to see what the scores for the 3000 series will be...