• Welcome to Overclockers Forums! Join us to reply in threads, receive reduced ads, and to customize your site experience!

Evga 2080ti black edition gaming first impressions

Overclockers is supported by our readers. When you click a link to make a purchase, we may earn a commission. Learn More.

Brando

Member
Joined
Jan 9, 2006
Just got this thing installed and ran a few tests to see how this ghetto $999 card compares to the $1500 "real" ones. Here's my girl in her new home for the first time.

thumbnail_image000000.jpg

- - - Auto-Merged Double Post - - -

This is how it sits at idle. a bit warm at 53c but makes no noise and seems to be fine.


EDIT: It looks like i was starving my case of air flow a bit and heat was building up. I turned my case fans up from 450rpm to 590rpm and temps on everything dropped by about 10c with no noticeable noise. the card idles at 41c now.

2080ti idle.JPG


Here's a stock settings run of superposition at 4k to set a baseline. Max temps were 74c with the core starting out at about 1740mhz and ending around 1675mhz after heating up for awhile.

2080ti stock sposition 4k.JPG


Here's another 4k run after running the evga X1 auto overclock with the power and temp sliders maxed. It ended up at +193. Clocks started at 1920mhz and ended up in the mid/upper 1800's after heating up.

2080ti pow-temp max mem stock.JPG

After messing with the ram a bit I settled on +1250 or 16500mhz. trying 17000mhz caused artifacts.

2080ti +193 +1250 max sliders.JPG

And last but not least is another 4k superposition run with +193/+1250 and sliders maxed again but this time with fans at 100%. This took the max temp from about 80c to about 70c and kept the clocks around 1900mhz for the most part. A good cooler should be able to keep it over 2000mhz under load no problem I'm sure.

2080ti +193 +1250 max sliders max fan.JPG
 
Last edited:
looks like when your in an actual game the clocks stay higher. in 1440p ultra bf5 it was around 2025 to 2050mhz most of the time. in bf5 4k multi it was over 1900 the whole time. a good cooler will make this thing fly. most likely going to grab an arctic accelero xtremeIV.


4k ultra clock speed after heating up
bf5 4k ultra.jpg
 
Last edited:
i would say the ray tracing is pretty cool if it's on ultra. otherwise i barely notice it so why bother. at that point dlss is required so hopefully it gets less blurry soon. for the moment 4k ultra in dx11 is the best looking mode with acceptable frame rates for multiplayer but 1440p is still better if you really want to win.
 
I like ultra ray tracing also in BF V and getting FPS 60+, it is hard for me to tell when DXR is enabled. I game at 1080p so DLSS is out for me to blurry.
 
I like ultra ray tracing also in BF V and getting FPS 60+, it is hard for me to tell when DXR is enabled. I game at 1080p so DLSS is out for me to blurry.

go somewhere with lots of shadows and reflections and toggle between low/ultra. definitely better. i'll try to make some screens. the ultra rt looks deeper or something. hard to explain.
 
go somewhere with lots of shadows and reflections and toggle between low/ultra. definitely better. i'll try to make some screens. the ultra rt looks deeper or something. hard to explain.

What I would like to see a screen shot of is DXR enabled and disabled.:)
 
What I would like to see a screen shot of is DXR enabled and disabled.:)

Ask and GN provides-sort of. It's Metro Exodus instead of BFV, lots of on screen comparisons. This is the qualitative comparison Steve did, there's another one for benchmark numbers (I'll link to the benchmark video.)


Benchmark numbers
 
Ask and GN provides-sort of. It's Metro Exodus instead of BFV, lots of on screen comparisons. This is the qualitative comparison Steve did, there's another one for benchmark numbers (I'll link to the benchmark video.)


Benchmark numbers

Battlefield V does not do shadows. Battlefield V DXR ray tracing improves reflections only. I spent a hour in BF V looking at reflections with DXR on and off and reflections are only slightly improved so small it would not help to try and get a screen shot.
 
i made a vid in 4k with shadowplay but youtube won't take it because it's too long. now i can't login in geforce experience gallery to trim the video for some reason even though it worked last night. wtf. i'll think of something.

edit: meh the youtube vid got compressed to hell after upload so i scrapped it. @wingman you're probably right about shadows, i didn't look at it super scientifically but it looked good to me with everything cranked. i'll try some on/off pics later. i've been trying to nail down some issues first.

i think there may be something going on with the overclocking software but i'm not 100%. i tried switching to msi afterburner but it remembered my clocks from before and added onto them when i tried to oc....i think. i was getting crashing in bf5 and i noticed it was trying to boost over 2200 and had remembered my oc from precision even though it was uninstalled. it took some experimentation with gpuz and heaven to see that the clocks were still set even with no oc software running. i think i got it back to square one but there's been some strangeness. windows tried to repair itself at least once during a restart. i'm starting to think the drivers are too different and need a fresh windows install or something.
 
Last edited:
Problem possibly solved. i think evga precision x1 tries to auto overclock too high. it was actually playing games with an occasional stutter and i thought it was basically a solid oc but it was way too high. even though the clock speed was levelling out in game after heating up, it was trying to boost over 2200hz for a second every time you open a game or start windows. at least that's the theory since i saw it boost super high for a few seconds while the temp way low as soon as i tried to open up bf5.

i set everything to baseline and did a manual oc a bit at a time starting low and creeping up while in game by alt tabbing out and watching in game clock speed until it was at what i know it can do. in afterburner this takes +100. evga precision auto oc had it at +193. other main difference is afterburner can only do +1000 on memory and precision let me go to +1250.

i think it was the core speed because last night and again this morning i was able to come out of sleep mode with no problem. time will tell but at the moment it looks like there's nothing wrong with the card. on the contrary it should be a good clocker with better cooling but even with this one it stilll does pretty well while keeping under 80c.
 
That's how the cards work, Brando. At 55C or so, you will see it start to slowly drop boost bins. Power limits also come into play.

Typically, the OC scanners (at least X1) leaves meat on the bone in my experience.
 
That's how the cards work, Brando. At 55C or so, you will see it start to slowly drop boost bins. Power limits also come into play.

Typically, the OC scanners (at least X1) leaves meat on the bone in my experience.

i understand about boost with modern gpu's but this was kind of extreme. everything i read said that the auto oc in precision x1 is conservative and stable with room to grow so i never considered that it may be be overclocking to almost double the extra mhz that it should be to get current in game clocks. for example....

afterburner manual oc of +115 gets me to mid 1900's to low 2000's in game and everything starts up normally

precision auto oc of +193 in game speed settles to about mid 1900's to low 2000's in game but spikes way higher when trying to start a game (or windows apparently?) at idle temps before heating up and sometimes won't start the game or display windows upon start/reboot/wake because of how high it's trying to go before it settles into load temps and clocks.
 
I dont recall ever seeing a + value that high honestly. I don't recall what our FE 2080 Ti was offhand (its in teh review though). Through the several cards we have tested in this generation, it always left some meat on the bone.

Try to run MSI AB OC Scanner and see what it says.
 
I realize it's a totally different GPU, but I seem to recall some Pascals being "golden" and popping higher than usual speeds, too. It seemed kind of random and not always on a top end "binned" card. I've hit 2265 MHz with mine (and ran Heaven at that speed). Nature of the beast? I would imagine the silicon lottery is as real for GPUs as it is for CPUs.
 
Thanks for the review. Very interesting.

The non-OC result confuses me because it's quite close to my watercooled OC 1080ti (link). The result after OC is insane though. How do you like this version? Does it overclock well compared to others? What about the cooling?
 
Thanks for the review. Very interesting.

The non-OC result confuses me because it's quite close to my watercooled OC 1080ti (link). The result after OC is insane though. How do you like this version? Does it overclock well compared to others? What about the cooling?

the cooler is better than i expected. with my old 1080ti founders it ran around 80c at stock speed. this card has some headroom to oc a worthwhile amount without going past 80c and the ram boosts up pretty well. it has samsung vram which is apparently a good thing. i've read that overclocking past a point with these cards is pointless and if true this would be the "value" option to get since it's gonna be pretty close. if i could get a beefed up 2080ti for an extra $50 like the old days i would, but now they want an extra $300-$500 so to hell with that.
 
the cooler is better than i expected. with my old 1080ti founders it ran around 80c at stock speed. this card has some headroom to oc a worthwhile amount without going past 80c and the ram boosts up pretty well. it has samsung vram which is apparently a good thing. i've read that overclocking past a point with these cards is pointless and if true this would be the "value" option to get since it's gonna be pretty close. if i could get a beefed up 2080ti for an extra $50 like the old days i would, but now they want an extra $300-$500 so to hell with that.
I see. If you're trying to replace the cooler, don't pass up on the Kraken G12. It's pretty decent even when using a 120mm rad.
 
Can anyone else with a 2080ti tell me if the actual in game boost clock sits lower at high resolutions on their card? I noticed that if i play bf5 in 1080p it stays at 2040mhz solid the whole time. at 1440p is stays at around 2000mhz or just over. at 4k it dips down to the 1900's. i'm thinking this is a power limit thing since the temps are about the same in every scenario. do the cards with custom pcb's and higher power limit do this? it's kind of a bummer to lose 100mhz at 4k right where you need it the most.
 
Last edited:
Back