• Welcome to Overclockers Forums! Join us to reply in threads, receive reduced ads, and to customize your site experience!

4070 SUPER

Overclockers is supported by our readers. When you click a link to make a purchase, we may earn a commission. Learn More.
I had to look up what that even was. Linux thing. SteamOS doesn't have that problem does it? So maybe it matters in a tiny niche but totally irrelevant for the overwhelming majority.
SteamOS/Steamdeck are running on AMD graphics so no it does not have this issue. Nvidia are ***** and are happy to support the HPC world with Cuda support in linux, but they treat users as second class citizens. Which is crazy because I paid full price for their GPU just like everyone else. Its not a windows only world anymore, so its time for them to move on and support the most common and most used (just not for desktop) operating system in the world.
 
Its not a windows only world anymore, so its time for them to move on and support the most common and most used (just not for desktop) operating system in the world.
In what world? You must be talking Enterprise? Enterprise isn't using gaming cards. I'm really confused, like straw man type confused...

Because 2.9% doesn't scream (nor whisper) common and most used...
 
In what world? You must be talking Enterprise? Enterprise isn't using gaming cards. I'm really confused, like straw man type confused...

Because 2.9% doesn't scream (nor whisper) common and most used...
I am talking everything. Sitting at my desk I have 2 phones, 2 tablets, one touch screen device, a Raspberry Pi all running Linux. In my rack only a single device runs on windows, and that is because it runs a specific windows only dx9 professional application I cant run as a VM (technically we do have it running in a VM but it a PITA).

All the security cameras on my house, run linux firmwares, my Roku's, linux, if it has an embedded CPU/MCU then chances are its running on Linux.

If its IOT, its probably running linux.

Linux desktops are not common, though growing.
 
In the context of this thread, we're looking at gaming GPUs so the focus is leaning heavily towards that.

I'm not up to date on SteamOS but I'm almost curious enough to download it and try it on NV system to see if RT works. SteamDeck is only one use of it, even if likely the main one.
 
I am talking everything. Sitting at my desk I have 2 phones, 2 tablets, one touch screen device, a Raspberry Pi all running Linux
We are not, lol. The thread is about a 4070 Super, lol. So let's be clear...........Linux on these items is by your choice, correct?

The phones come with android or apple. Tablets too (or windows). No clue what touchscreen device, and the Pi is Pi. About the only thing you listed that's native to Linux. IoT nor those devices won't use a gaming graphics card, right?

As this is a thread about a desktop gaming graphics card, I find the merits of you talking points laregly irrelevant within the scope as well. Straw man argument, sorry.
 
We are not, lol. The thread is about a 4070 Super, lol. So let's be clear...........Linux on these items is by your choice, correct?

The phones come with android or apple. Tablets too (or windows). No clue what touchscreen device, and the Pi is Pi. About the only thing you listed that's native to Linux. IoT nor those devices won't use a gaming graphics card, right?

As this is a thread about a desktop gaming graphics card, I find the merits of you talking points laregly irrelevant within the scope as well. Straw man argument, sorry.
I dont think its strawman, but I given you and I have different views on what counts as an influencing data point I can see why you are thinking this way.

I am dismissing all of nvidia's features because they refuse to make them available to me, so I am judging them only on what they do give me which is raster performance. dollar to dollar AMD walks away with the win across every price point next to nvidia for what I can buy and use today.

You are using RT/DLSS as these features are available and the advantage to Nvidia over AMD with those enabled is clear. If you are looking for the most RT performance per dollar nvidia wins, if you are looking for most performance per watt, nividia wins.

So im not really arguing that people should use linux, im using to define and defend why I dont/cant value RT and DLSS so they are not factors in my decision making process.

I will also argue that FSR is superior technology because it runs on both AMD and Nvidia cards, runs on both Windows and Linux, can be implimented at lower cost than DLSS, and can be turned on for games that dont even have support for it with only minimum quality loss in things like menues and HUDs. DLSS is often only added when Nvida specifically pays for it to be added to a game, and I would prefer to improve quality in other ways rather than scaling up, but that is again me.

I again, dont think that people who have nvidia cards made bad choices, nor do I think people should not buy them. I simply thing calling the crown for overpriced midrange cards that are/have been vram limited based on RT performance alone is not great for everyone. We need to present the numbers fairly and then weigh them based on each persons specific requriements.

Raster is IMO the baseline since it is all games across all operating systems, then you can add value if your specific set of games gets RT or DLSS, you can add specific value if your power is expensive and you need efficency over raw performance. I add additioanl value based on operating system support but that is again specific to me, and everyone who owns a Series X or PS5.
 
I dont think its strawman, but I given you and I have different views on what counts as an influencing data point I can see why you are thinking this way.

I am dismissing all of nvidia's features because they refuse to make them available to me,
I think the sooner you understand that the subject of the thread you're posting in has nothing do with Linux, IoT and 2.9% of the desktop population, the better off we'll all be. Your points are valid, but just not within the scope of this thread and helping the OP pick a card. Surely we appreciate hearing how you feel about them, but, the Linux stuff can be put to rest. :)
 
I think the sooner you understand that the subject of the thread you're posting in has nothing do with Linux, IoT and 2.9% of the desktop population, the better off we'll all be. Your points are valid, but just not within the scope of this thread and helping the OP pick a card. Surely we appreciate hearing how you feel about them, but, the Linux stuff can be put to rest. :)
Which is why I said Raster is the baseline, and with additional information about what games, or applications are being used we can then see if any other features offer value. I use Linux as an example where AMD offers me more value because I can use RT/FSR and they support wayland.

The benchmarks posted previously were specifically RT so unless that fits the application it may not actually provide a valid comparison point. And again, this comes from a user of both chipsets, I have 3 Nvidia cards and 2 AMD cards in active use currently. And one of the nvidia cards in use is specifically because of the feature sets they offer that AMD does not for professional applications.
 
"This RTX 4070 Super review focuses on a detailed comparison with the closest priced GPU from AMD- the RX 7800 XT. Which GPU should you buy- the RX 7800 XT or the RTX 4070? Let's find out by testing out a ton of the latest games at 1080p, 1440p, and 4K, but using a variety of graphics settings including ray tracing, upscaling, FSR 3 and DLSS 3 frame generation, path tracing, and more!"

TLDR: nothing to think about here, buy the 4070s unless you play at 4k with high settings + RT because of the 12gb VRAM

 
TL DR: Get the 4070s, unless you absolutely don't care about RT. lol 7800XT becomes "viable" (GamersN) that way.

7800XT needs a change to be chosen over the 7800XT - HUB

The conclusions on both are quite telling. They like the 4070S over 7800XT "a no brainer" from HUB.
Yeah seems like a price drop is needed. The 7800xt should have always been branded the 7700xt, AMD really hurt themselves with that move same with the 7900 XT and 7900 XTX.

But for someone who cant use any of the nvidia features that extra $100 and soon to be more once AMD drops prices, really cuts down on that lead.

HW Unboxed also had some nice price/frame analysis that shows the price advantage, but the feature weakness of AMD vs Nvidia.
 
I can get a 4070 ti right now $769. And the price is going to drop even more IMO.https://www.newegg.com/zotac-geforce-rtx-4070-ti-zt-d40710j-10p/p/N82E16814500546?Item=N82E16814500546
 
Yeah seems like a price drop is needed. The 7800xt should have always been branded the 7700xt, AMD really hurt themselves with that move same with the 7900 XT and 7900 XTX.
People have said the same thing about the Supers, lol. But, that's always been NV's plan... a bit of an Intel 'tick-tock' cadence.

But for someone who cant use any of the nvidia features that extra $100 and soon to be more once AMD drops prices, really cuts down on that lead.
You and that 2.9% on Linux. :p

I can get a 4070 ti right now $769.
Or you can pay $170 less and still get very close to 4070 Ti performance. I'd only buy a 4070 Ti if the price was a lot closer to the 4070S... or I needed the VRAM for 4K gaming on a midrange card.
 
I can get a 4070 ti right now $769. And the price is going to drop even more IMO.https://www.newegg.com/zotac-geforce-rtx-4070-ti-zt-d40710j-10p/p/N82E16814500546?Item=N82E16814500546
1705517565683.png

the 4070 TI is not the same as the 4070 Super so just be aware of what your paying for.
Post magically merged:

You and that 2.9% on Linux. :p
2.9% of Linux Desktop users*

you are a linux user too, just not for Desktop and (probably) gaming.
 
Back