Hi,
I am having a chat about GPUs. I was positive on 7900XTX, it's about 6% slower than 4090 whilst 37% cheaper (specifically CoD MW 4K rastaman). I think that's a good show. In some games the XTX is a fair bit slower than the 4090; obviously those games suit or have been optimised for the nVidia architecture. Some games are poorly optimsed for PC as they were console ports. Even though the PC, X-Box & PS share AMD64/x64 CPUS, RDNA GPUs but no one says hey Microsoft/Intel/AMD/nVidia sort out The Last of Us on PC, it's hey Mr Game Developer fix game to run well on PC. I don't understand why they blame AMD when a game doesn't run as well on RDA3 as well on Ada Lovelace. It's down to the programmer. What am I missing what don't i see/understand? I'm trying to understand these people...
For example I am writing an application it target a 4K@55 inches for command, control and communications of a production via a web-browser. It will not run well on a tablet because the resolution and screen size is too small. Until Apple support 4K on 55 inch tablet it ain't going to run well. At a later when the client is willing to spend the money I will write them a native iOS app. I don't blame Apple, I accept it's down to the way I wrote the software, when the users pays for the tablet optimisation they will have that..
I am having a chat about GPUs. I was positive on 7900XTX, it's about 6% slower than 4090 whilst 37% cheaper (specifically CoD MW 4K rastaman). I think that's a good show. In some games the XTX is a fair bit slower than the 4090; obviously those games suit or have been optimised for the nVidia architecture. Some games are poorly optimsed for PC as they were console ports. Even though the PC, X-Box & PS share AMD64/x64 CPUS, RDNA GPUs but no one says hey Microsoft/Intel/AMD/nVidia sort out The Last of Us on PC, it's hey Mr Game Developer fix game to run well on PC. I don't understand why they blame AMD when a game doesn't run as well on RDA3 as well on Ada Lovelace. It's down to the programmer. What am I missing what don't i see/understand? I'm trying to understand these people...
For example I am writing an application it target a 4K@55 inches for command, control and communications of a production via a web-browser. It will not run well on a tablet because the resolution and screen size is too small. Until Apple support 4K on 55 inch tablet it ain't going to run well. At a later when the client is willing to spend the money I will write them a native iOS app. I don't blame Apple, I accept it's down to the way I wrote the software, when the users pays for the tablet optimisation they will have that..