I recall Windows$ being a simple file cache, though I gotta admit, the last time I really looked into it was back in the XP/7 days when I was still messing around with 32-bit OS's and getting more then 4GiB of ram to actually be useful (creating RAM disks, or trying to use RAM above 4GiB as cache). And the write cache being more of a "buffer" then a "cache", though I understand that one could easily argue that a buffer is a cache. Real write cacheing was left to the SATA chipset on the motherboard and the drivers available for it; remember the days when certain chipsets were coveted for having certain features enabled by using specific drivers? I recall being bummed out that some of my motherboards couldn't do write-cacheing on drives, even though it was the same exact chipset on other boards, or hunting down a very specific driver version for said chipset that would allow enabling write-cacheing.
If I remember correctly, it wasn't until Windows 7 that Micrsoft introduced Superfetch (and later ReadyBoost), which did keep track of file usage to keep most read data of system files ready in a dedicated cache, but the normal usage of the free RAM cache was still a dumb cache, first-in-first-out. I don't know if they enabled block cacheing in the regular free RAM cache, as that would require additional RAM to keep track of what blocks are in the cache, over the normal file cache which just kept track of what file (much less additional RAM needed for tracking filenames over blocks).
Even in Windows 10, running with 32GiB on a previous system build, I could easily see the difference in the Windows free RAM cache when gaming with a game that was small enough in size to fit in the available free ram that Win10 would use for cache. First run of the game, it would be slow to load up, as well as once in-game loading up of any new data would be slow, cause of course, it's the first time being read off a mechanical hard drive. Subsequent reads, exiting the game, and loading the game up again, would be much faster, being that pretty much the entirery of the game was being cached in RAM at that point. However, the moment I run some other game, even if it was significantly smaller then the previous game (that appears to be entirely in RAM cache judging from load times), and then go back to that larger game that I was playing moments ago, load times were back to the mechanical drive (till it was of course all read back into cache again).
It's one of the reasons I've been a long time PrimoCache user, to reduce game load times from mechanical hard drives. As much as SSD size has increased, and prices come down, they still don't compare to the size, and price per GB of a mechanical drive. So just as you said, I don't use PrimoCache on my NVME or SSD drives, but I do use it on my mechanical drives. I have two 4TB drives dedicated to games, paired up with a 256GB and 512GB SSD, along with a couple of GiB of RAM each for cacheing. Only time I noticed longer load times is when games have been recently updated and the cache hasn't refreshed yet (new file = new data on blocks, even if its the same filename as before), or I haven't played a game in a long time so it was no longer in cache.