• Welcome to Overclockers Forums! Join us to reply in threads, receive reduced ads, and to customize your site experience!

Games drive & PrimoCache

Overclockers is supported by our readers. When you click a link to make a purchase, we may earn a commission. Learn More.
For spinners, it is a huge boost. For SSD, I found all the RAM drives and other things quite pointless. I only wonder what access time you get, and since it's fully cached, you can switch CDM to SSD mode (it's in the settings tab).
It's one of the things where more RAM is helpful, but nowadays, barely anyone uses HDDs.
 
If I'm reading this right, Primocache is configured for ram cache only? Not sure that is so different to how Windows would do it out of the box, unless it has different cache pre-loading, and also acts on writes. While I've never used it, I understood the typical scenario for Primocache is to throw in a SSD tier too. Much bigger capacity than ram so chances are you'll hit spinners less.
 
wonder if someone that does video editing has used that? say on a server that can do 512gb+ ram, i do not look at servers. i swear i saw one with 1tb of ram. if the file fits in all the ram, that near instant access. should be faster then using a pcie ssd as far as access time on jumping through the video timeline.
 
RAM disk benefit over just using ram as a read cache is niche. Maybe if you have very specific files you need to read and write to a lot, ram disk makes sense. Game and video editing is very read heavy so OS caching is fine. Most video file bitrates are far lower than SSD rates. Even if you're jumping around the access latency is low enough to be insignificant. During playback it is sequential so no problem. If you're encoding the limitation to speed is usually the processing not file IO.
 
If I'm reading this right, Primocache is configured for ram cache only? Not sure that is so different to how Windows would do it out of the box, unless it has different cache pre-loading, and also acts on writes. While I've never used it, I understood the typical scenario for Primocache is to throw in a SSD tier too. Much bigger capacity than ram so chances are you'll hit spinners less.

Windows Cache is a "dumb" file cache, or more specifically last read file "cache". Windows will simply cache whatever file was last read from a drive, until it runs out of space in its cache and evicts the oldest read file. It doesn't care about what files are frequently being accessed, just what file was last read from a drive. This is fine for gaming if you have enough free RAM to cache all the files in RAM, but doesn't help if your system is RAM limited, and the game is constantly reading from files (very good chance frequently read files will be evicted from cache), and/or the files in question are larger then the available free space in the cache. Many modern games consolidate all the individual game files into one large game file (for various reasons), and can easily be tens of Gigabytes in size for a single file, so Windows cacheing such a file would wholely depend on whether you have enough free ram to cache the ENTIRE file. Even with 64GiB of ram, that can easily be a problem, what with the OS, background programs, AND the running game itself taking heaps of RAM before whatever freely available RAM is left over for cacheing. Forget about Windows cacheing any multi Gigabyte files on a system with 8GiB of RAM or less.

PrimoCache is a "smart"(er?) cache. It runs at the block level, not file, and works by keeping the most frequently read blocks on a drive in RAM/SSD cache, so it doesn't read an entire file into cache like Windows, just the blocks most frequently read. This means a game that has all its data in one giant file(s), can still benefit from the cache mechanism. This also means that the most frequently accessed blocks will be kept in its cache and are less likely to be evicted from cache. Hence, the longer PrimoCache have been running with you using the system, the more data it collects on what blocks are most frequently accessed, the more it helps with access/read speed. It can also work in a tiered fashion as you mentioned, using both RAM and a SSD to speed up a mechanical spinner. PrimoCache can also cache writes, unlike Windows, and defer the writes till later or when the system is idle, or disable write cacheing in PrimoCache (read only cacheing).
 
I did say "not that different" to Windows. I'm sure there are differences and I wasn't about to pick every little detail. But the response did make me look it up. I found this:

It looks like Windows is a kind of block cache, not a file cache. That page mentions mapping 256k slots. Primocache might have a "better" algorithm to determine what to keep or evict but at a high level it isn't so different. Also Windows can cache writes too, but this is always a bit of a danger area. The longer you wait to write, the bigger the chance of data loss should the system malfunction for any reason. Thus it is a balance of performance and safety. At least for gaming specifically, writes aren't a big deal. About the only heavy write usage will be updates and patches, but they're kinda one-off and not performance critical.

If you can afford to allocate ram to Primecache to use, you can afford for Windows to do the same. So like for like I don't see any advantage there. I still feel that Primocache's strength is using an intermediate storage tier cache. Recent SSD pricing is cheaper than ever. Slap a 512GB or 1TB SSD as a cache and you might even forget you had a bigger capacity HD behind it.
 
I recall Windows$ being a simple file cache, though I gotta admit, the last time I really looked into it was back in the XP/7 days when I was still messing around with 32-bit OS's and getting more then 4GiB of ram to actually be useful (creating RAM disks, or trying to use RAM above 4GiB as cache). And the write cache being more of a "buffer" then a "cache", though I understand that one could easily argue that a buffer is a cache. Real write cacheing was left to the SATA chipset on the motherboard and the drivers available for it; remember the days when certain chipsets were coveted for having certain features enabled by using specific drivers? I recall being bummed out that some of my motherboards couldn't do write-cacheing on drives, even though it was the same exact chipset on other boards, or hunting down a very specific driver version for said chipset that would allow enabling write-cacheing.

If I remember correctly, it wasn't until Windows 7 that Micrsoft introduced Superfetch (and later ReadyBoost), which did keep track of file usage to keep most read data of system files ready in a dedicated cache, but the normal usage of the free RAM cache was still a dumb cache, first-in-first-out. I don't know if they enabled block cacheing in the regular free RAM cache, as that would require additional RAM to keep track of what blocks are in the cache, over the normal file cache which just kept track of what file (much less additional RAM needed for tracking filenames over blocks).

Even in Windows 10, running with 32GiB on a previous system build, I could easily see the difference in the Windows free RAM cache when gaming with a game that was small enough in size to fit in the available free ram that Win10 would use for cache. First run of the game, it would be slow to load up, as well as once in-game loading up of any new data would be slow, cause of course, it's the first time being read off a mechanical hard drive. Subsequent reads, exiting the game, and loading the game up again, would be much faster, being that pretty much the entirery of the game was being cached in RAM at that point. However, the moment I run some other game, even if it was significantly smaller then the previous game (that appears to be entirely in RAM cache judging from load times), and then go back to that larger game that I was playing moments ago, load times were back to the mechanical drive (till it was of course all read back into cache again).

It's one of the reasons I've been a long time PrimoCache user, to reduce game load times from mechanical hard drives. As much as SSD size has increased, and prices come down, they still don't compare to the size, and price per GB of a mechanical drive. So just as you said, I don't use PrimoCache on my NVME or SSD drives, but I do use it on my mechanical drives. I have two 4TB drives dedicated to games, paired up with a 256GB and 512GB SSD, along with a couple of GiB of RAM each for cacheing. Only time I noticed longer load times is when games have been recently updated and the cache hasn't refreshed yet (new file = new data on blocks, even if its the same filename as before), or I haven't played a game in a long time so it was no longer in cache.
 
Last edited:
I filled up my 16 bay JBOD with $15 4TB SAS drives with some cold spares couple of years back. I keep checking all the time for any lucky deals like that for larger drives as I'm already 60% full on my NAS. :sneaky:

The cheap mechanical drives even at new prices is also why I try to maximize my RAM on every main system I build, cause I always try to dedicate whatever I can spare to cache. :beer:
 
Back