I'm a developer, and over my 30+ years I've been a consultant on a number of projects, sometimes consulting with PhD candidates, others like yours in scope and requirement, though not in the field of biomedical research.
You've received some good information thus far. There are specialized workstation and server products with high RAM potential, and some of your requirements may depend on how the software operates...perhaps you're just in need of viewing?
Are you familiar with virtual memory?
Are you familiar with paging or deblocking?
What resolution is the montor(s) you're using for this work?
What software are you using for this research?
What is the storage solution (if there are so much as 100 images of the size you give, that's a significant database compared to most desktop workstations).
I'm happy to keep this discussion as public as you prefer; other members may find that fascinating, but feel free to PM me if you'd like a private exchange, more detailed information or, perhaps, someone experienced and qualified to help.
At present prices, RAM sufficient to store one full image, assuming the file size you mention isn't compressed, would cost perhaps $3000 to $4000, but there are catches involved. As others mentioned, there are some few boards intended for servers and workstations that support up to 256 Gbytes, and I assume you've investigated hardware intended for this size of image processing priced upwards of $30,000. In my experience these kinds of projects run for a year or two, so leased machines were more common in the previous epoch. I assume you realize it should be possible to assemble capable hardware for much less, and that is a fact that has never been more applicable than in the past.
Yet, you might not require that much RAM, if the software is suitably designed. At 350 Gbytes, assuming 8 bit color is sufficient, and a roughly square aspect ratio (might even be a circular photographic target), I'm estimating your images are about 350,000 pixels wide (similar height). That estimate is probably sufficient for my observations to follow.
There is no monitor beyond approximately 30,000 pixels (width) to my knowledge, and such devices are enormously expensive. You might have 4K monitors, which would be assistive, or standard 1920 x 1080. No doubt you are scrolling through 100 to 200 "columns" by 100 to 200 "rows" around the source imagery, either to view, process (intelligent analysis perhaps) or copy. I would assume at least 3, possibly 9 monitors under one workstation's control would be appropriate.
If you were using 4K monitors, each requires about 24 MBytes for a full display, and 9 active monitors would require under 256 Mbytes of RAM.
Appropriately constructed software can page or deblock from source materials. That is, loading the 9 regions around the local view position so that scrolling is fluid results in about 2 Gbytes for 9 monitors. Add to that other buffers, control information...an application would perform reasonably well requiring less that 16 Gbytes of RAM.
The relies on the notion of disk paging performance. As one scrolls any, or all, of the 9 displays, the data source must be able to keep up with any reasonable user interface demands. A typical paging scenario involves retrieving 3 new adjacent blocks from the source image (while dropping 3 that trail the scrolling direction), at a demand rate of merely 75 Mbytes. Typical modern rotational hard disks can supply sustained output of 120 Mbytes per second, meaning that one could scroll entire screens in less than one second without ever really noticing a delay.
That is, on a well tuned machine.
There are also obvious features of zooming, which includes processing, indexing and caching. If one assumed a RAM only operation of the software, a typical 350 Gbyte image would require an additional 100 to 200 Gbytes for support of this feature.
I assume you may require multiple layers to be processed, representing depth of the subject you're studying, but it's likely layered image processing is focused on small regions rather than the entire 350Gbyte source imagery.
If a local image were loaded into a pre-staged worksurface comprised of SSD's, the result would be much faster.
Supporting active scrolling of 9 displays would, of course, reach demands in the region of 750 Mbytes per second, which SSD's in RAID, or other high performance solutions, can support.
In other words, you may not require an unusually high amount of RAM, depending on just how your software requirements stress the various systems in a more typical machine. You may merely require customized software on a robust machine.
That may be simpler than it sounds. It also happens to be the focus of my work over several decades (high performance applications, drivers, 3D rendering engines, etc).
You do require a robust machine, there's no doubt. 32 Gbytes could work if your requirements are simple enough, 64 Gbytes would serve better, but I think 128 Gbytes would be luxurious. That is, unless the software is NOT well tuned for this particular function. Then you may well need 1 Tbyte of RAM.
Exactly where the focus of the cost of the equipment should be placed is a matter of design, software and possibly some creative solutions which help to curb costs.
Unless I completely misunderstand what you require, $3000 is not best spent on RAM for your requirements. It's probably best put to large SSD's as temporary working storage (like the scratch disks of Photoshop), with perhaps $600 on RAM.