In this environment, since Kirk's fine book addresses correct concepts and patterns,
you would definitely have to use some form of CMS (Content mangement System) optimized
for large files (videos, scans etc..) and rapid keyword-based retrieval.
Kirks concepts and rules are still valid, simply apply to your own requirements and needs.
For instance, some of us build applications for finance (e.g: building a bank),
you seem to be building a hospital. Same architectural concepts, foundation etc.. but different.
Would you walk into a bank with a chest-pain ?
I wouldn't call 50MB as large. 50MB file can easily fit into memory. Now, if you are trying to load 20 such files together , you might run into a problem . I would call data as large only when it exceeds the memory limit of the Java process, which is 2GB on 32 bit OS. On 64 bit Java, this limit is much larger.
I think your problem is that possibly you are handling more files than your application can handle at a time. So you would need to limit the number of files you get into the memory at any time based on the memory you have with your VM. If your application needs are more than what can be handled by the VM then you can think of these two options
a. Increase the VM size. If you are using 32 bit VM you can not go beyond 3gb, so you would need to migrate to 64 bit VM
b. Other option is to cluster your application.