Well, if you read an 8MB file into a byte array, it requires 8MB to store the bytes and some trifling amount (a few bytes) to keep track of the fact that it's a byte array. That's less than 1% of a gigabyte. But if you unpack those bytes into some kind of object structure that represents the contents of the file, you will require more memory.
How much more? It depends on your objects. But basically, if your strategy is to load large objects entirely into memory then you are asking for trouble. Next week somebody will produce a 10MB file, then a 50MB file next year, and so on. File sizes are growing faster than available memory, so eventually that strategy will fail to scale.