People used to put everything in a single code module. In fact, many early programming languages had limited, if any support for doing it any other way.
The results tended to be a big, unmaintainable mess. In fact, the term "spaghetti code" was coined, referring to the tendency of such modules to weave logic all over the place similar to the pasta strands on a plate of spaghetti.
The first major disciplinary attempt to address that problem was the development of subroutines (sometimes known as functions). Subroutines allowed breaking code into modules where the internals of the modules were basically "black boxes". Meaning that you could (theoretically) completely re-design and re-code the insides of those modules and the rest of the program would remain unaffected - in fact, ignorant - of the changes.
That was a step up, and it had the bonus effect of creating re-usable code instead of having to put everything together from scratch each time, but spaghetti code was still a problem. Often there was one big main module and lots of little function modules and the main module was usually still very messy and sometimes so were some of the subroutine modules.
The next disciplinary attempt, coming somewhere around the 1970s was the banishment of the infamous "goto" statement and its replacement with 3 logical building blocks: Linear code, logical selection (if/then or select/case) and loop statements (do-while, repeat-until, for-each and so forth). That got rid of a lot of the logic weaving (spaghetti), but even then modules tended to become large and unwieldy. I once encountered a COBOL program where a single "IF" statement and its sub-clauses ran to 4 pages of 66 program print lines. And in COBOL, a statement terminator was a single ".", which was especially perilous - can you spot the dot?
The idea of creating "building blocks" and giving them self-contained properties and behaviors dates back to about 1960, but until the arrival of the Smalltalk programming language, mostly it was used for graphics management and simulation languages. Smalltalk was perhaps the first general-purpose language that was based on object-oriented programming.
In the early 1980's, Bjarne Stroustrup at AT&T Bell Labs modified a C compiler front-end to support object-oriented programming. He called it "C with Classes", then "C++". AT&T made it part of their licensable software product offerings and I purchased a license and ported it to the Commodore Amiga computer system. This was a case of being in the right place at the right time. The Amiga was the first 32-bit machine* with a totally flat address space, and the AT&T C++ translator wasn't designed to be as kind to hardware as 1960s-era compilers were. Meaning that 16-bit processing was right out and the segmented memory storage model used on the IBM 80286 and earlier Intel systems was an absolute nightmare. I could go into the sordid details, but until Microsoft started running OS's in flat memory spaces, C++ products for the DOS/Intel platform were a problem.
C++ is literally object-
oriented in that you could still write "flat" programs in C++ that were essentially the same code as straight C, but you could also define object classes. However by 1990, James Gosling had adapted the concepts to a similar language that was truly object-based - the "Oak" language that eventually became
Java.
Coding modules isn't just beneficial to programmers, although now that computers are cheaper than programmers, the more productive you can make programmers, the better. However, the modularity afforded by making things object-based allows the compiler to produce more efficient code by taking advantage of
locality of reference. Basically, this amounts to keeping as many resources that are in active use as possible within short reach, since at the raw machine language level, far-away stuff typically requires longer, slower machine instructions to access and more base address registers to find them.
Up until about 1983, most compilers were fairly inefficient. The rule of thumb being that high-level language code would run about 10% less efficient than hand-coded assembly (machine) language. However, about then optimizing compilers started being the rule rather than an extra-cost exception. My first experience with such a compiler was IBM's Pascal/VS compiler which could juggle around internal objects every time it re-compiled, giving, in effect, several days worth of hand-rewriting in a matter of milliseconds. About the same time, we were working with the COBOL compiler for Honeywell's mini-computer line (which I often joked was the only COBOL compiler suited to write OS code in a la C). In both compilers, you
had to keep your modules fairly small, since the optimizer allocated only a few address registers per module, and in the case of IBM mainframes, an address register could only access in a range of 8192 bytes of RAM.
So, in short, object-oriented programming (and modular programming in general) are preferred as being more efficient for both people and computers.
There are other approaches, such as Functional Programming, but we'll leave that for another day.
====
*The Commodore Amiga and the Apple Macintosh both used the Motorola MC68000 CPU, but the Amiga OS and its official compilers used it in 32-bit mode, whereas the Macintosh used it as a 16-bit processor. So even though the Mac hardware was capable, the Apple OS wasn't set up for it.