In the beginning...
Regarded as the "father of the computer", Charles Babbage finished his first difference engine in 1822. Built upon top technology of the time; physical gears, levers and switching were the CPU of the computer. The "programming" of the difference engine was mechanical and Babbage soon realized a more approachable programming interface was needed, resulting in his analytical engine giving way to punch card programming. This computer had storage and more complex physical switching, but it's inputs and outputs were finite and 1:1.
Mechanical computers continued to be developed and approved upon for over 100 years with advancements directly proportional to wars of the time. In 1936 Alan Turing changed the direction of development with his theories on algorithm-based computers that would (mathematically) never halt their processing. While development on finite electro-mechanical computers would continue through World War II, Turing's theories would eventually give way to the rise of fully electronic, digital stored-program computers. Changing everything.
As previously mentioned, inputs and outputs on early computers were physical and finite. A "program" was entered by the operator either through re-wiring, punch cards or physically changing levers. This required a different "storage" area for the output, which was usually some sort of Base 10 set of dials to store the computation result, or data. In the beginning, there was virtualization, it was the operator.
In 1946, Turing published his paper and design for the Automatic Computing Engine. His design centered around the speed and the size of the memory built into the computer. His proposal included (for the time) a large amount of high-speed memory, an implementation of subroutine calls and an early form of programming language.
The first stored-program computers built upon Turing's design were finite; however, less than a decade after Turing's paper, Virtual Memory Management (VMM) was born in 1956. Although this early implementation required custom hardware and was error prone, the 1:1 boundary of computing resources to capability was broken.
In 1969 IBM perfected the virtual memory management in their mainframe. Although a blip in the time line of computer history, this was historic. Virtualization would soon become a commodity and in a little more than a decade, computers would no longer require huge rooms of equipment. From this point forward, computing and virtualization technology would grow exponentially.
Just 3 years after IBM's advancement, manual data entry and processing gave way to Word Processing, the early 1970's. In 1982, The Intel 80286 processor was released with a protected mode enabling x86 computers to begin taking advantage of virtual memory.
After the 286 was released, the rest is history. We all carry and wear computing devices that leverage virtualization based on principals developed 80 years ago on physical switching computers.