Virtualization is one of those hot IT topics, but the reality is that - in the general sense - virtualization has been around for what seems like forever.
I define virtualization as the abstraction of logical resources away from physical resources.
For example, NFS (the Network File System) allows you to access your data from any system, not just the one that happens to be physically connected to the disks on which the data is stored.
The X Window System allows you to display applications on a different host to the one where they might be running.
VNC (Virtual Network Computing) is one of many technologies that allow you to access a complete desktop on a different system to that where it's running.
The various types of Virtual Desktops supported by X11 window managers allow you to manage and display applications independently of the physical screen that you may be sitting in front of.
Solaris Zones allow you to construct an operating system instance that, while in any particular instantiation is locked to a piece of hardware, is logically distinct. It's relatively trivial to pack up a Zone, redeploy it on new hardware, and the application layer doesn't notice. Again, a level of abstraction from the physical.
Interestingly, the current virtualization fad doesn't quite work the way you would expect based on the definition I started with. Looking at the various virtual machine technologies currently in vogue, while the virtual machines are abstract from the underlying hardware, the abstraction that's presented to the user is that of physical systems. This doesn't introduce anything new, it just gives us the same old IT infrastructure we've always had, just replicated one layer up in the stack. Useful, but not revolutionary.
And the cloud doesn't really change that, because all you're getting with a cloud is an extra layer (and a whole new charging model) underneath your virtualization layer. Everything may be virtual, but in many cases all we're doing is finding new ways to implement the physical.