Some recent press has referred to virtualization and browser based applications as a threat to the operating system. As an IT Manager - having lived through a few technology waves - I think the arguments that are being made are unrealistic.
Today's Windows machines have Flash ROM that contains the BIOS code, the low level functions that are required to start a machine and figure out where the actual boot code resides. When they boot up they generaly flail around until they access a boot image - which most often is on a hard drive (but could be on a network drive, or a CD).
If we go to the extreme of providing a Hypervisor on the machine, then we can run virtual machines (various O/S') on top of the Hypervisor without loading Windows or Linux first.
This is great for flexibility, but you'll notice that it doesn't reduce your licensing costs with Microsoft or whomever. It also doesn't reduce the security vulnerability of your box - in fact it increases the threat surface by the number of VMs that are running at once! You might be able to use a snapshot to restore quickly if you are compromised, but that's the only security benefit.
The problem is that virtualizing an O/S just brings you more of the same. All the software related management costs come along with each instantiated machine, along with some additional complexity due to running VMs.
But there is an opportunity here for a different kind of virtualization to make an impact.
A couple examples might make this clear.
1) Take a look at how BitTorrent streams files to multiple recipients. The network traffic is comprised of tiny slices of the file that come to you from many different directions. The BT application breaks the files into slices for shipment and at the client reassembles them into a recognizable file. What is created is close to an internet SAN - but a fairly slow one.
2) SETI at home, and other applications, make use of compute time on internet connected
computers. This "free" CPU resource is used to solve very large problems.
Now what if network speeds reach the point where we can store the boot image for our PCs in the "cloud". Perhaps you would keep a local cached copy of the boot image and refresh it every time you start up the machine. We might be able to produce an O/S that is never "installed" on a PC. The machine could simply run BOOTP - if there were a server waiting to provide the software load.
Well... Of course replacing everyone's home PC with a BOOTP loaded machine is impractical - much as it means that now Verizon and Comcast will have to get in the O/S provisioning business. And there is the little problem of loading the OS down in the clear via TFTP.
Yikes!
But what could happen is that we could create a storage network that runs on the excess capacity of other machines that share their bandwidth. Then, home machines would run a modified BOOTP over something like Bittorrent. The boot loader comes over the wire.
Well - how is this better than what we have today?
I think anything you load at boot time like this would have to be fairly small.
It could be about as complicated as Mac OS 6 - a basic stable OS, but with a browser
built in. But if the OS is not affected by loading programs, and everything is run in a browser
window, then we can reduce the threat of infection to software on the machine (since that would be replaced at each boot)
Need a lot of refinement, of course, but I can see where this might be headed!
No comments:
Post a Comment