More Virtualization

I've been thinking about code distribution. I like native code, and I'm continually impressed by the divergence of "native" capabilities (e.g., pixel shaders and fantastic low-level optimization) from the execution and display capabilities of the web.

When I get around to thinking about it, it seems like the flaws of .net and java on the desktop were the idea that these problems could and should be solved in user space, and that nobody had to do any hardware or new OS technology to make things work.

I had one conversation in the past week about how I think "apps should be VMs", and communication across processes should be done through authenticated channels. I should be able to download an app, try it out, and uninstall it, with no trace left on my PC, no matter how bad the app is. A decent scheduler and resource monitor should be able to take care of the rest: don't use all my disk time, don't use all my RAM, etc.

Then I had another conversation at Siggraph with some guys working on "Larrabee" at Intel, and how we might soon have GPUs executing native x86 code. And I was thinking, oh, an almost general-purpose massively parallel chip that's naturally firewalled from messing with the OS? This sounds rather like a VM, again. Flash today lets me write pixel shaders that "tunnel" into the hardware--remote code, but a limited subset. What if I could download a native x86 "pixel shader game" and run a big percentage of it on my GPU?

These thoughts are probably oversimplified for someone who actually does security for real, but a native code sandbox (there are many) that works in the browser, or even a full "light VM" environment seems extremely feasible for someone who makes big OSes.

User perception of this? Installs are "one click", uninstalls are "one click". Code just works again, though sharing data stills needs more specific solutions.

And I guess this leads to another thought: if Microsoft had shipped Vista as some sort of "app VM" where apps could install into sandboxes with no UAC prompts or annoyance, we'd think of them as magical, as technology geniuses. But I think where Vista fails is in doing this halfway...in establishing a secure boundary, but making the user deal with it. Virtualization & amazing sandboxing could potentially defer these decisions--antivirus software could analyze what an app was doing in its VM, and users could make informed decisions about trusting an app, more or less.

I've run across a number of problems where solving them "at the core" makes everything else simpler. I think the success of VMs (e.g., VMWare) in the enterprise proves that this is all technically feasible, and I think virtualization at a massive level should be the way we deal with untrusted apps.

1 comment:

  1. What initially excited me about Java applets was "hey, look, code on the wire!" but what killed them for everybody was the creaky way they behaved in reality. Even the horrific mess that is "Modern AJAX" feels smoother most of the time.

    A microkernel for safely running native code from multiple sources, and giving each "enough" hardware, is something I think that the web still needs. Go to.

    ReplyDelete