That's the image out of the Canon S90, natively.
It back me back 15 years to my college dorm room, when I had the original black & white Quickcam. For this old camera (~1994) Connectix had chosen a lens which allowed more light and sharpness for less money, but at a cost of terrific barrel distortion.
Back then, I posted some ancient code to do a bilinear warp of the form r = r + ar^3 + br^5, a polynomial approximation that seems to be able to fix most lens distortion. This code caches the warp map, and applies it to video in realtime. I'm sure it barely compiles anymore, because the hardware is really ancient:
Today you'd probably want to use a much better interpolation than bilinear, but this one was realtime for video back in 1996, which seemed pretty cool at the time.
Really all lenses should have these kinds of computations done automatically in the processing pipeline. It would give a benefit to weight, sharpness, and cost. And the software isn't that hard to write, either.
The dpreview article mentions that optical corrections for barreling tend to introduce a mustache distortion (which even the most expensive lenses aren't able to fix perfectly).
It seems to me like lenses should have some of these distortion measurements built-in, and the data could come through the processing pipeline directly.
My impression is that DxO and Photoshop CS5 are keeping external databases of this information, which is nice for the moment. It doesn't necessarily make sense to talk to a network service just to process a RAW file. But I guess it makes it possible to do this sort of correction right away rather than waiting 10 years for metadata standards to adapt.