Image Deblurring via sensor data
Neel Joshi, Sing Bing Kang, C. Lawrence Zitnick, and Richard Szeliski from Microsoft Research presented a paper at the SIGGRAPH 2010 conference in Los Angeles where they introduced image treatment algorithms using sensor information gathered by an Arduino board (3 gyroscopes and a 3 axis accelerometer) to compensate errors introduced while shooting a camera due to the movement of the capturing device. Their article can be downloaded directly from Microsoft Research’s website; you will find their PDF there, but also the slideshow they used to introduce their work, and some examples of the application of their correction algorithms to some pictures.
As mentioned in this article at PC-Magazine, which was our source:
The four researchers named in the study managed to construct their hardware sensor package completely off-the-shelf, using a combination of one three-axis accelerometers, three gyroscopes, and a Bluetooth radio all wired to an open-source Arduino controller.
“Our method is completely automatic, handles per-pixel, spatially-varying blur, and out-performs the current leading image-based methods,” reads the accompanying paper.
“To the best of our knowledge, this is the first work that uses 6 [degrees-of-freedom] inertial sensors for dense, per-pixel spatially-varying image deblurring and the first work to gather dense ground-truth measurements for camera-shake blur.”