Computer scientists at MIT and Google have unveiled a clever new algorithm that is able to remove window reflections from photos - creating both a clearer photo of the thing you actually wanted to take a photo of, and amazingly, a photo of the reflected image in the process.
The same algorithm is also capable of removing raindrops on a window, and even fences, if you take a snap looking at something through a fence.
The way it works is by taking a short video clip (of just a couple of seconds) and comparing select frames to fill in the blocked parts. The first step is to perform an edge detection on each frame and using a "RANSAC" iterative model to compare edge flows with a key frame taken from the middle of the sequence. The interpolated flow fields are then warped to the reference image to remove the unwanted material. Translated into English, what this basically means is that the images are compared and the algorithm figures out using the relative position of different components what is in the background and what is in the foreground - and then separates them out.
You can see all of the complicated maths behind it in the paper that has been published.
What is particularly interesting is that though the images were shot using phones (a HTC One M8 and Samsung Galaxy S4), the number crunching was done on a fairly powerful computer and took around 20 minutes. So could we see this technology announced for some future version of Android? Surely eventually, but we would perhaps expect it more likely to show up on Google Photos, silently manipulating your photos in the cloud.