This Video Streaming Breakthrough Could Lead to Smart Glasses You Might Actually Want to Wear

By Andrew Liszewski on at

One of the many inconveniences that doomed wearable products like Google Glass and Snap’s Spectacles was the need to charge them every night. Streaming video and data requires a lot of power, and that takes a tremendous toll on a tiny battery. But engineers at the University of Washington say they’ve found a way to stream hi-def video that uses 10,000 times less power than current technologies.

Eventually, the new approach to wirelessly broadcasting video signals could lead to devices that don’t need a power source at all. Security cameras, for example, could be mounted almost anywhere, and video-streaming smart glasses could wind up almost indistinguishable from a regular pair of Ray-Bans as other components, not just the batteries, could be removed from the process—or at least relocated.

Streaming video usually involves taking raw data from an image sensor, converting it into a compressed signal to minimise the file size, and then broadcasting it wirelessly. The compression step reduces the amount of data that has to be transferred, but that step also requires additional power-hungry processing hardware. What the researchers at the University of Washington are instead doing is broadcasting the raw image data right from the camera sensor, requiring anywhere from 1,000 to 10,000 times less power to do, and then relying on a nearby smartphone to handle all of the image processing and compression.

In a paper presented at the Symposium on Networked Systems Design and Implementation, the engineers detail how the individual pixels in a camera sensor can be connected directly to an antenna, instead of a processor. When photons hit the sensor, a corresponding signal would be received by the antenna which would in turn broadcast a signal. Shorter signals would indicate the pixel detected a minimal amount of brightness, while longer signals would indicate brighter results, allowing the smartphone receiving the data from the sensor to translate the results into actual images. The subtle signals piggyback on other signals bouncing around a room, enabling them to be picked up by a smartphone, or some other device, as far as 14 feet away.

That’s a gross simplification of the complex technology behind this breakthrough, but in real world testing, the researchers say they were able to broadcast a 720P hi-def signal at 10 frames per second from a customised camera, which is remarkable quality given this approach is still in the early research phases.

In addition to having terrible battery life, wearables like Google Glass simply looked bulky and terrible. Snapchat’s Spectacles were a little better in terms of their form-factor and aesthetics, but seem to have died an equally predictable death. Ditching the need for having bulky batteries sitting on your face is a huge step towards developing smart glasses consumers might actually want to buy and wear, but at the moment there’s unfortunately no timeline on when this technology might be tested and adopted by the companies who’ve already failed in this space.[University of Washington via EurekAlert!]


More Technology Posts: