Researchers Develop Tiny Depth Sensor Inspired by Spider Eyes

Researchers Develop Tiny Depth Sensor Inspired by Spider Eyes

Your smartphone might have a few different depth-sensing technologies for features like face unlock and portrait mode photos. The exact method of measuring the distance to a subject varies, but they might all one day end up being replaced by a new type of sensor based on nature. A team of Harvard researchers has designed the new 3D sensor using the same technique as a jumping spider.

Most depth-sensing systems in use today rely on stereo vision (multiple sensors a set distance apart) or projected light (IR illumination). A jumping spider has eight eyes, but it doesn’t use stereo vision like humans do to estimate distance. They don’t even have the brainpower to process vision as we do. Instead, each eye uses a multilayered retina to process images with different degrees of blur based on distance. As a result, jumping spiders can accurately determine the distance to their prey with incredible accuracy across a wide field of view.

The Harvard team used this as a model for its new “metalens” sensor, which can calculate distance without any traditional optical elements. It doesn’t have layers like a spider eye, but it does split light to generate two differently de-focused images on a photosensor. This is known as “depth from defocus.”

Researchers Develop Tiny Depth Sensor Inspired by Spider Eyes

Of course, the key to the jumping spider’s hunting prowess is the way its nervous system interprets the blurred images as a depth map. The team developed an AI-powered version of that, too. Data from the metalens feeds into a custom algorithm that compares the split images. It can then generate a real-time depth map that tells you how far away your target is. Like the vision processing of the jumping spider, this process is highly efficient. You don’t need any bulky sensors or powerful CPUs to generate the distance map. The metalens sensor used in the experiment is only three millimeters across.

The researchers see the potential for metalens depth sensing in self-driving cars and robots. Rather than having a few cameras spread around a vehicle and complex algorithms to generate depth maps, a larger number of tiny metalenses spread around could quickly and easily tell the computer how far away everything is. The technology could also come to phones in the future, replacing the bulky multi-sensor 3D sensor platforms like Apple Face ID and Google’s Face Match.

Continue reading

Chromebooks Gain Market Share as Education Goes Online
Chromebooks Gain Market Share as Education Goes Online

Chromebook sales have exploded in the pandemic, with sales up 90 percent and future growth expected. This poses some challenges to companies like Microsoft.

Scientists Confirm the Presence of Water on the Moon
Scientists Confirm the Presence of Water on the Moon

Scientists have confirmed the discovery of molecular water on the moon. Is there any of it in a form we can use? That's less clear.

Review: The Oculus Quest 2 Could Be the Tipping Point for VR Mass Adoption
Review: The Oculus Quest 2 Could Be the Tipping Point for VR Mass Adoption

The Oculus Quest 2 is now available, and it's an improvement over the original in every way that matters. And yet, it's $100 less expensive than the last release. Having spent some time with the Quest 2, I believe we might look back on it as the headset that finally made VR accessible to mainstream consumers.

Samsung, Stanford Built a 10,000 PPI Display That Could Revolutionize VR, AR
Samsung, Stanford Built a 10,000 PPI Display That Could Revolutionize VR, AR

Ask anyone who has spent more than a few minutes inside a VR headset, and they'll mention the screen door effect. This could eliminate it for good.