Researchers Develop Tiny Depth Sensor Inspired by Spider Eyes

Researchers Develop Tiny Depth Sensor Inspired by Spider Eyes

Your smartphone might have a few different depth-sensing technologies for features like face unlock and portrait mode photos. The exact method of measuring the distance to a subject varies, but they might all one day end up being replaced by a new type of sensor based on nature. A team of Harvard researchers has designed the new 3D sensor using the same technique as a jumping spider.

Most depth-sensing systems in use today rely on stereo vision (multiple sensors a set distance apart) or projected light (IR illumination). A jumping spider has eight eyes, but it doesn’t use stereo vision like humans do to estimate distance. They don’t even have the brainpower to process vision as we do. Instead, each eye uses a multilayered retina to process images with different degrees of blur based on distance. As a result, jumping spiders can accurately determine the distance to their prey with incredible accuracy across a wide field of view.

The Harvard team used this as a model for its new “metalens” sensor, which can calculate distance without any traditional optical elements. It doesn’t have layers like a spider eye, but it does split light to generate two differently de-focused images on a photosensor. This is known as “depth from defocus.”

Researchers Develop Tiny Depth Sensor Inspired by Spider Eyes

Of course, the key to the jumping spider’s hunting prowess is the way its nervous system interprets the blurred images as a depth map. The team developed an AI-powered version of that, too. Data from the metalens feeds into a custom algorithm that compares the split images. It can then generate a real-time depth map that tells you how far away your target is. Like the vision processing of the jumping spider, this process is highly efficient. You don’t need any bulky sensors or powerful CPUs to generate the distance map. The metalens sensor used in the experiment is only three millimeters across.

The researchers see the potential for metalens depth sensing in self-driving cars and robots. Rather than having a few cameras spread around a vehicle and complex algorithms to generate depth maps, a larger number of tiny metalenses spread around could quickly and easily tell the computer how far away everything is. The technology could also come to phones in the future, replacing the bulky multi-sensor 3D sensor platforms like Apple Face ID and Google’s Face Match.

Continue reading

The PlayStation 5 DualSense’s Joystick Drift Is Only Going to Get Worse
The PlayStation 5 DualSense’s Joystick Drift Is Only Going to Get Worse

iFixit has torn down the Sony DualSense controller and their prognosis for its new drift problem is not encouraging.

Apple Seeks Sensitive Financial Data on Xbox Sales in Epic Trial
Apple Seeks Sensitive Financial Data on Xbox Sales in Epic Trial

Apple is calling a Microsoft exec's testimony into question unless additional documents are entered into evidence, but Microsoft may balk at releasing sensitive data to a long-time competitor.

Foundry Freedom: AMD Loosens Wafer Agreement With GlobalFoundries
Foundry Freedom: AMD Loosens Wafer Agreement With GlobalFoundries

AMD and GlobalFoundries have amended their WSA, giving AMD complete freedom in its choice of foundry partners.

MIT Develops Inflatable Bionic Hand That Senses Touch
MIT Develops Inflatable Bionic Hand That Senses Touch

Medical prosthetics have come a long way but can cost many thousands of dollars, and they're heavy, rigid, and prone to mechanical failures. Researchers have created a prototype prosthetic hand that's the opposite: light, soft, and potentially very, very cheap.