We’ve Almost Gotten Full-Color Night Vision to Work

We’ve Almost Gotten Full-Color Night Vision to Work

Scientists at the University of California, Irvine, have experimented with reconstructing night vision scenes in color using a deep learning algorithm. The algorithm uses infrared images invisible to the naked eye; humans can only see light waves from about 400 nanometers (what we see as violet) to 700 nanometers (red), while infrared devices can see up to one millimeter. Infrared is therefore an essential component of night vision technology, as it allows humans to “see” what we would normally perceive as total darkness.

Though thermal imaging has previously been used to color scenes captured in infrared, it isn’t perfect, either. Thermal imaging uses a technique called pseudocolor to “map” each shade from a monochromatic scale into color, which results in a helpful yet highly unrealistic image. This doesn’t solve the problem of identifying objects and individuals in low- or no-light conditions.

We’ve Almost Gotten Full-Color Night Vision to Work

The scientists at UC Irvine, on the other hand, sought to create a solution that would produce an image similar to what a human would see in visible spectrum light. They used a monochromatic camera sensitive to visible and near-infrared light to capture photographs of color palettes and faces. They then trained a convolutional neural network to predict visible spectrum images using only the near-infrared images supplied. The training process resulted in three architectures: a baseline linear regression, a U-Net inspired CNN (UNet), and an augmented U-Net (UNet-GAN), each of which were able to produce about three images per second.

Once the neural network produced images in color, the team—made up of engineers, vision scientists, surgeons, computer scientists, and doctoral students—provided the images to graders, who selected which outputs subjectively appeared most similar to the ground truth image. This feedback helped the team select which neural network architecture was most effective, with UNet outperforming UNet-GAN except in zoomed-in conditions.

The team at UC Irvine published their findings in the journal PLOS ONE on Wednesday. They hope their technology can be applied in security, military operations, and animal observation, though their expertise also tells them it could be applicable to reducing vision damage during eye surgeries.

Continue reading

Hyundai Is Buying Boston Dynamics for Almost $1 Billion
Hyundai Is Buying Boston Dynamics for Almost $1 Billion

The company just started selling its first product, the Spot quadruped robot. Owner SoftBank apparently feels this is the best time to unload the company, which it purchased from Google in 2017. Now, Hyundai Motor Company is set to acquire Boston Dynamics for $921 million.

Arctic Is Planning to Launch First New Thermal Paste in Almost a Decade
Arctic Is Planning to Launch First New Thermal Paste in Almost a Decade

Arctic (formerly Arctic Cooling) may be prepping to introduce its first new thermal paste in nearly a decade.

AMD Shipped Almost 1M Ryzen 5000 CPUs, Still Couldn’t Meet Demand
AMD Shipped Almost 1M Ryzen 5000 CPUs, Still Couldn’t Meet Demand

Intel gained market share in desktop and mobile last quarter, but AMD hit records of its own, including the fastest new CPU ramp in company history.

With AT&T Deal, Google Has Almost Won the RCS Messaging Wars
With AT&T Deal, Google Has Almost Won the RCS Messaging Wars

A few years ago, Google announced it was giving up on its much-maligned Allo chat platform in favor of RCS. That decision is finally starting to pay off.