When one considers the types of equipment shipped up to the International Space Station, there’s a certain list of default equipment that makes sense. Spare parts? Check. Food and medical supplies? Check. Some manner of entertainment options? Check. Eleven-pound robot head? Not so much. But IBM has visions of how such hardware might be useful, and it’s developed such a device — the Crew Interactive Mobile Companion (CIMON for short).
Meet Wilson, Watson, Cimon
Cimon is technically powered by what IBM calls “Watson” technology, but it has a unique mission on the ISS. It will work with German astronaut Alexander Gerst to run some crystal experiments, solve a Rubik’s Cube, and conduct a “complex medical experiment” using Cimon as a flying camera.
All of this sounds fairly rote, but some of the other functions are more interesting. Cimon is also intended to serve as a colleague to on-board astronauts, including working through prescribed checklists in coordination with its “teammates.” There’s also talk of Cimon being able to serve as a safety-improvement by giving warnings of impending failures before astronauts might see them on a control board.
Here’s how IBM describes Cimon’s ability to learn:
AI gives the space assistant text, speech and image processing capabilities, as well as the ability to retrieve specific information and findings. These skills, which can be trained individually and deepened in the context of a given assignment, are developed based on the principle of understanding – reasoning – learning.
Watson speech and vision technologies helped train CIMON to recognize Alexander Gerst, using voice samples and Gerst, as well as “non-Gerst” images. It also used the Watson Visual Recognition service to learn the construction plans of the Columbus module on the International Space Station to be able to easily move around. CIMON also learned all the procedures to help carrying out the on-board experiments. Experiments sometimes consist of more than 100 different steps, CIMON knows them all.
Cimon is a long way off from demonstrating what experts refer to as “strong” AI — progress in that field hasn’t really budged, despite the widespread adoption of AI as a marketing term — but there’s real potential for advance here, we think.
One area where AI advances could truly revolutionize human capabilities is in the exploration of the solar system. Human exploration of objects beyond the moon is a difficult problem for many reasons, but the need to protect and preserve human life across an interplanetary journey ranging from months to years is one of the most significant issues. Most of NASA’s greatest exploration breakthroughs have been delivered by satellites and rovers controlled from Earth, but these systems have clear limits. Communications have to traverse at least three “hops,” first from the vehicle to Earth, then from Earth to the vehicle, and then from the vehicle to Earth again before NASA or one of the other space agencies knows that a specific action has completed successfully.
The idea of a rover or satellite as intelligent as, say, R2-D2, is still entirely within the realm of science fiction. But it’s not crazy to imagine a successor to Curiosity that’s equipped with the sensors and tools it needs to reach its own conclusions about which rock faces might be optimal for drilling, or which patches of ground might yield revealing soil samples. Initially, of course, these conclusions would be checked and triple-checked by scientists on the ground. But over time, as the probe proved itself, it might be possible to support the mission with fewer personnel, allowing for other projects to be pursued.
Meanwhile, the inevitable adjustments made to each probe once it began its exploration could be folded back into improvements to future probes. While some improvements would obviously be mission-specific, changes to instrument data integration or core adjustments to boost reliability and stability over the long term could drive a virtuous cycle over time.
That’s an awful lot of weight to put on the initial deployment of a single robot head, but it’s not a crazy idea. Maybe in 15-20 years we’ll be fielding space probes with self-directed exploration capabilities, accompanied by modest human oversight.
GM Stations Chevrolet Bolts in Austin for Maven Ridesharing
Austin, the city that hassled Uber and Lyft two years ago, now welcomes Chevrolet Bolts EVs aimed at the gig economy.
China’s First Space Station Will Reenter the Atmosphere in About a Week
After China lost control of the aging space platform, scientists around the world set to work tracing its position to find out when and where it would reenter the atmosphere. We've got it narrowed down a little today.
China’s First Space Station Has Broken Up, Landed in Pacific Ocean [Update]
As predicted, the Chinese space station hit the atmosphere over the weekend. Specifically, Tiangong-1 reentered the atmosphere at 8:16 PM EDT Sunday, April 1.
Rumored PlayStation 5 Specs Leak, Suggest Launch as Early as 2018
With the PS4 Pro and Xbox One X firmly established in-market, thoughts have already begun to turn towards what sort of console might be coming in the not-too-distant future.