About a year ago, IBM began prominently promoting Watson as a potential means of fighting cancer. IBM has been working on this technology for years, and by last summer felt confident enough to start advertising it. Follow-up reporting, however, indicates that IBM’s Watson advertisements may have been more than a little premature.
A report from Stat, which has seen internal documents from IBM, shows Watson engaged in “multiple examples of unsafe and incorrect treatment recommendations.” Now, this kind of failure rate is to be expected in any project, particularly one as new as AI and as hideously complex as cancer. A full discussion of the types of cancer is far beyond the scope of this article, but one of the things that makes cancer so difficult to treat is that there isn’t just one type of cancer. In fact, there isn’t even just one type of cancer per organ. Organs can develop cancer differently and different cancers have different characteristics, requiring different treatment plans.
In short: Cancer is fiendishly complicated. That’s probably why IBM wanted to announce that it’d put Watson to work on cancer research in the first place; it’s a textbook example of a “big problem” that AI could solve. But it may also explain why some of the doctors who have worked with Watson seem less-than-enamored of it.
“This product is a piece of shit,” Gizmodo reports a doctor at Florida’s Jupiter Hospital said to IBM, according to the documents reviewed by Stat. “We bought it for marketing and with hopes that you would achieve the vision. We can’t use it for most cases.”
The gap between the advertising and the reality suggests that IBM’s marketing department either did their job really well or not nearly well enough (where you fall on that question may depend on whether your email address has @ibm.com after it). Gizmodo has reported before on the fact that IBM’s rhetoric and Watson’s capabilities were increasingly divorced, and that critique has legs under it after these latest reports.
One critical point to be made here is that Watson isn’t just treating random patients — it’s being used to offer potential feedback and is available for doctors to consult when they’re stymied by a problem or potential issue. The specific issue that Watson screwed up was an attempt to prescribe Bevacizumab to a man being treated with chemotherapy for lung cancer with a history of severe bleeding. Bevacizumab (name brand: Avastin) is known to cause bleeding as a side effect and is only used with some lung cancer patients. It was definitely not called for in this case.
Part of the problem may lie with how the system was trained. Instead of being trained on real patient cases, Watson was supposedly trained on hypothetical scenarios. But this could just as easily be a dodge. Cancer treatments are hard and any system in which a doctor (real or artificial) is required to weight data to arrive at the appropriate treatment is going to be wrong some of the time. The real question here is, just how often is Watson wrong, and how much success is IBM having in training the system to be right?
It is not particularly surprising that Watson, an AI that’s existed for a handful of years, has thus far made little progress with a disease that has plagued mankind since our earliest existence. There’s going to be a learning curve and it’s going to be steep. And let’s be honest — looking to cancer treatment for a place where a company makes a big splash is probably the wrong goal here. Google’s ability to take or make a restaurant reservation with an AI is a more practical near-term goal than building an AI program that can flawlessly detect or treat cancer — even if the latter is far more valuable in terms of its impact on human life. AI could still revolutionize medicine, but given the complexities of the human body, it’s not surprising it’s taking a while to unlock that potential.
Top image credit: Getty Images
IBM Halts Sales of Watson AI For Drug Discovery and Research
It's a high-profile retreat for the company, which has aggressively marketed AI as being useful for these purposes.