
By MICHAEL MILLENSON
“Dr. Google,” the nickname for the search engine that answers hundreds of millions of health questions every day, has begun including advice from the general public in some of its answers. The “What People Suggest” feature, presented as a response to user demand, comes at a pivotal point for traditional web search amid the growing popularity of artificial intelligence-enabled chatbots such as ChatGPT.
The new feature, currently available only to U.S. mobile users, is populated with content culled, analyzed and filtered from online discussions at sites such as Reddit, Quora and X. Though Google says the information will be “credible and relevant,” an obvious concern is whether an algorithm whose raw material is online opinion could end up as a global super-spreader of misinformation that’s wrong or even dangerous. What happens if someone is searching for alternative treatments for cancer or wondering whether vitamin A can prevent measles?
In a wide-ranging interview, I posed those and other questions to Dr. Michael Howell, Google’s chief clinical officer. Howell explained why Google initiated the feature and how the company intends to ensure its helpfulness and accuracy. Although he framed the feature within the context of the company’s long-standing mission to “organize the world’s information and make it universally accessible and useful,” the increasing competitive pressure on Google Search in the artificial intelligence era, particularly for a topic that generates billions of dollars in Search-related revenue from sponsored links and ads, hovered inescapably in the background.
Weeding Out Harm
Howell joined Google in 2017 from University of Chicago Medicine, where he served as chief quality officer. Before that, he was a rising star at the Harvard system thanks to his work as both researcher and front-lines leader in using the science of health care delivery to improve care quality and safety. When Howell speaks of consumer searches related to chronic conditions like diabetes and asthma or more serious issues such as blood clots in the lung – he’s a pulmonologist and intensivist – he does so with the passion of a patient care veteran and someone who’s served as a resource when illness strikes friends and family.
“People want authoritative information, but they also want the lived experience of other people,” Howell said. “We want to help them find that information as easily as possible.”
He added, “It’s a mistake to say that the only thing we should do to help people find high-quality information is to weed out misinformation. Think about making a garden. If all you did was weed things, you’d have a patch of dirt.”
That’s true, but it’s also true that if you do a poor job of weeding, the weeds that remain can harm or even kill your plants. And the stakes involved in weeding out bad health information and helping good advice flourish are far higher than in horticulture.
Google’s weeder wielding work starts with digging out those who shouldn’t see the feature in the first place. Even for U.S. mobile users, the target of the initial rollout, not every query will prompt a What People Suggest response. The information has to be judged helpful and safe.
If someone’s looking for answers about a heart attack, for example, the feature doesn’t trigger, since it could be an emergency situation.
Continue reading…