Artificial intelligence is making major advances in healthcare, even providing psychotherapy and helping shorten the timeline for creating new drugs, science journalist David Pogue reported Sunday at the Purdue University Northwest Sinai Forum in Westville.
“It is in every aspect, every corner of the healthcare system,” he said.
“ChatGPT is now the single largest provider of mental health in the United States,” Pogue said.
He offered a demonstration onstage, posing as someone really depressed. “If you want to talk to someone about it or you just need a little distraction, I’m here for you,” the chatbot responded.
“I’m especially depressed about AI. Is there anything I can do as a citizen to fight against AI?” Pogue prompted.
“Just stay involved and stay in conversations about how AI is used,” the bot responded.
Could you create a limerick about that? ChatGPT does so instantly. Translate it into Italian? No problem.
What happens if a patient tells a mental health chatbot he’s thinking about shooting up a mall or school, an audience member asked Pogue.
Human psychotherapists are required by law to report it to police, and Pogue expects the chatbot’s owner would follow the same law.
Wearable devices like Apple watches and Fitbits, too, have a big potential upside for individuals.

“Quietly, these things have turned into massive biotech-absorbing gadgets,” Pogue said.
They see you when you’re sleeping, they know when you’re awake. They check your heart rate, shooting LED light into your skin to check your pulse. When you’re sleeping, they detect tiny movements to evaluate the stage of sleep you’re in.
“Now it’s started to warn you about things that might be starting to happen,” Pogue said. That includes detecting anomalies like atrial fibrillation. “AFib leads to stroke, and that’s bad,” Pogue said. “It can literally tell you, we detect this. Go check it out.”
“This thing has saved so many lives, hundreds of people a day,” he said. “If you know, you can take blood thinners and you won’t get the problem.”
A month ago, Dr. Apple Watch introduced a sleep apnea warning, Pogue said. “All these warnings are on your Apple watch right now,” he said, showing a slide with a long list of conditions the watch warns about.
Pogue shared a clip from a CBS news story he reported. Stanford researcher Michael Snyder is looking into how watches could detect diseases, including anemia and type 2 diabetes. “We’re looking into research into how it can detect cancer right now,” Snyder told Pogue.
A study found that smart watches could detect COVID three days before the recognizable symptoms show up. Imagine if the government issued smart watches to everyone, Pogue suggested.

Based on data the devices are already gathering, it should be possible to do really super personalized medicine, he said.
A watch could use pollen information to advise asthma patients to do their workouts inside that day. Or even note that your stress level goes up 65% when you visit Cincinnati, where your in-laws live.
AI chatbots outperformed physicians in diagnosing illness, a 2024 study found, with evaluators preferring ChatGPT’s response in 78% of the cases, Pogue said.
What about AI’s use in the pharmaceutical world? “This is the part that’s going to fry your brain,” Pogue said.
“Most of the world’s worst diseases all come from the same fundamental problem, and that is malformed proteins,” he said. “They help digest your food, they help carry your oxygen.”
“This is the holy grail, the entire universe,” to unlock all 200 million known proteins, Pogue said. AI managed to do what humans couldn’t.
Originally, Google realized what a gold mine it had with these results. Instead of selling the data to pharmaceutical companies, though, Google made it open source, available to everyone, because the data could save so many lives.
It takes 15 years and billions of dollars to create a new drug. That’s why there aren’t drugs specifically designed to address orphan diseases that affect a small number of people. It just isn’t profitable for companies to do so.
“Suddenly, that’s plausible to do,” Pogue said.
A drug for lung disease is in its second round of clinical trials after being identified by AI just 13 months ago, he said. Prior to AI, that was unheard of.
“Really, AI in health care is the real story” that most journalists are missing, he said.
Another study Pogue cited involved the Massachusetts Institute of Technology scanning 28,000 young, healthy patients’ lungs. Five years later, they were scanned again. A certain number of them had lung cancer. The information from the scans was used to determine if AI could predict who’s going to get lung cancer based on the initial scans. AI was 95% accurate.
AI can be used to predict the likelihood of individuals developing Alzheimer’s and other diseases, Pogue said.
However, AI isn’t perfect.
“We already know there’s a data bias,” Pogue said. “80% of all electronic health records are for people of European descent. They’re white people.” The data for people with dark skin is so scant that it skews the information fed into the “black box” from which artificial intelligence gets its data.
Geography is another barrier. “There are huge parts of the world where they don’t have high-speed internet,” Pogue said, so data from those areas isn’t feeding artificial intelligence.
Data privacy is another issue. “We have to worry about hackers. We have to worry about consent and whether your data is being sold” to put into the black box. “Nobody gave permission for Google to take this data.”
Accountability is another issue. Software is sold to a hospital, used by a doctor. If something goes wrong with your treatment that was based on the AI advice, who do you sue?
Then there’s the adoption gap, in which some people simply don’t trust AI’s involvement in healthcare. “The doctors themselves don’t know how it works,” which isn’t reassuring to those patients.
But then nobody knows exactly how AI works, even OpenAI founder Sam Altman, Pogue said.
AI has other potential pitfalls outside the healthcare field, with AI summaries in Google search results sometimes offering false information, like saying adding nontoxic glue to the recipe can help cheese stick to a pizza better.
Another fear Pogue has is that deepfake videos will be used to skew election results, with candidates being made to appear to say things they didn’t really say and voters not being able to discern what’s real and what isn’t.
But for healthcare, so far, the results are promising.
“AI is reshaping the world as we know it. Healthcare is no exception,” PNW Sinai Forum Executive Director Leslie Plesac said.
Doug Ross is a freelance reporter for the Post-Tribune.
