
Judy joined RFU in 2014 after nearly 20 years as a newspaper reporter. She likes reading about artists and scientists and hiking with her kids and grandkids.
Author PageRobotic fingers that take a pulse and detect a lump.
Chatbots that generate more empathetic responses than trained clinicians.
Algorithms that predict the risk of developing heart disease or cancer.
Former Microsoft CEO and philanthropist Bill Gates insists artificial intelligence will replace doctors and teachers within the next decade. Rosanne Oggoian, DO, BS, a Chicago Medical School (CMS) assistant professor and clinical skills course director, disagrees.
“There’s nothing like having a human who has some life experiences make a connection with you and say, ‘I am so sorry you’re going through that,’ or ‘Tell me more about that,’” Dr. Oggoian said. “The human connection opens up communication channels — something AI, at least at this point, cannot do.”
Scans and tests can’t replace the hands-on physical exam, the look in the eye, the hand on the shoulder that are fundamental to the practice of medicine.
“Patients expect their doctor to examine them, talk to them, listen to them — and they expect them to touch them, too,” Dr. Oggoian said. “Physical exams promote trust in the patient–doctor relationship. You can’t replace that human touch.”
But AI-powered brains and fingers are moving fast to try. Researchers at the University of Science and Technology of China have created a robotic “finger” capable of tactile sensing and safely performing routine elements of the physical exam. But is replacing human touch with AI really the point?
As a student at CMS, Lalima A. Hoq, MD ’97, MPH, FACP, learned from textbooks. When she entered practice, she used paper charts. Today, she’s an informatics physician. She is medical director of wellness informatics for the Cedars-Sinai Medical Network in Los Angeles, a role akin to a technology therapist.
“I help practicing clinicians learn how to make technology work for them,” Dr. Hoq said. “I help them use technology very intentionally as part of their workflow to balance both their human and professional wellness and to preserve the core of medical practice, which is the service of humanity.”
AI tools can free up clinicians to see patients in a smarter way, said Dr. Hoq, who uses AI applications to “take away the silly work” — clerical tasks that “don’t help you be a better doctor.”
Studies show that while the average length of a primary-care visit has been consistently stable for decades, at around 13 to 24 minutes, the time doctors spend on clerical work and documentation for the electronic medical record (EMR) has dramatically increased.
“AI is the first sliver of hope for reducing that burden so we can do the thing the very dynamic brain can do better than a computer can do, which is forming a true, lasting connection with a person,” Dr. Hoq said. “That person understands that you care about them — not their numbers or their outcomes or anything else but them as a person.”
Ambient voice technology can summarize, frame and chart progress notes in the EMR following a patient exam. AI algorithms can improve diagnostic accuracy and reduce human error. Machine learning models can identify early signs of disease. Dr. Hoq, an internist, embraces these tech advances in support of improving patient outcomes.
“I’ve been practicing for 25 years,” she said. “That’s 25 years of seeing and recognizing diseases. But every person is different. AI draws from things that I might not see immediately because I don’t happen to look in the right way. AI doesn’t have that experience bias.
“But I can connect with the person who’s sitting in front of me, because I know them at an intellectual and emotional level and through our shared experience. That’s a relationship that AI can’t replicate or reproduce.”
At least not yet. Researchers are exploring the field of “Emotion AI,” also called affective computing, the brainchild of an MIT researcher that dates back to 1995. The goal is to improve human–machine interactions through AI technologies that can understand and respond to human needs and emotions — and potentially simulate human emotions.
“AI is the first sliver of hope for reducing that burden so we can do the thing the very dynamic brain can do better than a computer can do, which is forming a true, lasting connection with a person.”
Health-professions educators are grappling with concerns that increasing reliance on technology and digital tools will erode the focus on empathy, active listening and the careful use of senses, including touch, so crucial to the physical exam and doctor–patient relationship. Cory Krebsbach, BFA, CHSE, director of simulation programming for RFU’s North Chicago campus, doesn’t think that is happening. When EMRs were introduced in clinical skills training, some feared students would be distracted. They weren’t. Comfortable with technology, members of the digital generation shared scans and X-rays and other content from simulated EMRs to educate their patients.
“They knew that their greatest resource was the patient in the room,” Mr. Krebsbach said. “There’s no piece of technology that could ever replace human-to-human communication and interaction. The future is in finding a way to leverage technology to enhance interpersonal communication.”
Students and clinicians are using simulation, including high-fidelity manikins and virtual reality (VR), to practice procedures and decision-making in a safe and controlled environment. An early adopter of simulation technology, Rosalind Franklin University also uses a human modality of simulation training: standardized patients (SPs), or patient actors, who follow scripted scenarios to help train and evaluate medical students in clinical and physical exam skills.
Mr. Krebsbach, a former SP who worked at healthcare institutions throughout Chicagoland, likes hybrid simulations that marry machine and human. SPs might be outfitted with a tracheotomy task trainer that provides students with immediate sensory feedback. Students can use a SimScope — a stethoscope with electrodes or sensors programmed with different heart sounds. Programmed sensors can also be built into the clothing of SPs. Students hear the telltale signs of tachycardia, but they’re still able to interact on emotional and social levels with an SP, as opposed to controlled interactions with a manikin, which can be programmed to respond to students through a remote audio device.
“The future is in finding a way to leverage technology to enhance interpersonal communication.”
Mr. Krebsbach, who admits to skepticism over whether technology can replace the highly responsive training provided by SPs, recently tried out VR simulation-based learning when he donned an Oculus headset.
“I was bedside with a patient who was having trouble breathing,” he said. “I could use the controls to pick up an oxygen mask or a stethoscope. The patient’s spouse on the other side of the bed was growing agitated. I could speak directly to the avatar and its responses were very quick — as long as I said the right thing. But is it able to capture all of the nuances of emotions? That’s where you’re not getting that true authentic sort of interaction you would get from a human being.”
VR is good at teaching skills. Follow the steps for diagnosing an asthma attack, and the user is quickly assessed and graded. But it is not yet able to offer the kind of human empathy and compassion Mr. Krebsbach experienced as an SP.
In one such encounter, a student delivered a devastating diagnosis to Mr. Krebsbach, “the patient,” who learned he was in end-stage pancreatic cancer.
“I’ll never forget the student,” Mr. Krebsbach said. “She was very patient and gave me space. She allowed me to process in that moment. She came up with a plan. And then she asked if she could give me a hug. It was such a human thing to do.”