Sign In / Sign Out
Navigation for Entire University
- ASU Home
- My ASU
- Colleges and Schools
- Map and Locations
According to the director of Health Innovation Programs at Arizona State University's College of Nursing and Health Innovation, the integration of artificial intelligence into health care is inevitable.
“AI has been around for quite a while, but its ability to be effective in the health care arena is just beginning,” said Rick Hall, who also serves as a clinical professor and director of the recently established health- and wellness-centric entrepreneurial HEALab.
Chris Yoo, HEALab advisory board member and founder of Phoenix-based health care data company Systems Imagination, agrees with Hall and saw the potential to capitalize on the abundance of local experts in the field — both within ASU and the local community — by holding a pop-up event focusing on the burgeoning role of AI in health care today.
In association with Systems Imagination, HEALab will host “Artificial Intelligence: On the Edge of Health Innovation,” from 11 a.m. to 5 p.m. Friday, April 6, at the A.E. England Building on ASU’s Downtown Phoenix campus.
“Arizona and ASU have all of the same raw materials and expertise that any other place — like Boston or San Francisco — that promotes itself as a big biotech hub does,” Yoo said. “We hope this event will help people see that and build on the type of community that will put Arizona on the map as a place where bioscience and AI are happening.”
Instead of longer-form, conference-type panels, Friday’s “pop-up” event will feature shorter, 30-minute talks by speakers on various aspects of AI in health care, such as policy application, privacy and ethics. The event is free for ASU faculty, staff and students, and $15 for members of the public.
College of Nursing and Health Innovation Clinical Assistant Professor Heather Ross will be speaking at the event about concerns related to developing a workforce to manage ubiquitous health care monitoring as well as algorithmic bias. She is part of a research group at ASU that is testing a wearable device that aims to reduce hospital readmission rates of heart-failure patients by detecting pre-symptom signs of heart health-related incidents.
“I’m super excited that they’re putting on this event because there’s so much exciting work being done on AI throughout the many schools and colleges at ASU and beyond,” Ross said.
ASU Now sat down with Ross, Hall and Yoo to get a crash course on AI health care ahead of Friday’s event.
Editor’s note: Responses have been edited for length and clarity.
Question: How are we seeing artificial intelligence being incorporated into health care?
Rick Hall: We’re going to start seeing big data used a lot more for diagnoses. I read an article about a year ago talking about AI and robotics, and the top five jobs they’ll affect the most. One of the top five was doctors. I don’t think AI will replace doctors, but certainly it could increase the efficiency of diagnoses; how fast we can diagnose and potentially how accurate we can diagnose. Because essentially what doctors are doing is running algorithms themselves to diagnose patients.
Chris Yoo: Systems Imagination Inc. is a bootstrap company that started about four-and-a-half years ago knowing that there was a problem in the health care sciences industry about what to do with all this data being generated. My cofounder had a theory about designing a system based on what the human mind does really well, which is to imagine. So we built a computation model around that concept, and did some technology development to create computer systems that could think on their own, that could imagine (that’s where our company name comes from). Now we help institutions dealing with data overload identify insights in that data.
Q: What are the benefits of integrating AI into health care?
Heather Ross: The technologies we have available in health care right now are generating so much data, there’s absolutely no way a human can cope with it all. Machine learning (I prefer the term “machine learning” to “AI”; it’s not as sexy but it’s basically the same thing) is enabling us to sift through that information that we can now measure from people – whether through wearable devices or implantable devices or genome testing; any number of things — and make sense of it so we can say, “Ah, there’s a problem here or a trend here that could be connected to a problem.” In some cases, before it even becomes a problem.
One of the things we know about congestive heart failure is that before patients experience symptoms, such as fatigue or puffy ankles, their resting heart rate will start to go up. So monitors that use machine learning and artificial intelligence see these pre-symptom signs of worsening heart failure and alert patients or health care providers so they can take action and make medication changes before it becomes a problem. I think that’s one of the biggest opportunities of machine learning, particularly for managing chronic conditions.
Yoo: In my view, one of the greatest things about medicine today is the ability to collectively come up with an answer, this so-called “collective wisdom,” where many experts come together and come up with a solution. It works really well with complex diseases. If AI can help accelerate the way we collectively come up with answers to problems, that will help speed up and make better health care more accessible to everyone. AI has the capability to democratize health care information, and I think that is very valuable.
Q: What are some of the challenges of integrating AI into health care?
Ross: One of the challenges has to do with the ways that we deliver health care services and the workforce that it takes to get these types of ubiquitous monitoring technologies implemented into the existing health care system and patterns of how we deliver health care. It takes time to educate patients about how to use these technologies appropriately, and it also raises a lot of concerns for clinics as to who is going to take on the responsibility of monitoring these devices. Another potential problem is algorithmic bias. Every machine learning algorithm is written by a human, and every human approaches algorithm writing with some kind of bias, whether intentional or unintentional. So it may focus on things that aren’t really that important or it may miss things that are really important.
Yoo: One the main challenges is how do you know AI is right? We need to be able to trust in the answers that AI gives us with health care data. Throughout history, we’ve always trusted doctors to tell us something’s wrong and here’s how fix it. But when a computer does it, how do you trust it? That’s part of the validation that all of us as an industry need to do.
Hall: Some really bright minds are really afraid of what AI could do in the future, people like Elon Musk. He has said that he believes AI is dangerous. People were afraid of drones not that many years ago. There were filibusters on the Senate floor trying to avoid having drones used in policing technologies and things like that. So there’s a real fear of the unknown and how technology and robots and AI could get out of a hand — and that’s a healthy fear, to be honest.
I think part of the fear of AI is related to privacy, and in the health care sector, particularly the sharing of data and information involving medical care. And if we get to a point where we’re utilizing technology to help make diagnostic decisions, certainly we’re removing the human element to that, which can be scary to a lot of people. What we need to be doing as we’re creating these technologies and working on related policy is accounting for that missing human element and figuring out how we make provisions for that.
Q: How do we address those fears?
Hall: I think some of these fears will go away over time because technology integration is already happening in our daily lives. For example, I was texting with friends trying to discuss where and when to have coffee. Once we agreed on all that, I went to my calendar to put it in. I typed in one letter and it immediately filled out the rest. It was a little bit scary at first, but over time I got a little more comfortable with it because it became a convenience. And I think over time we’ll begin to consider certain technologies as necessary. At ASU, we already have dorms built with Alexa in them. The generations that are coming are going to be more comfortable with it, because it’s part of their regular lives. We just need to figure out how to work through those fears and make sure we’re creating the necessary protocols and safeguards.
Q: The cardiologist Dr. Eric Topol visited ASU last March to deliver the McKenna Lecture about digital health care. He talked about virtual doctor visits, bandages that measure vital signs and smartphone apps that diagnose diseases. How far away are we from those kind of technologies becoming the norm in health care?
Ross: We’re very close, meaning that there are devices that are being tested right now. In a very low-tech way, we already have things like forehead monitors that read your temperature, which have been available for years. Blood pressure and respiration monitors are coming very, very soon. Half of it is here and half of it is being tested, but it’s right around the corner. A startup company called HealthTell that grew out of ASU research is already looking at technology that has the potential to enable health monitoring using a single drop of blood. It’s still years away from being commercially available, but it’s being actively developed right now.
Hall: We’re already carrying and wearing devices on a regular basis that are used for health in some ways. I recently heard of a woman who found out she was pregnant through her Fitbit. She noticed her resting heart rate was higher and asked about it on a Fitbit web forum, where a nurse told her it was possible she could be pregnant. I think that kind of thing will probably start happening more often if we have a regular understanding of what our vitals are — things like blood pressure and heart rate, which are measurable but we don’t check on a regular basis. But our smart devices can tell us those measurements easily.
Top image courtesy of Pixabay.com