Skip 
Navigation Link

1215 South Walnut Ave.
Demopolis, AL 36732 map map 

Access to Care: 1-800-239-2901

Health Policy & Advocacy
Resources
Basic InformationMore InformationLatest News
What Will Convince Americans to Get a COVID-19 Vaccine?CDC Recommends Face Masks in All Public Transportation SettingsInsured Patients Are Getting Surprise Bills After ColonoscopiesBogus 'Cure' Claims Have U.S. Consumers Snapping Up CBD ProductsPediatricians' Group Tackles Racism in Health CareAs Virtual Doctor Visits Spike, Concerns About Equity, Missed Diagnoses GrowWas FDA Lax in Approving Opioids Too Easily?Allowing More Gay Men to Donate Corneas Could Save Sight for Thousands: StudyAccuracy of COVID-19 Antibody Tests Varies Widely, Study FindsCould Drones Delivering Defibrillators Save Lives?Statins Going Generic Saved Medicare BillionsAHA News: Looming Wave of Evictions, Housing Instability Pose Threat to HealthAHA News: Health Apps Pose Privacy Risks, But Experts Offer This AdviceCould You Save a Life After Mass Violence? Most Americans Say NoGun Violence Costs U.S. Health Care System $170 Billion AnnuallyWith COVID Vaccine in Works, 1 in 5 Americans Doesn't Believe in ShotsTelehealth Skyrocketing Among Older AdultsPharmacists in All U.S. States Can Give Kids Childhood ShotsAHA News: COVID-19's Economic Fallout Expands Food Insecurity, as Groups Scramble to HelpCOVID-19 Clinical Trials Lack Diversity, Researchers SayLook Beyond Fossil Fuels to Curb Air PollutionTelemedicine Is Here: Experts Offer Tips for SeniorsMany Older Adults Can't Connect With Telehealth: StudyAHA News: High-Speed Internet Offers Key Connection to Health, But Millions Lack It11 States Could Face ICU Doc Shortages as Coronavirus Cases SurgeWill the Telemedicine Boom Outlast the Pandemic?Yet Another Study Finds Vaccines Are SafeIn Rush to Publish, Most COVID-19 Research Isn't Reliable, Experts SayWith Tighter Handgun Laws, U.S. Would See Fewer Suicides by Young PeoplePandemic Has ER Docs Stressed Out and Weary: SurveyU.S. Air Quality Got Better During Pandemic: StudyColon Cancer Tests by Mail Might Boost ScreeningWill CPR Save Your Life? Study Offers a Surprising AnswerWill COVID Pandemic's Environmental Benefit Last?AHA News: As Pandemic Disrupts Research, Scientists Look for New Ways ForwardAmericans Lag Behind Brits When It Comes to HealthBan Menthol Cigarettes, Lower Smoking Rates?Tech Is Keeping More Americans in Touch With DoctorsEven Small Reductions in Air Pollution Help The HeartHigh Costs Lead Millions of Americans to Shop Abroad for Rx DrugsPandemic Hits Primary Care Practices Hard Across the U.S.: StudyOne-Time Treatment Eases Parkinson's -- in MiceAHA News: Here's What Doctors Know About Immunizations Right Now – You Still Need ThemDoctors' Choice of Anesthesia Could Help Curb Climate ChangeTough State Gun Laws Help Save Lives: StudyBlood Donors Will Get Results of Coronavirus Antibody Test, Red Cross SaysCOVID Got You Scared of Performing CPR? Study Finds Infection Risk Is LowFor Stressed-Out Black Americans, Mental Health Care Often Hard to Come ByHealthDay In-Depth
The AI Revolution: For Patients, Promise and Challenges Ahead">HealthDay In-Depth
The AI Revolution: For Patients, Promise and Challenges Ahead
Women Still Left Out of Much Medical Research
Questions and AnswersVideosLinksBook Reviews
Related Topics

Health Insurance
Healthcare

HealthDay In-Depth
The AI Revolution: For Patients, Promise and Challenges Ahead

HealthDay News
by By Anne Harding
HealthDay Reporter
Updated: Jun 10th 2020

new article illustration

WEDNESDAY, June 10, 2020 (HealthDay News) -- Streaks of color swirl through a pulsing, black-and-white image of a patient's heart. They represent blood, and they're color-coded based on speed: turquoise and green for the fastest flow, yellow and red for the slowest.

This real-time video, which can be rotated and viewed from any angle, allows doctors to spot problems like a leaky heart valve or a failing surgical repair with unprecedented speed. And artificial intelligence (AI) imaging technology made it possible.

"It's quite simple, it's like a video game," said Dr. Albert Hsiao, an associate professor of radiology at the University of California, San Diego, who developed the technology while a medical resident at Stanford University.

There's a lot going on behind the scenes to support this simplicity. Each 10-minute scan produces 2 to 10 gigabytes of data. To handle such huge, complicated data sets, Hsiao and his colleagues at Arterys, the company he helped found in 2012 to develop the technology, decided to build the infrastructure on the internet, where it can be accessed by servers from other researchers.

And now, investigators around the world are using this cloud-based infrastructure to share and test medical AI imaging models in the Arterys Marketplace. "We've made it almost as easy to get medical AI online as to upload a YouTube video," said Arterys product strategy manager Christian Ulstrup.

Arterys decided to open up its $50 million platform to all comers -- a move that raised eyebrows in the competitive world of health care and medicine -- because the company realized that the full potential of the technology to transform medicine couldn't be realized without collaboration from others, Ulstrup explained.

"There are all these brilliant researchers, startup founders and individual developers who are working with machine learning models with the data they find online," Ulstrup explained. "The thing that's really heartbreaking is most of these models that could be used to meet unmet clinical needs end up dying on hard drives. We're just trying to connect these people who don't really have a communication channel."

Here's a video of the AI heart scan in action:

Artificial intelligence -- basically, computer programs or machines that can learn -- has the potential to open up access to health care, improve health care quality and even reduce costs, but it also carries real risks. AI tools have to be "trained" with huge quantities of high-quality data, and to be useful they have to be robust enough to work in any setting. And using AI that is trained on biased data could harm patients.

AI as 'double-edged sword'

"It's very important that we start looking at the unconscious biases in the data to make sure that we don't hardwire discriminatory recommendations," said Dr. Kevin Johnson, chair of biomedical informatics at Vanderbilt University Medical Center in Nashville. He prefers the term "augmented intelligence" to "artificial intelligence," since AI aims to extend the abilities of clinicians, not to steal their jobs.

One key application of AI in health care will be to identify patients who are at risk of poor outcomes, but such predictions are worse than useless if doctors don't know how to prevent these outcomes, or the resources aren't available to help patients, Johnson added. "We don't have the work force who plays the role of the catcher's mitt" and can step in and help these at-risk patients, Johnson noted, especially in a healthcare system now stretched to the limit by the coronavirus pandemic.

"I think we have to think creatively about how we restructure the system to support some of the outcomes that are of interest to us," he said.

Dr. Ravi Parikh is an instructor in medical ethics and health policy at the Perelman School of Medicine at the University of Pennsylvania in Philadelphia. He pointed out that "AI and machine learning are sort of a double-edged sword, particularly in my field of oncology."

AI has proven its potential for interpreting images, for example diagnosing lung cancer from a CT scan, but when it comes to using AI for supporting clinical decisions, like whether this patient should have chemo or that patient should go to the hospital, there's a risk that it won't help patients or could even be harmful, Parikh noted.

"Even though you might have an AI that's accurate on the whole, if it's mischaracterizing an outcome for a specific group of patients you really have to question whether it's worth it," he said.

What's been missing in the development of health care AI, Parikh added, have been rigorous prospective studies to determine whether the technology is actually useful for patients.

Just as the U.S. Food and Drug Administration requires that drug companies run clinical trials to confirm that their product is safe and effective, Parikh said, the FDA should start requiring makers of AI tools to test their safety and effectiveness in humans.

And just as the agency tracks the safety of drugs once they reach the market, the FDA should set up frameworks to study whether the AI algorithms it approves are enforcing existing biases, he noted.

"We really need to start focusing on that as these things are making their way to the clinic," Parikh said.