News

Erbil family denies reports their son died after using AI to recommend medication

The New Region

Jun. 09, 2025 • 2 min read
Image of Erbil family denies reports their son died after using AI to recommend medication The family of Zakaria Salman (pictured) said that their son's untimely death was due to his blood pressure issues, adding they are "twice as anguished" by inaccurate media reports implicating AI as a factor. Graphic: The New Region

The death of Zakaria Salman, a young man with chronic blood pressure issues, led to media reports that he had consulted an artificial intelligence interface and received faulty medication recommendations, a claim his family dismissed as "misleading and baseless."

ERBIL, Kurdistan Region of Iraq - A family in Erbil dismissed media reports that their son had died due to consulting artificial intelligence (AI) for medicinal recommendations after falling ill.

 

Zakaria Salman was a second-year college student who suffered from chronic low blood pressure. His father, Salman Khidir, told The New Region that media reports suggesting Zakaria had died from taking the wrong medications recommended by an AI interface were false.

 

“We are very saddened by many reports by media outlets. We are in a state of anguish. When we see all of these misleading and baseless news stories published about my son, we are twice as anguished,” the distraught father said.

 

The father attributed his son’s passing to Salman's chronically unstable blood pressure.

 

"My son had been suffering from low blood pressure since childhood. His teachers at school used to inform me all the time that my son’s blood pressure had plummeted and that he needed treatment,” the father said.

 

He went on to recall the day his son finally succumbed to his illness. “This time, like many other times in the past, his blood pressure dropped, and he died. All the rumors that he died after consulting artificial intelligence are untrue.”

 

Saman Abdulla of Salahaddin University’s media office, for his part, also denied claims that AI had anything to do with Zakaria’s death.

 

“We would like to confirm that the student has not died due to taking medicines based on advice from AI,” Abdulla said.

 

Despite rapid advances in the field of AI in recent times, the technology is still prone to a phenomenon, known in the space as hallucination, whereby it can present inaccurate information and pass it off as correct. Most AI firms warn against the use of their products for medical advice, diagnosis, or treatment.

Profile picture of The New Region
Author The New Region

NEWSLETTER

Get the latest updates delivered to your inbox.