A new study suggests that ChatGPT can give patients diagnoses as well as a trained doctor.

ChatGPT could help diagnose patients in future emergency rooms, according to a new pilot study that looked at how the large language model can be used to support doctors.

The research, published in the Annals of Emergency Medicine, concluded that the artificial intelligence (AI) chatbot diagnosed patients as well as trained doctors. The research will be presented at the European Congress of Emergency Medicine, which begins this weekend.

Researchers at Jeroen Bosch Hospital in the Netherlands fed doctors' notes and anonymous information about 30 patients, including exams, symptoms and laboratory results, into two versions of ChatGPT.

They found that there was an overlap of around 60% between the emergency doctors' shortlist of possible diagnoses and the chatbot.

"We found that ChatGPT performed well in generating a list of likely diagnoses and suggesting the most likely option," said study author Dr. Hidde ten Berg, from the department of emergency medicine at Jeroen Bosch Hospital, in a communicated.

"We also found a lot of overlap with doctors' lists of likely diagnoses. In simple terms, this indicates that ChatGPT was able to suggest medical diagnoses just as a human doctor would."

Emergency physicians had the correct diagnosis in their top five lists 87% of the time, while ChatGPT version 3.5 had the correct diagnosis in their short list 97% of the time, compared to 87% for ChatGPT version 4.0.


This tool is not a medical device

The research, as a proof of concept, was not used to have an impact on patients, but rather to test the potential or feasibility of using generative Artificial Intelligence (AI) for diagnosis.

But it is not yet something that is available for clinical use.

"One of the problems, at least in Europe, is the fact that legislation is very strict," study author Steef Kurstjens, from the department of clinical chemistry and hematology at Jeroen Bosch Hospital, told Euronews Next.

"These types of instruments are not medical devices. If we use them to affect patient care, we are using an instrument that is not a medical device as a medical device, which is not allowed. Therefore, I think it is necessary to adopt new legislation if we want to use this type of instrument", he added.