CheXzero diagnoses perfectly without training with X-rays
San Francisco, Sept. 20, 2022.
A new AI diagnostic tool can detect disease on chest X-rays using natural language descriptions from clinical reports.
The CheXzero model is self-monitored and did not require X-rays annotated by human radiologists, a process that can take hundreds of hours. In tests, it performed better than other self-supervised AI models and was about as accurate as human radiologists.
The model was developed by researchers at Harvard Medical School and Stanford University, who describe it in a recent report published in Nature Biomedical Engineering.
CheXzero was trained using chest X-rays and associated radiology reports. It then “learns” to match the image with text descriptions, essentially recognizing how the unstructured text relates to the visual patterns of the image.
Other AI models trained on medical images often require datasets with thousands of images where diseases have been explicitly annotated by human clinicians. The researchers have made the model’s code publicly available.