Computer screen showing side-by-side comparison of real and AI-generated medical X-ray images

Reporter Matches Radiologists at Spotting AI-Made X-Rays

🤯 Mind Blown

A journalist with zero medical training scored just as well as expert radiologists at detecting deepfake X-rays in a new study. The surprising results show both how far AI imaging has come and how humans can still spot the fakes.

📺 Watch the full story above

A STAT News reporter just proved you don't need a medical degree to spot AI-generated X-rays as well as the experts do.

Katie Palmer, a health journalist, took the same test given to 17 radiologists in a recent study published in the journal Radiology. The challenge? Figure out which X-rays were real and which ones were created by artificial intelligence. Palmer scored about 75%, matching the trained medical professionals exactly.

The study tested whether radiologists could tell the difference between genuine patient X-rays and deepfakes made by AI. These aren't obvious fakes. The images look incredibly realistic, which is exactly what makes this research important for patient safety.

Palmer had reported on the study just a week earlier but had never read X-rays professionally. Her success at matching expert performance surprised everyone, including herself.

STAT's multimedia producer Alex Hogan decided to test himself too. He took the same quiz to see if he could beat both the radiologists and Palmer at spotting the AI-generated images.

Reporter Matches Radiologists at Spotting AI-Made X-Rays

The results reveal something fascinating about how AI medical imaging has evolved. These deepfake X-rays are good enough to fool trained eyes three-quarters of the time. But they're not perfect, and humans are getting better at recognizing the telltale signs.

Why This Inspires

This story shows human adaptability at its best. As AI technology advances, people are learning to spot the differences between real and artificial. Radiologists, journalists, and curious producers are all developing new skills to meet new challenges.

The 75% success rate isn't just a fun trivia fact. It means that with attention and practice, people can learn to identify AI-generated medical images. That's crucial as these tools become more common in healthcare settings.

Understanding how to spot deepfakes protects patients and ensures medical decisions get made with accurate information. The fact that non-experts can learn this skill shows we're not helpless against advancing technology.

Human judgment still matters, and we're proving we can evolve alongside the tools we create.

More Images

Reporter Matches Radiologists at Spotting AI-Made X-Rays - Image 2
Reporter Matches Radiologists at Spotting AI-Made X-Rays - Image 3
Reporter Matches Radiologists at Spotting AI-Made X-Rays - Image 4
Reporter Matches Radiologists at Spotting AI-Made X-Rays - Image 5

Based on reporting by STAT News

This story was written by BrightWire based on verified news reports.

Spread the positivity!

Share this good news with someone who needs it

More Good News