I'm not sure why everyone picks on radiology as the 'obvious' field that will get automated. So far it has been the opposite:
> In 2025, American diagnostic radiology residency programs offered a record 1,208 positions across all radiology specialties, a four percent increase from 2024, and the field’s vacancy rates are at all-time highs. In 2025, radiology was the second-highest-paid medical specialty in the country, with an average income of $520,000, over 48 percent higher than the average salary in 2015.
Simply put, radiologists do a lot more than merely read scans:
> Radiologists are useful for more than reading scans; a study that followed staff radiologists in three different hospitals in 2012 found that only 36 percent of their time was dedicated to direct image interpretation.
Source: https://worksinprogress.co/issue/the-algorithm-will-see-you-...
In a similar vein, studies have found that software engineers spend only ~30% of their time coding on average. Yet we're similarly meant to believe that AI is going to replace human engineers? Ironically this belief is often held by software engineers themselves, as if they don't realize that they do so much more than generate characters. Bizarre.
With this attitude you'll never get investments...
A radiologist read my brother-in-law's MRI and contacted him and told him to go directly to the ER and actually wrote up a referral to have him immediately admitted. This happened in the middle of the night since the radiologist was reading them on the night shift. He was admitted and operated on two days later (as he had to be prepared for surgery and have additional pre-op testing done).
No AI replacement is going to be doing that anytime soon.
I'm as skeptical of AI as the next guy, but in this particular case, wouldn't the AI have read your brother-in-law's MRI right after it was taken, and sent him to the ER before he even left the MRI testing room?
I'm not arguing with the GP's point that radiologists don't do many other things that the AI maybe can't do, but it feels like your example is the opposite of that.
And not only that, your example demonstrates a failing of the human's limited amount of time to get all their work done.
Maybe the radiologist did that 'one' out many possible correlations and deduction that AI couldn't have had possibly done for OP's BiL.
> everyone picks on radiology as the 'obvious' field that will get automated.
Nah, pre-LLM, I think the obvious fields to pick on were lawyers, middle-management, and "email jobs" generally. That was a big miscalculation, since most people (especially engineers) do not understand the politics of power. Those jobs tend to jealously protect the power that they have and systematically dismantle what accountability they might be subjected to. Engineers in general are much more likely to democratize (and thus threaten) power, by creating things like accountability via metrics, at the same time as they mostly refuse to unionize. Radiologists have some unions, medicine generally enjoys a moat of credentials and certification. Things that SWEs in particular rejected while they said "come on over, anyone can code". I doubt radiologists ever suggested themselves that they should be measured on throughput, but SWEs actually did push ideas of 10x engineers and metrics like lines-of-code for years to argue they are productive enough to deserve raises.
The article addresses this. It explains that if large portion of your job can be automated (and hence be done at little marginal cost) then the remaining parts of your job become more valuable.
Think of it like 'commoditize your complements'.
Radiology seems like a good field for AI because it's easier to see what automation would look like in practice, since an MRI, CT-Scan directly produces data and which can be fed to an AI. How well this actually works, I have no idea.