I don’t understand how offline interviewing is needed to catch ai use, not counting take homes.
Surely just asking the candidate to lean a bit back on the web interview and then having a regular talk without him reaching for the keyboard is enough? I guess they can have some in between layer hearing the conversation and posting tips but even then it would be obvious someone’s reading from a sheet.
That type of cat-and-mouse game is ultimately pointless. It's fairly easy to build an ambient AI assistant that will listen in to the conversation and automatically display answers to interview questions without the candidate touching a keyboard. If the interviewer wants to get any reliable signal then they'll have to ask questions that an AI can't answer effectively.
There are interview cheating tools which listen in on the call and show a layer over the screen with answers which doesn’t show on screen shares.
So you’d only be going off how they speak which could be filtering out people who are just a bit awkward.