From what I've seen it works.
There is definitely a war between cheaters and people catching them. But a lot of people can't be bothered and if learning the material can be made easier than cheating then it will work.
You can imagine proctoring halls of the future being Faraday cages with a camera watching people do their test.
Local LLMs are almost here, no Internet needed!
Almost?
I've been running a programming LLM locally, with a 200k context length with using system ram.
Its also an abliterated model, so I get none of the moralizing or forced ethics either. I ask, and it answers.
I even have it hooked up to my HomeAssistant, and can trigger complex actions from there.