sien 6 days ago

From what I've seen it works.

There is definitely a war between cheaters and people catching them. But a lot of people can't be bothered and if learning the material can be made easier than cheating then it will work.

You can imagine proctoring halls of the future being Faraday cages with a camera watching people do their test.

1
exhilaration 6 days ago

Local LLMs are almost here, no Internet needed!

mystraline 6 days ago

Almost?

I've been running a programming LLM locally, with a 200k context length with using system ram.

Its also an abliterated model, so I get none of the moralizing or forced ethics either. I ask, and it answers.

I even have it hooked up to my HomeAssistant, and can trigger complex actions from there.

robotnikman 5 days ago

What model are you using and what kind of hardware are you running it on?