I like:
Unscramble the following letters to form an English word: “M O O N S T A R E R”
The non-thinking models can struggle sometimes and go off on huge tangents
Current LLM’s are based on multi-character tokens, which means they don’t know how to spell well. As a result, they are horrible at spelling games like this or, say, Hangman.
Llama 3.3 worked but (as you said) struggled before arriving at the correct answer. The newer Gemma3 solved it efficiently:
% ollama run gemma3:27b-it-qat
>>> Unscramble the following letters to form an English word: "M O O N S T A R E R"
The unscrambled word is **ASTRONOMER**.
gpt 4o got that one, but it's listed on lots of anagram sites so it's in the training data ;-)
But it failed badly when I tried a Norwegian word T U R V E I G L E N (utlevering), suggesting "uglelivert" which is not a word