> Tbf, there's a phase of learning to code where everything is pretty much an incantation you learn because someone told you "just trust me."
There really shouldn't be. You don't need to know all the turtles by name, but "trust me" doesn't cut it most of the time. You need a minimal understanding to progress smoothly. Knowledge debt is a b*tch.
I remember when I first learned Java, having to just accept "public static void main(String[] args)" before I understood what any of it was. All I knew was that went on top around the block and I did the code inside it.
Should people really understand every syntax there before learning simpler commands like printing, ifs, and loops? I think it would yes, be a nicer learning experience, but I'm not sure it's actually the best idea.
If you need to learn "public static void main(String[] args)" just to print to a screen or use a loop, means you're using the wrong language.
When it's time to learn Java you're supposed to be past the basics. Old-school intros to programming starts with flowcharts for a reason.
You can learn either way, of course, but with one, people get tied up to a particular language-specific model and then have all kinds of discomfort when it's time to switch.
For most programming books, the first chapter where they teach you Hello, World is mostly about learning how to install the tooling. Then it goes back to explain variables, conditional,... They rarely throws you into code if you're a beginner.
I mean, I didn't need to learn those things, they were just in whatever web GUI I originally learned on; all I knew was that I could ignore it for now, a la the topic. Should the UI have masked that from me until I was ready? I suppose so, but even then I was doing things in an IDE not really knowing what those things were for until much later.
> There really shouldn't be.
I don't see how, barring some kind of transcendental change in the human condition. Simple lies [0] and ignore-this-until-later is basically human nature for learning, you see it in every field and topic.
The real problem is not about if, but when certain kinds of "incantations" should be introduced or destroyed, and in what order.
Please, reread the statement I'm arguing with. I posit that you can mostly avoid "everything is an incantation for a while" if you're onto the correctly constructed track to knowledge.
Consider, how it's been done traditionally for imperative programming: you explain the notion of programming (encoding algorithms with a specific set of commands),explain basic control flow, explain flowcharts, introduce variables and a simplified computation model. Then you drop the student into a simplified environment where they can test the basics in practice, without the need to use any "incantations".
By the time you need to introduce `#include <stdio.h>` they already know about types, functions, compilation, etc. At this point you're ready to cover C idioms (or any other language) and explain why they are necessary.