submeta 5 days ago

Chomsky’s notion is: LLMs can only imitate, not understand language. But what exactly is understanding? What if our „understanding“ is just unlocking another level in a model? Unlocking a new form of generation?

5
roughly 5 days ago

> But what exactly is understanding?

He alludes to quite a bit here - impossible languages, intrinsic rules that don’t actually express in the language, etc - that leads me to believe there’s a pretty specific sense by which he means “understanding,” and I’d expect there’s a decent literature in linguistics covering what he’s referring to. If it’s a topic of interest to you, chasing down some of those leads might be a good start.

(I’ll note as several others have here too that most of his language seems to be using specific linguistics terms of art - “language” for “human language” is a big tell, as is the focus on understanding the mechanisms of language and how humans understand and generate languages - I’m not sure the critique here is specifically around LLMs, but more around their ability to teach us things about how humans understand language.)

npteljes 5 days ago

I have trouble with the notion "understanding". I get the usefulness of the word, but I don't think that we are capable to actually understand. I also think that we are not even able to test for understanding - a good imitation is as good as understanding. Also, understanding has limits. In school, they often say on class that you should forget whatever you have been taught so far, because this new layer of knowledge that they are about to teach you. Was the previous knowledge not "understanding" then? Is the new one "understanding"?

If we define "understanding" like "useful", as in, not an innate attribute, but something in relation to a goal, then again, a good imitation, or a rudimentary model can get very far. ChatGPT "understood" a lot of things I have thrown at it, be that algorithms, nutrition, basic calculations, transformation between text formats, where I'm stuck in my personal development journey, or how to politely address people in the email I'm about to write.

>What if our „understanding“ is just unlocking another level in a model?

I believe that it is - that understanding is basically an illusion. Impressions are made up from perceptions and thinking, and extrapolated over the unknown. And just look how far that got us!

foldr 4 days ago

Actually no. Chomsky has never really given a stuff about Chinese Room style arguments about whether computers can “really” understand language. His problem with LLMs (if they are presented as a contribution to linguistic science) is primarily that they don’t advance our understanding of the human capacity for language. The main reasons for this are that (i) they are able to learn languages that are very much unlike human languages and (ii) they require vastly more linguistic data than human children have access to.

dinfinity 5 days ago

> But what exactly is understanding?

I would say that it is to what extent your mental model of a certain system is able to make accurate predictions of that system's behavior.

smokel 5 days ago

Understanding is probably not much more than making abstractions into simpler terms until you are left with something one can relate to by intuition or social consensus.

HPsquared 5 days ago

Transforming, in other words.

egberts1 5 days ago

Just because it can transform doesn't mean that the logic still remains correct.

I found this out when attempting to transform wiki pages into blog-specific-speak, repeatedly.