hackinthebochs 4 days ago

LLMs are modelling the world, not just "predicting the next token". They are certainly not akin to parrots. Some examples here[1][2][3]. Anyone claiming otherwise at this point is not arguing in good faith.

[1] https://arxiv.org/abs/2405.15943

[2] https://x.com/OwainEvans_UK/status/1894436637054214509

[3] https://www.anthropic.com/research/tracing-thoughts-language...

1
wat10000 4 days ago

Maybe it takes some world modeling to do it as well as they do, but ultimately they are just predicting the next token. These things are not mutually exclusive.

hackinthebochs 4 days ago

The issue is whether they are "just" predicting the next token. When people say they are stochastic parrots, they are denying any of these further capabilities. Modelling is facet of understanding and so to discover that LLMs model the world should strongly raise your credence that they do understand.