My point is that computers, when used properly, can absolutely do math. And LLMs, when used properly, can absolutely explain the reasoning behind why a pound of bricks and a pound of feathers weigh the same.
Can they reason? Maybe, depending on your definition of reasoning.
An example: which weighs more a pound of bricks and 453.59 grams of feathers? Explain your reasoning.
LLM: The pound of bricks weighs slightly more.
*Reasoning:*
* *1 pound* is officially defined as *0.45359237 kilograms*, which is *453.59237 grams*. * You have *453.59 grams* of feathers.
So, the pound of bricks (453.59237 grams) weighs a tiny fraction more than the 453.59 grams of feathers. For most practical purposes, they'd be considered the same, but technically, the bricks are heavier by 0.00237 grams. /llm
It is both correct and the reasoning is sound. Do I understand that the machine is a pattern following machine, yes! Is there an argument to be made that humans are also that? Probably. Chomsky himself argued in favor of a universal grammar, after all.
I’m steel manning this a bit, but the point is that LLMs are capable of doing some things which are indistinguishable from human reasoning in terms of results. Does the process matter in all cases?
> Does the process matter in all cases?
So there are 2 dimensions being conflated here:
"Does how the reasoning work matter in all cases" Pretty Obviously no, but it may matter in some of them. We also don't really understand which ones yet.
"Does the reasoning work as intended in all cases?" Pretty Obviously no, but it doesn't work for at least some of them. We also don't really understand which ones yet.
"We also don't really understand which ones yet" Is the critical point of caution.