dghlsakjg 5 days ago

I recently had a computer tell me that 0.1 + 0.2 != 0.3. It must not be a math capable machine.

Perhaps it is more important to know the limitations of tools rather than dismiss their utility entirely due to the existence of limitations.

1
blamestross 5 days ago

A computer isn't a math capable machine.

> Perhaps it is more important to know the limitations of tools rather than dismiss their utility entirely due to the existence of limitations.

Well, yes. And "reasoning" is only something LLMs do coincidentally, to their function as sequence continuation engines. Like performing accurate math on rationale numbers, it can happen if you put in a lot of work and accept a LOT of expensive computation. Even then there exists computations that just are not reasonable or feasible.

Reminding folks to dismiss the massive propaganda engine pushing this bubble isn't "dismissing their utility entirely".

These are not reasoning machines. Treating them like they are will get you hurt eventually.

dghlsakjg 5 days ago

My point is that computers, when used properly, can absolutely do math. And LLMs, when used properly, can absolutely explain the reasoning behind why a pound of bricks and a pound of feathers weigh the same.

Can they reason? Maybe, depending on your definition of reasoning.

An example: which weighs more a pound of bricks and 453.59 grams of feathers? Explain your reasoning.

LLM: The pound of bricks weighs slightly more.

*Reasoning:*

* *1 pound* is officially defined as *0.45359237 kilograms*, which is *453.59237 grams*. * You have *453.59 grams* of feathers.

So, the pound of bricks (453.59237 grams) weighs a tiny fraction more than the 453.59 grams of feathers. For most practical purposes, they'd be considered the same, but technically, the bricks are heavier by 0.00237 grams. /llm

It is both correct and the reasoning is sound. Do I understand that the machine is a pattern following machine, yes! Is there an argument to be made that humans are also that? Probably. Chomsky himself argued in favor of a universal grammar, after all.

I’m steel manning this a bit, but the point is that LLMs are capable of doing some things which are indistinguishable from human reasoning in terms of results. Does the process matter in all cases?

blamestross 4 days ago

> Does the process matter in all cases?

So there are 2 dimensions being conflated here:

"Does how the reasoning work matter in all cases" Pretty Obviously no, but it may matter in some of them. We also don't really understand which ones yet.

"Does the reasoning work as intended in all cases?" Pretty Obviously no, but it doesn't work for at least some of them. We also don't really understand which ones yet.

"We also don't really understand which ones yet" Is the critical point of caution.