Well, two responses I have seen to the claim that LLMs are not reasoning are:
- we are all just stochastic parrots lmao
- maybe intelligence is an emergent ability that will show up eventually (disregard the inability to falsify this and the categorical nonsense that is our definition of “emergent”).
So I think this research is useful as a response to these, although I think “fuck off, promptfondler” is pretty good too.
yikers