My point is that the only definition that really makes sense is a functional one. The latest studies show GPT-4.5 passing the Turing Test in pretty convincing fashion (https://arxiv.org/abs/2503.23674) so applying the label of "thinking" does not seem unreasonable to me.
If you have another definition in mind, I'd be curious to know what it is.
Nice piece. Searle's Chinese Room argument is very weak, and I have not found that his rejection of computationalism fits into any coherent framework.
Matthew, thanks for this post - I found it insightful. The lack of shared definitions is a major bottleneck right now.
Since you are in the same headspace, I was wondering if you could comment on a related piece on definitions I wrote this week:
https://open.substack.com/pub/kthuot/p/stop-calling-everything-agi?r=2rx3m&utm_medium=ios
Cheers,
Kevin
It's really pointless to accept or reject anything before defining it
My point is that the only definition that really makes sense is a functional one. The latest studies show GPT-4.5 passing the Turing Test in pretty convincing fashion (https://arxiv.org/abs/2503.23674) so applying the label of "thinking" does not seem unreasonable to me.
If you have another definition in mind, I'd be curious to know what it is.