• Wow, do you have any proof of this wild assertion? Has this ever been done before or is this simply conjecture?

    A Turing machine can compute any computable function. For a thing to exist in the real world it has to be computable otherwise you break cause and effect itself as the Church-Turing Thesis doesn’t really rely on anything but there being implication.

    So, no, not proof. More an assertion of the type “Assuming the Universe is not dreamt up by a Holtzmann brain and causality continues to apply, …”.

    A thermostat is an unthinking device.

    That’s a fair assessment but besides the point: A thermostat has an internal state it can affect (the valve), is under its control and not that of silly humans (that is, not directly) aka an internal world.

    There is as much chance of your thermostat gaining sentience if we give it more computing power as an LLM.

    Also correct. But that’s because it’s a T1 system, not because the human mind can’t be expressed as an algorithm. Rocks are T0 system and I think you’ll agree dumber than thermostats, most of what runs on our computers is a T1 system, ChatGPT and everything AI we have is T2, the human mind is T3: Our genes don’t merely come with instructions how to learn (that’s ChatGPT’s training algorithm), but with instructions on learning how to learn. We’re as much more sophisticated than ChatGPT, for an appropriate notion of “sophisticated”, as thermostats are more sophisticated than rocks.

    •  Veraticus   ( @Veraticus@lib.lgbt ) OP
      link
      fedilink
      English
      110 months ago

      That’s a fair assessment but besides the point: A thermostat has an internal state it can affect (the valve), is under its control and not that of silly humans (that is, not directly) aka an internal world.

      I apologize if I was unclear when I spoke of an internal world. I meant interior thoughts and feelings. I think most people would agree sentience is predicated on the idea that the sentient object has some combination of its own emotions, motivations, desires, and ability to experience the world.

      LLMs have as much of that as a thermostat does; that is, zero. It is a word completion algorithm and nothing more.

      Your paper doesn’t bother to define what these T-systems are so I can’t speak to your categorization. But I think rating the mental abilities of thermostats versus computers versus ChatGPT versus human minds totally absurd. They aren’t on the same scale, they’re different kinds of things. Human minds have actual sentience. Everything else in that list is a device, created by humans, to do a specific task and nothing more. None of them are anything more than that.