We may never be able to tell if AI becomes conscious
5 points
5 hours ago
| 1 comment
| techxplore.com
| HN
richardatlarge
5 hours ago
[-]
Because consciousness doesn't matter, that's why

Consciousness is neither necessary nor sufficient for AI to achieve general intelligence, defined as exponential emergent learning. Thinking it is may be an important step in an AI outbreak. Consciousness as proposed here is Cartesian dualism. It's pure religion and has all the power associated with religion

reply
nis0s
2 hours ago
[-]
> It's pure religion and has all the power associated with religion

Not exactly. Consciousness has a lot of useful properties that help with different kinds of behaviors in living things, e.g., improving information synthesis through learning (where self-reflection can also help you discover what you don’t know, so you find out what else to learn), and applying cognitive flexibility for adaptability.

That said, you’re right that AI does not need consciousness to be a useful tool. But does AI need consciousness to attain AGI? I think so, there’s no other model that we’re aware of where properties related to conscious beings can be found in beings that aren’t considered as having consciousness.

Is AGI necessary for a tool to have beyond human information synthesis. I don’t think so, as long there’s some process which can help it realize where it’s wrong, and how to correct itself.

reply