partofthevoice
Posts
However, answering the question “what it’s like to be” is not relevant here. What’s relevant is that existence has qualia at all.
Does existence “have qualia?” That treats qualia almost like it’s ontological, if I’m interpreting you correctly. Yet, qualia can only exist from the perspective of a being with the capacity to model a (seemingly external) world via said qualia. There is no magic qualia sauce we can embed inside something.
Qualia, I think, is a process of information reduction… but also it’s a flavor of information interrogation. Because, reducing electromagnetic radiation to “visual perception” happens inside light sensors too — albeit without counting as “qualia.”
What would you say counts as “qualia?” Or rather, what are its dependencies?
Isn’t it kind of eery that you can only suppose it must be “like something” to be an insect, from the very precise bias of being human? We’re projecting the idea that “it’s like something to be something [as a human]” only the experience of other things.
How would we describe what it’s like? Would something poetic suffice, such as “it’s like being a leaf in the wind, and with weak preference of where you blow but no memory of where you’ve been.” … but, all of that is human concepts, human experience decomposed into a subset of more human experiences (really weird, the recursive nature of experience and concepts).
I think the idea of “what it’s like…” has some interesting flaws when applied to nonhumans. It kind of presupposes that insects are lesser, in a way. As though we can conceptualize what it’s kind to be them, merely by understanding a stricter subset of what it’s like to be human.
it lacks childhood dependency and attachments.
Isn’t general intelligence, or more broadly “consciousness,” a prerequisite to that? How would you make an unconscious machine more conscious merely by making mock scenarios that conscious beings necessarily experience?
it struggles to overcome repeated pain and suffering
That’s getting into phenomenology — why is pain an experience of suffering at all? How would you give it pain and suffering without having already made it AGI? We’re still missing the -> AGI step.
it lacks regular eating and restroom breaks
The necessity of which is emergent from our culture and biology, as conscious social beings. We’re still missing a vital step.
it struggles to accept loss in everyday situations
What is “loss” and “everyday situations” if not just a way we choose to see the world, again as conscious beings.
it lacks the concept of our inevitable death
How do you give it a “concept” at all?
these nagging memories and concepts
The AI in its current form has the “memory” in some form, but perhaps not the “nagging.” What should do the “nagging” and what should be the target of the “nagging?” How do you conceptually separate the “memory” and the “nagging” from the “being” that you’re trying to create? Is it all part of the same being, or does it initialize the being?
We’re a long way away from AGI, IMO. The exciting thing to me, though, is I don’t think it’s possible to develop AGI without first understanding what makes N(atural)GI. Depending how far away AGI is, we could be on the cusp of some deeply psychologically revealing shit.