Tl;dr: Maybe?
Putting all the memery of this subreddit aside, there was a question posted here that really peaked my interest. It was a serious question of “do bots deserve rights?” My first initial reaction was yes, then I thought about it, and my answer became now, but then I thought about it some more and my answer became yes. I’m stuck in a perpetual cycle of disagreeing with myself and thinking about where to draw the line.
1) Slavery
One of the biggest examples that comes to mind is the civil rights era. What a majority of people thought was that people of colour should not be able to have to same rights as white people. What really struck a chord with me was that we are currently using out bots as ethical slaves. We feel no remorse in using them for menial, labor intensive virtual or physical tasks that we deem are below us or can be transferred to the detonator of existence, aka bots or programs. The comparison I drew really highlights how the roots of the struggle to liberate people from slavery is in the same vein of the situation of AI or robot rights.
On the same hand, the robots we’ve programmed or made today don’t actually feel emotion or pain the same way a human does, but does that make it ethical? Maybe. If a rock feels no pain, sadness, or any other feeling if you kick it, does kicking I become morally okay? As AI and robots develop further, will society have to not program them with emotions or will they have to give them the right to “feel”, and will the robots or AI be considered the same level of worth as a human?
2) Ecocentrism
The (very basic and over simplified) idea of ecocentrism is that all life has equal value, and that your life is to that of the life of a rock or piece of grass. I like to look at the world like this, as balance and harmony are two of the most important things to me. What problems this encounters is that ecocentrism encapsulates the natural world, and has almost no literate on virtual or manmade object, such as AI or Robots. However, I think it would be a safe assumption to include robots into the natural world in the same way we include babies. They are technically “manmade” in the same fashion as a robot, so thus, for the sake of simplicity, should be treated the same. I believe that the world isn’t ready to embrace ecocentrism as an idea yet, because we still eat animals, we still kill insects and flies, and regularly participate in anti-ecofeminist activities.
An opposite ideology is Anthropocentrism. (Again, over simplified and very basic.) The ideology that human should be above all things in existence, be it inanimate or living. Though it may seem like a terrible ideology, it’s what the world lives in right now, and it’s not too bad. For humans that is, for animals it’s a nightmare. In an anthropocentric world, bots and Ai don’t really matter and don’t have any significance. Drawing from the slavery example, slaves were considered to be subhuman in many cultures, and thus were deemed not to be treated the same way a non-slave would be treated. I think that if the world continues in the direction it is headed in now, bots and AI won’t see major civil rights actions until someone decides to fight a robot war, but then we head into overwatch territory.
3) Overwatch
Overwatch is a multiplayer team based FPS, though I assume most people know what it is. The lore of that game does a great job of explaining what a realistic robot civil rights movement would look like. With people like Zenyatta embracing the ideologies of Martin Luther King Jr., and the rouge bots acting as the black panthers. If we want an sort of accurate depiction of what will go down if AI and robots are given “real” intelligence and decision making, overwatch lore would be the best place to look. Robots like Bastion suffer from PTSD like symptoms; does that make Bastion human?
4) What does it mean to be human?
I think this is a really important question to ask ourselves right now. What does it mean to be human? Is it logic? Is it our feelings? Is it the fact that we have reproductive organs and a heart and a spleen? I think that this is where we’re stuck right now in terms of advancing bots rights; because we don’t know what makes something human, we are unable to give something that “isn’t human” human rights. The further that we alienate ourselves from some basic definitions to what it means to be human, the further we get from allowing non humans to have rights. We still treat animals in inhumane conditions, we do not consider bots and AI to be “real”, and we treat nature as something to conquer. We are alienating ourselves from the very things that we need to come to terms with in order to understand our existence better.
5) Conclusion
Do I think that robots and AI deserve rights? Yes, but under very specific conditions. I’m very torn to the idea of whether or not we should even give “virtual-kind” any sort of emotions to begin with, I think to subject something that we’ve created to some of the worst aspects to being human is immoral and that we should be staying far away from it, however, if someone does eventually create a virtual being with emotions and feelings, that being should be given basic human rights.
I’m terrified of what technology has grown to be, and I’m sure all of us are as well, but we shouldn’t let fear and prejudice dictate how to treat others.
If you have any commentary or questions to make, please comment! I would love to hear what all of you think.
ここには何もないようです