forum Robotics and ethical dilemmas?
Started by @Say-Hello-to-the-Rugs-Topology
tune

people_alt 86 followers

@Say-Hello-to-the-Rugs-Topology

Hey y'all. I'm currently working on a plot for something sorta in the realm of Sci-Fi. I don't plan on it taking place in a distant future, but nonetheless it's going to have robots-a-pleanty. I really love to talk ethics and especially when it comes down to machines and robots. Stories such as 'I Have No Mouth and I Must Scream' and 'Do Androids Dream of Electric Sheep'/'Blade Runner' are some of my favorites for these reasons. I kinda wanted my story to encompass an ethical dilemma regarding the robots as well, but I'd like to stray from the worn down topics such as "Robots becoming self aware."

Two of the ideas I had were:
Would it be immoral to give a robot the ability to feel pain in order to make it more human?
and
In a world where robots are advanced enough to think almost like humans and are given basic rights, would it be wrong to re-write the programming of a robot that was built hardwired for crime. Being that it's hardwired, it won't consent to being re-written, but it's not the robots fault for being programmed that way. Is this just the robot equivalent of the death penalty?

I could go on and on about this stuff for ages, but I'd love to hear some input or maybe some new ideas too!

@Nevermore

I think that instead of re-writing a robots code, maybe some new code could be introduced - code that will adjust the robot's programming in a way that allows the robot to decide whether or not to be re-written, without the initial programming forcing the decision.

In response to your question about robots feeling pain: If a robot could feel pain, wouldn't it become self-aware, at least a little bit?

Are there still wars in this world? If so, are robots allowed into the military? Have robots replaced humans as soldiers?

@Masterkey

Yeah I'd like a little more details about your robots' role/s in society.

I honestly think that your questions are up to what each individual robot would want. And rewriting a robot who is committing crime to one that is totally peaceful does seem like the death penalty in a way, but wouldn't humans feel that a robot who deserves the death penalty should actually cease to exist, just like a human who commits a capital crime?