@Say-Hello-to-the-Rugs-Topology
Hey y'all. I'm currently working on a plot for something sorta in the realm of Sci-Fi. I don't plan on it taking place in a distant future, but nonetheless it's going to have robots-a-pleanty. I really love to talk ethics and especially when it comes down to machines and robots. Stories such as 'I Have No Mouth and I Must Scream' and 'Do Androids Dream of Electric Sheep'/'Blade Runner' are some of my favorites for these reasons. I kinda wanted my story to encompass an ethical dilemma regarding the robots as well, but I'd like to stray from the worn down topics such as "Robots becoming self aware."
Two of the ideas I had were:
Would it be immoral to give a robot the ability to feel pain in order to make it more human?
and
In a world where robots are advanced enough to think almost like humans and are given basic rights, would it be wrong to re-write the programming of a robot that was built hardwired for crime. Being that it's hardwired, it won't consent to being re-written, but it's not the robots fault for being programmed that way. Is this just the robot equivalent of the death penalty?
I could go on and on about this stuff for ages, but I'd love to hear some input or maybe some new ideas too!