I feel society is making a big mistake by accusing modern technology to be evil when they don’t consider the people behind them

I am supposing there can be 'good robots' and 'evil robots' depending on their programmer, and 'good robots' can be corrupted, while 'evil robots' can be saved and redeemed.
I'm concerned about the robots that are "just following orders". For example, what will an autonomous car do when its brakes fail and it needs to decide in real time whether to run over a pedestrian or throw itself and its passengers into a ditch? Will it assign a numerical value to the lives of its passengers vs the life of the pedestrian and choose to sacrifice the party of lesser value? How would it do that?

It's a real-life trolley problem and it's coming your way soon.
 
I'm concerned about the robots that are "just following orders". For example, what will an autonomous car do when its brakes fail and it needs to decide in real time whether to run over a pedestrian or throw itself and its passengers into a ditch? Will it assign a numerical value to the lives of its passengers vs the life of the pedestrian and choose to sacrifice the party of lesser value? How would it do that?

It's a real-life trolley problem and it's coming your way soon.
Oh dear heavens. There is a third option of the pedestrian vs passenger dilemma, to simply find an alternate route to go by to avoid anyone getting killed. Seriously, robots don’t have the intention to kill people unless it’s being controlled by a corrupt human being. And no, it’s not going by orders, that situation is just a take on sadism. Seriously, only a sadist would make the “who am I going to kill?” dilemma. On their own, robots cannot be sadistic. They cannot be good either. With them alone they’re JUST robots. The “someone needs to die” mentality is simply an emotional decision, and robots are not capable of showing emotion.

Trust me, calm down and trust God. He knows what He’s doing.
 
Back
Top