The 3 Laws of Pentagon Robotics

The three laws of robotics, according to science fiction author Isaac Asimov, are: 1. A robot may not injure a human being or, through inaction, allow a human being to come to harm.2. A robot must obey the orders given to it by human beings, except where such orders would conflict with the First Law.3. A robot must protect its own existence as long as such protection does not conflict with the First or Second Law. I would gladly have accepted a $20 million Pentagon contract for the job of pointing out these three laws. OK, maybe $25 million. Sadly, … Continue reading The 3 Laws of Pentagon Robotics