<p>Warner Bros. You may have noticed we have a soft spot for sci-fi author Isaac Asimov. His fiction, especially as it pertains to robotics, cemented him in the sci-fi canon and advanced the thinking on what practical robotics would look like in the future.</p><p>He also used his fiction to entertain a foreboding question: Should a robot be able to kill a human?</p><p>Asimov decided not, and drew up three "laws of robotics" that governed how robots behaved in his fictional universes.</p><p>They go like this:A robot may not injure a human being or, through inaction, allow a human being to come to harm.A robot must obey the orders given to it by human beings, except where such orders would conflict with the First Law.A robot must protect its own existence as long as such protection does not conflict with the First or Second Law.</p><p><a href="http://www.businessinsider.com/asimov-3-laws-of-robotics-2014-1">Keep reading...</a></p>