Will the robot become my boss?
Law enforcement agencies are increasingly using robots to eliminate threats, surveillance and hostage situations, which makes me uneasy.Maybe i just watched Robocop There are too many times, but it is a matter of life and death for me to make crucial decisions about the machine, especially considering that actual personnel often abuse their power. Do I have any moral obligation to obey police robots?
Hollywood is not particularly optimistic about the robots of those in power. Robocop This is just one example of the broader sci-fi canon, which has given up our critical tasks to inflexible machines, which has plunged people deeply into tragic consequences. The main instructions of these robots have been outspoken and exaggerated. They can kill a person or blow up a person, but are bewildered by a series of stairs. The message of these films is clear: rigid automata cannot solve the impromptu solutions and moral nuances that are often required in times of crisis.
It was this stereotype that prompted Boston Dynamics (some of which have been incorporated into the police department) to release a video in December last year that played a model from the Contours hit “Do You Love Me” in the 1950s. Maybe you saw it? These robots include Atlas (a robot similar to the deconstructed assault soldier) and Spot (inspired by the killer dog robot in the episode “Metal Head”) Black mirror. Neither of these machines seem to be designed to calm the fear of robot takeovers. So, besides showing their agility, what better way to make them popular with the public? What better test agility than what is said to be considered a unique human skill, so much so that we invented an action (robot) designed to simulate an action that an automata could not do? Looking at the machines randomly, shaking and spinning, it is hard to avoid seeing them as energetic, embodied creatures that can be as flexible and sensitive as ourselves.
It doesn’t matter, Spot’s joints will cut your fingers, or police robots have been used to perform deadly force. One way to answer your question is, in the absence of any moral and philosophical appeal, it may be in terms of pragmatic consequences. If you, like most of us, have a plan to stay alive, then yes, you should definitely obey police robots.
But I think your question is not only practical. I agree that it is important to consider the trade-offs involved in transferring police duties to the machine. By the way, the “Boston Dynamics” video was released at the end of 2020 to “celebrate the beginning of a year we hope will be a happier year.” A week later, the rebels rushed into the Capitol, and police images were flooded, showing little resistance to the mob. These photos were clearly juxtaposed on social media in opposition to the protests against “Black Life Issues” last summer. Stern response.
At a time when many police departments are facing a crisis of authority due to ethnic violence, the most persuasive argument for robotic policing is that machines have no inherent ability to be biased. For robots, a person is a person, regardless of skin color, gender or cause. As the White House pointed out in its 2016 report on algorithms and civil rights, new technologies have the potential to “help law enforcement officials make decisions based on factors and variables related to risk experience, rather than based on flawed human instincts and prejudices.”
Of course, if the existing police technology has any evidence, then things will not be so simple. The predictive policing algorithms used to identify high-risk groups and communities are very prone to bias, which roboticist Ayanna Howards calls “the original evil of artificial intelligence.” Because these systems rely on historical data (past court cases, previous arrests), they ultimately single out communities that were initially targeted for injustice and reinforce structural racism. Automated predictions may be self-fulfilling, thus locking certain quadrants into a mode of over-regulation. (Officials who arrive at locations that have been marked as mature criminals are ready to find criminals.) In other words, these tools do not eliminate prejudice, but just formalize it, thereby deeply imprinting existing social inequality Unknowingly and mechanically in a system that exists forever. they. As Kevin Macnish, a professor of digital ethics, pointed out, the values of algorithm manufacturers “are frozen in the code, effectively institutionalizing these values.”