The Perfect Soldier

DIGG THIS

The U.S. government is looking for "An army of one," and for soldiers who will "Be all that they can be." Unfortunately for the Pentagon, all soldiers have the same defect: they are human. Humans have emotions, fears, morals and worst of all – a family who will ask questions when they die.

The Pentagon has a terrible solution for this human problem – and it's straight out of a science fiction movie. By 2010, they will invest $4 Billion into researching a fully automated robot soldier. According to scientists working on the project, the goal is to program these machines to "not violate the Geneva conventions" and “perform more ethically than human soldiers.."

The truth is certainly more nefarious. There are few limits to the lengths that the government will go to in enforcing U.S. policies. The state engages in torture, illegal wars, and "black ops." These policies reflect the wishes of our rulers, who are only restricted by the capacity of humans to follow orders and keep quiet. It should come as no surprise that the state would seek to remove the human element.

The Pentagon tells us that their robots will make decisions which are based on hard rules – not emotion, fear or vengeance. They say these robots will perform more ethically than humans. To believe that this is the Pentagon's true motivation, we'd have to ignore history and common sense. Both tell us that we'd be naïve to take the government at face value.

Let's assume for a moment that the Pentagon's statements are truthful. It is foolish to believe that these robots would be defect free. Remember – these defects would exist in a system designed to take human life. There are many risks with this plan which should be considered.

Creating a robot soldier would be an extremely complex problem in software engineering, wrought with the chance for software defects and logical errors. For instance, consider the behavior this machine would need to replicate. Humans take for granted most actions they perform on a daily basis – however, these actions require complex mathematical calculations and algorithms to replicate on a machine.

Think about all the steps necessary to use a simple handgun. Assuming the weapon is holstered and already loaded: you would remove the gun from the holster; bring your arm up to point the weapon at the target – then aim and fire.

For a robot to replicate these actions would require thousands of instructions. Consider all of the calculations not described which are easily performed by humans – including identifying and classifying the target, and the calculations necessary for aiming with some accuracy (which must take into account weather conditions, terrain, wind speed, the movement speed of the target, distance to the target, etc.). Replicating this with software is not a trivial exercise.

It would be foolish to think the software which controls this robot soldier could be defect free. Writing code is a complex task – and humans are not perfect. There is no amount of testing that would expose all of these defects prior to the first field use of a given system. If you've used any software product, you know this: whether it be Windows, Office, or an ATM machine.

We can find examples of defects being exposed in the field by looking at existing military projects – such as the Patriot Missile system. The Patriot system was designed to "provide a coordinated, secure, integrated, mobile air defense system." Of course, this is a complicated way of saying that the goal is to shoot down enemy missiles.

However, the Patriot system was far from bug free. These problems were hidden during military operations – state propaganda lauded the success of the Patriot during the Gulf War. Only when military operations ended did the true story become public.

Initially, President Bush claimed that the missile was responsible for the downing of 41 out of 42 Scud missiles. Experts later refuted this highly publicized success rate. In response to testimony and other evidence presented, the staff of the House Government Operations Subcommittee on Legislation and National Security reported, “The Patriot missile system was not the spectacular success in the Persian Gulf War that the American public was led to believe. There is little evidence to prove that the Patriot hit more than a few Scud missiles launched by Iraq during the Gulf War, and there are some doubts about even these engagements. The public and Congress were misled by statements of success issued by administration and Raytheon representatives during and after the war."

The Patriot failures in the Gulf war include a tragic loss of life on February 25, 1991. The Patriot system launched a missile intended to take down a SCUD fired into Saudi Arabia. Due to a mathematical error in the software, the Patriot missile was fired off target and missed. The SCUD missile did not miss its target, resulting in the death of 28 soldiers at the U.S. Army base in Dhahran. The error was a miscalculation in the time that had elapsed since the system was booted. As noted here:

"Specifically, the time in tenths of seconds as measured by the system’s internal clock was multiplied by 1/10 to produce the time in seconds. This calculation was performed using a 24 bit fixed point register. In particular, the value 1/10, which has a non-terminating binary expansion, was chopped at 24 bits after the radix point. The small chopping error, when multiplied by the large number giving the time in tenths of a second, lead to a significant error. Indeed, the Patriot battery had been up around 100 hours, and an easy calculation shows that the resulting time error due to the magnified chopping error was about 0.34 seconds. (The number 1/10 equals 1/24+1/25+1/28+1/29+1/212+1/213+…. In other words, the binary expansion of 1/10 is 0.0001100110011001100110011001100…. Now the 24 bit register in the Patriot stored instead 0.00011001100110011001100 introducing an error of 0.0000000000000000000000011001100… binary, or about 0.000000095 decimal. Multiplying by the number of tenths of a second in 100 hours gives 0.000000095100606010=0.34.) A Scud travels at about 1,676 meters per second, and so travels more than half a kilometer in this time. This was far enough that the incoming Scud was outside the “range gate” that the Patriot tracked. Ironically, the fact that the bad time calculation had been improved in some parts of the code, but not all, contributed to the problem, since it meant that the inaccuracies did not cancel."

This is far from the only problem with this system. Here are further examples:

Remember that these errors occurred with a "defensive" system. The Pentagon's robot soldier will not be for defensive purposes only. This machine will actively decide whether or not to end a human life. It will involve more complex operations than the Patriot – operations which will require massive research and development of an artificial intelligence. We have to assume that problems with a robot soldier would dwarf those encountered with the Patriot missile system.

The Pentagon claims these robots will be designed to conform to Geneva conventions. Still, who will pay the price if a logical software error causes the robots to commit a war crime – such as misidentifying a village full of children as a target which must be eliminated. Will the government behave like Microsoft and issue a "hot fix" for this bug?

The problems with this plan are not just technical. We should not trust the government's motives for pursing this project – there are many obvious benefits they have not mentioned.

One immediate benefit of this robot is that it will remove a restraint on the government's ability to conduct war at a large scale – that is, the public criticism surrounding the loss of American lives. This benefit will certainly cause the project to gain a great deal of favor among Pentagon officials.

Initially, we'd likely see a minor rollout of these robotic troops for "high risk" situations. Eventually the government would systematically replace all front line troops. Only human support staff would be required.

Consider the occupation of Iraq – dangerous zones could be patrolled by these robots. They would certainly be more durable than humans, requiring no food, shelter or sleep. Only a power source to recharge their batteries and access to the central system would truly be necessary.

The public would have a completely different view of the war with the risk to American soldiers largely removed. The public and media generally ignore the effects of war on those we occupy. The potential for American military destruction abroad will not only increase, but it will do so at the hands of a soldier which will not confess details later on.

There are other benefits as well. Today, military leadership must concern itself with the size of the force, the negative PR associated with operations as well as certain programs (i.e. "stop loss"). These concerns will decrease or cease to exist when humans are taken out of the equation.

Once the Pentagon is able to mass produce these metallic beasts, there will be few restraints on their actions. In the long run, it will become easier to engage in war.

The logical next question is: "Why would the government only deploy this technology to the military?"

I suspect it wouldn't be long before these troops patrol the streets of America as well. The public may be told that dangerous inner city neighborhoods can be cleaned up without risk to the police force. Robots could replace SWAT teams, and even regular police officers.

This technology would be able to take advantage of the state invasion of our privacy. Law enforcement robots will have immediate access to all of your records, recent transactions, and other private data through existing government and civic systems. You don't have to be paranoid to see the potential for further government intervention in your life. These tracking mechanisms are already being put in place – think of RFID tags and Real ID. Today, it may sound like science fiction: a robot police force could arrest or fine a citizen based on a set of weighted criteria – using data gathered from these systems.

The total scenario described here may not to play out in the short term. There is still a great deal of research necessary. However, the Pentagon has committed a huge sum of money towards the goal of an autonomous soldier, and they've deployed remote control robots which are human controlled (i.e. the Predator drone). It is closer than you think.

This advancement will be celebrated at home for the U.S. troops it will save. Technological advancements have enriched our lives in many ways. However, the potential for abuse at the hands of our government is too great to ignore. Instead of focusing on the "Terminator" scenario (robots turning against the human race), we should focus on the real threats posed by this technology. Are we willing to accept a machine which determines whether or not to end a human life?

December 22, 2008