• Tidak ada hasil yang ditemukan

1

2

Even if the Brimstone doesn’t quite cross the line to an autonomous weapon, it takes one more half step toward it, to the point where all that is needed is a light shove to cross the line. A MMW-only Brimstone could be converted into a fully autonomous weapon simply by upgrading the missile’s engine so that it could loiter for longer. Or the MMW-only mode algorithms and seeker could be placed on a drone. Notably, the MMW-only mode is enabled in the missile by a software change. As autonomous technology continues to advance, more missiles around the globe will step right up to—or cross—that line.

Would the United Kingdom be willing to cross that line? The debate surrounding another British program, the Taranis drone, shows the difficulty in ascertaining how far the British might be willing to push the technology.

3 On the authority of mission command, Taranis would carry out a simulated firing and then return to base via the programmed flight path.

At all times, Taranis will be under the control of a highly-trained ground crew. The Mission Commander will both verify targets and authorise simulated weapons release.

This protocol keeps the human in the loop to approve each target, which is consistent with other statements by BAE leadership. In a 2016 panel at the World Economic Forum in Davos, BAE Chairman Sir Roger Carr described autonomous weapons as “very dangerous” and “fundamentally wrong.”

Carr made clear that BAE only envisioned developing weapons that kept a connection to a human who could authorize and remain responsible for lethal decision-making.

In a 2016 interview, Taranis program manager Clive Marrison made a similar statement that “decisions to release a lethal mechanism will always require a human element given the Rules of Engagement used by the UK in the past.” Marrison then hedged, saying, “but the Rules of Engagement could change.”

The British government reacted swiftly. Following multiple media articles alleging BAE was building in the option for Taranis to “attack targets of its own accord,” the UK government released a statement the next day stating:

The UK does not possess fully autonomous weapon systems and has no intention of developing or acquiring them. The operation of our weapons will always be under human control as an absolute guarantee of human oversight, authority and accountability for their use.

The British government’s full-throated denial of autonomous weapons would appear to be as clear a policy statement as there could be, but an important asterisk is needed regarding how the United Kingdom defines an

“autonomous weapon system.” In its official policy expressed in the UK Joint Doctrine Note 2/11, “The UK Approach to Unmanned Aircraft Systems,” the British military describes an autonomous system as one that

“must be capable of achieving the same level of situational understanding as a human.” Short of that, a system is defined as “automated.” This definition of autonomy, which hinges on the complexity of the system rather than its function, is a different way of using the term “autonomy”

than many others in discussions on autonomous weapons, including the U.S. government. The United Kingdom’s stance is not a product of sloppy language; it’s a deliberate choice. The UK doctrine note continues:

As computing and sensor capability increases, it is likely that many systems, using very complex sets of control rules, will appear and be described as autonomous systems, but as long as it can be shown that the system logically follows a set of rules or instructions and is not capable of human levels of situational understanding, then they should only be considered to be automated.

This definition shifts the lexicon on autonomous weapons dramatically.

When the UK government uses the term “autonomous system,” they are describing systems with human-level intelligence that are more analogous to the “general AI” described by U.S. Deputy Defense Secretary Work. The effect of this definition is to shift the debate on autonomous weapons to far- off future systems and away from potential near-term weapon systems that may search for, select, and engage targets on their own—what others might call “autonomous weapons.” Indeed, in its 2016 statement to the United Nations meetings on autonomous weapons, the United Kingdom stated:

“The UK believes that [lethal autonomous weapon systems] do not, and may never, exist.” That is to say, Britain may develop weapons that would search for, select, and engage targets on their own; it simply would call them “automated weapons,” not “autonomous weapons.” In fact, the UK doctrine note refers to systems such as the Phalanx gun (a supervised autonomous weapon) as “fully automated weapon systems.” The doctrine note leaves open the possibility of their development, provided they pass a legal weapons review showing they can be used in a manner compliant with the laws of war.

In practice, the British government’s stance on autonomous weapons is not dissimilar from that expressed by U.S. defense officials. Humans will remain involved in lethal decision-making . . . at some level. That might mean a human operator launching an autonomous/automated weapon into an area and delegating to it the authority to search for and engage targets on its own. Whether the public would react differently to such a weapon if it were rebranded an “automated weapon” is unclear.

Even if the United Kingdom’s stance retains some flexibility, there is still a tremendous amount of transparency into how the U.S. and UK governments are approaching the question of autonomous weapons.

Weapons developers like BAE, MBDA, and Lockheed Martin have detailed

descriptions of their weapon systems on their websites, which is not uncommon for defense companies in democratic nations. DARPA describes its research programs publicly and in detail. Defense officials in both countries openly engage in a dialogue about the boundaries of autonomy and the appropriate role of humans and machines in lethal force. This transparency stands in stark contrast to authoritarian regimes.