The X-47B Unmanned Combat Air Vehicle (UCAS), developed by Northrop Grumman in cooperation with DARPA, is popularly referred to as ‘semiautonomous.’ A new variant is envisioned as a manned aerial tanker for carrier aircraft (Northrop Grumman photo)
Some see the new opportunities and potential benefits of using autonomous drones, others consider the development and use of such technology as inherently immoral. Influential people like Stephen Hawking, Elon Musk and Steve Wozniak have already urged a ban on warfare using autonomous weapons or artificial intelligence. So, where do we stand, and what are the main legal and ethical issues?
Towards autonomous drones
As yet, there is no agreed or legal definition of the term "autonomous drones". Industry uses the “autonomy” label extensively, as it gives an impression of very modern and advanced technology. However, several nations have a more stringent definition of what should be called autonomous drones, for example, the United Kingdom describes them as “…capable of understanding higher level intent and direction” (UK MoD, The UK Approach to Unmanned Aircraft Systems, 2011). Generally, most military and aviation authorities call unmanned aerial vehicles "Remotely Piloted Aircraft" (RPAs) to stress that they fly under the direct control of human operators.
Most people would probably understand the concept of “autonomous drones” as something sophisticated, for instance, drones that can act based on their own choice of options (what is commonly defined as "system initiative" and "full autonomy" in military terminology). Such drones are programmed with a large number of alternative responses to the different challenges they may meet in performing their mission. This is not science fiction – the technology is largely developed though, to our knowledge, no approved autonomous drone systems are yet operational. The limiting factor is not the technology but rather the political will to develop or admit to having such politically sensitive technology, which would allow lethal machines to operate without being under the direct control of humans.
One of the greatest challenges for the development and approval of aircraft with such technology is that it is extremely difficult to develop satisfactory validation systems, which would ensure that the technology is safe and acts like humans would. In practice, such sophisticated drones would involve programming for an incredible number of combinations of alternative courses of action, making it impossible to verify and test them to the level we are used to for manned aircraft. There are also those who think of autonomy meaning ”artificial intelligence” – systems that learn and even self-develop possible courses of action to new challenges. We have no knowledge that we are close to a breakthrough on such technology, but many fear that we actually might be.
Autonomous drones – meaning advanced drones programmed with algorithms for countless human-defined courses of action to meet emerging challenges – are already being tested by a number of civilian universities and military research institutions. We see testing of “swarms of drones” (drones which follow and take tasks from other drones) that, of course, are entirely dependent on autonomous processing.
We also see testing of autonomous drones that operate with manned aircraft, all from what the US Air Force calls (unmanned) "Loyal Wingman" aircraft, to the already well tested Broad Area Maritime Surveillance (BAMS) system of Poseidon P-8 maritime patrol aircraft and unmanned TRITON aircraft.
We also see the further development of unmanned systems to be dispatched from manned aircraft, to work independently or in extension of the “mother aircraft”, for instance, the recently tested PERDIX nano drones, of which 100 drones were dropped from a F-18 “mother aircraft”. Such drones would necessarily operate with a high degree of autonomy.
These many developments and aspirations are well described in, for example, the US planning document USAF RPA Vector - Vision and Enabling Concepts 2013-2038 published in 2014, and other documentation and even videos of such research are widely available. The prospects of autonomous technology, be it flying drones, underwater vehicles or other lethal weapon systems, clearly bring new opportunities for military forces.
In the case of flying aircraft, we have learned that there are long lead times in educating pilots and operators. One of the greatest changes that will come from the development of autonomous drones is that military forces in the (near) future could develop great fighting power in much shorter timeframes than previously. It is important to note – and many have – that creating the infrastructure and educating ground crew for operating drones is no cheaper or easier than it is to educate aircrew. However, once in place, the drone crew and operation centres would be able to operate large numbers of drones.
Similarly, legacy manned aircraft would be at the centre of a local combat or intelligence system extended with drones serving, for example, in supportive roles for jamming, as weapons-delivery platforms or as a system of multi-sensor platforms. Moving beyond the past limitations of one pilot flying one aircraft or one crew flying one drone to a situation where one crew could control large amounts of drones would quite simply be groundbreaking.
These perspectives for new types of high-tech weapon systems – and the fears they raise – are the background for the research we conducted on autonomous drones and weapon systems. It is almost impossible to assess when these technologies will become widespread – this will depend on the situation and the need of states. However, the technologies are becoming available and are maturing and we would argue that the difficult discussions on legal and ethical challenges should be dealt with sooner, rather than later.
The legal perspectives
General rules apply but it is not that simple
Autonomous drones, if and when they are used during armed conflict, would be subject to the general principles and rules of the Law of Armed Conflict. In this respect, autonomous drones are not to be distinguished from any other weapons, weapon systems or weapon platforms. As with any “means of warfare”, autonomous drones must only be directed at lawful targets (military objectives and combatants) and attacks must not be expected to cause excessive collateral damage. (end of excerpt)
Click here for the full story, on the NATO website.
-ends-
from Defense Aerospace - Feature stories http://ift.tt/2vdY3rF
via featured
No comments: