• No results found

AI, Autonomy and Airpower : the End of Pilots?

N/A
N/A
Protected

Academic year: 2021

Share "AI, Autonomy and Airpower : the End of Pilots?"

Copied!
17
0
0

Loading.... (view fulltext now)

Full text

(1)

Full Terms & Conditions of access and use can be found at

https://www.tandfonline.com/action/journalInformation?journalCode=fdef20

ISSN: 1470-2436 (Print) 1743-9698 (Online) Journal homepage: https://www.tandfonline.com/loi/fdef20

AI, autonomy, and airpower: the end of pilots?

Arash Heydarian Pashakhanlou

To cite this article: Arash Heydarian Pashakhanlou (2019) AI, autonomy, and airpower: the end of pilots?, Defence Studies, 19:4, 337-352, DOI: 10.1080/14702436.2019.1676156

To link to this article: https://doi.org/10.1080/14702436.2019.1676156

© 2019 The Author(s). Published by Informa UK Limited, trading as Taylor & Francis Group.

Published online: 12 Oct 2019.

Submit your article to this journal

Article views: 295

View related articles

(2)

ARTICLE

AI, autonomy, and airpower: the end of pilots?

Arash Heydarian Pashakhanlou

Swedish Defence University, Stockholm, Sweden

ABSTRACT

Military pilots have long been central to airpower projection in both combat and non-combat operations. While the historical and con-temporary roles of military aviators have been examined extensively in previous scholarship, the present study distinguishes itself by evaluating the future prospects of military aviators. By so doing, it argues that technological advances in autonomy and artificial intelli-gence (AI) will most likely lead to the development of pilotless aerial vehicles (PAVs), if current technological and social trends persist. In this new order, the military pilot will become a thing of the past.

ARTICLE HISTORY

Received 11 July 2019 Accepted 1 October 2019

KEYWORDS

Artificial intelligence; autonomous systems; airpower; pilots; pilotless aerial vehicles (PAVs)

Introduction

Pilots1are instrumental in the armed forces of numerous nations. It is therefore hardly surprising that a vast body of literature on military aviators exists that assesses pilot selection (Bor et al.2017, pp. 21–78), personalities (Chang et al.2018), physical fitness (Rintala et al.2015), job satisfaction (Ahmadi and Alireza2007) and the impact offlying aces2, etc. In The Problem with Pilots (2018), Timothy P. Schultz evaluates the evolution of the pilot–aircraft relationship from 1903–2017, and posits that machines have increas-ingly assumed the tasks previously performed by pilots. In contrast with this retro-spective view, the present investigation is primarily concerned with the time ahead, in particular with the implications of technological and social developments for the future of military pilots. It is thefirst study to analyze this issue in-depth.

The specific research question that this article seeks to address is as follows: What is the likely future of military pilots, in light of current technological and social develop-ments? An analysis of a wide variety of sources, ranging from interdisciplinary scholarly work to military documents indicates that autonomous technology and artificial intelli-gence (AI) will probably render the military pilot obsolete in the future.3 Indeed, the paper argues that manned aircraft with onboard pilots and unmanned aerial vehicles (UAVs)4will likely give way to aircraft I refer to as pilotless aerial vehicles (PAVs). As

a result, pilots will no longer be needed, but other humans may still be in the loop –

developing, designing, testing and occasionally even making decisions on behalf of PAVs. At this point, critics may contend that this research is speculative and that human agency makes it impossible to predict the pilot’s future role. Objections of this kind fail to recognize that while human behavior can be extremely unpredictable, it can also be CONTACTArash Heydarian Pashakhanlou arash.h.pashakhanlou@gmail.com Drottning Kristinas väg 37, Stockholm, 27805, Sweden

https://doi.org/10.1080/14702436.2019.1676156

© 2019 The Author(s). Published by Informa UK Limited, trading as Taylor & Francis Group.

This is an Open Access article distributed under the terms of the Creative Commons Attribution-NonCommercial-NoDerivatives License (http://creativecommons.org/licenses/by-nc-nd/4.0/), which permits non-commercial re-use, distribution, and reproduction in any med-ium, provided the original work is properly cited, and is not altered, transformed, or built upon in any way.

(3)

highly regular. Whilst black swan events (e.g. the impact of the Web) are virtually impossible to predict prospectively, stable historic trends are far easier to envisage (Hofman et al.2017, p. 487). Even in these“easier” cases, it is still important to proceed with caution. Social events are usually complex with multiple factors influencing out-comes and their effects are difficult to predict with great precision. Yet, it is vital to pursue this line of inquiry as falsifiable predictive studies are needed for the progression of social

sciences (Kaplan 1940, Hofman et al. 2017). The current study therefore aims to

contribute to these efforts by examining the likely effects of cutting-edge technological innovations on the future of pilots.

This issue is covered in more depth in thefive remaining sections of this article. The first part looks at the challenges associated with finding and retaining pilots as well as their physical and psychological limitations. The next section demonstrates how tech-nological solutions have already mitigated or overcome military aviators’ deficiencies and in so doing marginalized the pilot’s role. The third section argues that if the technolo-gical, military, political and institutional trends continue, pilots will most likely become redundant and replaced by autonomous systems with AI. The next section continues in the same vein but addresses this issue from an economic, legal and moral perspective

instead. The fifth and final section briefly summarizes the overarching argument and

maintains that the end of human pilots is unlikely to occur anytime soon.

Pilot challenges

During the infancy of aviation, the standards for pilot selection were lacking in scientific rigor. This changed as aviation matured and armed forces sought to select candidates with the“right stuff”, often favoring attributes such as high intelligence, an above-average level of extraversion and conscientiousness with low levels of neuroticism.5In this view, high intelligence enables pilots to complete their training successfully. Extraversion facilitates the pilot’s interpersonal relationships with crew and staff as well as accurate and decisive communication during combat missions. Conscientiousness is important since pilots control extremely costly weapons system and perform demanding tasks. Finally, pilots need to remain calm and steady when encountering stressful situations during missions, hence the requirement of low levels of neuroticism (Wood et al.2015). In addition to these traits, pilots are also expected to meet specific academic and physiological requirements concerning age, height, weight,fitness, and eyesight (“Royal Air Force,”2019;“U.S. Air Force,”2019a). Finding candidates that fulfil all these criteria is no easy task.

The challenge is not merely tofind capable pilots but also to retain them once they have joined. Issues such as limitedflying time with an increased emphasis on additional duties, and difficulties with maintaining work–life balance have prompted some pilots to recon-sider their aviation career and leave for higher-paying professions with better prospects. The draw of commercial airlines hiring military pilots seems to have exacerbated the

problem. One studyfinds that the US Air Force pilot shortages from 1950 and onwards

typically coincide with aggressive hiring by airlines (Axe 2018). According to Lt. Gen. Gina M. Grosso, the airlines hired 4,100 pilots in 2016 alone and offer salaries that are on average 17 percent higher than the military (Parrish2017).

The career of military pilots can also end prematurely due to injuries, mental health issues, and death, among others. During the Six-Day War in 1967, about 100 of Egypt’s 350 qualified

(4)

air-combat pilots were for instance killed within three hours by Israel’s pre-emptive strikes (Henriksen2018, p. 85). Even in the absence of such extreme outcomes, the full-length careers of military pilots tend to be relatively short. In the US Air Force, they are eligible to retire after 20 years, irrespective of their age (“U.S. Air Force,”2019b). In 2018, the world’s oldest active

fighter pilot, Phillip Frawley of the Royal Australian Air Force, retired at the age of 66 (Yeung2018).

Consequently, pilot shortage is a major concern for countries around the world. In the US, the Air Force, the Navy, and the Marine Corps reported that the actual numbers offighter pilots and authorizations (i.e. funded positions) gap in fiscal year 2017 were 27 percent, 24 percent and 26 percent, respectively (United States Government Accountability

Office2018). In November 2017, Air Force Secretary Heather Wilson disclosed that the US

Air Force had a shortage of some 2,000 pilots (Daniels2017). The Heritage Foundation’s 2018

Index of U.S. Military Strength rated the Air Force as weak, partly due to its current and forecast deficit of fighter pilots (Venable2017a). A 2018 National Audit Office report stated

that the British Armed Forces were 800 pilots short, or 23 percent below requirement (Morse

2018, p. 19). Even in the second most populous country in the world, India, a 2015 parlia-mentary report revealed that there was afighter pilot shortage with a cockpit-pilot ratio of 1:1.08 rather than the mandated 1:1.25 (Kainikara2018).

The training of pilots poses a further difficulty, as it requires significant investments in time and money. According to Air Force officials, “it costs between $3-$11 million and

takes approximately 5 years to develop an individualfighter pilot to lead combat

mis-sions” (United States Government Accountability Office2018). In order to keep the costs

manageable, pilotflying time may be lowered. In the US, most pilots have reportedly less than 150 hoursflying time per year that often takes place in very benign conditions. This is insufficient to acquire and develop the skills needed to attain victory in a high-threat

combat environment (Venable2017b, p. 5, 10). During the Second Chechen War in 1999,

the average Russian pilotflying time was around 23–25 hours whereas the average flying

time was about 150 hours for Soviet pilots during the Cold War (de Haas2003).6The lack of training contributed to the relatively poor performance of the Russian Air Force

during the Second Chechen War (Sutyagin2018, pp. 319–320). Moreover, the training

itself may be of poor standard, leading to deficits in pilot skills. Despite the implementa-tion of a number of improvements, the Chinese pilot training allegedly remains insuffi-cient. According to a RAND report, Chinese pilots fall short inflight-lead, tactical and coordination skills and have difficulties operating autonomously as a result (Morris and

Heginbotham2016, pp. 26–27).

Even with ideal training, pilots with the“right stuff” suffer from all sorts of human

cognitive and physiological limitations. All pilots are eventually overcome by hunger and fatigue, making it impossible for them to operate around the clock. Human senses are

also inadequate for maintaining spatial orientation whenflying in clouds, fog, or

dark-ness. Pilots therefore become disoriented without instruments, irrespective of their skills and experience. Altitude poses another problem for the human body. At approximately 20,000 feet (6,096 meters) people begin to suffer from severely deficient supply of oxygen to the body while above 30,000 feet (9,144 meters) decompression sickness emerges. Flying duties can only be performed efficiently for a few seconds at 50,000 feet (15,240 meters) and at about 63,000 feet (19,202 meters) blood starts to boil at normal body temperature (Schultz2018, p. 1). There are also limitations in terms of the G-forces that

(5)

the human body can endure. Excessive G-force may lead to the loss of consciousness and even death (Venosa2016). Finally, humans are susceptible to error that may in the worst case result in fatal outcomes. According to the 2015 Nall report, pilot-related errors accounted for roughly 74 percent of all total and fatal accidents in non-commercialfixed wing aircraft (Geske2015, p. 12). As a result of the identified issues, the pilot’s role has become increasingly marginalized and replaced by more efficient technological solutions. The decline of pilots

Initially, aircraft were technically simple by today’s standards with pilots as their master, controlling them with his/her cognitive and physical faculties. With time, aircraft became more technologically advanced and increasingly took over tasks

pre-viously conducted by pilots. In 1912, thefirst autopilots emerged that enabled aircraft

to maintain altitude and direction without any intervention from pilots. During the 1920s, gyroscopically based instruments were introduced that were far superior in maintaining spatial orientation under low-visibility conditions than the instincts of

pilots (Schultz2018, p. 6). With the advent of the Norden Bombsight during the Second

World War, the computerized autopilot could autonomouslyfly the plane to the prime

location based on the automatically measured conditions such as wind speed and release bombs over the target. The pilot only needed to activate the autopilot to perform these tasks (Allen and Chan2017, p. 13).

In modern aircraft, the technology has become even more refined and has further

marginalized the pilot’s role. Contemporary autopilots can takeoff, climb, cruise,

des-cend, and land the aircraft without pilot involvement. Some aircraft can autonomously fly over rugged terrain at high speed in complete darkness. There are even automation systems that can take over the control of the aircraft from errant pilots to avoid collision

(Schultz 2018, p. 6). Automatic Ground Collision Avoidance System (AGCAS) takes

control of the aircraft when the pilot is incapacitated to avoid a crash. AGCAS allegedly saved a U.S. F-16 in Syria from meeting this fate (Scharre2018a). Obviously, automation systems are prone to error as well. A 2013 study by the Federal Aviation Administration of civil aviation contended that unexpected or unexplained behavior of automated

systems were found in 46 percent of the accident reports (Nakamura et al.2013). With

that said, the spread of automation is one of the main reasons civil aviation aircraft accident rate has fallen from approximately, four accidents per millionflights in 1977, to less than 0.4 in 2018 (Wise2019). Furthermore, a direct comparison reveals that in the early days offlight, approximately 80 percent of commercial airline accidents were caused by mechanical errors and the remaining 20 percent by human error. In 2003, these

numbers were reversed (Rankin 2007, p. 16). In short, although machines are not

immune to errors, they have become far more reliable than humans.

Generally, modern autopilots fly with greater precision than the best pilots.

Moreover, modern electronic equipment has the capacity to detect enemy aircraft long before the pilots’ naked eye, and acts as his/her visual cue. In aerial warfare, modern software and satellites can ideally guide bombs to within inches of their target. The world’s most expensive fighter, the F-35, reportedly has an active electro-nically scanned array radar that automatically assesses targets at all ranges without pilot input. Its cameras, sensors, and radar provide data that is often highly processed

(6)

and prioritized before reaching the pilot’s senses. Under some conditions, the F-35 (like most other modern combat aircraft) can take action before the pilot even

manages to react (Schultz 2018, pp. 1, 169, 176). In fact, the fighters’ software is

said to prevent the pilot from inadvertently putting the plane into unrecoverable spins

and other aerodynamically unstable conditions (Scharre2018a).

With the advent of UAVs or drones, pilots have even been relegated from the cockpit. In the US, the demand for UAV pilots reportedly rose by 76 percent in the period

2013–2018, going from 1,366 to 2,404 (Vandiver2019). Similar developments are to be

expected in other nations, as nine countries have already employed armed UAVs in combat, and at least 20 countries are currently developing lethal drone programs (Zegart

2018, p. 1–2). There are numerous advantages to removing pilots from the aircraft.

Firstly, human limitations in coping with excessive G-force and altitude are no longer a factor. Secondly and relatedly, without onboard pilots, drones can usually stay airborne longer than a manned aircraft. For instance, the manned surveillance airplane U-2 can remain airborne for up to 12 hours and presents a formidable challenge to the pilot who

reportedly feel “completely wiped” after the flight (Fisher 2010), while its unmanned

counterpart, the RQ-4 Global Hawk Block 40, has successfully completed a 34.3 hour

flight (“RQ-4 Global Hawk,”2014).

Thirdly, pilots are less vulnerable with UAVs as drones can be operated from a remote location. For instance, between 2002 and mid-2015, the United States conducted

approximately 568 drone strikes without a single pilot casualty (Zegart 2018, p. 14).

During the same period, at leastfive American onboard pilots reportedly lost their lives in Operation Enduring Freedom alone (“Army Chief Warrant Officer 3 William

T. Flanigan,” 2006;“Enduring Freedom Casualties – Special Reports,” 2008; “Michael

Slebodnik, Chief Warrant Officer, United States Army,”,2008, Faraj2005, O’Brien2013). These losses are not only costly, but have in addition a negative impact on maintaining public support for the war. Indeed, studies demonstrate a strong and direct correlation

between rising casualties and declining support in US public opinion (Gartner2008).

Fourthly, without onboard pilots, equipment such as the cockpit, armor, ejection seat

andflight controls are no longer required. As a result, UAVs can be constructed lighter

and smaller. The space taken up by pilots in manned aircraft can also be utilized for the installation of technical equipment that can in turn perform more autonomous tasks, which further marginalize the pilots function.7This is evident in the Air Force Global

Hawk and Army Gray Eagle drones that fly autonomously once pilots have set their

designated location. In fact, the army do not even refer to these human controllers as pilots but as“operators”. Yet, these human operators are still essential as these UAVs are

only capable of acting autonomously in the simplest of missions (Scharre2018a).

The unmanned carrier-launched airborne surveillance and strike aircraft, X-47B, has demonstrated a more extensive autonomous capability. In July 2013, the X-47B per-formed one of the most difficult tasks pilots can be confronted with, landing on an aircraft carrier. The X-47B can do so day or night and successfully touchdown every single time (Schultz2018, p. 155). Pilots, on the other hand, require extensive training before they are able to land on a moving carrier. They must touch down with precision and timing so that the tail hook catches the arresting wire and brings them to a stop, before the short runway runs out. Pilots have died when attempting to complete this difficult landing (Skaine1999, pp. 39–42). In addition to its landing ability, the X-47B

(7)

successfully conducted thefirst fully autonomous aerial refueling in 2015. This capability spares the pilot of dangers associated with getting close to another aircraft and perform-ing tricky maneuvers to refuel (Piesperform-ing2016).

The military development and testing of swarming drones have also begun whereby numerous small, inexpensive, cooperative and unmanned aircraft coordinate their

actions and fight as a coherent unit to overwhelm the enemy. Swarms can perform

offensive, defensive and supportive functions such as intelligence, surveillance, and

reconnaissance missions, and enable the military tofield forces that are cheaper, larger, faster and better coordinated. In October 2016, the US Defense Department launched

a swarm of 103 Perdix drones flying in formation and demonstrating autonomous

collective decision-making. Scholars speculate on the possibility of far larger swarms in the future. For the purposes of this article, it is important to note that in contrast to individually operated UAVs, a single pilot can control an entire swarm since they maintain their formation automatically (Lachow2017, Scharre2018b). If this is so, the number of pilots needed will decrease.

As this investigation illustrates, pilots have gone from being the masters of their aircraft to becoming their system managers. The tasks that machines have taken control of in planes have only increased over time due to their superior performance. As a result, contemporary military pilots spend considerable time controlling the

auto-maticflight control systems, and act as safety observers overseeing sensors and other

mechanical components rather than exerting direct control (Schultz 2018, p. 172).

More specifically, they have to manage internal and external subsystems to optimize

the aircraft functions; “remain alert to environmental factors such as weather,

obsta-cles, and enemy threats; and communicate effectively with other aircrew members and supporting agents, including air traffic control, ground-based commanders, and com-mand and control aircraft” (Schultz2018, p. 165). Pilots remain after all more versatile than machines. Currently, both pilots and machines are therefore required to compen-sate for each other’s weaknesses, and optimize system performance as a whole. However, since technological development is far faster than human evolution, it is highly probable that machines will eventually outperform pilots to such an extent that this occupation will become redundant.

The end of pilots? Technological, military, political and institutional issues

To replace pilots, it is evident that PAVs must possess sufficient autonomy and

intelli-gence. The real technological challenge in developing PAVs lies in making them ade-quately intelligent. Lethal autonomous weapons systems (LAWS) such as the Tomahawk Anti-Ship Missile (TASM) were already available in the 1980s. Yet, the fully operational TASM was never launched, since it was incapable of accurately distinguishing between enemy ships and merchant vessels. In other words, TASM possessed sufficient autonomy but insufficient AI to be used in practice (Scharre2018a). Sceptics may therefore point out that AI technology is still in its infancy and the mathematical possibilities too

complex for AI to understand context, choose between conflicting goals, handle new

situations and interpret meaning (Ayoub and Payne2016, p. 816, Payne2018, p. 10).

These objections fail to recognize the advantages and potential of AI and technology.

(8)

supervision, with limited data for training, and to cope with ambiguous and asymmetric

information” (Payne2018, p. 8). With hybrid AI, systems can learn in a humanlike way

by combining two rival AI approaches– Neural pattern recognition and symbolism. In

doing so, the key limitations of each approach are overcome and neural pattern recogni-tion allows the system to“see” and symbolism enables it to “reason” (Mao et al.2019). Through networked computer agents, the AI system could also learn by studying its own activities or that of other agents in their network and make inferences on the basis of far greater (and confusing) data (Arkin2010, p. 333, Levine et al.2016, Payne2018, p. 9). Moreover, AI coupled with Natural Language Processing can already interpret some types of meaning in enormous quantities of texts far faster and with greater accuracy than

a human being (Kruger 2019). With advances in robotics, sensors and increasingly

powerful hardware, enhanced AI performance in all of these aforementioned areas should therefore be possible, with operations conducted at staggering speeds that sur-passes that of the pilot.

Remarkably, an AI system dubbed ALPHA has repeatedly defeated a human pilot,

namely, the retired US Air Force Colonel Gene Lee, in multipleflight simulator trials.

ALPHA managed to shoot down Lee on each protracted simulated engagement. Lee did not manage to score a single kill. ALPHA’s main advantage compared to that of human pilots lies in its ability to collect and process the enormous amount of data from the aircraft’s sensors and make extremely rapid decisions on how to best respond in the given situation. Humans with an average visual reaction time of between 0.15 and 0.30 seconds who need even longer time to think of optimal plans can simply not emulate ALPHA’s performance. Once trained, ALPHA can beat human pilots even while running on cheap computers or smartphones (Ernest et al.2016).

Considering that the technological development is far faster than human evolution, it is highly probable that PAVs will eventually outperform human-piloted aircraft in actual combat. In airpower theorist John Boyd’s OODA loop, the goal is to complete the four steps: (1) observe, (2) orient, (3) decide and (4) act, faster than the adversary to create

confusion and disorder in his/her mind and attain victory (Osinga 2007). If success is

achieved by completing the OODA-loop faster than the antagonist, automation and AI will most likely eventually defeat any human pilot in combat. Through iterative learning, autonomous AI could develop skills and capacities that exceed what is humanly possible.

They could learn tofly high-G maneuvers human pilots can only dream of with much

shorter reaction times and superior decision-making capacities (Altmann and Sauer

2017, p. 123). With advances in computer speed and AI, the gap between the speed

and accuracy machines can complete the OODA-loop compared to that of the pilots in non-complex environments will likely only increase.

Militarily, there is little point in retaining pilots, if PAVs are superior. Those who continue to rely on pilots may after all risk facing defeat due to their relatively inferior fighting capacity, an outcome countries seek to avoid. When the Prussian Armed Forces realized how to exploit new technologies such as railroads, rifles, and the telegraph to project power rapidly in the mid-19th century, other nations followed suit (Showalter

1975, Herrera and Mahnken 2003). There is no evidence to suggest that the situation

would be different with autonomy and AI. The Russian President Vladimir Putin has already indicated that this will be the case when he publicly stated that the leader in AI

(9)

security challenges willfind such a solution appealing as it would enable them to project airpower without pilot dependency. The United States Department of Defense (DoD) has

even stated that contemporary”automation is too dependent on human beings” and that

it“must continue to pursue technologies and policies that introduce a higher degree of

autonomy to reduce the manpower burden” (“Unmanned systems integrated roadmap

FY2011-2036,” 2011, p. vi). The DoD also explicitly acknowledge that research and

development are heading towards systems with greater autonomy capable of making decisions and reacting without human input (“Unmanned systems integrated roadmap FY2013-2038,”2013, p. 68).

At this point, it could be argued that since pilots typically have a high standing in the armed forces, they would be in a position to halt the development of PAVs as they

undermine and threaten their occupation. According to Horowitz (2018, p. 48), the US

Armed Forces for instance failed to fund the autonomous X-47B drone mentioned earlier in this paper due to bureaucratic resistance. Whilst it is true that pilots are powerful actors, their role has diminished over the years, without them being able to stop this process. Despite, some projects such as the X-47B drone being abandoned due to pressures, PAVs will most likely be developed as long as the overriding political and institutional objective to command the air as effectively as possible whilst minimizing costs (politically,financially and in terms of soldiers’ lives lost) is paramount. There is little pilots can do to hinder this historical trend.

The US Air Force chief scientist, Greg Zacharias’, testimony to congress bears witness to the bleak future pilots will likely face. Zacharias (2015) noted that the goal

is to integrate intelligent machines with humans and thereby maximize “mission

performance in complex and contested environments.” While Zacharias envisions a central role for humans in this scenario, he interestingly never mentions pilots specifically. Similarly, when media outlets reported that the RAF’s new AI drone possessed the capacity to attack targets autonomously, a spokesperson from the Ministry of Defence simply remarked that the employment of weapons will be under

human control with no reference to pilots (Allison2016).

This comes as no surprise given that existing aircraft require vast numbers of costly pilots who are in need of extensive training. This is perhaps partly why the Israelis developed the operational Harpy kamikaze-drone in the 1990’s. The Harpy carries explosives in its nose and attacks radar systems by self-destructing into them and has

been purchased by Chile, China, India, South Korea, and Turkey (Gertz 2002).

Humans are only required to launch the Harpy, as it can search, detect and engage targets fully autonomously, without pilot involvement.

The end of pilots? Economic, legal and ethical issues

In terms offinancial costs, Cummings (2017, p. 10) maintains that only a small propor-tion of defense research & development money is spent on autonomous systems, whereas

large sums are spent on traditional systems. The world also experienced two major“AI

winters” in the 1970’s and 1980’s, where funding and interest in this technology declined (Dickson2018).8On basis of these facts, it could therefore be argued that the willingness to fund the expensive development of PAVs is lacking. Although it is true that military spending on autonomy and AI are relatively small, such an interpretation would be

(10)

wrongheaded as these technologies are growing in importance. In 2018, autonomy and robotics were Pentagon’s top priorities in the national defense strategy. The following year, the DoD requested a 28 percent increase for the development of these technologies with a planned spending of almost $7 billion in 2019 on unmanned aircraft,

which are a vital stepping stone towards the development of PAVs (Harper 2018).

According to Siemens, the global military spending on robotics that can replace and replicate human actions increased from $5.1 billion in 2010 to $7.5 billion in 2015

(Getting to grips with military robotics 2018). Furthermore, the world is arguably

experiencing an AI arms race with major players such as China and the United States increasing spending on military AI (Allen,2019). It is possible that another AI winter may come about, however, that will almost certainly not be enough to stop the advance-ment. As researchers have rightly pointed out, major progress occurred even during AI winters (Kurzweil2005, pp. 263–264).

Critics may also call attention to the point that PAVs are not covered by existing international law and this may impede their development. This objection fails to recog-nize that only an extremely limited number of weapons such as chemical weapons are governed by specific international treaty rules. Other weapons, even if not yet developed,

are subject to the Law of Armed Conflict (LOAC) (Anderson and Waxman 2017).

Provided that PAVs satisfy the four principles of LOAC, namely, military necessity, distinction, proportionality and unnecessary suffering they are thus essentially legal.9

There is no inherent reason why PAVs would fail to satisfy these demands. If pro-grammed correctly to comply with international law, PAVs could act more lawful than

piloted aircraft.10This development may, however, take some time since algorithms are

“only optimal for well-understood or modeled situations” (Gilli and Gilli2016, p. 79). Another legal issue associated with PAVs is the concern regarding accountability and responsibility. Who is to be held accountable if something goes wrong? Although there is no consensus on this issue among legal experts, the Group of Governmental Experts on Lethal Autonomous Weapons Systems (GGE LAWS) chaired by India in 2017,

estab-lished that the “responsibility for the deployment of any weapons system in armed

conflict remains with states” and they “must ensure accountability for lethal action” (Kane2018, p. 6). As illustrated, numerous states have been willing to invest in auton-omous military technology and legal responsibility is thus highly unlikely to impede any efforts in the development of PAVs.

As a counter-argument, one could insist that the campaign to stop LAWS or“killer

robots” may lead to a legal ban of PAVs. Global polls do after all indicate that the

majority of the population are against the employment of LAWS along with 28 countries

and 100 non-governmental organizations (“Killer Robots,”2019). The movement against

killer robots has had some success as well. When it was revealed that Google was involved in the US Department of Defense-funded Project Maven that sought to autonomously process video footage shot by surveillance drones, over 4,000 Google employees pro-tested. As a result, the company announced its decision to stop developing AI for

weapons use (“Rise of the tech workers,” 2019). Yet, this is only a minor victory, as

Russia, Israel, South Korea, and the United States have all indicated that they do not

support negotiations for a new treaty banning killer robots (Human Rights Watch2019).

Regardless of the development or employment of PAVs was to be made illegal; it is unclear whether such a ban would be respected in a world where there is no efficient

(11)

global enforcement mechanism, with the power to compel actors to abide by this law. This is especially true when breaking the law may prove extremely advantageous. For example, an international treaty from 1899 banned the use of weaponized aircraft for fear of aerial bombing (Allen and Chan2017, p. 3). As history has shown, this voluntary treaty was disregarded in both world wars.

Thefinal barrier against the development of PAVs is ethical. It could be argued that

pilots are needed as human moral agents since the internal logic of AI is not well-understood and PAVs must be capable of contextual decision-making in a dynamic and non-deterministic manner (Gilli and Gilli2016, pp. 77–79, Payne2018, p. 11), to make informed decisions and minimize collateral damage. It is currently impossible to verify whether PAVs will be able to match or surpass piloted aircraft in satisfying the ethical imperative of minimizing collateral damage, as it is an empirical question for the future. There are however reasons to believe that they will likely be able to match, or even exceed piloted aircraft in some situations.

Presently, AI can identify objects such as aircraft with on-board multi-spectral ima-ging (Hammes2016). AI has already demonstrated its ability to recognize and categorize some images better than human beings in various tests. However, weaknesses were also exposed when more context, backstory, or proportional relationships were necessary

(Tanz2017). This barrier may be overcome as researchers have for instance developed

a new AI computer vision system that mimics human’s visualization and identification of

objects (Chen et al. 2019). In the future, PAVs equipped with a superior sensory

apparatus, Global Information Grid, and devoid of psychological dispositions for revenge, may potentially make for better battlefield observations, process extensive information faster and more accurately than pilots, and could avoid immoral actions

due to psychological influences (See, Arkin2010, pp. 333–334).

Subsequently, these abilities could help PAVs distinguish between civilians and comba-tants more effectively, and keep collateral damage lower than the average pilot. Accomplishing this task is of course easier if the target is mechanized and combat is confined to an environment with few civilians rather than that of an urban setting. There are target identification systems that are already capable of reliably distinguishing between military objects (such as tanks and mechanized artillery, etc., that often have silhouettes, radar, and infrared signatures) and civilian objects (such as regular cars, trucks and

merchant ships, etc.) (Sparrow 2016, pp. 102–103). According to the UK Ministry of

Defence (MoD), UAVs will be able to“independently locate and attack mobile targets,

with appropriate proportionality and discrimination” by 2030 (Doward2018).

Within crowded, complex environments, on the other hand, PAVs will have a much more difficult time to differentiate between legitimate and illegitimate targets. For instance, it would be extremely difficult to determine whether the person carrying arms in a visually cluttered environment is an enemy or an ally. As the UK MoD

(2018, p. 54) stated in a publication: “the last roles likely to be automated will be

where personnel conduct activities that demand contextual assessment and agile versatility in complex, cluttered and congested operating areas”. Perhaps nothing short of Artificial General Intelligence (AGI) with the capacity to understand, or learn any intellectual task that a human being can, would be suitable for such demanding tasks. Yet, progress in developing AGI has been slow and some scholars

(12)

forward in this article does not rely upon the development of AGI, as it merely suggests that AI and autonomy will likely make military pilots redundant in the future. If PAVs are unable to make ethical targeting decisions under demanding decisions, they could therefore be designed to await human instruction when they

encounter such difficulties. This is similar to when a human pilot receives an order on

what course of action to pursue through communication from the ground. Under such conditions, the PAV will thus act semi-autonomously and receive human advice on how to target ethically. Hence, humans may remain in the loop when PAVs perform missions, but they will not be pilots. Elaborate human-machine teaming is

expected by the UK MoD (2018) in the future.

Conclusion

This article has analyzed the evolving role of pilots in the armed forces. In doing so, it has highlighted a number of issues in recruiting pilots with the“right stuff”, the costly and lengthy nature of their training and the difficulties of retaining them once they have joined the services. Such factors have contributed to pilot shortages in numerous countries. In addition, pilots have physiological and psychological limitations, are sus-ceptible to error and eventually overcome by fatigue and hunger. Tasks previously performed by pilots have therefore increasingly been assumed by machines with a proven superior capacity in these areas. As a result, pilots have been reduced to the system managers of aircraft.

Developments in AI, autonomy and other technology indicates that this process will only intensify over time. Indeed, an AI system named ALPHA has already defeated an

experienced military pilot in flight simulator trials and the X-47B drone has

demon-strated an ability to complete aircraft carrier landings and aerial refueling autonomously, tasks that most pilotsfind challenging. Since technological progress occur at a far quicker rate than human evolution, PAVs that render the pilot obsolete will likely be developed in the future if the overriding objective to command the air as effectively as possible whilst minimizing costs persists. The development of PAVs is expected to be fraught with technological challenges, bureaucratic resistance and met with legal and ethical objec-tions. Yet, none of these issues pose an insurmountable barrier, as this study has indicated. In light of current trends, only black swan events may reasonably halt the emergence of PAVs. The central challenge for PAVs will probably be operations in complex and crowded environments, where human, but not pilot involvement, may be required in the absence of AGI. It will also likely take considerable time before even semi-autonomous PAVs dominate the skies. The day that happens, it will mark the end of pilots.

Notes

1. Pilot refers to those who control theflight of an aircraft, either from the cockpit or a remote location. They must have undergone training and received the appropriate pilot certi fica-tions to qualify as such. In this study, the main focus is on military pilots and especially fighter pilots.

(13)

2. Flying ace is a military pilot that has shot down several opponents, usuallyfive or more in aerial combat (Robertson2003).

3. In this context, autonomy refers to the degree that aircraft can search, locate and engage targets on their own. When an aircraft can complete a task without human assistance, it is ‘fully autonomous’, in this respect. Should it require human aid, it is ‘semi-autonomous’. AI refers to the intelligence the aircraft possesses. Intelligence“reflects a broader and deeper capability for comprehending… surroundings-‘catching on,’ ‘making sense’ of things, or ‘figuring out’ what to do” (Gottfredson1997, p. 13).

4. In this article, the terms UAVs and drones are used interchangeably and refer to aerial vehicles that do not have an onboard human pilot. As such, this definition includes so-called suicide or kamikaze drones with built-in warheads that attack targets by self-destructing into them.

5. Extraversion is typically manifested in outgoing, assertive and energetic behavior. Conscientiousness is associated with being hard-working, reliable, efficient, organized, self-disciplined and striving for achievement. Individuals with low levels of neuroticism are more emotionally stable, calm and even-tempered (Barrick and Mount1991).

6. It should be noted that pilot flying time in Russia is measured differently than in the United States. The Russians only consider the time that the aircraft is airborne whereas the US Air Force start calculating from the moment the aircraft moves on the ground by its own power to the moment it comes to a complete halt upon landing. Moreover, the averageflight hours for Russian pilots had reportedly increased to about 120–125 hours in 2016 (Sutyagin2018, pp. 320–321).

7. There are nonetheless a number of weakness with contemporary drones. Current UAVs such as the MQ-1 Predator and MQ-9 Reaper lack defensive capabilities have limited maneuverability andfly slowly. They are deemed useless in contested environments and susceptible to jamming and hacking (Kaag and Kreps2014, Zegart2018, p. 5).

8. There are different accounts as to how many AI winters there have been, when they took place and why they occurred. The causes of AI winters have been attributed to the overhyping of AI research that did not live up to promises and expectations, lack of practical applications for AI research as well as institutional and economic factors, etc. (Boobier2018, p. 42).

9. Military necessity prohibits acts that are not essential from a military point of view. Distinction suggests that military operations may only be directed against combatants and specific military objectives. Proportionality specifies that attacks harming civilians or civilian objects may not exceed the concrete military advantage anticipated by such attacks. Unnecessary suffering seeks to reduce and alleviate human suffering in war (Bourbonnière2004).

10. AI has already been used in jurisdiction. Some states in the US employ AI that recommend criminal sentences. The Estonian Ministry of Justice is currently planning to launch an AI-system to settle small claims disputes of less than€7,000 (about $8,000) (Niiler2019).

Acknowledgments

I thank Stefan Borg, Rickard Lindborg, Mike Palmer, Dan Öberg and the anonymous reviewers for their helpful suggestions.

Disclosure statement

(14)

Notes on contributor

Arash Heydarian Pashakhanlouis an Assistant Professor in War Studies at the Swedish Defence University. His work has appeared in the journals International Relations, International Politics, Journal of International Political Theory and The Washington Quarterly, among others. Palgrave published his latest monograph Realism and Fear in International Relations: Morgenthau, Waltz and Mearsheimer Reconsidered.

References

Ahmadi, K. and Alireza, K., 2007. Stress and job satisfaction among air force military pilots. Journal of social sciences, 3, 159–163. doi:10.3844/jssp.2007.159.163

Allen, G.,2019. Understanding China’s AI strategy: clues to Chinese strategic thinking on artificial

intelligence and national security. Washington, DC: Center for a New American Security. Allen, G. and Chan, T.,2017. Artificial intelligence and national security. Cambridge, MA: Belfer

Center for Science and International Affairs.

Allison, G.,2016. British drones“will always be under human control”. UK Defence Journal. URL

https://ukdefencejournal.org.uk/british-drones-will-always-human-control/ [Accessed 27 March 2019].

Altmann, J. and Sauer, F.,2017. Autonomous weapon systems and strategic stability. Survival, 59, 117–142. doi:10.1080/00396338.2017.1375263

Anderson, K. and Waxman, M.C.,2017. Debating autonomous weapon systems, their ethics, and their regulation under international law. In: R. Brownsword, E. Scotford, and K. Yeung, eds. The oxford handbook of law, regulation and technology. Oxford: Oxford University Press, 1097–1117.

Arkin, R.C.,2010. The case for ethical autonomy in unmanned systems. Journal of military ethics, 9, 332–341. doi:10.1080/15027570.2010.536402

Army Chief Warrant Officer20063 William T. Flanigan [WWW document]. Military Times, URL

https://thefallen.militarytimes.com/army-chief-warrant-officer-3-william-t-flanigan/1941909

[Accessed 13 June 2019].

Axe, D.,2018. What’s Driving the U.S. Air force pilot shortage? [WWW Document]. Foreign

Policy, URL https://foreignpolicy.com/2018/05/04/whats-driving-the-u-s-air-force-pilot-shortage/[Accessed 26 February 2019].

Ayoub, K. and Payne, K.,2016. Strategy in the age of artificial intelligence. Journal of strategic studies, 39, 793–819. doi:10.1080/01402390.2015.1088838

Barrick, M.R. and Mount, M.K.,1991. The bigfive personality dimensions and job performance: a meta-analysis. Personnel psychology, 44, 1–26. doi:10.1111/peps.1991.44.issue-1

Boobier, T.,2018. Advanced Analytics and AI: impact, implementation, and the future of work. Sussex: John Wiley & Sons.

Bor, R., et al., Eds., 2017. Pilot mental health assessment and support: a practitioner’s guide. New York: Routledge.

Bourbonnière, M.,2004. Law of armed conflict (LOAC) and the neutralisation of satellites or ius in bello satellitis. Journal of conflict and security law, 9, 43–69. doi:10.1093/jcsl/9.1.43

Chang, M.-C., Lee, T.-H., and Lung, F.-W.,2018. Personality characteristics offighter pilots and ground personnel. Military Psychology, 30, 70–78. doi:10.1080/08995605.2017.1420977

Chen, L., et al.,2019. Brain-inspired automated visual object discovery and detection. PNAS, 116, 96–105. doi:10.1073/pnas.1802103115

Cummings, M.,2017. Artificial intelligence and the future of warfare. London: Chatham House for the Royal Institute of International Affairs.

Daniels, J.,2017. Air Force secretary warns pilot shortage worsens, people“burning out.” de Haas, M.,2003. The use of Russian air power in the second chechen war. The royal air force air

(15)

Dickson, B.,2018. What is the AI winter? [WWW Document]. TechTalks, URLhttps://bdtech talks.com/2018/11/12/artificial-intelligence-winter-history/[Accessed 24 April 2019].

Doward, J.,2018. Britain funds research into drones that decide who they kill, says report. The Guardian.

Enduring Freedom Casualties - Special Reports [WWW Document], 2008. CNN. URL http:// edition.cnn.com/SPECIALS/2004/oef.casualties/page4.html[Accessed 13 June 2019].

Ernest, N., et al.,2016. Genetic fuzzy based artificial intelligence for unmanned combat aerial vehicle control in simulated air combat missions. Journal of Defense Management, 6, 1–7. Faraj, C.,2005. Pilot of U.S. spy plane killed [WWW Document]. CNN, URLhttp://edition.cnn.

com/2005/US/06/22/spy.plane.crash/index.html[Accessed 13 June 2019].

Fisher, F.,2010. U-2s challenge pilots’ endurance in the air [WWW Document]. Stars and Stripes, URL https://www.stripes.com/news/u-2s-challenge-pilots-endurance-in-the-air-1.97772

[Accessed 17 September 2019].

Gartner, S.S.,2008. The multiple effects of casualties on public support for war: an experimental

approach. American political science review, 102, 95–106. doi:10.1017/S0003055408080027

Gertz, B.,2002. China deploys drones from Israel. Washington Times.

Geske, R.C.,2015. 27th Joseph T. Nall Report: General Aviation Accidents in 2015 (Text). Frederick, Maryland: AOPA Air Safety Institute.

Getting to grips with military robotics,2018. The economist.

Gigova, R.,2017. Who Putin thinks will rule the world [WWW Document]. CNN. URLhttps:// www.cnn.com/2017/09/01/world/putin-artificial-intelligence-will-rule-world/index.html

[Accessed 4 April 2019].

Gilli, A. and Gilli, M., 2016. The diffusion of drone warfare? Industrial, organizational, and infrastructural constraints. Security studies, 25, 50–84. doi:10.1080/09636412.2016.1134189

Gottfredson, L.S., 1997. Mainstream science on intelligence: an editorial with 52 signatories, history, and bibliography. Intelligence, 24, 13–23. doi:10.1016/S0160-2896(97)90011-8

Greg, 2015. Advancing the science and acceptance of autonomy for future defense systems. Washington: U.S. Government Publishing Office.

Hammes, T.X.,2016. Technologies converge and power diffuses: the evolution of small, smart, and cheap weapons. Washington, D.C: Cato Institute.

Harper, J.,2018. Spending on unmanned systems set to grow. National Defense Magazine. Henriksen, D., 2018. Control of the air. In: J.A. Olsen, ed. Routledge handbook of air power.

London: Routledge, 82–94.

Herrera, G. and Mahnken, T., 2003. Military diffusion in nineteenth-century Europe: the

Napoleonic and prussian military systems. In: E.O. Goldman and L.C. Eliason, eds. The diffusion of military technology and ideas. Stanford: Stanford University Press, 205–242.

Hofman, J.M., Sharma, A., and Watts, D.J.,2017. Prediction and explanation in social systems. Science, 355, 486–488. doi:10.1126/science.aal3856

Horowitz, M.,2018. Artificial intelligence, international competition, and the balance of power. Texas national security review, 1, 36–57.

Human Rights Watch, 2019. Germany: support a Ban on ‘Killer Robots’ [WWW Document]. Human Rights Watch. URL https://www.hrw.org/news/2019/03/14/germany-support-ban-killer-robots[Accessed 28 March 2019].

Kaag, J. and Kreps, S.,2014. Drone warfare. Cambridge: Polity Press.

Kainikara, S.,2018. Indian air power. In: J.A. Olsen, ed. Routledge handbook of air power. London: Routledge, 327–338.

Kane, A.,2018. Lethal autonomous weapons systems: can the international community agree on an approach? Presented at the Uehiro-Carnegie-Oxford Ethics Conference, New York. Kaplan, O.,1940. Prediction in the social sciences. Philosophy of science, 7, 492–498. doi:10.1086/

286658

Killer Robots [WWW Document], 2019. Campaign to stop killer robots. URL https://www. stopkillerrobots.org/[Accessed 28 March 2019].

Kruger, E.,2019. AI in humanities: teaching machines to read and interpret classical texts [WWW Document]. SuperPosition. URL

(16)

https://sci-techmaven.io/superposition/tech/ai-in-humanities-teaching-machines-to-read-and-interpret-classical-texts-sJtNArKpAk221MnmG1hqig/

[Accessed 4 april 2019].

Kurzweil, R.,2005. The singularity is near: when humans transcend biology. New York: Viking. Lachow, I.,2017. The upside and downside of swarming drones. Bulletin of the atomic scientists,

73, 96–101. doi:10.1080/00963402.2017.1290879

Levine, S., Lillicrap, T., and Kalakrishnan, M.,2016. How robots can acquire new skills from their shared experience. Google AI Blog, URL http://ai.googleblog.com/2016/10/how-robots-can-acquire-new-skills-from.html

Mao, J., et al.,2019. The neuro-symbolic concept learner: interpreting scenes, words, and sentences from natural supervision. Presented at the International Conference on Learning Representations. Melnyk, A., 1996. Searle’s abstract argument against strong AI. Synthese, 108, 391–419.

doi:10.1007/BF00413696

Michael, S. 2008, Chief warrant officer, United States Army [WWW Document], Arlington National Cemetery, URL http://arlingtoncemetery.net/michael-slebodnik.htm [Accessed 13 June 2019].

Morris, L.J. and Heginbotham, E.,2016. From theory to practice: people’s liberation army air force

aviation training at the operational unit. Santa Monica: RAND Corporation.

Morse, A.,2018. Ensuring sufficient skilled military personnel. London: National Audit Office.

Nakamura, D., McKenney, D., and Railsback, P.,2013. Operational use offlight path management systems. Washington, D.C: Federal Aviation Administration.

Niiler, E.,2019. Can AI be a fair judge in court? Estonia thinks so. Wired.

O’Brien, M.,2013. U.S. Pilot killed in jet crash in Afghanistan [WWW Document]. IDGA, URL

https://www.idga.org/special-operations/news/u-s-pilot-killed-in-jet-crash-in-afghanistan

[Accessed 13 June 2019].

Osinga, F.P.B., 2007. Science, strategy and war: the strategic theory of John Boyd. London: Routledge.

Parrish, K., 2017. Air force official details ‘National Aircrew Crisis’ [WWW Document]. U.S. Department of Defense, URL https://dod.defense.gov/News/Article/Article/1134560/air-force-official-details-national-aircrew-crisis/[Accessed 27 February 2019].

Payne, K., 2018. Artificial intelligence: a revolution in strategic affairs? Survival, 60, 7–32.

doi:10.1080/00396338.2018.1518374

Piesing, M.,2016. The aerial tankers that helped shrink the globe [WWW Document]. BBC, URL

http://www.bbc.com/future/story/20161220-the-aerial-tankers-that-helped-shrink-the-globe

[Accessed 19 March 2019].

Rankin, W.,2007. Media investigation process. Aero quarterly, 2, 15–21.

Rintala, H., et al., 2015. Relationships between physical fitness, demands of flight duty, and musculoskeletal symptoms among military pilots. Military medicine, 180, 1233–1238. doi:10.7205/MILMED-D-14-00467

Rise of the tech workers [WWW Document],2019. Campaign to stop killer robots. URLhttps:// www.stopkillerrobots.org/2019/01/rise-of-the-tech-workers/[Accessed 28 March 2019]. Robertson, L.R.,2003. The dream of civilized warfare: world war:flying aces and the American

imagination. Minneapolis: University of Minnesota Press.

Royal Air Force [WWW Document],2019. URL https://www.raf.mod.uk/recruitment/roles/roles-finder/aircrew/pilot[Accessed 19 February 2019].

RQ-4 Global Hawk [WWW Document],2014. U.S. Air Force. URLhttps://www.af.mil/About-Us /Fact-Sheets/Display/Article/104516/rq-4-global-hawk/[Accessed 17 September 2019]. Scharre, P., 2018a. Army of none: autonomous weapons and the future of war. New York:

W. W. Norton & Company.

Scharre, P., 2018b. How swarming will change warfare. Bulletin of the atomic scientists, 74, 385–389. doi:10.1080/00963402.2018.1533209

Schultz, T.P.,2018. The problem with pilots: how physicians, engineers, and airpower enthusiasts redefined flight. Baltimore: JHU Press.

Showalter, D.E.,1975. Railroads and rifles: soldiers, technology and the unification of Germany. Hamden, CT: Archon Books.

(17)

Skaine, R.,1999. Women at war: gender issues of Americans in combat. Jefferson, North Carolina: McFarland.

Sparrow, R.,2016. Robots and respect: assessing the case against autonomous weapon systems. Ethics & international affairs, 30, 93–116. doi:10.1017/S0892679415000647

Sutyagin, I.,2018. Russian air power. In: J.A. Olsen, ed. Routledge handbook of air power. London: Routledge, 313–326.

Tanz, O., 2017. Can artificial intelligence identify pictures better than humans? [WWW Document]. Entrepreneur. URL https://www.entrepreneur.com/article/283990 [Accessed 21 March 2019].

U.S. Air Force [WWW Document],2019a. URL https://www.airforce.com/frequently-asked -questions/medical/how-bad-can-my-vision-be-to-qualify-for-the-air-force [Accessed 19 February 2019].

U.S. Air Force [WWW Document],2019b. URLhttps://www.airforce.com/[Accessed 27 septem-ber 2019].

UK Ministry of Defence,2018. Joint Concept Note 1/18, Human-Machine Teaming.

United States Government Accountability Office,2018. Military personnel: DOD needs to reeval-uate fighter pilot workforce requirements. Washington, D.C: United States Government Accountability Office.

Unmanned systems integrated roadmap FY2011-2036,2011. Unmanned systems integrated roadmap FY2013-2038,2013.

Vandiver, J.,2019. Promotion rates improving for Air Force drone pilots, GAO says [WWW Document]. Stars and Stripes. URL https://www.stripes.com/news/promotion-rates-improving-for-air-force-drone-pilots-gao-says-1.567839[Accessed 14 September 2019].

Venable, J., 2017a. A Plan for Keeping the U.S. Air Force’s Best Pilots in Service [WWW

Document]. The Heritage Foundation. URL https://www.heritage.org/defense/commentary/ plan-keeping-the-us-air-forces-best-pilots-service[Accessed 20 February 2019].

Venable, J.,2017b. Independent capability assessment of U.S. Air force reveals readiness level below carter administration hollow force. Washington, D. C: The Heritage Foundation.

Venosa, A.,2016. Breaking point: how many G-forces can humans tolerate? [WWW Document]. Medical Daily. URL https://www.medicaldaily.com/breaking-point-whats-strongest-g-force-humans-can-tolerate-369246[Accessed 28 Feb 2019].

Wise, J.,2019. The Boeing 737 Max and the problems autopilot can’t solve. The New York Times.

Wood, J., Shurlow, C., and Haynes, J., 2015. Objective versus subjective military pilot selection methods in the United States of America. OH: USAF School of Aerospace Medicine, Wright-Patterson AFB.

Yeung, J.,2018. The world’s oldest fighter pilot retires at 66. CNN.

Zegart, A.,2018. Cheapfights, credible threats: the future of armed drones and coercion. Journal of Strategic Studies, 1–41. doi:10.1080/01402390.2018.1439747

References

Related documents

Chapter 6 has presented experimental results of a vision-based landing approach which uses inertial sensor data and position data from a vision system relying on an artificial

In conclusion, the activation of TrA is associated with the upright postural de- mand on the trunk and with balancing imposed moments acting on the spine, re- gardless their

Magnus Karlberg is an Associate Professor in Computer Aided Design and Head of the Division of Product and Production Development in Luleå University of Technology.. Since 2011 he

The multidisciplinary nature is expressed through the simultaneous use of both disciplinary models and analysis capabilities under a common framework, and for this

7 When sandwiched as an active layer between suitably chosen metal electrodes, the polarization charge on the ferroelectric material modulates the barrier for charge injection

This allows us to reason at different levels of abstraction on the user: while the decisions taken on component Human are always a di- rect consequence of sensor

Submitted to Linköping Institute of Technology at Linköping University in partial fulfilment of the requirements for the degree of Licentiate of Engineering. Department of Computer

In this thesis we propose location aware routing for delay-tolerant networks (LAROD). LAROD is a beacon-less geographical routing protocol for intermittently connected mobile ad