Pages

11 October 2020

Treat AI As Intelligence — Not Technology

By BRYAN CLARK and DAN PATT

The US military is rolling out AI-enabled projects like the Air Force’s Airborne Battle Management System or the Army’s Project Convergence. But the novelty of these demonstrations and the effort required to pull them off suggest that—unlike Silicon Valley—DoD is struggling to incorporate AI into its combat systems, aircraft, ships, and other equipment. 

DoD promulgated an Artificial Intelligence Strategy, established the Joint Artificial Intelligence Center, and the services all stood up their own AI offices, so we know they’re trying hard. The problem is these initiatives treat AI as a tool rather than a method for using a tool. For example, copying past efforts to field nuclear weapons and nuclear propulsion, the Defense Innovation Board advised the JAIC to be made the central manager for all DoD AI efforts; the vice-chair of the National Security Commission on Artificial Intelligence proposed creating a Naval Reactors-like organization to accelerate introduction of AI into the US military. 

Recently the JAIC changed course and announced a series of moves that break with the flawed “AI is a thing” paradigm. By transforming itself into an enabler of AI adoption across the US military, the JAIC will treat AI like a technique that is bought as a service, complete with new contracting mechanisms for the continuous data management, model refinement, software development, and testing needed for defense organizations and vendors to incorporate AI into their products and processes.

The model of AI as technique suggests a new way to think about its use by the military. Warfighters should treat AI as just another form of intelligence. Officers don’t need to be experts in biology to lead a division or squad; they need to understand their subordinates’ knowledge, motivations, and strengths or limitations. The same is arguably true with AI. Operators don’t need to know or write algorithms employed in the AI-enabled processes they oversee, but they need to understand data being used by the processes, the algorithm’s goals or objective functions, and how the AI-enabled system will pursue its goals.

The recent dogfighting victory of an AI agent against a Fighter Weapons School graduate, or the use of AI to defeat salvos of incoming missiles suggest AI will profoundly affect military operations. Some observers conclude from these demonstrations that, with enough data and computational power, AI may meet or exceed human intelligence. This is the wrong way to view AI’s military contributions. Instead of replacing humans, AI is causing an explosion in the forms of intelligence available to leaders and operators. 

At its core, AI is an advanced method of information processing. There are many types of AI, including machine learning, natural language processing, planning systems, and expert systems. Each has distinct strengths and weaknesses that reflect different forms of intelligence. Expert systems quickly diagnose situations and propose solutions within defined bounds like a good junior technician. Machine learning programs will appear more intuitive, diagnosing by comparison to previous experience and using trial and error to formulate solutions. And some machine learning AI, like deep neural nets, will exhibit more creativity by linking what appear to be unrelated experiences and concepts to formulate answers or courses of action. 

Training, Leading And Recruiting

A hallmark of the industrial era was Taylorism, which assumed high performance in combat would result from establishing repeatability in cleanly defined tasks. This logic still guides much military training. But business and military leaders often find this approach is flawed—different humans have different aptitudes and abilities. The training techniques that work best vary from person to person; they also vary from machine to machine. For example, algorithms that conduct image recognition are different from those that perform language recognition because they require a fundamentally different approach to training.

Today’s knowledge economy drives business leaders towards managing a diversity of intelligence, experience, and learning styles to get the most from their workforces. This mandate is growing with the introduction of AI-enabled systems. For example, given a common objective, what can each agent contribute with his, her or its natural or artificial intelligence? A key role for future military leaders will be training, managing, and overseeing teams of human or AI-enabled machine subordinates.

Teaching leaders to manage rather than operate AI-enabled systems is very different from the approaches advanced by some technical experts and the National Commission for AI who propose corps of specialists to operate and maintain AI-enabled systems or extensive training for warfighters who will work with AI. Not only are these approaches expensive and likely to create skills that quickly atrophy, they fail to address the fundamental issue–how military leaders can exploit the strengths of AI while mitigating its weaknesses and vulnerabilities. These are issues of behavior rather than technology. 

Viewing AI as another form of intelligence also has implications for recruiting. The US military recruits almost all new servicemembers into a specialty for which they have some measured aptitude, such as electronics, nuclear engineering or aviation. Program executives should think of AI in a similar way, investing to incorporate the most appropriate form of AI into existing or planned systems to complement human operators. 

DoD must not continue treating AI as a tool or a product to be managed by a select few technical experts and organizations. Like software, AI-enabled algorithms pervade every commercial electronic product we touch today. The US military will need that same level of proliferation to affordably sustain the force and win in future conflicts. And as with commercial products, warfighters don’t need an engineering degree to get to know their new machine teammates.

No comments:

Post a Comment