Pages

11 March 2023

The US Air Force Is Moving Fast on AI-Piloted Fighter Jets


ON THE MORNING of December 1, 2022, a modified F-16 fighter jet codenamed VISTA X-62A took off from Edwards Air Force Base, roughly 60 miles north of Los Angeles. Over the course of a short test flight, the VISTA engaged in advanced fighter maneuver drills, including simulated aerial dogfights, before landing successfully back at base. While this may sound like business as usual for the US’s premier pilot training school—or like scenes lifted straight from Top Gun: Maverick—it was not a fighter pilot at the controls but, for the first time on a tactical aircraft, a sophisticated AI.

Overseen by the US Department of Defense, VISTA X-62A undertook 12 AI-led test flights between December 1 and 16, totaling more than 17 hours of autonomous flight time. The breakthrough comes as part of a drive by the United States Air Force Vanguard to develop unmanned combat aerial vehicles. Initiated in 2019, the Skyborg program will continue testing through 2023, with hopes of developing a working prototype by the end of the year.

The VISTA program is a crucial first step toward these goals, M. Christopher Cotting, director of research at USAF Test Pilot School, explains. “This approach, combined with focused testing on new vehicle systems as they are produced, will rapidly mature autonomy for uncrewed platforms and allow us to deliver tactically relevant capability to our warfighter,” he says.

With Ukraine’s use of semiautonomous drones, the US military’s first autonomous flight of a Black Hawk helicopter last November, and the successful testing of AI algorithms in US U-2 spy planes in 2020, it’s clear that autonomous combat represents the next front in modern warfare. But just how completely will AI take over our skies, and what does it mean for the human pilots left on the ground?

The VISTA X-62A (short for Variable In-flight Simulation Test Aircraft) has always been ahead of its time. Built in the 1980s and based on an F-16D Block 30 Peace Marble Il, the plane previously held the designation NF-16D and became the US Airforce Test Pilot School’s go-to simulation machine in the early 1990s. A versatile and adaptable training tool boasting open systems architecture, the VISTA can be fitted with software that allows it to mimic the performance characteristics of multiple aircraft, from heavy bombers to ultra-light fighter jets.

Prior to last year’s autonomous flight tests, the VISTA received a much-needed update in the form of a “model following algorithm” (MFA) and a “system for autonomous control of the simulation” (SACS) from Lockheed Martin’s Skunk Works. Combined with the VISTA Simulation System from defense and aerospace company Calspan Corporation, these updates facilitated an emphasis on autonomy and AI integration.

Utilizing General Dynamics’s Enterprise-wide Open Systems Architecture (E-OSA) to power the Enterprise Mission Computer version 2 (EMC2, or Einstein Box), the SACS system also integrates advanced sensors, a set of Getac tablet displays in both cockpits, and multilevel security features, all of which enhance VISTA’s capabilities, including its rapid-prototyping advantage, which allows for speedy software updates to meet the accelerating pace of AI development.

During testing in December, a pair of AI programs were fed into the system: the Air Force Research Laboratory’s Autonomous Air Combat Operations (AACO) and the Defense Advanced Research Projects Agency’s (DARPA) Air Combat Evolution (ACE). AACO’s AI agents focused on combat with a single adversary beyond visual range (BVR), while ACE focused on dogfight-style maneuvers with a closer, “visible” simulated enemy.

While VISTA requires a certified pilot in the rear cockpit as backup, during test flights, an engineer trained in the AI systems manned the front cockpit to deal with any technical issues that arose. In the end, these issues were minor. While not able to elaborate on the intricacies, DARPA program manager Lt. Col. Ryan Hefron explains that any hiccups were “to be expected when transitioning from virtual to live.” All in all, it was a significant step toward realizing Skyborg’s aim of getting autonomous aircraft off the ground as soon as possible.

The Department of Defense stresses that AACO and ACE are designed to supplement human pilots, not replace them. In some instances, AI copilot systems could act as a support mechanism for pilots in active combat. With AACO and ACE capable of parsing millions of data inputs per second, and having the ability to take control of the plane at critical junctures, this could be vital in life-or-death situations. For more routine missions that do not require human input, flights could be entirely autonomous, with the nose-section of planes being swapped out when a cockpit is not required for a human pilot.

“We’re not trying to replace pilots, we’re trying to augment them, give them an extra tool,” Cotting says. He draws the analogy of soldiers of bygone campaigns riding into battle on horses. “The horse and the human had to work together,” he says. “The horse can run the trail really well, so the rider doesn’t have to worry about going from point A to B. His brain can be freed up to think bigger thoughts.” For example, Cotting says, a first lieutenant with 100 hours of experience in the cockpit could artificially gain the same edge as a much higher-ranking officer with 1,000 hours of flight experience, thanks to AI augmentation.

For Bill Gray, chief test pilot at the USAF Test Pilot School, incorporating AI is a natural extension of the work he does with human students. “Whenever we [pilots] talk to engineers and scientists about the difficulties of training and qualifying AI agents, they typically treat this as a new problem,” he says. “This bothers me, because I have been training and qualifying highly non-linear and unpredictable natural intelligence agents—students—for decades. For me, the question isn’t, ‘Can we train and qualify AI agents?’ It’s, ‘Why can we train and qualify humans, and what can this teach us about doing the same for AI agents?’

Gray believes AI is “not a wonder tool that can solve all of the problems,” but rather that it must be developed in a balanced approach, with built-in safety measures to prevent costly mishaps. An overreliance on AI—a “trust in autonomy”—can be dangerous, Gray believes, pointing out failures in Tesla’s autopilot program despite Tesla asserting the need for the driver to be at the wheel as a backup. Cotting agrees, calling the ability to test AI programs in the VISTA a “risk-reduction plan.” By training AI on conventional systems such as the VISTA X-62—rather than building an entirely new aircraft—automatic limits and, if necessary, safety pilot intervention can help prevent the AI from endangering the aircraft as it learns.

The USAF’s technology is advancing rapidly. This past December, trial flights for ACE and ACCO were often completed within hours of each other, with engineers switching autonomy algorithms onboard the VISTA in minutes, without safety or performance issues, according to Cotting. In one instance, Cotting describes uploading new AI at 7:30 am and the plane being ready to test by 10 am.

“Once you get through the process of connecting an AI to a supersonic fighter, the resulting maneuvering is endlessly fascinating,” says Gray. “We have seen things that make sense, and completely surprising things that make no sense at all. Thanks to our safety systems, programmers are changing their models overnight, and we’re engaging them the next morning. This is unheard of in flight control system development, much less experimentation with unpredictable AI agents.”

Despite these successes, it will take some time before the curriculum at the USAF Test Pilot School undergoes an AI overhaul. Cotting explains that the newness of the AACO and ACE platforms means students will require a greater level of understanding before trying them out in the cockpit of the VISTA. “We’re basically building the bridge as we’re driving over,” Cotting says.

In the meantime, students will undergo a broader test this fall in which they’re exposed to a set of AI and have to figure out how to test it, then execute that test.

As for wider military applications, Cotting says that while he has no visibility into these areas, AI is already ubiquitous in image recognition technology used across the military. While AI-driven tanks may not be on the horizon just yet, the skies, it seems, are set to be home to a new kind of intelligence.

No comments:

Post a Comment