Pages

2 February 2023

The Army’s Distributed Command Posts of the Future Will Need More than Videochats

LAUREN C. WILLIAMS

A recent Army exercise out of Joint Base Lewis-McChord sought largely to test ways to distribute command and control—to, say, replace big command posts with small cloud-connected teams scattered around the Pacific region. But what the I Corps’ IT team discovered was just how much of the service’s vision of future warfare will depend on turning a morass of data into well-structured bundles.

The experiment was set up to use unstructured data, the kind that accounts for much of the information the Army moves around: PDFs, PowerPoint slides, emails, calendar invites, etc. It takes a lot of human brainpower to assemble this information into forms that can help commanders make decisions.

That’s not good enough for the future battlefield, says Col. Elizabeth Casely, who runs I Corps’ communications, networks, and services.

“We're now beginning to understand how much we were using, I would say, human-in-the-loop cognitive processing to achieve a result that could be easily achievable if we had exposed data that was structured in some way, [if] we had access to a data environment, or a tool if you will, to put it in,” Casely told Defense One recently. “And then the big lift that has to occur inside the Corps is this data-engineering lift: this move from unstructured to structured. Because you can't begin to imagine what questions you might ask of the data until you begin to understand what sorts of things you have access to.”

Toward “distributed mission command”

Headquartered at JBLM in Washington state, I Corps supports operations in the vast U.S. Indo-Pacific Command, whose area of responsibility stretches over more than half the Earth’s surface. As has much of the U.S. military, the Corps has been re-thinking its methods as a potential fight with China looms larger. Key to these changes is a new concept called “distributed mission command,” which is intended to allow small teams in various locations to perform all the functions of today’s big command posts.

This requires better data networks, better cloud storage, and a lot more, said Casely, who is I Corps’ G6.

“We're responsible for making sure that we have the transport in place…make sure that transport is widely accessible, highly available, simple and intuitive to connect to and move data all over the place and in a way that the warfighter intends to use the network,” she said. “The idea is to be able to have a tactically-enabled cloud environment, connect, and then have a predetermined architecture in mind about where we would need to have on prem, or edge computing devices.”

Just moving the bits around is easier said than done in INDOPACOM’s area of responsibility, which encompasses some 100 million square miles, mostly water. The Army’s existing network gear was designed to send information more regionally, not over great distances.

“Bandwidth is a challenge. Latency is a challenge,” Casely said.

Then there’s the need to make sure the data can be understood as it passes between systems and organizations. That means developing standards for data, first within a given function, like intelligence or fires, and then across them. Not only does this help tie the systems together, it also turns the data into useful input for machine-learning or artificially intelligent tools.

A data-centric journey

I Corps’ recent exercise, mostly local at JBLM, Yakima, Wash., and Oregon, had the goal of duplicating a distributed architecture.

“We tried to organize some of our services, understand where the warfighting functions would require or rely on network or server architecture, and then try to understand how we would move forward with that, as we worked on mission command information systems modernization,” Casely said.

One of the lessons was that they need a capability that converts unstructured data into structured information.

Now, the Corps is putting the pieces together to do that, which means pooling data from across the Corps so that it’s accessible, pushing it into the right environment, and getting the right talent expertise to make the most of it.

“Data exists in varying forms all over the Corps. The question is, how do we start to pool all of that together, get it into an environment and then apply the appropriate talent to it. Then, ultimately, do what we're all trying to do—answer a question.”

Casely said all three of those steps are linked: “You can't do one without the other.”

Other challenges observed during the exercise include problems with authentication, latency, and the result of too much network chatter.

As part of modernizing the mission command information systems, the Corps wants to employ infrastructure as code.

“We have applications that are very tightly coupled to its associated data and its physical hardware. That tight coupling forces us to operationalize or conduct operations in a certain way.”

And that set up causes problems with authentication and latency when the Corps splits up into multiple nodes.

That latency creates network chatter as communications aren’t confirmed as delivered: “Did you see me? Yes, I'm here. Did you see me? Yes, I'm here. Did you send the message? I didn't get it, send it again, send it again, send it again.”

Casely said the Corps is working to figure out how much of that chatter is related to the architecture of its mission command information systems, and whether a cloud of Pentagon and commercial services could help.

It’s a challenge that’s come up in various regions the Corps has set up shop recently—Guam, Thailand, and Korea. Next month, they’re headed to Japan to continue testing distributed command and control with personnel nodes there and at Joint Base Lewis-McChord.

Casely expects latency-related issues to “calm down” as the Corps deploys cloud-native mission command information systems. And modernizing those systems will require a lot of changes, including approaches to software development and adopting microservices.

To do this the Corps will have to significantly increase its software development investments. But as a first step, she said, “we would like to use cloud-native industry best practice to deploy and configure workloads as code,” also called infrastructure as code.

“This will allow us to rapidly, securely and consistently deploy mission command capabilities in these automated DevOps pipelines, which consists of a series of multiple stages and tasks. So you install it, you connect to a database, you provision the accounts, you conduct the security scanning,” Casely said. And if something breaks along the way, developers can go back and pinpoint the failure.

The Corps also plans to create an unclassified information sharing system for its mission partners.

“We have a lot of bilateral agreements in the Pacific. And so the challenge there is how do you create an environment where you can use common information collaboration services, amongst multiple mission partners to conduct planning for one mission,” Casely said.

Lt. Gen. Xavier Brunson, the commanding general of the Army’s I Corps, has pushed the distributed C2 concept to make the organization more flexible and survivable and that mission partner environment was key for training as the Corps moves to increase its presence in the Indo-Pacific region year round, up from eight months out of the year.

“Our partners demand that of us, but we've got to be able to communicate as we exercise,” he told reporters during the Army’s annual conference last month.

The Corps plans to test out that mission partner information environment during a Cobra Gold exercise scheduled in February. The goal is to demonstrate an initial capability over the next year.

No comments:

Post a Comment