28 June 2019

Control the Information Environment Narrative…or the Threat Will

Mario Hoffmann

“The advent of the internet, the expansion of information technology, the widespread availability of wireless communication, and the far-reaching impact of social media have dramatically impacted operations and changed the character of modern warfare”1

--James Mattis, Former Secretary of Defense

Strategic competitors like Russia and China are using old technologies in new ways while also employing new advanced technology to fight their enemies in all domains (space, cyber, air, sea, and land). This required the U.S. Army to evolve and adapt the way it wants to fight by publishing “Multi-Domain Operations (MDO) 2028” as the cornerstone for the Joint force to militarily compete, penetrate, dis-integrate, and exploit future adversaries.2 While air, land, and sea domains have been prevalent since World War II, the relative new-comers of Cyber and Space are still establishing their doctrinal foundation in modern warfare. 

US adversaries have demonstrated they will use offensive cyber and electronic warfare (EW) capabilities within cyberspace and the electromagnetic spectrum (EMS) to complicate a commander’s decisions and mitigate his/her ability to employ the full range of warfighting capabilities to gain an advantage. Our adversaries can or will soon be able to:


Intercept and disrupt advanced voice and data communications;

Degrade air defense and target acquisition radars;

Deny/deceive Global Positioning Systems to influence navigation and timing data;

Deny or degrade friendly Intelligence, surveillance, and reconnaissance.

While all these multi-domain challenges are real and relevant, an already contentious battle exists in the “information environment” (IE). The Chairman of the Joint Chiefs of Staff approved “Information” as a seventh Joint Function, consistent with the 2016 DoD Strategy for Operations in the IE (SOIE), justifying the significance at the strategic, operational, and tactical levels of military operations.1 Adversaries within this IE will attempt to deceive our strategic leaders and senior commanders from the realities of their Operational Environment (OE). This misdirection is intended to sway their decisions to forestall desired outcomes, and promote false public perception to undermine our goals and support of our troops. It is a relentless engagement within the “competition” phase of MDO, which is meant to achieve their interests, or at least gain a position of relative advantage in shaping potential future “armed conflicts.”

To describe the IE, the U.S. Army tends to favor terms like cyber, space, electronic warfare, and information operations that imply categorized approaches that must be synchronized, whereas our opponents see such as mere ways and means to achieving a desired end-state via “Information Warfare (IW).” Adversaries conduct IW to deny or manipulate information trusted by users without their awareness to make decisions not in their interest, but rather for the benefit of the adversary.3 

Opposing Force (OPFOR) doctrine emphasizes the importance of IW from tactical through strategic engagements. As a tactical combat multiplier it enhances leadership decisions and magnifies maneuver, firepower, and protection at decisive points. The U.S. Army’s OPFOR Training Circular 7-100 describes seven elements of IW (EW, Computer Warfare, Deception, Physical Destruction, Protection/Security, Perception Management, and Information Attack), which neither exist in isolation from each other nor are mutually exclusive.4

Much of today’s IE encompasses social media applications on the internet, though social media in itself does not make up or define the IE. Internet based social media tools provide a plethora of information often used by global state and non-state intelligence communities. Facebook alone adds approximately 250 million photos per day while Twitter adds 200 million, and YouTube reports viewings of 4 billion videos per day. 5 These posting provide intelligence communities (tactical-strategic) near real-time situational awareness of indicators and events as they unfold, as demonstrated by ‘bursts’ on tweeters that pre-empt conventional reporting. Sites like Facebook also provide insights into group behaviors and activities.5 Social media provides users an ideal platform to voice non-attributional comments, but also the ability to publish false information that can shape global perceptions. According to an MIT study, false information spreads on average six times faster than real information.6

Exacerbating the spread of false information are social bots (computer based impersonators), which in one example accounted for one-fifth of a conversation encompassing roughly 2.8 million tweets. This included the retweeting of bots by real humans that now represent friends, family, and co-workers and provide more credibility to disinformation. While creating bot profiles is cheap, quick, and easy for use on social forums and chat rooms, to include Twitter and Facebook, they become extremely effective when endorsed with advertising dollars. This was demonstrated by the Kremlin-linked Internet Research Agency (IRA) investment of $100,000 to reach more than 126 million users during the 2016 presidential election.7

For creating emotional responses to disinformation, Russia is also known to sponsor internet trolls that are well versed in multiple languages and customary behaviors. Their intent is to start quarrels by posting inflammatory and aggressive comments that provoke readers into emotional reaction. Often these trolls spoof their location to appear in local areas and affiliated within local groups, but were much more likely to be within Ukraine, Russia or other Eastern European countries.8Looking ahead, artificial Intelligence (AI) enabled technologies are becoming weaponized disinformation tools that include: 9
Deepfake Productions – Videos constructed for making a person appear to say or do something that they never said or did. AI has improved this capability so greatly that it is extremely difficult to discern deepfakes from real video or imagery (e.g., adding or destroying bridges) by the naked eye and ear.

Generative Adversarial Networks (GANs) - Artificial intelligence (AI) driven technologies used to create entirely original and fake faces and bodies, often for commercial applications (e.g., video games) but can also have a profound impact by providing visual reassurances of troll and bot armies. 

Text Generation Tools – AI assisted ability to compose original text in realistic prose, able to generate mass-produced and convincing headlines, posts, articles, and comments, entirely free from human input. These tools already demonstrated their ability to create false pretenses for war (road to wars) and highlighted the dangers in creating ‘Black Mirror’ Scenarios.10

Russia uses these IE tools in every phase of their operations, including covert disinformation during peace, which U.S. military policy does not allow.11 For example, to set operational conditions for the invasion of Ukraine, military officers in the Baltics claimed that Russia began their information operations 12-years prior to the annexation of Crimea by claiming eastern Ukraine historically belonged to Russia.12 To influence tactical aspects of the IE, Russia broadcasted false news of gangs and fascists who were terrorizing Kiev and fighting to ban the Russian language, which convinced ethnic-Russians to flee Ukraine for fear of persecution, creating localized discord.13

Leaders must be savvy in shaping proactive narratives within their IE to overcome the old cliché of “perception is 9/10th of reality.” This is critical as Army units are the only armed force to offer direct and continuous interaction with populations and opposition forces. Social media influences these interactions and shapes perceptions of operational variables (PMESII-PT), which military planners must address during mission analysis to inform mission variables (METT-TC).14 Integrating a competitive IE within brigade and above collective training exercises not only creates more realistic and relevant MDO training conditions, but moreover provides commanders and staff the situational awareness and experience of operating within this new warfighting function. To help our commanders, TRADOC provides two key resources:

Network Engagement Team (NET): Working with the Center for Strategic Leadership and the Army War College, is developing courseware that will train senior leaders in cognitive maneuver -- an effort directed at more effectively crafting/shaping narratives and delivering such within the IE. This understanding is essential for integrating information as a warfighting function and maximizing the effectiveness of operations in the IE. For more information or to request assistants, please send an email to usarmy.jble.tradoc.list.eustis-tboc-operations1@mail.mil

Information Operations Network (ION): Replicates the social media aspects of the internet, which is Decisive Action Training Environment (DATE) compliant and adjusted for specific exercise needs. This is a government developed and operated tool globally accessible via the NIPRnet, but can also be hosted on local networks, such as a Multinational Partner Environment (MPE). It provides the replicated feeds and information of government websites, international news agencies, Twitter, Facebook, YouTube, etc. (https://oedata.army.mil/ion-browser/). TRADOC G2 provides units exclusive access to dedicated ION partitions, for which the unit themselves extend controlled access to users for updating, modifying, and posting new information relative to their scenario, including the OPFOR.

No comments: