Pages

17 March 2024

Silicon Valley wants its cut of US military spending

NICK CLEVELAND-STOUT
Source Link

It’s official — the Pentagon is becoming a bank. Well, sort of. At a March 8th event on dual-use technology at SXSW in Austin, Texas, director of the Office of Strategic Capital Jason Rathje announced that his team has officially received the internal authority to grant executive loans and loan guarantees, a first within the Pentagon.

The Office of Strategic Capital, or OSC, was created in response to growing concern over China’s investment in next-generation technology. According to its investment strategy, released Friday, March 8th, the OSC will invest in firms researching and developing 14 “critical technologies,” including hypersonics, quantum computing, microelectronics, autonomous systems, and artificial intelligence.

After surviving a rocky first year — punctuated by allegations of conflicts of interest from Sen. Elizabeth Warren (D-Mass.) and hard questions over its funding — the OSC is now close to licensing its first funds as part of a joint lending program with the Small Business Administration. OSC loans require private funding to match their loans, giving a pathway for smaller defense tech companies with aggressive investment strategies to enter the mix.

Venture capitalists have poured money into many of the items now on the “critical technologies” list, making them well-poised to benefit from OSC loans. By one New York Times estimate, venture capital firms went from spending around $6.7 billion on military tech in 2016 to $34 billion in 2022.

However, they have generated relatively few government contracts so far, leading some tech entrepreneurs to accuse the Pentagon of paying lip service to innovation without actually funding innovative ventures. According to Palantir, a “unicorn” of the defense tech world founded by Peter Thiel, the top 100 venture-funded military start-ups have only generated somewhere between $2-5 billion in government contracts. Part of this is because of Silicon Valley’s"move fast and break things" approach, which sees the Pentagon’s bureaucracy as little more than a straightjacket.

Marc Andreessen, the co-founder of Andreessen Horowitz and an investor in many defense tech firms through his American Dynamism initiative, embodies this psyche, defined by an infatuation with new technology and a repudiation of the precautionary principle, which urges prudence in the face of uncertainty. In an essay Andreessen authored entitled “The Techno-Optimist Manifesto,” he writes, “We believe the techno-capital machine is not anti-human – in fact, it may be the most pro-human thing there is. It serves us. The techno-capital machine works for us. All the machines work for us.”

This is where the message of defense tech venture capitalists differs from that of the prime contractors like RTX (previously known as Raytheon) and Lockheed Martin; instead of waxing lyrical about security, tech stalwarts evangelize about wielding artificial intelligence to overcome the frailties of human nature itself. Buoyed by their “yes, and…” theater-kid ethos, their beguiling promise is to usher in a near-utopia at the hands of the “Techno-Capital Machine.”

That is, if the government steps aside. “Silicon valley is a builder culture, and Washington is never going to be a builder culture,” argued Katherine Boyle, the co-founder of Andreessen’s American Dynamism initiative. “I think people just have to come to terms with that.”

So what does this “material philosophy” look like in practice? Shield AI, a company Andreessen has invested in through American Dynamism, offers AI-powered autonomous swarms that claim to own “the kill chain from start to end” like a “scene from Top Gun 2.” Palantir has demonstrated a language model that analyzes battlefields and generates courses of action for a human operator. As defense analyst Van Jackson puts it, the OSC has “created various regulatory exemptions and federally guaranteed loans to incentivize VCs to go big on death-tech.”

Even if their promises are more grandiose, the business model of capitalizing on instability remains familiar. On a panel about public/private partnerships at SXSW, former Olympian turned venture capitalist Larsen Jensen said that Russia’s invasion of Ukraine is a “tremendous catalyst” for changing the national security investing environment.

“There have been many other catalysts that have occurred, if you think back prior to that, 9/11 was a catalyst,” Jensen said. “Many companies that otherwise would not exist in the defense industry, such as General Atomics, probably owe a big portion of their success due to a geopolitical catalyst that was, you know, unfortunate for the United States obviously, but the Predator probably wouldn’t be as prolific as it is now, and the early innings of autonomy wouldn’t be as important as it is now, were it not for that tragedy.”

It doesn’t take a Luddite to realize that the Pentagon should exercise caution when partnering with VC firms on exploring technologies such as AI-powered language models and autonomous weapons. As Craig Martell, the head of the Chief Digital and Artificial Intelligence Office at the Pentagon, warns, AI chatbots “speak authoritatively, so we just believe them,” despite the fact that these devices often spit out misleading or outright false answers. In a new report from Public Citizen, Robert Weissman and Savannah Wooten argue that autonomous weapons can lead to dehumanization or even loss of human control. “AI-driven swarms involve autonomous agents that would interact with and coordinate with each other, likely in ways not foreseen by humans and also likely indecipherable to humans in real-time,” Weissman and Wooten write.

The Pentagon has some guardrails in place that urge caution with technology like artificial intelligence and autonomous weapons. A Pentagon directive, issued just a month after the creation of the OSC in January 2023, requires autonomous weapons to be designed to allow human operators to exercise “appropriate levels of human judgment over the use of force,” establishes testing and evaluation standards for autonomous weapons, and mandates a chain of review for approval, among other requirements.

But a number of critics outside of the department question whether this approach goes far enough. A Human Rights Watch/Harvard Law School International Human Rights clinic review of the policy noted that the directive allows for significant loopholes, among them allowing the senior review of autonomous weapons to be waived “in cases of urgent military need.” Weissman and Wooten argue that the “biggest shortcoming of the directive, however, is that it permits the development and deployment of lethal autonomous weapons at all.”

Venture capital firms are looking for more buy-in on the back end, an issue the OSC can’t quite solve. As adjunct professor at Stanford University Steve Blank explains, “There’s a demand problem, not a funding problem.” For the venture capitalists, this requires convincing the U.S. government to sideline concerns they may have about emerging technologies and buy into the techno-utopian vision they are selling.

In order to persuade the government to be more in line with the brash futurism of Silicon Valley, venture capital-backed defense tech firms are ramping up their lobbying operations. In 2023, Palantir spent over $5 million on their formal lobbying operations, lobbying Congress against “the regulation of AI.” Shield AI, which spent over $1 million on lobbying in 2023, lobbied the Department of Defense directly on “issues around autonomy and artificial intelligence.” Anduril, another defense technology company backed by Andreessen, spent over $1.5 million lobbying Congress on issues related to “unmanned and autonomous systems,” including autonomous sentry towers on the U.S.-Mexico border.

OpenAI, the creator of ChatGPT, has also signaled it may want in on Pentagon dollars. As the Intercept reported, earlier this year OpenAI quietly removed language that prohibits the military from using its technology. This week, former Sen. Norm Coleman registered as a lobbyist for OpenAI.

No comments:

Post a Comment