Pages

22 September 2023

ChatGPT Isn't Coming for Your Coding Job

ZEB LARSON

SOFTWARE ENGINEERS HAVE joined the ranks of copy editors, translators, and others who fear that they’re about to be replaced by generative AI. But it might be surprising to learn that coders have been under threat before. New technologies have long promised to “disrupt” engineering, and these innovations have always failed to get rid of the need for human software developers. If anything, they often made these workers that much more indispensable.

To understand where handwringing about the end of programmers comes from—and why it’s overblown—we need to look back at the evolution of coding and computing. Software was an afterthought for many early computing pioneers, who considered hardware and systems architecture the true intellectual pursuits within the field. To the computer scientist John Backus, for instance, calling coders “programmers” or “engineers” was akin to relabeling janitors “custodians,” an attempt at pretending that their menial work was more important than it was. What’s more, many early programmers were women, and sexist colleagues often saw their work as secretarial. But while programmers might have held a lowly position in the eyes of somebody like Backus, they were also indispensable—they saved people like him from having to bother with the routine business of programming, debugging, and testing.

Even though they performed a vital—if underappreciated—role, software engineers often fit poorly into company hierarchies. In the early days of computers, they were frequently self-taught and worked on programs that they alone had devised, which meant that they didn’t have a clear place within preexisting departments and that managing them could be complicated. As a result, many modern features of software development were developed to simplify, and even eliminate, interactions with coders. FORTRAN was supposed to allow scientists and others to write programs without any support from a programmer. COBOL’s English syntax was intended to be so simple that managers could bypass developers entirely. Waterfall-based development was invented to standardize and make routine the development of new software. Object-oriented programming was supposed to be so simple that eventually all computer users could do their own software engineering.

WIRED OPINION

ABOUT

Zeb Larson is a writer and software engineer based in Columbus, Ohio. Prior to becoming a developer, he earned a PhD in history from Ohio State University. He’s currently working on a history of computers for The Experiment.

In some cases, programmers were resistant to these changes, fearing that programs like compilers might drive them out of work. Ultimately, though, their concerns were unfounded. FORTRAN and COBOL, for instance, both proved to be durable, long-lived languages, but they didn’t replace computer programmers. If anything, these innovations introduced new complexity into the world of computing that created even greater demand for coders. Other changes like Waterfall made things worse, creating more complicated bureaucratic processes that made it difficult to deliver large features. At a conference sponsored by NATO in 1968, organizers declared that there was a “crisis” in software engineering. There were too few people to do the work, and large projects kept grinding to a halt or experiencing delays.

Bearing this history in mind, claims that ChatGPT will replace all software engineers seem almost assuredly misplaced. Firing engineers and throwing AI at blocked feature development would probably result in disaster, followed by the rehiring of those engineers in short order. More reasonable suggestions show that large language models (LLMs) can replace some of the duller work of engineering. They can offer autocomplete suggestions or methods to sort data, if they’re prompted correctly. As an engineer, I can imagine using an LLM to “rubber duck” a problem, giving it prompts for potential solutions that I can review. It wouldn’t replace conferring with another engineer, because LLMs still don’t understand the actual requirements of a feature or the interconnections within a code base, but it would speed up those conversations by getting rid of the busy work.

ChatGPT could still upend the tech labor market through expectations of greater productivity. If it eliminates some of the more routine tasks of development (and puts Stack Overflow out of business), managers may be able to make more demands of the engineers who work for them. But computing history has already demonstrated that attempts to reduce the presence of developers or streamline their role only end up adding complexity to the work and making those workers even more necessary. If anything, ChatGPT stands to eliminate the duller work of coding much the same way that compilers ended the drudgery of having to work in binary, which would make it easier for developers to focus more on building out the actual architecture of their creations.

The computer scientist Edsger Dijkstra once observed, “As long as there were no machines, programming was no problem at all; when we had a few weak computers, programming became a mild problem, and now we have gigantic computers, programming had become an equally gigantic problem.” We’ve introduced more and more complexity to computers in the hopes of making them so simple that they don’t need to be programmed at all. Unsurprisingly, throwing complexity at complexity has only made it worse, and we’re no closer to letting managers cut out the software engineers. If LLMs can match the promises of their creators, we may very well cause it to accelerate further.

WIRED Opinion publishes articles by outside contributors representing a wide range of viewpoints. Read more opinions here. Submit an op-ed at ideas@wired.com.

No comments:

Post a Comment