Pages

31 March 2024

Twilight of the Wonks

WALTER RUSSELL MEAD

Impostor syndrome isn’t always a voice of unwarranted self-doubt that you should stifle. Sometimes, it is the voice of God telling you to stand down. If, for example, you are an academic with a track record of citation lapses, you might not be the right person to lead a famous university through a critical time. If you are a moral jellyfish whose life is founded on the “go along to get along” principle and who recognizes only the power of the almighty donor, you might not be the right person to serve on the board of an embattled college when the future of civilization is on the line. And if you are someone who believes that “misgenderment” is a serious offense that demands heavy punishment while calls for the murder of Jews fall into a gray zone, you will likely lead a happier and more useful life if you avoid the public sphere.

The spectacle of the presidents of three important American universities reduced to helpless gibbering in a 2023 congressional hearing may have passed from the news cycle, but it will resonate in American politics and culture for a long time. Admittedly, examination by a grandstanding member of Congress seeking to score political points at your expense is not the most favorable forum for self-expression. Even so, discussing the core mission of their institutions before a national audience is an event that ought to have brought out whatever mental clarity, moral earnestness, and rhetorical skills that three leaders of major American institutions had. My fear is it did exactly that.

The mix of ideas and perceptions swirling through the contemporary American academy is not, intellectually, an impressive product. A peculiar blend of optimistic enlightened positivism (History is with us!) and anti-capitalist, anti-rationalist rage (History is the story of racist, genocidal injustice!) has somehow brought “Death to the Gays” Islamism, “Death to the TERFS” radical identitarianism, and “Jews are Nazis” antisemitism into a partnership on the addled American campus. This set of perceptions—too incoherent to qualify as an ideology—can neither withstand rational scrutiny, provide the basis for serious intellectual endeavor, nor prepare the next generation of American leaders for the tasks ahead. It has, however, produced a toxic stew in which we have chosen to marinate the minds of our nation’s future leaders during their formative years.

American universities remain places where magnificent things are happening. Medical breakthroughs, foundational scientific discoveries, and tech innovations that roar out of the laboratories to transform the world continue to pour from the groves of academe, yet simultaneously many campuses seem overrun not only with the usual petty hatreds and dreary fads, but also at least in some quarters with a horrifying collapse in respect for the necessary foundations of American democracy and civic peace.

Sitting atop these troubled institutions, we have too many “leaders” of extraordinary mediocrity and conventional thinking, like the three hapless presidents blinking and stammering in the glare of the television lights. Assaulted by the angry, noisy proponents of an absurdist worldview, and under pressure from misguided diktats emanating from a woke, activist-staffed Washington bureaucracy, administrators and trustees have generally preferred the path of appeasement. Those who best flourish in administrations of this kind are careerist mediocrities who specialize in uttering the approved platitudes of the moment and checking the appropriate identity boxes on job questionnaires. Leaders recruited from these ranks will rarely shine when crisis strikes.

The aftermath of the hearings was exactly what we would expect. UPenn, which needs donors’ money, folded like a cheap suit in the face of a donor strike. Harvard, resting on its vast endowment, arrogantly dismissed its president’s critics until the board came to the horrifying realization that it was out of step with the emerging consensus of the social circles in which its members move. There was nothing thoughtful, brave, or principled about any of this, and the boards of these institutions are demonstrably no wiser or better than those they thoughtlessly place in positions of great responsibility and trust.

It would be easier to simply dismiss or take pleasure in the public humiliation of some of America’s most elite institutions—but we can’t. Universities still matter, and as Americans struggle to reform our institutions in a turbulent era, getting universities right is a national priority. The question is not whether our higher educational system (and indeed our education system as a whole) needs reform. From the colonial era to the present, America’s system of higher ed has been in a constant state of change and reform, and the mix of opportunities and challenges presented by the Information Revolution can only be met by accelerating the pace and deepening the reach of that continuing historical process.

Universities are not and never have been castles of philosophic introspection floating high in the clouds. They are functional institutions serving clear and vital purposes in national life. That is why taxpayers and private donors pay for them to exist, students attend them, and society cares what happens on campus. As society changes, the roles that universities are called on to play change, and universities modify their purpose, structure, and culture to adapt to the new demands and opportunities around them. At a time of accelerating social change that centers on the revolutionary impact of the Information Revolution on the knowledge professions, it is not surprising that universities—the places where knowledge professionals like doctors, lawyers, business managers, civil servants, and teachers receive their formal education—face a set of challenges that are urgent and profound.

America’s education system developed in response to the transformation of human civilization over the last 150 years. The Industrial Revolution resulted in a society of unprecedented scale and complexity, and this society required a large class of highly trained specialists and administrators. Architects had to master the techniques to build hundred-story skyscrapers. Engineers had to build bridges and highways that could carry unprecedented masses of traffic. Doctors had to master immense amounts of knowledge as the field of medicine changed beyond recognition. Bureaucrats had to coordinate the efforts of government agencies taking on tasks and managing resources beyond anything past generations had considered possible. Corporate managers had to integrate sourcing of raw materials, design, and upkeep of factories based on cutting-edge technology and management of a labor force as large as some armies in past centuries with financing and marketing operations on an ever-expanding scale. Financiers had to calculate risks and extend long-term credit for corporations and governments whose need for capital eclipsed anything ever previously seen. Military leaders had to coordinate the largest armed forces in the history of the world, operating at unprecedented levels of technology from the tundra to the jungle, and from outer space to deep under sea.

At the same time, scientific research—which for past generations had been something of a hobby for intelligent gentry amateurs—became a vital engine of economic growth and a key aspect of national security. From the gentleman amateurs of the Royal Society showing their intriguing results to Charles II to the scientists of the Manhattan Project building superweapons for Franklin Roosevelt, there is a long and laborious journey. As the natural sciences became more complex, as experiments required ever more expensive and technologically sophisticated equipment, and as their importance to industry and government grew, the sciences required an ever-growing cadre of trained and skilled researchers. And with the increased economic and military importance of science, business, civilian government, and the national security sector required more professionals who could follow the increasingly arcane yet massively consequential developments in the scientific world.

Before the advent of modern computing, much less AI, all this work had to be carried out by the unaided human mind. Modern industrial societies had to stuff vast quantities of highly specialized and intellectually complex knowledge into the heads of a large sector of their population. They had to develop methods of managing and governing this large, technical intelligentsia, the overwhelming majority of whom would be cogs in the machine rather than exercising leadership. And they had to develop and maintain the institutions that could carry out these unprecedented responsibilities.

The consequences for universities were revolutionary. Higher education had to serve a much higher number of students and prepare them with a much higher standard of technical knowledge—and provide technical and specialist education in a much larger number of subjects—than ever before.

From the post-Civil War generation through the present day, the immense task of shaping the disciplines and filling the ranks of the learned professions and civil service shaped the evolution of the modern educational system. What we call colleges and universities today have, functionally speaking, little in common with the institutions that bore those names in pre-modern times. The University of California resembles the medieval University of Paris less closely than my hometown of Florence, South Carolina, resembles the Florence of Tuscany.

The pre-modern university was a small, loosely managed association, and its officials needed to pay the bills, discipline the students, arbitrate the petty jealousies of the faculty, and keep the university as a whole on the right side of the political and ecclesiastical powers of the day. A modern university, even of the second or third tier, will often be large enough to play a significant role as a local or even regional engine of economic development. It may well be the largest employer in the city or town in which it is sited. It will often manage operations ranging from top-of-the-line hospitals to world-class athletic facilities to academic printing presses and day care centers. Larger universities operate dining halls that feed thousands or even tens of thousands of people every day and carry out projects as diverse as cattle breeding and subatomic research.

American universities succeeded in these tasks better than their peers anywhere in the world, and America’s success in the 20th century was not unrelated to the speed and efficiency with which its higher education system adapted to the new realities. But precisely because they succeeded so brilliantly in the past and adapted so effectively to the conditions of late-stage industrial democracy after World War II, American universities face severe difficulties as the Information Revolution upends many of the institutions, practices, and ways of life that characterized the earlier era.

As universities and their student bodies became larger, with the percentage of college graduates in the American population growing from an estimated 1% in 1900 to 6% in 1950 to roughly 25% in 2000, the role of the university-educated in American life also changed. Access to higher education was significantly widened, but the gulf between those with bachelor’s and post-baccalaureate degrees and the rest of the population also widened. There was a day when most American lawyers had never studied in law school, and when many, like Abraham Lincoln, lacked even a high school degree. Today, entrance to the profession is much more highly controlled and those without the requisite degrees face nearly insuperable barriers.

At the same time, the relationship between higher education and social leadership has largely broken down. In pre-modern times, university graduates were almost entirely recruited from the upper classes, and their university study was consciously intended to equip them for the exercise of real power and leadership. The pre-modern university was dedicated to the artisanal production of new generations of elite leaders in a handful of roles closely related to the survival of the state. The modern university produces scientists, bureaucrats, managers, and assorted functionaries on an industrial scale to provide governments and the private sector with a range of skilled professionals and knowledge workers, most of whom will spend their lives following orders rather than giving them.

One measure of the change is to contrast the credentials of past generations with what is routinely expected of professionals today. Benjamin Franklin’s formal education ended when he was 10 years old. There were no economics departments or doctorates anywhere in the world when Alexander Hamilton, who was unable to complete his undergraduate studies at the then-Kings College of New York (now Columbia), designed the first central bank of the United States. None of the Founding Fathers were as well credentialed or thoroughly vetted as utterly mediocre, run-of-the-mill lawyers and political scientists are today. Armed only with his genius and his scanty formal educational credentials, a young John Marshall could not land an interview, much less a job, with a major American law firm today. Neither Ulysses Grant nor Robert Lee held a doctorate or had any formal professional training after graduating from West Point. Their lack of credentials would ensure that neither, today, would be considered for senior command in any branch of the armed forces. Through most of the 19th century, American colleges and even elite universities did not require doctoral degrees of their faculty. Today, however, a person with George Washington’s educational credentials could not get a job teaching the third grade in any public school in the United States.

The endlessly rising demand for more experts with higher degrees of expertise had profound social consequences, leading inexorably to a kind of society in which “merit” rather than ancestry or wealth was increasingly the key to advancement. If you are a fumble-fingered incompetent with a limited attention span, it doesn’t matter how rich or well-connected your parents are—you still can’t be a neurosurgeon or an air traffic controller.

Advocates of the rising professional meritocracy pointed to Thomas Jefferson’s ideas of a “natural aristocracy” of talent to underline its grounding in both democratic theory and natural law. But merit in the emerging technocratic society of the Industrial Revolution was a highly specialized thing. This is not merit as traditionally conceived in Western civilization. This was not about achieving a holistic ideal of human merit in which wisdom, judgment, virtu, and intellectual excellence are all appropriately considered.

Merit, for the 20th century, was increasingly dissociated from the older ideals. It was more and more conflated with the kind of personality and talent set that define what we call a “wonk.” Wonks do well on standardized tests. They pass bar examinations with relative ease, master the knowledge demanded of medical students, and ace tests like the Law School Admissions Test (LSAT) and the Graduate Record Examination (GRE). Wonks are not rebels or original thinkers. Wonks follow rules. What makes someone a successful wonk is the possession of at least moderate intelligence plus copious quantities of what the Germans call Sitzfleisch (literally, sit-flesh, the ability to sit patiently at a desk and study for long periods of time).

Wonk privilege is a rarely examined form of social advantage, but progressively over the last century we’ve witnessed a steady increase in the power, prestige, and wealth that flow to people endowed with a sufficiency of Sitzfleisch. The wonkiest among us are deemed the “best” and the goal of meritocracy has been to streamline the promotion of wonks to places of power and prestige while sidelining the fakers: those who use good looks, family descent, wealth, or charisma to get ahead.

The 20th century was the golden age of the wonk, as one profession after another demanded people who had more and more of this ability. People who could do the work to get into and succeed in medical school, law school, accountancy school, engineering school, and other abstruse and difficult pre-professional programs earned high incomes and enjoyed great social prestige. But even as professionals developed greater degrees of specialized knowledge, the nature of their work changed. Early in the 20th century, most professionals operated essentially independent of outside supervision or control. Doctors, lawyers, and many others were often in practice for themselves, and their relationships with their clients were long term, confidential, and generally not subject to review by outside bodies.

In the 19th century, when my great-grandfather finished medical school, medical education was not particularly demanding. This made sense; while important 19th-century discoveries like the use of anesthesia and the germ theory of disease were making themselves felt, there was still not all that much specialized professional knowledge that doctors needed to acquire. On the other hand, medicine was very much a “people” business. What distinguished successful doctors from less successful colleagues was more bedside manner than medical knowledge. Successful doctors were people people. With scientific knowledge about health relatively small, what counted most was wisdom. A physician’s ability to see the whole person, think carefully and wisely about their condition, and then mobilize the extremely limited resources of the medical profession to ease their condition made for the most successful outcomes.

In his son’s day, much of that was beginning to change. Cornell Medical School, which my grandfather Dr. Walter Mead attended in the early days of seriously scientific medical education, prided itself on academic rigor, and the second Dr. Mead was familiar with theories, techniques, and treatment methods that his father would not have known. It took more academic ability and personal discipline to train for the profession in the 1920s than it had a generation earlier. But people skills still mattered. My grandfather, though some of his patients paid him in hams and hickory nuts left on the back porch, was one of the most affluent and respected citizens of his town. He helped bring Alcoholics Anonymous to South Carolina, served as a warden in his church, was called in to examine Franklin Roosevelt as the president’s health deteriorated in the spring of 1944, and played a significant role in the quiet discussions that led to the peaceful desegregation of the town in the Civil Rights era. While he brought the best scientific knowledge of his time to the treatment of his patients, his practice remained people-based. He knew his patients’ life stories and families, and this knowledge played a not-inconsiderable role in his diagnoses and treatments.

Eventually, the explosion of medical knowledge in the 20th century turned medicine into a highly regulated and disciplined profession that demanded a lifelong commitment to study. It is not just that medical school requires a much more rigorous program of intensive study before a physician is ready to go into the field. As a recent National Bureau of Economic Research paper put it, “Medical knowledge is growing so rapidly that only 6 percent of what the average new physician is taught at medical school today will be relevant in ten years.” Not many people have the capacity to absorb and retain information at this pace through decades of active professional life.

As academic standards tightened, the social status and the relative income of doctors increased. This made sense. Scientifically trained doctors could do more for their patients, and as it became harder to qualify as a doctor, fewer people could master the academic work, and those who did could demand better pay. The prestige was also good. Some may remember the old joke about the mother of the first Jewish president. As her son took the oath of office, she turned to the person next to her and said, “My other son’s a doctor!”

But as the medical profession became more scientifically rigorous and therapeutically successful, it began to change in other ways. Medicine became more capital intensive, and my brother and nephews, like most of their peers who also became doctors, did not go into private practice. They work in hospitals, and much of what they do is governed by rigorous and detailed protocols. The contemporary doctors in the Mead clan are very sympathetic and intuitive people, but medical practice today is increasingly dominated by cost accounting, government and insurance payment protocols, and the need to process as many patients as possible in the shortest amount of time.

Today’s doctors are almost infinitely more scientifically educated than their 19th- and mid-20th-century predecessors, but they enjoy much less autonomy. My grandfather’s medical decisions were not subject to reviews by hospital administrators, state examiners, or tort lawyers. Decisions today that are subject to government or institutional oversight were taken privately and quietly 100 years ago. Doctors made decisions about matters like end-of-life care and abortion without necessarily doing everything “by the book.”

Doctors were among the most admired of midcentury American professionals, but all the members of all the so-called “learned professions” had a good run. Well-paid, well-respected and in charge, lawyers, doctors, clergy, professors, architects, and business executives deployed the knowledge they gained through intensive education and years of practice. And many of the professions further benefited because their professional associations—modeled on the pre-modern guilds that protected the interests of master craftsmen—restricted the number of new entrants into the professions and continuously increased the required levels of training and education required to practice in each field.

But in profession after profession, we see a similar loss of autonomy even as the degree of required technical skill increases. Between government regulators, tort lawyers, and the economic pressures that lead a growing number of professionals to work as employees in large firms rather than managing private practices, outside forces continue to narrow the scope for private judgment and independent action in the American upper middle class. The wonkish qualities of Sitzfleisch and conformity grew in importance while the old gentry virtues associated with social leadership and independent action faded into the background.

Even as the traditional learned professions were vastly enlarged and profoundly changed by the demands of industrial society, both business and government saw the rapid development of wonk-favorable conditions. The rise of large, stable corporations was one of the most distinctive trends in 20th-century America as the era of Gilded Age entrepreneurs and robber barons gradually gave way to the oligopolies and monopolies of the emerging Blue Model economy. With this shift came the emergence of business management as both an academic discipline and a profession. Nineteenth-century factory managers often emerged from the ranks of blue-collar workers based on their intimate knowledge of the machinery and the workforce. In the 20th century, managers were increasingly recruited from graduates of business colleges and MBA programs.

It was government, however, that witnessed the greatest triumph of the wonk. Progressive reformers believed that the road to progress involved kicking out old style political machines and substituting the judgment of dispassionate, scientifically trained administrators who would make policy in what progressives understood as “the public interest.” Getting the politics out of government became a widely supported political ideal, especially in the refined atmosphere of upper middle-class reform.

The well-trained and well-credentialed professional became a central figure in the political imagination of the 20th century. Woodrow Wilson was the most educated president in American history; his adviser Colonel House wrote a novel, Philip Dru: Administrator, about a West Point graduate who takes over the United States from crooked politicians and saves the country by applying scientific administrative principles. Robert Moses achieved almost as much, shoving elected officials aside to remake the transportation systems in and around New York City. Bright-eyed, bushy-tailed reformers proposed reforms like substituting professional “city managers” for the backroom politicians. Schools of public policy rose up across the land to induct new generations of managers and administrators into the principles of management in the public interest.

The new, scientifically trained professional upper middle-class saw itself as the true guardians and guides of the American Way. Above them were the plutocrats and entrepreneurs whose greed and shortsightedness, if left unchecked, would ultimately drive workers into socialism and revolution. Below them were the masses whose chaotic impulses and unformed minds needed guidance and control. Rational, public-spirited, upper middle-class reformers took it upon themselves to repress the excesses of the plutocrats through economic regulation and progressive taxation. They repressed the excesses of the lower orders through measures like Prohibition and the inculcation of appropriate values in the compulsory public schools of the day.

Civil service reform at the municipal, state, and federal levels led to professionalized government bureaucracies. Under the old spoils system, the friends and allies of victorious politicians got government jobs and contracts regardless of academic attainments. In the new system, jobs were increasingly offered to the “best” candidates as measured by scores on standardized tests and similarly “objective” criteria. Ward heelers to the back of the bus, wonks to the front.

In Blue Model America, the wonk was king. Unelected government officials who could not be fired by politicians produced and administered the dense regulations of the post-World War II economy. The large, stable bureaucracies of the monopolies and oligopolies who controlled the commanding heights of the American economy were staffed with well-trained, eminently qualified managers, secure in their jobs and their status. Certified teachers in K-12 and Ph.D. professors in higher education ensured that new generations were educated according to the correct principles. Government functions were carried out by professional civil servants, insulated by lifetime tenure from the whims of politicians.

We had passed, keen observers like the economist John Kenneth Galbraith argued, the age of ideology. Midcentury America had reached a new level of human development. The principles that made for prosperity and social harmony were known. Qualified professionals could develop and implement the good government policies that satisfied these principles. We were in an age of consensus now, with political debates limited to minor issues. We had taken politics out of government and replaced it with administration. Wonkocracy, the rule of qualified professionals selected and promoted by merit as measured by “objective” indicators like test scores and professional degrees from prestigious universities, had arrived.

But then came the 21st century, and everything turned upside down once again.

If the 20th century was the golden age of wonkocracy, the 21st is its decline and fall. For a sense of just how disruptive the Information Revolution has already been, look at what is happening to the cab drivers of London. London’s intricate street pattern, dating back in some cases to the Roman era, was shaped by Anglo-Saxon cow paths, medieval fairgrounds, abbeys, and palaces that no longer exist, and had virtually all been built centuries before the advent of the automobile. It is an intricate, almost unnavigable thicket of one-way streets, dead ends, abrupt turns, and the occasional modern thoroughfare.

Until recently, to learn your way around one of the most tortuous mazes ever constructed by the hand of man required years of study. To get a coveted taxi license, one had to pass a complex and comprehensive examination on what was called “The Knowledge”: detailed information about the London labyrinth that enabled you to identify the fastest route from point A to point B anywhere in the metropolis. Then came GPS, and virtually overnight The Knowledge didn’t matter anymore. That was bad news for the licensed cabbies, who now faced competition from upstarts like Uber. It was even worse for the people who taught The Knowledge in a process that often took four years and cost something like $12,000 in tuition and fees.

What the London cabbies experienced was a threat to their incomes, their place in society, and their identities as skilled workers. What the teachers of those cabbies experienced was something like an extinction event. Nobody is going to pay thousands of dollars and invest years of their lives in learning The Knowledge if a machine renders that knowledge irrelevant. Enormous effort and investment likewise went into the creation of educational systems that could produce the expanding number of increasingly well-educated and credentialed individuals modern society required. The central and privileged role that knowledge workers played in 20th-century economic life reshaped social hierarchies, political ideologies, cultural values, and the nature of power. The decline of the wonks will be as consequential as their rise.

Bureaucrats, to take just one example, have long been at the heart of a mature industrial society. As social and economic life became more complex during the Industrial Revolution, effective and predictable systems of management and regulation became increasingly necessary. If the residents of the burgeoning cities of the industrial era were to live in safe houses, drink potable water, and eat healthy food, governments would have to administer far-reaching and detailed regulations. To do so reliably, regulations had to be written and institutions developed through which those regulations could be uniformly imposed. A society in which millions of people drove millions of cars at high speeds across road networks built on a continental scale needed to be able to develop and enforce safety standards on car makers, engineering standards on road builders, training requirements for drivers, provide insurance to millions of drivers, and address a multitude of disputes over liability resulting from the inevitable accidents.

In the pre-infotech era, all this could only be accomplished by establishing large, rule-based bureaucratic and legal structures. The millions of humans employed by these bureaucracies needed to be educated for and socialized into these roles. These employees were not educated to be leaders. The point of their training was not to create independent thinkers and nonconformists. Bureaucrats are trained to be functionaries—meme-processors who apply a set of rules to a set of facts. One of the principal functions of a modern university is to provide such bureaucracies with masses of human material capable of exercising the responsibilities while accepting the limitations of a bureaucratic career.

But a bureaucracy is, from an information point of view, a primitive, costly, and slow method of applying algorithms (rules and regulations) to large masses of data. It seems likely that drastic reductions in the size of both public and private sector bureaucracies will be coming, along with major changes in the functions of the workers that survive the coming cuts. That must also lead to massive changes in the educational systems that prepare young people for careers.

Similarly, among other learned professions, the relationship of the human practitioner to the knowledge necessary for the profession will change as radically as the relationship of London cab drivers to The Knowledge. Even before the arrival of AI, we were seeing examples of computers that could read MRI scans as accurately or even more accurately than human pathologists who’d spent the best years of their lives in medical school stuffing their heads with facts. More recently, researchers trained an AI tool to diagnose autism in young children from retinal scans with 100% accuracy. Increasingly, the bulk of the knowledge required by the learned professions will be held in computer systems. Physicians will not need (and will not be able) to “know” as much about human physiology and medical research as the AIs of the not-too-distant future. Ordinary people, with far less specialized knowledge in their heads but with full access to the wealth of machine knowledge will be able to carry out activities that, in today’s world, only highly trained specialists can manage.

The personal discipline and powers of memory and intelligence that make for successful careers in the age of wonkocracy will not lose their usefulness in the emerging world, and there may well be fields or specialized functions in which wonks remain essential. Still, after 150 years in which technological advance was inexorably increasing the importance of Sitzfleisch to human societies, we are now in a period in which technological progress is liberating humanity from the necessity of stuffing the heads of young people with an ever-increasing mass of specialized, functional knowledge aimed at creating a race of highly skilled rule-followers.

There is another force that is already undermining the ability of the learned and the credentialed to defend their places and privilege. It is one of the driving forces in modern history as a whole: the desire of ordinary people to rule themselves in their own way without the interference, well-intentioned or otherwise, of aristocrats, bishops, bureaucrats—or wonks.

Tocqueville used the term democracy to describe this force, but democracy for him was not the well-behaved, values-driven liberal constitutional ideal that our contemporary defenders of democracy have in mind. Tocqueville was talking about a wild, at times dangerous, desire for autonomy that cared nothing for the moral and political restraints embedded in the vanishing aristocratic order of his times. This force drove the American and French revolutions. It ignited the nationalist revolts that splintered the great European and Middle Eastern empires of the 19th century into nation states. It drove the socialist and anarchist movements that overturned kings and set up revolutionary regimes across much of the world. It inspired the anti-colonial and anti-imperialist movements that drove the European colonizers out of Asia and Africa. It now appears in the left and right populist movements that fight established political parties and upper middle-class professional governance on both sides of the Atlantic.

“[T]he mass of mankind,” wrote Thomas Jefferson in 1826, “has not been born with saddles on their backs, nor a favored few booted and spurred, ready to ride them legitimately …” Not even high SAT scores and Ivy League degrees confer that legitimacy, large numbers of Americans believe, and those people see opportunities for liberation in the disruptive consequences of the Information Revolution. Social media breaks the monopoly of the well-credentialed on the flow of news. Search engines and AI chatbots allow patients to question the judgment of their doctors. Investors can bypass brokers. Crypto opens doors for those who reject, or want to reject, the world of central bank-administered fiat money. Advances in information technology, including the proliferation of AI, will offer many more opportunities for “the mass of mankind” to dispense with the services of the “booted and spurred” who wish to guide and ride them.

Dissatisfaction with wonkocratic leadership may be the most potent force in American politics today. Recent opinion polling shows faith in “meritocratic” institutions at all-time lows. But the assault on wonkocracy isn’t just a matter of economics. The deeply skeptical public reaction to the public health establishment’s vaccine and lockdown prescriptions during the COVID pandemic revealed just how potent a force populist resistance to technocratic governance had become. Similarly, the propensity to embrace sometimes outlandish conspiracy theories testifies to the strength and depth of public skepticism about the “true” motives of ostensibly objective and technocratic officials and civil servants. This kind of populism found both among left- and right-wing movements is becoming a stronger political force. Wonk badges of authority like elite university credentials and peer-reviewed publications in scholarly journals are decreasingly able to legitimate claims to authority in the public debate.

A hatred of wonk privilege pervades both the right and the left in American politics today. On the right, Trumpian populists seethe with skepticism about the consensus of educated upper middle-class opinion behind much contemporary American policy and mores. “Woke corporations,” “woke generals,” snooty academics and self-regarding journalist elites arouse tremendous and growing antagonism around the country.

But the identitarian left hates wonkocracy too. As we’ve seen, wonkery appears with a very naïve concept of meritocracy. It’s not much of a caricature to say that the chief characteristic of a wonk is the achievement of very high scores on standardized tests like the SATs and the LSATs, and the foundational ideological belief of the modern wonkocracy is the conviction that high scores on tests and high achievement in elite educational institutions constitutes the “merit” that meritocracy is supposed to enshrine. Those who test well and flourish in elite educational settings are the best and the brightest our society has, and it is in everyone’s interest that these people be placed in positions of authority and power.

This self-regarding definition of merit often makes wonks feel entitled. They regard their social power and privilege as legitimate because it is established by external criteria. I deserve to go to Harvard and then Yale Law and then McKinsey because I am, objectively, smart. Furthermore, I am disciplined. I do homework and hand in my assignments on time. I have merit and so it is just that I rule. I was born “booted and spurred” thanks to my ability to score well on standardized tests, and the bulk of mankind needs to shut up and let me ride.

The left-wing, identitarian revolt against conventional wonk meritocracy challenges both the objectivity of the tests that establish wonk merit and, more radically, the meritoriousness of wonkishness. As to the tests, critics argue, not always unpersuasively, the privileged scion of a two-earner, upper middle-class couple in a suburb with good public schools has advantages taking the SAT that inner city poor kids do not. And encoded racism in our society makes things worse. For some sectors of the left, people from disadvantaged backgrounds or marginalized social groups are deemed more worthy than people with higher SAT scores (for example) who have had greater advantages in life. And the “merit” of representing a marginalized group can and should offset the “merit” of having high test scores.

The more radical assault identifies the values of wonkishness (Sitzfleisch plus bureaucratic values like punctuality, attention to accuracy in details, and the propensity to conform to the norms of large institutions) as the products of European culture whose privileged place in our society reflects the consequences of white supremacy and perpetuates white privilege. Liberation does not mean giving historically disadvantaged populations equal access to the credentialing factories (aka universities) that admit people into the ranks of the upper middle class. It does not even mean offering a designated percentage of upper middle-class slots to people from these historically disadvantaged groups. It means deconstructing the concept of wonk meritocracy.

This has all gotten more bitter in recent decades because positions in the wonkocracy have become, with the sad exceptions of journalism and the academy, much more lucrative. As our economy evolves and society becomes more complex, the services provided by the learned professions are in greater demand. The upper middle class is more affluent, larger and more visible than it used to be—and naturally enough it is more widely and more bitterly resented.

In detail, both the populist diagnoses and prescriptions coming from the left and right are often mistaken. But the populist resentment of the sleek, self-interested reign of rule-following meme processors for whom blue chip academic credentials are the modern equivalent of patents of nobility, conferring a legitimate right to rule over the unwashed masses, is too deeply grounded in human nature to fade away. The peasants have added smartphones to their traditional weapons of pitchforks and torches, and they are in no mood to peacefully disperse to their hovels.

With the inexorable pace of technological advance, the gravitational pull of economic advantage, the stark requirements of national security, and the molten magma of populist rage all working against the wonkish status quo, the accelerating disintegration of that status quo seems inevitable. The American university, so deeply committed to wonkocratic ideals and so closely tied to the economic, intellectual, and political agenda of the wonkish upper middle class, cannot avoid the maelstrom ahead. It will take more clarity of vision and eloquence of expression than the three college presidents showed in their congressional hearing to steer the American educational system through the coming upheaval, and the nostrums of woke ideology won’t be much help to faculties and students caught up in the gathering storm.

No comments:

Post a Comment