Stoller 18 tech and conclusion

In 1985, the Dow Jones average jumped 27.66 percent. Making money in stocks, as a journalist put it, “was easy.” With lower interest rates, low inflation, and “takeover fever,” investors could throw a dart at a list of stocks and profit.2 The next year was also very good. The average gain of a Big Board stock in 1986 was 14 percent, with equity market indexes closing at a record high.3
For the top performers, the amounts of money involved were staggering. In 1987, Michael Milken awarded himself $550 million in compensation. In New York City, spending by bankers—a million dollars for curtains for a Fifth Avenue apartment, a thousand dollars for a vase of precious roses for a party—was obscene. A major financier announced in the Hamptons one night that “if you have less than seven hundred fifty million, you have no hedge against inflation.” In Paris, a jeweler “dazzled his society guests when topless models displayed the merchandise between courses.” In west Los Angeles, the average price of a house in Bel Air rose to $4.6 million. There was so much money it was nicknamed “green smog.”4
Ambitious men now wanted to change the world through finance. Bruce Wasserstein had been a Nader’s Raider and had helped write the original FTC study in 1969. He now worked at First Boston as one of the most successful mergers and acquisitions bankers of the 1980s. Michael Lewis wrote his best-seller Liar’s Poker as a warning of what unfettered greed in finance meant, but instead of learning the lesson, students deluged him with letters asking if he “had any other secrets to share about Wall Street.” To them, the book was a “how-to manual.”5
Finance was the center, but its power reached outward everywhere. The stock market was minting millionaires in a collection of formerly sleepy towns in California. Sunnyvale, Mountain View, Los Altos, Cupertino, Santa Clara, and San Jose in the 1960s had been covered with “apricot, cherry and plum orchards,” and young people there often took summer jobs at local canneries.6 Immediately after Reagan’s election, in December of 1980, Apple Computer went public, instantly creating three hundred millionaires, and raising more money in the stock market than any company since Ford Motor had in its initial public offering of shares in 1956. A young Steve Jobs was instantly worth $217 million.7
In upper midwestern farming country, up in the Corn Belt and High Plains, the power of finance had very different impacts. The winter of 1985 had been bitter and harsh, with farmers often encountering days with a windchill factor of 50 degrees below. Worse than the weather was the dreaded monthly payment to the bank. The mid-1970s had been good, with high commodities prices and a land boom, so farmers borrowed money to buy land, planted as much as they could, and watched the cash roll in. But farm debts had more than doubled, and as interest rates increased, the problems began to build up fast in the late 1970s.8
Like the old populists of the 1890s, farmers in the late 1970s began getting together, and talking about the cost of equipment and how much of their crop revenue was going to the processors. A wave of farm strikes began, with signs known as “John Deere Letters.” “Crime Doesn’t Pay… Neither Does Farming.” Farmers wanted supply management, which would guarantee them payments based on the cost of production, and which for much of the twentieth century had been the policy of the U.S. government. In February 1979, thousands drove to Washington in a “tractorcade,” parking their tractors on and tying up the National Mall. President Carter then made things much worse by imposing an embargo on grain shipments to the Soviet Union, in response to the Soviet invasion of Afghanistan. Many farmers responded by ditching the Democratic Party in 1980 and voting for Reagan, hoping for something, anything, to get better. That was a mistake.
Reagan paid the farmers back by breaking the back of farm country. In the early 1980s, wheat production boomed, but prices for wheat collapsed. At the same time, interest rates went even higher. Higher costs and less revenue meant many farmers couldn’t pay their mortgage. Rural high schools closed, churches lost membership, as families and young people left the farms. The news constantly covered sad stories about the end of the family farm. There were still faint memories of the Great Depression, with old-time farmers talking about “the thirties,” when the cattle starved and corn wasn’t worth selling, the price was so low. In Worthington, Minnesota, 250 farmers gathered to “hear an activist tell them that they ‘have no moral obligation to repay an unjust debt’ and that they would be right to use a gun to defend their farms from foreclosure.”
The Reagan administration responded by cutting payments to farmers. But who could you shoot? A young Farmers Home Administration supervisor from New York relocated to Union County, South Dakota, a county named for the cause of the Civil War. The government had moved him around the state in the hope that he would get tough with local farmers behind on their mortgages. His wife had been fired from two separate jobs, and his daughter wrote poems expressing sadness she had to keep leaving new friends behind. He caused a stir when he killed his wife, daughter, and dog, and then went to the office to shoot himself. “The job has got pressure on my mind, pain on my left side,” he wrote in his suicide note.9 The news media focused on the plight of the farmers. The government sent in officials to make it worse.
The family farmer had lots of people who said they were friends at election time—even the glamorous music industry put on a giant “Farm Aid” concert in 1985 to raise money for bankrupt growers. But there was no Wright Patman in the Democratic Party anymore. On the contrary, “new” Democrats like Dale Bumpers and Bill Clinton of Arkansas worked to rid their state of the usury caps meant to protect the “plain people” from the banker and financier.10 And the main contender for the Democratic nomination in 1988, the handsome Gary Hart, with his flowing—and carefully blow-dried—chestnut brown hair, spoke a lot about “sunrise” industries like semiconductors and high-tech, but had little in his vision incorporating the family farm.11
It wasn’t just the family farmer who suffered. On the South Side of Chicago, U.S. Steel, having started mass layoffs in 1979, continued into the next decade, laying off more than 6,000 workers in that community alone. Youngstown, Johnson, Gary—all the old industrial cities were going, in the words of the writer Studs Terkel, from “Steel Town” to “Ghost Town.” And the headlines kept on coming. John Deere idled 1,500 workers, GE’s turbine division cut 1,500 jobs, AT&T laid off 2,900 in its Shreveport plant, Eastern Air Lines fired 1,010 flight attendants, and docked pay by 20 percent. “You keep saying it can’t get worse, but it does,” said a United Autoworker member.12
And all the time, whether in farm country or steel country, the closed independent shop and the collapsed bank were as much monuments to the new political order as the sprouting number of Walmarts and the blizzard of junk-mail-holding credit cards from Citibank. As Terkel put it, “In the thirties, an Administration recognized a need and lent a hand. Today, an Administration recognizes an image and lends a smile.”13
Americans were experiencing, once again, what it felt like under Mellonism. Regional inequality widened, as airlines cut routes to rural, small, and even medium-sized cities. So did income inequality, the emptying farm towns, the hollowing of manufacturing as executives began searching for any way to be in any business but one that made things in America. It wasn’t just the smog and the poverty, the consumerism, the debt and the shop-till-you-drop ethos. It was the profound hopelessness.
Within academic and political institutions, Americans were taught to believe their longing for freedom was immoral. Power was recentralizing on Wall Street, in corporate monopolies, in shopping malls, in the way they paid for the new consumer goods made abroad, in where they worked and shopped. Yet policymakers, reading from the scripts prepared by Chicago School “experts,” spoke of these changes as natural, “scientific,” a result of consumer preferences, not the concentration of power.
And the law and economics world celebrated. In 1988, Reagan accepted an award from the American Enterprise Institute, the think tank that had financed Bork’s legislative history, built by the man who would elevate the Chicago School’s intellectuals during the Goldwater campaign. “We have come a long way together,” the president said. “From the intellectual wilderness of the 1960s, through the heated intellectual battles of the 1970s, to the intellectual fruition of the 1980s. The American Enterprise Institute stands at the center of a revolution in ideas of which I, too, have been a part.”14
THE CORRUPTION OF THE DEMOCRATS
And what of the party of the people, the Democrats? The scandals of the 1980s should have enabled the party to hit back against Reagan and the GOP. Throughout American history, the triumph of plutocrats in a decade provoked a backlash, and the opposing party would win a series of elections and reorient political economy. But the Chicago School had dismantled this political fail-safe. By the 1980s, the Democrats as a party had lost the ability even to think about the problem of concentrated economic power, so they did not understand what was happening, hence could not oppose the process even if they wanted to.
The psychological shock of Reagan’s victory in 1980 had caused a soul-searching among party leaders, the defeat much worse than McGovern’s landslide loss to Nixon. Democrats in the 1970s had largely abandoned their New Deal alliances of small businesses, family farmers, and unionized workers, and had lost the ideological core of the party. Now they had to build something new.
Into this vacuum stepped a new generation of leaders. In 1982, Randall Rothenberg wrote an Esquire cover story titled “The Neoliberal Club: Bleeding Hearts Need Not Apply.” This article featured, accurately, the young major new leaders for the party: Paul Tsongas, Bill Bradley, Gary Hart, and Tim Wirth. Rothenberg’s follow-up writing, which included a book titled The Neoliberals, discussed the politicians Bill Clinton, Bruce Babbitt, Al Gore, and Dick Gephardt, and the writers and intellectuals Charlie Peters, Robert Reich, Lester Thurow, and James Fallows. It was these leaders who would dominate the next thirty years of Democratic politics. These were the men successfully grooming themselves for the post-Reagan presidency, and they began calling themselves “New Democrats.” The common denominator of the group was that they were “pragmatic,” which meant they believed in the “end of the New Deal.”15
Paired with these political leaders were economists like Alice Rivlin, as well as consumer-oriented advocates like Ira Magaziner, who styled themselves as technocrats able to float above dirty old politics. These operators adhered to the Boston Consulting Group’s framework that older industries such as steel and automobiles were low-value “sunset” industries, and that it was smart to allow Wall Street to milk these older industries for cash to be invested in “sunrise” industries such as computer chips and video games.16
As Rothenberg noted, neoliberal thinking, though sounding fresh, was not actually new. It was what Teddy Roosevelt argued in 1912 when he ran as a Bull Moose progressive and sought to abolish antitrust laws and organize business and government into a cooperative whole under the slogan “concentration, cooperation, and control.” New Democrats saw cooperation between business and government as a compelling alternative to Reaganism, and as a means of addressing international financial problems. Thurow had drawn from Galbraith, who in turn had drawn directly from such Bull Moose thinkers as Walter Lippmann.
Like Teddy Roosevelt and Galbraith, Thurow preached the abolition of the antitrust laws. All used the same excuse for doing so. “In markets where international trade exists or could exist, national antitrust laws no longer make sense,” wrote Thurow. This was a direct echo of TR’s statement, in accepting the Bull Moose nomination in 1912, that if we “do not allow cooperation, we shall be defeated in the world’s markets.”17 As Rothenberg pointed out, Bill Bradley, Gary Hart, and Paul Tsongas made the same argument, all proposing to relax antitrust and banking laws.18
There was one big difference, however. Whereas Teddy Roosevelt believed big government should rule concentrated capital, in the era of Reagan, New Democrats preached the idea that government should serve concentrated business institutions, under the guise that the job of political leaders was to cooperate with big business and forge consensus. New Democrats thought of the government’s assertion of public power against big business as illegitimate, as picking “winners” and “losers,” as unfair and unproductive redistributionism, and as a problem of “entrenched bureaucracies and narrow interests in Washington.” They used concepts, many from the Chicago School and repackaged by Thurow, to shield corporate executives, bankers, and financiers from democratic oversight.19 As shopping malls and mergers spread, these New Democrats found it repugnant to consider bringing back the New Deal model of attacking corporate concentrations of power.
A young operative named Al From organized the political operation of the New Democrats. From had worked in the Carter White House. After the Carter debacle, an old Louisiana politician, Gillis Long, recruited From to run the House Democratic Caucus, and they put together something called the “Committee on Party Effectiveness” to bring fresh ideas into the party. This forum included many of the key future leaders of the Democratic Party: Tim Wirth, Dick Gephardt, Al Gore, Geraldine Ferraro, Martin Frost, Les Aspin, Tony Coelho, Barney Frank, and many others. Advised by Thurow, Rivlin, and Charles Schultze, the group produced reports designed to infuse the Democratic Party with this new vision of political economy.20
The Committee on Party Effectiveness adopted new language for the party, a language of flabby, difficult-to-follow technobabble. Smart leaders of tomorrow should speak of “infrastructure,” and “human capital,” and “public-private partnerships,” and “high-technology entrepreneurship.” The concepts of “competition” and “the market” were reconceived to mean financial speculation and the free flow of capital, not social structures designed to support the independence and well-being of ordinary Americans. And big business was now “good.” As Thurow put it, “ ‘Small is beautiful’ sounds beautiful, but it does not exist because it does not jibe with human nature. Man is an acquisitive animal whose wants cannot be satisfied.”21
The language of the New Democrats was like Jell-O, impossible to nail down, vague, though always opposed to anything that sounded like populism or New Dealism. But the intent to insiders was clear. “Make no mistake about it,” wrote From in a memo about his strategy, “what we hope to accomplish… is a bloodless revolution in our party. It is not unlike what the conservatives accomplished in the Republican Party during the 1960s and 1970s.”22
In 1984, Walter Mondale, the vice president under Jimmy Carter, ran a campaign incorporating some of these new themes. Mondale, like the neoliberals, argued strongly for reducing the budget deficit. He also focused on cooperation between industry and government, proposing a technocratic sounding “Economic Cooperation Council.” The new council would of course not be “picking winners and losers,” but would among other things help, as the 1984 Democratic platform put it bloodlessly in the midst of ugly layoffs, “smooth the transition of workers and firms to new opportunities.” Voters didn’t like the new technobabble and gave Reagan forty-nine states instead of forty-four as they had in 1980.23
In the wake of this loss, financiers like Michael Steinhardt and Robert Rubin recruited From to take his organizing work outside Congress and establish the Democratic Leadership Council (DLC), an independent group designed to put neoliberal philosophy at the core of Democratic policymaking. The DLC now became the center of anti-populist political thinking for rising Democratic stars, the technobabble wielded to compete with Republicans for finance-friendly yuppies. As DLC’s chairman, Virginia politician Chuck Robb, said in 1986, “the New Deal consensus which dominated American politics for 50 years has run its course.”24
The DLC was an elite-driven organization, without a grassroots core. Populist senator Howard Metzenbaum opposed the DLC, and high-profile activist Jesse Jackson derided the DLC as the “Democrats for the Leisure Class.” But the Democratic betrayal of farmers, small business, and labor meant there was no longer institutional working-class support for the Democratic Party, except a fast-shrinking core group in labor. The result was that the DLC proved to be spectacularly successful. Groups of DLC politicians dubbed “the cavalry” traveled around the country to talk to reporters, activists, and operatives with a message of “change and hope.” Babbitt explained it by saying, “We’re revolutionaries. We believe the Democratic Party in the last several decades has been complacent.… We’re out to refresh, revitalize, regenerate, carry on the revolutionary tradition.” Media elites loved it; the Washington Post’s David Broder headlined his column: “A Welcome Attack of Sanity Has Hit Washington.”25
Along with From came a new architecture for political campaigns, the systemization of legal business donations to Democrats through the political action committee, or PAC. Democratic congressman Tony Coehlo had begun coordinating business PACs in 1981, directing them to Democratic candidates who fit the New Democrat mold. A close ally of Michael Milken, Coehlo was transactional, creating a patronage machine. With large annual donations to the “Speaker’s Club,” donors could become “trusted, informal advisors” to top Democrats, and though he never said it explicitly, able to influence policy. Coelho trained future Virginia governor Terry McAuliffe (who was the finance director of the DCCC in the mid-1980s) and a host of young operatives in what increasingly was viewed as a pay-to-play system.26 For the next generation, the Democratic Party’s main strategy was to attempt to outbid the equally craven Republican Party for the smiles—and money—of the new plutocrats.
When the junk bond market crashed at the end of the 1980s, the Democrats could have turned the collapse of this economy-wide Ponzi scheme into a political cudgel to use against Republicans. But they not only had no ideological framework to do so, many top Democrats were now implicated. Shortly before Coehlo’s own resignation in the face of investigations into financial improprieties, his close ally, Speaker of the House Jim Wright, also resigned in a cloud of scandal, having taken gifts from corrupt savings and loan bankers and then bullying regulators on their behalf (in one instance accusing a regulator of being part of a corrupt “ring of homosexual lawyers in Texas”).27 Thomas “Lud” Ashley, the old nemesis of Patman, was by this time a bank lobbyist, and he worked to protect one of George H. W. Bush’s sons, Neil Bush, from fraud charges for his involvement in Silverado Banking, Savings and Loan Association. Wirth, as well as Senators Bob Graham of Florida and John Kerry of Massachussetts, were caught having flown on the corporate jet of Miami’s CenTrust bank, which stood at the center of a multibillion-dollar savings and loan disaster.28
In 1988, Democratic presidential nominee Michael Dukakis was advised by Rubin, and few others had any interest in talking about the corruption. Jesse Jackson, the only thorn in the side of the DLC, did not organize his campaign around opposition to corporate power; his son took a summer internship with Drexel in 1989. “Scarcely a word about the smoldering S&L issue [was] said by either candidate during the 1988 election campaign,” according to The Washington Post. Even in the recession of the early 1990s, with a very slow recovery because of the overhang of junk debt and the S&L failures, Democrats had little to say.
It took a Republican, Representative Jim Leach of Iowa, to point out what the Democrats had missed. “The irony is that the biggest domestic public policy mistake of the century was effectively a non-issue in the 1988 election and appears likely to be a non-issue in the ’92 election.”29
THE REVOLUTION OF 1992
As the new decade began, Reagan’s successor, George H. W. Bush, presided over what seemed to be a remarkable series of foreign policy successes. The Berlin Wall fell in 1989, and communism collapsed across Eastern Europe. By 1991, the Soviet Union transformed into the independent state of Russia; the Cold War was over. At the same time, Bush exorcized the ghosts of Vietnam. During the First Gulf War, America put over half a million troops in the Middle East and pushed the Iraqis out of Kuwait using technological wonders like smart bombs and Patriot missiles. America lost fewer than 250 troops. Americans were euphoric, able to use military might at will, and no longer needing to worry about the great communist opponent sowing chaos in the rest of the world. By March of 1991, Bush had an approval rating of 89 percent, the highest ever measured by Gallup.
But the euphoria was short-lived. The end of the junk-bond-fueled real estate boom of the 1980s brought forth a new kind of recession. After the economy started growing again, jobs didn’t come back in what became known as a “jobless recovery.” America had military might, but its economic power seemed to be ebbing. Bush was a figure of potency on the world stage in 1991. A week into the new year and just before the New Hampshire primary, Bush collapsed at a state dinner in Japan, vomiting into the lap of the Japanese prime minister. His approval rating by July of 1992 dropped to 29 percent, a fall of sixty points in a little over a year.30
By the time of the 1992 election, there was a sullen mood among the voters, similar to that of 1974. “People are outraged at what is going on in Washington. Part of it had to do with pay raises, part of it has to do with banks and S&Ls and other things that are affecting my life as a voter,” said a pollster.31 That year, billionaire businessman Ross Perot ran the strongest third-party challenge in American history, capitalizing on anger among white working-class voters, the Democrats who had switched over to Reagan in the 1980s. He did so by pledging straightforward protectionism for U.S. industry, attacking the proposed North American Free Trade Agreement (NAFTA) and political corruption. Despite a bizarre campaign in which he withdrew and then reentered the race, Perot did so well he shattered the Republican coalition, throwing the election to the Democrats. There would be one last opportunity for the Democrats to rebuild their New Deal coalition of working-class voters.32
The winner of the election, Bill Clinton, looked like he might do so. He had run a populist campaign using the slogan “Putting People First.” He attacked the failed economic theory of Reagan, criticized tax cuts for the rich and factory closings, and pledged to protect Americans from foreign and domestic threats. “For too long, those who play by the rules and keep the faith have gotten the shaft,” Clinton said. “And those who cut corners and cut deals have been rewarded.” His campaign’s internal slogan was “It’s the economy, stupid,” and the 1992 Democratic platform used the word “revolution” fourteen times.33
As a candidate, Clinton’s Democratic platform called for a “Revolution of 1992,” capturing the anger of the moment. But the platform was written by Al From, and for the first time since 1880 there was no mention of antitrust or corporate power, despite a decade with the worst financial manipulation America had seen since the 1920s. This revolution would be against government, in government, around government.
When Clinton took office, the Democrats finally had a majority in the House, a majority in the Senate, and the presidency. Clinton not only entrenched Reagan’s antitrust principles into the DOJ by making them bipartisan, but expanded the Reagan revolution more broadly. With the end of the Cold War, Clinton took neoliberalism global. Through the North American Free Trade Agreement, the restructuring of relationships with China, and the creation of the World Trade Organization, the Clinton administration sought to do its part in building a New Economy, a borderless world everywhere where capital would flow freely.
Bill Clinton’s politics were those of Al Smith, not FDR. In New York state as governor, Roosevelt had fought with financial interests. In Arkansas, Clinton coddled them, solicited them, lavished them with attention. His wife, Hillary, was on the board of Walmart, and he had even appointed Sam Walton an honorary brigadier general in the Arkansas National Guard.34 And now, as president, he led the Democrats in repudiating their traditional populist distrust of concentrated capital in politics. In 1993, a book came out on lobbying in Washington. Wayne Thevenot, a Clinton donor and a former campaign manager for Gillis Long, laid out the new theme of the modern Democratic Party: “I gave up the idea of changing the world. I set out to get rich.”35
Like Reagan, Clinton went after restrictions on banking. Reagan sought to free restrictions on finance by allowing banks and nonbanks to enter new lines of business. Clinton continued this policy, but over the course of his eight years attacked restrictions on banks themselves. In 1994, the Clinton administration and a Democratic Congress passed the Riegle-Neal Interstate Banking and Branching Efficiency Act, which allowed banks to open up branches across state lines. Clinton appointed Robert Rubin as his treasury secretary, super-lawyer Eugene Ludwig to run the Office of the Comptroller of the Currency, and reappointed Alan Greenspan as the chairman of the Federal Reserve.
All three men worked hard through regulatory rulemaking to allow unfettered trading in derivatives, to break down the New Deal restrictions prohibiting commercial banks from entering the trading business, and to let banks take more risks with less of a cushion.36 Citigroup, now led by Walter Wriston’s successor, John Reed, finally got an insurance arm, merging with financial conglomerate Travelers Group, approved by Greenspan, who granted the authority for the acquisition under the Bank Holding Company Act.37 In 1999, Clinton and a now-Republican Congress passed the Gramm-Leach-Bliley Act, which fully repealed the Glass-Steagall Act that had shattered the House of Morgan and the House of Mellon. The very last bill Clinton signed was the Commodity Futures Modernization Act of 2000, which removed public rules limiting the use of exotic gambling instruments known as derivatives by now-enormous banks.
Clinton signed the Telecommunications Act of 1996, which he touted as “truly revolutionary legislation,” and this began the process of reconsolidating the old AT&T as the Baby Bells merged. At the signing ceremony, actress Lily Tomlin reprised her role as a Ma Bell operator. Huge pieces of the AT&T network came back together, as Baby Bells merged from seven to three. Clear Channel grew from forty radio stations to 1,240. In 1996, the Communications Decency Act was signed, with Section 230 of the act protecting certain internet businesses from being liable for wrongdoing that occurred on their platform. While not well understood at the time, Section 230 was one policy lever that would enable a powerful set of internet monopolies to emerge in the next decade.
Clinton also sped up the corporate takeover of rural America by allowing a merger wave in farm country. Food companies had always had some power in America, but before the Reagan era, big agribusinesses were confined to one or two stages of the food system. In the 1990s, the agricultural sector consolidated under a small number of sprawling conglomerates that organized the entire supply chain. Cargill, an agricultural conglomerate that was the largest privately owned company in America, embarked on a series of mergers and joint ventures, buying the grain-trading operations of its rival, Continental Grain Inc., as well as Azko Salt, thus becoming one of the largest salt production and marketing operations in the world.
Monsanto consolidated the specialty chemicals and seed markets, buying up DeKalb Genetics and cotton-seed maker Delta & Pine Land. ConAgra, marketing itself as selling at every link of the supply chain from “farm gate to dinner plate,” bought International Home Foods (the producer of Chef Boyardee pasta and Gulden’s mustard), Knott’s Berry Farm Foods, Gilroy Foods, Hester Industries, and Signature Foods. As William Heffernan, a rural sociologist at the University of Missouri, put it in 1999, a host of formal and informal alliances such as joint ventures, partnerships, contracts, agreements, and side agreements ended up concentrating power even further into “clusters of firms.” He identified three such clusters—Cargill/Monsanto, ConAgra, and Novartis/ADM—as controlling the global food supply.38
The increase in power of these trading corporations meant that profit would increasingly flow to middlemen, not farmers themselves. Montana senator Conrad Burns complained his state’s farmers were “getting less for our products on the farm now than we did during the Great Depression.” The Montana state legislature passed a resolution demanding vigorous antitrust investigations into the meatpacking, grain-handling, and food retail industries, and the state farmer’s union asked for a special unit at the Department of Justice to review proposed agricultural mergers. There was so little interest in the Clinton antitrust division that when Burns held a Senate Commerce Committee hearing on concentration in the agricultural sector, the assistant attorney general for antitrust, Joel Klein, didn’t bother to show up. “Their failure to be here to explain their policies to rural America,” said Burns, “speaks volumes about what their real agenda is.”39
In the Reagan era, Walmart had already become the most important chain store in America, surpassing the importance of A&P at the height of its power. But it was during the Clinton administration that the company became a trading giant. First, the corporation jumped in size, replacing the auto giant GM as the top private employer in America, growing to 825,000 employees in 1998 while planting a store in every state. The end of antitrust enforcement in the retail space meant that Walmart could wield its buying power to restructure swaths of industries and companies, from pickle producers to Procter & Gamble.40 Clinton allowed Walmart to reorder world trade itself. Even in the mid-1990s, only a small percentage of its products were made abroad. But the passage of NAFTA—which eliminated tariffs on Mexican imports—as well as Clinton’s embrace of Chinese imports allowed Walmart to force its suppliers to produce where labor and environmental costs were lowest. From 1992 to 2000, America’s trade deficit with China jumped from $18 billion to $84 billion, while it went from a small trade surplus to a $25 billion trade deficit with Mexico. And Walmart led the way. By 2003, consulting firm Retail Forward estimated more than half of Walmart merchandise was made abroad.41
Clinton administration officials were proud of Walmart, and this new generation of American trading monopolies, dubbing them part of a wondrous “New Economy” underpinned by information technology. “And if you think about what this new economy means,” said Clinton deputy treasury secretary Larry Summers in 1998 at a conference for investment bankers focusing on high-tech, “whether it is AIG in insurance, McDonald’s in fast-food, Walmart in retailing, Microsoft in software, Harvard University in education, CNN in television news—the leading enterprises are American.”42
The Clinton administration also went deep into the heart of the American military establishment, undoing the work of Clifford Durr, the New Dealer who had fought to decentralize corporate power in the defense industry in order to ramp up against the Nazi threat in the late 1930s. With the end of tensions with the communist regimes and the perceived ascendance of worldwide liberal democracy, the U.S. industrial base would have to undergo a radical shift. The direction of that shift would be left to the policy choices of the new administration. In 1993, Clinton’s deputy secretary of defense, William Perry, gathered CEOs of top defense contractors and told them that they would have to merge into larger entities because of reduced Cold War spending. “Consolidate or evaporate,” he said at what became known in military industrial lore as “The Last Supper.” Former secretary of the navy John Lehman noted that “industry leaders took the warning to heart.” Defense contractors hollowed out and concentrated. Along with cutting two million jobs, the number of U.S.-based prime contractors went from sixteen to six. These prime contractors then demanded that their subcontractors merge—subcontractor mergers quadrupled from 1990 to 1998.43
The defense industrial base turned from focusing on engineering wonders like cruise missiles and B-2 Stealth Bombers to balance sheet engineering. Private equity as a business model had been popularized by Milken. In 1993, this financial force moved into concentrating the defense industrial base. Lehman, for instance, presided over Reagan’s massive defense buildup in the 1980s as secretary of the navy. In the 1990s, he ran a private equity firm that raised money from investors to rearrange the corporate assets of defense contractors.
It was also under Clinton that the last bastion of the New Deal coalition—a congressional majority held by the Democrats since the late 1940s—fell apart as the last few holdout southern Democrats were finally driven from office or switched to the Republican Party. And it was under Clinton that the language of politics shifted from that of equity, justice, and potholes to the finance-speak of redistribution, growth and investment, and infrastructure decay.
The Democratic Party embraced not just the tactics, but the ideology of the Chicago School. As one memo from Clinton’s Council of Economic Advisors put it, “Large size is not the same as monopoly power. For example, an ice cream vendor at the beach on a hot day probably has more market power than many multibillion-dollar companies in competitive industries.”44
During the twelve years of the Reagan and Bush administrations, there were 85,064 mergers valued at $3.5 trillion. Under just seven years of Clinton, there were 166,310 deals valued at $9.8 trillion.45 This merger wave was larger than that of the Reagan era, and larger even than any since the turn of the twentieth century when the original trusts were created.46 Hotels, hospitals, banks, investment banks, defense contractors, technology, oil, everything was merging.
The Clinton administration organized this new concentrated American economy through regulatory appointments and through nonenforcement of antitrust laws. Sometimes it even seemed they had put antitrust enforcement itself up for sale. In 1996, Thomson Corporation bought West Publishing, creating a monopoly in digital access to court opinions and legal publishing; the owner of West had given a half a million dollars to the Democratic Party and personally lobbied Clinton to allow the deal.47 The DOJ even approved the $81 billion Exxon and Mobil merger, restoring a chunk of the Rockefeller empire.
Clinton also appointed pro-monopoly judges. When Clinton appointed Supreme Court justices, he picked Stephen Breyer and Ruth Bader Ginsburg. Both sailed through the Senate, not because of a tradition of bipartisanship, but because neither worried powerful business interests. Both were adherents of the same basic monopoly-friendly philosophy promoted by the Chicago School. And once on the bench, in 2004, both signed one of the most pro-monopoly opinions in the history of the court, one authored by Antonin Scalia. “The mere possession of monopoly power, and the concomitant charging of monopoly prices, is not only not unlawful,” said the court, “it is an important element of the free-market system. The opportunity to charge monopoly prices—at least for a short period—is what attracts ‘business acumen’ in the first place; it induces risk taking that produces innovation and economic growth.”48
Clinton advisor James Carville very early on in Clinton’s first term noted what was happening. “I used to think if there was reincarnation, I wanted to come back as the president or the pope or a .400 baseball hitter,” he said. “But now I want to come back as the bond market. You can intimidate everybody.” Toward the end of Clinton’s second term, with a transcendent stock market, bars in the United States began switching their television sets from sports scores to CNBC, to watch the trading in real time. In the 1990s, it wouldn’t be Herbert Hoover overseeing a bubble, it would be a Democrat.
THE RISE OF THE TECH GIANTS
Like the rest of the Clinton administration, the DOJ Antitrust Division in the 1990s talked populist, but governed with a deference to monopoly. Clinton’s first appointment to run the division was a Washington lawyer named Anne Bingaman. Antitrust was not particularly important to the administration, and it seemed to some that Bingaman got the job as a political favor to her husband, New Mexico senator Jeff Bingaman. “Hmph,” Attorney General Janet Reno said to The Wall Street Journal, “there’s the White House trying to push a Senator’s wife on me.”49
Nevertheless, when she took office, Bingaman was ready to entirely remake the dormant division. She “fired up” the staff, and opened up new investigations. “Anne Bingaman has a blunt message for corporate America: The antitrust cops are back on the beat,” said the Journal.50 One of Bingaman’s first goals was to open up the most important new area of the economy, the one where Reagan had allowed nascent robber barons to not only seize power over industry but over the future of technology. She would take on the big bad monopolist of the computer industry, Microsoft, which was frightening Silicon Valley, and increasingly, much of corporate America.
In the 1960s, Silicon Valley was a middle-class area populated by farmers and engineers. Up until the early 1980s, the personal computing industry was largely a world of hobbyists, composed of tinkerers who played with what most businessmen thought were toys. Hobbyist culture was pervasive and utopianist, a combination of both the San Francisco counterculture scene and the Cold War–era New Deal high-tech can-do spirit. One of the early forums for the personal computer, for instance, the Homebrew Computer Club, inspired the design of the Apple I. Tinkerers passed around software to each other for free, updating and improving it collectively.
New Deal enforcers had enabled this freewheeling culture. AT&T and IBM were both under constant threat by antitrust authorities, with IBM sued on the last business day of the Lyndon Johnson administration, and both companies being sued throughout the 1970s. Both developed software standards and languages, like COBOL and UNIX, widely available at little or no cost.51 Xerox, like AT&T and IBM, had been the target of aggressive antitrust actions. The corporation, through its Xerox PARC lab, developed core aspects of personal computing like the mouse and the graphical user interface, and allowed its technologies to be commercialized by others (including some of its employees who left to start their own companies). In the early 1980s, IBM, meanwhile, stepped gingerly into personal computing, afraid of new antitrust actions. The company worked to transfer enormous programming, manufacturing, and technical skills to the nascent personal computer supply chain of independent companies, without its usual vicious disciplinary tactics. It even indirectly financed the production of PC “clones,” competitors to its own PCs.
Information technology, like the railroad, the telephone, or the telegraph, is based on networks. Operating systems, software, memory chips, disk drives, videotapes—these are not just products but systems organized around common standards. The value of a piece of software is not just what you can do with the software, but whether it is compatible with other software and with various hardware platforms. When the market for personal computers exploded in the 1980s, it opened the way for a host of new software and hardware products, everything from memory to microchips to spreadsheets and word processing. The technical dynamics were similar to the digital computer market of the 1950s and 1960s, but with a much bigger market and more possibilities for entrepreneurs.
The key political economy question was whether industry standard setting would be public and open, or proprietary and monopolistic. This was not a new problem; there was a reason John D. Rockefeller named his company Standard Oil. New Dealers had forced standards to be relatively open. The fax standard, for instance, and that of TV broadcasting, were not proprietary.
The legal context of the Reagan era, however, returned business to Rockefeller’s era. In 1980, Congress passed a law applying copyright restrictions to software, and in 1982, Reagan dropped the IBM antitrust suit that had pressured the company to retain its open architecture model for computing. It also broke up AT&T, creating a burst of competition in communications, but for the purpose of “deregulating” the telecommunications field.
The rest of the industry took notice. As with Rockefeller leveraging the network of railroads to monopolize the oil industry, entrepreneurs used the exploding personal computer market to seize monopoly power around key bottlenecks. Spreadsheets, word processors, and operating systems became costly software monopolies. There were rivals in these markets—Lotus 123 and Boland in spreadsheets, for instance—but competition took place through lawsuits as they battled over who would control standards, not over whose product was better.52
These nascent monopolies were lucrative and did extremely well under the finance-friendly Reagan political economy. Software and computer companies sold shares on the frothy stock market; by the mid-1980s there were so many millionaires in Silicon Valley that there were shortages of high-end housing.53
The industry consolidated quickly. The key alliance dominating the technology industry, like that between the Pennsylvania Railroad and John D. Rockefeller in the 1880s, was that of Bill Gates’s Microsoft software producer and Intel microchip company. In 1980, IBM, wary of being accused of controlling the personal computer business, signed a deal with Microsoft to produce an operating system—known as DOS—for its personal computer, and standardized its PC chips on Intel. IBM then transferred enormous programming and technical know-how to both companies, and even protected Intel throughout the 1980s from Japanese competition.54 Gates was the more powerful of the two. He had gotten his start commercializing software in the late 1970s, fighting against the sharing culture of the early personal computer hobbyists.
The operating system (OS) is the basic controlling software for a computer, setting the specifications by which other software operate. IBM allowed Gates to sell his OS to other producers of personal computers. By 1983, Microsoft controlled the industry standard on-ramp to the personal computer. Gates soon realized how powerful this intermediary position was, and he moved quickly to entrench his monopoly power by forcing computer makers to take a “per processor” license fee. Under this arrangement, computer makers paid Microsoft for every computer shipped, regardless of whether it had a Microsoft operating system. The per processor contract excluded competitors from the operating system market.
By 1987, Gates, not IBM, controlled what customers sought in a personal computer, which was not the computer itself or the IBM brand or even the operating system, but the ability to do different things with their machine, like write documents using a word processor or play games with video game software.55 All software producers would essentially have to write software applications for Microsoft’s DOS operating system.
Gates then began to leverage his monopoly position. Over the course of the 1980s, Microsoft launched software applications that competed with the most popular business applications, like Lotus 123 spreadsheets, or WordPerfect word processing. It gave its own internal teams secret information about upcoming changes to its operating system product, Windows, leveraging its monopoly in operating systems into another monopoly for business software. Programmers at Microsoft used to say, “DOS ain’t done till Lotus don’t run.”56 Gates was aiming for two key monopolies—operating systems and business applications—in the most important and fastest-growing product market in history, the personal computer.
In 1991, spurred by Microsoft’s rivals in Silicon Valley, the Federal Trade Commission started investigating the corporation’s practices. Gates was openly contemptuous of the FTC, reportedly calling one commissioner a “Communist” and telling BusinessWeek, “The worst that could come of this is that I could fall down on the steps of the FTC, hit my head, and kill myself.”57 The commission could not reach an agreement about whether to move forward with a case. In 1993, with support from Republican and Democratic senators, Bingaman took over the case from the FTC.
This was the moment when the Clinton administration could have turned back the Reagan-era monopolization free-for-all. Bingaman could have sent a powerful signal to corporate America. Despite her pledge to stiffen the division’s work on antitrust, there were signs that Bingaman, like Clinton, was no populist. For one thing, she studied antitrust under William Baxter. More than his student, she was also his admirer, praising his brilliance and his “monumental” legacy. She did not veer from Baxter’s merger guidelines, which had helped unleash the merger boom, asserting they “appeared economically sound.” Her only area of disagreement was that Reagan-era antitrust was insufficiently supportive of chain stores and consumerism. She pledged to prosecute fair trade agreements where merchants maintained a minimum price for their products.58
Bingaman also had little intellectual support for taking on the largest and most powerful company in the personal computing industry. While many of Microsoft’s Silicon Valley competitors backed the suit, intellectuals and commentators did not. Frank Fisher, an important economist at MIT, argued that “you don’t want to confuse Microsoft’s success with monopoly.” Rob Shapiro, an operative who worked in Al From’s orbit at a New Democrat think tank, warned DOJ that it would damage America’s software industry with an ill-advised suit, noting that Microsoft’s high market share in operating systems was a sign not of dysfunction, but that the “market is working well.” And liberal legend Alfred Kahn said the costs of inaction were worth it. Should Microsoft turn slothful, he said, “We’ll just have to deal with the problem if and when it comes.”59
In July of 1994, the DOJ settled with the company, allowing Microsoft to retain its market position in operating systems and its ability to leverage that into new application markets. Bingaman earned a modest concession, where Microsoft stopped its per processor licensing fee structure. But by 1994, this contractual arrangement was irrelevant; Microsoft’s operating system had become the industry standard. Bingaman claimed victory and lawyers at DOJ cracked open champagne to celebrate, but it was a hollow and embarrassing announcement, undercut a few days later when Bill Gates mocked her. No one will change anything they are doing at Microsoft, he said, though he would have one official deign to read the agreement. Microsoft soon dominated the market for key business tools, including databases, presentation software, and word processing, with massive monopoly profits to match. Its stock jumped from $48 to $62 in the four months after the settlement.60
In the months to come, Bingaman would reveal herself as an ardent Chicago School adherent, seeing little wrong with Microsoft’s increasing market power. In 1995, Microsoft tried to buy Intuit, the leading personal finance software maker. The goal was to dominate the then-nascent internet. The company, as one executive said in 1997, was trying to get a “vig,” a mobster’s term for a share, of every transaction made on the internet. Bingaman reached a deal with Microsoft to allow the transaction to go through, but when she went before a court to have the settlement approved, she ran into Judge Stanley Sporkin, a former SEC enforcer and adamant opponent of white-collar misdeeds. Sporkin delayed the deal, and even allowed corporate opponents of the deal, led by lawyer Gary Reback, to argue against the merger in his court. Bingaman angrily demanded the judge allow Microsoft to leverage its monopoly power into control of the internet, but he would not. She soon left the DOJ, and a new chief, Joel Klein, filed suit to block the acquisition. Gates abandoned the merger attempt.
But Microsoft wasn’t chastened. Gates had decided that his company would dominate the internet. A small start-up called Netscape had created something called a browser, a piece of software letting a user look at websites easily. This would be the on-ramp to the internet, much as the operating system was the on-ramp to the personal computer. Gates decided to create a competitive product, Internet Explorer, and leverage his power to destroy Netscape. The company updated its operating system, Windows 95, bundling its browser and seeking to, as one rival executive testified he heard from a Microsoft executive, “cut off Netscape’s air supply.” It used the same exclusionary tactics as it had to defeat other creators of applications, bullying a host of PC makers and internet service providers. Netscape soon hired Reback to see what he could do to get enforcers to pursue a case against Microsoft.
Microsoft then began its next strategic move to leverage its monopoly power, this time into swaths of the nondigital economy. Gates understood that Americans would one day do their shopping, banking, news consumption, and social interactions online, and he wanted to rule it all. Microsoft launched a travel company called Expedia, and began investing in media, including a company called Sidewalk, which was intended to dominate the lucrative classified advertising market. It brought together over a hundred venture capitalists and implied strongly they should refrain from investing in areas Microsoft intended to dominate.61 The extent of Gates’s ambition and power finally scared old-line corporate America as well as state-level officials.
In 1998, the Texas attorney general became interested in Microsoft’s monopolization of the browser market; some key PC makers were located in his state. More state attorneys general followed his lead. In 1998, the Department of Justice, spurred by Reback, gathering interest from state attorneys general and angry venture capitalists, launched an antitrust suit against the company. Klein presided over the largest trial since that against AT&T. In 2000, the court ruled for the government, and put forward a plan to split up Microsoft into two companies, one that held the operating systems and the other that controlled the Microsoft software businesses that ran on top of the operating system, similar to how Congress had split railroads from other businesses. It was a cautious decision; Microsoft would still retain its monopolies, even if they were now in separate companies. But Gates appealed the decision anyway, and the most conservative circuit court in America overturned the breakup order. In 2001, the George W. Bush administration essentially dropped the remainder of the case.
The Microsoft suit had two critical impacts on the development of the American political economy. Microsoft never dominated the internet the way it had the personal computer, because it was never able to leverage its hold over the browser market to control how users interacted with third-party websites. Like IBM, which under fear of antitrust had allowed an open computer industry, Microsoft allowed an open internet. It did not block a new company dedicated to selling books called Amazon and a new company with an innovative search engine called Google from accessing customers through Internet Explorer.
But the suit also signaled an end to antimonopoly prosecutions. The Clinton administration clearly had little interest in prosecuting monopoly, and had to be embarrassed and cajoled into doing something about an obvious monopoly in a key sector of the economy. The Bush administration was even less inclined to do anything about monopoly power. While Microsoft’s internal culture was reoriented away from predatory action, after this case the Department of Justice would cease bringing forward monopolization cases entirely.
In 2003, Larry Ellison, the CEO of large software maker Oracle, said he had no choice but to copy Microsoft’s tactics. “We have to roll up our industry,” he said. Like every industry, the business software market was going to have just one key company. “We will be that dominant player.”62
THE ROARING 2000S
To most Democrats, and the majority of Americans, Bill Clinton’s years in power seemed to have been the most successful for any president since the 1960s. During the eight years that Clinton was president, the country grew twenty-three million jobs in the longest economic expansion in American history. Unemployment fell to a thirty-year low, with unemployment for blacks and Hispanics at the lowest point measured up until that point. There was the largest expansion of college opportunity since the GI Bill, and crime rates dropped to a twenty-six-year low. Industrial productivity jumped.
The median family income rose by $6,338 over eight years, adjusted for inflation. All income brackets had double-digit growth. The poverty rate fell to its lowest in twenty years. The budget went into surplus, and the number of families owning stock jumped by 40 percent.63 Goldman Sachs called this the “best economy ever,” and BusinessWeek lauded a “New Age economy of technological innovation and rising productivity.” When George W. Bush came into office after Clinton, the satirical website The Onion’s headline read, “Bush: ‘Our Long National Nightmare of Peace and Prosperity Is Over.’ ”64
During his second term, Clinton took his “third way” politics global, and leaders all over the world copied his model of success. A popular television show, The West Wing, written with heavy influence from Clinton insiders, inspired a younger generation to embrace the ideals of the Watergate Baby generation, the closeness to business, the lack of willingness to assert public power. In the show, corporate lawyers and lobbyists were cool, heroic, sexy.
But below the surface, something was off. Toward the end of the Clinton administration, a Harvard Law professor named Elizabeth Warren, specializing in bankruptcy law, noticed something odd about this new wonderful 1990s economy. She was conducting the first mass-scale research project asking the question, Why are Americans going bankrupt? In the midst of plenty she saw that many Americans were still falling behind. It turns out that Americans, with a record high stock market and a record low unemployment rate, weren’t doing well. The “trick and traps” of Wall Street were preying on their finances.
Meanwhile, in 1999, consumer groups began hearing complaints of predatory lending, specifically on mortgages, and particularly in a new segment called subprime lending. Protesters and dissidents were still on the margins, as they had been in the late 1920s. But other signs suggested that there was an awakening. Protesters shut down a World Trade Organization meeting in Seattle. These hints, however, did not add to much but a quiet dissent to the prevailing order. Even the shock of Bush winning the election in 2000, in the midst of an economy of plenty, did not disturb the prevailing neoliberal ideology of the Democrats.
By the time the George W. Bush administration took power in 2001, little remained of populism, or even memories of populism. Lawyers occasionally looked back at New Deal policies, baffled by how strange they seemed. Democrat Willard Tom called an FTC case against Xerox forcing the company to divest its patent portfolio like finding a “previously undiscovered ancient culture.” It was unsettling, he argued, because apparently the FTC’s remedy “seems to have done a world of good.”65
The roll-up of power in the political economy accelerated with more high-level tax cuts, deregulation by Bush, and monopolization. In 2005, the Bush administration finally repealed the New Deal–era Public Utility Holding Company Act, clearing the way for the rise of massive new multistate electricity corporations with ever more ability to resist local regulators. The Bush White House also passed bankruptcy legislation that made it easier for banks to use derivatives but harder for normal people to get out of credit card debt. Financiers and regulators thoroughly corrupted the mortgage industry, inflating a bubble that masked underlying deterioration of American industry.
And digital technology, once channeled into the public domain by the government, was now layered on top of the Reagan revolution of political economy that Clinton had completed. New giants, as powerful, or perhaps even more powerful than those originally built by John D. Rockefeller, J. P. Morgan, and Andrew Mellon, emerged in the post-Microsoft era. Though Gates didn’t get a “vig” on every piece of commerce online, the successor monopolies to Microsoft did. Amazon, Google, and Facebook followed the business model of Microsoft, leveraging their essential platforms to take power throughout swaths of the economy.
All of this seemed to represent a tremendous success, the equivalent of the endless prosperity of the Roaring Twenties. In 2004, Ben Bernanke, the conservative economist who in 2006 would be named chair of the Federal Reserve, announced that American policymakers had conquered the business cycle, and, it seemed, the world. Humans in 2004 were living in the “Great Moderation.”
On April 5, 2006, Barack Obama—then a young senator hoping to be president—gave a speech at the Brookings Institution’s new wing, the Hamilton Project. The Hamilton Project was financed and organized by Clinton’s economic policy team, with Robert Rubin as the key leader. Obama’s ideological framework was straight out of the 1970s, using rhetoric from Reich, Thurow, and From.
“The forces of globalization have changed the rules of the game,” said Obama, including “how we compete with the rest of the world.” The national competitiveness frame was in there, as was the globalization as inevitable framework. “For those on the left,” he continued, “and I include myself in that category, too many of us have been interested in defending programs the way they were written in 1938.” He called for straightforward Mellon-style policies—eliminating the budget deficit and keeping public debt low. He did not mention monopoly. The thoughtful bankers and policymakers in the audience were, he said, a “breath of fresh air.”66 It was those same people who, in the 1990s, “[took] on entrenched interests” and ushered in the prosperity of that decade. Hopefully, in two years, they could reenter the White House under a Democratic administration and do it again.
Then, in 2007, the loosening of financial rules and the merger of financial power induced a global financial crisis, centered in some of the world’s most powerful financial institutions. Concentration of power in the private sector, it turned out, had its downside.
TOO BIG TO FAIL
In retrospect, the fissures in the system had been obvious for years. Financial crises had been getting worse, from Mexico to East Asia, as “hot money” flowed across borders. The stock bubble collapsed almost as soon as Clinton turned over the White House to Bush. Starting with the recession in 2001 and continuing through the recovery, the total share of income in the entire economy going to workers—a measure that had been stable for fifty years—began declining.67 Part of the problem was that economists had lost the ability to measure economic activity accurately. A significant amount of the “computer-driven” productivity increases measured in the American economy during the 1990s came from just one corporation: Walmart. The store’s influence, as one journalist put it, had “reached levels not seen by a single company since the 19th-century.”68
Then the real crack-up happened. On September 15, 2008, the old-line investment bank Lehman Brothers declared bankruptcy. Lehman’s fall was the largest collapse in American corporate and banking history, a super-sized Penn Central, setting off a bank run that threatened every single commercial institution in the world. The art of modern politics had become so disconnected from any understanding of commerce that the president, George W. Bush, simply didn’t understand what was happening. Speaking at the UN less than a week later, he downplayed financial turmoil as irrelevant.
Foreign leaders weren’t fooled. Gloria Macapagal Arroyo, the president of the Philippines, discussed the terrifying global implications of the crisis. She was followed by the head of Argentina, and the president of France, both of whom pointed the finger at the intellectual consensus of unregulated financial capitalism then dominant in America (though also, as European leaders were loath to admit, dominant in Europe as well).69
Wall Street was too panicked to notice foreign disapproval. American International Group (AIG), the largest insurance company in the world, hovered on the brink of collapse, and if AIG fell, then so would every major bank. The unregulated markets where corporations borrowed froze up. Activity in the nonfinancial “real” economy—factories, shipping yards, housing—took a sickening slide, as millions of Americans began liquidating their savings in an unsuccessful battle to ward off foreclosures. Under pleading from Bush, Congress passed a $700 billion bailout for the treasury secretary to do with largely as he pleased.
A little less than two months after Lehman collapsed, Barack Obama won the presidency. With the firepower of the bailout, and the hope of the world, he had remarkable latitude to restructure the global political economy. The parallels with the Hoover-to-FDR handover of power were eerie. The shadow banking system, first unleashed in the early 1960s, had frozen up, similar to how the banks had collapsed in the early 1930s. George W. Bush was asking for cooperation from Obama, just as Hoover had of Roosevelt.
What path would the new president choose? For there were multiple possibilities. This crisis was centered in the most basic institution FDR had saved, the American home. Promoting homeownership had been a bedrock policy framework since at least the 1930s, a key link in the social contract between Americans and their political and financial elites, but also a mechanism to ensure social stability. Policymakers were now caught in a bind. There was now roughly $5–7 trillion of mortgages Americans could not repay, and if these mortgages were not repaid, large banks would become insolvent.70
In 2008, political leaders discussed a new Reconstruction Finance Corporation, mortgage write-downs. Meanwhile, academics, policymakers, and business leaders were open to a new intellectual framework. Richard Posner, the inheritor of Robert Bork’s mantle as the main organizer of the law and economics movement, published A Failure of Capitalism, in which he asserted that capitalism was not a self-correcting system. Alan Greenspan told a congressional oversight committee that he was in a “state of shocked disbelief” that the “self-interest of lending institutions” had not protected the integrity of finance.71 Former General Electric CEO Jack Welch told the Financial Times that the shareholder value movement was the “dumbest idea in the world.”72
But by the time Barack Obama took the oath of office, the ideas that took hold in the 1970s had been political orthodoxy for two generations. Obama had started his career in the early 1980s as a community organizer in Chicago, trying to work with poor families being ruined by the Reagan-era corporate looting and deindustrialization. Obama had been unable to do much to ameliorate the suffering, and turned to law. By the time he ran for Senate he had discovered the New Democratic movement. In his book The Audacity of Hope, he praised Bill Clinton’s third-way framework as “pragmatic” and “non-ideological.” Obama mimicked the arguments of the Democratic Leadership Council, attacking Democrats as bereft of ideas, ridiculing defenders of the New Deal and Great Society—a group that had become all but extinct—as adherents of a hokey “old-time religion.”73
When Obama took office, most of his advisors barely knew an earlier economic tradition had existed, and those who did—like Rubin—had built their careers rejecting it. His advisors, and Democrats writ large, accepted the solidity and importance of large corporations and banks; the idea of using public power to structure markets was not only off the table, it was outrageous. He would work with a Democratic Congress led by Watergate Babies. Both his major financial reform bill and his major push for health care went through committees chaired by members of the class of 1974 (Chris Dodd, Henry Waxman, Max Baucus, and George Miller). Barney Frank, elected a few years after the Watergate Baby class, had sat on the Committee on Party Effectiveness in the 1980s, and chaired the Financial Services Committee in the House during the crisis.
Not surprisingly, Obama’s policy choices ended up pushing wealth and power upward. Under Obama’s leadership, the Democratic Party continued Bush’s bank bailout, but without a populist restructuring of the monetary, debt, or industrial systems that had led to the crisis. In 2008, to secure the votes Bush needed for his bailout, Obama asked congressional allies to support the $700 billion Troubled Asset Relief Program. One newly elected member, Donna Edwards, demanded that he also attach to the bailout protections for homeowners by allowing people to write off their mortgage debt in bankruptcy, which they could not now do. This would give borrowers negotiating leverage with the banks in restructuring their mortgage to something they could pay. Obama demurred but promised her that he would get it done later. She voted for the bailout. Unbeknownst to her, Obama’s transition team, those who had been in the audience two years earlier, had already ruled out such changes to bankruptcy laws. The result was upward of nine million foreclosures. America’s middle class lost between $5 and $7 trillion in wealth.74
The lack of a populist wing in the Democratic Party meant that Obama received little criticism over these choices. But when asked why he didn’t address the foreclosure crisis, it became clear that Obama had chosen a remarkable ideological path for any Democratic president. He drew a comparison between his own immediate task and that of the last president to take office during a financial crash. Rather than come up with some sophisticated excuse for not following FDR’s model, Obama instead embraced long-disproven Republican libels of the New Deal, and criticized FDR for not working with Herbert Hoover and causing a banking crisis. We “didn’t do what Franklin Delano Roosevelt did, which was basically wait for six months until the thing had gotten so bad that it became an easier sell politically.”75
Obama’s treasury secretary, Tim Geithner, joined the attack on FDR, whom he charged with refusing to “lift a finger to help the outgoing administration relieve the suffering of the Depression.”76 Instead, he lauded the architect of concentrated financial power in America, Alexander Hamilton, calling him the “original Mr. Bailout.” Bill Clinton, who mentored many of the figures in the Obama administration, got together with Geithner and mocked the “bloodlust” of Americans who wanted justice for top-tiered bankers. 77
Like Bill Clinton, Obama ended up prioritizing the stability of a concentrated financial system over risking an attempt to end the foreclosure wave threatening the American housing market, or engaging in white-collar criminal prosecution, antitrust enforcement, or any sort of crackdown on concentrated financial power. The Democratic Party had become the party of Hamilton and Mellon.78 To the extent there was a reformist element in the Obama administration, it was limited to creation of a new Consumer Financial Protection Bureau, which was oriented around the consumerist framework that the Watergate Babies understood.
At the height of the Great Recession, Obama’s antitrust officials, who had drawn from their experience in the Clinton era and their training under the influence of Chicago and Harvard schools of law and economics, helped to engineer another merger boom, in telecoms, pharmaceuticals, airlines, event ticketing, media, and technology. Big Tech did especially well. During Obama’s years in office, Google, Facebook, and Amazon acquired hundreds of companies, growing to dominance over advertising markets, retail, and information technology, and in the process pushing forward another round of defensive consolidation in the rest of corporate America.
Notionally “progressive” corporations like Google became key pillars of a cosmopolitan liberal culture. This was the world of the Watergate Babies and the corporatist thinkers who shaped their intellectual understanding of it. Toward the end of the administration, Obama even helped a popular artist, Lin-Manuel Miranda, popularize a smash-hit theater production about Alexander Hamilton, portraying the founding monopolist—and onetime slave owner—as a sexy and daring warrior for racial justice.79
Unchecked, the great private monopolies made the social dysfunction of the Reagan era even worse. Americans became more obese, pumped full of sugary industrially processed foods by a small number of corporate giants. Independent black businesses collapsed; black-owned banks were a tenth as likely to get bailout money as other banks. A gruesome heroin epidemic spread in rural areas, spurred by hopelessness and corruption among pharmaceutical monopolies. Toward the end of Obama’s second term, the life span of white men and women without a college education began dropping as suicide, alcoholism, and drug addiction caused a die-off, what policymakers began calling “deaths of despair.” This epidemic approached, and then exceeded, the height of the death toll of the AIDS crisis.80
Meanwhile, the gilded elites who controlled the Democratic Party mused about theories projecting the natural inevitable unspooling of history, like the idea that “superstar cities” just innately attracted talented engineers, versus rural backwaters that would naturally lose out due to ignorant bigotry. The administration and bipartisan leaders in Congress continued to support the by-now-old framework, including working on behalf of corporatist trade deals and progressive tech monopolists.
In an era in which the Democratic Party was perhaps the strongest support pillar for the new Mellonism, the American people had nowhere to turn but the street. Republicans were the first to rebel, as self-described Tea Partiers—backed by billionaires—attacked the Bush-Obama bailout of the banks. Then came social movements on the left—Occupy Wall Street in 2011, and Black Lives Matter in 2013. At the ballot box, Americans voted for change in 2006, 2008, 2010, 2014, and 2016, veering from party to party in a desperate search for someone to address their fears and anxieties. This rebellion of protest and at the ballot box was inchoate, and did not point the finger at the real ideological culprit, which was the hold monopolies had over American business, politics, culture, even the family.
When Donald Trump ran for office, his platform of “America First” and his slogan, “Make America Great Again,” with its undertones of soft authoritarianism, seemed at first a joke to the gilded elites. But the Democratic Party had been gradually weakened during the Obama administration, with more than a thousand elected officials at every level but that of the presidency falling to conservative Republicans. Then in June 2016, Great Britain voted to leave the great postwar project of the European Union, revealing deep populist anger at the status quo worldwide. And in January 2017, it was Trump who took the oath of office over the heavily favored Hillary Clinton.
The Democratic Party, and American democracy, would have to be built anew.

Wright Patman was an optimist, but the rise of soft authoritarianism globally would not have surprised him. Dictatorship in politics is consistent with how the commercial sphere has developed since the 1970s. Americans are at the mercy of distant forces, our livelihoods dependent on the arbitrary whims of power. Patman once attacked chain stores as un-American, saying, “We, the American people, want no part of monopolistic dictatorship in… American business.”2 Having yielded to monopolies in business, we must now face the threat to democracy Patman warned they would sow.
The plutocratic winds blowing across the global landscape were formed from Michael Milken–staked takeover barons and their descendants, as well as Sam Walton’s Walmart, a deregulated transportation and telecommunications world, and an endless merger wave. We have concentrated markets in everything from airlines to coffins to candy to hospitals.3
But we also face a challenge even more significant than the consequences of a four-decade-long Reagan revolution, because layered on top of the political revolution wrought by the Chicago law and economics school and their left-wing allies is a technological revolution that has enabled a far more dangerous concentration of power. While Bill Gates never got his “vig” over every commercial transaction, the next generation of monopolists is closer than ever to doing so. The pace-setting political institutions in our culture are tech platforms, in particular Amazon, Google, and Facebook. These companies are information monopolies, manipulating the free flow of information, and our ability as citizens to think and come together to do politics.
Tech platforms are rewiring our culture. As Senator Richard Burr lectured tech executives in 2018, “The information your platforms disseminate changes minds, hardens opinions, helps people make sense of the world.” Or, as a reporter tweeted, “My friend’s toddler babbled ‘don’t forget to subscribe’ as he was put to bed. The kid watches so much YouTube he thought it means ‘good bye.’ ”4 This is not just a domestic problem; earlier in 2018 Sri Lanka banned Facebook because the company was unable to prevent the use of its platform to foment ethnic hate crimes.5 Journalism and politics are centralizing all over the world, on top of centralized intermediaries of our information, commerce, and advertising revenue.
Google and Facebook, in 2018, took roughly 60 percent of all online ad revenue in America, and online ad revenue is the largest and fastest-growing source of advertising money. Google has about 90 percent of the search ad market, can track users across 80 percent of websites, and its ad subsidiary AdMob has 83 percent of the market for Android apps and 78 percent of iOS apps. Facebook has 77 percent of mobile social networking trafficking, and roughly two thirds of Americans get news on social media.6
As Wired magazine editors Nick Thompson and Fred Vogelstein put it, “Every publisher knows that, at best, they are sharecroppers on Facebook’s massive industrial farm.… And journalists know that the man who owns the farm has the leverage. If Facebook wanted to, it could quietly turn any number of dials that would harm a publisher—by manipulating its traffic, its ad network, or its readers.”7
Roughly 1,800 local newspapers in America have disappeared since 2004, and over 2,000 of the 3,143 counties in America now have no daily newspaper.8 Pittsburgh has become the first midsized regional city without a daily newspaper. Specialty newspapers are dying as well; from 1999 to 2009 the number of black newspapers was cut in half.9 From 2005 to 2015, roughly 26 percent of newspaper journalists—including digital outlets—were laid off. There have also been massive declines in the workforce of related industries, like radio, book publishing, magazines, and music.
In other words, America is increasingly a news desert. This may not be obvious from the surfeit of seeming outlets for information, the endless number of websites, cable channels, and the stream of information coming from social media. But the reality is the increasing number of seeming options for information masks a smaller and smaller amount of original reported news. As journalist and media researcher Tom Rosenstiel put it in 2009, “A good deal of what is carried on radio, television, cable, wire services begins in newspaper newsrooms.” Today, most of what we read on Twitter or Facebook originates there as well.
Meanwhile Amazon captures nearly one of every two dollars Americans spend online, and it is the leading seller of books, toys, apparel, and consumer electronics in the nation. Its cloud computing subsidiary has over one million enterprise customers, it is a major movie producer and defense contractor, and it has 100 million U.S. customers that are members of its Prime bundling service. It is the number one threat to independent retailers.10
Book publishing and distribution, media financed by advertising, and social media are how we communicate ideas with one another, and all three channels for information are increasingly in the hands of a monopolist.
THE RISE OF AMAZON
Amazon, like Google and Facebook, exists because of the legal shift enabling and promoting bigness in structure and monopoly in business strategy. But while new, these companies are, like the railroads or Standard Oil, network industries very much like their antecedents.
In 1994, Jeff Bezos, working at a hedge fund, conceived of the idea of building an “Everything Store” that would serve as the monopoly intermediary for commerce. He started by selling books, but his plan was to expand into every possible item. As Bezos’s hedge fund boss, who helped conceptualize the notion, put it in 1999, “The idea was always that someone would be allowed to make a profit as an intermediary. The key question is: Who will get to be that middleman?”11 Amazon was born to be a monopolist. Bezos, like John D. Rockefeller, sought every competitive advantage. He noticed a 1992 Supreme Court decision allowing mail order companies to avoid sales tax, and used it to create a pricing advantage against his competitors.12
Bezos drew inspiration from Sam Walton, hiring hordes of Walmart executives to build out his retailing and logistics infrastructure. Venture capitalists also pushed Bezos toward monopoly. After the first major investment by Kleiner Perkins, an important investment firm, the company’s internal slogan became “Get Big Fast.” The end of Robinson-Patman enforcement meant Amazon could use bulk discounts to monopolize product markets. Bezos also used predatory pricing either to drive his competitors out of business or acquire them, in one case on track to losing $100 million in three months selling diapers below cost to force a company called Diapers.com to sell to him, in an acquisition approved by the toothless Federal Trade Commission.
In the late 1990s, Bezos began repurposing his logistics capacity to sell DVDs, music, and toys, and opened its storefront to third-party merchants, turning the retail business into a “platform” where multiple buyers and sellers come together to interact. Amazon also competes on this platform, so the corporation has unparalleled surveillance into its competitors’ business. Railroads in 1908 had been prohibited from this kind of vertical integration; so had A&P in 1949. But Amazon exploded in a legal environment crafted by Bork, where vertical integration was a signal not of monopolization but efficiency.
Today, Amazon is an infrastructure and data conglomerate that is well on its way to becoming the intermediary for all commerce. It is a gatekeeper to online buying, with roughly half of all online retail coming through Amazon. This means that Amazon can impose conditions on any merchant that seeks to sell online, and it does so with abandon. To sell in a prominent place on the site known as the “Buy Box,” merchants and manufacturers are encouraged to use Amazon’s fulfillment and logistics services and its advertising services. Amazon also creates private-label versions of the hot-selling products on its platform, using its surveillance monopoly power.13
The corporation sells hundreds of millions of products, both directly and by managing its “marketplace.” It owns a delivery and logistics network to serve its own retailing operation and those of its third-party merchants, including a growing shipping fleet. It has retail stores and owns the Whole Foods supermarket chain, and is the largest book retailer in the world and one of the largest publishers in America.14
The corporation’s business strategy is to emulate its takeover of the online retail business, where it first established a powerful market position using aggressive pricing strategies and then vertically integrated into new business lines by opening up its system to third-party merchants. Bezos has built or bought a series of similar platforms that Amazon both controls and on which Amazon competes.
Amazon manufactures Kindle e-readers, Fire tablets, and controls the Alexa voice-assistant platform. It runs the dominant cloud computing business, which hosts the websites and services of millions of businesses (as well as the CIA and much of the U.S. government), a sort of operating system for the web. It is also one of the largest producers of television and film, electronics, fashion, and advertising services, as well as a credit lender to small businesses. Its advertising business and cloud computing businesses are by themselves separately worth hundreds of billions of dollars apiece.15
There is no perfect analogy to Amazon, but it is a mixture of Standard Oil, the A&P chain store, the Microsoft software monopoly, and the Mellon system of interlinked businesses. Bezos came from the hedge fund world, learned from Walmart and Microsoft, and took advantage of the legal framework structured by Director, Bork, and the law and economics movement.
A GOD’S-EYE VIEW: THE RISE OF GOOGLE AND FACEBOOK
Bezos was also one of the original investors in the dominant information intermediary formed in 1998, Google. As Google cofounder Larry Page put it, “Jeff was very helpful in some of those early meetings.”16
The company originated as a computer science project by Page and his academic partner, Sergey Brin, to map the internet more effectively than had been done until that point. Brin and Page realized that existing search engines were riven by a structural conflict of interest, because they were advertising-supported. They would thus manipulate their search results in service to their advertising business. They decided to start a company using their academic research.
Google was, like the “Everything Store,” monopolistic from inception, with a goal of “organizing the world’s information.” Its business model emerged as a result of this libertarian legal environment, fused with progressive cultural tolerance and a key antitrust suit. While the corporation was started by two computer scientists, Larry Page and Sergey Brin, it was Eric Schmidt, an executive at Microsoft-opponent Novell, who built the monopolistic business model. He recruited to Google Sheryl Sandberg, a Clinton Treasury official who would later help organize Facebook. Schmidt, Page, and Brin soon turned Google toward an advertising-supported model, despite earlier misgivings about the basic conflict of interest such a model would inherently incur.
The regulatory framework for what would become Google’s business, online advertising, emerged in the mid-1990s, after the Bork revolution was complete. In the 1990s, the FTC explored how to update its mission in the context of the rise of the internet. The Federal Trade Commission’s 1996 report on consumer protection policy in the “new, high-tech global marketplace” framed the job of the commission as fostering self-regulation of industry. While the FTC staff report noted the possibility of “online entry barriers through search engines designed to push competitors out of the way,” it also quoted an expert hailing interactive media as “the first intelligent media on the consumer side.”17 The recommendations largely confined the FTC to encouraging self-regulation and policing consumer fraud.
At the same time, the antitrust suit against Microsoft by the state attorneys general and the Department of Justice set the stage for the innovative environment in which the search giant would emerge. After the suit, Microsoft executives began sharing software plans proactively with competitors, and decided against an early plan to use its control over the now-dominant Internet Explorer browser to crush new entrants to the internet economy, including Google.18 In the late 1990s, entrepreneurs rushed into the online advertising and internet space, propelled by the internet boom and no longer afraid of being squashed by Microsoft.
Over the course of the 2000s, Google grew into a significant advertising-fueled search engine, and eventually a data conglomerate. It bought companies to build out its advertising business in 2002–2003, software companies that would become Google Maps in 2004 and internet video monopoly portal YouTube in 2006. Google had the ability to advertise effectively because it knew what its users were thinking. As users did searches, Google accumulated a vast store of knowledge about the thoughts and questions of hundreds of millions of people. It became, as influential writer John Battelle put it, a “database of intentions.” In 2007, encouraged by industry lobby groups, the FTC offered principles for self-regulation in the behavioral advertising market, largely oriented around the industry regulating itself voluntarily.19
The legal environment was favorable to monopoly and did not have significant public rules structuring advertising markets, which meant that there was a scramble for dominance. What followed was, similar to the creation of U.S. Steel in 1901 from the thousands of independent iron and steel companies, a merger wave. AOL, Yahoo!, Microsoft, Verizon, WPP, and Oracle all became major buyers of behavioral targeting analytics companies, ad exchanges, publishers, and ad networks. But Google led the pack. From 2004 to 2014, Google spent at least $23 billion buying 145 companies.20
The most important acquisition was when, in 2007, the FTC permitted Google to purchase its rival in the online advertising space, DoubleClick.21 DoubleClick was becoming a key railroad-like marketplace for internet advertising, a facilitator for newspapers and publishers to sell their advertising online and for ad buyers to buy advertising online. Google, in buying DoubleClick, combined its “database of intentions” with a vast trove of knowledge on how advertising campaigns and user behavior worked across third-party websites. The merger allowed Google to leverage its dominance in search advertising to third-party display advertising in which DoubleClick specialized.
It was a controversial decision. As FTC Commissioner Pamela Jones Harbour wrote, “I dissent because I make alternate predictions about where this market is heading, and the transformative role the combined Google/DoubleClick will play if the proposed acquisition is consummated.”22
While multiple players were vying for online advertising supremacy, it was largely Google, and Facebook, that won. In 2004, Facebook emerged as a nascent competitor to Google in a new set of online markets: social media, or the sharing of social information. Facebook piggybacked on this libertarian legal framework and purchased over seventy companies to attain monopoly power over social media advertising. Its key acquisitions were Instagram in 2012 and WhatsApp in 2014. While each company has its own complex dynamics, Facebook and Google are both essential infrastructures for the digital economy, with little public accountability.
Today, Google has eight products with more than a billion users apiece.23 It knows what you think through its much larger “database of intentions.” It knows where you go in the physical world through its two billion constantly roaming Android phones and its mobile ad app and Google Maps subsidiaries. It knows where you go online through its tracking businesses, and it has information about business ad campaigns through its advertising technology subsidiaries. This provides Google with a God’s-eye view of behavior. Married to this surveillance power is the ability to organize the distribution of information through YouTube, Google Maps, Google search, email, and its own popular browser. As Schmidt put it, “We know where you are. We know where you’ve been. We can more or less know what you’re thinking about.”
Google, Amazon, and Facebook are conglomerates who monopolize ad markets, and have done so through a range of tactics and mergers that were until very recently illegal. And in doing so, they have become governing powers. Mark Zuckerberg, the CEO of Facebook, put it this way. “In a lot of ways,” he said, “Facebook is more like a government than a traditional company. We have this large community of people, and more than other technology companies we’re really setting policies.”24
While there are analogies, this concentration of power over the sinews of information is new. In 1831, Alexis de Tocqueville visited America and was astonished at the number and diversity of newspapers, later writing, “There is scarcely a hamlet which has not its own newspaper.” He contrasted that to the centralized newspaper systems in Europe, which were designed to insulate kings from public criticism. In America, there were so many newspapers that wealthy and famous people could not use the power of dominating information channels to, as he put it, “excite the passions of the multitude to their own advantage.”25 Americans rebelled against the British, in part, because of taxes the British levied on newsprint, which was the mechanism for organizing speech.
Today, with Google, Amazon, Facebook, we find ourselves in America, and globally, with perhaps the most radical centralization of the power of global communications that has ever existed in history. One company controls roughly 90 percent of what we search for. And they also know what we think, because we tell them, through our searches. Another company controls our book market, and a third controls how we interact with our social worlds. Meanwhile, the free press is dying.
TOWARD A NEW DEMOCRACY
In 1912, Americans understood that they were in the midst of a crisis, similar to those they had faced prior to the Civil War and during the revolutionary era. Theirs was a crisis of concentration of power, and they faced the decision of moving toward autocracy or democracy, a free society or a slave society. Today we face a similar moment of reckoning. There are different technical, social, and physical characteristics, but we can recapture the ability that our forebears had to demand democracy in the political and the commercial sectors.
The Watergate Babies rejected the lessons of Patman’s generation for many reasons, conflating economic populism with an antiquated vision of the economy. But there was wisdom in Patman’s lessons. In the 1930s, he said that restricting chain stores would prevent “Hitler’s methods of government and business in Europe” from coming to the United States. For decades after World War II, preventing economic concentration was understood as a bulwark against tyranny. From the 1970s until the financial crisis, this rhetoric seemed ridiculous. No longer. Financial crises occur regularly now, and prices for essential goods and services reflect monopoly power rather than free citizens buying and selling to each other. People worldwide, sullen and unmoored from community structures, are turning to rage, apathy, protest, and angry tribalism.
The reason for this dissatisfaction, the anxiety, is clear. The institutions that touch our lives are unreachable. We organize our social networks through Facebook, our information through Google, our health care from complex bureaucracies, our seeds and chemicals through seed monopolists. We sell our grain through Cargill and watch movies, buy groceries, books, and clothing through Amazon. Open markets are gone, replaced by a handful of corporate giants. We are increasingly addicted to opioids, sugary processed foods, and alcohol. We have no faith in what was once the most democratically responsive part of government, Congress. We cannot begin to address perhaps the most important existential challenge humanity has ever faced, climate change. Steeped in centralized power and mistrust, people all over the world face demagogues selling blame and hatred.
But even more profound than the anxiety is the confusion. Our policymakers, until recently, saw giants like Google, Amazon, and Goldman Sachs as exemplars of the American spirit, instead of the dangerous re-creation of trading corporations seeking to control and enslave us—like the East India Company—against which we rebelled. They confused charity for justice, meekly asking our munificent plutocrats to raise our wages or donate their ill-gotten gains to charitable foundations plastered with their names. We have asked for new laws to offer welfare to the poor, instead of seeing poverty itself as a lack of freedom, as a denial of the basic rights of citizenship. This is not democracy. This is servitude.
And yet, paired with this profound discouragement and confusion is a moment of tremendous opportunity. While today our monopolists use our scientific genius to encourage us to click on ads, the technology of our age is unimaginably powerful. Each of us carries a supercomputer disguised as a phone in our pocket, connected through satellites and wireless towers to the most extensive information network in human history. This grid can become a tool of liberation, or it can become the most sophisticated set of leashes ever invented. Across our commerce, our industries can be remade, and remade in remarkably innovative ways, if we would but move aside the entrenched status quo of monopolists and unleash the talents and genius of a free people upon them.
For forty-seven years in Congress, Patman got up early every morning to make sure that the “plain people,” as he put it, could live a secure middle-class life, not just in material comforts, but as an independent self-governing people. He drew upon a founding tradition, which was that the basis of a democratic social order was not just the right to vote, but the wide dispersal of private property among citizens.
Under this Enlightenment idea, property did not just mean the ability to dominate or use a parcel of land or a piece of capital, but also the responsibility to take care of it. The ability to use property and one’s labor to produce wealth created political independence. A citizen who could grow food and make money with a farm or store was not dependent on any social better or aristocrat for her livelihood. Carefully and publicly regulated markets that protected the rights of buyers and sellers were the forums by which citizens formed communities, freely trading goods, services, crops, information, their own labor, and ideas. Citizens as individuals had the ability to say no, and because they had that power, they had the ability to come together as a society.
The coming together to protect our property and liberty—from monopolists, financiers, or foreign powers—by a government we controlled was the American experiment. This Enlightenment ideal has been continually updated to take advantage of modern technology and new political and social contexts. After World War II, we exported this model globally, so that it became the basis for seventy-five years of prosperity and peace in Europe, and so that we would not have another world war. The American experiment, born in the Enlightenment, was, and is still, based on a radical philosophy of self-government. It is still who we are. We are coded for peaceful rebellion. We are born to fight against power. As a people, we expect our government to uphold justice. And we are angry, and sullen, when it does not.
We have challenges Patman did not face. There was no worldwide threat of climate change. And Patman did not have to face a society so alienated from itself as we are today. It was not outlandish in the 1930s to imagine restructuring corporate America, as corporate America was relatively new. Family farms existed, and corporations, while enormous, had not stretched across the globe in dangerous fragile webs of production, as they have today.
Still, we have one remarkable cultural advantage. Patman grew up in an era when citizenship meant being a straight white male. Being gay, or black, or female, or falling in love with someone from another race, or refusing to conform to any number of social norms, meant being excluded from social rights and becoming a political target. Patman was not a racist. He faced down the Ku Klux Klan in the 1920s, he ensured broad economic rights regardless of race, and he helped put the first black man on the Federal Reserve board in the 1960s. But he did not make racial equality a core mission of his life, and he voted for segregation in the 1950s and 1960s because he would have lost his seat in Congress had he not, even as he faced candidates claiming he was insufficiently committed to white supremacy.26
Patman had to make a cynical, ugly choice because most white Americans of his era would not tolerate racial or gender equality. Today, because of sacrifices going back generations, this is a choice we no longer have to make. We can form a multiracial democracy based on equality. We have not done so, but it is within our ability.
There are other reasons for hope today. The financial crisis induced new ways of thinking. In 2008, Elizabeth Warren was appointed by Senate majority leader Harry Reid for a job chairing an oversight board on the bank bailouts. She was from Oklahoma, one of the key centers of populism, and she grew up on the teetering edge of the middle class. In 2009, as chair of the Congressional Oversight Panel, she helped organize the first broad attack on financial power in decades, exposing bad behavior of banks and systemic criminal practices during the foreclosure crisis. Vermont senator Bernie Sanders then ran a campaign in 2016 based on opposition to Too Big to Fail banks, a phrase that came to culturally resonate and in doing so channeled popular American frustration with corporate gigantism. The antimonopoly movement is rising everywhere. The Democratic Party platform had an antimonopoly plank in 2016 for the first time since 1988, and most major Democratic Party figures are expressing concern over the power large technology companies have. So are important Republicans, such as Senator Josh Hawley of Missouri and Congressman Doug Collins of Georgia. The Department of Justice and the Federal Trade Commission are arising from their decades-long slumber, and Representative Emanuel Celler’s old stomping grounds, the Judiciary Committee, announced an investigation of big tech in June of 2019. In Europe, Australia, Japan, Israel, and India, antitrust enforcers are bringing suits and investigations. All of this is being spurred by a rising popular frustration with corporate concentration. The people are waking up.
The vision put forward by Warren and Sanders of a financial system under control by democratic institutions is fundamentally organized by the New Deal framework and its populist legacy. Even Donald Trump is a throwback, looking like a faint echo of the Mussolini that New Dealers feared. The rise of corporatism, and the backlash, has sparked a search for ideas and a new generation of leaders. In the 2018 midterm elections, a wave of young post-financial-crisis politicians took office. They may not know exactly what they want, but they are eager to find a new tradition, and merge what made sense about the Watergate Baby frame with a new populist understanding of power.
There is also now a new intellectual community. In the latter part of the Clinton administration, a business journalist named Barry Lynn began studying the problem of monopolization in the supply chains that moved goods from East Asia to the United States. In 2010, Lynn published Cornered, a book that inspired a new antimonopoly movement that draws its inspiration from what has come before.
Since then, a group of antimonopoly historians, economists, law professors, business leaders, politicians, policymakers, and writers has emerged, rediscovering our traditions and updating them for the age in which we live today. The book you are reading is a product of that community. It is intended both as a history but also as an invitation to join us.
In 2017 at, ironically, the Stigler Center at the University of Chicago, the first meeting of the intellectual core of this movement came together to examine the question, “Is there a concentration problem in America?” This new group of antimonopolists debated, for three days, with the Watergate Baby generation experts, who by now had become the old guard. And for the first time, the technocratic children of Bork, Areeda, and Turner had to debate the wisdom of Brandeis.
Join us. Make this your tradition, wherever you live and whatever you do. Because it is your tradition, it is your birthright, as an American, or just as someone who believes that free individuals can come together and govern themselves. If you are dissatisfied, if you seek a better world, you are not alone.
I’m not going to lie. We are in a bad spot. It isn’t just that we have to contend with plutocrats. It’s that we have divorced property ownership from caretaking itself. Our industrial supply chains are fragile, concentrated and full of dangerous and hidden risks. The Chinese Communist Party, with its highly sophisticated and vast industrial power, has integrated surveillance into a nascent totalitarian model, with not only concentration camps full of ethnic minorities but immense leverage over Western corporations. Democracies are falling globally, in part because people understand that the formal mechanisms of voting do not matter when decisions about political economy are reserved to a coddled elite. But we can fight back.
What that means is not just protest, or elections. It means learning. Whoever you are, whatever interest you have, you have the ability to do that. There are tens of thousands of markets, local, national, international, and a democracy requires people to think and learn about the policies underpinning each one of them.
Use your identity. Every subgroup—whether white, black, gay, straight, immigrant, male, female, genderqueer—interacts with market structures that can discriminate, integrate, liberate, or not. If you are experienced in an area, you see with experienced eyes, and can guide policymakers into building a more just way to do commerce. If you are building a business, think about the right way to do commerce, and make that your politics. If you are young, take your interest, whatever it is—law, politics, sports, history, business, engineering, farming, art—and learn not just about the technical specifics of the field, but about the moral choices you can make in that field. If you want to enter politics, think about the glorious life you could have if you emulate Patman, a man who loved his constituents, and fought for the plain people and against monopolists, or any of the staffers, regulators, business leaders, or lawyers who were his allies. Our monopoly problem is a massive one, but it can be solved by breaking it down into person-sized chunks.
Fighting back also means not falling into the trap of elitism. The weakness of the technocrat is “imposter syndrome,” a feeling that he or she doesn’t belong in a position of power, that his or her expertise is a pretense. This insecurity is what Aaron Director identified as his key to power. “Beautiful smugness” is how John Kenneth Galbraith and Richard Hofstadter persuaded a generation to give up their liberties. These men, Director in particular, understood that powerful liberal lawyers, well-respected and arrogant, were susceptible to mockery by their colleagues. Director made fun of lawyers who subscribed to common sense, creating a social context where an embrace of complex-seeming models by credentialed experts who were working for oligarchs overrode respect for justice, democracy, or basic human decency. In doing this, Director reoriented the elitist liberal brain, rewiring it for plutocracy without liberals even knowing.
The only immunization against this is a democratic form of populism. I do not mean the toxic fake version of populism, demagogues and frauds blaming ethnic groups. I mean old-school populism, the belief that citizens, educated and responsible, know what is best for themselves. And united they come together in a system of democracy and use the law to protect and develop themselves.
This populism does not disdain expertise, but embraces it. But expertise must serve the people; it must not be oriented to confuse them, to erect a new aristocracy. And that means you must never be afraid to say “I don’t know,” and you must do your best to remember that men in suits with impressive credentials can and often do lie, cheat, and steal. It’s true that humanity has a remarkable amount of accumulated knowledge, but as individuals, we’re all making it up as we go along. To do the work of being a citizen, each of us has to work hard, learn, build up a working body of knowledge.
We have created and re-created our republic many times in our history. We did it in 1776 when we declared independence, not just from a king but from the idea of aristocracy itself. We did it in freeing ourselves from the Slave Power, and again in 1912 and during the New Deal and World War II, when we liberated ourselves from industrial barons and fascists. Today, we must choose whether we have the courage, wisdom, and discipline to govern ourselves, both as individuals, as communities, and as a nation. That is our choice, as a people.
Nothing about monopolization is inevitable. Our increasingly dystopian and corrupt corporate apparatus was brought to us by people selling a fantasy of inevitability. Some of them sold us a right-wing fantasy of corporate monopolies and bigness as a sign of progress. Some of them sold us a left-wing fantasy of corporate monopolies as an unstoppable feature of capitalism. But these fantasies are, in the end, the same. They both are designed to sell you on the idea that you have no power, that you are nothing but a consumer.
And that is not true. It has never been true. America has always been a nation of tradespeople. We embed social justice in our banks, our corporations, our markets, in how we exchange goods, services, crops, ideas, and labor with one another. Each of us is a worker, a businessperson, a consumer, and a citizen. The real question is not whether commerce is good or bad. It is how we are to do commerce, to serve concentrated power or to free ourselves from concentrated power.
This is the choice that has always confronted the American people, liberty for all or a small aristocracy governing our commerce and ourselves. To choose wisely, we must unlearn much of the history we have been taught. Many of us learned a version of our history as one of inevitable progress, goodness, and triumph. Many of us learned the inverted version, that our history is one of inevitable sin, racism, conquest, greed. Neither of these is true, because both versions airbrush out our own free will. The truth is, America is a battle, a struggle for justice. And we choose, every generation, who wins.

Comments

Popular posts from this blog

ft

gillian tett 1