The Public Option Read online

Page 2


  The second assumption was that employers would provide workers with critical social benefits. Employees would get health care and pensions, and eventually even educational, child care, housing, transportation, and other benefits, from their employers, not from the government. Employer-based benefits had been around since the late nineteenth century, but they took off during the 1940s. Between 1940 and 1945, health insurance coverage expanded from 1.3 million American workers to 32 million workers.3 After the war, the country continued along the employer-based benefits path instead of creating national programs, as many European countries did. In America, government programs existed largely as a safety net for those who fell through the cracks of the employer-based system. The extremely poor received health care through Medicaid. The elderly and retirees got their health care through Medicare and a minimal pension through Social Security’s old age provisions. The disabled, orphans, and widows received a basic income through Social Security. But these were exceptions. As a general approach to public policy, ordinary Americans would not get their benefits from the government.

  The third assumption underlying the Treaty of Detroit was that employers would provide a “family wage.” Families were thought of as a male breadwinner, a stay-at-home wife, and children.4 When Reuther and Wilson negotiated the Treaty of Detroit, the idea of wages rising with the cost of living and with productivity improvements was partly a function of the need for workers to provide for their families. Indeed, between 1948 and 1978, the median man’s income kept pace pretty well with productivity. Productivity rose 108 percent during this period; hourly compensation increased 96 percent.5

  Starting in the 1970s, however, companies increasingly began withdrawing from the Treaty of Detroit. Oil shocks, stagflation, increasing competition from abroad, and an ideological shift that emphasized shareholder profits instead of stakeholder well-being all contributed to corporations shedding the view that they were “in it together” with their workers. Employers instead increasingly shifted risk onto their workers.6 One of the best examples of what Jacob Hacker calls the “great risk shift” was the move from providing defined-benefit pension plans to offering defined-contribution pension plans. Defined-benefit plans give retirees a fixed amount in their yearly pension, which means the risk of market fluctuations is borne by the employer sponsoring the pension. Defined-contribution plans, in contrast, involve employees putting money into pension funds and investing that money; the retiree, not the company, bears the risks of market fluctuations. In 1980, there were 30 million defined-benefit plan participants, compared to only 19 million defined-contribution participants. By 2006, there were only 20 million defined-benefit participants, compared to 66 million defined-contribution participants.7 Today companies rarely, if ever, offer new defined benefit pension plans to their employees. Workers bear the risks of the market.

  The trouble with the Treaty of Detroit model is that the assumptions undergirding the system of employer-based benefits are no longer reasonable, if they ever were. Americans don’t stay in the same job or career for decades. According to a 2016 Gallup study, 21 percent of millennials reported changing jobs within the past year.8 This isn’t unique to millennials. Even past generations of workers went through multiple job transitions throughout their lifetimes.9

  When people do work for companies, they increasingly are not actually employees of the company. The most prominent example is Uber and the “gig economy.” Uber claims that drivers are just using their mobile platform, and they are therefore not employees of the tech company. On this understanding of the gig economy, because workers are not employees, the company doesn’t have to provide benefits—no retirement package, no health insurance, no unemployment insurance.

  The gig economy might get the most attention, but technology companies aren’t the only—or even the most common—cases of this phenomenon. Consider a housekeeper working at the San Francisco Marriott. Even though its name is prominently displayed outside, Marriott doesn’t own the property—a company called Host Hotels and Resorts does. Nor does Marriott employ the housekeeper or manage her hours or payroll—that’s the job of Crestline Hotels and Resorts, a hotel management company. Marriott does establish guidelines for how the housekeeping staff is supposed to maintain the hotel rooms. But this is a far cry from the old days, in which Marriott would have owned the property, managed the hotel, and employed the people working there. This example, taken from David Weil’s The Fissured Workplace, shows just how far the employer-employee relationship has changed in the last generation.10 Today, Weil argues, workplaces across a variety of sectors have been fractured, leading to complex employment relationships. Employees are now often working for contractors or subcontractors. Sometimes they are independent contractors who have no direct relationship with a company at all. Whether you call it the “gig economy,” the “1099 economy,” or the “patchwork economy,” the common theme is that workers are increasingly cobbling together different forms of employment—part-time, gig, independent contracting, and self-employment.11 They aren’t working for a single employer.

  The decline of the lifetime employer idea and the fissuring of American workplaces strike at the very core of the Treaty of Detroit’s employer-based benefits model. It makes little sense for employers to pay expansive pensions to a worker whose tenure is only a few years. And for workers who switch jobs frequently, it is a hassle to change retirement plans and health care providers every time they move to a new company. When companies use independent contractors or gig workers, the situation is even more problematic. These individuals don’t get any benefits. The legacy of the Treaty of Detroit leaves them out in the cold.

  The basic idea behind the family wage hasn’t fared well, either. Almost every assumption behind the imagined ideal of a 1950s-style family, where Dad comes home from work to a stay-at-home mom and a couple of kids, is out of date. While wages and productivity grew together in the Treaty of Detroit era, the ordinary male worker’s wages have been stagnant since the 1970s.12 True, household incomes continued to rise through the 1970s and 1980s until about 2000.13 But this was partly because women entered the workforce in increasing numbers. In May 1950, when the Treaty of Detroit was signed, 33 percent of women were in the workforce. Fifty years later, in May 2000, that number was 60 percent.14 Women’s income buoyed households that were already struggling under the model of breadwinner liberalism.

  Whether or not the Treaty of Detroit paradigm suited the age of managerial corporations, it is certainly out of sync with employment and families today. Adhering to the approach of employer-based benefits makes little sense given contemporary conditions. While some people might be nostalgic for the America of the 1950s, nostalgia is rarely a good guide to public policy.

  The Perils of Neoliberalism

  As companies withdrew from the Treaty of Detroit, a second paradigm for public policy took hold. Neoliberalism was defined by its desire to use markets as the means to achieve social goals and by its view that markets were an end in themselves.15 Ideologically, neoliberalism grew out of a conservative intellectual movement that was powered by anti-government Austrian economic theorists who fled Europe in the mid-twentieth century, and it was buoyed by the Cold War and the fear of communism.16 It gained political dominance with the elections of Ronald Reagan in the United States and Margaret Thatcher in the United Kingdom and became virtually universal when liberals turned to neoliberal policies in the 1990s and 2000s.17

  When it came to public policy, neoliberals thought that markets—rooted in strong private property and contract rights—were the best way to achieve the well-being of society and to preserve and advance freedom for all. On their theory, government would do little more than enforce contract and property rights. If government was going to act to advance broader social goals (like health or education), it would do so through private markets. Privatization involved outsourcing government functions to profit-seeking companies. Deregulation became central to neoliberalism because it freed
markets to act without constraints. Vouchers, state-provided coupons to purchase private services, became the dominant approach for how government could advance social goals by market means.18

  When we look back from beyond the 2008 crash, neoliberalism seems woefully naive. But Reagan-era policy makers were facing their own age of anxiety and were searching for an alternative to old institutions that no longer seemed viable. The economic chaos of the late 1970s, with high inflation and economic stagnation, called for desperate measures. The old Treaty of Detroit model was already moribund. Modern economics had resurrected Adam Smith and the promise that an unregulated, laissez-faire market could do a far better job than any command-and-control government intervention.

  And we cannot forget the role of the Cold War. Neoliberalism equated markets with freedom and big government with Soviet-style planned economies. Unfettered capitalism, the Reaganites believed, could pull the United States out of the economic doldrums of the late 1970s and provide the fuel needed to win a decisive economic, political, and ideological victory against the Soviets.

  But in retrospect, we know that neoliberalism had a number of failings, including an almost religious faith in markets even in the face of empirical evidence of their failings and flaws. Unquestioning acceptance of market efficiency often led proponents to ignore fundamental problems with the neoliberal approach: that profit-seeking actors would try to use fraud, force, and deception to cheat people for their own gain; that markets can lead to the concentration of economic power and, as a result, undermine the very functioning of competitive markets; and that profit-seeking actors would seek to use government to benefit themselves, at the expense of taxpayers, consumers, and a competitive market. All of these neoliberal problems came home to roost at one time or another, and sometimes all at the same time.19

  Consider the neoliberal turn in higher education, for-profit colleges. Instead of the government funding public universities and community colleges directly and allowing people to attend private non-profit colleges if they desire, the neoliberal approach urged two things. First, private, profit-seeking actors should run institutions of higher education; second, the government should give students a voucher to go to whatever school they wanted. The vouchers in this case are Pell Grants, federal grants given to students directly for use at any college or university, and federal student loans, some of which the government subsidizes by not requiring students to pay interest while in school. On the neoliberal model, students can use Pell Grants and federal student loans at any school in the market—public, nonprofit, or for-profit. The hope was that the market would create new and better options for students.

  So how did the experiment turn out? The neoliberal higher education model turned into a way for shareholders and CEOs to make boatloads of money off taxpayers by deceiving prospective students, saddling them with unconscionable levels of debt, and leaving them without a decent education or any reasonable job prospects.

  According to a report from the U.S. Senate Committee on Health, Education, Labor, and Pensions, in 2009, 86 percent of the revenues for the fifteen publicly traded for-profit colleges came from taxpayer dollars.20 Think about what that means. For every $100 of revenue that for-profits earned, $86 came from taxpayers and only $14 came from other sources. And the absolute dollars at stake are large. In the 2009–2010 school year alone, the U.S. government invested $32 billion in for-profits, or a whopping 25 percent of the Department of Education’s budget for student aid programs. Pell Grants alone amounted to $7.5 billion (up from only $1.1 billion in 2000–2001).21

  Neoliberalism’s faults are sometimes more subtle. One example is basic banking services. Most of the time we don’t notice that dollar bills say on them “legal tender.” What that means is that the government has mandated that bills can be used to pay for goods and services. For a variety of reasons, including security and technology, most employers don’t pay their employees in legal tender—in cash. Instead, they pay either with a check or with direct deposit. For many people, that isn’t a big deal; it’s convenient. You just go to the bank and deposit the check, or if you have direct deposit, the money automatically appears in your account. But 7 percent of American households—15.6 million adults and 7.6 million children—don’t have a bank account.22 As a result, just to get access to the money that they are owed for their work, they have to use check-cashing establishments that charge exorbitant fees.

  Neoliberalism’s faith in markets also too often ignores the consequences of concentrated economic power. In this case, a lesson from history went ignored. In the late nineteenth and early twentieth centuries, industrialization led to massive consolidation of industry. During the “great merger movement” of 1895 to 1904, more than 1,800 firms disappeared. By 1904, 40 percent of American industry was under the control of only 300 corporations.23 Concentrated economic power had a variety of detrimental consequences. It meant that employers had more power over their workers, it meant that monopolies could raise prices on customers, and it meant that corporations and their wealthy leadership had more resources to influence public policy. In response, Progressive Era reformers passed antitrust laws and public utilities regulations in order to break up or regulate concentrated economic power. They did so not just for economic reasons of efficiency but also for constitutional reasons. They understood that America could not remain a republic—it could not have political freedom—if power was concentrated in a small number of corporations that would try to rig the political system to serve their interests.24

  Three generations later, during the ascendency of neoliberalism, Robert Bork, then a professor and later a judge, penned a famous tract, The Antitrust Paradox, arguing that antitrust policy should not take into account political factors, the market ecosystem, or anything except economic efficiency. As Bork’s ideas took hold, antitrust prosecutions languished. Today we are in the midst of what the Economist calls the “great merger wave.”25 Looking at 900 sectors of the economy, the magazine found that two-thirds had become more concentrated between 1997 and 2012.26 Concentrated power once again means less competition, higher prices, and the increasing political power of big monopolies that control virtually every aspect of Americans’ lives.

  Of course, the 2008 economic crash was one of the consequences of the long neoliberal moment. Deregulation meant deceptive and fraudulent practices in a variety of sectors, including the housing and financial markets. The neoliberal preference to facilitate social goals through government-supported vouchers, subsidies, and incentives only deepened this pathology. Federally supported housing finance allowed financial institutions to capture the benefits of their bets without bearing the risks when those bets went bad. And consolidation and concentration in the financial sector produced Wall Street banks that were “too big to fail,” leading to government bailouts.

  Despite its many problems, fondness for (if not necessarily robust faith in) neoliberalism remains strong. Many people remain nostalgic for this easy-to-understand approach to public policy. But the financial debacle of 2008–2009 and its continuing aftershocks should leave Americans in no doubt. Nostalgia for neoliberalism is perilous. We need better options.

  Beyond Privatization: The Possibilities of Public Options

  The public option challenges the privatization agenda by turning it on its head. Privatizers begin with the assumptions, often undefended, that government programs are always and everywhere ineffective and corrupt and that private markets work perfectly. It’s no surprise, then, that they propose to turn over public functions—ranging from prisons to roads to welfare programs—to private firms that are motivated by profits.

  The case for public options, by contrast, doesn’t rest on black-and-white assumptions about either government or the private sector. We don’t assume that government always works well, or that government should muscle out private provision. Instead, we think that public options can add social value when the power of government is needed to guarantee universal access to the basics of
modern life. So public options aren’t socialist-style big government—they don’t aim to replace markets. Instead, public options exist alongside market options. Citizens can rely on the public option but also can turn to the marketplace for additional choices, combining public and private options in ways that work best for them.

  2

  Why Public Options?

  The social contract evolves as society changes. In its time, the Treaty of Detroit supported an idea of the social contract that had wide appeal: a lifetime job at a wage that enabled workers to support their families by buying necessities and having the middle-class dream of a house, a car, an annual vacation, and a secure retirement. So the values behind the social contract in that era were something like, “If you are willing to work hard, you can have a good life.” In that era, with or without a high school diploma (let alone a college degree), Americans could expect to participate in the good life as long as they were willing to work (or to marry a worker, in the case of many women).

  The Treaty of Detroit was a market system, since workers paid for all of this out of their wages. But it wasn’t laissez-faire by any stretch of the imagination: government rules supported labor unions and subsidized employer-provided benefits. This system, as Chapter 1 described, fell apart as economic and social conditions changed. Without economic growth and a robust demand for unskilled labor, the Treaty could no longer guarantee profits to firms along with plentiful jobs for workers of all backgrounds.

  The neoliberal market subsidies that gained prominence between the 1980s and the early 2000s, by contrast, expressed a very different social contract. Its values were something like, “Everyone has to look out for themselves in the marketplace, but the government will provide a few extra dollars to buy important goods for people who have extremely bad luck in the market.” That market ideal answered the access and accountability questions in a characteristic way: citizens should buy whatever education they needed to succeed in the marketplace. As savvy consumers, they should vet providers (like colleges and employers) for themselves. Market subsidies would help poorer students take out loans, but students had to figure out for themselves what jobs would pay the most and how much student debt was too much. So those who picked the wrong majors or were bilked by for-profit schools—well, they had only themselves to blame.