Between 2000 and 2020, America drifted left on some cultural issues but continued to drift right economically as unions weakened, workers labored longer hours for less overtime pay, Wall Street banks grew larger in relation to the rest of the economy, and new precariat “plug-and-play” independent contractors, 55 million strong by 2020, lacked benefits in a gig economy. With millions struggling to make ends meet in retail, service, and fulfillment centers and manufacturing jobs lost to outsourcing and automation, class lines hardened to the point that America had less upward mobility than European countries. By 2020, nearly all American families receiving SNAP food benefits (fka food stamps) were working rather than unemployed, with companies shoving their workers’ cost of living onto other taxpayers, including healthcare. Multinational firms transcended borders after the Cold War, cashing in on capitalism’s global victory and the Internet, leaving U.S. technology with the five biggest public companies in history: Apple, Microsoft, Amazon, Alphabet (Google), and Meta (Facebook/Instagram), that used an ocean of smaller startups as free R&D departments. Corporations also won the legal right in Citizens United v. FEC (2010) and American Tradition Partnership, Inc. v. Bullock (2012) to unlimited “dark money” Super PAC campaign donations to buy politicians, judges, and state attorneys general because, as “citizens,” corporations were exercising their “free speech.” The Reagan Revolution kept liberals on the defensive as taxes and faith in government stayed low, with only a tenth as much spent on infrastructure as eighty years earlier (0.3% of GDP vs. 3% at the peak of the New Deal, with the 2021 IIJA raising it back up to 1.25%). Ronald Reagan emboldened conservatives in the same way that FDR’s New Deal emboldened liberals a half-century earlier. And, just as FDR would’ve found some of LBJ’s Great Society too liberal (and LBJ found today’s progressive Democrats too liberal), so too Reagan wouldn’t be conservative enough today to run as a Republican.
By the mid-2010s, global politics was shifting away from the traditional left-right economic spectrum toward a dichotomy between those who embraced globalization, immigration, and the postwar Western alliance (NATO, EU, etc.) and nationalists led by Donald Trump in the U.S. and Eurosceptics like Viktor Orbán in Hungary, Theresa May in Britain, Geert Wilders in the Netherlands, Marine Le Pen in France, Lech and Jaroslaw Kaczynski in Poland, Beppe Grillo in Italy, and Jörg Meuthen in Germany — some elected, others not. Jair Bolsonaro was Brazil’s “Tropical Trump” (and tried something similar to January 6th on January 8th, 2023 after losing re-election), and billionaire prime minister Andrej Babiš styled himself the “Czech Donald Trump.” Trump strategist Steve Bannon called Switzerland’s Christoph Blocher, who’d kept Switzerland out of the European Union long before Britain’s Brexit, “Trump before Trump.” United in their opposition to unchecked immigration and free trade, these nationalists transcended the left-right economic spectrum, instead appealing to fears that globalization weakens group identities. Scottish nationalist and First Minister Nicola Sturgeon was a democratic socialist and Trump oscillated between political parties from 1987-2011. Often, their nationalism tended toward religious or ethnic nationalism, including Orbán and Narendra Modi in India, and some borrowed from fascist playbooks by undermining faith in elections and focusing their attacks on media, minorities, immigrants, and truth, with patriotism expressed in xenophobic terms. All promised to restore former greatness, a common campaign pitch historically despite vagueness as to what candidates want to restore about the past. The 2007-09 Financial Crisis and Middle Eastern immigration exacerbated the new globalist-national divide, but it also stemmed from traditional middle-class economic concerns.
Fortunes made in the digital economy didn’t make the poor poorer, but neither did they benefit. In America, gains from increased productivity flowed almost exclusively to the wealthy, frustrating the middle classes and making it hard for the right to argue the merits of trickle-down economics convincingly to a broad GOP coalition, thus weakening the Republican establishment and opening that party to alternatives like Trump. Despite increased economic productivity, wages remained largely flat between the 1970s and 2010s (relative to inflation) except for the wealthy. An education system that favored the rich was a net impediment to upward mobility, but still provided the surest path to middle-class prosperity for those that took advantage of public schools. Forty percent of community college students nationwide experienced some form of housing instability in any given semester. These 2012 numbers from Harvard and Duke’s business schools, published in the left-leaning Mother Jones, explain the widening wealth gap:
With the world’s top 1% now worth more than the bottom 99%, it’s likely that liberal parties will pursue increasingly socialist means to redistribute wealth within market economies (e.g., “Jobs-for-All” with $15 minimum wage, right) while conservative parties steer the conversation away from economics toward culture wars, or at least discourage “class warfare.” This working-class demographic is who FOX personality Sean Hannity was appealing to as fellow “smelly Walmart shoppers.” In America, right-wing populism offers workers curtailed immigration, continued dependence on fossil fuels, and tariff protectionism, but tariffs have an incidental, mixed impact on workers because they disrupt exports and cause inflation, halting immigration can undercut the economy as we saw in the 1920s, and ramping up carbon emissions will make the climate crazier, with its biggest victims no doubt being the poor and working classes. Remember that William McKinley, despite some tariffs, brought blue-collar workers and farmers under the Republican tent in 1896 and 1900 through more exporting, not less. Protecting global trade is why both parties feared communist expansion during the Cold War. And when the right-wing populist Tea Party won congressional power after 2012, they were hostile to unions, which isn’t really populist by even the most expansive definitions. Likewise, GOP in the 2023 House of Representatives floated the idea of converting federal tax revenue from income to sales, which would’ve caused a big un-populist transfer of wealth to the rich from the working classes, who spend a disproportionate percentage of their incomes on essentials. And the GOP’s opposition to cracking down on wealthy tax cheaters wasn’t populist in substance, even if it was populist in style.
We still have similar debates about inequality as we had forty years ago even though the ultra-rich have way more money than they did then. For example, if we saw that, during COVID-19, Amazon’s Jeff Bezos netted $35 billion per month, which he did, our reaction would be similar to if he’d netted $35 million per month — still a lot but a thousand times less. But the idea that the federal minimum wage has declined relative to inflation is wrong. Like working- and middle-class wages in general post-Reagan, it has stagnated rather than declining relative to inflation. It was low starting in 1938 in order to win the votes of Southern Democrats who feared over-paying minorities.
In American domestic politics after the Reagan Revolution, the GOP moved to the right and the Democrats followed by moving partway to the right on economics but not culture, and the gap between the two parties grew because of factors we covered in the last half of the previous chapter: media fragmentation, enhanced gerrymandering, and uncompromising, rights-based party strategies, along with the Monica Lewinsky scandal and contested 2000 Election. Other factors magnifying partisanship were the end of the Cold War (removing a unifying cause) and greater (not less) transparency on Capitol Hill, leading to fewer backroom compromises outside the view of partisan lobbyists and voters. Congressional approval ratings dropped to all-time lows amidst normal rates of lobbying and abnormally-high partisan dysfunction. Social media poured accelerant on the partisan fire, creating an “outrage-industrial complex” also covered in the previous chapter. The 9/11 attacks in 2001 also sowed fear, mistrust, and division, even if they had a brief unifying effect initially.
All these factors amplified the partisanship that’s been a mainstay of American democracy, creating near dysfunctional gridlock in Congress worsened by parliamentary filibustering (delay tactics) that required 60% super-majorities to overcome for most Senate bills, but not budget reconciliation bills. A bare majority could override the rule, but each party hung on to the filibuster when in power out of fear that if the other side won back power in the next election, they’d need it to check them. The logic behind the filibuster was to create opportunities for the minority party to help shape legislation in the spirit of compromise but, since voters wouldn’t tolerate their respective parties compromising, the filibuster just contributed to gridlock. Hyperpartisanship also precluded any chance for amendments, the Constitution’s clever mechanism for correction, evolution, and improvement, since amendments require ¾ of the states to approve. Occasionally, minor bipartisan legislation passed but these bills generated so little interest that this was nicknamed the “secret congress.” The Problem Solvers Caucus made up of moderate “big tent” Democrat and Republican members of congress (MOCs), was unpopular within each party and had no traction on social media. Substantive bipartisan bills passed with the First Step Act (2018), Infrastructure Investment & Jobs Act (IIJA, 2021) and Bipartisan Safer Communities Act (2022), but Republicans that voted for the IIJA and BSCA were subjected to death threats in an attempt to purge them out of the party, and liberals didn’t learn much about First Step because their favored media saw no profit in Trump agreeing to some of the prison reform that the left and minorities had been hoping for. The gun bill adding bipartisan to its name was a sign of the times, suggesting that was a novel feature. A bipartisan bill tackling Big Tech monopolies was plausible, but Senate majority leader Chuck Schumer (D-NY) never brought it to a floor vote (in a potential conflict of interest, his daughter worked for Amazon and Meta). National Review editor Philip Klein argued that any Republicans guilty of bipartisanship “should be primaried and defeated” while those retiring should be “shamed for the rest of their lives.” Yet, simultaneously, fellow National Review writer Kevin Williamson disingenuously lambasted Joe Biden for being unable to swing any GOP votes over to his bills. Because Senator John Cornyn (R-TX) supported the 2022 gun bill, he was booed at the GOP’s state convention.
But 2022 also saw a rare burst of more bipartisan legislation passed in the form of a new electoral count act, so that electors have to vote for winners in their respective states, to counter potential future versions of Trump’s 2020 Eastman Plan, the PACT Act to award veterans benefits and treatment for toxic exposure (a longtime cause of comedian Jon Stewart), and the CHIPs & Science Act to strengthen domestic semiconductor manufacturing and better compete with China by seeding mini-Silicon Valley’s across the country. Democrats also passed the misnamed Inflation Reduction Act (IRA) to encourage transitions to EV’s and renewable energy as a stripped-down Green New Deal, allow Medicare to negotiate cheaper drug prices with pharmaceuticals (which Trump had also favored), and start paying down the debt by closing tax loopholes and boosting the IRS (cracking down on cheaters making over $400k and improving customer service for everyone else). But the IRA was not bipartisan, passing 50-50 plus VP Kamala Harris’ tiebreaker, and its weird name resulted from its status as a budget reconciliation to get around the filibuster, similar to the GOP and Trump’s 2017 Tax Cut.
But to get around partisan gridlock prior to 2022, presidents increasingly bypassed Congress with executive orders. Republican state attorneys general sued Democratic presidents, especially Obama, to block national legislation in their respective states (i.e., a modern-day version of nullification theory). Democratic states did likewise to Republican presidents, especially Trump, and employed their own nullification by legalizing marijuana at the state level and creating [immigrant] sanctuary cities, both in defiance of federal law, the former after expiration of the Cole Memorandum and failure of the bipartisan STATES Act. Successive presidents just reversed each other’s executive orders, meaning that policy waffled back and forth within federal agencies like the Environmental Protection Agency (EPA), Department of Energy (DOE), etc. in 4- or 8-year segments.
Some gridlock is a healthy and natural result of the Constitution’s system of checks-and-balances, though New Yorkers invented the actual term in the early 1970s to describe traffic. And just as not all partisanship is bad, not all bipartisanship is good, as observed by comedian George Carlin when he warned that Democrats and Republicans working together “usually means some larger-than-usual deception is being carried out.” But too much gridlock disrupts the compromises that keep the political system functioning. For instance, the bipartisan Simpson-Bowles plan to balance the budget long-term with small compromises on both sides, spearheaded by President Barack Obama in 2010, never even made it out of committee and likely wouldn’t have passed if it did. It likely would’ve cost Obama re-election in 2012 since it slightly reduced Social Security and Medicare benefits, and the GOP opposed it because it slightly raised taxes. But it would’ve secured long-term solvency for the federal government. A similar fate befell the 2007 Comprehensive Immigration Reform Act, which would have tightened border security, granted paths to citizenship for 12 million undocumented immigrants, and raised quotas for skilled workers. Agreeing to such a compromise, which might’ve taken immigration off the table as a divisive topic for the near future, would’ve threatened the re-election status of senators on both sides of the aisle. Put another way, the lack of bipartisan compromise on issues like these is our own fault, collectively, as voters. The budget and immigration bills failed not because they were too liberal or conservative but rather because they were too centrist in a divided country of uncompromising liberal and conservative voters. The bipartisan Coons-Flake bill to tax carbon and redistribute fines for exceeding emissions directly to citizens to offset higher energy bills also went nowhere. Substantive action on climate change (e.g. an emissions cap) failed partly because of right-wing opposition and political gridlock and partly because, collectively, we believed that climate change was real and that we should address it, but not in a way that involved short-term sacrifice of even something as trivial as an extra $10 month on power bills according to this AP-NORC/EPIC poll (Cato Numbers). We were at least okay with distant goals and deadlines that our children and grandchildren would have to meet. But Congress has devoted funds toward further researching renewables and storage, upgrading America’s aging power grid, and enticing consumers to switch to renewable energy and electric vehicles. Currently, our progress on renewable energy is running ahead of our capacity to add it to the grid, partly because of time-consuming red tape.
Meanwhile, rifts opened in the combustible 1960s evolved and hardened into the “culture wars” of the next half-century. If Americans in 2020 weren’t really more dividedthan usual, they were at least better sorted by those who stood to gain by magnifying their disagreements. Cable TV, for instance, essentially manufactured an otherwise mostly non-existent “War on Christmas.” And citizens sorted themselves better than ever, often into conservative rural areas and liberal cities, reminiscent of the rural-urban divides of the 1920s. As we saw in the previous chapter’s section on gerrymandering, this geographic segregation resulted in partisan districts of red conservatives and blue liberals, with interspersed purple that defied categorization, though most Americans lived in purple suburbs. The fragmented and partisan media encouraged and profited from animosity between citizens, selling more advertising and “clickbait” than they would have if politicians had cooperated and citizens respectfully disagreed over meaningful issues (around 20% of Americans were online as of 1997, but almost everyone was by 2000, especially on AOL). A 1960 poll showed that fewer than 5% of Republicans or Democrats cared whether their children married someone from the other party; a 2010 Cass Sunstein study found those numbers had reached 49% among Republicans and 33% among Democrats, indicating a cultural gridlock to match politics. This trend might not continue, as polls showed that Millennial brides and grooms were less rigid ideologically than their parents. In 2014, Pew research showed that 68% of Republican or Republican-leaning young adults identified their political orientation as liberal or mixed and similar polls show some young Democrats identifying as conservative (it’s also possible that many young people don’t know what liberal or conservative mean). This 2021 poll (right) shows more common ground between younger liberals and conservatives than their parents, while the blue poll indicates that the progressive left doesn’t speak for as many Americans as it claims to.But more than ever, politicians struggled to please voters who disliked each other and, like children toward their parents, were both defiant toward and dependent on government. While it’s true that the U.S. narrowly averted economic and political disasters in 2008 and ’21, respectively, voters also thought other things were declining even when they weren’t. Declinism, aside from being a nearly universal cognitive bias, also sells better, resulting in a situation where most people thought the country was going downhill while their local area was improving, just as they thought public schools were “broken” even though their own children’s public schools were good. Americans couldn’t agree on much, but many felt aggrieved and had “had enough” even if they weren’t well-informed enough to know what exactly they’d had enough of, or too tongue-tied to explain it. And those that did know didn’t agree with each other. Amidst this hullabaloo, Tweeting, and indignation — with nearly every imaginable demographic perceiving itself as under siege — historians hear echoes of earlier periods in American history. Large, diverse, free-speech democracies are noisy and contentious countries to live in as you’ve already seen from having read Chapters 1-20. In the early 21st century, hyperpartisanship and biased media complicated and clouded debates over globalization/trade, healthcare insurance, and the Financial Crisis/Great Recession that would’ve been complicated enough to begin with. These are three primary areas we’ll cover below, with some brief economic background to start.
From 2000-2020, the American economy continued on a path toward increased globalization and automation that began long ago, with American labor competing directly against overseas workers and industrial robots. Traditional manufacturing jobs were increasingly outsourced to cheaper labor markets or displaced by automation, compromising middle-class prosperity. This “barbelled economy” had growing lower and upper classes with a shrinking middle. But fears about automation aren’t new. Nineteenth-century English textile workers called Luddites lashed out against automation, vandalizing power looms around the same time that country was debating the pros and cons of what they called the March of Intellect (machines/technology, progress, specialized education, experts, etc.):
Karl Marx likewise feared that steam would spell doom for human workers and John Maynard Keynes feared the same about fuel engines and electricity. A group of scientists lobbied Lyndon Johnson to curtail the development of computers, just as Elon Musk recently suggested hitting the pause button on AI. Automation began with the Industrial Revolution and has been steadily replacing workers ever since, deskilling parts of the workforce while creating other new jobs. Strip (surface) mining, for instance, conducted by excavators and earthmovers, started eroding coal mining jobs decades ago. Most longshoremen have been displaced by the intermodal container system, as we saw in Chapter 15. We even have digital fashion models now. Some economists argue that, despite all the talk of robotics, the actual rate of automation hasn’t increased (see optional Krugman article below). Yet, studies showed that more jobs were lost to automation (~85%) than outsourcing (~15%) in the first decade of the 21st century, even though the U.S. lost over 2 million jobs to Chinese manufacturing.
Information technology assumed a dominant role in most Americans’ jobs between 2000 and 2020. The verdict isn’t in but, proportionally, the digital age hasn’t yet translated into the domestic job growth that accompanied the steam engine, railroad, electricity, or internal combustion engine, and Wall Street’s expansion hasn’t been accompanied by growth in the “real economy.” Apple, Microsoft, Google, and Amazon employed less than a million workers between them as of 2019, even as they impacted other parts of the economy. Online retail increasingly replaced brick-and-mortar as malls closed and For Lease signs popped up in strip malls. Automation and digitization made businesses more efficient than ever and American manufacturing was stronger than naysayers realized, but it provided fewer unskilled jobs.
Efficiency is a two-edged sword: sometimes technology destroys jobs faster than it creates others. If automated trucks displace our country’s drivers over the next 10-20 years, it’s unlikely we’ll find another 1.7 million jobs for them overnight. One line of thinking is that, if automation continues unabated, we’ll have to transition to universal basic incomes (UBI) of the sort envisioned by Andrew Yang, Musk, and Mark Zuckerberg, and briefly implemented by Stockton, California mayor Michael Tubbs. But remember that almost no jobs exist today that were around a century ago, and vice-versa. Creative destruction always cuts in both directions. The loss of coal mining jobs has been more than offset by the increase in renewable energy and natural gas jobs. If we weren’t creating new jobs at roughly the same rate we’re losing them, then unemployment rates would be higher. But also realize that it’s a tough labor market for people without at least some training past high school in college, a trade school, or the military, and automation has been displacing white-collar and blue-collar workers alike. In fact, so far, middle class jobs have been hit the hardest by automation, with bots/robots still unable to perform many low-paying jobs like domestic work. Yet, many jobs remain unfilled and high schools focusing on Career & Technical Education (CTE) are gaining traction to fill gaps. Apprenticeships are also making a comeback, sometimes re-branded as job shadowing. They’ve existed all along in many trades (carpentry, plumbing, electrical, bricklaying, sheet metal, etc.) and overlap with office internships, but are currently being pioneered as wholesale career paths for white-collar work.From employers’ standpoints or that of the free market, affordable robots are more efficient than humans and they never complain, show up late, get sick, join unions, file discrimination suits, demand pensions, or health insurance, etc. So far, these fears of being taken over haven’t been realized, but automation has gained momentum since Marx and Keynes and is well on its way to posing a significant economic challenge. It’s not always for the better: automated customer service virtual assistants are inferior to humans, just cheaper. Still, for those with training, America’s job market remains healthy, with unemployment under 5% as of 2019 and ~ 3.5% in 2022. It’s difficult to tell how COVID-19 impacted/impacts employment as the U.S. is currently experiencing big labor shortages in many sectors despite unemployment being above zero. We could use some self-driving trucks right about now or, better yet, just improved pay and working conditions for human truckers. Most likely, the dynamic American economy will adjust as it always has. Humans are unlikely to go the way of the horse, partly because societies have more power over robots than horses had over engines. Many workers are adapting well to working with cobots or finding jobs designing and maintaining automated work stations. But the verdict isn’t in yet on whether the scientists who warned LBJ about computers were right. Hopefully, physicist Stephen Hawking, sci-fi writers, and singularity theorists are wrong about artificial intelligence (AI) taking control of humanity. Or, maybe, if it does, the robots will take pity on us. Those interested in a lighter take on these foreboding themes should view the visionary WALL-E (2008), one of the best movies of the young century.
Globalization: Pro & Con
Like automation, globalization didn’t start in the 20th century. Global maritime trade dates to the 15th century with overland routes like the Eurasian Silk Road dating to the 2nd Century B.C. Trade has been a controversial and important part of history since the Bronze Age. Greeks, Romans, Persians, and Mongols traded beyond their original spheres and, as you can see from this map, Scandinavian Vikings weren’t shy about embracing globalization:
Even in ancient and medieval times, trade routes spread diseases like the Plague more readily, just as today’s hyper-connectivity hastened COVID-19. Some prognosticators predict that the post-pandemic world will rely more on local networks and that Russia’s 2022 Ukrainian invasion reminded Europeans, especially, of the dangers of relying on others for energy. But it’s unlikely that early Christianity would’ve taken root without the Roman Empire’s vast road network. And languages and diets expanded because of trade. People have always wanted the advantages of free trade without the disadvantages, while politicians were left to sort it out. British trade controversies predate Brexit, tracing to the late Middle Ages with arguments over imported wool and violence toward foreign traders and, more recently, the formation of the European Union in 1992 to expedite free trade and movement of workers. Britain withdrew from the European Union (EU) with Brexit.
Stateside, free-trading colonial smugglers that spearheaded the American Revolution resented Britain’s restrictive, mercantilist zero-sum trade policies; then protectionist Treasury Secretary Alexander Hamilton aimed to incubate America’s early industrial revolution with tariffs (duties) that restricted free trade; then trade disputes and embargoes drove the U.S. into the War of 1812. Icons Alexander Hamilton and Abraham Lincoln saw protectionism as the key to economic strength. Tariffs were a divisive enough issue between North and South in the 19th century to be a meaningful, if secondary, cause of the Civil War behind slavery vs. wage labor (see Morrill Tariff of 1861). Southerners wanted freer trade to send cotton to mills in Britain and France, whereas the North wanted to protect American industry from foreign competition. Since the North won the war, the U.S. then employed protectionism as the Industrial Revolution kicked into gear after the Civil War (Chapter 1), with the GOP (e.g., McKinley) overriding free-trade Bourbon Democrats by implementing tariffs, while also promoting export trade as best they could. Republican presidents in the 1920s (Harding, Coolidge, Hoover) kept tariffs high. But analysts interpreted tariffs as worsening the Great Depression (Chapter 8) and they were disparaged in British history because the Corn Laws (1815-1846) artificially raised food prices even as the poor went hungry, enriching landowners at the expense of the working classes and manufacturers. This cartoon was from the 19th-century British Liberal Party:
Tariffs fell out of favor after World War II as the U.S. and Britain reshaped the new global economy along the lines of free trade, which they saw as the most effective way to combat communism. The more interwoven the capitalist countries were, the stronger their combined, integrated economy would be versus communist alternatives. You’ve seen in the first twenty chapters that issues cross back and forth historically between the political parties. Trade policy cuts across partisan lines over time but also at any one point in time, including now. There is rarely consensus within parties, in other words. Both parties want the advantages of trade without the disadvantages, which is impossible, so their candidates employ simple ideas and phrases in hopes that their voters aren’t comprehending the whole picture.
When the U.S. and Britain set out to remake the world economy in their image after World War II and avoid more depressions (Chapter 13), they encouraged as much free trade as possible between America and Europe. In reality, though, there were virtually no countries that favored pure, unadulterated free trade. In America, most major economic sectors had lobbies contributing to politicians — some free trade, some protectionist, and some both. Most democratic countries, including those that signed the General Agreement on Trade and Tariffs (GATT) in 1947, had voting workers back home demanding favoritism for their profession. France, for instance, qualified its GATT inclusion with “cultural exceptions” enabling its cinema to compete with Hollywood imports and it maintained high agricultural tariffs. Translation: the upside of trade was great, but other countries couldn’t undermine farm-to-market La France profonde, or “deep France,” with cheap wine, bread, and cheese. In sum, the free-trade movement has lowered but not eliminated tariffs: the U.S. averages ~ 2.5%, Japan 2%, and China and European countries ~ 3%. These are averages, though, and vary widely among products.
By the early 1990s, a near consensus of economists favored free trade but globalization threatened some American workers while strengthening others. Democrats had supported unions since the New Deal of the 1930s and they generally supported a Buy American protectionist platform to help workers, including taxes or duties (tariffs) on imports. They were the American version of the French farmers, in other words. But globally, economic inequality that had risen steadily since the dawn of the industrial revolution reached its peak in the 1970s, precisely when inequality within the United States was at its lowest. With freer trade policies, inequality has since grown in the U.S. while trending down worldwide. With more automation and competition from outsourcing via free trade, unions have steadily declined in strength (right) even as their popularity in a 2021 Gallup Poll rose to a 60-year high (The Hill). Big companies have to honor the legal right of employees to unionize, but they can also legally hire union-busting consulting firms to make propaganda films and posters explaining to their workers that unions will lower their wages (because of dues) and perhaps even cause them to lose their homes or marriages. Managers are hired to be on the alert if they hear workers utter phrases like “living wage” so as to snuff out the fire before it starts, and workers suspected of union-organizing can be reassigned to unpleasant tasks, all within the law. Amazon mandated anti-union propaganda meetings for its employees. The combination of propaganda and intimidation has helped weaken labor in most sectors, but automation and outsourcing are the underlying culprits. So far, independent contractors haven’t been able to unionize except for a few stirrings in blue states like California and New York. Fulfillment centers also have a mixed but moderate record. Labor history teaches us that gig economy unionizing would be most effective if combined across sectors (ride-hailing, food delivery, home repair, etc.). The more workers consolidate, the more they could cash in on their huge and growing numbers and they could communicate with each other easier than ever. Most likely, the app companies’ resistance, red-state push-back, and the contractors’ own proclivity for independence will preclude unionization any time soon. As for the Democratic Party, they’ve long since lost their monopoly on “lunch pail’ (blue-collar) workers for a variety of reasons we’ve unpacked elsewhere, but partly because they no longer use union support to galvanize their support. In the 2016 presidential campaign, Senator Chuck Schumer (D-NY) advised that the Democrats not squander a lot of time and money campaigning among them, reasoning that for everyone one they lose, they win over two moderate Republican suburban women.
That year, Trump famously outpolled Hillary Clinton in crucial rust-belt swing states in the upper Midwest because he appealed to blue collars directly, even if his trade and immigration arguments were over-simplified and he provided no substantive gain for them during his presidency (steelworkers were one exception, aided by tariffs; coal miners just got less safety regulations, benefitting their employers). Buy American sounds good, but protectionism has been an awkward topic for mainstream Republicans because they fancied themselves as the more patriotic of the two parties but had mostly supported free trade since WWII to boost corporate profits and/or out of a genuine belief that, overall, it helps workers. George W. Bush’s VP Dick Cheney said, accurately enough, that globalization has “visible victims and invisible beneficiaries.”
As we saw in the previous chapter, Democrat Bill Clinton’s embrace of free trade created a window of opportunity for Independent Ross Perot to garner significant third-party support in 1992, and Hillary Clinton’s ongoing support of globalization along with mainstream Republicans partially explains Donald Trump’s appeal in 2016. In 1992-93, Bill Clinton wanted open trade borders with the United States’ neighbors to the north and south, Canada and Mexico, and, in 2000, he normalized trade relations with China. Two major candidates, Bill Clinton (D) and George H.W. Bush (R), supporting free trade in 1992 left the door open for a third-party candidate to focus on the outsourcing of labor. In his high-pitched Texan accent, Ross Perot quipped, “Do you hear that giant sucking sound? That’s your jobs leaving for Mexico.” He focused on Mexico because the issue was whether the U.S., Canada, and Mexico would trade freely via NAFTA, the trilateral North American Free Trade Agreement (logo, left). Ronald Reagan promoted NAFTA in the 1980s, George H.W. Bush had agreed to it, and Bill Clinton promised to push it through the Senate and sign it. Perot was wrong about huge numbers of jobs leaving for Mexico, but a lot of American manufacturing and customer-service jobs left for China, India, Vietnam, and other places where companies could pay low wages and pollute the environment without concern for American regulations. There was a giant sucking sound, all right; it just went mostly toward Asia instead of Mexico. Prior to COVID, an astounding 90% of retail goods worldwide crossed the oceans. Meanwhile, workers came north from Mexico and Central America for jobs that existing Americans either weren’t interested in or asked more for. Historian Thomas Frank cites Bill Clinton’s pro-NAFTA speeches — in which he described how the government would fund re-training for displaced workers but conceded honestly that some would lose their jobs — as weakening whatever residual grip Democrats retained on blue-collar Southerners.
The pro-globalization argument is that free trade and outsourcing improve profit margins for American companies, boost American exports, and lower prices for consumers while providing higher wages and economic growth in developing countries. For free traders, tariffs are just hidden taxes that attempt to arbitrarily pick winners and losers in the economy and create general inefficiency; just let the free market dictate where goods, services, and jobs flow on their own. The godfather of free trade, and critic of mercantilism, was Enlightenment-era Scottish economist Adam Smith. NOTE: for pure free traders, all political borders are an impediment to economic growth and efficiency. Globalizing service jobs like lawyers and architects, as opposed to just goods, would boost the economy even more. Free trade also offers consumers a wider range of products, ranging from Mercedes-Benz and Samsung electronics to Harry Potter novels. Globalization is how the San Antonio Spurs won NBA championships with players from Argentina, France, Italy, and the Virgin Islands and why retail and food chains are nearly indistinguishable across Europe and North America (Scandinavians are nuts for 7-11 hot dogs). You could drive a German BMW built in South Carolina or an American Ford or John Deere tractor built in Brazil. South Korean LG builds solar panels in Alabama. Japanese Yamaha builds Jet Skis and ATV’s in Georgia, the same state where Chinese Sany builds excavators. Like to “stop and smell the roses?” They’re likely from Ecuador or Columbia. The smartphone in your pocket might come from South Korea or California but contain silicon in its central processor from China, cobalt in its rechargeable battery from the Congo, tantalum in its capacitors from Australia, copper in its circuitry from Chile or Mongolia, plastic in its frame from Saudi Arabian or North Dakotan petroleum, and software in its operating system from India or America. Another pro-globalization argument is that it creates more jobs than it destroys, as foreign companies who otherwise wouldn’t operate in the U.S. open plants and hire American workers. Honda, from Japan, builds almost all the cars and trucks it sells to Americans in the United States. In 2014, Silicon Valley-based Apple started making Mac Pros at Singapore-based Flextronics in northwest Austin, creating 1.5k jobs. In 2021, South Korean Samsung announced they’d build a semiconductor plant in Taylor, Texas that will employ 2k — many, no doubt, from Austin Community College.
Opponents of globalization point out that American manufacturers are undersold, costing jobs and lowering wages at home as companies exploit and underpay foreign workers. Barack Obama’s 2009 stimulus package included a Buy American provision for that reason. Are such provisions beneficial to the American economy? For at least some workers, yes. When jobs go overseas, they lose theirs and labor unions lose their hard-earned bargaining power. But tariffs also keep prices artificially high on parts and products, costing other workers and consumers. For instance, steel tariffs help American steel manufacturers but cost American builders and consumers buying steel, who would otherwise buy it cheaper on an open international market. Free trade lowers prices. The U.S. could put a tariff on clothing and save 135k textile jobs. But it would also raise the price of clothing, a key staple item, for 45 million Americans under the poverty line. Globalization, in short, is why your smartphone didn’t cost $2k but also why you can no longer make good union wages at the local factory with only a GED or high school diploma. New workers at General Electric’s plant in Louisville earn only half of what their predecessors did in the 1980s adjusted for inflation. Globalization levels the labor playing field.
We take for granted the lower price of products that importing and outsourcing make possible, and might not notice the increased productivity their devices allow for on the job, but we take notice that Americans aren’t employed assembling electronics. In Walmart’s case, we might notice a “Made In China” tag on items but not realize that their lower prices curbed retail inflation in the U.S. for 35 years. Walmart also saved money by selling bulk items, using bar-codes, and coordinating logistics with suppliers — all now customary in retail. Free trade and outsourcing also help stock returns, because large American corporations not only can make things cheaper, they also do half of their own business overseas. The stock market helps not only the rich but also workers with company pensions and 401(k)’s that rely on growing a nest egg for retirement.
Most importantly, the U.S. exports too, and when it puts up protective tariffs other countries retaliate by taxing American goods. That happened famously when the Smoot-Hawley Tariff of 1930 worsened the Depression, stifling world trade. The shipping company pictured below, UPS, is based in Atlanta and it boosts America’s economy to have them doing business overseas. No globalization; no UPS gondolas in Venice piloted by a gondolier checking his cheap smartphone. Globalization, then, is a complex issue with many pros and cons, some more visible than others. For a bare-bones look at the downside of globalization view the documentary Detrotopia (2013), that traces the decline of unionized labor and manufacturing in one Rust Belt city, or just look at the dilapidated shell of any abandoned mill across America. There aren’t any comparable documentaries concerning the upside of globalization since that’s harder to nail down. When it comes to what psychologist Daniel Kahneman called “fast and slow thinking” (Phase 1 and Phase 2), we can see the downside of globalization in five seconds but might need five minutes to really think through the upside. The British magazine Economist, which has promoted free trade now for over 175 years, explained the ongoing appeal of protective tariffs among voters more eloquently than Dick Cheney, if less succinctly: “The concentrated displeasure of producers exposed to foreign competition is more powerful than the diffuse gratitude of the mass of consumers, and so tariffs get reimposed.” Likewise, immigration, which overlaps with the issue of globalization, is favored by most economists as good for the overall economy, but it victimizes certain demographics/sectors who lose jobs to immigrants willing to work for less. It’s always been so in American history, as you may remember from reading about earlier immigration controversies in Chapters 2 and 7.
The 1992 campaign drew attention to globalization, as did the protests and riots at the 1999 World Trade Organization conference in Seattle. The WTO is the successor to GATT, part of the economic framework the West created after World War II, along with the World Bank and International Monetary Fund, to stimulate global capitalism. The rioters were protesting against the WTO’s free trade policy and the tendency of rich countries to lend money to emerging markets with strings attached, often mandating weak environmental regulations and outlawing unions. Working conditions often seem harsh and exploitive from a western perspective even if the job represents a relatively good opportunity from the employee’s perspective. Apple lays off or adds 100k workers at a time in their Chinese facilities — mobilization on a scale the U.S. hasn’t seen since World War II and wouldn’t tolerate. At the WTO riots, protesters threw bricks through the windows of chains like Starbucks that they saw as symbolizing globalization.
In the 2010s, the outsourcing trend reversed some, as more manufacturing jobs returned to the U.S. Some companies, like General Electric, realized that they could monitor and improve on assembly-line efficiency better close to home, while other factors included the rising costs of shipping and rising wages in countries like China and India, which started to approach that of non-unionized American labor in right-to-work states. Yet, insourcing also included foreign workers. Under H1-B non-immigrant visas, companies could hire temporary immigrants to do jobs for which there were no “qualified Americans.” Due to lax oversight, some companies started to define qualified as will do the same job for less money. In 2016, Walt Disney (250 in Data Systems) and Southern California Edison (400 in IT) fired hundreds of American employees and even required them to train their replacements from India before picking up their final paycheck. Corporate America is currently lobbying Congress to loosen H1-B restrictions, while Trump vowed to get rid of the H1-B Visa program in his 2016 campaign.
Globalization continues to be a controversial topic in American politics, but tariffs/protectionism versus free trade isn’t the only issue. There’s also the question of how fair trade agreements are once countries agree to trade. Keep in mind that pure, free trade rarely exists except in economics classrooms. And trade agreements aren’t one-page contracts that declare: “No rules whatsoever. It’s a free-for-all” in large font above the picture of a handshake emblazoned over a Maersk container ship or a picture of John Hancock smuggling bootleg rum into 18th-century Boston. They’re more like legal documents hundreds of pages long that make it difficult for the average citizen to parse out what they really include, and they almost always include tariffs on some things and not on others. Agreements are most controversial today in Sino-American trade (the Latin-derived Sino means Chinese), as we’ll see in the remainder of this section. Chimerica, as it’s sometimes called, is the biggest partnership in the global economy.
The 2016 election saw two populist candidates, Trump (R) and Bernie Sanders (D), opposed to free trade and they even pressured Hillary Clinton (D) into taking an ambiguous stand against President Obama’s Trans-Pacific Partnership (TPP) that loosened trade restrictions and lowered tariffs between NAFTA (U.S., Canada, Mexico) and twelve Pacific Rim countries other than China that constitute 40% of the world economy. George W. Bush also supported the TPP. If not opposed to trade outright, Sanders and Trump at least wanted to rework the terms of the agreement, though it’s difficult in multilateral (multi-country) agreements to have each country go back to the drawing board because then the others might want to renegotiate and the whole process gets drawn out or falls apart. Part of the reason for the TPP’s unpopularity among Americans was that the negotiations weren’t transparent so it seemed that something was being done behind their backs without their input, even though that something might’ve been good.
Trump and his advisor Steve Bannon took advantage of TPP skepticism, blasting through the door cracked open by Perot a quarter-century earlier, winning big in rural areas and the Rust Belt hit hard by globalization. Politicians, by and large, are aware that globalization is probably a net gain for Americans, but there are economic pockets that lose and are motivated to vote against free trade. It’s tempting to tap into those voters and they deserve to be heard as much as anyone else. Steelworkers can argue that if farmers get subsidies and bankers get bailouts, why shouldn’t they get to “level the playing field” with protective tariffs? You can sympathize with their need for a steady paycheck if not their use of the term level since tariffs artificially unlevel the playing field from a global perspective. Trump tapped the energy of voters disadvantaged by free trade while sagely keeping his bark bigger than his bite. Once in office, he tinkered with existing trade deals (other than TPP) without dismantling them, which would’ve cratered the U.S. economy.
But a key point of TPP was to check China’s growing hegemony in Asia by striking agreements with its neighbors but not them, pressuring China by surrounding them with American trading partners and potential allies. With the TPP, North America was signaling China that they could do business elsewhere in Asia. Its proponents still intended to do a lot of business with China, just on better terms. The TPP provided the option of converting to A-B-C (anyone but China) supply chains if necessary for military purposes. When Trump pulled out of the TPP he lost leverage vis-à-vis China, weakening America’s negotiating position. Moreover, according to the American Farm Bureau Federation, leaving TPP cost farmers ~$5-8 billion in annual exports.
China joined the WTO in 2001 but refused to play by the rules, choosing clear winners and losers in their planned economy with subsidies and devaluing their yuan currency to boost export and discourage import trade. China subsidizes state-owned companies (aka SOEs or national champions), making it difficult for truly private companies to compete. Big technology firms like Alibaba and Tencent are financially independent but China’s government regulates Web traffic and uses its Sesame Credit software to encourage loyalty among the population. When the dust settled after a half-century of Cold War, America ironically found itself dependent on a hybrid communist economy for cheap labor and supply chains. China, equally unforeseen by Mao Zedong, morphed into a profit-driven economy with big state-owned firms, similar to the way the U.S. government owned the biggest phone company (Bell) between World War I and 1982. China didn’t embrace its own version of capitalism to join the West, but rather to become more independent.
China’s relationship with the U.S. wasn’t horrible but degenerated in the first decades of the 21st century with China’s increased military strength (especially naval) and its rampant cyber-theft of American technology, including military technology. The 2007-09 financial meltdown leveled the playing field, narrowing the gap as measured by America’s GDP advantage; China is now ~ 60% as big economically. The International Monetary Fund chart on the right shows how countries ranked by annual GDP in 2018, including how California and Texas would’ve ranked had they been countries (the x-axis measures U.S. dollars in trillions).
American companies also complained to the WTO or Obama administration about having to sign technology-sharing agreements to do business in China (violating WTO rule #7), but only under the condition that the administration or WTO not complain too loudly or tell China who complained about them in much detail. Their top priority, in other words, was to continue to do business in China — by then their biggest or at least fastest-growing market — even if it meant being blackmailed into sharing technology. There were also some products, not all, where Chinese tariffs on incoming American imports were higher than the tariffs the U.S. charged on Chinese goods. The same phenomenon happened during Trump’s administration. During his trade stand-off, and while China’s government was repressing Muslim Uyghurs in its Xinjiang region, crushing democracy in Hong Kong, enforcing digital censorship, and dabbling in cyber-theft, American companies like Starbucks and Apple were describing China to their shareholders as “remarkable” and “phenomenal.” Pro basketball was silent on Xinjiang and Hong Kong as well, even as NBA players wore jerseys emblazoned with equality and justice. Houston Rockets GM Daryl Morey Tweeted support for protestors in Hong Kong, but Lebron James told him to shut his trap and Morey had to resign. Social justice doesn’t extend into potentially lucrative overseas markets. American companies’ willingness to compromise weakened American politicians’ leverage.
Obama inked an agreement with Xi Jinping in 2015 for both countries to curtail cyber-theft. The Trump administration’s National Counter-Intelligence and Security Center conceded that this agreement had been mostly effective, at least on non-military items. Also, the U.S. Chamber of Commerce lists China pretty low on its list of countries that violate trade terms. From 2015-2020, China cooperated fairly well with the WTO but the Trump administration undercut the agency by blocking the appointment of the judges that rule on disputes. We should remind ourselves that other countries, including the U.S., have passed tariffs, favored certain industries with subsidies, and stolen each other’s technology plenty over the years. Early Americans stole mechanical engineering and steam technology from Britain. Americans prided themselves on industrial/military espionage during the early Industrial Revolution and Cold War. Still, there was widespread, justified bipartisan concern after China stole plans of Lockheed Martin’s F-35 Lightning II, a stealth fighter that cost the U.S. years of money and research.
Consistent with his campaign promises, Trump started a tit-for-tat tariff war with China, hoping that America could absorb the short-term damage long enough to outlast China and force them to back down. He slapped a 10-25% import duty on nearly half of the products shipping to the U.S. (including dishwashers and solar panels) and China retaliated with tariffs on American agricultural exports. China also cut off recycling for all but cleaned plastics in 2017. Their ensuing trade war was the biggest in history.
There’s no good reason for China and the U.S. to start a military war — and that would destroy each country’s economy anyway — but many Americans aren’t comfortable with potentially being surpassed as the world’s #1 economic and military power, even though China’s population is 4x larger than America’s. Some analysts see a new Sino-American cold war brewing and are hoping to avoid a Thucydides Trap scenario whereby the two countries stumble into an unnecessary war just because China is growing at a faster rate than the U.S., as was ancient Athens in relation to Sparta, described by Thucydides in History of the Peloponnesian War. We learned at the end of the Vietnam War chapter that the goal of Obama’s proposed Asia Pivot was to gradually shift more American troops and bases from the Middle East to Asia to protect Japan, South Korea, Taiwan, the Philippines, and Vietnam from potential Chinese aggression. This is one area where Trump supported multilateralism, keeping the U.S. engaged with the Quadrilateral Security Dialogue (aka the Quad, QSD or Asian NATO), through which the U.S., Australia, India, and Japan check Chinese expansion in the South China Sea, where China builds artificial islands as military bases. There was a seamless transition from Trump to Joe Biden on support for the Quad but, realistically, the U.S. supplies most of the military power in this alliance. In the short-term, it’s less important than trade alliances like the TPP or lack thereof.
The forementioned “China Model” of state-owned enterprises (SOEs) mixed with private entrepreneurship is also one Americans would prefer to see fail, though its South Korean allies have had success with state-favored, but not state-owned, chaebols like Samsung, LG, and Hyundai-Kia. Neither China nor South Korea see free competition as necessarily important to economic success. Led by Deng Xiaoping in the ’80s and Xi Jinping (2012- ), China’s communist party planned the economy years in advance and retained control of natural resources and energy while privatizing most other businesses. Americans convinced themselves that China’s modernizing economy would naturally open up their political system to more democracy, but it never happened and is trending in the opposite direction. American companies, meanwhile, didn’t really care about China’s political system so much as the billion+ potential customers and workers that lived there. New York Times economist Thomas Friedman wrote that “Beijing placated us by buying more Boeings, beef and soybeans.”
In Chapter 13 we saw how the U.S. and British remade the world economy in their image after World War II with the World Bank, International Monetary Fund and pro-free trade organizations like GATT > WTO, the World Trade Organization. In Bad Samaritans, Ha-Joon Chang argued that they “kicked the ladder out from under” emerging markets with “asymmetrical demands” after wealthy countries like the U.S. and Britain grew strong through tariffs in the 19th century and Keynesian stimulus spending in the 20th, both of which are forbidden for loan recipients. There may be some truth to that, but China shouldn’t have joined the WTO in that case. By the 2010s, China wasn’t just an emerging economy in the old sweatshop/light manufacturing sense of the word. They operated in robotics, AI, 3-D Printing, microchips, facial recognition software, electric and autonomous cars, etc. They had companies like Huawei that challenged western telecom with its 5G wireless platform. Controversy swirled around the firm in 2019 as the U.S. encouraged other western countries to cut them off and the CEO’s daughter, Meng Wanzhou, was detained in Canada for not honoring sanctions against Iran.
Avoiding conflict with China or another cold war was not on Trump’s agenda when he came into office, nor his chief strategist Steve Bannon. They welcomed an “economic war” with China as their top priority. But Trump’s first staff/cabinet was divided into globalist and nationalist camps. On the globalist side were Treasury Secretary Steve Mnuchin and chief economic advisor Gary Cohn, of Goldman Sachs, who argued for ironing out disagreements but maintaining a mutually-beneficial trade. On the nationalist side of the debate were Trump, Peter Navarro (lifelong Democrat and author of Death By China), and Bannon. Bannon, an economic nationalist, saw globalization as a transfer of wealth from working-class Americans to elites and foreigners and the Sino-American relationship as purely dualistic: they couldn’t co-exist and both thrive as trading partners. Bannon said “There’s no middle ground. One side will win; one side will lose.” Bannon’s take on China was similar to that of his hero, Ronald Reagan’s, toward the USSR. As a naval officer, Bannon was upset with Jimmy Carter during the Iranian Hostage Crisis and admired Reagan’s strategy of winning the Cold War outright instead of co-existing with détente. Bannon had an interesting post-military résumé, with stints as a Wall Street investment banker, Hollywood producer, documentarian, and co-founder of alt-right Breitbart (2007- ) and Cambridge Analytica (2013-18). Trade isn’t war, though. Historians warned that by severing ties with China in hopes of security, we could inadvertently strengthen China, making them more self-sufficient in the same way that the War of 1812 strengthened the U.S. vis-à-vis Britain. Likewise, cutting the U.S. off from promising engineering students would only cause a “brain-drain” to other countries, even if it prevented a few spies from slipping through. Increasingly, not all technology transfer will be from America to China.
Gary Cohn tried to arrange a meeting between Trump and American end-users of steel and aluminum — those that would be driven out of business or lose money with steel tariffs — but Trump refused and raised tariffs, leading to Cohn’s resignation. The nationalists won the debate. Bannon told PBS Frontline that Trump’s view of Chinese trade as purely bad for America was his only fully-formed worldview when he came into office and that he’d learned it almost exclusively by watching journalist Lou Dobbs for years on CNN then FOX. In the 1980s, as a celebrity real estate mogul, Trump directed most of his ire toward Japan, but he’d transferred his focus to China by 2000. Bannon and Trump saw American manufacturing as having grown weak, while economists saw it as strong but increasingly automated, providing fewer jobs.
But pretending trade is simple or having a simple understanding of trade doesn’t make trade simple. American farmers that exported meat, grain, wine, and dairy products to food importers like Vietnam and Japan didn’t fully think things through when they supported protectionism and opposed TPP in the 2016 election, just as they didn’t realize how much corn they exported to Mexico when denouncing NAFTA or how many soybeans they sent to China. When corn prices plummeted after Trump threatened to dismantle NAFTA, he agreed to pull back and renegotiate instead. In Spring 2018, Trump threatened tariffs on steel and aluminum from Mexico and Canada, and in Fall 2018 started renegotiating a slightly re-branded version of NAFTA called USMCA (United States-Mexico-Canada Agreement) that the respective countries signed in 2019-20. The deal opens the U.S. to the Canadian dairy market and requires that automobiles must get 75% of their parts from within their country of origin to qualify for tariff-free imports to the other countries. While more or less just a re-branded NAFTA, USMCA’s passage in Congress at least involved a rare bipartisan agreement between Trump and House Speaker Nancy Pelosi (D-CA).
With China, specifically, there was a widespread consensus — shared by Trump and Democratic politicians like Pelosi and Senator Chuck Schumer (D-NY) — that China was violating the terms of its newfound WTO membership with unfair tariffs, forced technology transfers, currency manipulation (keeping theirs artificially low to boost exports), and intellectual property theft. Both political parties wanted to resume trade, though, once new terms could be hammered out. Trump and Bannon weren’t really protectionists, they were just using tariffs as a temporary tool to renegotiate with China.
Looking at America’s negative balance of trade with China (importing more than we export; see chart on right), Lou Dobbs and Trump saw deficits as a harbinger of declining economic might — what Trump was referencing in the Tweet above by saying “when we are down a hundred billion…” In this zero-sum view, money is simply funneling from one country to the next. But, in contrast to Dobbs, Trump, Bannon, and Navarro, most economists aren’t as focused on trade deficits, pointing out that wealthier countries tend to import more than they export because they have more money to spend. Importing products, after all, is voluntary. If the U.S. wanted to level the trade deficit it could just buy fewer things. Also, trade deficit/surplus stats can be misleading because, if a finished product like phones goes from China to the U.S., it counts as being in China’s favor even though various countries “added value” to the product along the way, including the U.S.; it was just assembled in China. The chart below shows how many countries contributed to an iPhone, even though China got all the credit in the balance of trade because that’s where the final product shipped from.
Trump preferred to see less of the supply chain overseas, though, for products deemed essential to national security like steel and aluminum. The wisdom of simplifying or diversifying supply chains could be seen in the 2020 COVID-19 outbreak, when the U.S. was caught short on medical supplies. Trump wanted out of broad multilateral agreements and to renegotiate one-on-one bilateral agreements of the sort the U.S. had forged with South Korea (KORUS) under Bush 43 and Obama except that Trump’s “beautiful deals” would improve the terms of existing agreements. Trump was less ideological than transactional — hoping to trade, but with improved trade terms because he saw the old terms as disadvantageous to the U.S. As Americans followed these debates, they rarely knew what, exactly, those trade terms were in the first place and it wasn’t 100% clear that Trump did either or that he’d ever actually read the TPP. He just told voters that other countries “were raping us” and that our previous presidents were “stupid.”
When one country overplays its hand, other countries ice them out and sign separate agreements with each other (above) — thus the advantage of multilateral pacts. Countries are more willing to lower their own tariffs if it gives them access to multiple countries’ products, not just one. A new TPP-11 (led by Japan but excluding the U.S. and, still, China) formed immediately after Trump’s withdrawal announcement. By mid-2017, New Zealand, Australia, Canada, and the EU started negotiating lower tariffs with Asian food importers, hoping to undersell American farmers. With America and post-Brexit (post-EU) Britain on the sidelines, the European Union renegotiated with Vietnam, Malaysia, and Mexico, and Japan offered the EU the same deal on agricultural imports that it took the U.S. two years to hammer out during the TPP negotiations. An alliance of Latin American countries including Mexico, Peru, Chile, and Colombia formed their own Pacific Alliance to negotiate with Asian countries while China, sensing blood in the water, formed a 15-country regional partnership in Asia to rival TPP-11.
The cumulative effect of the 2007-09 Financial Crisis, the protracted Sino-American trade war, and COVID-19 led to what The Economist called slowbalization. You can surmise the advantages and disadvantages of that trend if it continues into the 2020s based on what you’ve read: essentially an inversion of the good and bad we saw in the 1990s and ’00s. The trend is already beginning in Japan, Britain, India, and the United States as governments encourage self-sufficiency. Expect the trend of poorer countries gaining ground to level off and for prices to rise in wealthier nations (inflation), while a more fractured world will make international cooperation on issues like climate, diplomacy, and health more challenging. If anything, COVID-19 should motivate us to re-invigorate the World Health Organization (WHO), not close it down because it didn’t perform well enough.
In January 2020, the U.S. and China had just signed Phase One of a new trade deal that made some headway on technology sharing and intellectual property rights abuses and reducing agricultural and manufacturing tariffs when talks were interrupted by the pandemic. With China Phase One, South Korea’s KORUS, and NAFTA/USMCA, Trump incrementally improved on and re-branded existing deals. They were a little bit more beautiful, though TPP was a missed opportunity. But Chinese relations soured in Spring 2020 when Trump tried to score election points by spinning Biden as soft on China the way Republicans had against Harry Truman in 1949 (Chapter 15), undercutting negotiations. Meanwhile, China’s communist party downplayed the novel coronavirus early on, even admonishing Wuhan doctor Li Wenliang for raising awareness of an emerging SARS-like pneumonia that could be passed among humans. As critical weeks passed in December 2019 and early January 2020, China’s government didn’t share COVID-19’s genetic sequence with the WHO in Geneva, who could’ve relayed that information to other countries for testing/tracing and to start developing a vaccine.
Just after the 2020 U.S. elections, perhaps hoping to fend off the potential of Biden renewing TPP talks, China inked its own agreement with neighboring Asian countries: the Regional Comprehensive Economic Partnership (right). Reversing the advantage the U.S. would’ve had with the TPP, China can now dictate the terms of America’s trade with the entire region. In the meantime, Biden inherited Trump’s smoldering trade war and, other than suspending the tariff on solar panels, sided with Trump by signaling a combative stance toward China and making the U.S. less vulnerable to overseas supply chains, which was understandable post-COVID. But, having evidently either not read or not agreed with Adam Smith’s Wealth of Nations (1776), Trump and Biden both understood trade from a mercantilist perspective, as a zero-sum game wherein the goal is to maximize exports and minimize imports.
After Russia’s invasion of Ukraine in 2022, China had to weigh the advantages of supporting Russia versus the potential that Russian aid and trade could undermine its more lucrative trade with Europe and the United States. Over half of young Americans use Chinese-owned TikTok, so security experts worry about its propaganda potential — unprecedented in diplomatic history — though so far it seems pretty innocuous. It may be too public to sneak in subliminal messages, but it’s collecting data on an influential part of the American population.
Like globalization, healthcare insurance played an increasingly large role in politics starting with the 1992 campaign. The main problem was escalating provider costs that outran inflation in the rest of the economy from 1988-2013. While America socialized some healthcare coverage for the elderly in 1965 with Medicare, it has an unusual and spotty arrangement for everyone under 65 whereby employers, rather than the government, are expected to provide workers insurance subsidies, split anywhere from full coverage to a 50/50 match. It stems from WWII, when price controls to prevent inflation prevented companies from giving raises, so in a low-unemployment economy they attracted workers with benefits instead, including health insurance. It’s difficult to measure how many Americans die annually because they lack insurance because it’s impossible to control for multiple factors across a wide population and many people go on and off insurance. The uninsured rarely have preventative checkups that might save them years later. Studies prior to 2009 ranged from 18k to 45k deaths annually according to factcheck.org. If we use a low estimate of 15k, then over a million Americans have died prematurely from lack of coverage since WWII. This recent study by West Health suggests higher numbers, claiming that 34% of Americans know someone who’s died in the last five years because they lacked health insurance.
Nonetheless, the employer-subsidized insurance system works well for many people, especially at big companies. But it leaves the unemployed with no coverage and presents complications for small businesses, especially, because wider pools lower cost — the reason many countries spread risk among the whole population with cheaper government-run systems. That makes Americans more conservative about sticking to big companies and less likely to start up small businesses, hampering entrepreneurship. In America prior to 2013, it was expensive to buy individual coverage if you fell through the cracks and prohibitively expensive if you’d already been sick or had pre-existing conditions. In 2000, the World Health Organization (WHO) ranked the United States 37th in the overall efficiency of its healthcare system. Most developed countries have cheaper healthcare with higher overall customer satisfaction, lower infant mortality, and longer life expectancies but, in some cases, longer waiting periods for non-emergency procedures and less choice in choosing doctors.
Teddy Roosevelt (R) advocated universal (socialized) healthcare insurance as part of his Bull Moose campaign in 1912 and Harry Truman (D) did likewise as part of the Fair Deal in 1948, but both initiatives failed. With FDR’s wage caps in place to avoid inflation during the rationing and low unemployment of World War II, companies started offering to match healthcare costs as a benefit. Unions, in turn, liked the idea because they thought demanding benefits of companies strengthened union membership. The American Medical Association (AMA) has, by and large, supported this patchwork system of privatized healthcare insurance, including, as you’d expect, its lack of caps on provider costs.
It’s important not to equate the quality of health insurance with healthcare itself, and other factors come into play like diet, exercise, and environment. In most countries with public health insurance, the health providers (doctors, hospitals, pharmaceuticals, etc.) remain in private hands. Japan and England are exceptions, along with communist countries. But most countries, and all developed nations outside of the U.S. and Switzerland, at least partly socialize insurance for those of any age. That way everyone pays in and everyone’s covered. The overall cost is lower per taxpayer than what American employee/employer combinations pay because, unlike profit-motivated private insurance companies, governments operate the system at cost. The young pay for the poor, but grow old themselves; men pay for women’s procedures but don’t complain because they have wives, sisters, daughters, or just fellow human beings. Everyone understands that “money doesn’t grow on trees” but when other countries figure the cost of public healthcare insurance, they subtract the cost of private insurance that they’d otherwise pay. If you’re not doing that already as a citizen, you can rest assured that politicians paid by insurance lobbies won’t volunteer to correct your error. They’ll just encourage you to consider the potential extra taxes of universal coverage on top of what you’re already paying privately, even though the latter would mostly disappear. One 2020 study on the U.S. figured that public healthcare insurance would save money for those making under ~ $150k/yr., whereas those in higher income brackets would pay more overall than now because of higher taxes.
Also, contrary to a common assumption among liberals, most other countries don’t have a purely single-payer public system. Most supplement basic coverage with optional for-profit insurance companies for those that want to stay in nicer hospitals or butt ahead in line for procedures. What other countries do have are stricter government-mandated price controls on what healthcare providers charge. While often seen as the bogeymen, American insurance companies lack bargaining power with providers and can be victimized by high costs, inconsistency, fraud, or unnecessary procedures — costs that they pass on to patients (us). One American can pay $1k for a colonoscopy while another pays $5k for the same procedure. As of 2012, according to an International Federation of Health Plans survey, MRIs averaged $1080 in the U.S. and $280 in France. C-section births averaged $3676 in the U.S. and $606 in Canada. In 2020, a bottle of Nexium® for acid reflux costs $202 in parts of the U.S. and $32 in Great Britain. Improving technology and over-charging contribute to inflation rates in medicine that outran the rest of the economy in the early 21st century.
In the American healthcare debate, many analysts focus on these high provider costs while consumers/patients and the political left, especially, focus on glitches in insurance. Profit margins of private insurance companies exceed the tax burden of socialized health insurance elsewhere, though the tax burden can be distributed unevenly at a net loss to the wealthy. In other words, while socialized healthcare insurance raises taxes in other countries, that’s more than offset in the U.S. by higher costs for private insurance. And, prior to 2013, “job lock” problems arose in the employee benefit system when workers with pre-existing conditions (past illnesses) tried to switch jobs because, understandably, no new employer wanted to take on the increased risk of future medical costs. Formerly sick employees (~ 18% of all workers, pre-COVID) in 45 states lacked portability, trapping them and putting them at the mercy of one employer, or on the outside looking in if they lost their job. Also, those that were covered risked having their coverage rescinded after paying premiums for years if the insurance company could find an error on the original application sheet, which they flagged but didn’t notify the holder about until they got ill. These rescissions, aka “frivolous cancellations” or “catastrophic cancellations,” filled up America’s bankruptcy courts with families whose finances (i.e., lives) were being ruined by runaway medical bills. Prior to 2013, the majority of personal bankruptcy cases in the U.S. involved medical hardship. According to Harper’s Index, prior to 2012, 42% of cancer patients in America were broke within two years of their diagnosis, having burned through, on average, $92k in savings. Rescissions were outlawed in some states, and now everywhere by Obamacare, but as recently as 2009 Blue Cross employees testified before Congress that their company paid bonuses to representatives that could cancel coverage for paying policyholders once they got sick.
Meanwhile, some bigger companies suffer because they’re burdened with paying long-term healthcare for pensioned retirees. For instance, with the good contracts the United Auto Workers union won in the mid-20th century, Ford, Chrysler, and General Motors were on the hook for all their retirees’ healthcare. Those obligations grew increasingly burdensome as life expectancies rose. If you bought an American vehicle before the 2007 collective bargaining agreement and GM’s 2009 bankruptcy restructuring, most of your money didn’t go to the materials or people who designed, built, shipped, and sold it; it went to pensions, to the tune of nearly $5 billion annually, industry-wide. The UAW Retiree Medical Benefits Trust now administers a much leaner independent fund with contributions from the Big Three automakers, some in the form of stock. Other companies like Walmart don’t have unions to worry about. They can shove much of their employees’ healthcare costs off onto the public dole (Medicaid) for the rest of us to pay. Medicaid is mostly state-managed, jointly-funded (state/national) healthcare insurance for the poor, passed in the same 1965 legislation as Medicare and later enhanced by Obamacare (except in Texas). It’s easy to see why the American workforce has been trending away from defined benefit pensions and toward defined contribution 401(k)’s, where companies aren’t on the hook for retirees’ healthcare. With a 401(k) or 403(b), employers might match a monthly contribution, but the employee is subjected to his or her own market risks. Additionally, once the employee qualifies for Social Security, they’ll receive some healthcare subsidies from the government in the form of Medicare.
Starting in 1986, the government allowed laid-off employees to continue purchasing healthcare through employer coverage temporarily through COBRA (at higher rates) and backed emergency care for anyone who came to a hospital through EMTALA. While the uninsured poor don’t have access to long-term care for diseases like cancer, heart disease, or diabetes, all Americans contribute to their emergency room care via higher hospital rates for those with insurance. It’s another problem that other countries don’t have to deal with.
Bill and Hillary Clinton made the biggest push since Harry Truman and Richard Nixon to reform the system, though they didn’t end up pushing for universal coverage because they understood that private insurers had enough pull among politicians to block any legislation that would’ve cost them their business. Also, the American public is more libertarian and individualistic than citizens of most countries, who define patriotism as doing what’s best and smartest for the group. When the Clintons crafted a patchwork bill in 1993 to guarantee all Americans access even to private insurance, with employers being required to provide it, insurers filled the airwaves with warnings about impending bureaucracy and baloney about how people would no longer get to choose their doctors (the infamous “Harry & Louise” ads). The bill lost in 1994, but a watered-down version passed in 1996 forcing companies to hire formerly sick workers with pre-existing conditions. The hitch was that insurance companies retained the right to charge more for workers with pre-existing conditions. It was a classic case of corporations paying politicians to water down legislation. Insurance companies are among the biggest donors in Washington.
In response to the long-term threat of universal coverage, conservative think tanks like the American Enterprise Institute and Heritage Foundation hatched the mandate system. Mandates are compromises that require employers or individuals to purchase insurance from private companies, but force insurance companies to cover costlier patients, including the elderly, sick or those with pre-existing conditions, at affordable rates. To offset the cost of covering those that need it, the young and healthy have to buy coverage, which is arguably smart anyway since catastrophic injuries or illnesses can impact people of any age. Moreover, when the young dodge such misfortune, that only ensures that they grow old themselves (think ahead). Stuart Butler of the Heritage Foundation started promoting this market-based solution in 1989 though the idea goes back further, at least to the Nixon era, and included backing from Mark Pauly and Newt Gingrich (R-GA). Butler’s idea required people just to buy catastrophic rather than comprehensive coverage. Free-market economist Milton Friedman published an op-ed in the Wall Street Journal in 1991 promoting individual mandates. Along with his Secretary of Health & Human Services, Dr. Louis Sullivan, President George H.W. Bush proposed the mandate idea in 1992 (minus the small employer requirement), but it died quickly in Congress, defeated by Democrats who either wanted an extension of Medicare to the whole population or just didn’t want to cooperate with Bush for partisan reasons with an upcoming election. No Republicans at the time mentioned anything about such a proposal being unconstitutional because the bipartisan Congressional Budget Office said it was, in effect, a tax. Like Obama’s future plan, Bush and Sullivan put an emphasis on preventative care in order to keep down the costs of treating patients after they develop serious illnesses. Richard Nixon’s earlier idea of an employer mandate suffered a similar fate to Bush’s in 1972, defeated by Ted Kennedy and other Democrats hoping for a simpler, more thorough-going single-payer system whereby Medicare would be extended to those under 65. Twice, Democrats shot down a Republican plan that Obama later passed, which was then opposed by Republicans for being socialist.
In 1993, Senators John Chafee (R-RI) and Bob Dole (R-KS) introduced a privatized mandate plan called HEART, for Health Equity & Access Reform Today Act, to counter Hillary Clinton’s Health Security Act, which they called “Hillarycare.” For many conservatives (and, later, Obama), an individual mandate for each household was preferable to an employer mandate and discouraged “free riders” that, for instance, took advantage of emergency rooms without buying any insurance. More libertarian conservatives at the CATO Institute opposed the idea from the outset. Ironically, Barack Obama, the man destined to become famously associated with the idea, opposed mandates during his 2008 campaign. Ted Kennedy (D-MA) later regretted his opposition to Nixon’s 1972 plan, but his home state of Massachusetts pioneered a mandate plan under Republican Governor Mitt Romney in 2006, that became the basis for the national government’s Patient Protection & Affordable Healthcare Act in 2010, aka Affordable Care Act (ACA) or “Obamacare.” Under the ACA, companies with over 50 employees have to share coverage costs with employees (as most already did), and individuals that aren’t covered and can afford it have to buy insurance or pay a fine. Also under the ACA, insurance companies can’t refuse customers with pre-existing conditions or cut off customers when they get sick. The Romney plan was a good example of how, under the federal system, states can experiment with ideas later adopted nationally.
While the most unpopular feature was the individual mandate for young people to buy coverage, polls show that around 95% of those under 35 wisely want coverage anyway. Under any wider pool insurance system, the healthy pay for the unhealthy and men and women pay for each other, just as homeowners who don’t suffer from fires, floods, or tornadoes pay for those who do, with some adjustments for risk factors. That’s the very nature of insurance. You don’t cancel your home insurance if your house hasn’t burned down yet. To make coverage as affordable as possible for small businesses and those that need to buy individual coverage, each state under the mandate system either sets up its own online exchange for comparison shopping or feeds into a similar national exchange.
The Affordable Care Act version mandates free preventive care to lower costs, caps insurance company profit margins at 15% (20% for smaller companies) to bring costs more in line with other countries, prevents insurance companies from capping annual payouts to patients, and, through 2016, taxed those in the wealthiest bracket an extra 3.8% on investments (capital gains) and 0.9% on income to pay for expanded Medicaid coverage. The profit margin cap meant, in effect, that insurance companies had to pay out 80-85% of what they collected as premiums. Premiums for the elderly can’t be more than 3x higher than those for the young. There was also a 40% excise tax on premium “Cadillac” insurance plans for the wealthy and a tax on medical devices. Around half of those who gained insurance coverage through the ACA did so through these subsidized Medicaid expansions. But mostly to snub Obama, Texas refused Medicaid expansion, even suing the federal government, even though the federal government would’ve subsidized 90% of the cost. This left Texas with the highest uninsured rates in the country (maybe the developed world) at 18% even before COVID-19, when more lost employer coverage at just the wrong time by losing their jobs. Houston had an astounding 20-year gap in life expectancy between its poorest and richest neighborhoods. Environmental factors, namely minorities being housed in and around industry, undoubtedly contributed to the 13-year life expectancy gap between Dallas’ upscale Highland Park (84 years) and Joppa (71) districts as of 2022. Other red states passing on the ACA’s generous Medicaid expansion offer is now leading to the closure of rural hospitals that can’t afford too much uncompensated treatment, compromising the well-being of aging, white Republicans on behalf of rejecting socialism.
Obamacare/ACA was often mistaken for socialism by conservative critics but, at their core, mandate plans preserve health insurance for the free market by forcing individuals and companies to buy insurance rather than the government providing it for them. Staving off socialism (public healthcare insurance) is the whole point of the mandate system. It is partly socialist because of the taxes and Medicaid expansion, and customers below the poverty line are subsidized on the exchanges. But conservative think tanks pioneered the mandate idea to stave off a full-blown socialist alternative whereby taxes provide insurance for everyone the way they do for those over 65 with Medicare Plans A and D or for some veterans with Veteran’s Affairs (VA) or, most ironically, all of the politicians in Washington opposing public insurance. It’s a system that Switzerland, the Netherlands, and Sweden have all experimented with or considered. Switzerland used the mandate system effectively enough to provide universal private healthcare insurance.
There is a part of American healthcare that is flat-out, straight-up socialism in every sense of the word: the widely popular Medicare, through which taxpayers fund a government-administered plan that no one wants to get rid of, regardless of political orientation. As was the case with ACA, opponents of Medicare in the 1960s filled radio waves with unfounded rumors of “government-run death panels.” Medicare led to no such death panels and it’s worked fairly well, all things considered, but it’s also expensive and takes up a growing portion of the federal budget. Also, the pharmaceutical lobby (aka “Big Pharma”) bought a law preventing Medicare from negotiating down prices prior to 2022, when Congress fixed that glitch. Big Pharma is a brazen example of the power of lobbies, as 88% of Americans across party lines favored Medicare being able to negotiate drug prices, and three presidents (Obama, Trump, Biden) supported it, too, but they bought off enough MOCs to stave it off until 2022.
What is also socialist in the American system — but not in the way the term is usually applied — is that employers who subsidize healthcare can deduct that from their taxes, meaning that all taxpaying Americans, including ones that aren’t covered, help subsidize those that are covered at larger companies. Due to the laws of math, all tax benefits punish every other taxpayer not receiving that benefit.
As we saw in Chapter 9, there is a price to be paid for the fact that life expectancies are rising, especially when elderly spend disproportionately on healthcare. Still, the fact that people are living longer is something most of us would argue is a good problem. Either way, there is no free quality healthcare. The money is either coming out of your paycheck if it’s from your employer (all of it, ultimately, not just the half they “match”), your own pocket through “out-of-pocket” bills, or your paycheck through taxes. The question is what setup provides quality healthcare in the most equitable and affordable way possible. The hybrid Affordable Care Act is a complicated, sprawling attempt to manipulate the free market through government intervention. It attempts to smooth over the worst glitches in the old system, but lobbyists ranging from insurers, drug companies, and hospitals all had a hand in crafting the legislation. Insurance lobbies convinced Obama and Congress to drop the idea of a public option being included on the new national insurance exchange, HealthCare.gov, to compete with private insurers, though individual states retained the option to add their own non-profit public option (none did). Amish, Mennonites, Indigenous Americans, Health Care Sharing Ministries (HCSMs), and prisoners aren’t mandated to buy insurance.
For patients that don’t purchase insurance on the state or national exchanges — still most Americans, who continue to get their insurance from employers — the legislation doesn’t include any price controls on hospitals or drug companies, as those lobbies bribed Congress to leave that important feature out. That’s critical, as the question over whether a government caps provider costs is arguably as important as the question of whether insurance will be private or public, maybe even more. Cost controls are a big reason healthcare is cheaper in other countries. Obamacare capped insurance profits but not most provider profits.
Similar to ACA, polls showed that Americans favored much of what was in Clinton’s 1993-94 legislation when posed questions about items in the bill and opposed it when Hillary’s name was mentioned in conjunction with those same items. Both are telling examples of how spin can trump substance in politics, and how the way questions are spun dictates how respondents “frame” the question (see Rear Defogger #17). Partisanship is now such an overriding factor in politics that when a Democratic president (Obama) pushed a conservative idea in Congress, zero Republicans voted in favor, and many confused voters thought a socialist revolution was at hand, while others feared Nazism (best-selling books by FOX personalities Anne Coulter and Glenn Beck argued that Obamacare would lead to concentration camps). Republican strategist Frank Luntz and Senate Majority Leader Mitch McConnell (R-KY) instructed colleagues to block any real reform and to deny Obama bipartisan cover. Nixon and Bush 41 suffered similar, if less inflammatory, responses from Democrats when they pushed mandate plans in 1972 and 1992. Much of the public misread the mandate idea as socialist in 2009 because they were spring-loaded to suspect President Obama of being a leftist and were unaware of its right-wing origins and purpose. Said Michael Anne Kyle of the Harvard Business School, “It was the ultimate troll, for Obama to pass Republican health reform,” accomplishing a liberal end (better coverage) through conservative means (the market).
But some companies keep so-called “29’ers” just under 30 hours a week to avoid having to buy insurance for them, while other small companies stay just under the fifty-employee threshold to avoid having to provide insurance. In an effort to control costs, people covered under policies purchased on HealthCare.gov were offered “narrow networks” of providers who’ve agreed to keep costs down. That annoyed some, but both narrow networks and sketchy customer service are issues many workers already experience on their employer-affiliated networks (HMOs, PPOs). As of 2014, over 70% of customers using the federal exchange were happy with the cost and quality of their coverage. Some insurance companies have grown less hostile to ACA as they’ve realized that expanded coverage means more overall profit, despite the 15-20% cap on profit margin. By 2015, the ACA had cleared two hurdles in the Supreme Court because, as argued by the GOP and CBO in the 1990s, the mandate is officially considered a tax.
Supposedly, Donald Trump’s 2016 election meant that the ACA would be dismantled or reformed, as he promised during his campaign that he had a better plan that would cover everyone for cheaper without reducing Medicaid. But, unless congressional Republicans really could replace and improve upon the ACA, they risked depriving millions of Americans, including many Trump voters, of their insurance. The CBO predicted that a simple repeal with no replacement would throw 32 million Americans off insurance within a decade and double premiums — numbers that discouraged all but the most libertarian conservatives like Rand Paul (R-KY) and Freedom Caucus Chair Mark Meadows (R-NC). Senator Lindsey Graham (R-SC) said the GOP was like the “dog that caught the car” we saw in the previous chapter regarding abortion, with no agreed-upon replacement strategy as Trump was bluffing about his secret plan. COVID-19 caused many Americans to lose their health insurance because of unemployment, as has the growth of the gig economy that provides fewer employer benefits (a key reason why it exists in the first place, along with the convenience of apps). Independent contracting with companies like Uber and TaskRabbit has its advantages in terms of freedom and flexibility, but such work doesn’t come with benefits like health insurance and retirement funds. Obamacare mitigates the downside of unemployment or working as an independent contractor.
A month into his first term, President Trump conceded that revamping the ACA would be “unbelievably complex….nobody knew that healthcare could be so complicated.” He said that people were starting to love Obamacare, but “there’s nothing to love. It’s a disaster, folks.” Congress wrote a repeal bill but it didn’t pass, with John McCain (R-AZ) casting the deciding vote against repeal shortly before his death. Trump cajoled House Republicans to support the repeal bill and called it “tremendous” but then Tweeted that it was “mean” and that Australia had a better healthcare system than the U.S. (Australia has socialized coverage for everyone supplemented with private insurance). Trump touted association health plans for small businesses that could bundle together to get more negotiating power, but courts ruled that they were an end-around to avoid the protections the ACA offered against denying patients with pre-existing conditions, etc.
The most positive thing to come from Trump’s first term on healthcare was a proposal for increased transparency for hospital bills (2019- ) and, hopefully, more transparency on negotiations between providers and insurance companies. It will be interesting to track which of those two industries pushes back harder against transparency. Trump also supported bipartisan legislation banning surprise medical bills that Joe Biden’s administration enacted in 2021 (AP).
Polls showed that only ~ 15-20% of Americans supported straight repeal, leading former Arkansas Governor Mike Huckabee to argue for repeal of the Seventeenth Amendment granting citizens the right to vote for Senators (prior to 1913, state legislators voted for U.S. Senators). Huckabee was the father of Trump’s White House Press Secretary Sarah Huckabee Sanders.
In the words of journalist Jonathan Chait, Obamacare, however imperfect, “squared the minimal humanitarian needs of the public with the demands of the medical industry.” As for former President Obama, he fully endorsed replacing ACA as long as the new plan provided better coverage for less money. In December 2017, Congress passed a tax reform bill that removed the national mandate penalty as of 2019, though states can still require one. The mandate, as the name suggests, is the linchpin of the mandate system. Switzerland attained universal coverage with private insurance by putting more teeth into its mandate, not less, keeping the fine high enough to compel young people to buy insurance. Congress also repealed the Cadillac tax on premium insurance and tax on medical devices. In an effort to make the ACA less effective, Republican Senators shortened the new enrollment period and slashed the advertising budget in an effort to limit patients’ access to ACA, shutting the website down on Sundays for “maintenance.” Trump’s administration also rolled back Obama-era legislation (Michelle Obama’s signature initiative) mandating healthier school lunches. They reintroduced burgers and pizza as à la carte items but the key, pleasing to the potato lobby, was re-classifying French fries as vegetables. As the chart below indicates, the GOP has had some success in undermining Obama’s legacy by removing the mandate:
Stay tuned; the story of the Affordable Care Act is far from over, though the GOP had given up on repealing it as of 2022 because too many of its features were popular among voters of both parties (see optional article below by McDonough). For now, the insurance exchanges, the consumer-friendly insurance company reforms, preventative care coverage (considered wise by nearly everyone), and subsidies for poor subscribers remain even without the mandate. But the insurance reforms (capping profit margins at 15-20%, covering those with pre-existing conditions, not cancelling on people who’ve paid their premiums when they get sick, etc.) might need to be offset by a stronger mandate with higher fines for non-compliance. In August 2020, Trump considered issuing an executive order requiring that insurance companies cover pre-existing conditions, either not realizing that it was already in the ACA that he was trying to overturn or hoping to convince voters that he’d added it himself since that was one of its most popular features.
The Supreme Court, with conservatives now in control 6-3, might consider overturning the ACA in the early 2020s but the GOP still hasn’t countered with an alternative plan. SCOTUS shooting down the ACA would be a grave strategic error for the GOP and lead many families, including Republicans, into bankruptcy. A district court in north Texas presided over by Judge Reed O’Connor has ruled that Obamacare is unconstitutional in its coverage of preventive medicine like mammograms, colonoscopies, pre-natal care, STD screening, and cholesterol-lowering drugs but, so far, SCOTUS has overturned his rulings. If SCOTUS or Congress overturns the ACA altogether, that could backfire on conservatives by making it more likely in the long run that the U.S. would enact universal healthcare coverage (expand Medicare).
Whatever happens next, the key for voters will be whether or not some solution can continue to bend the curve on medical inflation (premiums and provider costs) while maintaining quality care and shielding families from bankruptcies and premature deaths. Long-term, the GOP will have to confront the fact that most Americans want a system — private, public or somewhere in between — that allows coverage for pre-existing conditions, covers preventative care, and outlaws rescissions when people get sick. Other than the lobbying power of providers and health insurance companies, there’s no reason a developed country should trace most of its personal bankruptcies to illness. For Democrats, they will have to do a better job explaining any Medicare-for-All proposal than simply saying that we’ll tax the rich, or go with a more moderate public-option on the exchanges. They’ll need to convince Americans that the tax increase would be offset by companies and employees no longer needing to pay for private insurance. If Republicans revive any genuine interest in health insurance policy, their challenge will be to reclaim and modify some variation of a private/mandate system that they can distinguish from Obama in voters’ minds while promoting something similar, the way Trump re-branded NAFTA. That will be a difficult knot to untie. Both parties will have to reckon with whether we can cap provider costs without hindering the innovation that’s made the best of American medicine so good. Houston, for instance, the same city with the big gap on coverage, has some of the best hospitals in the world.
The Financial Crisis of 2007-09 & Great Recession: How Big Banks Got Too Big To Fail & (Maybe) Stayed That Way
Bill Clinton fared far better with the rest of the economy in the 1990s than he did with healthcare insurance. The economy was booming by the end of his first term and incumbents rarely lose re-elections in that scenario. People “vote with their pocketbooks,” as the saying goes. By the mid-’90s, the emerging Internet fueled explosive growth in the technology sector and better-than-anticipated petroleum discoveries drove oil down to one-fifth the price of the Carter years, adjusted for inflation. A tax hike on the rich from 36% to 39.6% didn’t inhibit things either. The government ran annual budget surpluses for the first time in decades. It’s easy to see, then, why Clinton would’ve gone along with libertarian Federal Reserve Chair Alan Greenspan, his two Secretaries of Treasury (Robert Rubin and Lawrence Summers), bank lobbyists, and Republicans in loosening up Wall Street regulations even further than they’d already been loosened by Reagan. Greenspan kept interest rates low despite not being in a recession, fueling a speculative bubble in real estate. Low-interest rates not only encourage borrowing for homes, they also fuel the stock market because comparison shoppers prefer investing in stocks to low-yielding bonds.
Between 2001 and ’05 the Fed pumped cash into the economy to keep it healthy after the 9/11 attacks and the collapse of the dot-com bubble. As an apostle of Ayn Rand, Greenspan believed that traders would naturally self-regulate as they pursued their selfish interests. But with the Federal Reserve’s role, this wasn’t a purely free-market economy. Greenspan’s system privatized profit while socializing risk because markets either went up or the Fed lowered interest rates to bump them up, threatening inflation or speculative bubbles in real estate or stocks. Greenspan’s successor Ben Bernanke followed the same policies after 2006. While the Fed was set up originally in 1913 to smooth out fluctuations in the economy, Greenspan’s high growth but bubble-prone policy ultimately made markets more erratic and he later testified before Congress that his strategy of deregulation combined with easy money had been a mistake.
Commentators often speak of the Law of Unintended Consequences to describe how either passing or eliminating laws often has unforeseen consequences (e.g., defensive treaties leading to World War I). In this case, three deregulations (eliminations of laws) contributed to a financial meltdown a decade later. First was the Gramm-Leach-Bliley Act of 1999 (GLB) repealing the 1933 Glass-Steagall Act from FDR’s New Deal that had set up a firewall between riskier bank investments and regular bank customer savings. For the half-century after Glass-Steagall, there hadn’t been many bank failures in America — the reason reformers like Texas Senator Phil Gramm argued that the law was outdated. In retrospect, though, Glass-Steagall might have been partly why the U.S. didn’t have bank failures. But Gramm was coming from the Reagan Revolution mindset that regulations only slow the economy. The GLB Act didn’t affect the major investment banks involved in the 2007-09 meltdown (other than allowing some mergers), but it affected commercial banks like Bank of America and Citibank on the periphery of the crisis. Additionally, anonymous surveys of chief financial officers show that many were increasingly asked to “cook the books” after the banking/accounting deregulations of the Reagan era. When such “creative accounting” led to scandals at Enron, WorldCom, and Tyco in 2001, the Sarbanes-Oxley Act restricted such practices but, predictably, the financial industry just lobbied for exemptions. The never-ending back and forth of regulating and deregulating was complicated by a revolving door of career paths between finance, lobbying, and politics. Robert Rubin, for instance, went from Goldman Sachs to serving as Clinton’s second Treasury Secretary, back to Citigroup. The deregulatory policies he promoted in public office benefited both firms. Moreover, the $26 million he earned at Citigroup included bailout money from the government after the system he helped set up failed. In what’s known as regulatory capture, many of the regulators in agencies like the SEC (Securities & Exchange Commission, 1934- ) are familiar socially and professionally with the financiers they regulate.
A second deregulation impacted the coming crisis more than Glass-Steagall’s repeal. A big cause of the 2007-09 Financial Crisis and the danger it posed to the rest of the economy was a three-fold loosening up of leverage ratios by the SEC in 2004. Investment banks could now gamble their clients’ money on a 30:1 ratio, rather than 10:1. This was yet another example of history rhyming but not repeating, as it was a variation on the loose on-margin investing rules that exacerbated the 1929 Crash (Chapter 8). Again — this is common refrain in politics, including Big Pharma — the amount Wall Street firms paid politicians to change the law was minimal in relation to profits. The financial industry lobbied around $600 million to politicians to deregulate in the decade prior to the meltdown while raking in trillions because of the boost to short-term performance. Bankers got big bonuses if their bets paid off and shareholders or taxpayers got the bill if they lost, in the form of plummeting stock or bailouts. Heads I win; tails you lose. The men who “incompetently” ran the big banks into the ground walked away with hundreds of millions of dollars in bonuses and salaries they’d already made based on short-term returns. Those were the Christmas bonuses of 2005, ’06 & ’07. It seems, rather, that the real incompetence lay with the politicians and voters who bought into deregulation too whole-heartedly.
Banks leveraged more in real estate than other parts of their portfolios. For every $1 that Americans spent on housing, Wall Street bet at least another $30 that the housing bubble would increase in perpetuity. With such leveraged bets, even a small 3-4% dip in housing prices would wipe out the banks…that is unless the government (i.e., taxpayers) came to their rescue because allowing them to collapse would’ve cratered the entire American economy, if not the world’s.
A third deregulation was the repeal of obscure laws that originated after the Panic of 1907 and 1929 Crash outlawing bucket shops. Bucket shops were gambling parlors, essentially, where people without actual share ownership just bet on the stock market the way one would bet on horses or football games. No official transaction occurs on any institutional exchange. Congress quietly repealed portions of those laws and another from the New Deal in the Commodity Futures Modernization Act, on the last vote of the last day of the 2000 session — the kind of dull scene on C-SPAN cable that viewers flipped past with the remote control. That changed how big financial firms bought and sold complicated financial products called derivatives. Here’s where things get very tedious if they haven’t gotten tedious enough already, so just strap in and do your best to comprehend. Don’t feel bad if it seems complicated because the subject’s complexity and boringness are deliberate, similar to fine print discouraging you to read. For a video explanation, I suggest Charles Ferguson’s documentary The Inside Job (2010).
Two derivatives threatened the economy in the early 21st century: real estate-based mortgage-backed securities and credit default swaps to insure against the failure of those mortgage-backed securities. A good starting point to understanding mortgage-backed securities is realizing that your mortgage — the loan you took out on your house, store/business, studio, farm, ranch, or condominium — can be bought by those that want to assume the risk of you not paying it off in exchange for the gain of you paying interest on the loan. Your job is to pay it off, not to choose whom you pay. Mortgage-backed securities (MBSs) are bundles of real estate mortgages that are sold as investments to other people who then own parts of your loan. The seeming upside of MBSs was the traditional stability of the American real estate market. When mortgages were securitized — packaged and sold as financial products like stocks or bonds — the bank no longer lost their money if the homeowner defaulted on the loan because they’d sold it to someone else. By then, they’d long since been “sliced and diced” like kitchen ingredients and recycled back into the financial food chain. Thus, banks no longer had as much incentive to avoid lending to borrowers they suspected might not be able to pay them back. MBSs spread risk, which was good, but they lowered standards because no one person had a stake in making sure the mortgages were sound loans.
Invented by Salomon Brothers’ Lew Ranieri in 1977, mortgage-backed securities were bunched into portfolios called collateralized debt obligations (CDOs) that few people, including investors at other banks or rating agencies like Standard’s & Poor, studied in enough detail to examine all the high-risk mortgages they included. At this point, your home loan would’ve been difficult to trace except that someone contacts you to continue paying it off. In The Big Short (2015), based on Michael Lewis’ 2010 namesake book, the narrator tells viewers that Ranieri had a bigger impact on their lives than Michael Jordan, iPods® and YouTube® combined, even though no one had heard of him. Lewis adds that the opaque, complicated, boringness of high finance is deliberate as it shields Americans from the reality of Wall Street corruption — in this case that bankers were getting rich from short-term bonuses by hiding bad, high-risk loans into bundles of seemingly stable real estate investments that were sold to other banks, investors, pension funds, etc. CDOs included the lowest-rated tranches of sub-prime mortgages that they couldn’t hide in normal MBSs — ones with high variable rates scheduled to go up in 2007. If the reader will pardon one personal word of advice: pay attention to variable versus fixed rates on home mortgages, as the former is subject to the whims of the Federal Reserve.
Adding fuel to the fire, bankers took advantage of the deregulations regarding leverage ratios and bucket shops to place side bets on the CDOs called synthetic CDOs. The amount of money riding on these unregulated derivatives was about 5x more than the CDOs themselves. Billionaire investor Warren Buffett called these complicated derivatives “weapons of mass destruction” because they were unregulated and only served to encourage reckless investment. It’s safe to say that when 19th-century president Andrew Jackson complained of unscrupulous financiers profiting off the hard-earned money of farmers and craftsmen by simply re-shuffling paper, he scarcely could’ve imagined anything as esoteric as 21st-century Wall Street. President Bush (43) later said that bank CEOs couldn’t explain even their own products to him.
Credit default swaps (CDS) are the second form of financial derivative that got Wall Street in trouble. CDSs insured banks against the failure of their own investments. In another case of unintended consequences, a group of young JPMorgan bankers first conceived them in 1994 as a way to stabilize the system. The House of Morgan recognized the need to insure against loans it made and bonds it underwrote to corporations after the Exxon Valdez tanker accident in Alaska in 1989. But CDSs came of age when banks began to insure against failure of their own mortgage-backed securities as the housing bubble expanded in the first decade of the 21st century. Credit default swaps emboldened banks to continue making money in risky real estate investments even as they knew a bubble was forming because they figured they could insure themselves against the inevitable collapse.
To gauge the quality of loans and bonds (their likelihood of being paid back), investors rely on rating agencies — similar to those that rate our personal credit. As of the early 21st century, these agencies had a solid reputation. However, like accountants and financial officers, agencies like Standard & Poor’s (S&P) and Moody’s had a direct conflict of interest even when they did understand derivatives because they were paid based on the quantity of loans they certified, not the quality or accuracy of their ratings. Securities with the highest AAA rating doubled in 2006 alone. If one ratings agency didn’t score low-quality debt AAA, banks would simply go to their competitor. Compounding that dynamic were mortgages with mistakes and forged information. Internal memos show one employee at Moody’s Investor Services joking that he’d award a high rating to a security “structured by cows.” Moody’s CEO Raymond McDaniel told his board that the quality of their ratings was the least important thing driving company profits. Rating firms were earning record-breaking profits by betraying the trust their firms had built up over generations. The entire rating industry is only as good as the firms’ commitment to detached analysis; otherwise, the economy is worse off than if they didn’t exist to begin with. This was crucial in the early 21st century because supposedly stable pension funds, overseas banks, and even municipalities were loading up on AAA-certified debt, thinking the high rating actually signified safety (low-risk) the way it had traditionally.
Some Wall Street banks like Lehman Brothers didn’t get the memo that the ratings had become meaningless, so they larded up on mortgage-backed securities. The subjects of Michael Lewis’ Big Short are small investors who, by actually researching the data and contents of the MBSs and CDOs instead of being willfully ignorant, figured out what was going on earlier than the big banks and bet against them — shorted them in regular stock terminology — with credit default swaps that promised 10:1 returns. They lost money paying insurance premiums as they waited for their predicted downturn in housing. Some of the banks, including Goldman Sachs, also figured out the “jig was up” and bought credit default swaps because they knew a real estate bubble was forming and some of the SBSs held too many “toxic mortgages.” Yet, they continued to push mortgage-backed securities to their customers even as they bet against them by buying default swaps.
The rating agencies were underestimating risk when some of the mortgages bundled into the funds, especially CDOs, were bound to default. Always be wary of precision in forecasts: though their risk calculations were figured down to the third decimal point, the agencies’ forecasts were off by a factor of two-hundred. Precision is usually a framing trick. Worse, they didn’t factor in that mortgage failures would correlate if the national economy cratered and homeowners defaulted in a cascading effect. The S&P shared its rating software with the banks, purportedly for “transparency,” but that only showed the banks exactly how many risky loans they could stuff into the lower-quality, higher-yielding tranches of the CDOs while retaining the high rating. Later, the rating agencies testified before Congress that they never saw the housing downturn coming but, between 2000 and 2007, there were analysts discussing the bubble in the Economist, New York Times, and elsewhere. Google searches for “housing bubble” among the public increased ten-fold between 2004 and ’05.
No one wanted to discuss the proverbial “elephant in the room” because too many people were profiting from the housing bubble. With values rising, homeowners could finance skiing vacations, cars or renovations by using their mushrooming home equity (value minus mortgage due) as collateral, and investors could flip houses by buying, fixing, and reselling — benefitting real estate agents, contractors, home improvement stores, and reality shows about house flippers. Greenspan and Bernanke’s easy money (low interest) policies weren’t causing inflation in the customary sense of the word, with higher prices on goods, but rather “asset inflation” where people who already owned real estate or stocks were enjoying a boom. Meanwhile, the financial sector could rake in million-dollar short-term bonuses as they ran proud historical firms into the ground while politicians deregulated the industry that funded their campaigns. Banks were borrowing from the Fed at 1% while lending mortgages at 5-6% and repackaging them as ticking time bomb securities sold around the world.
Closer to “main street” as opposed to “Wall Street,” banks offered subprime or predatory loans to people without jobs at high-interest rates, ensuring short-term profits and long-term defaults. Late-night commercials posed the question: “problems with your credit?” The catch with those car or home loans is that their interest rates are higher and the borrower is less likely to pay off the loan because they are poorer. CDOs owned subprime loans owed by people mortgage lenders jokingly referred to “Ninjas,” for no income, no job. Unregulated lenders got commissions for signing people up and passed on ownership of the mortgages. Some lenders testified that their bosses didn’t allow them to fill out the applicants’ boxes regarding income or credit rating.
An even bigger problem was middle- and upper-middle class Americans borrowing too much against their home equity as interest rates dropped. Meanwhile, the MBSs and CDOs stuffed with all these loans — subprime or prime — were being traded around among people with no direct stake in the mortgages and with no regulatory oversight in a shadow market. They weren’t bought and sold on common exchanges like regular stocks, bonds or commodities, and no one knew how much overall money banks were betting, even within their own companies (one argument against large size). Fed Chair Greenspan, Clinton’s third Treasury Secretary Larry Summers, and bank lobbyists defeated an attempt by commodities/futures regulator Brooksley Born to make the derivatives market more transparent, saying that the banks knew what they were doing and could regulate themselves.
They did not know what they were doing or, if they did, they didn’t care because their bonus pay structure provided no motive for the long-term solvency of their banks, let alone the American economy. Many likely suspected ahead of time that the government (taxpayers) would have no choice but to bail them out when the system crashed. Others were just clueless. If you find some of the finance in these paragraphs hard to wrap your mind around, don’t feel bad. The new financial products were too complicated for many bankers, regulators, analysts, and professional investors to grasp, either. Complexity wasn’t the core problem, though; investment bankers gambled too much borrowed money on the housing market, creating systemic risk that threatened the whole economy. Consideration of systemic risk and the relationship between Wall Street and the rest of the economy is an important concept to consider as a citizen and voter because it will impact what financial policies you will favor going forward. For skeptics of systemic risk, the solution was simple: let the banks fail if they screwed up; it was their problem. For believers in systemic risk, it was more complicated. The biggest banks in the 2007-09 Financial Crisis were “too big to fail” not because they were influential and curried favor with politicians (though both true), but rather because their failure would crater the whole economy. You see, we are the systemic part of systemic risk.
Too much downside risk pooled at the insurers and banks issuing the credit default swaps, like AIG (American International Group). They gambled that the real estate bubble wouldn’t burst but didn’t have anywhere near enough money to pay off the swap holders when housing lost momentum in 2007. An analogy would be the fate of regular insurance companies if everyone’s homes burned or flooded at the same time when their actuarial models are based on only a few being destroyed at a time. Likewise, the FDIC insures account holders up to $250k if their bank fails, but could never pay everyone if all the banks failed simultaneously. The FDIC can just cover sporadic failures and robberies.
Despite all the unregulated trading of the credit default swaps, they couldn’t swap risk out of the financial system. Somebody always stood to lose and the system couldn’t sustain a downturn of the real estate market given the high leverage ratios. The Big Short likens the SBS-CDOs with their tranches of variously rated mortgages to wooden blocks in the game of Jenga. As securities with the most toxic mortgages started to be pulled out, the tower would eventually collapse. Such a correction was nearly inevitable in retrospect. As the graph below shows, it was the price of land more than the structures themselves that skyrocketed in the ’90s and ’00s. The bubble couldn’t burst, argued realtors and speculators, because “God wasn’t making new land.” Soon-to-be Federal Reserve Chair Ben Bernanke said in 2005, “We’ve never had a decline in house prices on a nationwide basis.” Surely, as a student of the Great Depression, Bernanke must have known that real estate dropped 25% in the early 1930s. But what about housing when the economy isn’t in a recession? The reason real estate hadn’t collapsed on its own without a major recession was simply because there had never been a housing bubble. For reasons that are obvious if you think about it, home prices historically had been tied to inflation and wages except in periods of housing shortages such as after World War II. In the century from 1896 and 1996, the inflation-adjusted value of real estate in America rose only 6% — about how much the stock market has risen annually since 1925 adjusted for inflation. Yale economist Robert Shiller was foremost among those that claimed that real estate prices couldn’t sustain being untethered from wages and housing supply-and-demand.
Because of systemic risk, if big banks would’ve failed due to a sudden market correction in real estate, paychecks would’ve bounced and cash machines would’ve frozen up in an instant, causing a cascading effect of panic, cash shortages, and breakdown of confidence. Undoubtedly, there would’ve been rioting, looting, and violence. As mentioned, the FDIC would not have been able to save those who lost their money in failing banks. These would’ve been the short-term effects, along with a near-total stock market crash. Allowing the banks to go into Chapter 11 bankruptcy would’ve wiped out their shareholders and crashed the stock market more suddenly than 1929-32. Unlike then, ~ 66% of shares are held by ordinary Americans, not the wealthy. Directed benefits like 401(k)s, as opposed to defined benefits like pensions, would’ve evaporated, and pensions have their money in the market anyway. No one will ever know for sure what would’ve happened because the government came to the rescue and saved the banks. In the worst-case scenario, things could’ve gotten uglier quicker than in 1929-32. Unlike the early years of the Great Depression, the government would’ve spiraled into deeper debt right away since, unlike ’29-’33, they would’ve been on the hook for unemployment insurance, welfare, etc. As president of the Federal Reserve’s New York branch, Tim Geithner testified that this was a “foam on the runway” type emergency, alluding to airport protocol in case of a crash. Given the overall percentage of American money tied up in the biggest handful of banks and the importance of the stock market to retirement savings, 5x more than 1929, make no mistake: we were peering into the abyss.
This wouldn’t have been the case if the nation’s money had been more dispersed across smaller banks. But as a report from the Federal Reserve Dallas branch showed, the percentage of the financial industry controlled by the five biggest banks grew from 17% to 52% in the 40 years between 1970 and 2010. Under Chairman Greenspan, the amount of the economy controlled by these five banks had already grown to 45% by 2005 and former industrial stalwarts like General Electric and General Motors turned increasingly into financial institutions (financing loans on cars, appliances, etc. is how many manufacturers turned a profit). Banks even doubled their representation in the S&P 500 (U.S. large-cap stock market) from 9% to 17% in the four years after the crisis they were at the center of. Not only had a lot of the country’s wealth concentrated in a small number of banks, but big size makes banks more difficult to manage. Unregulated shadow banks maintained books so complicated that the Fed couldn’t save them through traditional “lender of last resort” methods. The New Bankruptcy Law of 2005 made these shadow banks more appealing to large banks because it put them first in line to collect if they went under. Hundreds of smaller mortgage lenders also went out of business. Among the big banks, Bank of America and JPMorgan bought Merrill Lynch and Bear Stearns, respectively, when they started to fail, competitors bought Washington Mutual and Wachovia, and the government bailed out AIG. However, Lehman Brothers didn’t have enough collateral to interest any prospective buyers, including the government.
After all the decades of controversy and debate over government intervention since the 1930s, free-market Republicans quickly (and wisely for believers in systemic risk) abandoned their commitment to laissez-faire when the pressure was really on. This was a key moment for Republicans to employ their longstanding free-market ideology — the same one that had led to the deregulations of the 1980s-’00s — but instead, as the market collapsed, Bush 43 said, “If this is Hoover or Roosevelt, I’m damn sure going to be Roosevelt!” In September 2008, Lehman Brothers’ bankruptcy sent the stock market into its biggest downturn since 1929 and collapsed the commercial paper market (short-term loans), threatening the ability of companies nationwide to meet payroll. The government realized they had to plug the dike by bailing out the others as markets nosedived. At first, the House of Representatives voted down a bailout, but that precipitated an even bigger downturn in the stock market, eventually bottoming out at a 54% decline by March 2009. After Lehman Brothers went under, Ben Bernanke testified before Congress that the entire banking system would collapse within 72 hours, followed shortly thereafter by the collapse of the global financial system. Bush 43’s last Treasury Secretary, Henry Paulson (former Goldman Sachs CEO), barely slept for weeks as he contemplated the “repugnant measures” needed to rectify the free market (i.e., government intervention). Just as Bush didn’t want to be Hoover, Hank Paulson said “I don’t want to be Mellon,” referring to Herbert Hoover’s Treasury Secretary Andrew Mellon, who’d advocated a hands-off approach to the Great Depression. Where was Ayn Rand when they needed her?
Bush 43 signed the Emergency Economic Stabilization Act that became known as “the bailout.” However, the Federal Reserve’s solution wasn’t a mere bailout. They didn’t just give banks and endangered corporations money, but rather invested in and loaned to them through the unpopular but successful TARP (Troubled Asset Relief Program), which also scooped the most toxic assets out of those institutions. In this way, Bush 43 should’ve given more credit to Hoover since TARP was loosely based on Hoover’s Reconstruction Finance Corporation (RFC) that Roosevelt made more famous. Billionaire Warren Buffet also deserves some credit, as he helped convince Washington to avoid a pure bailout, which the public was understandably upset about. TARP also aided Detroit’s ailing auto industry, namely General Motors. Meanwhile, the U.S. lent money to foreign banks who’d over-invested in American mortgage-backed securities, though the meltdown had a ripple effect in Europe that outlasted the worst danger at home (i.e., Eurozone Crisis). The crisis also boosted China in the overall balance of economic power, as mentioned above in the trade section.
Luckily, the U.S. merely sank into the Great Recession rather than the aforesaid abyss. Moreover, the rescue came at little cost to the American taxpayer, because the government got the TARP money back and more as the financial crisis passed. Adjusted for inflation, the government at least broke even with TARP. Yet, five trillion dollars — almost a third of the country’s annual GDP — disappeared from the economy and many Americans lost their jobs (8 million) or homes (6 million) or took pay cuts. The economy slowed as businesses and households began to “deleverage” or “unwind” (pay down) excessive debt. At the household and small business level, America remained in a balance sheet recession of overhanging debt stifling economic growth. The Obama administration faced conservative and populist (Tea Party) opposition to any meaningful stimulus package or mortgage debt relief and its first Treasury Secretary, Tim Geithner (upper left), shared the conservative view of stimuli as overrated “candy.” If Greenspan’s Reagan Revolution deregulation failed in the Financial Crisis, neither would its wake see a return to the Keynesian economics of the New Deal (Chapter 9), or at least not on as big of a scale. Geithner was a primary architect of TARP in his role at the Fed’s New York branch. He gives a conflicting testimony in his otherwise sound account, Stress Test: Reflections on Financial Crises (2014), arguing that he wanted a bigger stimulus package. In any event, that didn’t happen, with the $830 billion stimulus (around 63% spending and 37% tax cuts/rebates) equaling ~ 15% of the New Deal’s size as measured by percent of GDP adjusted for inflation (6% vs. 40%).
The 2009-19 Stimulus Package helped stave off a worse slowdown but gave way shortly thereafter to Tea Party-inspired budget austerity (belt-tightening). One could say that the government itself began to deleverage, for better or for worse — better because long-term budget forecasts improved; worse because most (or at least liberal Keynesian) economists see recessions as exactly the wrong time to initiate such fiscal conservatism. In their view, budget cuts during a recession cause a net loss because the lack of stimulus spending increases unemployment rolls and lowers tax revenues even as it spares the government in the short run from going deeper into debt.
After the 2007-09 meltdown, consumers, companies, and government all struggled to pay down debt during a slow, gradual recovery. Many small businesses laid off workers as they could no longer borrow from suddenly conservative, risk-averse banks. Mortgages that were too easy to qualify for a few years prior were now overly difficult, making it more challenging for first-time buyers to come up with a down-payment, often requiring 20% down instead of 10%. Many existing homeowners found themselves “underwater,” meaning that their mortgages were more than their homes were now worth. According to the Federal Reserve, average middle-class net worth plummeted from $126k to $77k between 2005 and 2010. According to Pew Research Center, the Great Recession nearly halved the wealth of lower-income white and middle-income black and Hispanic families. Real estate markets cratered in over-built areas like Las Vegas, Phoenix, and Florida. Budgets shrank at public and private institutions alike. Indebted consumers without access to credit spent less, creating the familiar downward recessionary cycle. Many retirees lost their savings and others nearing retirement delayed and worked longer. As of 2015, workers’ wages were $4k less (inflation-adjusted) than before the crisis. Like the Okies of the Dust Bowl and Depression, the 2010s saw van dwellers: retirees living in RVs who roamed the country in search of low-paying seasonal work, as chronicled by the Amazon CamperForce in Nomadland (2020).
Like the Great Depression, the Great Recession was global, with European banks having larded up on low-interest loans and real estate. In his sequel to The Big Short, Boomerang: Travels in the New Third World (2011), Michael Lewis chronicled the whipsaw effect of the Wall Street meltdown in Iceland, Ireland, Greece, and Germany. Central banks in the U.S. (the Fed) and across the world madly tried to stem the tide by infusing cash into the system and encouraging lending with low rates. By buying up long-term Treasuries, mortgages and other bonds at a staggering rate with Bernanke’s Quantitative Easing program, the Fed stuffed $85 billion a month into the American banking system for several years (totaling $3.5 trillion of “funny money” debt). That’s why the banks got even bigger after the crisis than before. Yet because lending got more conservative, they weren’t circulating all that cash into the economy. Instead they hoarded it and, like pigs at the trough, even used it to pay million-dollar bonuses to themselves.
By keeping interest rates so low, the Fed enticed wealthy Americans and corporations to borrow at low rates to invest more in stocks and real estate, re-inflating those asset classes and increasing wealth inequality. The Fed’s stimulus had no real trickle down effect and when they threatened to gradually inch rates back up, the stock market threw a “taper tantrum” and started falling because, like a child with candy, it’s addicted to help from the Federal Reserve for its sugar high. The companies didn’t use the money to pay their workers more or invest in the company, except for automation to replace workers with. When the Fed keeps rates low, corporations borrow money by issuing bonds at low rates and use the cash to buy back stock, which drives their stock price up and makes them more money than what it cost to borrow. When the Fed started buying these corporate bonds along with their usual government bonds and real estate during QE2 to swap with all the cash it was funneling into the banking system, it reinforced the feedback loop of mutual dependence between the stock market and Federal Reserve. Big stimulus spending during COVID-19 had the same effect, though it garnered less media attention because, by then, the market-Fed feedback loop was a well-oiled machine.
While TARP, QE, and COVID stimulus were successful as emergency room operations in saving the life of the patient, staving off a collapse of the financial system, they didn’t heal the underlying illness or stimulate any dramatic recovery. The economy stepped back from the abyss but remained stagnant for quite a few years in comparison to pre-2008 levels. The government, with both political parties taking money from Wall Street donors, still didn’t break up the big banks to preclude systemic risk, limit executive bonuses or compensation, regulate lobbyists, reform the credit agencies, or even attach many strings to the bailout. In fact, the big banks used some of the bailout money to buy up distressed smaller banks. And the ringleaders who broke the law by misleading investors didn’t go to prison, though the principal banks were fined $190 billion collectively, that they pilfered directly from shareholders rather than their own salaries. After JPMorgan Chase CEO Jaime Dimon arranged a $13 billion out-of-court settlement with the Department of Justice (DOJ), his ecstatic and relieved board gave him a 74% raise hiking his salary to $20 million. They hadn’t expected such a small penalty. If TARP was emergency heart surgery and banks were the patient, the patient was fine and feeling good within a couple of years, gorging on cheeseburgers and smoking a pack a day.
Part of the problem is that certain bankers excel at violating the spirit of the law without technically breaking it. Moreover, Obama’s Attorney General Eric Holder had issued the Holder Doctrine as Deputy Attorney General in 1999 that was basically a legal variant of the Too-Big-To-Fail doctrine — arguing that, like bankruptcy, too much prosecution of bankers posed systemic risk. Holder negotiated the slap-on-the-wrist with JPMorgan Chase. After he resigned, Holder returned to Covington & Burling, a law firm whose clients include Bank of America, Citigroup, and Wells Fargo. They even kept his office for him while he was away. Democrats didn’t raise much of a fuss because Holder was Obama’s AG, while Republicans either favored Wall Street deregulation anyway or were so busy bloviating about Obama’s alleged left-wing socialism that they failed to notice the real life-Obama’s right-wing “crony capitalist” connections. The unwillingness of either party to push back on Wall Street inspired Occupy Wall Street on the left and the new Tea Party on the right that merged with, and essentially took over, the Republican Party. This tirade by Rick Santelli on CNBC is commonly cited as launching the Tea Party — named for the infamous boycott of British tea in 1773 — though the party’s spirit traces more generally to Sarah Palin’s candidacy in 2008 and was eventually co-opted by Trump in 2016:
Oddly, given the Tea Party’s supposed populism, Santelli’s rant came on the heels of a relatively small bailout for some mortgage-holders rather than the big bank bailout (some people have argued that the whole bailout would’ve been cheaper had the government just paid off all the bad mortgages instead of Wall Street). Either way, once the Tea Party/Freedom Caucus hijacked the GOP establishment and took over the House of Representatives, they took no action whatsoever against Wall Street criminals, which in their case would’ve been legislation rather than prosecution. Their only interest was tapping into the rage of the voters, not the boring finance behind it. Consequently, this whole grotesque miscarriage of justice receded under the public radar as happy investors watched the stock market climb steadily during Obama’s two administrations. The SEC focused instead on corrupt hedge fund managers like Bernie Madoff who, while no doubt felonious, were small fish to fry in comparison with the bank CEOs who threatened and temporarily crippled the economy (Madoff also took advantage of CDOs). The banks crashed the stock market, after all, not vice-versa. Credit Suisse investor Kareem Serageldin spent 30 months in jail but none of the primary culprits went to jail despite the fact that several committed felonies. Either way, the main factors that caused the meltdown weren’t illegal since bankers had long since bribed politicians to deregulate.
The 2010 Dodd-Frank legislation that followed from Congress subjected the derivatives market to the same transparency as the regular stock market and forced stricter capital requirements on the banks, with twice as much of their cash safe on reserve (20% not invested, as opposed to 10%) and leverage ratios dropped back down to 16.66:1 as opposed to 30:1 (though still 66% higher than 2004). Dodd-Frank also created the Consumer Financial Protection Bureau (CFPB) to help protect citizens against fraud. In its first five years, the CFPB returned $11 billion total from banks and financial companies to consumers. However, in a classic case of the fox guarding the henhouse, the banks themselves were put in charge of overseeing the derivatives markets — a $600 trillion-dollar untaxed market that contributes virtually nothing constructive to society. Just to put some perspective on that, $600 trillion was ~ 33x bigger than the entire rest of the “real” American economy outside the derivatives markets. The increased capital cushion requirement hurt small businesses because big banks were less willing to take risks lending to small entrepreneurs. The result was that the big banks were bigger than ever and they still reaped the profits while taxpayers and investors assumed the risk.
With the Federal Reserve lending at virtually no interest, the government had shoveled banks around $3 trillion by 2009. Americans naturally resented the fact that the perpetrators of the crisis actually profited from it while everyone else suffered. On the other hand, TARP paid for itself within a few years and really had staved off a worse systemic crisis. Moreover, big banks had dramatically shrunk their involvement in the mortgage-related derivatives trade by 2016. Still, CDOs quietly crept back onto the scene in 2015, re-branded as “bespoke tranche opportunities” (BTOs). The system that led to the crisis more or less remained in place.
Senators John McCain (R-AZ) and Elizabeth Warren (D-MA) introduced legislation to cap bank size in 2013, effectively trying to break up the big banks. Bernie Sanders (D-VT) backed the idea in his 2016 campaign. According to this line of thinking, until such a law is enacted, too big to fail remains part of the economic landscape. Others argue that “bigness” was never really the problem to begin with and, even with such a law, the overall amount of capital controlled by big banks wouldn’t change; there would just be more of them. Dodd-Frank supporters pointed out that it included procedures for the Federal Reserve to “wind down” problematic banks that failed stringent stress tests and that some banks were breaking themselves up anyway to avoid the legislation.
Dodd-Frank also required that corporations reveal the ratio of pay between their CEO and average employees, though it didn’t require that they limit the ratio. Japan requires that CEOs not make more than 15x a company’s lowest-paid employee. In the U.S. the ratio of CEO to average employee salaries went from 20:1 to 300:1 between 1983 and 2013 and there’s resistance to the idea that they should have to reveal these ratios to their boards and shareholders.
As was the case with globalization and healthcare insurance, the public struggled to understand the situation amidst news skewed by political partisanship. It was exceedingly complicated even without that further complication. The Great Recession also caused a class conflict within the GOP between business-oriented “Wall Street” Republicans and “Main Street” Tea Partiers/Freedom Caucus who resented banks and corporate power despite their social conservatism and dislike for financial (or any government) regulation. During the 2016 campaign, Democrat Hillary Clinton supported reinforcing Dodd-Frank but a WikiLeaks video of her at a Goldman Sachs fundraiser revealed her suggesting to bankers that Wall Street had been simplistically scapegoated for the crisis as a matter of political necessity, though she stuck with her commitment to Dodd-Frank. Unlike Tea Partiers, democratic socialist Bernie Sanders (D-VT) wanted to regulate and reform Wall Street, but he shared their opposition to the “bailout” and had voted against TARP as a Vermont senator. So, too, congressman and future VP Mike Pence (R-IN) voted against TARP. Sanders and Pence evidently saw systemic risk as a bluff.
Many of the more populist economic Republicans supported Donald Trump in his candidacy as he argued that his own participation in high finance gave him insight into corruption. Trump promised to “drain the swamp” of lobbyists, especially those from Goldman Sachs that he said corrupted Republican rival Ted Cruz and Democratic opponent Clinton, and advisor Steve Bannon credited the financial crisis with giving rise to Trump. However, once he won the presidency, Trump followed in the footsteps of preceding administrations by filling his cabinet with Goldman Sachs alumni — including White House Chief Strategist (Bannon), Treasury Secretary (Steve Mnuchin), and Head of the Council of Economic Advisors (Gary Cohn). The populist right, in other words, tapped into the public’s rage without offering real solutions other than more of the very deregulation and cronyism that enabled the crisis. Trump hoped to dismantle Dodd-Frank as much as possible. In short, while the financial meltdown gave rise to a populist surge, that populism forgot about complicated Wall Street altogether and focused on other issues like globalization, immigration, and general hatred and suspicion of “experts” and “the government.” Whatever Trump meant by “draining the swamp,” it didn’t include any pushback against Wall Street or its Washington lobbyists.
It remains to be seen whether or not Dodd-Frank is mere “lipstick on a pig” or whether it can help stave off the next crisis, or whether or not Republicans can weaken it or Democrats strengthen it. The good news is that TARP worked and didn’t even cost taxpayers money; the bad news is Americans (and, by extension, the world) still have a system that privatizes gain and socializes catastrophic losses because the biggest banks are concentrated enough to pose systemic risk. Next time, though, there will be less slack in the rope, both politically and financially, as the crisis plays out among a public largely unable to wrap its head around the idea of systemic risk. I’ll now reward my patient, discouraged, and bleary-eyed reader by bringing this subject to a close.
The debates on globalization/trade and finance have been confusing and messy, but not necessarily more partisan than debates of the past. In fact, there was bipartisan cooperation for many years on free trade and promoting home ownership. In each case, a bigger problem is complexity that makes it difficult even for engaged voters to wrap their heads around the issue, even if those same heads weren’t being turned on a swivel by partisan media. Healthcare insurance, though, has been consistently warped by partisanship, with Democrats twice thwarting mandate-based reform (i.e., Obamacare) in the early ’70s and ’90s and Republicans sabotaging their own creation when passed by a Democratic Congress and president. “Democracy,” in the words of former British PM Winston Churchill, “is the worst form of government except all the others that have been tried.” One could plausibly say the same about capitalism as it pertains to free trade and healthcare providers, at least, if not high finance and health insurance.
Instead of ignoring public life or letting yourself degenerate into a state of cynical fatalism, step back and ask yourself whether the American system of democratic capitalism has really failed you in the big scheme of things. We don’t live in a genuine democracy directly responsive to voters but, then again, no one before you or elsewhere ever has either — not in ancient Greece or Rome, modern Europe, or the early United States. Moreover, we don’t know if such an idealized republic would necessarily be better than a political system responsive (first and foremost) to informed lobbyists — as long as the lobbyists are diverse and their quasi-bribery is somewhat transparent. Unlike you, lobbies don’t vote. Besides reading up on issues, educate yourself before voting as to which lobbies fund which candidates on non-partisan sites like those listed below, especially OpenSecrets.org. In the words of Watergate’s famous Deep Throat, “follow the money,” at least as best you can. Who are that candidate’s clients? Also, keep in mind that, unlike every country on Earth save Bolivia, 39/50 U.S. states elect judges along with politicians rather than appointing them. Here again, follow the money; their campaigns are often financed by corporate lobbies.
Human societies are full of conflict and the purpose of non-utopian politics isn’t so much to eliminate that conflict as to channel it as constructively as possible. You don’t need to live in a world surrounded by people you agree with about everything to thrive and be happy, at least not in a truly liberal democracy. Historian Henry Adams (John Quincy Adams’ grandson) hit the nail on the head when he said, “politics is the systematic organization of hatreds.” Since America’s big, diverse populace is genuinely divided and wants different things, maybe its system of gridlock interspersed with occasional compromise is the best we can do. It beats civil war. Do your part to override politicians like Congressman Steve King (R-IA) who warned that his side (red states) had more guns — an ignorant, uncivilized, unimaginative, and panicky line of reasoning that lacked faith in America and rejected the best in the Western tradition. That approach passes for politics in some parts of the world, but not here, not among real patriots.
Novelist Gustave Flaubert wrote, “Our ignorance of history causes us to slander our own times.” Hopefully, you’ve learned enough history at this point to understand that yours is not the first generation to confront challenges, nor will it be the last.
Optional Listening, Viewing & Reading:
Frontline: America’s Great Divide: From Obama to Trump, Parts I & II (PBS)
How Will History Books Remember the 2010s? Politico: History Department, 12.27.19
Benjamin Studebaker, “The Ungoverned Globe,” Aeon, 6.20
Tim Alberta, “The Senator Who Decided To Tell The Truth,” Atlantic, 6.30.21
Nick Corasaniti & Reid J. Epstein, “What Georgia’s Voting Law Really Does,” New York Times, 6.25.21
Francis Fukuyama, “Liberalism & Its Discontents.” American Purpose, 10.5.20
Aaron Cavin, “Trade Wars: Collapse of America’s Free Trade Consensus,” Origins, 1.17
Paul Krugman, “Don’t Blame Robots For Low Wages,” New York Times, 3.14.19
Avik Roy, “The Tortuous History of Conservatives & the Individual Mandate” Fortune, 2.7.12
Michael Hirsh, “Why Trump & Sanders Were Inevitable,” Politico Magazine, 2.28.16
Jonathan Rauch, “How American Politics Went Insane,” Atlantic, 7-8.16
Alex Bloomberg & Adam Davidson, The Giant Pool of Money, (Podcast: This American Life, NPR), 5.08
David Greenberg, “How Roger Ailes Created Modern Conservatism & How Donald Trump Upended It,” Politico, 7.20.16
Jedediah Purdy, “A Billionaire’s Republic,” Nation, 7.11.17
Johann Neem, “The War On Christmas is a Civil War,” USA Today, 12.22.17
Bill Dupor, “The Recovery Act of 2009 vs. FDR’s New Deal: Which Was Bigger?” (St. Louis Branch, Federal Reserve), 2017
Clive Thompson, “What the Founding Fathers’ Money Problems Can Teach Us About Bitcoin,” Smithsonian, 4.18
Saul Cornell, “What the ‘Right to Bear Arms’ Really Means,” Salon (1.15.11)
Laura Sullivan, “As China Hacked, U.S. Businesses Turned A Blind Eye,” NPR (4.2.19)
Peter Beinart, “U.S. Trade Hawks Exaggerate China’s Threat,” Atlantic (4.21.19)
John McDonough, “Republicans Have Stopped Trying to Kill Obamacare. Here’s What They’re Planning Instead,” Politico (4.26.22)
Carmen & Vincent Reinhart, “The Crisis Next Time: What We Should Have Learned From 2008,” Foreign Affairs (12.20)
Matthew Pressman, “America’s Biggest Newspaper 70 Years Ago Sounded A Lot Like Trump Today,” Atlantic: Ideas (5.10.19)
Jeff Greenfield, “How Orwell Diagnosed Democrats’ Culture War Problem Decades Ago,” Politico (4.19.22)
Yoni Appelbaum, “How America Ends,” Atlantic (12.19)
Ian Baruma, “The Great Wall of Steel: Xi Jinping Remakes Chinese Nationalism,” Harper’s (2.22)
Derek Thompson, “Why the Age of American Progress Ended,” Atlantic (12.9.22)
Non-Partisan Political Information
GovTrack.us (Civic Impulse)
PolitiFact (Non-Partisan B.S. Meter — Winner of Pulitzer Prize)
FactCheck.org (Non-Partisan Annenberg Policy Center)
Media Bias/Fact Check (MBFC News)
Project Vote Smart (Just the Facts)
OpenSecrets.org (Center for Responsive Politics)