Gridlock & Globalization

Republican Elephant & Democratic Donkey

Rotating Globe Flag, 2012, Meclee-Wiki CommonsIn the late 20th and early 21st centuries, America continued to drift in a conservative direction economically as unions weakened, workers worked longer hours for less overtime pay, Wall Street banks grew larger in relation to the rest of the “real” economy, and class lines hardened to the point that America had less upward mobility than European countries. Multinational firms transcended borders after the Cold War, cashing in on capitalism’s global victory. Religiously, Christian Fundamentalism spread from the Bible Belt across the country and, politically, the Reagan Revolution kept liberals on the defensive. Taxes and faith in government stayed low, with only a tenth as much spent on infrastructure as fifty years earlier (0.3% of GDP vs. 3%) and congressional approval ratings dropped to all-time lows as corporate lobbies brazenly bought off legislators. Ronald Reagan emboldened conservatives in the same way that FDR’s New Deal emboldened liberals a half-century earlier. And, just as FDR would’ve found some of LBJ’s Great Society too liberal, Reagan wouldn’t be conservative enough today to run as a Republican. By the mid-2010s, Western countries were shifting away from the traditional left-right economic spectrum that had defined politics for a century toward a dichotomy between those who embraced globalization, diversity, and the postwar Western alliance (NATO, EU) and those with a more nationalist viewpoint represented by Donald Trump in the U.S., Theresa May in Britain, Geert Wilders in the Netherlands, Viktor Orbán in Hungary, Marine Le Pen in France, and Lech and Jaroslaw Kaczynski in Poland. We’ll explore more on the potential dissolution of NATO and the European Union in the next chapter. The economic issues that defined the old spectrum were still more important than ever, though, as gains from increased productivity flowed almost exclusively to the wealthy, frustrating the middle classes and making it hard for the right to argue the merits of trickle-down economics convincingly to a broad GOP coalition.

Still, in domestic politics, the GOP moved to the right and the Democrats followed by moving part way to the right on economics but not culture, and the gap between the two parties grew due to the end of the Cold War (removing a unifying cause), media fragmentation, enhanced gerrymandering, and greater (not less) transparency on Capitol Hill. All this amplified the partisanship that’s been a mainstay of American democracy, creating near dysfunctional political gridlock in Congress worsened by increased parliamentary filibustering that can require 60% super-majorities on Senate bills. Some gridlock is a healthy and natural result of the Constitution’s system of checks-and-balances, though the actual term was invented in the early 1970s to describe New York City traffic. However, too much gridlock disrupts the compromises that keep the political system functioning. For instance, the bipartisan Simpson-Boles plan to balance the budget long-term with small compromises on both sides never even made it out of committee and likely wouldn’t have passed if it did. Meanwhile, among voters, rifts opened in the combustible 1960s evolved and hardened into the “culture wars” of the last 50 years.

legalpotdispensarycoloradoKCTV

PinkAR-15sei-automaticComplicating and dovetailing with these culture wars was a libertarian push back against regulations on guns, drugs, and sexual orientation. Taking advantage of an omission/loophole in the 1934 National Firearms Act — requiring FBI background checks, national database registration, fingerprinting, photo, and fees for fully automatic machine guns, silencers (until recently), hand grenades, missiles, bombs, poison gas, short-barreled rifles, and sawed-off shotguns — gun lobbies staked out a place among civilians for semi-automatic assault rifles like the AR-15, with some marketing even aimed at children (technically their parents). An ATF-approved kit could “bump-fire” them into fully-automatic machine guns. Despite no evidence that the U.S. was poised to invade itself, the National Rifle Association (NRA) promoted assault rifles as a way for citizens to raise arms “against a tyrannical government run amok.” Combining this Second Amendment right in open carry states with the First Amendment right to free speech, law enforcement was “outgunned” by protesters in Nevada’s Cliven Bundy Standoff (4.14) and Charlottesville’s Unite the Right Rally (8.17), though in neither case did protesters open fire. Current interpretations of the First and Second Amendments, in other words, were on a collision course in open carry states like Texas with no special event restrictions. Meanwhile, while overall crime rates fell in the U.S. in early 21st century, killers used assault rifles in mass shootings in Aurora (2012), Newton (2012), San Bernardino (2015), Orlando (2016), and on twelve police officers in Dallas (2016). While the country embraced military weapons for civilians, it simultaneously went in a more liberal direction on many social issues, including legalization of marijuana in some states and same-sex marriage everywhere (Chapter 17), and a generally more inclusive outlook among the young.

LGBT Flag In Castro District, San Francisco, Photo By Benson Kua-WikiCommons

LGBT Flag In Castro District, San Francisco, Photo By Benson Kua-WikiCommons

If Americans today aren’t more divided than usual, they are at least better sorted by those who stand to gain by magnifying their disagreements (e.g. cable TV manufactured a previously non-existent “War on Christmas”). And they’ve sorted themselves better than ever, often into conservative rural areas and liberal cities, reminiscent of the rural-urban divides of the 1920s. As we saw in the previous chapter’s section on gerrymandering, this geographic segregation results in partisan districts of red conservatives and blue liberals, with some interspersed purple that defy categorization. The fragmented and partisan media encourages and profits from animosity between them, selling more advertising than they would if politicians cooperated and citizens respectfully disagreed over meaningful issues. A 1960 poll showed that fewer than 5% of Republicans or Democrats cared whether their children married someone from the other party; a 2010 Cass Sunstein study found those numbers had reached 49% among Republicans and 33% among Democrats. This trend might not continue, as polls show that the actual brides and grooms (Millennials) are less rigid ideologically than their parents. In 2014, Pew research showed that 68% of Republican or Republican-leaning young adults identified their political orientation as liberal or mixed and similar polls show some young Democrats identifying as conservative (it’s also possible that many young people don’t know what liberal or conservative mean). But more than ever, politicians struggled to please voters who disliked each other and, like children, were both defiant toward and dependent on government. Americans couldn’t agree on much, but many felt aggrieved and had “had enough” even if they weren’t well-informed enough to know what exactly they’d had enough of. And those that knew what they’d had enough of didn’t agree with each other. Amidst this hullabaloo, Tweeting®, and indignation, historians hear echoes of earlier periods in American history. Large, diverse, free-speech democracies are noisy and contentious countries to live in as you’ve already seen from having read Chapters 1-20. Partisan media is a return to the 18th and 19th centuries, while today’s cultural rifts seem mild compared to more severe clashes in the Civil War era, 1920s, and 1960s-70’s. In the early 21st century, partisanship complicated and clouded debates over globalization, healthcare insurance, and high finance that would’ve been complicated enough to begin with. These are three primary areas we’ll cover below, with connective tissue in between.

Pew Poll Showing Increasing Partisanship

The American economy continued on a path toward increased globalization and automation that began long ago, with American labor competing directly against overseas workers and robots. Information technology assumed a dominant role in most Americans’ jobs and lives, as traditional manufacturing jobs were increasingly outsourced to cheaper labor markets or displaced by automation, compromising middle-class prosperity. Studies showed that more jobs were lost to automation (~85%) than outsourcing (~15%) even though the U.S. lost over 2 million jobs to Chinese manufacturing in the first decade of the 21st century. The verdict isn’t in but, proportionally, the digital age hasn’t yet translated into the domestic job growth that accompanied the steam engine, railroad, electricity, or internal combustion engine, and Wall Street’s expansion hasn’t been accompanied by growth in the “real economy.” In the information technology sector, Apple, Microsoft, Google, and Amazon employ only 150k people between them — less than the total number of Americans entering the workforce each month. Unlike Sears in the 20th century, when you place an order with Amazon, the people scurrying around the warehouse floor to fill it aren’t people on roller skates, they’re robots. Automation and digitization have made businesses more efficient than ever and American manufacturing is stronger than naysayers realize — still the best in the world — but it provides fewer unskilled jobs. Efficiency is a two-edged sword; sometimes technology destroys jobs faster than it creates others. If automated trucks displace our country’s drivers over the next 10-20 years, it’s unlikely we’ll find another 1.7 million jobs for them overnight. It’s a tough labor market for people without at least some training past high school in college, a trade school, or the military, and robots are displacing white-collar and blue-collar workers alike. Yet, many jobs remain unfilled and high schools focusing on Career & Technical Education (CTE) are gaining traction to fill gaps. Despite increased economic productivity, wages have remained largely flat since the 1970s (relative to inflation) except for the wealthy. As they struggled to “keep up with the Joneses” or just pay bills, Americans borrowed against home equity and the average ratio of household debt-to-disposable income doubled between 1980 and 2015, despite still being relatively low by international standards. America From Space, NASA

Most likely, the dynamic American economy will adjust as it always has before. Karl Marx feared that steam would spell doom for human workers and John Maynard Keynes feared the same about fuel engines and electricity. A group of scientists lobbied Lyndon Johnson to curtail the development of computers. From employers’ standpoints or that of the free market, robots are more efficient than humans and they never complain, show up late, get sick, join unions, file discrimination suits, demand pensions, or health insurance, etc. So far, at least, these fears of being taken over by robots haven’t been realized on a massive scale, but automation has gained momentum since Marx and Keynes and is well on its way to posing a significant economic challenge. Still, for those with training, America’s job market remains healthy, with unemployment under 5% as of 2016. Humans are unlikely to go the way of the horse, partly because democratic societies have more power over robots than horses had over engines. Hopefully, physicist Stephen Hawking and sci-fi writers are wrong about that! 

Globalization: Pro & Con
Globalization didn’t start in the 20th century. The trend dates to the 15th century in terms of maritime trade and even earlier with overland routes, and trade has always been a controversial and important part of American history. Free-trading colonial smugglers resented Britain’s restrictive, mercantilist zero-sum trade policies, protectionist Alexander Hamilton aimed to incubate America’s early industrial revolution with tariffs, and trade disputes and embargoes drove the U.S. into the War of 1812. Tariffs were a divisive enough issue between North and South in the 19th century to be a meaningful if secondary cause of the Civil War behind slavery. The U.S. then had high tariffs as the Industrial Revolution kicked into high gear after the Civil War (Chapter 1), but protectionism was widely interpreted as worsening the Great Depression after the Smoot-Hawley Tariffs (Chapter 8). When the U.S. and Britain set out to remake the world economy in their image after World War II and avoid more depressions (Chapter 13), they strove to encourage as much global trade as possible, though, in reality, there are virtually no countries that favor pure, unadulterated free trade. All democratic countries, including those that signed on to the General Agreement on Trade and Tariffs (GATT) in 1947, have voting workers back home demanding favoritism and each country looks to strike the best deals possible. France, for instance, qualified its inclusion in GATT with “cultural exceptions,” to help its cinema compete with Hollywood imports and it maintains high agricultural tariffs. Translation: the upside of trade is great, but other countries can’t undermine farm-to-market La France profonde, or “deep France,” with cheap wine, bread, and cheese.

The Return to Amsterdam of the Second Expedition to the East Indies on 19 July 1599, by Andries van Eertvelt, ca. 1610-20

By the early 1990’s, a near consensus of economists favored free trade and globalization threatened America’s working classes more than ever. Competition, outsourcing, and automation had weakened manufacturing labor relative to the rest of the economy, undercutting America’s postwar source of middle-class prosperity and upward mobility for blue-collar workers. Democrats had supported unions since the New Deal of the 1930s and they generally supported a Buy American protectionist platform to help workers, including trade restrictions and taxes (tariffs) on imports. They were the American version of the French farmers, in other words. Tariffs are the primary way to empower protectionism by discouraging free trade and favoring workers from one’s own country. This is more complicated than it might seem, though, because Buy American helps some workers and not others, especially those that work in industries that export and are susceptible to retaliatory tariffs from other countries (e.g. southern cotton exporters in the 19th century). Moreover, tariffs artificially raise prices for everyone. Buy American was also an awkward topic for mainstream Republicans because they fancied themselves as the more patriotic of the two parties but had mostly supported free trade over the years to boost corporate profits. However, tariffs and protectionism aren’t the only stories; there’s also the issue of how fair trade agreements are once countries agree to trade. They aren’t one-page contracts that declare: “No rules whatsoever. It’s a free-for-all” in large font above the picture of a handshake emblazoned over a Maersk container ship. They’re more like legal documents hundreds of pages long that make it difficult for the average citizen to parse out what they really include.

NAFTA LogoAs we saw in the previous chapter, Bill Clinton’s embrace of free trade created a window of opportunity for Ross Perot to garner significant third-party support in 1992, and Hillary Clinton’s ongoing support of globalization along with mainstream Republicans partially explains Donald Trump’s appeal in 2016In 1992, Bill Clinton wanted open trade borders with the United States’ neighbors to the north and south, Canada and Mexico, and, in 2000, he normalized trade relations with China. With the two major candidates, Clinton and George H.W. Bush, supporting free trade in 1992, that left the door open for a third-party candidate to focus on the outsourcing of labor. In his high-pitched Texan accent, Ross Perot quipped, “Do you hear that giant swooshing sound? That’s your jobs leaving for Mexico.” He focused on Mexico because the issue at hand was whether the U.S., Canada, and Mexico would open their borders to each other for freer trade through NAFTA, the North American Free Trade Agreement (logo, left). George H.W. Bush had agreed to NAFTA and Clinton, too, promised to push it through the Senate. Perot was wrong about huge numbers of jobs leaving for Mexico, but a lot of American manufacturing and customer-service jobs left for China, India, Vietnam, and other places where companies could pay low wages and pollute the environment without concern for American regulations. There was a giant swooshing sound, all right; it just went toward Asia instead of Mexico. Meanwhile, workers came north for jobs from Mexico and Central America. MSNBC Cartoon, 9.1.2003

The pro-globalization argument is that free trade and outsourcing improve profit margins for American companies, boost American exports, and lower prices for consumers while providing higher wages and economic growth in developing countries. Free trade also offers consumers a wider range of products, ranging from BMWs and Samsung electronics to Harry Potter novels. You could drive a BMW built in South Carolina, or a Ford or John Deere tractor built in Brazil. The smartphone in your pocket — or maybe you’re even reading this chapter on it — might come from South Korea but contain silicon in its central processor from China, cobalt in its rechargeable battery from the Congo, tantalum in its capacitors from Australia, copper in its circuitry from Chile or Mongolia, plastic in its frame from Saudi Arabian or North Dakotan petroleum, and software in its operating system from India or America. Another pro-globalization argument is that it creates more jobs than it destroys, as foreign companies who otherwise wouldn’t operate in the U.S. open plants and hire American workers. Honda, from Japan, builds almost all the cars and trucks it sells in America in America. In 2014, Silicon Valley-based Apple started making Mac Pros® at Singapore-based Flextronics in northwest Austin, creating 1500 jobs.

McDonald's, Osaka City, Japan, 2005

McDonald’s, Osaka City, Japan, 2005

Opponents of globalization point out that American manufacturers are undersold, costing jobs and lowering wages as companies exploit and underpay foreign workers. As recently as 2009, Barack Obama’s stimulus package included a Buy American provision. Are such provisions beneficial to the American economy? For at least some workers, yes. When jobs go overseas, they lose theirs and labor unions lose their hard-earned bargaining power. But tariffs keep prices artificially high on products, costing other workers. With free trade, other workers make more money because their companies make more and, in theory, that “trickles down” to all of us “invisible beneficiaries.” More directly, it lowers prices. The U.S. could put a tariff on clothing, for instance, and that could save 135k textile jobs. But it would also raise the price of clothing, a key staple item, for 45 million Americans under the poverty line. Globalization, in sum, is why your smartphone didn’t cost $2k but also why you can no longer make good union wages at the local plant with only a GED or high school diploma. New workers at General Electric’s plant in Louisville earn only half of what their predecessors did in the 1980s.

Port Elizabeth, New Jersey

Port Elizabeth, New Jersey

Return to phones and electronics as an example of globalization’s pros and cons. We take for granted the lower price of products that importing and outsourcing make possible, and might not notice the increased productivity their devices allow for on the job, but we take notice that Americans aren’t employed assembling their products. Not only are their workers paid less than they would be in America, Apple lays off or adds 100k workers at a time in their Chinese facilities — mobilization on a scale the U.S. hasn’t seen since World War II. But cheap prices are a huge benefit. In Walmart’s case, their lower prices have curbed inflation in the U.S. over the last 35 years. (Walmart also saved money by selling bulk items, and using bar codes, wireless scanning codes, and coordinating logistics with suppliers — all now customary in retail.) Free trade and outsourcing also help stock returns, because large American corporations not only can make things cheaper, they also do half of their own business overseas. The stock market not only helps the rich but also workers with company pensions and 401(k)’s that rely on growing a nest egg for retirement.

Walmart in Quanzhou, China

Walmart in Quanzhou, China

Most importantly, the U.S. exports too, and when it puts up protective tariffs other countries retaliate by taxing American goods. That happened most famously when the aforementioned Smoot-Hawley Tariff of 1930 worsened the Depression, stifling world trade. The shipping company pictured below, UPS, is based in Atlanta and it boosts America’s economy to have them doing business in Italy. No globalization; no UPS gondolas piloted by a gondolier checking his cheap smartphone. Globalization, then, is a complex issue with many pros and cons, some more visible than others. For a bare-bones look at the downside of globalization view the documentary Detrotopia (2013), that traces the decline of unionized labor and manufacturing in one Rust Belt city, or just look at the dilapidated shell of any abandoned mill across America. There aren’t any comparable documentaries concerning the upside of globalization since that’s harder to nail down. When it comes to what psychologist Daniel Kahneman called “fast and slow thinking,” we can see the downside of globalization in five seconds but might need five minutes to really think through the upside.

UPS Boat in Venice, Italy

UPS Boat in Venice, Italy

The 1992 campaign drew attention to globalization, as did the protests and riots at the 1999 World Trade Organization conference in Seattle. The WTO is the successor to GATT, part of the economic framework the West created after World War II, along with the World Bank and International Monetary Fund, to stimulate global capitalism. The rioters were protesting against the WTO’s free trade policy and the tendency of rich countries to lend money to emerging markets with strings attached, sometimes mandating weak environmental regulations and outlawing unions. Working conditions often seem harsh and exploitive from a western perspective, even if the job represents a relatively good opportunity from the employee’s perspective. At the WTO riots, protesters threw bricks through the windows of chains like Starbuck’s that they saw as symbolizing globalization.

WTO Protests in Seattle, 1999

WTO Protests in Seattle, 1999

Today the outsourcing trend is reversing some, as more manufacturing jobs are returning to the U.S. Some companies, like General Electric, realize that they can monitor and improve on assembly-line efficiency better close to home, while other factors include the rising costs of shipping and increasing wages in countries like China and India, which are finally starting to approach that of non-unionized American labor. Yet, insourcing can also include foreign workers. Under H1-B non-immigrant visas, companies can hire temporary immigrants to do jobs for which there are no “qualified Americans.” Sometimes it’s true there aren’t qualified Americans, but other times that’s a farce. Due to lax oversight, some companies started to define qualified as will do the same job for less money. In 2016, Walt Disney (250 in Data Systems) and Southern California Edison (400 in IT) fired hundreds of American employees and even required them to train their replacements from India before picking up their final paycheck. Corporate America is currently lobbying Congress to loosen H1-B restrictions. In the 2016 campaign, Trump vowed to get rid of the H1-B Visa program. International GDP Growth & Population

Globalization continues to be a controversial topic in American politics and will continue to be for the foreseeable future. The 2016 election saw two populist candidates, Trump (R) and Bernie Sanders (D), opposed to free trade and they even pressured Hillary Clinton (D) into taking an ambiguous stand against President Obama’s Trans-Pacific Partnership that loosens trade restrictions between NAFTA (U.S., Canada, Mexico) and twelve Pacific Rim countries that constitute 40% of the world economy. If not opposed to trade outright, Sanders and Trump at least wanted to rework the terms of the agreement, though it’s difficult in multilateral (multi-country) agreements to have each one go back to the drawing board because then the others might want to renegotiate and the whole process gets drawn out. When Trump became president and pulled the U.S. out of the TPP, other nations started negotiating their own trade agreements. One point of TPP was to check China’s growing hegemony in Asia by striking agreements with its neighbors but not them. With America abandoning its place at the negotiating table, China has assumed a leadership role in brokering these negotiations from the outside, leveraging their influence as low-cost manufacturers.

Source: Gallup Poll

Source: Gallup Poll

American farmers that export meat, grain, wine, and dairy products to food importers like Vietnam and Japan didn’t fully think things through when they supported protectionism and opposed TPP in the 2016 election, just as they didn’t realize how much corn they export to Mexico when denouncing NAFTA. When corn prices plummeted after Trump threatened to dIsmantle NAFTA, he agreed to pull back and renegotiate instead, for the time being. It was paradoxical to have all the candidates in 2016 more or less support protectionism at a time when Gallup polls showed that 58% of Americans (above) and nearly all economists favored free trade. Such polls are confusing because many voters might not fully understand the give-and-take and simply favor all the advantages of protectionism combined with all the advantages of free trade, or some might just want trade terms renegotiated for a “better deal” even if they’ve never read the existing terms (TPP, after all, got rid of tariffs that were hurting American exporters; without TPP, they’ll have to pay them). Polls likewise show that voters simultaneously want better benefits/services and lower taxes; the fact that they can’t get both is part of what drives their resentment toward government. With trade, Trump would like to get out of broad multilateral agreements and renegotiate bilateral one-on-one “beautiful deals” with each country that favor America.

Source: Asian Development Bank / POLITICO Research

However, experts point out that if one country overplays its hand in that regard, other countries will ice them out and sign separate agreements with each other (above) — thus the advantage of multilateral pacts. Countries are more willing to lower their own tariffs if it gives them access to multiple countries, not just one. A new TPP-11 (led by Japan but excluding the U.S. and, still, China) formed immediately after Trump’s withdrawal announcement on January 23rd. By mid-2017, New Zealand, Australia, Canada, and the EU had started negotiating lower tariffs with Asian food importers, hoping to undersell American farmers. With the U.S. and post-Brexit Britain on the sidelines, Europe renegotiated with Vietnam, Malaysia, and Mexico, and Japan offered the EU the same deal on agricultural imports that it took the U.S. two years to hammer out during the TPP negotiations. An alliance of Latin American countries including Mexico, Peru, Chile, and Colombia formed their own Pacific Alliance to negotiate with Asian countries while China, sensing blood in the water, formed a 15-country regional partnership in Asia to rival TPP-11.

Politics Get Personal
The 1992 campaign also kicked off an era when the public and media have been consumed with the personal biographies of their candidates more than ever before. Previous politicians were from the WWII era and came of age when journalists, for the most part, didn’t pry into politicians’ sex lives or religions. In JFK’s case, his Catholicism was controversial, but not his rampant adultery. By 1992, though, Baby Boomers like Clinton were running for office and people wanted to know more about their backgrounds. Clinton was an inveterate philanderer and women came forward claiming to have had affairs with him. Some were no doubt gold diggers, but there was too much smoke for no fire. And what was Clinton doing in the 1960s? Was he in Vietnam? Was he a protester? Clinton was at Oxford on a Rhodes Scholarship and had smoked pot, but “not inhaled.” He went into the National Guard toward the end of the Vietnam War and got a deferment to study another year overseas. Republicans jumped on Clinton for being a draft dodger but it came back to haunt them eight years later when George W. Bush (Bush 43) ran for president and people discovered he’d also gone into the Guard. Donald Trump got college deferments and a medical disqualification (1-Y/4-F) for short-term heel spurs. Unlike today, the Guard didn’t fight overseas in the 1960s. The GOP had a photo of Clinton from the early ’70s with a beard — enough to spin him as anti-establishment.

Bill Clinton Meeting John Kennedy, 1963

Bill Clinton Meeting John Kennedy, 1963

For his part, Clinton embraced more universally popular visions of the 1960s, including that of John Kennedy and the Civil Rights movement. The argument over which sixties cut both ways, though, because many voters had grown up in that era and not all of them necessarily held it against Clinton that he’d had a beard, smoked pot, or hadn’t fought in Vietnam. In the three-way race, Clinton won the Electoral College and a plurality of popular votes (43%), with independent Ross Perot garnering nearly 20% despite dropping out two weeks prior to the election. While many commentators assume that Perot’s presence hurt Bush, there’s no solid evidence to support him swinging the election toward Clinton. Perot drew voters from both parties but he attacked Bush more relentlessly than Clinton during the campaign.

Rush Limbaugh

Rush Limbaugh

A crusade against Clinton commenced with his election, more passionate and well funded than those normally launched by political opponents. The same would occur with his successors, but never really had on this scale with his predecessors. Such venomous and one-sided dialogue became a hallmark of American politics from the 1990s forward, thriving in an era of cable TV and the Internet. It resulted not just from a changing media environment but also an increasingly bipartisan atmosphere in Washington. While most people ultimately want what’s best for the country, committed partisans (at least subconsciously) want things to go poorly for the country for a while when someone from the opposing party is in the White House. Right-wing pundits such as Rush Limbaugh tried to derail Clinton as he ran for office and once he got to Washington. Clinton’s team was young and inexperienced, as well, and the Republicans were able to “throw him off message” once he got in the White House by distracting him from the issues that made him popular among voters, such as his centrist business policies. They forced the issue of gays in the military since they knew it was a no-win issue among the public. Clinton endorsed equal rights for gays in the military during his campaign, but then hedged in office and came up with the muddled don’t ask, don’t tell policy.

After stumbling out of the gate on that, Clinton made another strategic error, which was to grant too much responsibility to his wife Hillary in crafting healthcare insurance legislation. He knew Hillary was smart and capable, but some opponents were leery of her as an educated career woman. The couple had even talked of a “two-for-one deal” on the campaign trail, and Hillary was the first First Lady to have an office in the West Wing alongside other senior staff. Other than Eleanor Roosevelt and Edith Wilson in 1920 (after Woodrow’s stroke), First Ladies had stayed in the East Wing and avoided a meaningful role in policy. While their role is ambiguous, First Ladies are not elected officials. It’s generally understood that the president’s wife should limit herself to innocuous things like promoting literacy or physical fitness or discouraging drug use among kids, with maybe some ribbon-cutting here and there. At the same time, American healthcare insurance was a system in need of repair.

Healthcare Insurance
While America socialized some healthcare coverage for the elderly in 1965 with Medicare, it has an unusual and spotty arrangement for everyone under 65 whereby employers, rather than the government, are expected to provide workers insurance subsidies, split anywhere from full coverage to a 50/50 match. It stems from WWII, when price controls (to prevent inflation) prevented companies from giving raises, so in a low-unemployment economy they attracted workers with benefits instead, including health insurance. It’s difficult to measure how many Americans die annually because they lack insurance because it’s impossible to control for multiple factors across a wide population and many people go on and off insurance. The uninsured rarely have preventative checkups that might save them years later. Studies prior to 2009 ranged from 18k to 45k deaths annually according to factcheck.org. If we use a low estimate of 16k, then over a million Americans have died from lack of coverage since WWII.

Nonetheless, the employer subsidized insurance system works well for many people, especially at big companies. But it leaves others with no coverage and presents complications for small businesses, especially, because wider pools lower cost — the reason many countries spread risk among the whole population with cheaper government-run systems. That makes Americans more conservative about sticking to big companies and less likely to start up small businesses, hampering entrepreneurship. In America prior to 2013, it was expensive to buy individual coverage if you fell through the cracks and prohibitively expensive if you’d already been sick. In 2000, the World Health Organization (WHO) ranked the United States 37th in the overall efficiency of its healthcare system. Most developed countries have cheaper healthcare with higher overall customer satisfaction, lower infant mortality, and longer life expectancies, but longer waiting periods for non-emergency procedures and less choice in choosing doctors. Teddy Roosevelt advocated universal (socialized) healthcare insurance as part of his Bull Moose campaign in 1912 and Harry Truman did likewise as part of the Fair Deal in 1948, but both initiatives were defeated.

It’s important to distinguish between healthcare insurance and the healthcare itself, which also remains in private hands in America’s system and that of most other countries with socialized coverage (Japan and England are exceptions, along with communist countries). Most countries, and all developed nations outside the U.S., at least partly socialize insurance for those of any age. That way everyone pays in and everyone’s covered. The overall cost is lower per taxpayer than what American employee/employer combinations pay because, unlike profit-motivated private insurance companies, governments operate the system at cost. The young pay for the poor, but grow old themselves; men pay for women’s procedures but don’t complain because they have wives, sisters, and daughters. Yet, contrary to a common assumption among liberals, most other countries don’t have a purely single-payer public system; most supplement basic coverage with optional for-profit insurance companies. What other countries do have are stricter government-mandated price controls on what healthcare providers charge. While often seen as the bogeymen, American insurance companies lack bargaining power with providers (pharmaceuticals, hospitals, doctors, etc.) and can be victimized by high costs, inconsistency, fraud, or unnecessary procedures — costs that they pass on to patients/employers. One American can pay $1k for a colonoscopy while another pays $5k for the same procedure. As of 2012, according to an International Federation of Health Plans survey, MRI’s averaged $1080 in the U.S. and $280 in France. C-section births averaged $3676 in the U.S. and $606 in Canada. A bottle of Nexium® for acid reflux costs $202 in the U.S., $32 in Great Britain. Improving technology and over-charging contribute to inflation rates in medicine that always outrun the rest of the economy.Healthcare & Inflation

In the American healthcare debate, many analysts focus on these high provider costs while consumers/patients and the political left, especially, focus on glitches in insurance. As mentioned, profit margins of private insurance companies exceed the tax burden of socialized health insurance elsewhere. While socialized healthcare insurance raises taxes in other countries, that’s more than offset in the U.S. by higher costs for private insurance. And, prior to 2013, “job lock” problems arose in the employee benefit system when workers with pre-existing conditions (past illnesses) tried to switch jobs because, understandably, no new employer wanted to take on the increased risk of future medical costs. Formerly sick employees (~ 18% of all workers) in 45 states lacked portability, in other words, trapping them and putting them at the mercy of one employer, or on the outside looking in if they lost their job. Also, those that were covered risked having their coverage rescinded after paying premiums for years if the insurance company could find an error on the original application sheet, which they flagged but didn’t notify the holder about until he or she got ill. These rescissions, aka “frivolous cancellations” or “catastrophic cancellations,” filled up America’s bankruptcy courts with families whose finances (i.e. lives) were being ruined by runaway medical bills. Prior to 2013, the majority of personal bankruptcy cases in the U.S. involved medical hardship. Rescissions were outlawed in some states, and now everywhere by Obamacare, but as recently as 2009 Blue Cross employees testified before Congress that their company paid bonuses to representatives that could cancel coverage for paying policyholders once they got sick.

Meanwhile, some bigger companies suffer because they’re burdened with paying the long-term healthcare for pensioned retirees. For instance, with the good contracts the United Auto Workers union won in the mid-20th century, Ford, Chrysler, and General Motors were on the hook for all their retirees’ healthcare. Those obligations grew increasingly burdensome as life expectancies rose. If you bought an American vehicle before the 2007 collective bargaining agreement and GM’s 2009 bankruptcy restructuring, most of your money didn’t go to the materials or people who designed, built, shipped, and sold it; it went to pensions, to the tune of nearly $5 billion annually, industry-wide. The UAW Retiree Medical Benefits Trust now administers a much leaner independent fund with contributions from the Big Three automakers, some in the form of stock. Other companies like Walmart don’t have unions to worry about. They can shove much of their employees’ healthcare costs off onto the public dole (Medicaid) for the rest of us to pay. Medicaid is mostly state-managed, jointly-funded (state/national) healthcare insurance for the poor, passed in the same 1965 legislation as Medicare. It’s easy to see why the prevailing trend in the American workforce has been away from defined benefit pensions and toward defined contribution 401(k)’s, where companies aren’t on the hook for retirees’ healthcare. With a 401(k), employers might match a monthly contribution, but the employee is subjected to his or her own market risks. Additionally, once the employee qualifies for Social Security, they’ll receive some healthcare subsidies from the government in the form of Medicare.

Bill and Hillary Clinton made the biggest push since Harry Truman (or maybe Richard Nixon) to reform the system, though they didn’t end up pushing for universal coverage because they understood that private insurers had enough pull among politicians to block any legislation that would’ve cost them their business. Even as it was, when the Clintons crafted a patchwork bill in 1993 to address the most serious of the aforementioned problems, insurers filled the airwaves with baloney about how people would no longer get to choose their doctors. The bill lost in 1994, but a watered-down version passed in 1996 forcing companies to hire formerly sick workers. The hitch was that insurance companies retained the right to charge more. It was a classic case of corporations paying politicians to water down legislation. Insurance companies are among the biggest donors in Washington. Starting in 1986, the government allowed laid-off employees to continue purchasing healthcare through employer coverage temporarily through COBRA (at higher rates) and backed emergency care for anyone who came to a hospital through EMTALA. While the uninsured poor don’t have access to long-term care for diseases like cancer, heart disease, or diabetes, all Americans contribute to their emergency room care.

In response to the long-term threat of universal coverage, conservatives at think tanks like the American Enterprise Institute and Heritage Foundation formulated the mandate system. Mandates are compromises that require employers or individuals to purchase insurance from private companies, but force insurance companies to cover costlier patients, including the sick or those with pre-existing conditions. To offset the cost of covering those that need it, the young and healthy have to buy coverage, which is arguably smart anyway since catastrophic injuries or illnesses can impact people of any age. The Heritage Foundation’s Stuart Butler started promoting this market-based solution in 1989 though the idea goes back further, at least to the Nixon era, and included backing from Mark Pauly and Newt Gingrich. Butler’s idea required people to buy catastrophic coverage rather than comprehensive coverage. Famous free market economist Milton Friedman published an op-ed in the Wall Street Journal in 1991 promoting individual mandates. Along with his Secretary of Health & Human Services, Dr. Louis Sullivan, George H.W. Bush proposed the mandate idea in 1992 (minus the small employer requirement), but it died quickly in Congress, defeated by Democrats who either wanted an extension of Medicare to the whole population or just didn’t want to cooperate with Bush for partisan reasons with an upcoming election. No Republicans at the time mentioned anything about such a proposal being unconstitutional because the Congressional Budget Office said it was, in effect, a tax. Like Obama’s future plan, Bush and Sullivan’s put an emphasis on preventative care in order to keep down the costs of treating patients after they develop serious illnesses. Richard Nixon’s idea of an employer mandate suffered a similar fate to Bush’s in 1972, defeated by Ted Kennedy and other Democrats hoping for a simpler, more thorough-going single-payer system. In 1993, Republican Senators John Chafee (RI) and Bob Dole (KS) introduced a privatized mandate plan called HEART, for Health Equity & Access Reform Today Act, to counter Hillary Clinton’s Health Security Act, which they called “Hillarycare.” For many conservatives, an individual mandate for each household was preferable to an employer mandate and discouraged “free riders” that, for instance, took advantage of emergency rooms without buying any insurance. More libertarian conservatives at the CATO Institute opposed the idea from the outset. Ironically, Barack Obama, the man destined to become famously associated with the idea, opposed mandates during his 2008 campaign. Ted Kennedy later regretted his opposition to Nixon’s 1972 plan, but his home state of Massachusetts pioneered a mandate plan under Republican Governor Mitt Romney in 2006, that became the basis for the national government’s Patient Protection & Affordable Healthcare Act in 2010, aka Affordable Care Act (ACA) or “Obamacare.”

While the most unpopular feature is the mandate for young people to buy coverage, polls show that around 95% wisely want coverage anyway. Moreover, under any wider pool insurance system, the healthy pay for the unhealthy and men and women pay for each others’ maladies, just as homeowners who don’t suffer from fires, floods, or tornadoes pay for those who do (albeit with some adjustments for risk factors). That’s the very nature of insurance. You don’t cancel your home insurance if your house hasn’t burned down yet. To make coverage as affordable as possible for small businesses and those that need to buy individual coverage, each state under the mandate system either sets up its own online exchange for comparison shopping or feeds into a similar national exchange.

The Affordable Care Act version mandates free preventive care to lower costs, caps insurance company profit margins at 15% (20% for smaller companies) to bring costs more in line with other countries, prevents insurance companies from capping annual payouts to patients, and, at least through 2016, taxes those in the wealthiest bracket an extra 3.8% on investments (capital gains) and 0.9% on income to pay for expanded Medicaid coverage. There’s also a 40% excise tax on premium “cadillac” insurance plans for the wealthy. Around half of those who gained insurance coverage through the ACA did so through these subsidized Medicaid expansions. Insurance companies can’t cut off sick patients/customers, but the young and healthy have to buy insurance to help balance that out. Premiums for the elderly can’t be more than 3x higher than those for the young. Also, companies with over fifty full-time employees have to provide insurance. The ACA also taxes insurance companies and medical equipment makers.

The plan is often mistaken for socialism by conservative critics, but at their core mandates preserve health insurance for the free market by forcing individuals and companies to buy insurance rather than the government providing it for them. That’s the whole point. It is a partly socialist because of the taxes and Medicaid expansion, and customers below the poverty line are subsidized on the exchanges. However, conservative think tanks pioneered the idea to stave off a full-blown socialist alternative whereby taxes provide insurance for everyone the way they do for those over 65 with Medicare Plans A and D. It’s a system that Switzerland, the Netherlands, and Sweden are moving toward. What is also socialist in the current and pre-Obamacare system is that employers who subsidize healthcare can deduct that from their taxes, meaning that all taxpaying Americans, including ones that aren’t covered, help subsidize those that are covered at larger companies. The widely popular Medicare is also socialist insofar as taxpayers fund a government-administered plan. As was the case with ACA, opponents of Medicare in the 1960s filled radio waves with unfounded rumors of “government-run death panels.” Medicare led to no such death panels and it’s worked fairly well, all things considered, but it’s also expensive and takes up a growing portion of the federal budget.

Again, there is a certain price to be paid for the fact that life expectancies are rising, especially when the elderly spend a lot on healthcare. Still, the fact that people are living longer is something most of us would argue is a good problem. There is no free quality healthcare; the money is either coming out of your paycheck if it’s from your employer (all of it, ultimately, not just the half they “match”), your own pocket through “out-of-pocket” bills, or your paycheck through taxes. The question is what setup provides quality healthcare in the most equitable and affordable way possible. The hybrid Affordable Care Act is a complicated, sprawling attempt to manipulate the free market through government intervention. It attempts to smooth over the worst glitches in the old system, but lobbyists ranging from insurers, drug companies, and hospitals all had a hand in crafting the legislation. Insurance lobbies convinced Obama and Congress to drop the idea of a “public option” being included on the new national insurance exchange, HealthCare.gov, though individual states retained the option to add their own non-profit public option (none did). Amish, Mennonites, American Indians, and prisoners aren’t mandated to buy insurance. For patients that don’t purchase insurance on the state or national exchanges — still most Americans, who continue to get their insurance from employers — the legislation doesn’t include any price controls on hospitals or drug companies, as those lobbies paid pennies on the dollar to avoid it.

Similar to ACA, polls showed that Americans favored much of what was in Clinton’s 1993-94 legislation when posed questions about items in the bill and opposed it when Hillary’s name was mentioned in conjunction with those same items. Both are telling examples of how spin can trump substance in politics, and how the way questions are spun dictates how respondents “frame” the question. Partisanship is now such an overriding factor in politics that when a Democratic president pushed a conservative idea in Congress, zero Republicans voted in favor, and many confused voters thought a socialist revolution was at hand. Republican strategist Frank Luntz and Senate Leader Mitch McConnell (R-KY) instructed colleagues to block any real reform and to deny Obama bipartisan cover. Nixon and Bush suffered similar, if less inflammatory, responses from Democrats when they pushed mandate plans in 1972 and 1992. Much of the public misread the mandate idea as socialist in 2009 because they were spring-loaded to suspect President Obama of being a leftist and were unaware of its right-wing origins and purpose. Said Michael Anne Kyle of the Harvard Business School, “It was the ultimate troll, for Obama to pass Republican health reform,” accomplishing a liberal end (better coverage) through conservative means (the market).

Source: Gallup Poll

Source: Gallup Poll

The story of the Affordable Care Act is far from over. The government’s private contractor fumbled its rollout of HealthCare.gov. Some people showed up in hospitals and drugstores only to find their new insurance carrier hadn’t processed their paperwork yet, and many have to renew their plan annually. Some companies keep so-called “29’ers” just under 30 hours a week to avoid having to buy insurance for them. In an effort to control costs, people covered under policies purchased on HealthCare.gov will be offered “narrow networks” of providers who’ve agreed to keep costs down. That will no doubt annoy, but both narrow networks and sketchy customer service are issues many workers already experience on their regular employer-provided systems. As of 2014, over 70% of customers using the federal exchange were happy with the cost and quality of their coverage. Some insurance companies have grown less hostile to ACA as they’ve realized that expanded coverage means more overall profit, despite the 15% cap on profit margin. By 2015, the ACA had cleared two hurdles in the Supreme Court.

The number of uninsured Americans has fallen, but the cost of most people’s insurance rose the first year. Would it have risen as much without the ACA? It’s impossible to tell. With the fine for not buying insurance relatively low — in 2016, it rose to 2.5% of total household adjusted gross income, or $695 per adult and $347.50 per child, to a maximum of $2,085 — not as many young Americans bought coverage as hoped, raising concern that insurance companies still won’t have enough to cover everyone else, or that the government will have to make up the difference. Consult this official site for updated facts from the Congressional Budget Office, Census Bureau, Center for Disease Control, and RAND Corporation (think tank).

By the end of Barack Obama’s administration, the ACA was trending toward the death spiral caused by the low penalty for healthy young people not joining, causing insurance companies to raise rates for everyone else using the markets (not those already covered by their existing employer-sponsored plan). Insurance companies are uncertain moving forward with the cost-sharing reductions and mandate will continue. “Death spiral” is bit hyperbolic, though, because while Obamacare has failed to cause the hoped-for “bending of the curve” on medical inflation, the rate of increase on premiums isn’t higher than it was before 2009 (538). Also, even with these rising premiums, insurance on the ACA exchange is still cheaper than buying stand-alone insurance or COBRA coverage by a considerable margin. Less than 5% of Americans are on the Obamacare public exchanges, with ~ 50% on employer-subsidized, ~ 35% on government-subsidized plans (public employees, military, politicians, Medicare/Medicaid, etc.), and ~ 10% uninsured (Kaiser Health, 2015).

Supposedly, Donald Trump’s 2016 election meant that the ACA would be dismantled or reformed, as he promised during his campaign that he had a better plan that would cover everyone for cheaper without reducing Medicaid. But unless congressional Republicans really replace and improve upon the ACA rather than just repeal it, they will be throwing millions of Americans, including many Trump voters, off healthcare insurance. The Congressional Budget Office predicts that a simple repeal with no replacement will throw 32 million Americans off insurance within a decade and double premiums — numbers that discourage all but the most libertarian conservatives like Rand Paul (KY-R) and Freedom Caucus Chair Mark Meadows (NC-R). Now, in the words of Senator Lindsey Graham (SC-R), the GOP is like the “dog that caught the car,” with no agreed-upon replacement strategy as Trump was bluffing about his secret plan. A month into his first term, the new president conceded that revamping ACA would be “unbelievably complex….nobody knew that healthcare could be so complicated.” Pressed by Trump to come up with a plan quickly or leave Obamacare in place, House Speaker Paul Ryan (WI-R) introduced the American Health Care Act in March 2017 based on making health insurance tax deductible, but the overall effect would’ve been to save money for the wealthy by throwing millions off insurance. The bill divided the GOP as the House Freedom Caucus (Tea Party) complained there was still too much coverage for those that couldn’t afford it. In a last-ditch effort to salvage the bill, they removed maternity care and mammograms from coverage, but they still didn’t have the votes and pulled it from consideration. President Trump said that people were now starting to love Obamacare, but “There’s nothing to love. It’s a disaster, folks.”

In May 2017, the House passed a revised bill to repeal Obamacare and drop the mandate and tax on the wealthy, with less money for Medicaid, states having the option to drop protections for those with pre-existing conditions, and higher premiums for the elderly (allowed to be 5x higher than the young as opposed to Obamacare’s 3x). Trump cajoled House Republicans to support the bill and called it “tremendous” but then Tweeted® that it was “mean” and that Australia had a better healthcare system than the U.S. (Australia has basic medicare for everyone combined with supplemental private insurance). 

The revised American Healthcare Act went to the Senate, where Republicans were split over a similar plan that some saw as too harsh while others saw as too generous. In July 2017, Senator Ted Cruz (TX-R) promoted a free market for insurers with no mandate, even if that meant that those with pre-existing conditions and the poor were priced out. Vice-President Mike Pence endorsed Cruz’s plan on Limbaugh’s show, saying: “Rush, that’s what freedom looks like, isn’t it?” That same month, a Republican “skinny” repeal vote (dropping just the mandate and funding for Planned Parenthood) lost in the Senate by a mere one vote (51-49), with three dissenting Republicans, including John McCain dramatically casting the deciding vote after returning to Washington from Arizona, where he was diagnosed with brain cancer. Had it passed, Senate Majority Leader Mitch McConnell (KY-R) would’ve successfully passed legislation that steered ~ a trillion dollars of poor and middle-class health benefits to tax cuts for the wealthy. Polls showed that only ~ 15-20% of Americans supported straight repeal, leading former Arkansas Governor Mike Huckabee to argue for repeal of the Seventeenth Amendment granting citizens the right to vote for Senators (prior to 1913, state legislators voted for U.S. Senators). Huckabee is the father of White House Press Secretary Sarah Huckabee Sanders. Stay tuned; for now, Obamacare remains the law of the land. Entitlements once passed are hard to take away from American voters.

In the words of journalist Jonathan Chait, Obamacare, however imperfect, “squared the minimal humanitarian needs of the public with the demands of the medical industry.” As for former President Obama, he fully endorses replacing ACA as long as the new plan provides better coverage for less money. If Trump wants the ACA to implode he might be able to sabotage it by encouraging the IRS to not enforce the mandate penalty, thus passive aggressively worsening the cost spiral for older Americans in the hopes that Democrats will be blamed; he threatens to do just that periodically, interspersed with beseeching Congress to repeal or repeal/replace. In the meantime, a group of ~ 40 bipartisan “problem-solvers” led by Josh Gottheimer (NJ-D) and Tom Reed (NY-R) is caucusing regularly over beer and tacos to come up with a bipartisan solution for healthcare reform. In the Senate, Patty Murray (WA-D) and Lamar Alexander (TN-R) plan hearings to help stabilize the insurance exchanges. Whatever happens, the key for voters will be whether or not some plan can at least bend the curve on medical inflation (premiums and provider costs) while maintaining quality care and shielding families from bankruptcies and premature deaths.

Newt Gingrich on TIME Magazine Cover1994 Mid-Terms & 1996 Election
Let’s return to the Clinton era. The Speaker of the House of Representatives, Newt Gingrich of Georgia, seized on Clinton’s first-term problems and spearheaded the Republican Revolution of 1994, whereby the GOP took over both houses of Congress in the mid-term elections. The Republicans hadn’t controlled the House since the late 1940s. Its leaders were predominantly southern, including the Georgian Gingrich and Texans Tom Delay and Dick Armey. Presaging future threats against Barack Obama, North Carolina Senator Jesse Helms warned Clinton to not visit his state “without a bodyguard.” Just as Nixon presided during Lyndon Johnson’s Great Society era even though he was a Republican, the Democrat Clinton was presiding over a Republican congress during the ongoing Reagan Revolution. Partisan gridlock began to kick in as the growing debt issue inherited from previous administrations reared its head.

The U.S. had been “in the red” since 1969 and, as Senator and ex-Democrat Richard Shelby (AL-R) pointed out, President Reagan had run the country further into debt. Working “across the aisle” with Republicans like John Kasich of Ohio, Bill Clinton was the first president since 1969 to balance annual budgets (receipts=outlays), but the overall, long-term debt didn’t go away. Deficits are annual shortfalls whereas the debt is the running total. Republicans wanted a balanced budget amendment but weren’t specific as to where they’d make cuts. Democrats laid out more specific ideas for cuts but wouldn’t chain themselves to an amendment, citing circumstances like wars or the Louisiana Purchase where governments need to run deficits. The U.S. can’t really get its long-term budget under control without moderate tax hikes or cuts to entitlements, but a wide swath of Americans in both parties like Medicare and want Social Security to kick in before they’re too elderly. Meanwhile, conservative policy wonks like Grover Norquist force politicians to sign pledges against raising taxes. In a democracy, blame for reckless budgets ultimately falls on the divided citizenry that, collectively, wants more for less in an atmosphere that discourages compromise.

contractwithamericaGingrich promised a Contract with America that would pass a balanced budget amendment, reform (reduce) welfare, make the day-to-day workings of Congress more efficient, roll back Social Security, and cut funding for environmental initiatives like the Superfund and Safe Drinking Water Act. Though Gingrich favored cutting benefits for the working poor and taxes for the rich, he blamed increasing wealth disparity on “radical seculars.” In cutting back on the size of congressional committees and limiting the power of senior committee chairs, Gingrich’s reforms seemingly made Congress leaner and more transparent. However, in a classic case of how reform can have unforeseen consequences, by the early 21st century, it became harder for party leaders (or anyone, for that matter) to assert leadership in Congress.

The 1980s, in retrospect, were the last decade (so far) in American history when politicians of opposing parties socialized together. Newt Gingrich’s popularity in the mid-1990s signaled a new type of confrontational politics that demonized opponents and made bipartisanship difficult. Gone were the days of Democratic-Republican whiskey, poker, and golf, as new rules sent Congressmen home for the weekends. As a history Ph.D. and former professor, Gingrich understood how politicians can help hammer home peoples’ worldviews through repetition. In 1990, he and his GOPAC action committee issued a list of negative terminology — corrupt, betray, bizarre, cheat, devour, disgrace, greed, steal, sick, traitors, shallow, radical — advising Republicans to never speak of Democrats without associating them with those terms. This “Newtspeak” mandated calling the opposition the “Democrat Party” or “Dems” because Gingrich feared the adjective Democratic had positive connotations. The comic strip Doonesbury called Gingrich’s memo the Magna Carta of attack politics. Gingrich’s negativity dovetailed well with the proliferation of cable TV as Roger Ailes aligned FOX News with the GOP. Such partisanship, regardless of which side of the aisle it originates on, is usually accompanied by disingenuous complaints that the opposition is who doesn’t want to cooperate.

Bill Clinton Giving State-of-the-Union Address, 1997

Bill Clinton Giving State-of-the-Union Address, 1997

What Gingrich’s Contract didn’t promise was to crack down on corporate lobbying, as many people riding the “reform” wave into Washington were there to cash in themselves. The new House Majority Whip in 1994 was Tom Delay of suburban Houston, who went into office promising to reform Washington and left as one of the most corrupt politicians in the modern era. Gingrich overstretched a bit with his Contract, not taking into account that only 38% of Americans had voted in the 1994 midterm elections. Clinton cherry-picked the popular portions of the Contract (welfare reform and the balanced budget — not as an amendment, but at least as a reality for a few years) and held firm against the rest.

Clinton backed Gingrich down and rode the momentum to victory in the 1996 election. He had good economic tailwinds at his back, including improving information technology, the post-Cold War “peace dividend” of reduced military spending, heavy worker immigration, and Baby Boomers passing through peak years of productivity. And Clinton played to the centrist popularity that helped get him elected in 1992 by beefing up police forces and reforming the worst abuses of the welfare system. Welfare recipients now faced benefit limits, had to look harder for a job, and couldn’t have more kids while on welfare. Clinton defeated Bob Dole, a WWII vet who ran a clean election. At the GOP’s summer convention, Dole played on his seniority and credibility, promising a “bridge to the past.” The Democrats held their convention two weeks later and promised a “bridge to the 21st century.” When it comes to conventions, it sometimes helps to go second.

ANNUAL Budget Deficits, 1971-2000 (not the overall DEBT)

ANNUAL Budget Deficits, 1971-2000 (not the overall DEBT)

How Big Banks Got Too Big To Fail & (Maybe) Stayed That Way
The economy was booming by the end of Clinton’s first term and incumbents rarely lose re-elections in that scenario. As the saying goes, people “vote with their pocketbooks.” By the mid-90’s, the emerging Internet fueled explosive growth in the technology sector and better-than-anticipated petroleum discoveries drove oil down to one-fifth the price of the Carter years, adjusted for inflation. A tax hike on the rich from 36% to 39.6% didn’t inhibit things either. The government ran annual budget surpluses for the first time in decades. It’s easy to see, then, why Clinton would’ve gone along with Federal Reserve Chair Alan Greenspan, his two Secretaries of Treasury (Robert Rubin and Lawrence Summers), bank lobbyists and Republicans in loosening up Wall Street regulations even further than they’d already been loosened by Reagan. Greenspan kept interest rates low, fueling speculative bubbles first with the Internet, then in real estate. Low interests rates not only encourage borrowing, they also fuel the stock market because comparison shoppers prefer investing in stocks to low-yielding bonds.

Alan Greenspan

President George W. Bush Presents the Presidential Medal of Freedom to Federal Reserve Chairman Alan Greenspan @ White House, 2005, Photo by Shealah Craighead

Between 2001 and ’05 the Fed pumped cash into the economy to keep it healthy after the 9/11 attacks and the collapse of the dot.com bubble. As a libertarian apostle of Ayn Rand, Greenspan believed that traders would naturally self-regulate as they pursued selfish interest. But with the Federal Reserve’s role, this wasn’t a purely free market economy. Greenspan’s system privatized profit while socializing risk; markets either went up or the Fed lowered interest rates to bump them up. His successor Ben Bernanke followed the same policies after 2006. While the Fed was set up originally in 1913 to smooth out fluctuations in the economy, Greenspan’s high growth but bubble-prone policy ultimately made markets more erratic and he later testified before Congress that his strategy of deregulation had been a mistake.

Commentators often speak of the Law of Unintended Consequences to describe how either passing or eliminating laws often has unforeseen consequences (e.g. defensive treaties leading to World War I). In this case, three deregulations (eliminations of laws) contributed to a financial meltdown a decade later. The first was the Gramm-Leach-Bliley Act of 1999 (GLB) repealing the 1933 Glass-Steagall Act from FDR’s New Deal that had set up a firewall between riskier bank investments and regular bank customer savings. For the half-century after Glass-Steagall, there hadn’t been many bank failures in America — the reason reformers like Texas Senator Phil Gramm argued that the law was outdated. In retrospect, though, Glass-Steagall might have been partly why the U.S. didn’t have bank failures. But Gramm was coming from the mindset that regulations only slow the economy. The GLB Act didn’t affect the major investment banks involved in the 2007-09 meltdown (other than allowing some mergers), but it affected commercial banks like Bank of America and Citibank on the periphery of the crisis. Additionally, anonymous surveys of chief financial officers show that many were increasingly asked to “cook the books” after the banking/accounting deregulations of the Reagan era. When such “creative accounting” led to scandals at Enron, WorldCom, and Tyco in 2001, the Sarbanes-Oxley Act restricted such practices, but predictably the financial industry just lobbied for exemptions. The never-ending back and forth of regulating and deregulating is complicated by a revolving door of career paths between finance, lobbying, and politics. Robert Rubin, for instance, went from Goldman Sachs to serving as Clinton’s second Treasury Secretary, back to Citigroup. The deregulatory policies he promoted in public office benefited both firms. Moreover, the $26 million he earned at Citigroup included bailout money from the government after the system he helped set up failed. In what’s known as regulatory capture, many of the regulators in agencies like the SEC (Securities & Exchange Commission, 1934- ) are familiar socially and professionally with the investors they regulate.

"The Bosses of the Senate," Joseph Keppler, 1899, Puck Magazine

“The Bosses of the Senate,” Joseph Keppler, 1899, Puck Magazine

A second deregulation impacted the coming crisis more than Glass-Steagall’s repeal. A big cause of the 2007-09 meltdown and the danger it posed to the rest of the economy was a three-fold loosening up of leverage ratios by the SEC in 2004. Investment banks could now gamble their clients’ money on a 30:1 ratio, rather than 10:1. The amount they paid politicians to change the law was “pennies-on-the-dollar” (minimal in relation to profits). The financial industry lobbied around $600 million to politicians to deregulate in the decade prior to the meltdown while raking in trillions because of the boost to short-term performance. Bankers got big bonuses if their bets paid off and shareholders or taxpayers got the bill if they lost, in the form of plummeting stock or bailouts. Head I win; tails you lose. The men who “incompetently” ran the big banks into the ground walked away with hundreds of millions of dollars in bonuses and salaries they’d already made based on short-term returns. It seems, rather, that the real incompetence lay with the politicians and voters who drank too much deregulatory Kool-Aid. While banks were limited to a 30:1 ratio overall, they leveraged even more in real estate than other parts of their portfolios. For every $1 Americans spent on housing, Wall Street bet at least another $30 that the housing bubble would increase in perpetuity. With such leveraged bets, even a small 3-4% dip in housing prices would wipe out the banks…that is unless government came to their rescue.

A third important deregulation was the repeal of obscure laws that originated after the Panic of 1907 and 1929 Crash outlawing bucket shops. Bucket shops were gambling parlors, essentially, where people without actual share ownership just bet on the stock market the way one would bet on horses or football games. No official transaction occurs on any institutional exchange. Congress quietly repealed portions of those laws and another from the New Deal in the Commodity Futures Modernization Act, on the last vote of the last day of the 2000 session — the kind of dull scene on C-SPAN cable that viewers flip past with the remote control. That changed how big financial firms bought and sold credit default swaps (CDS). Here’s where things get very tedious if they haven’t gotten tedious enough already, so just strap in and do your best to comprehend.

Credit default swaps are a form of financial derivative with which companies can insure against the failure of their own investments. Billionaire investor Warren Buffett called these complicated, unregulated types of derivatives “weapons of mass destruction.” Yet, in another case of unintended consequences, a group of young JPMorgan bankers first conceived them in 1994 as a way to stabilize the system. The House of Morgan saw the need to insure against loans it made and bonds it underwrote to corporations after the Exxon Valdez tanker accident in Alaska in 1989. Banks began to insure against failure of their own mortgage-backed securities (shares in bundles of real estate mortgages) as the housing bubble expanded in the first decade of the 21st century and they sold these securities to other banks and investors. The key thing about mortgage-backed securities is that when mortgages were securitized (packaged and sold as financial products like stocks or bonds), the bank no longer lost their money if the homeowner defaulted on the loan because they’d sold it to someone else. Thus, banks no longer had as much incentive to not lend to borrowers they suspected might not be able to pay them back.

Invented by Salomon Brothers’ Lew Ranieri in 1977, mortgage-backed securities and bonds were bunched into portfolios called collateralized debt obligations (CDOs) that few people, including investors at other banks or ratings companies like Standard’s & Poor, studied in enough detail to examine all the high-risk mortgages they included. In The Big Short (2015), based on Michael Lewis’ 2010 namesake book, the narrator tells viewers that Ranieri had a bigger impact on their lives than Michael Jordan, iPods® and YouTube® combined, even though no one has heard of him. He adds that the opaque, complicated, boringness of high finance is deliberate as it shields Americans from the reality of Wall Street corruption. CDO’s included the lowest-rated tranches of sub-prime mortgages that they couldn’t hide in normal mortgage-backed securities (MBS’s) — ones with high variable rates scheduled to go up in 2007. Adding fuel to the fire, bankers took advantage of the aforementioned deregulations regarding leverage ratios and bucket shops to place side bets on the CDO’s called synthetic CDO’s. The amount of money riding on these unregulated derivatives was about 5x more than the CDO’s themselves.

Like accountants and financial officers, ratings agencies like S&P and Moody’s had a direct conflict of interest even when they did understand these derivatives because they were paid based on the quantity of loans they certified, not the quality or accuracy of their ratings. If one didn’t provide AAA ratings to low-quality debt, banks would simply go to their competitor. Compounding that dynamic were mortgages with mistakes and forged information. Internal memos show one employee at Moody’s Investor Services joking that he’d award a high rating to a security “structured by cows.” Moody’s CEO Raymond McDaniel told his board that the quality of their ratings was the least important thing driving company profits. Ratings firms were raking in record-breaking profits by betraying the trust their firms had built up over generations. Obviously, the entire ratings industry is only as good as the firms’ commitment to detached analysis; otherwise, the economy is worse off than if they didn’t exist to begin with. This was crucial in the early 21st century because supposedly stable pension funds, overseas banks, and even municipalities were loading up on AAA-certified debt, thinking the high rating actually signified safety (low-risk) the way it had traditionally. Some Wall Street banks like Lehman Brothers didn’t get the memo that the ratings had become meaningless, so they larded up on mortgage-backed securities. The subjects of Michael Lewis’ Big Short: Inside the Doomsday Machine are small investors who, by actually researching the data and contents of the MBS’s and CDO’s instead of being willfully ignorant, figured out what was going on earlier than the big banks and bet against them — shorted them in regular stock terminology — with credit default swaps that promised 10:1 returns. They lost money paying insurance premiums as they waited for their predicted downturn in housing. Some of the banks, including Goldman Sachs, also figured out the “jig was up” and bought credit default swaps because they knew a real estate bubble was forming and some of the SBS’s held too many “toxic mortgages.” Yet, they continued to push mortgage-backed securities to their customers even as they bet against them by buying default swaps.

The ratings agencies were grossly underestimating risk when some of the mortgages bundled into the funds, especially CDO’s, were bound to default. Always be wary of precision in forecasts; though their risk calculations were figured down to the third decimal point, the agencies’ forecasts were off by a factor of two-hundred. Worse, they didn’t factor in that mortgage failures would correlate if the national economy cratered and homeowners defaulted in a cascading effect. The S&P shared its ratings software with the banks, purportedly for “transparency,” but that only showed the banks exactly how many risky loans they could stuff into the lower-quality, higher-yielding tranches of the CDO’s while retaining the high rating. Later, the ratings agencies testified before Congress that they never saw the housing downturn coming but, between 2000 and 2007, there were analysts discussing the bubble in the Economist, New York Times and elsewhere. Google searches for “housing bubble” among the public increased ten-fold between 2004 and ’05.

There Goes the Neighborhood

Courtesy of resourcesforhistoryteachers.wikispaces.com

No one wanted to discuss the proverbial “elephant in the room” because too many people were profiting from the housing bubble. With values rising, homeowners could finance skiing vacations, cars or renovations by using their mushrooming home equity (value minus mortgage due) as collateral, and investors could “flip” houses by buying, fixing and reselling — benefitting real estate agents, contractors, home improvement stores and reality shows about house flippers and real estate agents. Greenspan and Bernanke’s easy money (low interest) policies weren’t causing inflation in the customary sense of the word, with higher prices on goods, but rather “asset inflation” where people who already owned real estate or stocks were enjoying a boom. Meanwhile, the financial sector could rake in million-dollar short-term bonuses as they ran proud historical firms into the ground while politicians deregulated the industry that funded their campaigns. Banks were borrowing from the Fed at 1% while lending mortgages at 5-6% and repackaging them as ticking time bomb securities sold around the world.

Brooksley Born

Brooksley Born

Closer to “main street” as opposed to “Wall Street,” banks offered predatory loans to people without jobs at high interest rates, ensuring short-term profits and long-term defaults. Ever seen late-night commercials that start with the question, “problems with your credit?” The catch with those car or home loans is that their interest rates are higher and the borrower is less likely to pay off the loan because they are poorer. CDO’s owned subprime loans owed by people mortgage lenders jokingly referred to “Ninjas,” for no income, no job. Unregulated lenders got commissions for signing people up and passed ownership of the mortgages off to unsuspecting bankers, investors, and pension funds far down the chain. Some testified that their bosses didn’t allow them to fill out the applicants’ boxes regarding income or credit rating.

An even bigger problem was middle and upper-middle class Americans borrowing too much against their home equity, as savings rates dropped. Meanwhile, the MBS’s and CDO’s stuffed with all these loans — subprime or prime — were being traded around like baseball cards among people with no direct stake in the mortgages and with no regulatory oversight in a shadow market. They weren’t bought and sold on common exchanges like regular stocks, bonds or commodities, and no one knew how much overall money banks were betting, even within their own companies (one argument against large size). Fed Chair Greenspan, Clinton’s third Treasury Secretary Larry Summers and bank lobbyists defeated an attempt by commodities/futures regulator Brooksley Born to make the derivatives market more transparent, saying that the banks knew what they were doing and could regulate themselves.

They did not know what they were doing or if they did they didn’t care because their bonus pay structure provided no motive for the long-term solvency of their banks, let alone the American economy. Many likely suspected ahead of time that the government (taxpayers) would have no choice but to bail them out when the system crashed. Others were just clueless. If you find some of the finance in these paragraphs hard to wrap your mind around, don’t feel bad. The new financial products were too complicated for many bankers, regulators, analysts, and professional investors to grasp, either. In any event, complexity wasn’t really the core problem; greedy investors gambled too much borrowed money on the housing market. Given the aforementioned increased leverage ratios (now 30:1), big firms had too much money riding on the fate of these CDOs, spelling doom for those on the wrong side of the downturn in real estate and creating systemic risk that threatened the whole economy. Consideration of systemic risk and the relationship between Wall Street and the rest of the economy is an important concept to consider as a citizen and voter because it will impact what financial policies you will favor going forward. For skeptics of systemic risk, the solution is simple: let the banks fail if they screw up; it’s their problem. For believers in systemic risk, it’s more complicated. The biggest banks are “too big to fail” not because they’re influential and curry favor with politicians, but rather because their failure could crater the whole economy. We’re the systemic part of systemic risk. In this case, too much of the downside risk pooled at the insurers and banks issuing the credit default swaps, like AIG (American International Group). They gambled that the real estate bubble wouldn’t burst but didn’t have anywhere near enough money to pay off the swap holders when housing lost momentum in 2007. An analogy would be the fate of regular insurance companies if everyone’s homes burned or flooded at the same time when their actuarial models are based on only a few being destroyed at a time. Likewise, the FDIC insures account holders up to $250k if his or her bank fails, but could never pay everyone if all the banks failed simultaneously.

Case-Shiller Real Estate Data from 1890 to 2012

Case-Shiller Home Price Index from 1890 to 2012

Despite all the swapping around of the credit default swaps, they couldn’t swap risk out of the financial system. Somebody always stood to lose and the system couldn’t sustain a collapse of the real estate market given the high leverage ratios. The Big Short compares the SBS’s-CDO’s with their tranches of variously rated mortgages to wooden blocks in the game of Jenga. As securities with the most toxic mortgages started to be pulled out, the tower would eventually collapse. Such a correction was nearly inevitable in retrospect. As the graph below shows, it was the price of land more than the structures themselves that skyrocketed in the 90’s and 00’s. The bubble couldn’t burst, argued realtors and speculators, because “God wasn’t making new land.” Soon-to-be Federal Reserve Chair Ben Bernanke said in 2005, “We’ve never had a decline in house prices on a nationwide basis.” Surely, as a student of the Great Depression, Bernanke must have known that real estate dropped 25% in the early 1930s. But what about housing when the economy isn’t in a recession? The reason real estate hadn’t collapsed on its own (without a major recession) was simply because there had never been a housing bubble. For reasons that are obvious if you think about it, home prices historically had been tied to inflation and wages except in periods of housing shortages such as after World War II. Between 1896 and 1996, the inflation-adjusted value of real estate in America rose only 6% — about how much the stock market has risen annually. Yale economist Robert Shiller was foremost among those that claimed real estate prices couldn’t sustain being untethered from wages and housing supply-and-demand.
Shiller Housing Index Data

Because of systemic risk, if big banks would’ve failed due to a sudden market correction in real estate, paychecks would’ve bounced and cash machines would’ve frozen up in an instant, causing a cascading effect of panic, cash shortages and breakdown of confidence. Undoubtedly, there would’ve been at least some rioting, looting, and violence. As mentioned, the FDIC would not have been able to save those who lost their money in failing banks. These would’ve been the short-term effects, along with a near total stock market crash. Allowing the banks to go into Chapter 11 bankruptcy would’ve wiped out their shareholders and crashed the Stock Market. No one will ever know for sure what would’ve happened because the government came to the rescue and saved the banks. In the worst case scenario, things could’ve gotten uglier quicker than in 1929-32. And, unlike the early years of the Great Depression, the government would’ve spiraled into deeper debt right away since they would’ve been on the hook for unemployment insurance, welfare, etc.  As president of the Federal Reserve’s New York branch, Tim Geithner testified that this was a “foam on the runway” type emergency (referring to airport protocol in case of a crash). Given the overall percentage of American money tied up in the biggest handful of banks and the centrality of the stock market to retirement savings, make no mistake: we were peering into the abyss.

Timothy Geithner

Timothy Geithner

This wouldn’t have been the case if the nation’s money had been more dispersed across smaller banks. But as a report from the Federal Reserve Dallas branch showed, the percentage of the financial industry controlled by the five biggest banks grew from 17% to 52% in the 40 years between 1970 and 2010. Under Chairman Greenspan, the amount of the economy controlled by these five banks had already grown to 45% by 2005 and former industrial stalwarts like General Electric and General Motors turned increasingly into financial institutions (financing loans on cars, appliances, etc. is how many manufacturers turn a profit). Banks even doubled their representation in the S&P 500 (U.S. large-cap stock market) from 9% to 17% in the four years after the crisis they were at the center of. Not only had a lot of the country’s wealth concentrated in a small number of banks, but largeness itself makes banks more difficult to manage. Unregulated shadow banks maintained books so complicated that the Fed couldn’t save them through traditional “lender of last resort” methods. The New Bankruptcy Law of 2005 made these shadow banks more appealing to large banks because it put them first in line to collect if they went under. Hundreds of smaller mortgage lenders also went out of business. The long and short of it is that when the real estate bubble started to burst, the big banks started to crater along with it. Bank of America and JPMorgan bought Merrill Lynch and Bear Stearns, respectively, when they started to fail, competitors bought Washington Mutual and Wachovia, and the government bailed out AIG. However, Lehman Brothers didn’t have enough collateral to interest any prospective buyers, including the government.

After Lehman Brothers’ collapse in September 2008 sent markets into their biggest downturn since 1929, the government realized they had to plug the dike by bailing out the others. Markets nosedived as Congress dithered around blabbering naive 1980s rhetoric about free markets correcting themselves. At first the House of Representatives voted down a bailout, but that precipitated an even bigger downturn in the stock market (eventually bottoming out at 54% decline by March 2009). The Federal Reserve knew better and Bush 43’s last Treasury Secretary, Henry Paulson, soon came to the same realization. Bush signed the Emergency Economic Stabilization Act that became known as “the bailout.” However, the Fed’s solution wasn’t a mere bailout. They didn’t just give banks and endangered corporations money, but rather invested in them through the unpopular but successful TARP (Troubled Asset Relief Program), which also scooped the most toxic assets out of those institutions. TARP also aided Detroit’s ailing auto industry, namely General Motors. Meanwhile, the U.S. lent money to foreign banks who’d over-invested in American mortgage-backed securities, though the meltdown has had a ripple effect abroad that’s outlasted the worst danger at home (i.e. Eurozone Crisis).

Luckily, the U.S. merely sank into the Great Recession rather than the aforementioned abyss. Moreover, the rescue came at little cost to the American taxpayer, because the government got the TARP money back and more as the financial crisis passed. Yet, five trillion dollars — almost a third of the country’s annual GDP — disappeared from the economy and many Americans lost their jobs (8 million) or homes (6 million) or took pay cuts. The economy slowed as businesses and households began to “deleverage” or “unwind” (pay down) excessive debt. At the household and small business level, America remained in a balance sheet recession of overhanging debt stifling economic growth. The Obama administration faced conservative and populist (Tea Party) opposition to any meaningful stimulus package or mortgage debt relief and its first Treasury Secretary, Tim Geithner (upper right), shared the conservative view of stimuli as overrated “candy.” If Greenspan’s Reagan Revolution deregulation failed in the Financial Crisis, neither would its wake see a return to the Keynesian economics of the New Deal (Chapter 9). Geithner was a primary architect of TARP in his role at the Fed’s New York branch. He gives a conflicting testimony in his otherwise sound account, Stress Test: Reflections on Financial Crises, arguing that he wanted a bigger stimulus package. In any event, that didn’t happen, with the $830 billion stimulus (around 63% spending and 37% tax cuts/rebates) equaling ~ 16% of the New Deal’s size as measured by percent of GDP adjusted for inflation.

American Private Debt-to-GDP Ratio by Sector

American Private Debt-to-GDP Ratio by Sector

The 2009-19 Stimulus Package helped stave off a worse slowdown, but amounted to only around 1.6% of GDP compared to 10% during the Great Depression of the 1930s and 45% federal spending during WWII — and gave way shortly thereafter to Tea Party-inspired budget austerity (belt-tightening). One could say that the government itself began to deleverage, for better or for worse — better because long-term budget forecasts improved; worse because most (or at least liberal Keynesian) economists see recessions as precisely the wrong time to initiate such fiscal conservatism. In their view, budget cuts during a recession cause a net loss because the lack of stimulus spending increases unemployment rolls and lowers tax revenues even as it spares the government in the short run from going deeper into debt.Downward Spiral

After the 2007-09 meltdown, consumers, companies, and government all struggled to pay down debt during a long, slow, gradual recovery. Many small businesses laid off workers as they could no longer borrow from suddenly conservative, risk-averse banks. Mortgages that were too easy to qualify for a few years prior were now overly difficult, making it more challenging for first-time buyers to come up with a down-payment. Many existing homeowners found themselves “underwater,” meaning that their mortgages were more than their homes were now worth. According to the Federal Reserve, average middle-class net worth plummeted from $126k to $77k between 2005 and 2010. Real estate markets cratered in over-built areas like Las Vegas, Phoenix and Florida. Budgets shrank at public and private institutions alike. Indebted consumers without access to credit spent less, creating the familiar downward recessionary cycle. Many retirees lost their savings and others nearing retirement delayed and worked longer. As of 2015, workers’ wages were $4k less (adjusted for inflation) than they were before the crisis. Like the Okies of the Dust Bowl and Depression, the 2010s saw van dwellers: retirees living in RV’s, who roamed the country in search of low-paying seasonal work.

Central banks in the U.S. (the Fed) and across the world madly tried to stem the tied by infusing cash into the system and encouraging lending with low rates. By buying up long-term Treasuries, mortgages and other bonds at a staggering rate with Bernanke’s Quantitative Easing program, the Fed stuffed $85 billion a month into the banking system for several years; that’s why the banks got even bigger after the crisis than before. Yet because lending got more conservative, they weren’t circulating all that cash into the economy.

Jamie Dimon, CEO of JPMorgan Chase & Co.

Jamie Dimon, CEO of JPMorgan Chase & Co.

While TARP was successful as an emergency room operation in saving the life of the patient (staving off a collapse of the financial system), it didn’t heal the underlying illness or stimulate any dramatic recovery. The economy stepped back from the abyss but remained stagnant in comparison to pre-2008 levels. The government — with both political parties taking money from Wall Street donors — still didn’t break up the big banks, limit executive bonuses, or even attach many strings to the bailout. In fact, the big banks used some of the bailout money to buy up distressed smaller banks. And the ringleaders who broke the law by misleading investors didn’t go to prison, though the principal banks were fined $190 billion collectively (which they pilfered directly from shareholders rather than their own salaries). After JPMorgan Chase CEO Jaime Dimon arranged a $13 billion out-of-court settlement with the Department of “Justice,” his ecstatic and relieved board gave him a 74% raise hiking his salary to $20 million.

Eric Holder

Eric Holder

Part of the problem is that certain bankers excel at violating the spirit of the law without technically breaking it. Moreover, Attorney General Eric Holder had issued the Holder Doctrine as Deputy Attorney General in 1999 that was basically a legal variant of the Too-Big-To-Fail doctrine — arguing that (like bankruptcy) too much prosecution of bankers posed systemic risk. Holder negotiated the slap-on-the-wrist with JPMorgan Chase. After he resigned as attorney general Holder returned to Covington & Burling, a law firm whose clients include Bank of America, Citigroup and Wells Fargo (they even kept his office for him while he was away). Democrats didn’t raise much of a fuss because Holder was Obama’s AG, while Republicans either favored Wall Street deregulation anyway or were so busy fantasizing about Obama’s alleged left-wing socialism that they failed to notice the real life-Obama’s right-wing “crony capitalist” connections.

Consequently, this whole grotesque miscarriage of justice flew pretty much under the public radar. The SEC focused instead on corrupt hedge fund managers like Bernie Madoff who, while no doubt felonious, were small fish to fry in comparison with the bank CEO’s who threatened and temporarily crippled the economy (Madoff also took advantage of CDOs). The banks crashed the stock market, not vice-versa. Credit Suisse investor Kareem Serageldin spent 30 months in jail but none of the primary culprits went to jail despite the fact that several committed felonies. Either way, the main factors that caused the meltdown weren’t illegal since bankers had long since bribed politicians to deregulate.

The Dodd-Frank legislation that followed subjected the derivatives market to the same transparency as the regular stock market and forced stricter capital requirements on the banks, with twice as much of their cash safe on reserve (20% not invested, as opposed to 10% and leverage ratios dropped back down to 6%, or 16.66:1 as opposed to 30:1). Dodd-Frank also created the Consumer Financial Protection Bureau (CFPB) to help protect citizens against fraud. In its first five years, the CFPB returned $11 billion total from banks and financial companies to consumers. However, in a classic case of the fox guarding the hen house, the banks themselves were put in charge of overseeing the derivatives markets — a $600 trillion-dollar untaxed market that contributes virtually nothing constructive to society. Just to put some perspective on that, $600 trillion is ~ 33x bigger than the entire rest of the “real” American economy outside the derivatives markets. The increased capital cushion requirement hurt small businesses because big banks were less willing to take risks lending to small entrepreneurs. The result was that the big banks were bigger than ever and they still reaped the profits while taxpayers and investors assumed the risk. With the Federal Reserve lending at virtually no interest, the government had shoveled banks around $3 trillion by 2009. Americans naturally resented the fact that the perpetrators of the crisis actually profited from it while everyone else suffered. On the other hand, TARP paid for itself within a few years and really had staved off a worse systemic crisis. In fact, the government (and, by extension, taxpayers) made $15.3 billion in profit by the time TARP closed operations at the end of 2014. Moreover, big banks had dramatically shrunk their involvement in the (self-regulated) mortgage-related derivatives trade by 2016. Still, CDO’s quietly crept back onto the scene in 2015, rebranded as “bespoke tranche opportunities” (BTO’s).

Bank Asset Concentration in U.S., 1997-2012

Bank Asset Concentration in U.S., 1997-2012

Senators John McCain (R-Arizona) and Elizabeth Warren (D-Massachusetts) introduced legislation to cap bank size in 2013, effectively trying to break up the big banks. Bernie Sanders (D-VT) backed the idea in his 2016 campaign. According to this line of thinking, until such a law is enacted, too big to fail remains part of the economic landscape. Others argue that “bigness” was never really the problem to begin with and, even with such a law, the overall amount of capital controlled by big banks wouldn’t change; there would just be more of them. Supporters of Dodd-Frank point out that it includes procedures for the Federal Reserve to “wind down” problematic banks that fail stringent stress tests and that some banks are breaking themselves up anyway to avoid the legislation.

Still, the banks’ campaign donations paid off well. Patriots like McCain and Warren were the exception among politicians. After a Citibank lobbyist wrote legislation to water down the Dodd-Frank rules on default swaps, Congressman Jim Himes (D-Connecticut) said of the relationship between Wall Street and politicians: “It’s appalling. It’s disgusting. It’s wasteful and it opens the possibility of conflicts of interest and corruption.” Himes, who worked for Goldman Sachs before masquerading as a public servant, co-sponsored the Citibank bill. Banks also decided they would no longer use actual paper (it’s inefficient) and lost or ditched many of the homeowners’ actual mortgage statements as they swapped them around. Then they forged new ones when they went to foreclose in the years after the collapse. Just before Christmas 2014, Citigroup and Republican Congressman Jeb Hensarling from Texas’ 5th District (near Dallas) inserted legislation into the overall federal budget deal that weakened Dodd-Frank, loosening up regulations on derivatives and ensuring that taxpayers ultimately cover the increased risk via bailouts and the FDIC. The legislation passed because of lukewarm Democratic opposition and threat of a government shutdown if the budget didn’t go through.

Dodd-Frank also required that corporations reveal the ratio of pay between their CEO and average employees, though it didn’t require that they limit the ratio. Japan requires that CEOs not make more than 15x a company’s lowest-paid employee. In the U.S. the ratio of CEO to average employee salaries went from 20:1 to 300:1 between 1983 and 2013 and there’s resistance to the idea that they should have to reveal these ratios to their boards and shareholders. Wall Street hedge fund managers are the most galling because, backed by Republicans and Democratic Senator Chuck Schumer (NY), they bought a loophole called the Carried Interest Tax whereby they don’t have to pay taxes on income. This farce withstood numerous assaults by the Obama administration until criticism finally went mainstream with Donald Trump’s campaign in 2016. At that point, GOP candidate Jeb Bush joined the chorus of those promising to get rid of it. Don’t hold your breath.

More generally, the Great Recession caused a class conflict within the GOP between business-oriented “Wall Street” Republicans and “Main Street” Tea Partiers who resented banks and corporate power despite their social conservatism and dislike for financial (or any government) regulation. During the 2016 campaign, Democrat Hillary Clinton supported reinforcing Dodd-Frank but WikiLeak videos of her at Goldman Sachs fundraisers revealed her suggesting to bankers that Wall Street had been simplistically scapegoated for the crisis as a matter of political necessity (though she stuck with her commitment to Dodd-Frank). Unlike Tea Partiers, democratic socialist Bernie Sanders wanted to regulate and reform Wall Street, but he shared their opposition to the bailout and had voted against TARP as a Vermont senator. Many of the more populist economic Republicans supported Donald Trump in his candidacy as he argued that his own participation in high finance gave him insight into corruption. Trump promised to “drain the swamp” of lobbyists, especially those from Goldman Sachs that he said corrupted Republican rival Ted Cruz and Democratic opponent Clinton. However, once he won the presidency, Trump followed in the footsteps of preceding administrations by filling his cabinet with Goldman Sachs alumni — including White House Chief of Staff, Treasury Secretary, Head of the Council of Economic Advisors and, yes, even the Securities & Exchange Commission charged with enforcement of Dodd-Frank.

Not all the economy’s problems were in real estate (Lehman Brothers and Bear Stearns made other bad investments, as well), but it was at the heart of the crisis. The Real Estate Bubble included both commercial and residential properties. On the residential side, the government encouraged accessible home loans since the New Deal in the 1930s, including the Democrats’ Community Reinvestment Act of 1977 that mortgage companies took advantage of through predatory lending. The government shares blame for the Great Recession with Wall Street bankers, duped voters, corrupt ratings agencies, and citizens anxious to live in houses they couldn’t afford. Their bipartisan contribution to the meltdown was an unfortunate combination of deregulation, low interest rates, and intervention in housing on behalf of more ownership. Historian Thomas Sugrue wrote that federal lawmakers promoted the idea of homeownership as “simultaneously the fount of citizenship and virtue, and an asset to be grown and milked.”

Barney Frank

Barney Frank

Government-backed semi-public corporations Fannie Mae and Freddie Mac bought up subprime mortgages under the Department of Housing and Urban Development (HUD). They were pushed by Congressman Barney Frank (D-Massachusetts) and affordable housing advocates ACORN. Frank testified before Congress that “you’re not going to see a [housing] collapse like you would a bubble” and Fed Chair Ben Bernanke agreed. The government’s HUD-mandated quota of loans classified as affordable housing rose from 30% in 1977 to 50% under Clinton in 2000, to 55% under Bush 43 in 2007. When Bush 43 signed the American Dream Downpayment Assistance Act he said, “We want every American to own a home.” The result of this decades-long push was that over half of the mortgages in the system by 2007 were of the subprime variety and over 70% of those loans were held by government agencies like Fannie and Freddie, the Veterans Administration or Federal Housing Authority. Yet according to a study by the National Bureau On Economic Research, most of the bad loans were actually regular prime-rate loans held by middle-class homeowners who’d over-borrowed based on confidence in a never-ending real estate bubble. When that bubble started to burst in 2006-7, Congress responded by trying to re-inflate the bubble in February 2008, expanding the lending capacities of Fannie and Freddie, hoping to spur the market. Like Alan Greenspan with banking deregulation, Barney Frank realized his mistake in pushing affordable housing by 2007, but by then it was too late. At least Frank (left) tried to clean up his mess; he is the Frank in the aforementioned Dodd-Frank legislation of 2010 aimed at regulating the banks that mishandled the mortgages he helped generate.

One ironic outcome of the real estate crisis was that it helped Fannie and Freddie. The part of the economy that triggered the crisis — housing — remains the least reformed, with banking having undergone regulations through Dodd-Frank. At first, there was a near bipartisan consensus that their role in mortgages should be reduced, especially since they’d contributed to the boom and bust. However, the ensuing crisis wiped out so many mortgage companies, and the big banks lost so much enthusiasm for riskier loans, that Fannie and Freddie’s share of the mortgage market jumped from 40% before the crisis to 80% after. The Veterans Administration and Federal Housing Administration also control a portion of the market via Ginnie Mae. The role of these agencies isn’t to originate the loans or end up owning them, but rather to guarantee them as underwriters, meaning that American taxpayers are on the hook for — as of 2016 — $6.4 trillion worth of mortgages if all homeowners were to default. While they won’t all default at once, it could get expensive if, say, 5% do, triggering another bailout on the scale of TARP or the 1980s Savings & Loan scandal. By 2012, these pseudo-corporations combined for a $100 billion profit — more than Apple, Exxon and Walmart put together. The money went directly to the Treasury Department, helping to reduce the budget crisis worsened by the meltdown. It remains to be seen whether or not Dodd-Frank is mere “lipstick on a pig” or whether it can help stave off a second crisis, or whether or not Republicans under the Trump administration can weaken it. The good news is TARP worked and didn’t even cost taxpayers money; the bad news is Americans (and, by extension, the world) still have a system that privatizes gain and socializes catastrophic losses because the biggest banks are concentrated enough to pose systemic risk. I’ll now reward my patient, discouraged and bleary-eyed reader by transitioning to a spicier (if less important) topic.

The 4th Exam for In-Class Lecture Students and 5th Exam for Distance Learning Currently Covers Information From This Chapter on Crime, Globalization, Healthcare Insurance and the Great Recession, But Nothing Below.  The Rest of The Chapter is Optional.

Monica Lewinsky, ca. 1998

Monica Lewinsky, ca. 1998


Lewinsky Scandal
In the late 1990s, though, not a soul in sight was focused on credit default swaps or an impending financial crisis. That was still ten years off. The economy was in full swing, aside from a speculative bubble building in tech stocks, and Republicans were riveted on Bill Clinton’s sex life. The Judiciary’s Independent Council, a special branch formed in the 1970s after Nixon corrupted the regular judiciary during Watergate, investigated Bill Clinton. They focused on Whitewater, a real estate investment the Clintons had made years earlier in the Arkansas Ozarks with a shady developer, then later the associated suicide of their friend Vince Foster and, finally, Clinton’s unsettled sexual harassment case with Paula Jones. Anxious to put the issues behind him, Clinton himself authorized the creation of the Independent Council investigation. The Council’s leader, Ken Starr, suspected all along that the original Whitewater charges against Clinton were bogus. However, outside sources like Richard Mellon Scaife, hoping to get rid of Clinton before his term expired, bankrolled the investigation anyway through the Arkansas Project. Having run up against a dead end on Whitewater and Foster, the Council’s hope was that if they investigated Clinton’s extramarital affairs (e.g. Paula Jones), he might have already revealed, or reveal in the future, something they didn’t know yet about Whitewater during “pillow talk” with a woman after sex. If a waste of taxpayer money, it at least spared Americans the boredom and effort of confronting al-Qaeda, climate change, and Wall Street instability.

For his part, Clinton couldn’t keep his drawers zipped up, even though he knew his opponents were pining to catch him in an affair. He struck up a relationship with a young intern named Monica Lewinsky. She told a co-worker about it, who told Starr’s council. They subpoenaed Clinton and, low and behold, had to videotape the testimony because one juror was mysteriously missing, then slipped the tape to FOX News by accident. Under oath, Clinton claimed he “did not have sexual relations” with Lewinsky, but a semen stain on one of her dresses obtained under warrant suggested otherwise. Thus, Clinton was guilty of perjury, depending on how one defined sex exactly. Did a semen stain prove he’d had sex? Clinton admitted to an “improper physical relationship.” Clinton was notorious for this kind of ambiguity, as evidenced by the non-inhaling aspect of his marijuana smoking. The perjury and obstruction of justice charges hinged on whether the actual allegations of oral sex constituted sex in the way that intercourse did. The House of Representatives determined that it did and impeached Clinton, sending the trial to the Senate.

Here’s where things started going awry for the Republicans. They’d been trying to get rid of Clinton for two terms, which was virtually unprecedented in American history. Normally the opposing party just counters policy through the accepted checks-and-balances system and tries to win the next election. The Constitution doesn’t authorize impeaching presidents because of disagreement over policies or unpopularity. The GOP would get their wish, though, if perjury could be defined as a high crime or misdemeanor, the Constitutional bar for impeachment. But the public was put off by the lurid details of the Starr Report, not understanding why Clinton’s investigators had taken such a detailed interest in his sex life.

Then journalists discovered that the Republican’s ringleader, Newt Gingrich (R-GA), was having an affair with his intern as well and that he’d asked his wife for a divorce as she was recovering from cancer. Ironically, Gingrich became the first victim of the Lewinsky scandal — yet another politician impaled by his own sword. After he stepped down, replacement Bob Livingston (R-ID) revealed that he, too, was sleeping with an intern, and resigned in a remorseful press conference. Years later, it came out that Livingston’s replacement Dennis Hastert (R-IL) paid $1.7 million in blackmail to a wrestler he’d sexually abused as a high school coach. He went to prison on child molestation charges in 2016.

By now, the public was also starting to realize that the entire Starr Commission had been a waste of time because they knew all along there was nothing to Whitewater, and the Clinton trial was looking like a witch-hunt. Did the U.S. really want to set the precedent whereby private citizens like Scaife could finance impeachment campaigns, launched before anyone had any knowledge of criminal wrongdoing? After all, Clinton’s perjury didn’t cause the original investigation; it resulted from it. Why was there an investigation to start with? No good reason, as it turned out, and at the most, nothing that had anything substantive to do with his inveterate womanizing. The Senate sensed which way public sentiment was leaning and voted in Clinton’s favor, 55-45, as his approval ratings shot up to the highest of his presidency (73%). All the Democrats voted to acquit and ten Republicans broke ranks and joined them. But Clinton’s goose was cooked, politically. He’d let down many Democratic voters, his VP Al Gore (whom he lied to about Lewinsky) and his wife, Hillary. He likely spent the remainder of his presidency sleeping on the couch.

If the Lewinsky Scandal had any more substantive political fallout, it was that it prevented Clinton from working with the GOP on reforming Social Security. Some commentators think that Clinton and the Republicans were working on a deal to partially privatize Social Security prior to Clinton’s impeachment.

2000 Election
When Al Gore, Jr. ran for president in 2000, in fact, he distanced himself from Clinton because of the Lewinsky Scandal. His opponents were George W. Bush (R) and Ralph Nader, the old leader of Nader’s Raiders from the 1960s who brought you seat belts (Chapter 16). Nader wanted to clean up Wall Street corruption, reduce CO2 emissions and kick lobbyists out of Washington. Gore had to pander to farmers and coal miners to win the Democratic nomination, so the future subject of An Inconvenient Truth (2006) was ambiguous about reducing carbon emissions. Nader ended up stealing some votes from Gore in the critical state of Florida though only 10% as many as the Democrats in that state who voted for Bush.

But, what made this election famous was the controversy on election night over who won and when. The media are in a tough spot calling states (naming the winner) after only 2-3% of the votes are in, because each network wants to be first but none wants to be wrong.

2000 Presidential Election Results by County, w. Gore Winning Blue and Bush 43 Red

2000 Presidential Election Results by County, w. Gore Winning Blue and Bush 43 Red

Florida is a tough-to-call “swing state,” with a sizable 22 all-or-nothing electoral votes at stake. Like Ohio, it has a northern and southern contingent culturally speaking, except that in Florida’s case, their South is up north in the panhandle, and their North is down south with Yankee retirees. The networks called Florida in Bush’s favor too early, and Gore even called Bush to congratulate him on his win, since the entire close election got down to Florida. Gore didn’t hold a press conference to concede the race, though, and he called Bush back to rescind his earlier concession when he started to catch up in Florida. With the networks using blue to denote states Gore won on their maps, and red for Bush states, the terms red and blue states entered the political vocabulary to represent Republican and Democrat, respectively. As you can see from the county map on the left, the GOP dominated sparsely-populated rural areas, while the Democrats did well on the coasts and in cities, and in areas with predominantly Hispanic and black voters.

By midnight, Florida had narrowed and was too close to call. Bush wisely spun it as if he’d already won and was being challenged by Gore. Having relatives at FOX, his brother Jeb as governor of Florida, and a Republican state attorney general helped him spin the election as a challenged victory rather than what it really was: a virtual tie. Gore wanted to recount certain Florida counties, but not all, and Bush wanted to recount nothing. When they did recount certain counties they found many Democrats had accidentally voted for Republican Pat Buchanan because of how the ballots were confusingly designed, while many voters of both parties had partially poked a hole through the circle next to their candidate, but accidentally left the “hanging chad” — the little circle that dangles on the ballot without falling off. In the meantime, minorities were claiming that white cops intimidated them away from the polls (similar charges arose in Ohio and in South Dakota toward Indians in future campaigns).

Bush’s camp hired protestors to create some chaos in the streets so that mainstream voters would want to stop the recounts. Finally, the Supreme Court ruled 5-4 in Bush v. Gore to stop the recounts ordered by Florida’s Supreme Court because inconsistent recount standards would violate 14th Amendment rights to equal protection, and their result “might not reflect well on Bush’s victory.” The ruling was strictly along partisan lines, with five conservative judges and four liberal. It was tortured logic and hypocritical given the GOP’s usual emphasis on states’ rights, but it might have incidentally served justice according to later independent recounts. When the Miami Herald and USA Today recounted the disputed counties, they found that Bush won Florida by a small margin. When the National Opinion Research Center, a consortium of media outlets, counted all the counties — what should have been done in the first place, but neither candidate favored doing so — they found that naming the winner would’ve depended on how the ballots themselves were counted (the hanging chads, etc.). Gore would’ve won the most restrictive but consistent scenario by 127 votes, and Bush would’ve won the most inclusive but consistent scenario by 110 votes. These studies don’t take voter fraud or intimidation into account. If there was even a shred of truth to those charges, that would’ve thrown the election off by more than these razor-thin margins. It was a tough loss for Gore, who defeated Bush nationally in the popular vote. He is one of four candidates in history to win the popular vote and lose the presidency.

The heated controversy surrounding Bill Clinton and the contested 2000 election dovetailed well with an increasingly confrontational media landscape and increased partisanship. Bipartisanship — the idea of the competing political parties working together “across the aisle” (of the Senate or House floors) on compromise — has generally been seen as a positive thing in American history, even though conflict and disagreement are natural and healthy parts of the system, too. In 2013, the Arizona Republican Party censured longtime senator, 2008 presidential candidate and Vietnam veteran John McCain for having cast too many bipartisan votes in his career; he was officially declared guilty, in other words, of having cooperated across the aisle and voting his mind. The Tea Party’s perception that Congressmen John Boehner and Eric Cantor were negotiating a budget deal with President Obama ruined their careers, as well.

Clearly, this is not a golden era of bipartisan cooperation. Partisans have always tried their best to smear opponents (see Chapter 12 in the HIST 1301 column for the 1800 election), but today’s spin-doctors are better organized and funded. In fact, both parties hire out politically neutral professional character assassins — it’s a thriving trade. A huge portion of time in office is spent raising money rather than governing, and the money goes toward negative campaign ads that everyone hates but are actually more effective than positive ones. Whoever can afford to smear the opponent more usually wins among a largely disinterested, ignorant or misinformed public. One man who ran for the Senate from Arizona, Richard Kimball, explained to voters how the system works in the middle of a debate in 1986:

Understand what we do to you. We spend all of our time raising money, often from strangers we do not even know. Then we spend it in three specific ways: First we measure you, what it is you want to purchase in the political marketplace – just like Campbell’s soup or Kellogg’s cereal. Next, we hire some consultants who know how to tailor our image to fit what we sell. Lastly, we bombard you with the meaningless, issueless, emotional nonsense that is always the result. And whichever one of us does that best will win.

Kimball went on to found Project Vote Smart, a non-partisan, non-profit site for serious voters interested in cutting through the fluff and learning where candidates actually stand on issues and who they take money from. Vote Smart takes no money from corporations, unions or any source generally affiliated with liberals or conservatives. Another neutral source for where candidates stand on issues (minus the financial backing) is from the League of Women Voters. Their voter guides ask the same questions of candidates and simply print the responses. There are other non-partisan sites listed at the bottom of the chapter for citizens that want to rise above the noise and hysteria and get informed.

Why else has partisanship increased so much in the recent past? The Cold War bound the parties together unusually well in the postwar period, as did a near consensus on the broad goals of New Deal programs like Social Security (Barry Goldwater was an exception). But the bipartisan era started to fray with Lyndon Johnson’s Great Society, Richard Nixon’s corruption, and the partisan Watergate trial. The primary system institutionalized in the 1970s, in which parties have their own elections to nominate candidates, tends to nominate more extreme candidates in the case of each party, who then usually move to the center in general elections. Most of the dirt tossed in general elections was already stirred up during the primaries within each party. While between 40-45% of American adults identify as independents, most aren’t eligible to vote in most primaries when they are “closed” to non-party members (not doing so would mean that people of the opposing party would vote for the worst candidates in their opponent’s primary).

Another factor intensifying partisanship is the increasingly fragmented media landscape, littered with hyper-partisanship and even fake news. For more on that, see the supplement at the end of the chapter entitled “The New Yellow Journalism.” Most people are generally aware of problems we’re encountering in media, with no real agreed-upon center of gravity. But there are also structural trends in America’s checks-and-balances system that have poured fuel on the partisan fire.

Checks & Balances: Beats the Alternative?
Conflict isn’t new, as anyone can tell who has read the previous chapters. We have multiple layers of checks and balances built into the system by design. States and the national government have a naturally adversarial relationship, along with the three branches of the national government (judicial, executive and legislative) constantly checking each other. That multi-layered, three-branch federal system can lead to gridlock and, indeed, it’s set up to do just that, to a certain extent. The party out of power (the minority) can block the agenda of the party in power, especially with the ability to filibuster any vote less than 60 in the full Senate — talking or debating so long that bills don’t come to a vote unless 3/5ths invoke cloture (the old rule was two-thirds prior to 1975). At first, filibusters were only used in rare circumstances but, in the mid-20th century, Southerners like Strom Thurmond (D-SC) used them as a tactic to block civil rights legislation that would’ve passed with simple majorities. By the early 21st century, filibusters were commonplace. In fact, the cloture law inadvertently made filibusters more common because Senators can filibuster with knowledge that (in theory) it will be overcome while still being able to tell their constituents back home that they did their best to block a bill unpopular in their state. Now, unless one party controls the White House, the Lower House of Representatives, and at least 60 Senators in the Upper House, with a cooperative Supreme Court, nothing dramatic is likely to get done.

As of 2012, overall Congressional job approval sat at 14%, far below the most unpopular presidents in history. The only way most things are done is by the President circumventing Congress altogether with special executive orders, or through courts taking matters into their own hands. Most voters hate “judicial activism” or presidents using executive powers when the opposing party does it, and defend either practice when their own party does it. On foreign policy, presidents have assumed near total control of the military from Congress, a trend set in motion under Franklin Roosevelt during WWII. When George W. Bush and Barack Obama signed off on long-term security agreements with the governments of Iraq (2008) and Afghanistan (2012), they didn’t even bother consulting Congress and, more telling, no one in Congress seemed to care.

The legislative branch, originally conceived as the primary voice of the people and the driving force among the three branches, doesn’t function very well. Or does it? For many people, their complaints really boil down to the fact that they share a country with people they disagree with. That’s not the system’s fault and isn’t necessarily even a crisis, other than a self-imposed contrived crisis. A smoother running system could just lead to more laws that people don’t want. Everyone bemoans the gridlock, but not as much as they’d hate the opposition party running roughshod over them. England, unlike the U.S., grants the victorious party power for as long as they can stay in office. But before you jump to the conclusion that their system is preferable, consider what would happen when the politicians you disagree with most took control of all the branches. Politicians in early 21st America are responding to voters who disagree on key issues like the budget and climate change.

Congress probably isn’t as utterly dysfunctional as people think, anyway. Prior to the 2010 mid-terms, they passed some significant legislation, including Obamacare and the Dodd-Frank bill. Whatever these laws’ strengths and weaknesses — all bills are compromised in ways that leave everyone unhappy — the final product in both cases was arguably a bill similar to what would’ve emerged from a Congress that drank and played golf and poker together, the way Democrats and Republicans used to when they could stomach each other.

Political Polarization

But there’s a growing sense the system has fundamental flaws that interfere with the democratic process and mitigate compromise. Three problems are the media fragmentation and enhanced Gerrymandering we introduced in the previous chapter and increased use of the filibuster mentioned above. Four other developments — three inadvertently caused by seemingly positive change — contribute to that uncompromising partisanship and dysfunction. These factors, if you’ll pardon the redundancy, are yet more cases of the Law of Unintended Consequences.

First, Congress is now more transparent and less corrupt than it used to be. Bear with me on this if that sounds backward. Congressmen used to make more backroom deals, and members with seniority ran powerful committees that hammered through bargains behind the public’s back. The Rules Committee even added amendments making it impossible for the other house to alter the legislation — a huge source of contention when done legally and openly. In 1974, newly elected liberal Democrats revamped the committee system in an effort to dislodge conservative Southern Democrats who’d long served as chairmen and, in 1994, Newt Gingrich’s reforms weakened the roll of senior committee chairs. By making the legislative process more transparent — today most hearings are open to the public and some are on C-SPAN — they made politicians more accountable to their constituents. How is that bad? you might ask. Because their constituents don’t want them compromising with the opposition by making deals, so it gums up the system.

Second, lobbying also gums up the gears more than it used to. While all lobbying in the form of campaign donations is arguably corrupt, the practice has become more transparent and, like the more transparent legislative process, that too has had unforeseen negative consequences. Now all the lobbyists can check up on their competition through public records, helping them to decide where and when to peddle their influence most effectively. That influence is more widely dispersed across rank-and-file legislators because power is no longer all in the hands of key committee chairs. Since Congress doesn’t do as much in secret anymore, lobbyists hold the politicians they bribe accountable for standing up for them in debates. Whereas before a politician could tell lobbies, “I’m sorry; I tried to plead your case but failed” — really meaning that he swapped their cause for an opponent’s vote on an issue he deemed more important — today’s lobbyist attends the hearing personally or watches on C-SPAN. The result, again, is less compromise greasing the system. To wit: the undemocratic corrupt backroom of yore filled with cigar smoke and “horse trading” worked better in some ways than real democracy. Government doesn’t “get as much done” when it has to work out in the open through televised hearings, accessible guest books, town hall meetings amidst public factions, voters, and lobbies poised to launch voter recalls, run negative campaign ads and slow things down further with court injunctions.

Lobbyists also deal more directly with individual candidates during elections than they used to. The bipartisan McCain-Feingold bill of 2002 limited the amount of “soft money” going to political parties, but that ultimately redirected more money directly to the candidates themselves in the form of Super PACs, 501(c)(4)’s and 527’s, especially after the Supreme Court ruled in Citizen’s United v. FEC (2010) that corporations (non-profit or for profit) and labor unions can’t be restricted in political donations without abridging their First Amendment right to free speech. This ruling has had the overall effect of weakening the political parties, as seen most dramatically in the inability of the Republicans to nominate (or at least control) their own candidate, Donald Trump, in 2016.  In fact, despite promises to “drain the swamp” of political corruption, Trump chose the head of Citizen’s United, David Bossie, as his Deputy Campaign Manager.

Third, the outlawing of pork barrel legislation has given congressmen less incentive to compromise and swap trades with each other. “Pork” is when extra bills and spending are “thrown into the barrel” (in 19th-century slang) on top of the key legislation to help swing votes. Pork is not very transparent or democratic because the public rarely knows what goes on as important bills are swapped behind their back and tacked onto other bills as footnotes amidst the backroom cigar smoke and horse-trading. For good reason, the public didn’t appreciate having key decisions made behind their backs and we saw some of the downside of this system in the financial meltdown above. In 2011, a coalition of Tea Party Republicans and some progressive Democrats helped ban pork barrel earmarks (spending bills) in the House of Representatives. But here’s the problem: pork is how things got done. Pork is why opposing parties worked together and hashed out compromises. Pork is why the road in front of your house is paved and the local school got built. Now, angry reform-minded citizens vote in candidates who promise “change” and who’ve “had enough” and want to “shake up the establishment,” but that means candidates who don’t know how to compromise and nothing gets done. The Constitution set up a system that only works if, ultimately, congressmen and women are forced to compromise. Folks, we’re learning that cigar smoke and secrecy aren’t all bad in politics.

A fourth issue intensifying partisanship is that the budget is now of more concern than ever. The total debt, as opposed to the annual budget deficit, is growing, as shown by the charts below.  Things don’t look quite as dire when you compare the debt to the overall economy, but a looming budget crisis nonetheless sharpens partisan swords, especially with the Right by and large opposed to any compromise, but slightly outnumbered among the population. Federal Debt In Total Dollars, Historically By YearAnnual GDP Growth In Relation to Public Debt

Parting Thoughts
“Democracy,” in the words of former English Prime Minister Winston Churchill, “is the worst form of government except all the others that have been tried.” One could plausibly say the same about capitalism. Instead of ignoring public life or letting yourself degenerate into a state of cynical fatalism, step back and ask yourself whether the American system of democratic capitalism has really failed you in the big scheme of things. We don’t live in a genuine democracy directly responsive to voters but, then again, no one before you or elsewhere ever has either. Moreover, we don’t know if such an idealized republic would necessarily be better than a political system responsive (first and foremost) to informed lobbyists — as long as the lobbyists are diverse and their bribery is somewhat transparent. Unlike you, lobbies don’t vote. Besides reading up on issues, educate yourself before voting as to which lobbies fund which candidates on non-partisan sites like those listed below. In the words of Watergate’s famous Deep Throat, “follow the money.” Also, keep in mind that, unlike every country on Earth save Bolivia, 39/50 U.S. states elect judges along with politicians rather than appointing them. Here again, follow the money; their campaigns are financed mainly by corporate lobbies.

Human societies are full of conflict and the purpose of non-Utopian politics isn’t so much to eliminate that conflict as to channel it as constructively as possible. Historian Henry Adams (John Quincy Adams’ grandson) hit the nail on the head on when he said, “politics is the systematic organization of hatreds.” Since America’s big, diverse populace is genuinely divided and wants different things, maybe its system of gridlock interspersed with occasional compromise is the best we can do. It beats civil war.

In common with earlier eras, Americans of the early 21st century share an obsession with national decline — fueled by real problems like climate change and terrorism, but also predicated on a lack of appreciation for how complicated and challenging those earlier eras were and how some things have improved (e.g. crime, pollution, life expectancy, race relations, agricultural and industrial productivity, space exploration, etc). Novelist Gustave Flaubert wrote, “Our ignorance of history causes us to slander our own times.” It’s always been thus. Hopefully, you’ve learned enough history by this point to understand that yours is not the first generation to confront challenges, nor is it likely to be the last.

Optional Listening, Viewing & Reading:
Supplement: The New Yellow Journalism
Backstory, “Stuck: A History of Gridlock” (Virginia Foundation for the Humanities)
John Green, Why Are American Healthcare Costs So High?
Peter Wehner, “The Party of Reagan is No More,” TIME, 3.10.16
Avik Roy, The Tortuous History of Conservatives & the Individual Mandate Fortune, 2.7.12
History of U.S. Bailouts & Their Results, 1970-Present Pro Publica
Michael Hirsh, “Why Trump & Sanders Were Inevitable,” Politico Magazine, 2.28.16
Jonathan Rauch, “How American Politics Went Insane,” Atlantic, 7-8.16
David Greenberg, “How Roger Ailes Created Modern Conservatism & How Donald Trump Upended It,” Politico, 7.20.16
Jedediah Purdy, “A Billionaire’s Republic,” Nation, 7.11.17

Non-Partisan Political Information
THOMAS (Library of Congress Legislative Information)
OpenCongress (Search For & Email Your Representatives)
PolitiFact (Non-Partisan B.S. Meter — Winner of Pulitzer Prize)
FactCheck.org (Non-Partisan Annenberg Policy Center)
Follow the Money (National Institute on Money in State Politics)
Project Vote Smart (Just the Facts)
OpenSecrets.org (Center for Responsive Politics)