“Reagan doesn’t have that presidential look” — United Artists Executive Rejecting Reagan for Lead Role in The Best Man (1964)
The liberal wave started during the Progressive Era, gained momentum with FDR’s New Deal in the 1930s, crested with LBJ’s Great Society in the 1960s, and ran out of steam by the mid-1970s. Conservative Ronald Reagan won the 1980 presidential election by arguing that “In this present crisis, government is not the solution to our problem; government is the problem.” The Departments of Energy (1977-) and Education (1979-) expanded federal bureaucracy some in the ’70s, but the public mood was shifting back toward smaller government at all levels. Ralph Nader’s idea of creating a bureaucracy for consumer protection went nowhere. The Civil Aeronautics Board went away in 1985 and the Interstate Commerce Commission (railroads and trucking) in 1995, their safety enforcements transferred to other agencies. America, of course, never swings all the way in one direction or the other — right or left — but the momentum and rhetoric shifted right during the conservative resurgence of the 1970s and after. To repurpose for liberalism what Winston Churchill said about the meaning of World War II’s Battle of El-Alamein for Nazi Germany: “this was not the end. It was not even the beginning of the end. But it was, perhaps, the end of the beginning.”
Does America really have a “big government?” Proportional to the size of its country, America’s government as of 2015 was fairly small by international standards, spending around 14.5% of national GDP output compared to Australia (18%), Germany (19%), Russia (19%), the United Kingdom (19%), and Canada (21%). These World Bank figures are lower than the Office of Management & Budget figures (right) that usually hover ~20%. The U.S. leads the world in total expenditures, though, spending ~$6.2 trillion in 2023 while collecting $4.5 trillion in revenue (USAFacts). America spends more on its military than its next eight competitors and twice that of China and Russia combined (SIPRI), but doesn’t provide health insurance for those under 65, except for Medicaid at the state level. Of course, spending and military power aren’t the only measures of a government’s size or reach; there’s also its legal/regulatory system and the power of law enforcement.
Look at the chart on above and think of how steadily the drumbeat has grown over the last decades about the federal government getting bigger. Separate the signal from the noise: the noise is ongoing explosive growth; the signal is that we spent some extra money during WWII and COVID-19 and saved some money on the military when the Cold War ended, with a financial hiccup around 2008-09. Domestically, there’s a lot of noise about “tyranny” but, collectively, U.S. laws don’t stand out as being overly oppressive in comparison to other nations. Some of that noise about the growing government originates among those profiting from beating the drum, while some of the noise originates from quarters unfamiliar with real tyranny or suspicious of a deep state within intelligence agencies operating outside the public or even regular government’s purview. Conspiracy theories are more entertaining and profitable than real knowledge or perspective. Still, many laws originate in agencies run by non-elected bureaucrats and housed under the executive branch such as the IRS, FDA, OSHA, FCC, EPA, etc., and it’s not surprising that citizens resist or resent these laws when they’re onerous or administered with an inflexible or heavy hand. Governments grow because voting citizens across the political spectrum demand things, because agencies grow to police other agencies, because big companies lobby (i.e., bribe) politicians to pass regulations that small companies can’t afford to comply with, and because bureaucracies have a natural tendency to grow like fungi regardless of the first three reasons.
By the 1970s and ’80s, the Great Society era launched by Lyndon Johnson was waning, and the public was ready to send the pendulum back in the other direction, despite still wanting the services and protection government provided. You could trace one turning point in liberalism’s demise to New York City’s near-bankruptcy, when they ran out of money to pay public employees and had to look overseas to sell municipal bonds. While New York was the first city to tighten its budgetary belt, California was the first state, though in their case they cut taxes more than they cut spending. California passed Proposition 13 in 1978, a law banning increases in property taxes (beyond inflation) unless authorized by a two-thirds majority. The brewing conservative resurgence wasn’t just about cutting taxes, but also reducing government intervention in the economy and reinjecting religion into politics. With the election of Ronald Reagan in 1980, the conservative revolution launched under Barry Goldwater in 1964 led to a fundamental changing of the guard in Washington and in many states. Conservatives didn’t take over, but they stole the momentum and have more or less maintained it ever since.
Stagflation & Energy Crisis
Yet, the first president we’ll discuss as part of this conservative resurgence was a Democrat. By the 1976 election, the public wanted the most anti-Nixon, anti-Vietnam, anti-Watergate type candidate they could find. They found it in Jimmy Carter (D), a born-again, peanut-farming governor from Georgia untarnished by Washington politics. Watergate began a trend toward outsider candidates, resulting in state governors Carter, Reagan, Clinton and Bush the Younger all winning the presidency. After defeating Gerald Ford in a tight 1976 election, Carter came to Washington with what some congressmen perceived to be a holier-than-thou attitude and didn’t work well with what he perceived to be corrupt Washington insiders. Like Richard Nixon, he had a fortress mentality in the White House, not initiating relations with congressmen on Capitol Hill. He alienated conservatives by creating the Department of Energy to try to wean the country off Arab oil (the GOP didn’t want more bureaucracy, and oil companies feared breakthroughs on alternative energy), and he alienated Great Society Democrats by overseeing economic deregulation (more later) and insisting on a balanced budget. In that way, Carter was more of an independent and fiscal (budgetary) conservative than a party-line Democrat. As a fiscal conservative, Carter alienated Democrats by refusing to go further into debt and Republicans by refusing to cut taxes. But his main, seemingly intractable problem was stagflation.
Normally prices don’t rise during a recession but, by the mid-1970s, the U.S. was mired in stagflation: the unlikely combination of inflation and high unemployment. LBJ’s Great Society and the lengthy Vietnam War raised the federal deficit, contributing to inflation. So, too, did President Nixon decoupling American currency from the Gold Standard in 1971, due to too many trade surplus nations swapping greenbacks for a dwindling gold supply (American dollars were convertible to gold dating back to the Bretton Woods Conference in 1944). Going off the Gold Standard created confidence, or fiat, currency instead. Greenbacks from then on were worth whatever people thought they were worth, based on their confidence in America’s solvency and survival. As for coins, they cost more to mint than the amount of copper or nickel on the coin is worth. Caught in between its two mandates — curbing inflation and supporting job growth — the Federal Reserve initially didn’t raise interest rates to stem inflation because they feared that would further weaken the job market. Inflation has a mixed impact on the government itself. It helps because, in real dollars, it reduces the money the government owes; however, the higher interest rates needed to curb inflation raise the cost of issuing bonds to fund future debt.
As we saw in the previous chapter, OPEC (Organization of Petroleum Exporting Countries) embargoed oil in 1973 to show the West how dependent they’d become while punishing them for supporting Israel. Then they raised prices and within a few short years, oil climbed from $3 to $12/barrel. The Iranian Revolution that we’ll cover later in this chapter caused another big spike in oil prices in 1978-79. This coincided with increased dependence on foreign oil. Oil hit $1 gallon for the first time in American history, even adjusted for inflation far higher than the 5¢ it cost in the 1950s.
To offset high prices and Peak Oil, the U.S. built, between 1975 and ’77, the Trans-Alaska Pipeline from the North Slope, which was iced in much of the year, to the Port of Valdez, where tankers could ferry it to the Lower 48. That still wasn’t enough to offset the price hike. Many Americans overreacted by trying to hoard oil, not realizing that global price and local supply aren’t always linked. There was often no actual shortage in the pipeline. Either way, prices remained high, and oil is so important to industrialized societies that it can drive up inflation even during a recession, thus the stagflation. European countries tax oil enough to deliberately make it expensive and then use the taxes to pay for mass transit, encouraging people to conserve. The United States has historically devoted more to its military budget instead, partly to ensure the flow of cheap oil.
Detroit wasn’t well prepared to manufacture fuel-efficient cars in the 1970s. For years leading up to then, bigger was better. The oil crisis coincided with the overall rise of European and Japanese industry from the ashes of WWII, and they were better at making smaller cars that got good mileage. The U.S. rebuilt those countries as industrial powerhouses after the war and succeeded beyond their expectations. They were now fully rebuilt, which was good for the world economy, but also meant that the U.S. wasn’t the only kid on the block. In Detroit itself, frustration took the form of one white autoworker and his son-in-law beating to death Chinese-American draftsmen Vincent Chin with a baseball bat in 1982. A plea-bargaining judge gave the murderers three months probation for manslaughter because they had histories of solid employment (see optional video below).
Among other things, foreign competition meant that American unions would steadily weaken from that point forward, and most middle and working-class families would need both parents working to keep up.
American manufacturing was entering a long, slow period of decline, as one factory after another shut down in the Rust Belt of the industrial Northeast and Midwest. That trend continued into the early 21st century if you look at manufacturing jobs. But if you measure by output, American manufacturing has done well in recent decades. In fact, the U.S. is producing far more than ever; it’s just producing more with automation and lowered-paid staff rather than union workers. And there is even an unmet need for semi-skilled workers in American plants that can only be offset by immigration or more young people seeking work in manufacturing. As factory unions weakened, public unions suffered setbacks as well. Major cities like New York struggled with a combination of low taxes and well-pensioned public unions (policemen, firemen, sanitation workers, teachers, etc.).
President Carter described the need to transition to alternative energy as a moral imperative, or the “moral equivalent of war.” But his pleas for Americans to conserve energy out of a sense of patriotism mainly fell on deaf ears. Carter learned that, while most Americans are patriotic when it comes to wars, they’re less enthusiastic about turning down the AC in the summer or heat in the winter. The moral argument ran headlong into strands of Sunbelt Christianity that associated with the oil industry — what historian Darren Dochuk called “wildcat Christianity” or “oil patch evangelicalism.” Then a near meltdown at the Three Mile Island nuclear plant outside Harrisburg, Pennsylvania in 1979 dampened the public’s enthusiasm to pursue atomic energy as a viable alternative to fossil fuels. Carter had been a nuclear engineer himself, commanding a nuclear sub in the early Cold War, and visited the plant personally during the height of the crisis. Engineers staved off a full meltdown of the inner core reactor, but the near-miss spooked the public and few new reactors went into construction afterward. Had they not been able to cool the reactor it would’ve sunk into the Earth, radiating the soil and water around it before exploding when it contacted groundwater. At least the crisis propelled artificial intelligence, as they had to build a robot to clean up the wreckage and water because it was too contaminated for humans. It finished up its work in 1993, fourteen years later. In 2024, Microsoft inked a deal with Constellation Energy to re-open a Three Mile Island reactor to power a data center.
A popular movie called The China Syndrome, so-named because one fanciful scenario was that a reactor might melt through the Earth’s core “all the way to China,” which it wouldn’t, foreshadowed the near-meltdown just prior to the actual emergency. Problems with waste disposal and much worse crises in the USSR in 1986 (Chernobyl) and Japan in 2011 (Fukushima Dai-Chi) dampened the industry’s prospects. Soviets also covered up an accident at Kyshtym in 1957 and the CIA didn’t mention it either, even though they knew, so as to not dampen Americans’ enthusiasm for nuclear energy. President Eisenhower’s dream of always having one of the giant reactor chimneys within sight as one drove across the country didn’t materialize, and neither did mini-reactors in each home or smaller devices to propel vacuum cleaners and other appliances. We didn’t use nuclear weapons to widen the Panama Canal or clear cuts through western mountains for interstates as some hoped. Nuclear-powered coils that could melt driveway and sidewalk snow didn’t happen in the 1950s and they weren’t about to after Three Mile Island. Still, today, nuclear reactors supply ~ 20% of America’s electrical power and it’s carbon-free.
When Carter talked about the economic malaise the country was in, he came across as ineffectual in fixing the situation or as telling Americans something they didn’t want to hear. His administration was struggling to keep the basic cost of living and energy from rising faster than wages. For many Americans, especially white-collar workers or union workers with automatic cost-of-living adjustments (COLAs), the price-wage spiral allowed them to keep up. People are skeptical about this concept but inflation requires wage spirals; otherwise no one would be able to afford the higher prices. But wage hikes of the 1970s were uneven across occupations, as is always the case during inflation. Some blue-collar workers lost ground and retirees on fixed incomes saw their “nest egg” savings shrink at the rate of inflation, probably the cruelest effect of high inflation. Investors weren’t happy either. While stock prices rose in nominal dollars, there was no real increase (adjusted for inflation) from 1965 to 1982. Still, stocks were a better investment than a low-yielding savings account which, in effect, lost money as it grew at a lower rate than inflation.
Just as the capacity of wages to rise along with inflation varies among occupations, so too, inflation is uneven from sector to sector. For instance, in the 21st century, despite low inflation in retail and deflation in electronics, alcohol, and air travel, prices in housing, healthcare, and education rose relative to income. In an inflationary environment, borrowing makes sense because the amount you’ll owe later is, in effect, less, so people that hang on to their jobs keep borrowing and spending, which drives inflation even more. In a desperate move to stop double-digit inflation in 1980, which stood at a staggering 14%, Federal Reserve Chair Paul Volcker raised interest rates dramatically, slowing the economy because fewer people could borrow, but at least reversing inflation’s rise. As the short-term borrowing rate between banks soared to 20%, the nation dipped into its worst recession since the 1930s, this one deliberately caused.
Some economists argue that Volcker’s drastic actions were unnecessary because those prices would’ve subsided on their own, without government action. There were many causes of inflation in the 1970s besides low-interest rates, and those would’ve naturally taken care of themselves, according to this argument.
Slamming the brakes on the economy by raising rates also raised unemployment, which Volcker supporters like Milton Friedman claimed hovered naturally around 5% anyway. Unemployment shot up to nearly 11%, above. In truth, the country was in a bind that didn’t offer any easy solutions, and Carter sided with the conservative approach of Volcker and Friedman. America took its Volcker chemotherapy, killing inflation cells along with growth and employment cells and deliberately triggering a telegraphed recession. Britain tried another approach, switching everyone to a 3-day workweek, but that just caused shortages (especially coal) that spiked prices even more. Things got worse in America before they got better and the economy didn’t bottom out until 1982. Farmers with variable-rate land mortgages also felt the pinch of higher rates, especially those that lost their wheat and corn export trade to the USSR with the end of détente (more below), while high production lowered commodity prices and a strong dollar weakened overseas exports. Over 900 farmers committed suicide in the Midwest, while thousands more families lost farms to banks due to foreclosures, leading musicians Willie Nelson, Neil Young, and John Mellencamp to organize Farm Aid benefit concerts.
Inflation rose briefly after COVID-19 but has mostly subsided, from 8+% in 2021 to 3+% by December 2023. As we saw in the Chapter 5 section on the Fed, recent inflation resulted from a perfect storm of COVID-related disruptions in supply-chains (especially global), tight oil supplies, Trump-Biden China tariffs, and worker shortages — all combining to cause demand to exceed supply. Some people blamed too much COVID-19 relief under Trump and Biden, but $1200 stimulus checks to Americans didn’t cause global inflation, and they helped stave off a recession domestically. Oil, food, and fertilizer prices were up because of Russia’s invasion of Ukraine. The Federal Reserve can’t fix problems outside their control, including gas and food prices, the part of inflation that’s most troubling to citizens (and voters). By 2023, supply-chain snags were clearing up and gas prices lowering, but a tight labor market kept wages, and thus prices, on the rise, especially in service industries, and high corporate margins (profits) kept prices high, too. Keep in mind, though, that falling inflation rates don’t mean prices are going down (that would be deflation), but rather that the rate at which they’re rising is decreasing. It’s good news for young people entering the workforce that wages will climb in certain sectors as employers compete for help in an economy with low unemployment.
It’s also possible that the recent spike in gas prices will either accelerate the transition to EVs and hybrids, which would be good for the environment in the long run, or lead to a political backlash against renewables because part of the problem is that energy companies aren’t making long-term capital investments in fossil fuels (e.g. refineries) because they anticipate the switch to EVs, contributing to near-term shortage. Every demographic in America except for white Democrats supports expanding oil production in the near term, at least, to keep prices down, and Biden released oil from the Strategic Petroleum Reserve after’s Russia’s Ukraine invasion. In the case of ExxonMobil, their potential loss in gas revenue with the advent of EVs will be partially offset by the need for more plastic, a petroleum byproduct, as we’ll need lighter-weight vehicles for greater EV range and less tire pollution.
Deregulation
Jimmy Carter made some changes that helped the economy long-term besides initiating the painful process to slow inflation. To deregulate is to lessen the government’s role in a sector and/or take laws off the books. One theme you’ll see in this chapter is that, much like its inverse regulation, deregulation can have unpredictable and unforeseen consequences, good and bad. If it was as simple as the-more-laws-the-better or the-fewer-laws-the-better we would’ve figured that out already. Spurred by Republicans and his intraparty rival, Senator Ted Kennedy (D-MA), Carter deregulated some industries that had been under the government’s control, including transportation (airlines, trucking, rail) and natural gas lines. In 1978, regional startups such as Southwest began to undersell big national airlines on an open market, challenging the original five-headed government-sanctioned oligopoly of United, Eastern, Braniff, American, and Delta. The impact of airline deregulation has been to dramatically cheapen the cost of flying, adjusted for inflation, so that by the late 20th century it was common for middle-class Americans to fly. However, in comparison to the mid-20th century, airlines now put less emphasis on customer service because they distinguish themselves by selling cheaper coach tickets rather than serving customers that are all paying similar, high rates. Trucking deregulation has been a net loss for short- and long-haul truckers, leading to lower pay (with fewer unionized as Teamsters), more dangerous working conditions (that also kill 5k motorists annually), and corrupt lease-purchasing agreements that rarely result in truckers owning their rig.
Communications followed the same deregulatory trend, triggered by a 1974 anti-trust lawsuit breaking up Ma Bell (AT&T), by 1984, into “Baby Bells” including AT&T, Verizon, CenturyLink, etc., which one AT&T engineer described as “trying to take apart a spider web without breaking it.” The government had sanctioned the Bell System’s monopoly in 1921, and their Bell Labs research and development dominated patents, stifling competition, while they rolled their profit into buying up over half the U.S. bonds on the market. Taking direction from the Federal Communications Commission (FCC), the quasi-public utility’s slogan was: “one policy, one system, universal service.” That service was decent, widely accessible, and affordable for local calls but expensive for long distance, and public pay phones were hard to maintain. Pre-cable TV piggybacked on AT&T’s transmission lines, and Richard Nixon hated the big three networks that could afford to pay AT&T’s ransom. So Nixon’s Office of Telecommunications Policy and DOJ overcame AT&T’s lobbyists and filed a suit that, ten years later (1984), opened up telecom for competitive pricing just before the advent of mobile/cell phones, starting with car phones and “bricks.” They also opened up competition in satellites just as they replaced expensive, long distance communication lines, and freed cable TV to compete with networks in 1972. The 1996 Telecommunications Act then opened up any companies to compete in any market. The biggest company now is none other than AT&T, but it’s unlikely that smartphones would’ve come about in the same way under just Ma Bell.
Other industries deregulated in the 1970s. Credit card companies — to the detriment of the working poor and unfrugal middle class, but in keeping with free-market principles — won the right to charge unlimited interest rates. Jimmy Carter also signed off on legislation allowing for the creation of Business Development Companies (BDCs) that gave small investors a tax-friendly way to invest in private businesses and riskier startups than those otherwise allowed in the SEC-regulated public stock markets. All this likely helped lay the foundation for economic recovery in the 1980s but wasn’t enough to help Carter at the time. Finally, on a lighter note, Carter signed regulation legalizing home-brewing in 1978 that advanced the rise of craft beers as independent, artisanal brewers initially honed their craft in garages.
Arms Race
Meanwhile, Jimmy Carter had plenty of problems overseas to deal with. Building on Richard Nixon’s foundation, the U.S. officially normalized relations with China in 1979 but Nixon’s détente with the Soviets came unraveled under Carter. The Soviets gained influence in Ethiopia (East Africa) as a Marxist state killed hundreds of thousands in a Red Terror and various relocation schemes. Carter’s emphasis on human rights threatened the Soviets and the arms race spiked dramatically because of better ICBMs (inter-continental ballistic missiles) and multi-warhead MIRVs (multiple independently targetable reentry vehicles) that, according to rumor at least, could be dosed with biological weapons. This LGM-118 “Peacekeeper” MIRV the U.S. tested over the Kwajalein Atoll divides into eight 300 kiloton warheads, each ~20x more powerful than the Hiroshima bomb if detonated.
In the 1970s, each side was testing submarine-based ballistic missiles (SLBMs) and air-launched cruise missiles that could be loaded onto traditional bombers like America’s B-52’s. The U.S. placed medium-range cruise missiles similar in design to the Nazis’ old V-1 “flying bombs” in southern England, loaded onto mobile launch pads in the payloads of trucks parked in underground bunkers. These relatively cheap warheads, each costing only ~ ¼ of a jet fighter, had a 2k-mile range and could incinerate a small city and kill everyone in a ten-mile radius. In general, Soviets focused more on size whereas the U.S. focused on accuracy. Each side additionally worked on neutron bombs that could wipe out life without destroying property, though the U.S. shelved plans to arm NATO with the new weapons due to public pressure. Near the end of his presidency, Carter announced that both sides had more than 5x as many warheads as they had in the early 1970s. A round of SALT (strategic arms limitation talks) between Carter and the Soviets slowed the madness some, limiting each side to 2400 warheads and convincing the Soviets to halt production on new MIRVs that carried up to 38 separate warheads.
Middle East
Six months after the SALT II talks in Vienna, in December 1979, the Soviets invaded Afghanistan to support communist forces in a civil war there against the jihadist Mujahideen. In response, Carter embargoed agricultural trade, boycotted the 1980 Moscow Olympics, and issued a doctrine stating America’s intention to protect its oil interests in the Middle East. However, nothing could compel the Soviets to relent in their misguided quest to conquer Afghanistan and arms reduction talks stalled.
West of Afghanistan, resentment had been building in Iran against the U.S. ever since 1953 when the CIA helped overthrow their parliamentary democracy and replace it with a dictatorship (the Shah) that sold the West cheap oil. America’s role in that overthrow grew in the Iranian imagination as the years passed. Unfortunately for Carter, he reaped what President Eisenhower and his successors sowed. The only accommodation the Shah had made to free speech was within mosques, so anti-Western sentiment fused with fundamentalist Islam there over the decades. When Islamic revolutionaries took over the country in 1978, the Shah escaped to Egypt and Mexico, then sought cancer treatment in the U.S. after Henry Kissinger insisted on it in exchange for endorsing Carter’s SALT II treaty. Granting the Shah exile was the straw that broke the camel’s back for the new leaders who seized the American embassy in Tehran, capturing diplomats and Marines in the process. Before the Carter administration granted the Shah asylum at a New York hospital, they should’ve evacuated their embassy in Tehran. According to international law, embassies are enclaves protected from the societies around them, but U.S. embassies were breached in Vietnam (1968), Iran (1979), Lebanon (1983), Tanzania (1998), Kenya (1998), and Libya (2012), and Mexico’s in Ecuador (2024).
The Iranian Hostage Crisis gave rise to a warped, symbiotic relationship between western media and terrorists. The American TV-watching public riveted on the event, but the kidnappers were granting them access inside the embassy only because it provided them a forum to broadcast their anti-American message. At least the English-speaking kidnappers and hostages bonded over pro football, as western media gave them cassette recordings of the 1980 Steelers-Rams Super Bowl to listen to together. Meanwhile, Iranian TV televised executions of Iranians live, nightly, as an assortment of factions within the country fought for control of the revolution. At first, Carter hoped the crisis could divert Americans’ attention away from the domestic economy, but that backfired as the crisis wore on and ABC’s Nightline covered angry Iranians burning Uncle Sam in effigy. CNN launched in 1980 to take advantage of viewer interest in the crisis. For Americans still in a post-Vietnam funk, it was like getting salt poured in their wound. Finally, Carter ordered a military rescue, but poor planning compromised the operation and one helicopter crashed into the C-130 tanker aircraft there to re-fuel the helicopters, killing eight Americans who were left behind. The CIA foolishly chose a desert staging area with a highway running through it (they blew up a fuel truck after the driver and other witnesses drove off in a pickup, and temporarily kidnapped 44 bus passengers). The fiasco further torpedoed Carter’s presidency and raised the prestige of revolutionary leader Ayatollah Khomeini (below), exiled by the Shah in 1963.
The one big feather in Carter’s foreign policy cap was negotiating peace between Israel and its most formidable adversary, Egypt. Carter built on the Shuttle Diplomacy initiated by Henry Kissinger in the early 1970s, whereby the U.S. no longer supported Israel unconditionally but rather tried to broker peace between Israel and its neighbors. Building on an idea raised by CBS News’ Walter Cronkite in a split-screen interview with the leaders of Israel and Egypt, Menachem Begin and Anwar el-Sadat, Carter invited both to Camp David, Maryland for a retreat.
At first, he had to walk from one end of the compound to the other to relay messages, but he eventually got both men in the same room to talk through interpreters. In the Camp David Accords, Israel agreed to swap the Sinai Peninsula in exchange for Egypt’s recognition of Israel’s right to exist. The two have been at peace ever since, though the democratic revolution in Egypt in 2011 threatened the relationship because, potentially, a populist Muslim Brotherhood in Egypt might favor war with Israel. As for Sadat, his own army assassinated him during a parade for negotiating with Israel, underscoring the resistance to peaceful compromise that Middle Eastern leaders face among their own populations. A similar fate awaited Israeli leader Yitzhak Rabin after he laid out a framework for peace with Palestinians within Israel in the 1993 Oslo Accords. He was killed by a right-wing Israeli opposed to peace.
For Carter, his success with Israel and Egypt wasn’t enough to offset setbacks in Iran and Afghanistan. In retrospect, Afghanistan was causing the Soviets more harm than the Soviets were causing the U.S., but Iran plagued Carter as he approached re-election in 1980. After months of negotiations to get their assets unfrozen in American banks, Iranians released American hostages within minutes of when Carter left office. As new president Ronald Reagan took the inaugural oath in January 1981, freed hostages hit the airport tarmac. This, too, played out on a split-screen.
Morning In America
It’s hard to say whether the Iranian Crisis cost Carter re-election or not. By 1980, the time was right for the Reagan Revolution as Americans were ready for a conservative change of pace. The Misery Index (stagflation) as economists came to call it, set the stage for Republican victory by Californian Ronald Reagan over Carter in 1980. The actor and former liberal had tried to lure Dwight Eisenhower into running as a Democrat in 1952 and supported Harry Truman’s plan for universal healthcare insurance until his conservative (second) father-in-law talked him out of it. As president of the Screen Actors Guild, he veered right during the Red Scare, serving as an FBI informant. With no foreknowledge of classic movie theaters or TCM, he negotiated for the actors’ future royalties, or residuals (he’d seen the popularity of 1939’s Wizard of Oz on TV starting in 1956). Reagan became a spokesman for General Electric in the early 1960s and advocated for deregulation of all industries. He became governor of California in 1966 after campaigning for Barry Goldwater’s presidential run in 1964, saying of his defection that “I didn’t leave the Democratic Party; it left me.” As California governor in the ’60s, he deftly used race riots and the counterculture as foils. He ran on a law and order platform, just as Nixon did in the presidential election, saying that inner cities were “jungle paths after dark.” He ordered tear gas dropped from National Guard helicopters on protesting “beatniks and malcontents” at Cal-Berkeley and upbraided professors for their leftist leanings, sometimes hauling news crews in with him to their office hours. By 1980, he led movement conservatism. If Goldwater was the godfather of the conservative counter-revolution, Reagan was its messiah.
At the 1976 GOP convention — the last contested convention to date — Reagan nearly stole the nomination from Gerald Ford. With his telegenic charisma and jocular charm, the “Gipper” gave a rousing speech even as the party crowned the more moderate Ford as their candidate. Ford was an Eisenhower Republican whereas Reagan was a more conservative Goldwater Republican. His acting talent served him well in politics and provided built-in familiarity. Most famously, Reagan had played dying Notre Dame football player George Gipper in Knute Rockne: All-American (1940).1980 was a watershed election, on par with 1932 in terms of swinging the American political pendulum back toward the right, just as ’32 had swung it to the left. Fusing optimism and patriotism, Republicans, for the first time since the New Deal, pried away a significant chunk of blue-collar workers. Many of these Reagan Democrats wanted to restore military pride or, in the case of some Christians, opposed the Democrats’ pro-choice abortion platform. The Supreme Court legalized abortion in Roe v. Wade (1973), completely during the first trimester. Reagan tapped into public skepticism about government agencies and programs launched under Johnson’s Great Society in the 1960s like welfare, public housing, and food stamps, whereas Goldwater had taken on a Social Security program that was popular among working-class Whites. As we read in Chapter 16, one commentator said that “Goldwater lost against the New Deal, but Reagan won against the Great Society.” Reagan’s motto in 1980 was “Let’s Make America Great Again” but there weren’t red LMAGA hats.
One of Reagan’s first speeches after his nomination was at the Neshoba County Fair outside Philadelphia, Mississippi, where three civil rights workers were killed in 1964. Reagan and his advisers weren’t just throwing darts at a map and picking random towns instead of larger cities. While not endorsing segregation or violence, he used the occasion to remind the townspeople of how much he’d always appreciated their commitment to states’ rights in the context of a talk on the innocuous subject of education. Like many who went before him, he’d learn to couch racism in an anti-federal context, especially appealing to suburban racists who, like Reagan, obsessed over not being labeled as racist. This time-honored political trick is known as a dog-whistle, for whistles that dogs can hear but not humans, like coded messages that racists hear but not mainstream voters. The obvious advantage is plausible deniability. The rebel flags in the crowd were, of course, merely about “heritage” and the fact that everyone knew states’ rights in the South was associated with Jim Crow was coincidental. When his opponent Jimmy Carter accused him of racially-coded language Reagan was insulted and demanded an apology.
In the 1960s, Reagan opposed the national government’s role in the 1964 Civil Rights Act and ’65 Voting Rights, which he denounced as “humiliating” the South and later tried to weaken as president. When confronted about race, his stock response was that he supported Jackie Robinson’s integration of baseball in 1947. Other times Reagan was more plain-spoken: as California’s governor, he appealed to Whites in areas like Orange County by denouncing the 1968 Fair Housing Act, arguing that “If an individual wants to discriminate against Negroes or others in selling or renting his house, it is his right to do so.” In a 1971 phone call to Richard Nixon that he didn’t realize was being recorded, he referred to African leaders as “monkeys…still uncomfortable wearing shoes,” in keeping with his characterization of American cities as “jungles” and frequent jokes about African cannibalism. (Likewise, when a supporter that hadn’t donated enough to his campaign asked John Kennedy for an ambassadorship, he purportedly said, “F*** him. I’m going to send him to one of those boogie republics in Central Africa.”) The key for politicians, though, isn’t their own racism but rather manipulating that of voters. Nixon admired how Reagan played on the “emotional distress of those who fear or resent the Negro, and who expect Reagan somehow to keep him ‘in his place.’” It makes historians queasy to see this racism paired with Reagan’s promise of trickle-down economics — the idea that making the rich richer will ultimately create working-class jobs — because rich planters, the “one-percenters” of the 19th century if you will, paired those two in the antebellum South; except that poor Whites there never got anything out of the bargain except for their pride in being above enslaved workers and the right to serve on slave patrols. Playing on the “emotional stress of those that fear the negro” has been such a major theme in our history that it helps explain the entire evolution of America’s two-party system from the founding to the present.
But Reagan was craftier on-the-record by the late 1970s, exploiting the public’s resentment toward people who were taking unfair advantage of the welfare system. He asked working-class white audiences if they were tired of working hard for their paycheck then going to the grocery store and seeing a “strapping young buck” ahead of them in line with food stamps. Prior to the Civil War, “bucks” or “studs” referred to enslaved males that masters encouraged to reproduce with “wenches.” During his 1976 campaign, Reagan popularized the term welfare queen for women who took advantage of the government’s well-intentioned (if misguided) idea of paying single unemployed moms more than married couples. The term derived from an African-American Linda Taylor who, in 1974, was caught defrauding the government with multiple identities and sentenced to prison. Reagan’s audience understood who the strapping young buck and welfare queen were. He wasn’t going to beat people over the head with explicit racism (he wasn’t stupid), but neither was he going to leave the old southern Democratic voters and George Wallace supporters on the table. In an infamous 1981 interview (YouTube), South Carolina GOP strategist Lee Atwater explained how his party won over racists without sounding overtly racist.
Reagan was continuing with a variation on the GOP’s Southern Strategy, begun under Nixon to siphon off the old Democratic “negrophobes.” The concern over welfare abuse transcended race, though. In Hillbilly Elegy: A Memoir of a Family & Culture In Crisis (2016), future politician and vice-presidential nominee J.D. Vance recalled how working-class Whites in the Appalachia of his youth resented other lazy Whites that ate (and drank) better than they did by staying on the public dole permanently, spurring the workers to abandon the Democratic Party.
Republicans gained control of the South, fulfilling LBJ’s prophecy about the Civil Rights movement, partly through various Southern Strategies on race, partly through their general limited government philosophy (including cracking down on welfare abuse among all races), and partly through their new alliance with Christian Fundamentalists. Fundamentalism had been growing since the 1970s and abortion, legal since Roe v. Wade (1973), gave Republicans a wedge issue to galvanize their alliance, along with Christian nationalism. Consequently, many Reagan Democrats, North and South, Protestant and Catholic, crossed the aisle and voted GOP for the first time, regardless of their economic class and Republican hostility toward unions that would lack the right to collective bargaining in a free market. Economically, they were either willing to sacrifice their interests on behalf of outlawing abortion and strengthening the military, or were convinced that wealth really would trickle-down if the rich got richer.
You need to be careful what you wish for in politics, as both parties have learned with abortion. Roe was a seeming victory for liberals but, like the Civil Rights movement of the previous decade, it further splintered the Democrats’ New Deal coalition. As of 1973, when SCOTUS ruled on Roe v. Wade, each party was split about evenly on the issue of abortion. It hadn’t yet become a polarizing issue either in terms of public awareness or partisanship. In fact, moderate Republican judges wrote Roe, just as they had Brown v. Board (1954). But Republicans siphoned off far more working-class Christians into the GOP than Democrats did pro-choice Republicans. Today, you can’t function as a Republican if you’re not pro-life or as a Democrat if you’re not pro-choice, and many voters in each party’s base stake out absolutist positions. It is the most defining litmus test in each party’s platform — more important than either’s take on economics, diplomacy, the environment or, more recently, democracy itself. But fifty years later, modern Republican voters are not uniformly pro-life. So, in another case of being careful what you wish for, Republican politicians are like the proverbial “dog that caught the car” after they overturned Roe in Dobbs v. Jackson (2022), especially after putting draconian trigger laws on the books in red states based on the assumption that it wouldn’t happen. The GOP is currently trying to put out the fire from swing-state Arizona re-enacting an 1864 law that banned abortion outright, with Donald Trump opposing the law even though his appointed judges ruled in Dobbs v. Jackson. Setting aside religion or morality, the ideal political position for the GOP was just to oppose Roe, not actually overturn it.
In the 1970s, evangelicals didn’t agree on whether this political alliance was a wise deal, given the amorality of politics. Jerry Falwell bridged fundamentalism and the GOP with his Moral Majority, but the most famous evangelist, Billy Graham, aka the “Protestant Pope,” who counseled every president from Harry Truman to Barack Obama, urged Christians to avoid partisanship (right) despite Graham’s earlier ministry having paved the way for the infusion of religion into politics. According to evangelical journalist David French, this alliance opened the door for the 21st-century Christian Right supporting politicians that undermined their moral values, even without them being aware of it. Conversely, influential conservative and 1964 GOP candidate Barry Goldwater didn’t want to dilute conservatism with too many “preachers,” arguing in 1981 that they could be uncompromising. Despite these misgivings, religion took on a revived role in American politics during the Carter-Reagan era and after. Carter was born-again and wanted to carry Christ’s message of peace into the real world. Reagan, too, was interested in the New Testament, especially its last chapter, the Book of Revelations, that he suspected might foreshadow an apocalyptic showdown between America and the USSR. He secured a lasting alliance between Christian fundamentalists and the Republican Party. Today no candidate could run for president without fully explaining his or her faith. Mormons (e.g., Mitt Romney 2012) and Jews are more or less welcome to join the sweepstakes along with Christians, while this 2024 Gallup poll indicates that other groups’ prospects are at least improving from earlier polls. Increasingly, such polls would vary if disaggregated by Republicans and Democrats, but this poll can’t fill Joe Biden with confidence in the run-up to the 2024 election.
Reaganomics
Economics, though, was where the right-wing Reagan set himself apart the most from Democrats. Just as FDR wanted to jump-start the economy through government spending, Reagan’s supply-side economics reversed the concept of Keynesian demand-side stimuli, focusing not on government spending but on tax cuts, especially for the wealthy and corporations. Several major corporations were essentially on welfare throughout Reagan’s presidency because their tax rebates exceeded their tax bills. It’s difficult to tell how much the wealthy were actually paying on income taxes prior to 1980, but the top rates went from 70% when Reagan came into office down to 28% by 1988, so they were the biggest and most obvious beneficiaries of his election. Just as the New Deal philosophy wasn’t that new, drawing on ideas that Populists and Progressives had advocated for decades, so too, Reaganomics drew on ideas that conservative economists at the University of Chicago like Milton Friedman had promoted since WWII, the Chicago School of Economics.
Critics saw all this as a cover for power and privilege, increasing the gap between rich and poor. But was it “Reagan-Hood” as Reagan’s critics charged? In other words, did Reaganomics really steal from the poor and give to the rich, the opposite of the legendary Robin Hood? Yes and no. He helped the rich plenty, but his record was mixed on the poor. He reduced food stamps and school lunch subsidies, most forms of student aid (e.g., Pell Grants), and painkillers from disability coverage, leading to a black market in drugs like oxycodone. His reductions in publicly-funded mental health facilities increased homelessness. It’s hard to untangle these cuts from coded racist rhetoric. To the extent that race played a part in the pullback on public spending, which is impossible to prove or measure, it victimized poor Whites as collateral damage. However, spending continued through Reagan’s presidency on most of the core New Deal programs and even much of the welfare from the Great Society. Some tax burdens were shifted to the states but still came out of paychecks just the same. Reaganomics kicked off an era when Americans continued to spend on core entitlements (Social Security and Medicare) while voting themselves tax cuts.
Reagan’s budget director, David Stockman (right), a follower of Austrian free-market economist Friedrich Hayek (Chapter 9), quickly learned the limits of what a full-blown Reagan economic revolution would entail. Stockman, the “father of Reaganomics,” realized that since cutting most non-military spending would decimate “Social Security recipients, veterans, farmers, educators, state and local officials, [and] the housing industry…democracy had defeated the [free market] doctrine.” In other words, it would be political suicide to cut off all those demographics since they can vote. Senate Majority Leader Mitch McConnell reiterated this point in 2018 when Republicans had control of both chambers and the White House, meaning that, unless they’d been bluffing all these years, they could finally cut Social Security and Medicare all they wanted to balance the budget and offset tax cuts. That would’ve required legal changes rather than just lowering the amounts (apportionments) because those entitlements come under mandatory spending rather than discretionary, but they could’ve changed the law. The problem is that most GOP voters who favor lower spending and smaller government don’t actually want cuts to include Social Security or Medicare, just Medicaid for the poor. In an unprecedented burst of honesty, McConnell said the pain that comes with meaningful cuts makes it “difficult if not impossible to achieve when you have unified government” (translation: we need the Democrats to win back at least one chamber so that they can take the blame among a voting public that, collectively, wants higher spending for lower taxes). The truth is that democracies and budgets don’t dovetail well. The $6.2 trillion in spending for 2023 was ~ 38% higher than the $4.5 trillion collected in revenue. Even consistently running huge deficits and burdening their grandchildren with the tab leaves the public complaining about insufficient services and high taxes. In 2020, Donald Trump asked a private audience at Mar-a-Lago “Who the hell cares about the budget?” The current answer is Republicans when a Democrat is in office and nobody when Republicans are in office. But today’s house GOP wants to cut funding for the IRS that collects revenue. If there’s a silver lining in this cloud, ~ 22% of this debt is held by U.S. government itself, mainly the Treasury Department through swapping cash with Federal Reserve member banks.
The result of Reagan’s concession to core New Deal programs, when combined with increased military spending and tax cuts, was ballooning debt. Rather than always trying to sell the multiplier effect that tax cuts would increase revenue, Reagan often candidly told American voters that he was willing to plunge the country into debt to win the Cold War if that’s what it took. And Reagan likely knew, wisely if cynically, that overspending helps a sitting president and punishes his successors — a lesson his successors took to heart. George W. Bush’s VP Dick Cheney said Reagan proved that, in politics, “deficits don’t matter.” Adjusted for inflation, Abraham Lincoln and Franklin Roosevelt are the runaway leaders in growing the size and cost of the federal government because of the Civil War and World War II. But aside from them, what presidents oversaw the most growth in the size of the national government? Surprisingly, George W. Bush (87%) and Reagan (82%) in non-inflation-adjusted numbers (Source: USGovernmentSpending.com). This is as good a time as any to remind ourselves that, while presidents submit budget proposals under the Constitution, Congress is in charge of the nation’s purse strings as far as setting budgets, though presidents sign off. Given our current partisanship, it’s heartwarming that Reagan got along well with House Speaker Tip O’Neill (D-MA), but really their bipartisan budget compromises just meant mounting debt for future Americans.
Reagan’s supporters often claim that the debt-to-GDP ratio actually shrank under Reagan, meaning that the economy grew more than the debt, and the ratio of federal spending to the overall economy fell, but that’s not the case. The Debt-to-GDP ratio grew in the 1980s (at the top of the chapter we looked at annual spending vs. GDP rather than debt). According to the multiplier theory of supply-side economics, lower tax rates were supposed to stimulate growth enough to increase overall tax revenues even though tax rates dropped but that didn’t happen. Nonetheless, the economy took off on a long bull run, lasting through the late 1990s. By that measure, and a booming stock market, Reaganomics was a success.
But did increased wealth “trickle down” to workers as supply-side advocates promised? Again, yes and no. Trickle-down economics isn’t something that’s true or false, as money is constantly flowing in both directions, including down in the form of salaries. Also, understand that some of the inequality charts you might see are skewed because they’re based on earnings before taxes that hurt the rich and government aid that helps the poor (ours on the left is after taxes). As explained in Rogé Karma’s optional article below, these stats are also complicated by our lack of reliable statistics on tax evasion (conservative scholars assume that wealthier Americans cheat less because they’re already rich). But we have some basic stats and the question is whether, as the political slogan implied, making the rich richer created a net gain for workers in relation to the rest of the economy. There’s no doubt the economy grew over the next twenty years, and the booming stock market of the 1980s and ’90s helped all workers tied to defined-benefit pensions and directed-benefit 401(k) or IRA retirement funds, along with stimulating overall growth. As our graph shows, working classes didn’t suffer significant wage reductions on average between 1980 and 2007, just before the Great Recession. Wealthy people don’t necessarily earn money at the expense of the poor in a zero-sum game. The overall earning power of most workers stagnated, though, except for people in the upper 20%. Also, most workers didn’t stay at one job long enough to take full advantage of either type of retirement fund, and most weren’t investment-savvy enough to manage their own directed funds to the fullest advantage.
Many economists claim that it’s wrong to look at the economy like a zero-sum pie and ask who is getting the biggest piece because the pie itself is growing. That’s true to a certain extent and, yes, it’s true that America’s poor suffer more from diseases of abundance (diabetes, obesity, etc.) than hunger. But regardless of the size of the pie, it is finite at any given moment, and the gap between rich and poor widened between 1980 and 2019 (COVID), with most of the money trickling up. Really, it was more like a tidal wave than a trickle. While the rich have gained more proportionally than the working and middle-classes, the ultra-rich (top 1%) have gained far more than the poor, middle, or regular rich. Since their taxes on capital gains/investments (15% post-2001) are lower than most workers’ taxes on income, their overall effective rates fell beneath that of workers by 2018.
The top 1% lost ground in the Great Recession of 2008-09, but when the slow recovery kicked in around 2010, they increased their lead over the bottom 99%. Much of that wealth is in the hands of entrepreneurs who’ve created jobs and products for the rest, but much of it has gone to investment bankers and hedge-fund managers who stash their earnings in offshore accounts to avoid paying taxes. Tax shelters, combined with the fact that taxes are lower on investments than earned income, mean that most wealthy now pay a lower effective tax rate than the middle and upper-middle classes and, if you don’t like it, you’re guilty of fomenting “class warfare” and favoring “totalitarian government.” By 2012, the richest 400 people in the U.S. had more money than the bottom 50% of the population and wages stood at an all-time low as a percentage of GDP. However, since COVID-19, working class wages have risen at a faster rate than upper classes. Still, 40+ years after the Reagan Revolution, the public is now skeptical enough about trickle-down economics that 21st-century conservatives have dropped the term from their campaign platforms while still favoring supply-side economics.
Another key to the Reagan Revolution was deregulation. Reagan was adamant in his aforementioned philosophy that government is not the solution to our problems; government is the problem. The deregulatory trend started under Carter in the 1970s but gained momentum under Reagan. He rolled back environmental and workplace safety (OSHA) regulations, loosening rules on pesticides and chemical waste, and got rid of many rules governing banking and accounting. The financial changes, especially the evolution of Special Purpose Entities (SPE’s), contributed to problems like the Savings & Loan Crisis, Michael Milken’s junk bond-related fraud, and Enron in the late 1990s. The taxpayer bailout of corrupt Savings & Loans cost Americans 3-4% of GDP between 1986 and ’96. On the other hand, allowing accountants to “cook their books” may have stimulated the economy and Milken used some of the money he stole to help fund medical research. But, ideally, one purpose of accounting, other than to run a business responsibly for your own sake, is so that other people (employees, investors, IRS) can get a feel for what’s going on. Reagan also cut funding for the Small Business Administration (1953-), the only government agency aimed at helping small entrepreneurs, but one that could be spun as yet more bureaucracy and costing taxpayers because of some failed loans.
Mergers were another hallmark of Reaganomics, as courts were hesitant to prosecute monopoly cases. Remember all the hullabaloo about trust-busting in the Progressive Era? An upstart football league called the USFL led by, among others, Donald Trump, sued the NFL in an antitrust case in 1986. A lower court determined that the USFL was right — the NFL did have a monopoly on pro football — and awarded the USFL a grand total of $1. While that particular case wasn’t influential, it symbolized the era. The Reagan Revolution led to a wave of mergers in the 1980s and ’90s that continues to this day. Unions didn’t do much better than trust-busters in the 1980s. Shortly into Reagan’s presidency, air traffic controllers went on strike. He had the FAA order them back to work and fired over 11k that refused. The graph below is striking, but remember that correlation doesn’t necessarily imply causation (see Rear Defogger #6).
Media Fragmentation
Deregulation also impacted media. This lengthy section is important since free speech is a double-edged sword within democracies and currently challenging our own. Partly to reward Rupert Murdoch for helping him win New York state in 1980 via Murdoch’s New York Post, Reagan got rid of the law preventing simultaneous ownership of newspapers and television stations, allowing Murdoch to acquire Twentieth Century Fox and spin-off FOX News in 1996. In 1987, the Federal Communications Commission (FCC) revoked their 1949 Fairness Doctrine requiring TV and radio broadcasts to be fair and “tell both sides of a story” when touching on controversial topics. There was bipartisan support in Congress to codify the doctrine into law, but President Reagan vetoed the bill. The Fairness Doctrine was arguably a violation of the First Amendment, but its retraction fragmented news into what it is today, where most conservatives and liberals just listen to their own tribe’s spins, with little center of gravity in the middle to rely on for “straight news.” Like the Newspaper War of the 1790s, when Federalist and Democratic-Republican papers just reinforced their own partisans’ views, and bitter regionalized media of antebellum America that led to the Civil War, we’ve returned to the narrowcasting journalism model. In the run-up to the Dominion Voting Systems defamation suit trial in 2021-23, for instance, FOX essentially argued for the First Amendment right to disinformation (they settled out of court). FOX claimed that, technically, they were just reporting other peoples’ misinformation about the 2020 election results. Anonymous influencers, then shared by Trump and Elon Musk, are currently arguing on X that undocumented migrants are being registered to vote by the millions in the 2024 election, even though it’s virtually impossible to register without an SSN. SCOTUS is currently weighing whether or not disinformation, as opposed to misinformation, is protected under the First Amendment, but it’s a fine line because people can always argue later that they were just wrong. In the meantime, disinformation-for-hire is a booming industry for those looking to profit off our republic’s decline.
In 1988, lambasting what he later called the “Four Corners of Deceit” — science, government, media, and academia — Rush Limbaugh of Sacramento launched his radio show a year after the Fairness Doctrine deregulation. Today, many Americans think a cabal of those four groups is conspiring against them. Limbaugh, whose bombastic Manichaeanism echoed Tom Coughlin (Chap 9), Carl McIntire from the ’50s, and Joe Pyne in the ’60s (Chap 16), argued that feminism was launched to give unattractive women entry to society, called gays perverts and, once, at the height of the AIDS epidemic, played “I’ll Never Love This Way Again” in the background while mirthfully reading off the names of perished victims. He pushed the envelope racially, too, telling one black caller to “take that bone out of your nose and call me back,” saying everyone on composite wanted posters looked like black politician Jesse Jackson and that the National Basketball Association (NBA) should be renamed the Thug Basketball Association, and questioning Obama’s citizenship and calling him “Barack the magic negro.” But his millions of devoted “Dittoheads” interpreted his engaging banter as refreshing, courageous integrity in an era of irritating and stifling political correctness. In the 21st century, he and his offshoots feasted on the perceived excesses of woke politics like sharks on chum.
Limbaugh was irreverent, especially in his younger years, influentially playing rock and country songs as bumper music in and out of commercials, and connecting to listeners in a way that journalists hadn’t prior to deregulation, especially conservatives who didn’t have a voice in mainstream near-left media. His shows gave succor to conservatives in liberal enclaves like Seattle, San Francisco, or Austin. Limbaugh shaped contemporary conservative politics more than Republican presidents and reshaped media by blurring journalism and entertainment into so-called infotainment. As chronicled in the optional article below by Brian Rosenwald, “Limbaugh’s style would come to be reflected in everything from late night comedy shows to cable news channels to podcasts of every ideological flavor and style — the very things that define our political media in 2021. Whether one likes Rachel Maddow, Stephen Colbert, Joe Rogan, or Sean Hannity, he or she is engaging the media world created by Limbaugh.” Limbaugh saved AM radio when most music had migrated to FM and ushered in a wave of conservative talk show hosts, including future VP Mike Pence. Due to human nature, much of the public prefers infotainment over drier analysis if given the choice, while comedians like Colbert, Jon Stewart, John Oliver, and Greg Gutfeld wove substance into comedy to make substance palatable. Polls showed that people that voted for both Barack Obama and Donald Trump saw each as being more entertaining television than other candidates. Trump also benefitted insofar as many viewers got to know him as host of NBC’s Celebrity Apprentice from 2008-2015, just as many Ukrainians got to know Volodymyr Zelenskyy playing a president on TV. When Trump left office in 2021, ratings sank for all cable news, right and left.
Adversarial journalism profited from another de-regulatory law, the forenamed 1996 Telecommunications Act signed by Bill Clinton that allowed for greater consolidation and vertical integration among communications companies. Crucially, Section 230 of the 1996 law exempts Internet providers and hosts from liability regarding most content on their platforms, but contributors to the platforms aren’t liable either. Currently, neither social media nor toxic anonymous message boards with Holocaust humor (e.g., 4chan > 8chan > 8kun) are legally accountable for events that result from their content, up to and including even mass shootings, political violence, and disinformation. Section 230’s broad immunity raises thorny issues involving the First Amendment. The optional article below by Jan Werner Müller cautions against blaming social media, in particular, for today’s problems. After all, we don’t blame telegraphs, phones, radios, TV’s, and newspapers for past troubles like World War II. Yet, the Web has more capacity to manufacture crises and manipulate perceptions than older mediums. By comparison, courts have been fairly lenient toward newspapers over the last fifty years, generally erring on the side of protecting them from libel suits if they correct mistakes (e.g., Sarah Palin’s suit against the New York Times, NYT). But there’s at least a line there with disinformation that journalists can’t cross with impunity. Online versions of legacy media (newspapers, magazines, etc.) still can’t libel, but other platforms’ libel laws are, at best, murky. And, even on TV, MSNBC’s Rachel Maddow and FOX’s Tucker Carlson fended off defamation and slander suits, respectively, because judges ruled that their audiences understood their op-ed commentaries weren’t to be taken literally.
The Sinclair Broadcast Group brought a right-leaning if subtler approach to local news at the hundreds of network affiliates it bought in the South and Midwest. Much news now is what people used to refer to as op-ed (for opinions and editorials) within each tribe’s echo chamber or filter bubble and the respective chambers don’t report on the same stories. Blindspot is a good reminder of this selection bias (Rear Defogger #21). Another advantage that op-eds have for news outlets is that they’re cheaper to fund than investigative journalism. Op-eds are beneficial to democracies and fine in their own right, as is bias when recognized. But, after 1986, unwary viewers weren’t always distinguishing between fact and opinion or recognizing bias, except in others, and media companies were cashing in. Rolling Stone journalist Matt Taibbi described modern media firms as working backward, first asking “How does our target demographic want to understand what’s just unfolded? Then they pick both the words and the facts they want to emphasize.” FOX News’ belated disinformation of the 2020 election, after first reporting accurately, was a dramatic example. But it’s true across the political spectrum. Since it’s human nature to suffer from anchoring and confirmation bias and easier to confirm preconceptions, most people just choose their “truths” from a virtual buffet table of options and some politicians exploit the “post-truth” era. If deregulation opened up television and radio, the advent of the World Wide Web obliterated any hope of an agreed-upon reality. That can be true even when opposing sides consult the same source, as is the case with the Washington Post’s Fatal Force crime database, based on FBI statistics.
Online, the new model often doesn’t go far beyond maximizing “eyeballs” or “hits,” with no profit motive to teach reality or encourage intelligent debate because many people find that boring. There’s no money in pointing out, for instance, that big numbers of Americans currently favor reasonable compromises on guns and abortion, so uncompromising voices speak for the left and right. Absolutists are “good copy” as they say in journalism. And few people will subject themselves to the emotional trauma of hearing views they disagree with. Just as trees are worth more to the economy as lumber than absorbers of CO2, Americans are worth more to media as hive-minded partisans. Algorithms are designed to learn your leanings and emotional triggers and reinforce what you already think. We all look at different feeds tailored to what robots think will make us happy or interest us, which keeps us online longer. Like the Yellow Journalism of a century ago, the more they feed us what we want, the more money they make on advertising. The more you show an inclination toward conspiracy theories, the more conspiracies you’ll see. Imagine if each student saw different chapters in this textbook as artificial intelligence altered them to suit their preferences in exchange for advertising revenue. That’s what’s already happening on social media, which is where many people get their news and their views of history (aka e-history). If 36 students in this course section researched the war between Israel and Hamas, not only would they get 36 different versions from different outlets, they could even get slightly different versions from the same outlet. That doesn’t mean that it’s impossible to understand events, but you need to triangulate your sources and understand both their leanings and your own, which takes effort.
Whereas the best minds two generations ago worked on weapons of mass destruction, today’s work on best syncing and addicting you to social media and keeping you in a feedback loop — a free, closed loop ideal for politicians spreading propaganda among adult users. It offers little emotional jolt rewards like a slot-machine dropping you occasional quarters. The most valuable commodity in today’s economy is neither gold, crypto nor oil, but rather our limited attention spans and the personal data companies can collect when we’re engaged and locked in. Personal data is our coin of the realm (currency), as they said in 18th-century England. This “attention extraction” model is nothing short of the biggest business ever and largely responsible for the booming stock market of the 21st century. Companies often have no clue where their ads land since they’re funneled through the search engine to wherever there’s traffic. Hopefully, ex-Facebook executive Tim Kendall overstated things when he prophesied in the 2020 docudrama The Social Dilemma that social media will inevitably lead to civil war. But it has already fueled/enabled ethnic cleansing in Myanmar, degraded civic discourse, fueled all manner of conspiracy theories of the type formerly limited to tabloids, and hastened civil unrest around the world, including insurrection against the U.S. government. When social media’s algorithms smell smoke, they rush to the fire and pour gasoline on it. Ex-Google designer Tristan Harris called social media a “race to the bottom of the brain stem.”
Facebook’s own research revealed how they could impact levels of political violence and discord in countries like Hungary, Poland, Spain, and Taiwan almost as if adjusting a thermostat (countries vary in how much censorship they allow). In 2021, whistleblower Frances Haugen testified before Congress that they altered their formula to sow divisiveness and that their own research revealed that Instagram contributed to anxiety and self-harm, especially among teenage girls. It has worsened adolescent bullying, as the victim can’t escape at the end of the school day. The Chinese company TikTok limits exposure for Chinese teenagers to 40 minutes per day, whereas American teenagers are now pushing ~ 100 m.p.d. Extra sleep deprivation alone contributes to psychological problems, regardless of content. Many of the industry’s inventors are understandably feeling guilty as they initially set out to just get rich while we pushed “like” for puppy videos and pictures of our cousins’ Grand Canyon trip. Chris Wetherell lamented after his invention of the Retweet button that “we might have just handed a four-year-old a loaded weapon.” In an optional article below, Jonathan Haidt explains why a better metaphor might’ve been darts and how small subsets on the left and right used these darts to dominate public discourse in a way that’s silencing the center and weakening the fabric of American society.
Much of our political discourse is Tweeting or venting on blogs with like-minded people or having profanity-laced exchanges with faceless adversaries in comment boxes that go nowhere. Former Facebook VP Chamath Palihapitiya said, “We have created tools that are ripping apart the social fabric of how society works.” For conservative columnist David Brooks, the Web accelerated the longstanding “paranoid style” in American politics that we touched on in Chapter 16. Now, we mostly just talk past each other, as shown in this image that, contrary to first impressions, wasn’t taken by the Hubble Telescope but rather purportedly mapped 2016 election-oriented traffic in the Twitterverse:
What can we do to mitigate this dystopian development? In addition to the basic responsibilities of citizenship, like voting and paying taxes, Americans now have to filter news to find out what’s going on. Judge theories based on the quality of evidence, not the quantity (see Rear Defogger #25 on the tab above). “Do your own research” sounds like good advice (as educators, we encourage independent thinking), but, online, it’s a hop-skip-and-a-jump from D.Y.O.R. to just choosing your own reality. Going further out into the fringes of the Web doesn’t mean you’re getting closer to the truth. If you have any genuine interest in reality, then the first thing you should research is how to do research. That starts with examining evidence (primary sources) when possible and humble open-mindedness to experts, who aren’t necessarily wrong or conspiring against you, but usually imperfectly acting in good faith. Don’t assume that someone is wrong or dishonest just because they have expertise. Talk to, or email, a reference librarian who can direct you to the best online sources if you can’t visit a library. Guard against the “beginner’s bubble” initial over-confidence that often accompanies D.Y.O.R. If you find the prospect of talking to a reference librarian too daunting, nerdy, or time-consuming, the key with most online sources is cross-checking other sites. If you’re considering dabbling in reality or reality curious, check your source’s reputation at Media Bias/Fact Check. English poet Alexander Pope, most famous for “to err is human; to forgive, divine,” also wrote that “a little learning is a dangerous thing.” Read smart people with diverse opinions.
Quality liberal and conservative media is part of the Anglo-American buffet table, too, as suggested by Spanish actor Vanessa Otero in the chart below. The forenamed Media Bias/Fact Check is a more thorough site, referenced by sources as wide-ranging as Huffington Post on the left and Breitbart News on the right. The point here isn’t the accuracy of Otero’s 2016 diagram — with which presumably nearly anyone with a pulse would quibble (underscoring my point) — but rather that many Americans only feed off the edges of the buffet table then share posts that others either already agree with or ignore. The y-axis is also a reminder that quality only overlaps with centrism incidentally, even though centrist media like the wire services play a crucial role. This chart is dated as Trump’s presidency pushed some formerly centrist/near-left outlets like CNN further to the left. These Pew Research polls from 2019-20 indicate that the traditional TV networks (ABC, CBS, NBC) long since forgotten by millennials still occupy a significant role. Online, there’s little constructive debate on the lower edges of the spectrum and more often cherry-picking or even bogus misinformation and disinformation, aka fake news (misinformation is false, whereas disinformation is intentionally false).
This distinction between misinformation and disinformation is important. All textbooks for instance, including this one, contain some misinformation. It could hardly be otherwise unless every single primary and secondary source they consulted was perfect, combined with perfect interpretive skills from the author. Your STEM textbooks have a better theoretical chance at being airtight, but they’re covering moving targets as theories and technology evolve. That’s why printed textbooks keep errata lists to fix for future editions. But this text contains zero disinformation. A bigger problem today is that too many people are conflating fake with bias, which are distinct concepts. It’s an important distinction because it’s hard to understand the world around you if you think that everything being reported in the mainstream media (MSM) is factually incorrect when it isn’t. Reporters in the MSM usually get fired when they make false reports, especially if deliberate. If you’re wondering about any current news, cross-check with fact-checking sites like the Pulitzer Prize-winning Politifact or Snopes before posting volatile material on social media. Politifact includes a page for social media hoaxes and another called Punditfact that assesses journalists instead of politicians similar to Media Bias/Fact Check. If possible, ask yourself whether a quote or idea is coming directly from a politician or is being filtered through a third-party. Less than 10% of us will do anything as dramatically patriotic as serving in the military, but all of us have the chance to stay grounded in reality to man the civic fort while others indulge in fantasy. Are these fact-checking sites perfect? Undoubtedly not, but what is? Don’t let perfection be the enemy of good. Politifact and Snopes call people out across the political spectrum. Learning both sides of issues is key to bolstering your own side of the argument, for one thing. Imagine an attorney walking into the courtroom with no clue what evidence the opposition will present or what arguments they will make. Philosopher John Stuart Mill put it best when he wrote that “He who knows only his own side of the case knows little of that.” Whatever you read, listen to, or watch, tap the brakes enough for some “slow thinking” (deliberation, reasoning, etc.) since all of us are easier to manipulate in the instinctive, emotional fast thinking mode, aka auto-pilot or System 1. The toxic business model described above depends on you staying in System 1, whereas true digital literacy requires System 2.
The term fake news was commonly used in the early 20th century by people across the media spectrum (see optional article below). As far back as ancient Greece, Plato wrote about free speech and propaganda as the central challenges confronting democracy in his Republic (375 BCE). Irresponsible media spinning was out of control in free-speech Britain as 17th-century newspapers misinformed English readers of fake or exaggerated Catholic atrocities, stoking the Irish Confederate Wars. England invaded Ireland to avenge atrocities that mostly didn’t happen. Anglo-Irish novelist Jonathan Swift wrote: “Falsehood flies, and truth comes limping after it, so that when men come to be undeceived, it is too late; the jest is over, and the tale hath had its effect.” Franklin Roosevelt’s Secretary of State, Cordell Hull, updated that to: “A lie will gallop halfway round the world before the truth has time to pull its breeches on.” Participants in the Capitol Siege of January 6, 2021 used variations on this line as their legal defense, arguing that they didn’t know until too late that they were being lied to about the 2020 election being stolen. Further updating Swift and Hull’s old-school quips, commentators in Social Dilemma confirm their research showed that fake news spreads ~ 6x faster than real news. Fake news lands on the front page; rebuttals and corrections on page ten.
The idea, if not the exact term fake news, is as old as journalism itself. Students who’ve taken HIST 1301 might remember the epic mudslinging of the 1800 presidential election, when John Adams had his voters worrying that Thomas Jefferson was going to confiscate their Bibles, while Jefferson’s camp paid for favorable newspaper editorials and called Adams a “howling hermaphrodite.” In America’s formative years, there was no such thing as objective media. Later, in the Gilded Age, Joseph Pulitzer and William Randolph Hearst accused each other of publishing fake news, often accurately, as they competed for New York’s sensationalized Yellow Journalism market. The better part of the 20th century was, in retrospect, a unique time in journalism when readers valued objectivity. Most newspapers self-identified as Democratic or Republican with op-ed pages that leaned left or right, but the rest of the paper was neutral and factual or at least aimed to be. Exceptions were sensationalist tabloids near the grocery check-out. That ideal of objectivity has been more the exception than the rule, historically, though. From the late 18th to the early 20th centuries, no one made much pretense toward objectivity. More recently, comedians used the term fake news to describe news skits on Saturday Night Live “Weekend Update,” the Daily Show, and Colbert Report. Those were in the innocent days. By the mid-2010s, Hearst was back in the building.
In modern America, media fragmentation is happening among an increasingly diverse population with foreign powers able to manipulate voters, as Russia did in 2016, and multiple countries try to in every election. One protest and counter-protest in Houston between the “Heart of Texas” and “United Muslims of America” was manufactured from scratch by Russian trolls at the Internet Research Agency in St. Petersburg. Neither group actually existed, but protesters showed up, some armed. Whereas Soviets threatened the U.S. with nuclear weapons during the Cold War, modern Russians just push a few buttons and sit back and watch giddily as Americans gnaw away at each other from within. There’s good news and bad. The good news is that, all else being equal, the younger the viewer, the savvier at filtering information, suggesting that digital literacy could improve. The bad news is that fake news will start looking more real as video footage gets less reliable in the deepfake era and bots spread convincing misinformation with programs like GPT-3/4. Still, there’s an unseen benefit to all this for those with the time and the fortitude to stomach it. If you can transcend your filter bubble and expose yourself to a wide spectrum of media, at least higher-grade versions of it, you can be better informed today than someone who simply watched the “straight” nightly news fifty years ago. There’s more good journalism today than ever.
Yet, it’s easy to see how unwary listeners reared in this hyperpartisan environment could believe in conspiracies like Trutherism, Birtherism, Pizzagate, Sandy Hook Denial, or Jade Helm 15 just to name a few of dozens. These are conspiracies that would’ve been relegated to tabloids like National Enquirer as recently as a generation ago. Journalist/historian Garrett Graff wrote that “media was so polluted it should be an EPA Superfund site.” America’s Overton Window of acceptable mainstream discourse widened as the market grew of people inclined to think along those lines. QAnon, the mother of all conspiracy theories (more accurately their child), contributed directly to January 6. Freedom of the press is a right that citizens should honor and be grateful for — the kind Americans have fought and died for — but we need to up our game when it comes to filtering news and thinking about a reality that might be drier than Q’Anon. Rex Tillerson, Trump’s Secretary of State, suggested that our republic depends on it and Jan. 6 vindicated his fears. Whereas infotainment just requires an audience that likes drama, a republic requires informed citizens.
Gerrymandering
Drawing congressional district boundaries also took on greater importance in American politics in the late 20th century. As with media bias, this too has a longer history, is impacted by technology, and transcends the Reagan-era conservative resurgence. Gerrymandering, named after Revolutionary-era Massachusetts politician Elbridge Gerry’s salamander-shaped district (right), is as old as American politics and stems from the problem of how to divide up a state’s congressional districts. Senators don’t present this problem because each state gets two drawn from all the state’s voters. However, with the House of Representatives, there is no perfect or fair way to map districts given the irregular shape of most states and the shifting population within them. Even in a square state like Wyoming, dividing into four even squares wouldn’t be even in terms of population distribution, although Wyoming’s population is so low that it’s not a problem anyway because they have only one representative in the House; the whole state is one district. Gerrymandering manipulates district borders to maximize one party’s capacity to win votes by herding opponents’ voters into as few districts as possible, or by spreading and diluting those votes across districts. This diagram, if a bit hyperbolic in using “steal” in its title, shows two simplified versions with right angles:
We could discuss gerrymandering at any point in the course but, by the late 20th century, software enhanced the efficiency of redistricting. Also, like the media deregulation mentioned in the previous section, enhanced gerrymandering has amplified partisanship. Both phenomena divide and sort us. Per the Voting Rights Act of 1965 and Fourteenth Amendment, courts have generally struck down gerrymanders aimed at racial discrimination but sanctioned those aimed at partisan discrimination, unless it’s too extreme. Several states voted to transfer redistricting duties from legislatures to bipartisan commissions in the 2018 mid-terms, including swing states where it’s an easier sell like Michigan, Virginia, Arizona, and Missouri, and Republican Utah and Democratic Colorado. Among larger states, Republican Texas and Democratic Illinois maintain partisan (legislative) gerrymandering, but Democratic California, owing to the good deeds of former governor Arnold Schwarzenegger (R), transferred redistricting to bipartisan citizen commissions. New York has a hybrid system, whereby the partisan legislature (Democratic) can’t tweak bipartisan commission-drawn maps more than 2% in their favor. In Ohio, gerrymandering bias must align roughly with the results of previous elections. So, in 2021, Republicans argued that since they’d won 13 of 16 previous elections by ~8%, they should be able to arrange the districts so that they’d win 80% of the vote (13/16) instead of win by 8%. In other words, had they won 16/16 of previous elections 51-49%, that should grant them the right to rig the districts so that they’d get 100% of the vote in every election.
Ballotpedia maintains this map of which states use partisan legislatures versus independent commissions for redistricting. Allowing the party in power to draw up their own redistricting lines, as many states currently do, creates a situation where “politicians pick voters” nearly as much as voters pick politicians. GOP strategist Thomas Hofeller, who redrew North Carolina’s lines, talked openly about that strategy, saying that “redistricting is like an election in reverse.” He advised other Republican strategists to cover their tracks by avoiding email but, when he died in 2018, his liberal daughter found evidence on his hard drive that, for him, the point of gerrymandering in North Carolina was to suppress minority voting (NYT). But, in 2023, conservative SCOTUS justices John Roberts and Brett Kavanaugh joined liberals in shooting down the Alabama GOP’s new map as a racial gerrymandering violation of the 1965 Voting Rights Act and, in Moore v. Harper (2023), the same 6-3 majority weakened the power of state legislatures to override state courts in gerrymandering and other electoral laws. In Democrat-controlled Illinois, their 4th Congressional District as of 2017 barely met the requirement that districts must be contiguous (below).
In Illinois, Republicans will likely control ~ 23% of House seats despite Trump winning 40% of the vote in 2020. Wisconsin Republicans, meanwhile, drew non-contiguous districts but their state’s supreme court shot down their 2023 map (ProPublica). Democrats in New York drew some highly imaginative districts despite their 2% rule, but a court ruled that the state re-draw fairer lines, which helped Republicans upstate in 2022. Democrats helped themselves with redistricting in New Mexico and Nevada in 2022. Texas is a prime, notorious example of partisan gerrymandering, awarded an F overall by the Princeton Gerrymandering Project for its 2021 maps. Democrats once drew up the lines to favor themselves and, since the Reagan Revolution of the 1980s, Republicans have done likewise. Texas Attorney General Ken Paxton (R) argued that, in compliance with Gill v. Whitford (2018), gerrymandering in the Lone Star state was merely intended to discriminate against Democrats rather than minorities. But there is overlap since minorities have tended to vote Democrat since 1964. Austin, for instance, is the “bluest” (most liberal) city in Texas and one of the most liberal in the country — a “blueberry in a bowl of tomato soup” as comedian Jon Stewart called it (writer Lawrence Wright says conservative Texans see Austin as a “spore of the California fungus that is destroying America”). Under former Congressman Tom Delay (R), the GOP created the 25th District, aka the “Fajita Strip,” to condense Democratic voters into one district — writing that district off but limiting Democrats’ overall impact by combining Austin with the Rio Grande Valley.
When courts shot down the Fajita Strip, the GOP created a map whereby five of Austin’s six Congressional representatives were Republican. The two basic ways to Gerrymander are “packing and cracking” and the GOP has used both on Austin. To pack is to herd like-minded constituents into one district such as the Fajita Strip, to minimize their impact. A second way is to divide a city like Austin by cracking it into the tips of multiple wedges that fan out into large enough conservative districts that the end result is a liberal city like Austin represented by mostly conservative congressional members (see Districts #10 & #17 below). It’s illegal to crack minority voters per the Voting Rights Act (Section 2) and illegal to pack minority voters per the Fourteenth Amendment. In 2020, the GOP reversed course with Austin and packed instead of cracked it in their new maps.
When packed rather than cracked, Gerrymanders result in clear red and blue districts whose voters don’t elect centrist candidates. In the House of Representatives, that results in a diverse collection of staunch partisans who are elected for the very purpose of doing battle with the opposing party and not compromising, making it impossible for politicians to compromise on issues like immigration, guns, and police reform about which polls show big overlap between left and right. Many voters/constituents see compromise as a sign of weakness and force their candidates to pledge that they won’t, just as many Barack Obama supporters — and perhaps, subconsciously, even his opponents — saw the former president’s willingness to compromise as a weakness. If voters themselves aren’t more divided than they were in the past, they’re at least better sorted. They choose their own sets of experts and facts from the media buffet table, vote for candidates from districts deliberately made as partisan as possible and, according to some studies (but not all) have increasingly come to self-segregate by moving to “red and blue” areas to be around like-minded people. Consequently, congressional districts with “swing voters” have shrunk steadily over the last thirty-five years. In Gerrymandered non-swing districts, one’s biggest threat is often someone more stridently conservative or liberal within his or her own party, not an opponent from the opposing party. A stark example is Trump supporters purging congressional GOP who don’t support his claim to have won the 2020 election or support bipartisan legislation. In a swing district, that tactic would only work at the primary level but might backfire in the general election. The lack of swing districts creates parties less open to compromise and debate “across the aisle,” as it’s often put. The current chair of the Democratic National Committee, Tom Perez, for instance, has ruled that the right-to-choose (an abortion) isn’t open to negotiation among Democrats; pro-life Democrats, in other words, aren’t welcome to debate the issue at platform meetings or compromise with Republicans if elected. Any Republican that compromises with Democrats on bipartisan legislation is instantly labeled a cuckservative or RINO (Republican In Name Only) on social media and risks being outflanked in the next cycle’s primary election (within the party). By weakening the role of partisan legislatures, Moore v. Harper (2023) is a promising first step in protecting democracy, defined here subjectively as the right of voters to choose politicians rather than vice-versa. Another obvious step that most Americans would likely favor, since it would apply to everyone equally, would be an outright nationwide ban on all partisan district mapping, though that would be opposed on states’ rights grounds. We already know from purple states and California that it works to assign it to neutral parties. That ban was in the For the People Act that passed the House in 2021 but lost in the Senate, with Democrats unable to overcome a Republican filibuster. Prison gerrymandering also favors the GOP because minorities are often relocated from cities to rural prisons where they count as part of that population but can’t vote; though that practice, too, is blocked in some states and not others (PPI).
The best scenarios are either for someone to introduce a stand-alone bill that puts opponents on the spot to defend states’ rights on behalf of something unpopular and undemocratic, or for Democrats to abuse their privilege so egregiously in Illinois that enough Republican voters agree to the nationwide ban that Democrats already support. In the meantime, gerrymandering is a classic example of a collective action problem, in which it’s in certain actors’ best interests to undermine what’s best for the group (climate change initiatives are another of many examples). A national ban wouldn’t be a panacea that fixes our democracy, not by a long-shot, but it would provide an instant boost of political sanity and stability. Gone would be progressives campaigning to decriminalize shoplifting and cancel Abe Lincoln, along with Republicans in campaign ads blasting a thinly-disguised symbol of their opponent with a semi-automatic rifle (likewise, one Democrat in Washington state shot a paper machete Elephant). Until we ban gerrymandering, it remains a constitutional glitch that magnifies partisanship.
The Democratic National Redistricting Committee (DNRC), founded in 2017, is using the courts to push back against the most brazen GOP gerrymandering, while also encouraging bipartisan districting in blue states of the sort “the Governator” Schwarzenegger supported in California. But SCOTUS sanctioned the South Carolina GOP’s gerrymandering in Alexander v. South Carolina NAACP (2024), arguing that it was partisan, not racial, despite having overlapping, racial implications. Conservative justices raised the bar of evidence and argued that, going forward, plaintiffs must assume the good faith of legislatures drawing the maps. Judge Clarence Thomas (left) added in a solo concurrence that the Supreme Court should vacate any rulings even on racial gerrymandering, as all redistricting is outside their jurisdiction. Such a stance would reverse the precedent of the Court’s 5-4 ruling against in Alabama in Allen v. Milligan (2023) that shot down an outright racial gerrymander as violating the 1965 Voting Rights Act.
1988 Election
Let’s return to the 1980s to explore more issues that impact us now. The post-Reagan era kicked off with what seemed then like a depressing campaign — one that served mainly to underscore the superficiality of media coverage and tendency in democracies for campaigners to manipulate voters by appealing to their worst instincts. That wasn’t because either of the candidates was bad. The 1988 race pitted Reagan’s VP, George H.W. Bush against Democrat Michael Dukakis of Massachusetts. Since New England is generally more liberal than the rest of the country, the Democrats balanced the ticket with Lloyd Bentsen of Texas, hoping to recapture the Austin-Boston magic of the 1960 Kennedy-Johnson ticket. Dukakis led midway through the summer, but Bush’s media consultant Roger Ailes (Chapter 16, future head of Fox News) and campaign manager, the forenamed Lee Atwater, came up with an ad attacking Dukakis’ weakest point besides his unfortunate photo-op in a tank.
As governor of Massachusetts, Dukakis oversaw a prison furlough program that allowed prisoners out on temporary weekend probations. One of the convicts, Willie Horton, broke into a home and raped a woman. Horton was black and a group called Bush for Americans flooded the airwaves with his mug shot (left), asking viewers if they wanted someone soft on crime. Atwater said the ad would “strip the bark off the little bastard” [Dukakis] and “make Horton Dukakis’ running mate” (for VP). Dukakis made the soft-on-crime spin worse by answering no to a tough debate question from CNN’s Bernard Shaw over whether he’d favor the death penalty for someone who raped and murdered his wife. While his answer was clear and he backed it up by arguing that studies showed the death penalty was not a deterrent, viewers were put off that the question hadn’t stirred deeper emotions.
Bush campaigners also made up stories that Dukakis burned American flags to protest the Vietnam War and that his wife was mentally ill, though Bush distanced himself from the false smears. When Atwater was diagnosed with brain cancer a couple of years later, he converted to Catholicism and issued an apology to Dukakis for the “naked cruelty” of the 1988 campaign. At the time, though, it was enough to pull his client ahead in the race and Bush won the election, with no real help from his old boss Reagan, whom he broke with. The Horton ads, along with the Rodney King arrest and riots in 1991-92 and the O.J. Simpson murder trial of 1994-95 that we read about in the Civil Rights chapter, served as unpleasant reminders of America’s ongoing racial conflicts and undertones. In political campaigns, though, even thinly veiled racism tapered off for the most part between the 1990s and 2016, at least among the politicians themselves.
Crime & Punishment
Since such shenanigans are typical of political campaigns, none of this would normally be worth mentioning. However, like the 1964 campaign during the escalation of the Vietnam War, the 1988 campaign had a lasting impact that transcended merely just deciding the next president. Fellow Democrats learned from Dukakis’ Willie Horton fiasco and vowed to be tougher on crime, building on Reagan’s efforts from the 1980s. Even before Horton, Democrats had “crossed the aisle” and joined Republicans in passing the Sentencing Reform Act of 1984 that abolished federal parole and, contrary to the defund the police noise more recently from the far left, Democrats had always supported increased funding for staffing more police. The “New Jim Crow” of mass incarceration that ensued wasn’t initially hatched as a conspiracy, as sometimes spun, but rather resulted from both political parties and middle-class minorities trying to deal with the crack cocaine epidemic and resulting gang warfare that spiked homicide and robbery rates. There are still people that grew up in urban America in the 1980s and ’90s suffering from PTSD. But, in the late 20th century, American prisons increasingly filled with small-time offenders through mandatory minimum sentencing, whereby judges didn’t have the discretion to lower sentences, with the proportion of African American prisoners growing substantially.
Another important deregulation of the Reagan era was opening up prison, prison services, and probation/parole business to the private sector. In a potential conflict of interest, politicians meanwhile took lobbying money from the for-profit prison construction business, creating a triangle similar to the military-industrial complex Eisenhower warned about in Chapter 14. Unfortunately, the 1994 Crime Bill also eliminated funding for college education within prisons, the thinking being that taxpayers shouldn’t fund free tuition when they were struggling to pay their own. That was short-sighted because studies showed that taxpayers had saved ~$5 for every $1 spent on prisoner tuition due to AA and BA degrees reducing recidivism rates. Education is a key component in reforming criminals.
However, another rare bipartisan consensus emerged by the late 2010s to reduce prison populations, which are higher in the U.S. than anywhere in the world and are expensive (ca. 2019, prisoners cost ~ $65k/yr). There are many factors contributing to crowded prisons. While incarceration for drug possession is commonly blamed, only ~ 20% of today’s inmates are in for drug-related offenses (PPI). Federal judges increasingly offer Drug Court treatment options for non-violent offenders. But many people that would’ve been in mental hospitals forty years ago are homeless or in prison. Moreover, petty offenders are taking up too much space, forcing judges to reduce sentences and grant earlier paroles for violent criminals. Supporters of tougher sentencing point to charts like the one on the right and see direct causation: crime has gone down precisely because more criminals are behind bars. In that scenario, the cost of criminality has been shifted (and spread more evenly) from individual victims to taxpayers at large and the streets are safer. Another theory for reduced crime rates is the lead-crime hypothesis, which posits that lead poisoning can increase violent behavior and that crime rates in America rose and fell with increased use of lead in gas, paint, and pipes and fell accordingly with lead’s abatement. The charts below don’t prove that causation, but the correlation is suggestive:
There are other factors impacting serious crime, including better surveillance (especially in the post-9/11 era), GPS-enhanced phone records, thorough cross-checking of databases with small-time criminals having already entered their fingerprints, and controversial broken windows strategies to reduce serious crime by curbing petty crime and cleaning up garbage. Studies and experience have shown that improving physical surroundings (garbage, graffiti, broken windows, etc.) and strictly enforcing minor crimes lowers the rate of serious crimes like murder, rape, and armed robbery. On the other hand, it’s a short step from curbing minor crimes to racial profiling, and minor criminals take up valuable jail space if sentenced. In short: the flip-side of “broken windows” strategies is “stop-and-frisk.” Undoubtedly, numerous factors act simultaneously on something as complex as crime rates. Despite the big crime drop-off between 1990 and 2010, the U.S. was still the most violent developed nation, especially in the South.
Either way, increased prison populations had libertarians on the right and civil rights progressives on the left calling for penal reform, and the idea of using prisons just for serious criminals became popular among Democrats and Republicans alike looking for wedge issues among working classes. In a rare modern case of bipartisan legislation, Congress and Trump passed the First Step Act in 2018, relaxing, among other things, minimum mandatory sentencing for non-violent offenders. In 2021, Joe Biden signed an executive order ending private prison contracts, but that order doesn’t currently apply to border detention centers. Incarceration rates are currently ~ 15% lower than they were in 2019 (USAFacts).
Violent crimes began to rise again in 2020-21. It wasn’t due to the infamous defund the police slogan because that never really happened, except for moderate restructuring in a few blue cities. It’s likely due to a combination of factors, including post-COVID stress, increased gun sales (now over a million per week), and the chilling effect of George Floyd’s murder and similar incidents on police, who tend to be less proactive after such incidents to stay out of trouble while citizens are less cooperative with police in helping them apprehend criminals. My reader can no doubt guess which factors the Left and Right will emphasize and/or ignore within their respective echo chambers. A key talking point for Trump in the 2024 election will be that violent crime rates have soared under Biden, even though they’ve been in slight decline since COVID (USAFacts).
Conclusion: the Reagan Revolution
We’ll discuss Reagan’s foreign policy much more in Chapter 22, having focused here on domestic policy. The Reagan Revolution shaped Republican Party politics from 1980 to 2016 and pushed the Democrats to the right economically. In some respects, we live in the shadow of the Reagan Revolution today. Just as conservatives won office and impacted policy after Roosevelt’s New Deal, from 1933-1980, liberals haven’t been silent since 1980 and three Democrats have won the presidency: Bill Clinton, Barack Obama, and Joe Biden. But Republicans Eisenhower and Nixon were okay with the New Deal and even expanded the federal government with departments like Health & Human Services (1953-), Environmental Protection Agency (1970-), and Drug Enforcement Administration (1973-). Likewise, Democrats Clinton, Obama, and Biden were no doubt liberal in some respects but pandered to Wall Street and never seriously suggested re-closing the gap between rich and poor back to mid-20th century levels by raising top tax rates significantly for individuals or corporations.
Language is one of the best indicators of overall trends. No aspiring Democratic politician called himself or herself a liberal or progressive between 1980 and 2012, while Republicans knocked each other out claiming who was most conservative. In the 2000 primary, fellow Democrats accused New Jersey Senator Bill Bradley of being a liberal and he dropped out of the race shortly thereafter. That’s a far cry from progressive Democratic candidates accusing Joe Biden of being too conservative in the 2020 primaries. Republicans had the overall momentum and upper hand from 1980 to at least 2008, with Democrats on the defensive.
As for the actual size and role of government, it hasn’t budged far in either direction since Reagan took office in 1980. In 2002, it consolidated some existing agencies under the Department of Homeland Security and added some smaller banking and consumer protection agencies after the financial meltdown of 2008. Beyond that, the biggest growth in government at the national level has been the post-9/11 increase of the National Security Agency (1952-) in domestic eavesdropping and Obamacare mandating insurers to expand coverage and citizens to be insured (the GOP dropped the mandate in 2017). There have been no dramatic increases or reductions in entitlement programs like Social Security beyond prescription drug coverage for Medicare (Bush 43), and tax rates stabilized with the top bracket oscillating between 36-39.6%. Up until the 2008 economic crisis, it was understood that regulation is bad and deregulation is good. There were even people who blamed the 2008 financial meltdown on too much regulation even though the derivatives markets that imploded had virtually no oversight. The U.S. now ranks last among developed nations in upward mobility.
Because its jobs pay better, high finance lures more of the country’s top graduates than law, medicine, science, or industry. Rather than just fueling the economy by lending to other businesses and encouraging constructive investments, finance itself is the biggest business in America, and most investing is purely speculative, high-frequency, and short-term. These changes aren’t just the result of presidential administrations, but rather overall structural changes in the economy, including trends toward globalization, outsourcing, etc. But the corporate-friendly politics of Reagan and his successors in both parties contributed to an era more financially conservative than postwar America — back in the direction of the late 19th century, but not as extreme. Has this historic shift run its course? With that very question at stake, pundits on the contested cable, radio, blogosphere, and chat room outlets fight to shape our minds and future with a brew of information, misinformation, and disinformation.
Optional Listening & Reading:
Media Bias / Fact Check
Jonathan Haidt, “Why the Last Ten Years of American Life Have Been Uniquely Stupid,” Atlantic, 4.11.22
“Who Killed Vincent Chin?” PBS-POV, 1989
“Fake News, Lies & Propaganda: How To Sort Fact From Fiction,” University of Michigan Media Library Research Guide
Supplement: The New Yellow Journalism
Sean Illing, “Flood The Zone With Shit: How Misinformation Overwhelmed Our Democracy,” Vox, 2.6.20
Stephanie Bastek, “Of Panic & Paranoia,” American Scholar Smarty Pants Podcasts (7.23) 25 Min.
Guy Schleffer/Benjamin Miller, “The Political Effects of Social Media On Different Regime Types,” UT Texas National Security Review, Summer 2021
Backstory, “Stuck: A History of Gridlock,” Virginia Foundation for the Humanities
H.W. Brands, “What Reagan Learned From FDR,” History News Network, 5.15
Frank Hyman, “The Confederacy Was a Con Job On Whites, And Still Is,” McClatchy, 3.6.17
Robert Barnes, “Efforts To Limit Partisan Gerrymandering Falter At the Supreme Court,” Washington Post, 6.18.2018
McKay Coppins, “The Man Who Broke Politics,” Atlantic, 11.18
Peter Wehner, “The Party of Reagan is No More,” TIME, 3.10.16
Henry Olsen, “How the Right Gets Reagan Wrong,” Politico, 6.26.2017
“Brian Rosenwald, “How Rush Limbaugh Broke the Old Media — And Built A New One,” The Week, 2.X.21
Bill Chappell, “Differing Narratives of Standoff Between Native American, High School Student” NPR, 1.21.19
Matthew Jordan, “A Century Ago, Progressives Were the Ones Shouting Fake News,” Conversation, 2.1.18
Jan Werner Müller, “The Myth of Social Media & Populism,” Foreign Policy, 1.24
John Lawrence, “How the Watergate Babies Broke American Politics,” Politico, 6.5.18
Maggie Astor & K.K. Rebecca Lai, “What’s Stronger Than A Blue Wave? Gerrymandered Districts,” New York Times, 11.29.18
Kaitlyn Tiffany, “So Maybe Facebook Didn’t Ruin Politics,” Atlantic, 7.27.23
Paul Lewis, “Our Minds Can Be Hijacked: The Tech Insiders Who Fear A Smartphone Dystopia,” Guardian, 10.6.17
Ford Fessenden & John Broder, “Examining the Vote [2000 Election],” New York Times, 11.12.2001
Amanda Robb, “Pizzagate: Anatomy of a Fake News Scandal,” Rolling Stone, 11.16.17
John Boehner, “Panic Rooms, Birth Certificates, and the Birth of GOP Paranoia,” Politico, 4.2.21
Adrien Chen, “The Fake News Fallacy: Old Fights About Radio Have Lessons For New Fights About the Internet,” New Yorker, 8.28.17
Anna Shectman, “Life in the Algorithm,” Yale Review, 12.11.23
Rogé Karma, “The Baffling Academic Debate Over Income Inequality,” Atlantic, 2.27.24
More Optional Reading, 1989-2000:
Bush 41, Clinton Democrats, Politics Gets Personal, 1994 Midterms & 1996 Election (Incl. Gingrich’s Contract With America), Monica Lewinsky Scandal, 2000 Election, More On Media Fragmentation
Bush 41
George H.W. Bush (hereafter referred to as Bush 41, since he was the 41st president) was popular midway through his presidency because of his handling of the Gulf War — so popular that most front-runners in the Democratic Party chose not to run in 1992. However, having inherited a large deficit from his predecessor Reagan, Bush broke a famous “read my lips” campaign promise and raised taxes to balance the budget. Cynical Democrats goaded him into it as an act of fiscal responsibility, then used it against him. Bush also alienated Republicans by beefing up the Clean Air Act (1963- ), expanding federal research on pollution and stemming acid rain and ozone depletion. Mostly ignored by conservatives and liberals alike was Bush’s help in fighting HIV-AIDS. Like his son George W. Bush, who later authorized relief to African countries when he was president (2001-09), Bush the elder aided American cities most afflicted by the disease through the 1990 Ryan White CARE Act. Bush 41 also signed congressional legislation outlawing an important type of discrimination with the 1990 Americans With Disabilities Act. According to the younger Bush’s speechwriter, Michael Gerson, his Emergency Plan for AIDS Relief (PEPFAR) resulted from an alliance of liberal global-health advocates and evangelicals who held sway with key congressional Republicans. Bush 43 explained and defended the program by referencing Luke 12:48: “To whom much is given, much is required.”
The country dipped into a mild recession in 1991 and Bush seemed to some detached in his response, possibly because he was smart and realized that mild recessions happen all the time and aren’t the presidents’ fault. Americans, overall, overestimate what presidents can do to revive or harm the economy. It’s not run or handled by a person and, if it were, that person would be Chair of the Federal Reserve. Bush’s problem was more of a public relations crisis than anything else. He tried to connect to the regular people by going to a grocery store but, when he got to the counter and had never seen a scanner, it just added to his image of being a detached blue blood. Bush stumbled into the 1992 campaign vulnerable, facing off against a little-known, young Democratic governor from Arkansas, Bill Clinton, and a colorful computer mogul independent from Texas, Ross Perot (right), who supported protectionism (tariffs) and opposed free trade.
Clinton Democrats
Bill Clinton was prominent in the Democratic Leadership Council, a group of “third way” policy wonks who argued that the Democrats should stay competitive by moving to the right (toward the center) and acquiescing in the Reagan Revolution. Other than Jimmy Carter, the Democrats had been getting pummeled since 1968. At its best, the DLC represented a refreshing and realistic change beyond the usual partisan tire ruts, helping to rid the Democrats of their unconditional support for the welfare state and anti-business reputation. At its worst, it just meant that Democrats would sell out to corporations and high finance, giving Wall Street virtual control over both parties. Liberals organized an anti-DLC conference in Washington under the motto Because One Republican Party Is Enough. While they advocated slightly higher taxes for the rich than Republicans, the new Democrats appealed as much to the leisured classes as the inner-city poor. Charles Schumer (D) of New York defended tax loopholes for hedge fund managers claiming that, like all Senators, he needed to look out for the constituents in his state — in this case, not defined by the 99% of New Yorkers who aren’t billionaires.
A key difference for the new Clinton-led Democrats was their support of free trade. Unions had weakened a lot by the late 20th century and Democrats needed other sources of money, so they turned increasingly to corporate contributions. For leftists, the answer would’ve been to strengthen unions by resisting free trade; for centrist Democrats, the answer was to get realistic and start winning elections, then do their best to barricade against the more extreme aspects of the Reagan Revolution. Clinton Democrats also tacked toward the center culturally. Clinton pledged to stop illegal immigration and, in an interview with civil rights leader Jesse Jackson, Clinton cleverly distanced himself from more radical elements in his own party by condemning rap singer Sister Souljah’s purported call for Blacks to kill Whites during the 1992 L.A. Riots. This technique, not invented in 1992, is now known as a “Sister Souljah Moment” and has been employed by George W. Bush, John McCain, and Barack Obama.
For leftists leery of big cuts to welfare and privatization of student loans, Clinton seemed like “Republican Lite.” For right-wingers, their goal was to spin the Arkansas governor like a communist anyway to prevent the Democrats from making inroads among independent, centrist voters still known as “Reagan Democrats.” That, too, is a time-honored political tactic. Later, Barack Obama would be stuck in the same political purgatory, spun alternatively as a commie or a corporate shill depending on who was doing the spinning. Clinton defined himself as an “Eisenhower Republican,” dedicated to balanced budgets, strong markets, and free trade. Indeed, Clinton’s long-term historical role was ratifying Reagan’s conservative revolution as a Democrat in the same way that Eisenhower ratified FDR’s New Deal as a Republican. Clinton saw globalization (free trade) as good for the American economy and a key to diffusing world conflict, yet the Democrats’ support of globalization opened a window of opportunity for Ross Perot (I), whose protectionist-fueled populism foreshadowed Donald Trump’s nomination among Republicans 24 years later. The Democrats were quietly withdrawing their uncompromised support for unionized labor and hoping that workers wouldn’t notice.
Politics Get Personal
The 1992 campaign also kicked off an era when the public and media have been consumed with the personal biographies of their candidates more than ever before. Previous politicians were from the WWII era and came of age when journalists, for the most part, didn’t pry into politicians’ sex lives or religions. In JFK’s case, his Catholicism was controversial, but not his rampant adultery. By 1992, though, Baby Boomers like Clinton were running for office and people wanted to know more about their backgrounds. Clinton was an inveterate philanderer and women came forward claiming to have had affairs with him. Some were no doubt gold diggers, but there was too much smoke for no fire. Hillary conceded that he was “a hard dog to keep on the porch.” And what was Clinton doing in the 1960s? Was he in Vietnam? Was he a protester? Clinton was at Oxford on a Rhodes Scholarship and had smoked pot, but “not inhaled.” He went into the National Guard toward the end of the Vietnam War and got a deferment to study another year overseas. Republicans jumped on Clinton for being a draft dodger but it came back to haunt them eight years later when George W. Bush (Bush 43) ran for president and people discovered he’d also gone into the Guard. Donald Trump got college deferments and a medical disqualification (1-Y/4-F) for short-term heel spurs. Unlike today, the Guard didn’t fight overseas in the 1960s. The GOP had a photo of Clinton from the early ’70s with a beard — enough to spin him as anti-establishment.
For his part, Clinton embraced more universally popular visions of the 1960s, including that of John Kennedy and the Civil Rights movement. The argument over which sixties cut both ways, though, because many voters had grown up in that era and not all of them necessarily held it against Clinton that he’d had a beard, smoked pot, or hadn’t fought in Vietnam. In the three-way race, Clinton won the Electoral College and a plurality of popular votes (43%), with independent Ross Perot garnering nearly 20% despite dropping out two weeks prior to the election. While many commentators assume that Perot’s presence hurt Bush, there’s no solid evidence to support him swinging the election toward Clinton. Perot drew voters from both parties but he attacked Bush more relentlessly than Clinton during the campaign.
A crusade against Clinton commenced with his election, more passionate and well-funded than those normally launched by political opponents. The same would occur with his successors, but never really had on this scale with his predecessors since the 1790s. Such venomous and one-sided dialogue became a hallmark of American politics from the 1990s forward, thriving in the deregulated era of cable TV and Internet described above. By and large, conservatives have used radio more effectively while liberals dominate comedy. Clinton’s biggest nemeses were Limbaugh and (during his second term) the Drudge Report, that blended fiction and reality. Limbaugh violated an unwritten code between journalists and politicians (i.e. kids are off-limits) in calling the Clintons’ daughter Chelsea ugly, the “dog of the White House.” Clinton’s team was young and inexperienced and the Republicans were able to “throw him off message” once he got in the White House by distracting him from the issues that made him popular among voters, such as his centrist business policies. They forced the issue of gays in the military since they knew it was a no-win issue among the public. Clinton endorsed equal rights for gays in the military during his campaign, but then hedged in office and came up with the muddled don’t ask, don’t tell policy, pleasing neither side.
After stumbling out of the gate on that, Clinton made another strategic error, which was to grant too much responsibility to his wife Hillary in crafting healthcare insurance legislation. Hillary was plenty capable, but some opponents were leery of her as an educated career woman, symbolized by her pantsuits. The couple had even talked of a “two-for-one deal” on the campaign trail and Hillary was the first First Lady to have an office in the West Wing alongside other senior staff. Other than Eleanor Roosevelt and Edith Wilson in 1920 after Woodrow’s stroke, First Ladies had stayed in the East Wing and avoided a meaningful role in policy. While their role is ambiguous, First Ladies (or future First Gentlemen) are not elected officials. It’s been understood that the president’s wife should limit herself to innocuous things like promoting literacy and physical fitness or discouraging drug use among kids, with maybe some ribbon-cutting here and there, occasionally necessitating a hard hat. At the same time, American healthcare insurance was a system sorely in need of repair.
1994 Mid-Terms & 1996 Election
Bill Clinton stumbled out of the gate for two other reasons in 1993 besides being thrown off message by right-wing media. First, like other state governors who became president in the post-Watergate era (especially Jimmy Carter), he brought with him to Washington a fairly inexperienced team. Second, his attempt to repair the nation’s troubled health insurance system, about which you’ll read more in the following chapter, mostly failed. The Speaker of the House of Representatives, Newt Gingrich of Georgia, seized on Clinton’s first-term problems and spearheaded the Republican Revolution of 1994, whereby the GOP took over both houses of Congress in the mid-term elections. The Republicans hadn’t controlled the House since the late 1940s. Its leaders were predominantly southern, including the Georgian Gingrich and Texans Tom Delay and Dick Armey. Presaging future threats against Barack Obama, North Carolina Senator Jesse Helms warned Clinton to not visit his state “without a bodyguard.” Just as Nixon presided during Lyndon Johnson’s Great Society era even though he was a Republican, the Democrat Clinton was presiding over a Republican congress during the ongoing Reagan Revolution. Recognizing the threat posed by his centrism, they trashed him for being a communist. Partisan gridlock began to kick in as the growing debt issue inherited from previous administrations reared its head.
The U.S. had been “in the red” since 1969 and, as Senator and ex-Democrat Richard Shelby (R-AL) pointed out, President Reagan had run the country further into debt. Working across the aisle with Republicans like John Kasich of Ohio and taking advantage of the post-Cold War peace dividend (less need for military spending), Bill Clinton was the first president since 1969 to balance annual budgets (receipts=outlays), but the overall, long-term debt didn’t go away. Deficits are annual shortfalls whereas the debt is the running total. Republicans wanted a balanced budget amendment but weren’t specific as to where they’d make cuts. Democrats laid out more specific ideas for cuts but wouldn’t chain themselves to an amendment, citing circumstances like wars or the Louisiana Purchase where governments need to run deficits. The U.S. can’t really get its long-term budget under control without moderate tax hikes or cuts to entitlements, but a wide swath of Americans in both parties like Social Security/Medicare and, of course, don’t like taxes. Meanwhile, conservative policy wonks like Grover Norquist require Republicans to sign pledges against raising taxes. In a democracy, blame for reckless budgets ultimately falls on the divided citizenry that, collectively, wants more for less in an atmosphere that discourages compromise.
Gingrich promised a Contract with America that would pass a balanced budget amendment, reform (reduce) welfare, make the day-to-day workings of Congress more efficient, roll back Social Security, and cut funding for environmental initiatives like the Superfund and Safe Drinking Water Act. He shortened the workweek of Congress to three days so that they could spend the other two “dialing for dollars” (raising money). Gingrich favored cutting benefits for the working poor and taxes for the rich, blaming increasing wealth disparity on “radical seculars.” In cutting back on the size of congressional committees and limiting the power of senior committee chairs, Gingrich’s reforms seemingly made Congress leaner and more transparent. However, in a classic case of how reform can have unforeseen consequences, by the early 21st century, it became harder for party leaders to assert leadership in Congress as increased transparency made everyone less likely to make deals that might alienate their constituents. They didn’t respond to party “whips” and majority leaders so much as lobbyists and the voters back home watching on C-SPAN. That transparency is good in a way, and more democratic, but it precludes dealmaking in the proverbial “smoke-filled back room.”
These reforms in the House of Representatives built on earlier attempts by the so-called “Watergate Baby” Democrats elected in the wake of the namesake scandal. They, too, objected to the glacial pace of the House with its seniority-dominated committees and unwritten codes against young Congressmen speaking up and initiating bills or amendments. After all, the supposed point of American democracy is to enable voters to effect change, especially in the lower house of Congress. But historian John Lawrence traces some of today’s lack of compromise to their infusion of political rights into the debate, which conservative Republicans later countered with their own rights:
Veterans of the long struggles for civil rights, for women, for children, for the environment, for people with disabilities, these new legislators articulated their agenda not merely as policy objectives but as constitutional and ethical “rights” with a profoundly moral dimension: a right to an abortion, a right to clean water and air, a right to consumer safety. While the practice occurred among liberals in the class of 1974, it increasingly appeared among conservatives as well: a right to gun ownership, a right to life for unborn fetuses, a right to lower taxes, a right to less government, a right to freedom from government regulation. Elevating policy goals to the status of rights would prove to be a crucial step in the evolution of ideological partisanship in the United States. The application of such a moral dimension to the framing of public issues served to diminish the attractiveness of compromise in pursuit of a common objective.
Rights are as American as apple pie and one thing nearly all Americans appreciate; they just appreciate different ones. In their purest absolute forms, rights don’t lend themselves to compromise when applied to issues like abortion and guns, creating gridlock. Absolutists don’t pound their fists on the table and exclaim their qualified right to anything. Bipartisanship — the idea of the competing political parties working together “across the aisle” (of the Senate or House floors) on compromise — has generally been seen as a positive thing in American history, even though conflict and disagreement are natural and healthy parts of the system, too. But bipartisanship is frowned upon by most Democratic and Republican voters today, who want their representatives to take no prisoners in Washington, hopefully taking over in one-party rule. Politicians rely on small donors who send their money mostly to the far left and far right candidates based on the understanding that they won’t compromise with the enemy. In 2013, the Arizona Republican Party censured longtime senator, 2008 presidential candidate and Vietnam veteran John McCain for having cast too many bipartisan votes in his career; he was officially declared guilty, in other words, of having cooperated across the aisle and voting his mind. Liz Cheney, likewise, was kicked out of Wyoming’s Republican Party for opposing Trump’s January 6 riot. As mentioned above, there’s a pro-life gag rule within the Democratic Party if not necessarily its voters.
The 1980s, in retrospect, was the last decade (so far) in American history when politicians of opposing parties socialized together. Newt Gingrich’s popularity in the mid-1990s signaled a new type of confrontational politics that demonized opponents and discouraged bipartisanship and compromise of the sort that made Great Society and Reagan Revolution legislation possible in the 1960s and ’80s. The key was to hammer home simple messages that assumes rivals aren’t acting in good faith. Gone were the days of Democratic-Republican whiskey, poker, and golf, as new rules sent Congressmen home on the weekends, ostensibly on behalf of staying in touch with citizens. Gingrich re-arranged schedules so that congressmen would fly home to their constituents over the weekend rather than stay in Washington socializing with colleagues. He hoped that, instead of compromising, Republicans could win and hang on to all the branches, asserting one-party rule.
As a history Ph.D. and former professor, Gingrich understood how politicians can help hammer home peoples’ worldviews through repetition and Manichaeism. Someone on the far other end of the political spectrum, Bolshevik Vladimir Lenin, said what translates to: “We can and must write in a language that sows among the masses hate, revulsion, and scorn toward those that disagree with us.” In 1990, Gingrich and his GOPAC action committee likewise issued a list of negative terminology: corrupt, betray, bizarre, cheat, devour, disgrace, greed, steal, sick, traitors, shallow, radical, pathetic, anti-flag — advising Republicans to never speak of Democrats without associating them with those terms, never interpret a national tragedy or crime without tying it to Democrats, and not to live in Washington, D.C. so as to avoid falling in with them socially. Occasionally, he would pen seemingly positive obituaries of deceased Democrats (e.g., Ted Kennedy), but only as backdoor way to criticize living ones by comparison (Obama, in that case). Gingrich liked to quote Mao, who described politics as what translates to “war without blood.” Democrats, ideally, would become people that reasonable Americans couldn’t respect enough to compromise with.
GOPAC resisted all bipartisan legislation in the hope that the public would blame Democrats if Republicans sabotaged Congress by making it more dysfunctional. Henceforth, threat of a government shutdown would loom over future negotiations between Congress and the president. Gingrich correctly linked his fame to “noise” and conflict rather than legislation, since that’s what made the news. Echoing Ayn Rand, he declared to one reporter that “people like me are what stand between us and Auschwitz [Nazi concentration camp],” perpetuating a longstanding, baffling equation among the fringe right between Democrats and Nazis. Later, conservative commentators Glenn Beck and Anne Coulter warned that Obamacare would lead to another Holocaust. Likewise, Vladimir Putin unpersuasively framed his 2022 Ukrainian invasion as denazification, even though it had nothing to do with WWII and Ukraine’s president, Volodymyr Zelenskyy, is Jewish, had relatives victimized during the Holocaust, and supports western democracy.
“Newtspeak,” a play on George Orwell’s Newspeak, mandated calling the opposition the “Democrat Party” or “Dems” because Gingrich feared that Democratic had positive connotations. He blamed congressional dysfunction and partisanship on the party “of socialism and anti-Semitism” rather than Democrats or, more accurately, himself. The comic strip Doonesbury called Gingrich’s memo the Magna Carta of attack politics. Gingrich’s strategy dovetailed with the proliferation of cable TV as Roger Ailes aligned FOX News with the GOP (Ailes also mentored MSNBC’s Rachel Maddow and even once offered to pay her a full-time salary to not work). Such partisanship, regardless of which side of the aisle it originates on, is usually accompanied by disingenuous complaints that the opposition is who doesn’t want to cooperate and criticism is deflected with whataboutism, as in what about when your side did something similar or worse? The Latin version of this argumentative fallacy is tu quoque for “you also.” George Washington warned in his 1797 Farewell Address about how partisanship makes the country more vulnerable to foreign interference:
It [partisanship] serves always to distract the public councils and enfeeble the public administration. It agitates the community with ill-founded jealousies and false alarms, kindles the animosity of one part against another, foments occasionally riot and insurrection. It opens the door to foreign influence and corruption, which finds a facilitated access to the government itself through the channels of party passions. Thus the policy and the will of one country are subjected to the policy and will of another.
Gingrich’s embrace of partisanship, combined with the media fragmentation and the decline of swing districts due to gerrymandering proved remarkably effective at getting citizens to subconsciously place a higher premium on cultural and political tribalism than patriotism or policy. The merit of any given policy is often reduced to whether or not it has a (D) or (R) attached to it. The overriding advantage of negativity and fear-mongering is that it raises more money than promises to craft legislation through compromise. While some partisanship is normal and it runs deeper in American history than most people realize — the 1790s Newspaper War and elections of 1800 and 1824 come to mind, for instance — politics no doubt grew increasingly toxic starting in the 1990s, or at least toxic politics grew more mainstream and sensationalist media grew more profitable into what commentators called the “outrage-industrial complex.”
Gingrich, then retired, led the infamous meeting at the Caucus Room Steakhouse on Obama’s inauguration night in January 2009, at which the GOP resolved to block Obama’s attempts at bipartisanship (Frank Luntz organized the dinner). Obama had campaigned on unity and naively thought that he could work across the aisle to craft legislation. Instead, the GOP rebuffed him and cynically blamed him for being divisive, leading Obama to give up such efforts midway through his second term and circumvent Congress with executive orders. The Tea Party’s perception that Republican Congressmen John Boehner and Eric Cantor were negotiating a budget deal with President Obama ruined their careers, as well. Partisanship is no longer a means to an end; hyper-partisanship is the end goal because it’s the most profitable in media and in campaign fundraising. During the January 6 uprising, some Republicans raised funds from those supporting the rioters as they hid from the rioters.
Back to the 1990s. What Gingrich’s Contract didn’t promise was to crack down on corporate lobbying, as many people riding the “reform” wave into Washington were there to cash in themselves. The new House Majority Whip in 1994 was Tom Delay (R) of suburban Houston, who went into office promising to reform Washington and left as one of the most corrupt politicians in the modern era. Gingrich over-reached with his Contract, not taking into account that only 38% of Americans had voted in the 1994 midterm elections. Mid-terms are often misleading in that way. Clinton cherry-picked the popular portions of the Contract (welfare reform and the balanced budget — not as an amendment, but at least as a reality for a few years) and held firm against the rest.
Clinton backed Gingrich down and rode the momentum to victory in the 1996 election. He had good economic tailwinds at his back, including improving information technology, the forenamed post-Cold War “peace dividend” of reduced military spending, heavy worker immigration, and Baby Boomers passing through peak years of productivity. And Clinton played to the centrist popularity that helped get him elected in 1992 by increasing police funding and reforming the worst abuses of the welfare system. Welfare recipients now faced benefit limits, had to look harder for a job, and couldn’t have more kids while on welfare. In 1996, Clinton defeated Republican Bob Dole, a WWII vet (injured in Italy) who ran a clean election. At the GOP’s summer convention, Dole played on his seniority and credibility, promising a “bridge to the past.” The Democrats held their convention two weeks later and promised a “bridge to the 21st century.” When it comes to conventions, it sometimes helps to go second.
Monica Lewinsky Scandal
Republicans stayed riveted on Bill Clinton’s sex life and purported criminal dealings throughout his two-term presidency. The Judiciary’s Independent Counsel, a special branch formed in the 1970s after Nixon corrupted the regular judiciary during Watergate, investigated Clinton. Congress took over the investigation from the Justice Department because Clinton’s attorney general, Janet Reno, had originally appointed Democrat Robert Fiske as lead investigator. They focused on Whitewater, a 1978 real estate investment the Clintons had made in the Arkansas Ozarks with a shady developer, then later the associated suicide of their friend Vince Foster and, finally, Clinton’s unsettled sexual harassment case with Paula Jones. Anxious to put the issues behind him, Clinton himself authorized the creation of the Independent Counsel investigation. Fiske’s replacement, Republican Ken Starr, suspected all along that the original Whitewater charges against Clinton were bogus. Later, when defending Trump in his 2020 Senate impeachment trial, Starr lamented that impeachments had become partisan. However, outside sources like Richard Mellon Scaife, hoping to get Clinton impeached before his term expired, bankrolled the investigation anyway through the Arkansas Project. Having run up against a dead end on Whitewater and Foster, the counsel’s hope was that if they investigated Clinton’s extramarital affairs (e.g., Paula Jones), he might have already revealed, or reveal in the future, something they didn’t know yet about Whitewater during “pillow talk” with a woman after sex. If a waste of taxpayer money, it at least spared Americans the boredom and effort of confronting al-Qaeda, climate change, and Wall Street instability.
For his part, Clinton couldn’t keep his drawers zipped up, even though he knew his opponents were pining to catch him in an affair. He struck up a relationship with a twenty-five-year-old intern named Monica Lewinsky. She told a co-worker about it, who told Starr’s counsel. They subpoenaed Clinton and videotaped his testimony so that he could remain in the White House while the grand jury viewed him, then slipped the tape to Fox News “by accident.” Under oath in a separate deposition involving Paula Jones, Clinton had claimed he “did not have sexual relations” with Lewinsky, but a semen stain on one of her dresses obtained under warrant suggested otherwise. Thus, Clinton was guilty of perjury, depending on how one defined sex exactly. Did a semen stain prove he’d had sex? Clinton admitted to an “improper physical relationship.” He claimed he hadn’t lied when he testified that he wasn’t in a relationship with Lewinsky because he wasn’t when he was asked, famously saying “it depends on what your meaning of is is.” Clinton was notorious for this kind of ambiguity, as evidenced by the non-inhaling aspect of his marijuana smoking. The perjury and obstruction of justice charges hinged on whether the actual allegations of oral sex constituted sex in the way that intercourse did. The House of Representatives determined that it did and impeached Clinton, sending the trial to the Senate.
Here’s where things started going awry for the Republicans. They’d been trying to get rid of Clinton for two terms, which was virtually unprecedented in American history. Normally the opposing party just counters policy through the accepted checks-and-balances system and tries to win the next election. The Constitution doesn’t authorize impeaching presidents because of disagreement over policies or unpopularity. The GOP would get their wish, though, if perjury could be defined as a high crime or misdemeanor, the Constitutional bar for impeachment. But the public was put off by the lurid details of the Starr Report, not understanding why Clinton’s investigators had taken such a detailed interest in his sex life.
Then journalists discovered that the Republican’s ringleader, Newt Gingrich (R-GA), was having an affair with his intern as well and that he’d asked his wife for a divorce as she was recovering from cancer. Ironically, Gingrich became the first victim of the Lewinsky scandal — yet another politician impaled by his own sword. After he stepped down, replacement Bob Livingston (R-ID) revealed that he, too, was sleeping with an intern, and resigned in a remorseful press conference. Years later, it came out that Livingston’s replacement Dennis Hastert (R-IL) paid $1.7 million in blackmail to a wrestler he’d sexually abused as a high school coach. He went to prison on child molestation charges in 2016.
By now, the public was also starting to realize that the Starr Commission had been a waste of time because they knew all along there was nothing to Whitewater, and the Clinton trial was looking like a witch-hunt. Did the U.S. really want to set the precedent whereby private citizens like Scaife could finance impeachment campaigns, launched before anyone had any knowledge of criminal wrongdoing? After all, Clinton’s perjury didn’t cause the original investigation; it resulted from it. Why was there an investigation to start with? No good reason, as it turned out, and at the most, nothing that had anything substantive to do with his womanizing. The Senate sensed which way public sentiment was leaning and voted in Clinton’s favor, 55-45, as his approval ratings shot up to the highest of his presidency (73%). All the Democrats voted to acquit and ten Republicans broke ranks and joined them. But Clinton’s goose was cooked, politically. He’d let down many Democratic voters, his VP Al Gore (whom he lied to about Lewinsky) and his wife, Hillary. He likely spent the remainder of his presidency sleeping on the couch.
The Lewinsky Scandal had substantive political fallout. First, it prevented Clinton from working with the GOP on reforming Social Security. Some commentators think that Clinton and the Republicans were working on a deal to partially privatize Social Security prior to Clinton’s impeachment. Second, Republicans remembered that liberals gave Bill Clinton a pass on sexual transgressions, which reemerged in 2016-17 as multiple women accused Donald Trump of harassment and he was even caught on tape bragging about adultery and groping (Hollywood Access Tape). Then news broke that he’d paid hush money to porn star Stormy Daniels. When liberals cried foul, conservatives retorted with the forementioned whataboutism. Since both tribes are imperfect, the supply of whataboutism (as in, “well, what about….”) never runs dry, excusing any kind of behavior in a “race to the bottom.” The French philosopher Voltaire wrote that he only had one prayer in life: “Oh, make my enemies ridiculous…and that was uniformly granted by God.” Then, in the 2017 #MeToo movement, when many women came forward with complaints against prominent actors, politicians, journalists, etc., some commentators reconsidered the Clinton scandal. In 2017 vocabulary, two of the four recent presidents, Clinton and Trump, were sexual predators (depending on how one defines the term, so too were several past presidents, including Thomas Jefferson and John Kennedy). In retrospect, it might have been best for Democrats to force Clinton to step down. That would’ve allowed VP Al Gore to continue with their agenda and perhaps put Gore in a better position to win the 2000 election since incumbents often have an advantage.
2000 Election
When Al Gore, Jr. ran for president in 2000, he distanced himself from Clinton because of the Lewinsky Scandal. His opponents were George W. Bush (R) and Ralph Nader (G), the old Nader’s Raiders lobbyist from the 1960s who brought you seat belts (Chapter 16). Nader wanted to clean up Wall Street corruption, reduce CO2 emissions and, ironically, kick lobbyists out of Washington. Gore had to pander to farmers and coal miners to win the Democratic nomination, so the future subject of An Inconvenient Truth (2006) was ambiguous about reducing carbon emissions. Nader ended up stealing some votes from Gore in the critical state of Florida though only 10% as many as the Democrats in that state who voted for Bush. But, what made this election famous was the controversy on election night over who won and when. The media are in a tough spot calling states (naming the winner) before all the votes are in because each network wants to be first but none wants to be wrong. The states themselves, with observers from both parties and independents, count and verify the votes and declare a winner.
Florida is a tough-to-predict “swing state,” with a sizable 22 all-or-nothing electoral votes at stake. No president won an election without winning Florida between Calvin Coolidge in 1924 and Trump in 2020. Like Ohio, it has a northern and southern contingent culturally speaking, except that in Florida’s case, their South is up north in the panhandle, and their North is down south with Yankee retirees. Complicating matters are Cuban refugees in south Florida that are swing voters themselves, except that in this case they were voting more Republican to punish Democrats for deporting six-year-old refugee Elián Gonzáles back to Cuba. Elián escaped Cuba with his mother and others but their boat capsized, killing everyone except Elián, whom fishermen found floating in an inner tube. Cuban Miami wanted him to stay in Florida with extended relatives but Bill Clinton’s administration and Attorney General Janet Reno returned him to Cuba to reunite with his closest living relative, his father. Gonzáles became a controversial symbol that politicians and media from all over the country argued about even if no one actually cared about the kid. He also likely cost Gore the presidency since Gore inherited blame as Clinton’s VP.
The networks called Florida in Bush’s favor too early, and Gore even called Bush to congratulate him on his win since the entire close election got down to Florida. Gore didn’t hold a press conference to concede the race, though, and he called Bush back to rescind his earlier concession when he started to catch up in Florida (at the time, they were tied with 266 electoral votes each). With all the networks using blue to denote states Gore won on their maps, and red for Bush, the terms red and blue states entered the political vocabulary to represent Republican and Democrat, respectively, though earlier elections assigned the two colors randomly. As you can see from the county map above, the GOP dominated sparsely-populated rural areas, while the Democrats did well on the coasts and in cities, and in areas with predominantly Hispanic and black voters. Gore still won counties in south Florida, but not as decisively as he would have without the uproar over Elián Gonzáles (counties don’t go all the way blue or red; all that matters to the electoral college is the state popular vote).
By midnight, Florida had narrowed and was too close to call. Bush wisely spun it as if he’d already won and was being challenged by Gore. Gore never should’ve made the first concession call, though, and the networks shouldn’t have called it in Bush’s favor. Having relatives at FOX, his brother Jeb as governor of Florida, and Florida’s Secretary of State Katherine Harris as his campaign co-chair helped Bush spin the election as a challenged victory rather than what it really was: a virtual tie. In a glaring if unplanned conflict of interest, the person in charge of the recount, Harris, worked directly for Bush. Gore wanted to recount either the entire state or at least certain disputed Democratically-leaning counties and Bush wanted to recount nothing, revealing a lot about what his camp suspected was the case. When they did recount Palm Beach County, they found many Democrats had accidentally voted for Republican Pat Buchanan because of their ballots’ confusing “butterfly” design.
Statewide, many voters of both parties had partially poked a hole through the circle next to their candidate, but accidentally left the “hanging (or dimpled) chad” — the little circle that dangles on the ballot without falling off when counted by antiquated IBM punched card machines. In the meantime, minorities were claiming that white cops intimidated them away from the polls (similar charges arose in Ohio and in South Dakota toward Indians in future campaigns).
The darkest moment — for those who value the American republic and see it as a global leader in democracy — came in Miami-Dade county, where Bush’s camp hired protesters to crash the county office and successfully intimidate officials into stopping their recount. This is the strategy that Trump’s camp wanted to use in 2020 but the counties in all states, Republican and Democrat, kept a buffer between the counters and protesters though allowed observers from both campaigns along with independents. Also different in 2020, Trump’s son Eric was calling for armed conflict to stop the original count instead of a recount, except in Arizona where his dad was catching up. Eric Tweeted® a video of counters burning Trump ballots to undermine Americans’ trust in the voting process, but it was quickly revealed to be a fake. They also had more states to worry about and a contingent within the Republican Party encouraging Trump to back down because it was futile, including his son-in-law, Jared Kushner.
In 2000, the GOP called on Watergate alumnus and future Trump operative Roger Stone, who understood the time-honored strategy of rationalizing any coup by blaming the other side of a coup. He later gloated that “one man’s dirty trick is another man’s civic participation.” Bush’s legal adviser, James Baker, flew rioters in from out-of-state to create enough havoc that citizens would want to stop the recount to save democracy and restore law-and-order. Miami’s mayor Alex Penales, still miffed at his fellow Democrats over the Elián Gonzáles affair, was AWOL, nowhere to be seen to order police to clear the county office of the “Brooks Brothers Riot” (so-named because Republicans are thought to dress better than Democrats). The key to any real law-and-order campaign worthy of mention is to incite criminal activity, as seen by the AK-47-wielding Boogaloo Bois‘ role in Minneapolis in 2020. With no police protection, the election officials freaked out and said that they wouldn’t have time, despite a large team, to count 10k votes in four days and called a halt to the recount. In 2020, Trump’s son-in-law Jared Kushner lamented not being able “to find a James Baker” to lead their fight in Pennsylvania.
The Supreme Court bought into the Bush campaign’s spin. SCOTUS ruled 5-4 in Bush v. Gore to stop the recounts ordered by Florida’s Supreme Court because inconsistent recount standards would violate 14th Amendment rights to equal protection and their result “might not reflect well on Bush’s victory,” causing the new president “irreparable harm.” Why he was considered the victor in the first place went unexplained, but Watergate journalist Bob Woodward did find the Court’s ruling on the equal protection argument sound. The ruling was strictly along partisan lines, with five conservative judges and four liberal. It was tortured logic and hypocritical given the GOP’s usual emphasis on states’ rights. When they stopped the recount, Bush was ahead by 537 votes and he won the electoral college 271-266 despite losing the popular vote. Technically, Gore could’ve pressed on still in Congress, but he conceded after the Court ruling.
Yet, even if the Republicans’ intent was to subvert democracy by preventing a recount, Bush might’ve actually won anyway had justice been served according to later independent recounts. The Miami Herald and USA Today recounted the disputed counties and found that Bush won Florida by a small margin. When the National Opinion Research Center, a consortium of media outlets, counted all the counties — what should have been done in the first place — they found that naming the winner would’ve depended on how the ballots themselves were counted (the hanging chads, etc.). Gore would’ve won the most restrictive but consistent scenario by 127 votes, and Bush would’ve won the most inclusive but consistent scenario by 110 votes. These studies don’t take voter intimidation into account. If there was even a shred of truth to those charges, that would’ve thrown the election off by more than these razor-thin margins. It was a tough loss for Gore, who defeated Bush nationally in the popular vote. He is one of five candidates in history to win the popular vote and lose the presidency.
The heated controversy surrounding Bill Clinton and the contested 2000 election dovetailed well with an increasingly confrontational media landscape and increased partisanship.
More On Media Fragmentation
As mentioned above, in addition to the basic responsibilities of citizenship, like voting and paying taxes, Americans now have to filter news to find out what’s going on. Unfortunately, many of us aren’t up to the task, with the result that we not only disagree — perfectly normal and healthy, even necessary, in a democracy — but that we aren’t disagreeing based on agreed-upon facts. For instance, the 2015 Iran Deal would’ve been complicated enough without huge swaths of Americans thinking that Obama’s administration paid Iran $150 billion in U.S. taxpayer money (ransom, in effect) instead of the truth that they unfroze ~$100 billion in Iranian assets locked up in overseas banks. During Merrick Garland’s failed Supreme Court nomination, many Americans believed in the Republicans’ mythical Biden Rule. Democrats later willfully misconstrued Florida legislation as a “Don’t Say Gay” bill and falsely claimed that the Texas GOP outlawed water breaks for construction workers in the Summer heat (AP). Likewise, Sarah Palin’s death panel hoax warped an already complicated Obamacare debate, with ~30% of Americans believing it.
In Spring 2020, the same percentage thought COVID-19 was introduced into human populations deliberately, with suspects ranging from the Chinese to Microsoft founder Bill Gates (PEW), while their arsonous kindred spirits in Europe scapegoated 5G phone towers. The virus might have leaked from the Wuhan Institute of Virology — the world’s biggest coronavirus research center — but, even so, there’s no evidence that its release was anything other than accidental. Of seven U.S. intelligence agencies studying its origins, five now favor instead the animal-spillover theory: that COVID-19 started in Wuhan’s Huanan Seafood Wholesale Market, from which 66% of early cases emerged (no early cases emerged in or around the WIV). It’s possible the virus traced to bats that migrated into south China due to climate change, who then bit and infected raccoon dogs or some other animal sold at the market. Mikki Willis’ Plandemic (2020) video misinformed millions more conspiracy-minded Americans precisely when they needed solid information, including dubious claims that masks and flu vaccines can cause coronavirus. The Web also educated us to the fact that snorting cocaine would cure COVID-19 and, early on, that “no one is actually sick.” Facing public pressure in July 2020, the Sinclair Group refrained from running a documentary on local stations blaming NIAID Director Anthony Fauci for starting the pandemic. Democrats understood that Fauci’s academic credentials and East Coast accent didn’t mean that he was spawned by Satan, but a 2021 New York Times-Gallup poll showed their exaggerated sense of how many COVID-19 victims were hospitalized:
All respondents, especially liberals, overestimated the magnitude of unarmed Blacks killed by police in 2019, with most unaware that most deaths are civilians killing each other. Most guessed between 100 and 1000, whereas the real number was between 13 and 27, depending on the source.
And liberal outlets were also slow to investigate the origins of the virus in Wuhan prior to PBS’s Frontline episode in 2021. On the right, in 2022, Tucker Carlson pushed vaccine and booster skepticism on his FOX viewers, who averaged ~ 65 years old, refusing to share whether he’d been vaccinated himself. Misinformation caters to those that “do their own research” but don’t actually know how to research, partially because they’re starting from the assumption that all “experts” are frauds.
At a 2019 Trump rally in Mississippi, a reporter asked a supporter if it mattered whether what he was hearing was true. The candid young man thought for a moment then said: “He tells you what you want to hear…And I don’t know if it’s true or not—but it sounds good, so [expletive] it.” (Atlantic) On the left, the post-truth era takes the form of “emotional truth,” as in the inflated stories of racial discrimination told by comedian Hasan Minhaj. This lack of agreed-upon reality, or dismissal of reality’s importance, is more dangerous territory than most people realize, especially in the political realm. Hannah Arendt’s study of totalitarian societies in the 20th century found that the ideal subjects of dictatorial rule are not committed to any particular cause like fascism, but rather people for whom the distinction between fact and fiction no longer exists. In 2022, George Santos (R-NY, right) won a seat in Congress not just by embellishing his résumé, but rather by running as a completely fictional character.
In 2018, Trump’s chief strategist Steve Bannon (left) told journalist Michael Lewis: “Anger and fear is what gets people to the polls…the Democrats don’t matter. The real opposition is the media. And the way to deal with them is to flood the zone with s**t.” We’ll give Bannon the benefit of the doubt that he wasn’t knowingly trying to sow chaos in Western society by doing the exact thing that Russia’s Vladimir Putin was trying simultaneously with his “firehouse of falsehood” (Rand) in an attempt to weaken or even destroy the West. Like the left-wing Occupy Wall Street movement in 2011, Bannon was nihilistic insofar as he focused more on tearing down the existing system than establishing realistic alternative plans. Trump cynically launched his political career by promoting the Birther Movement that questioned Obama’s American citizenship and he accused Obama of founding the ISIS terrorist group. During Obama’s presidency, the number of Americans who thought that he was Muslim grew from ~20% to ~30% (he is a Hawaiian-born non-denominational Christian with a Kenyan father and white mother from Kansas). Billy Graham’s son, Franklin, was another prominent Birther. When he took office, Trump flipped the script by accusing his critics of fake news, unwittingly echoing George Orwell in 1984 when telling his supporters that “what you’re seeing and what you’re reading is not what’s happening.” He told CBS’ Lesley Stahl that he trashed the media so that “when they say something negative about me, no one will believe them.”
As president, Trump falsely accused Hillary Clinton of rigging the 2016 election, suggested that liberals murdered conservative Supreme Court judge Antonin Scalia, and forwarded other dubious Tweets, including one claiming that Navy SEAL Team Six conspired to fake Osama bin Laden’s assassination and another of a Muslim mob throwing a little boy off a building in London. When confronted about the fake Muslim video, Trump’s Press Secretary Sarah Huckabee Sanders said: “Whether it’s a real video, the threat is real and that is what the president is talking about.” Likewise, when Paul Gosar (R-AZ) posted an anime with his face on a character murdering Alexandria Ocasio-Cortez (D-NY) by slashing her neck, his justification was that it was just a humorous way to educate Americans as to what was going on along the southern border, insofar as a recent “Marxist” spending package provided amnesty to undocumented workers and Ocasio-Cortez voted for it. College football coach Mike Leach Tweeted a fake montage of an Obama speech — re-ordered, edited, and spliced into a different speech because, as he told ESPN’s E:60, he preferred the fake video to the real Obama and thought it would stimulate a better discussion on the proper role of government.
The post-truth era has arrived. But arguing about government policy based on mashup Obamas doesn’t enhance discussions. Similarly, if more along the lines of old-fashioned misrepresentation as opposed to outright falsehoods, Bernie Sanders patched together chunks of video to form a 2020 commercial suggesting that Obama supported him, though the two mostly didn’t get along. If the real Obama was too centrist for our liking, we can at least take solace that fictionalized Obamas will live on in perpetuity, like the posthumous Colonel Sanders® on Kentucky Fried Chicken commercials.
Limbaugh admired Trump’s knack for re-posting hoax conspiracies then distancing himself the next day by saying that he was merely sharing others’ views. In keeping with that strategy, after accusing former VP Joe Biden of conspiring to murder all of Navy Seal Team 6 to save Osama bin Laden, Trump distinguished between originating fake news and spreading it, arguing that the latter helped people “decide for themselves.” But while Trump argued the merits of spreading fake news, his Secretary of State Rex Tillerson warned about declining standards in a Spring 2018 graduation speech at the Virginia Military Institute: “If our leaders seek to conceal the truth or we as people become accepting of alternative realities that are no longer grounded in facts, then we as American citizens are on a pathway to relinquishing our freedom.” Disinformation and unsubstantiated conspiracy theories are a problem across the political spectrum even if many of the prominent early examples have been on the far right, as in other countries. At the height of the Truther Movement, many Americans believed that George W. Bush or the U.S. was behind the 9/11 attacks, or at least knew of them in advance, despite no quality evidence to support that (Polls). Bannon and Trump’s campaign digital media director Brad Parscale (right) refined sketchy micro-targeting on social media. In the run-up to the 2016 election, with the help of consultants like Cambridge Analytica, they exposed rural Midwesterners to misinformation like Democrats favor welfare to Muslim polygamists. Facebook even provided coaching for politicians interested in micro-targeting and CEO Mark Zuckerberg initially pledged to allow disinformation in future elections to support free speech and to avoid having tech companies moderate content, but then changed course when pressure mounted to censor fake news. Remember that Facebook, Instagram and Twitter policy don’t concern the First Amendment as that applies to government regulation, not private companies. Initially, Facebook allowed Steve Bannon’s post calling to behead Anthony Fauci and FBI Director Christopher Wray. During Roy Moore’s 2017 senatorial campaign in Alabama, one liberal posted disinformation indicating Moore’s support for banning alcohol (he lost the special election), triggering a debate within the Democratic Party as to whether they’d need to “fight fire with fire” to compete in the new environment. Internationally, though, early evidence indicates that social media favors authoritarianism and undermines democracy (see Schleffer & Miller’s optional article below). Parscale led Trump’s re-election campaign through July 2020, but texted on January 6, 2021 that he felt guilty because Trump tried to start a civil war after he lost.
The new media model favored partisanship, especially movement conservatism as hatched by William Buckley, Jr. (Chapter 16), who argued that since there’s too much overlap and agreement among mainstream Americans on policy, it’s more politically effective to mobilize people along the lines of good vs. evil. According to historian Heather Cox Richardson, after the government revoked the Fairness Doctrine, FOX News fashioned their format less around dry policy than the sort of narrative spun in Star Wars (1977), also aiming to push the same emotional triggers as pro wrestling.