“Reagan doesn’t have that presidential look” — United Artists Executive Rejecting Reagan for Lead Role in The Best Man (1964)
Liberalism’s wave started with FDR’s New Deal in the 1930s, crested with LBJ’s Great Society in the 1960s, and ran out of steam by the mid-1970s. Conservative Ronald Reagan won the 1980 presidential election by arguing that “In this present crisis, government is not the solution to our problem; government is the problem.” The Departments of Energy (1977-) and Education (1979-) expanded federal bureaucracy some, but the public mood was shifting back toward smaller government at all levels. Ralph Nader’s idea of creating a bureaucracy for consumer protection went nowhere. The Civil Aeronautics Board went away in 1985 and the Interstate Commerce Commission (railroads and trucking) in 1995, their safety enforcements transferred to other agencies. To repurpose for liberalism what Winston Churchill said about World War II’s Battle of El-Alamein: “this was not the end. It was not even the beginning of the end. But it was, perhaps, the end of the beginning.”
Does America really have a “big government?” Proportional to the size of its country, America’s government as of 2015 was fairly small by international standards, spending around 14.5% of national GDP output compared to Australia (18%), Germany (19%), Russia (19%), the United Kingdom (19%), and Canada (21%). These World Bank figures are lower than Office of Management & Budget figures (right) that usually hover ~20%. The U.S. leads the world in total expenditures, though, spending ~$3.8 trillion in 2016 while collecting $3.3 trillion in revenue. America spends more on its military than its next eight competitors and twice that of China and Russia combined (SIPRI), but doesn’t provide health insurance for those under 65, except for Medicaid at the state level. Of course, spending and military power aren’t the only measures of a government’s size or reach; there’s also its legal/regulatory system.
Domestically, there’s a lot of noise about “tyranny” but, collectively, U.S. laws don’t stand out as being overly oppressive in comparison to other nations. Look at the chart on the right and think of how steadily the drumbeat has grown over the last decades about the government getting bigger and bigger. Some of that noise originates among those profiting from beating the drum, selling airtime on radio and cable. Some of the noise originates from quarters unfamiliar with real tyranny or suspicious of a deep state within intelligence agencies operating outside the public or even regular government’s purview. Conspiracy theories are more entertaining and profitable than real knowledge or perspective. Still, many laws originate in agencies run by non-elected bureaucrats and housed under the executive branch such as the IRS, FDA, OSHA, FCC, EPA, etc., and it’s not surprising that citizens resist or resent these laws when bureaucrats don’t communicate well with them as to why they’re enacted or they’re administered with a heavy hand. Governments grow because voting citizens across the political spectrum demand things, because agencies grow to police other agencies, because big companies lobby (i.e. bribe) politicians to pass regulations that small companies can’t afford to comply with, and because bureaucracies have a natural tendency to grow like fungi regardless of the first three reasons.
By the 1970s and 80’s, the Great Society era launched by Lyndon Johnson was waning, and the public was ready to send the pendulum back in the other direction, despite still wanting the services and protection government provided. You could trace one turning point in liberalism’s demise to New York City’s near bankruptcy, when they ran out of money to pay public employees and had to look overseas to sell municipal bonds. While New York was the first city to tighten its budgetary belt, California was the first state, though in their case they cut taxes more than they cut spending. California passed Proposition 13 in 1978, a law banning increases in property taxes (beyond inflation) unless authorized by a two-thirds majority. The brewing conservative resurgence wasn’t just about cutting taxes, but also reducing government intervention in the economy and reinjecting religion into politics. With the election of Ronald Reagan in 1980, the conservative revolution launched under Barry Goldwater in 1964 led to a fundamental changing of the guard in Washington and in many states.
Stagflation & Energy Crisis
By the 1976 election, the public wanted the most anti-Nixon, anti-Vietnam, anti-Watergate type candidate they could find. They found it in Democrat Jimmy Carter, a Born-Again peanut-farming governor from Georgia untarnished by Washington politics. Watergate began a trend toward outsider candidates, resulting in state governors Carter, Reagan, Clinton and Bush the Younger all winning the presidency. After defeating Gerald Ford in a tight 1976 election, Carter came to Washington with what some congressmen perceived to be a holier-than-thou attitude and didn’t work well with what he perceived to be corrupt Washington insiders. Like Richard Nixon, he had a fortress mentality in the White House, not initiating relations with congressmen on Capitol Hill. He alienated conservatives by creating the Department of Energy to try to wean the country off Arab oil (the GOP didn’t want more bureaucracy, and oil companies feared breakthroughs on alternative energy), and he alienated Great Society Democrats by overseeing deregulation and insisting on a balanced budget. In that way, Carter was more of an independent and fiscal (budgetary) conservative than a party-line Democrat. As a fiscal conservative, Carter alienated Democrats by refusing to go further into debt and Republicans by refusing to cut taxes. His main, seemingly intractable, problem was stagflation.
Normally prices don’t rise during a recession but, by the mid-1970s, the U.S. was mired in stagflation: the unlikely combination of inflation and high unemployment. LBJ’s Great Society and the lengthy Vietnam War raised the federal deficit, contributing to inflation. So, too, did President Nixon decoupling American currency from the Gold Standard in 1971, due to too many trade surplus nations swapping greenbacks for a dwindling gold supply; American dollars were convertible to gold dating back to the Bretton Woods Conference in 1944. Going off the Gold Standard created confidence, or fiat, currency instead. Greenbacks from then on were worth whatever people thought they were worth, based on their confidence in America’s solvency and survival. As for coins, they cost more to mint than the amount of copper or nickel on the coin is worth. Caught in between its two mandates — curbing inflation and supporting job growth — the Federal Reserve didn’t raise interest rates to stem inflation because they feared that would further weaken the job market.
As we saw in the previous chapter, OPEC (Organization of Petroleum Exporting Countries) embargoed oil in 1973 to show the West how dependent they’d become. Then they raised prices and within a few short years, oil climbed from $3 to $12/barrel. The Iranian Revolution that we’ll cover later in this chapter caused another big spike in oil prices in 1978-79. This coincided with increased dependence on foreign oil. Oil hit $1 gallon for the first time in American history, even adjusted for inflation far higher than the 5¢ it cost in the 1950s.
To offset high prices and Peak Oil, the U.S. built the Trans-Alaska Pipeline from the North Slope, which was iced in much of the year, to the Port of Valdez, where tankers could ferry it to the Lower 48. That still wasn’t enough to offset the price hike. Many Americans overreacted by trying to hoard oil, not realizing that global price and local supply aren’t always linked. There was often no actual shortage in the pipeline. Either way, prices remained high, and oil is so important to industrialized societies that it can drive up inflation even during a recession, thus the stagflation. European countries tax oil enough to deliberately make it expensive and then use the taxes to pay for mass transit, encouraging people to conserve. The United States has historically devoted more to its military budget instead, partly to ensure the flow of cheap oil.
Detroit wasn’t well prepared to manufacture fuel-efficient cars in the 1970’s. For years leading up to then, bigger was better. That coincided with the overall rise of European and Japanese industry from the ashes of WWII, and they were better at making smaller cars that got good mileage. The U.S. rebuilt those countries as industrial powerhouses after the war and succeeded beyond their expectations. They were now fully rebuilt, which was good for the world economy, but also meant that the U.S. wasn’t the only kid on the block. Among other things, that meant American unions would steadily weaken from that point forward, and most working-class families would need both parents working to keep up.
American manufacturing was entering a long, slow period of decline, as factory after factory in the Rust Belt of the industrial Northeast and Midwest shut down. That trend continued into the late 20th and early 21st century if you look at manufacturing jobs. But if you measure by output, American manufacturing has done well in recent decades. In fact, the U.S. is producing far more than ever; it’s just producing more with automation and lowered-paid staff rather than union workers. And there is even an unmet need for semi-skilled workers in American manufacturing that can only be offset by immigration. As factory unions weakened, public unions suffered setbacks as well. Major cities like New York struggled with a combination of low taxes and well-pensioned public unions (policemen, firemen, sanitation workers, teachers, etc.).
President Carter’s pleas for Americans to conserve energy out of a sense of patriotism mainly fell on deaf ears. Carter learned that, while most Americans are patriotic when it comes to wars, they’re less enthusiastic about turning down the AC in the summer or heat in the winter. Then a near meltdown at the Three Mile Island nuclear plant outside Harrisburg, Pennsylvania in 1979 dampened the public’s enthusiasm to pursue atomic energy as a viable alternative to fossil fuels. Carter had been a nuclear engineer himself, commanding a nuclear sub in the early Cold War, and visited the plant personally during the height of the crisis. Engineers staved off a meltdown of the inner core reactor, but the near miss spooked the public and few new reactors went into construction afterward. Today, nuclear reactors supply ~ 20% of America’s electrical power.
The meltdown scenario was seemingly prophesied in a popular movie called The China Syndrome just prior to the actual emergency, except that in the movie the core reactor really melted down, meaning that it sank into the earth, radiating the soil and water around it. Problems with waste disposal and much worse crises in the USSR in 1986 (Chernobyl) and Japan in 2011 (Fukushima Dai-Chi) dampened the industry’s prospects despite the fact that it’s mostly carbon-free. President Eisenhower’s dream of always being within site of one of the giant reactor chimneys as one drove across the country never happened and neither did mini-reactors in each home or smaller devices to propel vacuum cleaners and other appliances. Nuclear-powered coils that could melt driveway and sidewalk snow didn’t happen in the 1950s and they weren’t about to after Three Mile Island.
When Carter talked about the economic malaise the country was in, he came across as ineffectual in fixing the situation or as telling Americans something they didn’t want to hear. His administration was struggling to keep the basic cost of living and energy from rising faster than wages. For many Americans, especially white-collar workers or union workers with automatic cost-of-living adjustments (COLAs), the price-wage spiral allowed them to keep up (inflation requires wage spirals; otherwise no one would be able to afford the higher prices). However, the wage hikes were uneven. Some blue-collar workers lost ground, and retirees on fixed incomes saw their savings shrink at the rate of inflation — probably the cruelest effect of high inflation. Investors weren’t happy either; while stock prices rose in terms of nominal dollars, there was no real increase (adjusted for inflation) from 1965 to 1982.
In an inflationary environment, borrowing makes sense because the amount you’ll owe later is, in effect, less, so people that hang on to their jobs keep borrowing and spending, which drives inflation even more. In a desperate move to stop double-digit inflation, which stood at a staggering 13% by 1980, Federal Reserve Chair Paul Volcker raised interest rates dramatically, slowing the economy because fewer people could borrow, but at least reversing inflation’s rise. As the short-term borrowing rate between banks soared to 20%, the nation dipped into its worst recession since the 1930s, this one deliberately caused.
Some economists, led by Alan Blinder, argue that Volcker’s drastic actions were unnecessary because those prices would’ve subsided on their own, without government action. There were many causes of inflation in the 1970s besides low-interest rates, and those would’ve naturally taken care of themselves, according to this argument. Just as municipalities struggled to make ends meet, so to the federal budget sagged under the collective weight of the Vietnam War, entitlement programs, the energy crisis, and rising food costs.
Slamming the brakes on the economy by raising rates also raised unemployment, which Volcker supporters like Milton Friedman claimed hovered naturally around 5% anyway. Unemployment shot up to nearly 11%, above. In truth, the country was in a bind that didn’t offer any easy solutions, and Carter sided with the conservative approach of Volcker and Friedman. America took its Volcker chemotherapy, killing inflation cells along with growth and employment cells. Things got worse before they got better and the economy didn’t bottom out until 1982.
Carter made some changes that helped the economy long-term besides initiating the painful process to slow inflation. After consulting with economists, he deregulated some industries that had been under the government’s control, including transportation (airlines, trucking, rail) and natural gas lines. Regional startups such as Southwest began to undersell big national airlines, challenging the original five-headed government-sanctioned monopoly of United, Eastern, Braniff, American, and Delta. Communications followed the same trend, triggered by a 1974 anti-trust lawsuit causing the breakup of Ma Bell in 1982-84 into the new “Baby Bells” of Verizon, AT&T, CenturyLink, and others. That opened up telecom for competitive pricing just before the advent of cell phones. Also, credit card companies won the right to charge unlimited interest rates. Carter also signed off on legislation allowing for the creation of Business Development Companies (BDC’s) that gave small investors a tax-friendly way to invest in private businesses and riskier start-ups than those allowed in the SEC-regulated public stock markets. All this helped lay the foundation for economic recovery in the 1980s but wasn’t enough to help Carter at the time.
Meanwhile, Jimmy Carter had plenty of problems overseas to deal with. Building on Richard Nixon’s foundation, the U.S. officially normalized relations with China in 1979 but Nixon’s détente with the Soviets came unraveled under Carter. In Ethiopia, the Soviets gained influence in East Africa as a Marxist state killed hundreds of thousands in the Red Terror and various relocation schemes. The Soviets felt threatened by Carter’s emphasis on human rights and the arms race spiked dramatically because of better ICBMs (inter-continental ballistic missiles) and multi-warhead MIRVs (multiple independently targetable reentry vehicles) that, according to rumor at least, could be dosed with biological weapons. This LGM-118 “Peacekeeper” MIRV the U.S. tested over the Kwajalein Atoll divides into eight 300 kiloton warheads, each ~20x more powerful than the Hiroshima bomb if detonated.
In the 1970s, each side was testing submarine-based ballistic missiles (SLBM’s) and air-launched cruise missiles that could be loaded onto traditional bombers like America’s B-52’s. The U.S. placed medium-range cruise missiles similar in design to the Nazi’s old V-1 “flying bombs” in southern England, loaded onto mobile launch pads in the payloads of trucks parked in underground bunkers. These relatively cheap warheads, each costing only ~ ¼ of a jet fighter, had a 2k-mile range and could incinerate a small city and burn everyone to death in a ten-mile radius. In general, Soviets focused more on size whereas the U.S. focused on accuracy. Each side additionally worked on neutron bombs that could wipe out life without destroying property, though the U.S. shelved plans to arm NATO with the new weapons due to public pressure. Near the end of his presidency, Carter announced that both sides had more than 5x as many warheads as they had in the early 1970s. A round of SALT (strategic arms limitation talks) between Carter and the Soviets slowed the madness some, limiting each side to 2400 warheads and convincing the Soviets to halt production on new MIRVs that carried up to 38 separate warheads.
Middle East Turmoil
Unfortunately, six months after the SALT II talks in Vienna, in December 1979, the Soviets invaded Afghanistan to support communist forces in a civil war there against the jihadist Mujahideen. In response, Carter embargoed agricultural trade, boycotted the 1980 Moscow Olympics, and issued a doctrine stating America’s intention to protect its oil interests in the Middle East. However, nothing could compel the Soviets to relent in their misguided quest to conquer Afghanistan, and arms reduction talks stalled.
West of Afghanistan, resentment had been building in Iran against the U.S. ever since 1953 when the CIA overthrew their socialist democracy and replaced it with a dictatorship (the Shah) that sold the West cheap oil. Unfortunately for Carter, he reaped what President Eisenhower and his successors sowed. The only accommodation the Shah had made to free speech was within mosques, so anti-Western sentiment fused with fundamentalist Islam there over the decades. When fundamentalist revolutionaries took over the country in 1978, the Shah escaped to Mexico, then sought cancer treatment in the U.S. Granting the Shah exile was the straw that broke the camel’s back, and the new leaders seized the American embassy in Tehran, capturing diplomats and Marines in the process.
At first, Carter hoped the Iranian Hostage Crisis could divert Americans’ attention away from the domestic economy, but that backfired as the crisis wore on and ABC’s Nightline covered angry Iranians burning Uncle Sam in effigy on a nightly basis. For Americans still in a post-Vietnam funk, it was like getting salt poured in their wound. Finally, Carter ordered a military rescue, but a sandstorm compromised the mostly helicopter-based operation and a C-130 tanker aircraft crashed, killing eight. The fiasco only raised the prestige of revolutionary leader Ayatollah Khomeini, imprisoned by the Shah in 1963.
The one big feather in Carter’s foreign policy cap was negotiating peace between Israel and its most formidable rival, Egypt. Carter built on the Shuttle Diplomacy initiated by Henry Kissinger in the early 1970s, whereby the U.S. no longer supported Israel unconditionally but rather tried to broker peace between Israel and its neighbors. Building on an idea raised by CBS News’ Walter Cronkite in a split-screen interview with the leaders of Israel and Egypt, Menachem Begin and Anwar el-Sadat, Carter invited both to Camp David, Maryland for a retreat. At first, he had to walk from one end of the compound to the other to relay messages, but he eventually got both men in the same room to talk through interpreters. In the Camp David Accords, Israel agreed to swap the Sinai Peninsula in exchange for Egypt’s recognition of Israel’s right to exist. The two have been at peace ever since, though the democratic revolution in Egypt in 2011 threatened the relationship. As for Sadat, his own army assassinated him during a parade for negotiating with Israel, underscoring the resistance to peaceful compromise that Middle Eastern leaders face among their own populations. A similar fate awaited Israeli leader Yitzhak Rabin after he laid out a framework for peace with Palestinians within Israel in the 1993 Oslo Accords. He was killed by a right-wing Israeli opposed to peace.
For Carter, his success with Israel and Egypt wasn’t enough to offset setbacks in Iran and Afghanistan. In retrospect, Afghanistan was causing the Soviets more harm than the Soviets were causing the U.S., but Iran plagued Carter as he approached reelection in 1980. After months of negotiations to get their assets unfrozen in American banks, Iranians released the hostages within minutes of when Carter left office. As President Reagan took the oath, the hostages hit the airport tarmac.
Morning In America
It’s hard to say whether the Iranian Crisis cost Carter re-election or not. By 1980, the time was right for the Reagan Revolution as Americans were ready for a conservative change of pace. The Misery Index (stagflation) as economists came to call it, set the stage for Republican victory by Californian Ronald Reagan over Carter in 1980. The actor and former liberal Democrat had turned to the right in the early Cold War and become governor of California in 1966 after campaigning for Barry Goldwater’s presidential run in 1964. With his telegenic charisma and jocular charm, “The Gipper” stole the show with a rousing speech at the 1976 GOP convention even as the party coronated Gerald Ford as their candidate.1980 was a watershed election, on par with 1932 in terms of swinging the American political pendulum back toward the right, just as ’32 had swung it to the left. For the first time since the 1930s, Republicans managed to pry away a significant chunk of blue-collar workers. Many of these Reagan Democrats wanted to restore military pride or, in the case of some Christians, opposed the Democrats’ pro-choice abortion platform. The Supreme Court legalized abortion in Roe v. Wade (1973). In general, Reagan tapped into public skepticism about government agencies and programs launched under Johnson in the 1960s. As we read in Chapter 16, one commentator said that “Goldwater lost against the New Deal, but Reagan won against the Great Society.”
Reagan’s first speech after his nomination was in Philadelphia, Mississippi, where three civil rights workers were killed in 1963. Why Philadelphia? Obviously, Reagan and his advisers weren’t just throwing darts at a map and picking random towns instead of larger cities. While not endorsing segregation or violence, he used the occasion to remind the townspeople of how much he’d always appreciated their commitment to states’ rights in the context of a talk on the innocuous subject of education. Reagan never criticized Blacks directly, though he disliked the 1964 Civil Rights Act, but he exploited the public’s resentment toward people who were taking unfair advantage of the welfare system. He asked working-class white audiences if they were tired of working hard for their paycheck then going to the grocery store and seeing a “strapping young buck” ahead of them in line with food stamps. Prior to the Civil War, “bucks” or “studs” referred to healthy male slaves that masters purportedly encouraged to reproduce with “wenches.” Reagan popularized the term welfare queen during his 1976 campaign for women who took advantage of the government’s well-intentioned idea of paying single unemployed moms more than married couples. The term derived from a Chicago woman named Linda Taylor who, in 1974, was caught defrauding the government with multiple identities and sentenced to 2-6 years in prison. Reagan’s audience understood who the strapping young buck and welfare queen were. He wasn’t going to beat people over the head with explicit racism (he wasn’t stupid), but neither was he going to leave the old southern Democratic voters and George Wallace supporters on the table (because, again, he wasn’t stupid). In an infamous 1981 interview (YouTube), South Carolina GOP strategist Lee Atwater explained how his party won over racists without sounding overtly racist.
Reagan was continuing with a variation on the GOP’s Southern Strategy, begun under Nixon to siphon off the racists left over from the Democrats’ endorsement of civil rights in the 1960s. The concern over welfare abuse transcended race, though. In Hillbilly Elegy: A Memoir of a Family & Culture In Crisis (2016), J.D. Vance recalls how working-class Whites in the Appalachia of his youth resented other lazy Whites that ate (and drank) better than they did by staying on the public dole permanently, spurring the workers to abandon the Democrat Party.
Republicans gained control of the South, fulfilling LBJ’s prophecy about the Civil Rights movement, partly through various Southern Strategies on race, partly through their general limited government philosophy (including cracking down on welfare abuse among all races), and partly through their new alliance with Christian Fundamentalists. Fundamentalism had been growing since the 1970s and abortion, legal since Roe v. Wade in 1973, gave Republicans an excellent wedge issue to galvanize their alliance around, along with Christian nationalism. Consequently, many Reagan Democrats, North and South, Protestant and Catholic, crossed the aisle and voted GOP for the first time, regardless of their economic class. Economically, they were either willing to sacrifice their interests on behalf of outlawing abortion and strengthening the military or were convinced that helping the wealthy would ultimately create working-class jobs through trickle-down economics (more below on Reaganomics).
Religion took on a revived role in American politics during the Carter-Reagan era. Carter was born-again and wanted to carry Christ’s message of peace into the real world. Reagan, too, was interested in the New Testament, especially its last chapter, the Book of Revelations, that he suspected might foreshadow an apocalyptic showdown between America and the USSR. He secured a lasting alliance between Christian Fundamentalists and the Republican Party. Today no candidate could run for office without fully explaining his or her faith. Mormons and Jews are more or less welcome to join the sweepstakes along with Christians; but it’s safe to say that Muslims, Hindus, New Agers, agnostics, and atheists need not apply.
Economics, though, was where the right-wing Reagan set himself apart the most from Democrats. Just as FDR wanted to jump-start the economy through government spending, Reagan’s supply-side economics reversed the concept of Keynesian stimuli, focusing not on government spending but on tax cuts, especially for the wealthy and corporations. Several major corporations were essentially on welfare throughout Reagan’s presidency because their tax rebates exceeded their tax bills. It’s difficult to tell how much the wealthy were actually paying on income taxes prior to 1980, but the top rates went from 70% when Reagan came into office down to 28% by 1988, so they were the biggest and most obvious beneficiaries of his election.
Was all this “Reagan-Hood” as his critics charged? In other words, did Reaganomics really steal from the poor and give to the rich, the opposite of Robin Hood? Yes and no. He helped the rich plenty, but his record was mixed on the poor. He cut food stamps, most forms of student aid (e.g. Pell Grants), and painkillers from disability coverage, leading to a black market in drugs like oxycodone. However, spending continued through Reagan’s presidency on most of the core New Deal entitlement programs and much of the welfare from the Great Society. Some tax burdens were shifted to the states but still came out of paychecks just the same. Really, Reaganomics kicked off an era when Americans continued to spend on core entitlements while voting themselves tax cuts. Reagan’s budget director, David Stockman (right), a follower of Austrian free-market economist Friedrich Hayek (Chapter 9), quickly learned the limits of what a full-blown Reagan economic revolution would entail. Stockman, the “father of Reaganomics,” realized that cutting most non-military spending would decimate “Social Security recipients, veterans, farmers, educators, state and local officials, [and] the housing industry…democracy had defeated the [free market] doctrine.”
The result of Reagan’s concession to core New Deal programs, when combined with increased military spending, was ballooning debt. Reagan candidly told American voters that he was willing to plunge the country into debt to win the Cold War if that’s what it took. Adjusted for inflation, Abraham Lincoln and Franklin Roosevelt are the runaway leaders in growing the size and cost of the federal government because of the Civil War and World War II. But aside from them, what presidents oversaw the most growth in the size of the national government? Surprisingly, George W. Bush (87%) and Reagan (82%) in non-inflation-adjusted numbers (Source: USGovernmentSpending.com). This is as good a time as any to remind ourselves that, while presidents submit budget proposals under the Constitution, Congress not the president is in charge of the nation’s purse strings.
Reagan’s supporters often claim that the debt-to-GDP ratio actually shrank under Reagan, meaning that the economy grew more than the debt, and the ratio of federal spending to the overall economy fell, but that’s not the case. The Debt-to-GDP ratio grew from ~30% to 40% in the 1980s (at the top of the chapter we looked at annual spending vs. GDP rather than debt). According to the theory of supply-side economics, lower tax rates were supposed to stimulate growth sufficiently to increase overall tax revenues but that didn’t happen. Nonetheless, the economy took off on a long bull run, lasting through the late 1990s.
Did increased wealth “trickle down” to workers as supply-side advocates promised? Again, yes and no. There’s no doubt the economy grew over the next twenty years, and the booming stock market of the 1980s and 90’s helped workers tied to pensions and 401(k) retirement funds, along with stimulating overall growth. As the graph to the left shows, working classes didn’t suffer significant wage reductions on average between 1980 and 2007 (just before the Great Recession). The overall earning power of most workers stagnated, though, except for people in the upper 5-10%.
Many economists claim that it’s wrong to look at the economy like a pie and ask who is getting the biggest piece because the pie itself is growing. But regardless of the size of the pie, it is finite at any given moment, and the gap between rich and poor has widened between 1980 and the present, with most of the money trickling up. While the rich have gained more proportionally than the working and middle-classes, the ultra-rich (top 1%) have gained far more than the rich, middle, or poor. The top 1% lost ground in the Great Recession of 2008-09, but when the slow recovery kicked in around 2010, they increased their lead over the bottom 99%. Much of that wealth is in the hands of entrepreneurs who’ve created jobs and products for the rest, but much of it has gone to investment bankers and hedge-fund managers who stash their earnings in offshore accounts to avoid paying taxes. For them, deregulating Wall Street was a welcome part of Reaganomics — the trickle-down part not so much. They lobbied politicians to legislate tax loopholes for pennies-on-the-dollar. Tax shelters, combined with the fact that taxes are lower on investments than earned income, mean that most wealthy now pay a lower effective tax rate than the middle and upper-middle classes.
By 2012, the richest 400 people in the U.S. had more money than the bottom 50% of the population, and wages stood at an all-time low as a percentage of GDP. One of wealthiest, Warren Buffet, suggested that millionaires pay a 30% minimum tax rate regardless of whether their earnings are from work or investments (some already do). The majority of Americans favor the idea, but it’s difficult to see the Buffet Rule going anywhere given politicians’ need to win campaign donations from the very people (0.06% of the population) who would suffer under the heavy hand of “totalitarian government” in that scenario. Opponents argue that keeping taxes lower on investments than earned income spurs growth and that the revenue increase would be minor anyway.
Another key to the Reagan Revolution was deregulation. Reagan was adamant in his aforementioned philosophy that government is not the solution to our problem(s); government is the problem. The deregulatory trend started under Carter in the 1970s but gained momentum under Reagan. He rolled back environmental and workplace safety regulations and got rid of many rules governing banking and accounting. The financial changes, especially the evolution of Special Purpose Entities (SPE’s), contributed to problems like the Savings & Loan Crisis, Michael Milken’s junk bond-related fraud, and Enron in the late 1990s. The taxpayer bailout of corrupt Savings & Loans costs Americans 3-4% of GDP between 1986 and ’96. On the other hand, allowing accountants to “cook their books” may have stimulated the economy and Milken used some of the money he stole to help fund medical research. But, ideally, one purpose of accounting, other than to run a business responsibly for your own sake, is so that other people (employees, investors, IRS) can get a feel for what’s going on, not what’s not going on. Reagan also cut funding for the Small Business Administration (1953-) the only government agency aimed at helping small entrepreneurs, but one that could be spun as yet more bureaucracy and costing taxpayers because of some failed loans.
Mergers were another hallmark of Reaganomics, as courts were hesitant to prosecute monopoly cases. Remember all the hullabaloo about trust-busting in the Progressive Era? In 1986, an upstart football league called the USFL led by Donald Trump sued the NFL in an antitrust case. A lower court determined that the USFL was right — the NFL did have a monopoly on pro football — and awarded the USFL a grand total of $1. While that particular case wasn’t influential, it symbolized the era. The Reagan Revolution paved the way for a wave of mergers in the 1980s and 90’s. Unions didn’t do much better than trust-busters in the 1980s. Shortly into Reagan’s presidency, air traffic controllers went on strike. He had the FAA order them back to work and fired over 11k that refused.
Deregulation also impacted communications in 1986 when the FCC (Federal Communications Commission) removed their 1949 Fairness Doctrine requiring TV and radio broadcasts to be fair and “tell both sides of a story.” Such a rule was arguably a violation of the First Amendment, but its retraction fragmented news into what it is today, where most conservatives and liberals just listen to spins on their own tribe’s websites, networks or radio shows, with little center of gravity in the middle to rely on for “straight news.” Rush Limbaugh launched his radio show in 1988, two years after the deregulation, and almost everything now is what people used to refer to as op-ed, or opinions and editorials. The advent of the Internet then obliterated any hope of an agreed-upon reality. People can Tweet® or vent on blogs with like-minded people, or have profanity-laced exchanges with faceless adversaries in comment boxes that go nowhere. In addition to the basic responsibilities of citizenship, like voting and paying taxes, Americans now have to filter news to find out what’s going on. Unfortunately, many of us aren’t up to the task, with the result that we not only disagree — perfectly normal and healthy in a democracy — but that we aren’t even disagreeing based on agreed-upon facts. Since it’s human nature and easier to confirm preconceptions, most people just choose their “truths” from a virtual buffet table of options and at least some modern politicians are learning to take advantage of what commentators are calling the “post-truth” era. Here’s one person’s take on today’s Anglo-American media landscape, that you can feel free to dispute. The point here isn’t the accuracy of the diagram — with which presumably nearly everyone with a pulse would quibble — but rather that many Americans only feed off the edges of the buffet table then share posts on social media that friends and relatives either already agree with or ignore. There’s little to no constructive or intelligent debate within sites on the edges of the spectrum and often bogus misinformation or “fake news.”
Biased media is a return historically to the way journalism operated in the late 18th and early 19th centuries, when no one made any pretense toward objectivity, except that now it’s happening among a far broader, more diverse population. There’s an unseen benefit to all this if you have time and the fortitude to stomach it. If you expose yourself to a wide spectrum of media, at least higher-grade versions of it, you can be better informed today than someone who simply watched the “straight” nightly news forty or fifty years ago.
Finally, drawing and re-drawing congressional district boundaries took on greater importance in American politics in the late 20th century, though this has a longer history and transcends the conservative resurgence. Gerrymandering, named after Revolutionary-era Massachusetts politician Elbridge Gerry’s salamander-shaped district (right), is as old as American politics and stems from the problem of how to divide up a state’s congressional districts. Senators don’t present this problem because each state gets two drawn from all the state’s voters. However, with the House of Representatives, there is no perfect or fair way to map districts given the irregular shape of most states and the shifting population within them. Even in a square state like Wyoming, dividing into four even squares wouldn’t be even in terms of population distribution, although Wyoming’s population is so low that it’s not a problem anyway because they have only one representative in the House; the whole state is one district. Gerrymandering maximizes one party’s capacity to win votes by herding opponents’ voters into as few districts as possible, or by spreading and diluting those votes across districts. This diagram, if a bit hyperbolic in using “steal” in its title and misleading because it should use elections plural to refer to several within a state, shows two simplified versions with right angles:
We could discuss Gerrymandering at any point in the course but, by the late 20th century, computers enhanced the efficiency of redistricting. Also, like the media deregulation mentioned in the previous section, enhanced Gerrymandering has increased partisanship. Both phenomena divide and sort us. Per the Voting Rights Act of 1965, courts have generally struck down Gerrymanders aimed at racial discrimination but sanctioned those aimed at partisan discrimination. Obviously, there’s a lot of overlap between racial and partisan discrimination since minorities have tended to vote Democrat since 1964. The Supreme Court hasn’t ruled on purely partisan Gerrymandering since 2004 but will rule on a Wisconsin case in 2018. Allowing the party in power to draw up their own redistricting lines, as 38 states currently do, exacerbates the problem, creating a situation where “politicians pick voters” nearly as much as voters pick politicians.
Texas, one of the 38 yellow states above, is another notorious example. Democrats once drew up the lines to favor themselves and, since the Reagan Revolution of the 1980s, Republicans have done likewise. Austin, for instance, is the “bluest” (most liberal) city in Texas and one of the most liberal in the country — a “blueberry in a bowl of tomato soup” as comedian Jon Stewart called it. Under former Congressman Tom Delay (R), the GOP created the 25th District, aka the “Fajita Strip,” to condense Democratic voters into one district — writing that district off but limiting Democrats’ overall impact.
When courts shot down the Fajita Strip, the GOP created a map whereby five of Austin’s six Congressional representatives were Republican. The two basic ways to Gerrymander are “packing and cracking” and the GOP has used both on Austin. To pack is to herd like-minded constituents into one district such as the Fajita Strip, to minimize their impact. A second way is to divide a city like Austin by cracking it into the tips of multiple wedges that fan out into large enough conservative districts that the end result is a liberal city like Austin represented by mostly conservative congressman (see Districts #10 & #17 below).
When packed rather than cracked, Gerrymanders result in clear red and blue districts whose voters don’t elect centrist candidates. In the Lower House of Representatives, that results in a diverse collection of staunch partisans who are elected for the very purpose of doing battle with the opposing party, not compromising. Many conservative voters/constituents see compromise as a sign of weakness and force their candidates to pledge that they won’t, just as many Barack Obama supporters — and perhaps, subconsciously, even his opponents — saw the former president’s willingness to compromise as a weakness. Gerrymandering also increases the motivation to either cheat or rig the system in those few swing counties that aren’t red or blue. If voters themselves aren’t more divided than they were in the past, they’re at least better sorted. They choose their own sets of experts and facts from the media buffet table, vote for candidates from districts deliberately made as partisan as possible, and have increasingly come to self-segregate by moving to “red and blue” areas to be around like-minded people. Consequently, congressional districts with “swing voters” have shrunk steadily over the last thirty-five years. In Texas, only District #23 above, stretching from the western suburbs of San Antonio all the way to El Paso, is currently up for grabs. This mostly Hispanic district is currently led by ex-CIA, African-American Republican and outspoken Trump critic Will Hurd. Hurd was student body president at Texas A&M during the bonfire tragedy of 1999.
The post-Reagan era kicked off with what seemed then like a depressing campaign — one that served mainly to underscore the superficiality of media coverage and tendency in democracies for campaigners to manipulate voters by appealing to their worst instincts. That wasn’t because either of the candidates was bad. The 1988 race pitted Reagan’s VP, George H.W. Bush against Democrat Michael Dukakis of Massachusetts. Since New England is generally more liberal than the rest of the country, the Democrats balanced the ticket with Lloyd Bentsen of Texas, hoping to recapture the Austin-Boston magic of the 1960 Kennedy-Johnson ticket. Dukakis led midway through the summer, but Bush’s media consultant Roger Ailes (Chapter 16, future head of FOX News) and campaign manager, Lee Atwater, came up with an ad attacking Dukakis’ weakest point besides his unfortunate photo-op in a tank.
As governor of Massachusetts, Dukakis oversaw a prison furlough program that allowed prisoners out on temporary weekend probations. One of the convicts, Willie Horton, broke into a home and raped a woman. Horton was black and a group called Bush for Americans flooded the airwaves with his mug shot (left), asking viewers if they wanted someone soft on crime. Atwater said the ad would “strip the bark off the little bastard” [Dukakis] and “make Horton Dukakis’ running mate” [for VP]. Dukakis made the soft-on-crime spin worse by answering no to a tough debate question from CNN’s Bernard Shaw over whether he’d favor the death penalty for someone who raped and murdered his wife. While his answer was clear and he backed it up by arguing that studies showed the death penalty was not a deterrent, viewers were put off that the question hadn’t stirred deeper emotions.
Bush campaigners also made up stories that Dukakis burned American flags to protest the Vietnam War and that his wife was mentally ill, though Bush distanced himself from the smears. When Atwater was diagnosed with brain cancer a couple of years later, he converted to Catholicism and issued an apology to Dukakis for the “naked cruelty” of the 1988 campaign. At the time, though, it was enough to pull his client ahead in the race and Bush won the election, with no real help from his old boss Reagan, whom he broke with. The Horton ads, along with the Rodney King arrest and riots in 1991-92 and the O.J. Simpson murder trial of 1994-95 that we read about in the Civil Rights chapter, served as unpleasant reminders of America’s ongoing racial conflicts and undertones. In political campaigns, though, even thinly veiled racism tapered off for the most part between the 1990s and 2016, at least among the politicians themselves.
Crime & Punishment
Since such shenanigans are typical of political campaigns, none of this would normally be worth mentioning. However, like the 1964 campaign during the escalation of the Vietnam War, the 1988 campaign had a lasting impact that transcended just deciding the next president. Fellow Democrats learned from Dukakis’ Willie Horton fiasco and vowed to be tougher on crime. In a case of bipartisan agreement, they “crossed the aisle” and by and large supported Republican policies of tougher sentencing (Democrats had always supported increased funding for staffing more police). Consequently, American prisons filled with small-time offenders through mandatory minimum sentencing, whereby judges didn’t have the discretion to lower sentences, with the proportion of African American prisoners growing substantially.
However, another rare bipartisan consensus is emerging to reduce prison populations, which are higher in the U.S. than anywhere in the world and cost taxpayers money in an era of tight budgets. There are many factors contributing to crowded prisons. While incarceration for drug possession is commonly blamed, only 15-20% of today’s inmates are in for drug-related offenses. Federal judges now offer Drug Court treatment options for non-violent offenders. But many people that would’ve been in mental hospitals forty years ago are homeless or in prison. Moreover, petty offenders are taking up too much space, forcing judges to reduce sentences and grant earlier paroles for violent criminals. Supporters of tougher sentencing point to charts like the one on the right and see direct causation: crime has gone down precisely because more criminals are behind bars. In that scenario, the cost of criminality has been shifted (and spread more evenly) from individual victims to taxpayers at large and the streets are safer.
Undoubtedly that’s part of the explanation, even if the U.S. is still the most violent developed nation, especially in the South. However, there are other factors impacting serious crime, including better surveillance (especially in the post-9/11 era), GPS-enhanced phone records, more thorough cross-checking of databases with small-time criminals having already entered their fingerprints, and broken windows strategies to reduce serious crime by curbing petty crime and cleaning up garbage. Studies and experience have shown that improving physical surroundings (garbage, graffiti, broken windows, etc.) and strictly enforcing minor crimes lowers the rate of serious crimes like murder, rape, and armed robbery. But increased prison populations have libertarians on the right and civil rights progressives on the left calling for penal reform, and the issue has proven popular among Democrats and Republicans looking for wedge issue among working classes.
George H.W. Bush (hereafter referred to as Bush 41, since he was the 41st president) was popular midway through his presidency because of his handling of the Gulf War — so popular that most front-runners in the Democratic Party chose not to run in 1992. However, having inherited a large deficit from his predecessor Reagan, Bush broke a famous “read my lips” campaign promise and raised taxes to balance the budget. Cynical Democrats goaded him into it as an act of fiscal responsibility, then held it against him. Bush also alienated the Republicans by beefing up the Clean Air Act (1963-), expanding federal research on pollution and stemming acid rain and ozone depletion. Mostly ignored by conservatives and liberals alike was Bush’s help in fighting HIV-AIDS. Like his son George W. Bush, who later authorized relief to African countries when he was president (2001-09), Bush the elder aided American cities most afflicted by the disease through the 1990 Ryan White CARE Act. Bush 41 also signed legislation outlawing an important type of discrimination with the 1990 Americans with Disabilities Act.
The country dipped into a mild recession in 1991 and Bush seemed to some detached in his response, possibly because he was smart and realized that mild recessions happen all the time and aren’t the presidents’ fault. Americans, overall, overestimate what presidents can do to revive or harm the economy. It’s not run or handled by a person and, if it were, that person would be Chair of the Federal Reserve. Bush’s problem was more of a public relations crisis than anything else. He tried to connect to the regular people by going to a grocery store but, when he got to the counter and had never seen a scanner, it just added to his image of being a detached blue blood. Bush stumbled into the 1992 campaign vulnerable, facing off against a little-known, young Democratic governor from Arkansas, Bill Clinton, and a colorful computer mogul independent from Texas, Ross Perot, who supported protectionism (tariffs) and opposed free trade.
Bill Clinton was prominent in the Democratic Leadership Council, a group of “third way” policy wonks who argued that the Democrats should stay competitive by moving to the right (toward the center) and acquiescing in the Reagan Revolution. Other than Jimmy Carter, the Democrats had been getting pummeled since 1968. At its best, the DLC represented a refreshing and realistic change beyond the usual partisan tire ruts, helping to rid the Democrats of their unconditional support for the welfare state and anti-business reputation. At its worst, it just meant that Democrats would sell out to corporations and high finance, giving Wall Street virtual control over both parties. Liberals organized an anti-DLC conference in Washington under the motto Because One Republican Party Is Enough. While they advocated slightly higher taxes for the rich than Republicans, the new Democrats appealed as much to the leisured classes as the inner-city poor. Charles Schumer (D) of New York defended tax loopholes for hedge fund managers claiming that, like all Senators, he needed to look out for the constituents in his state — in this case, not defined by the 99% of New Yorkers who aren’t billionaires.
A key difference for the new Clinton-led Democrats was their support of free trade. Unions had weakened a lot by the late 20th century and Democrats needed other sources of money, so they turned increasingly to corporate contributions. For leftists, the answer would’ve been to strengthen unions by resisting free trade; for centrist Democrats, the answer was to get realistic and start winning elections, then do their best to barricade against the more extreme aspects of the Reagan Revolution. Clinton Democrats also tacked toward the center culturally. Clinton pledged to stop illegal immigration and, in an interview with civil rights leader Jesse Jackson, Clinton cleverly distanced himself from more radical elements in his own party by condemning rap singer Sister Souljah’s purported call for Blacks to kill Whites during the 1992 L.A. Riots. This technique, not invented in 1992, is now known as a “Sister Souljah Moment” and has been employed by George W. Bush, John McCain, and Barack Obama.
For leftists leery of big cuts to welfare and privatization of student loans, Clinton seemed like “Republican Lite.” For right-wingers, their goal was to spin the Arkansas governor like a communist anyway to prevent the Democrats from making inroads among independent, centrist voters still known as “Reagan Democrats.” That, too, is a time-honored political tactic. Later, Barack Obama would be stuck in the same political purgatory, spun alternatively as a commie or a corporate shill depending on who was doing the spinning. Clinton defined himself as an “Eisenhower Republican,” dedicated to balanced budgets, strong markets, and free trade. Clinton’s long-term historical role was ratifying Reagan’s conservative revolution as a Democrat in the same way that Eisenhower ratified FDR’s New Deal as a Republican. Clinton saw globalization (free trade) as good for the American economy and a key to diffusing world conflict, yet the Democrats’ support of globalization opened a window of opportunity for Ross Perot, whose protectionist-fueled popularity foreshadowed Donald Trump’s nomination among Republicans 24 years later. The Democrats were quietly withdrawing their uncompromised support for unionized labor and hoping that no one would notice.
1994 Mid-Terms & 1996 Election
Clinton stumbled out of the gate for two reasons in 1993. First, like the other state governors who became president in the post-Watergate era (especially Jimmy Carter), he brought with him to Washington a fairly inexperienced team. Second, his administration’s attempt to repair the nation’s troubled health insurance system, about which you’ll read in the following chapter, mostly failed. The Speaker of the House of Representatives, Newt Gingrich of Georgia, seized on Clinton’s first-term problems and spearheaded the Republican Revolution of 1994, whereby the GOP took over both houses of Congress in the mid-term elections. The Republicans hadn’t controlled the House since the late 1940s. Its leaders were predominantly southern, including the Georgian Gingrich and Texans Tom Delay and Dick Armey. Presaging future threats against Barack Obama, North Carolina Senator Jesse Helms warned Clinton to not visit his state “without a bodyguard.” Just as Nixon presided during Lyndon Johnson’s Great Society era even though he was a Republican, the Democrat Clinton was presiding over a Republican congress during the ongoing Reagan Revolution. Partisan gridlock began to kick in as the growing debt issue inherited from previous administrations reared its head.
The U.S. had been “in the red” since 1969 and, as Senator and ex-Democrat Richard Shelby (AL-R) pointed out, President Reagan had run the country further into debt. Working “across the aisle” with Republicans like John Kasich of Ohio, Bill Clinton was the first president since 1969 to balance annual budgets (receipts=outlays), but the overall, long-term debt didn’t go away. Deficits are annual shortfalls whereas the debt is the running total. Republicans wanted a balanced budget amendment but weren’t specific as to where they’d make cuts. Democrats laid out more specific ideas for cuts but wouldn’t chain themselves to an amendment, citing circumstances like wars or the Louisiana Purchase where governments need to run deficits. The U.S. can’t really get its long-term budget under control without moderate tax hikes or cuts to entitlements, but a wide swath of Americans in both parties like Medicare and want Social Security to kick in before they’re too elderly. Meanwhile, conservative policy wonks like Grover Norquistforce politicians to sign pledges against raising taxes. In a democracy, blame for reckless budgets ultimately falls on the divided citizenry that, collectively, wants more for less in an atmosphere that discourages compromise.
Gingrich promised a Contract with America that would pass a balanced budget amendment, reform (reduce) welfare, make the day-to-day workings of Congress more efficient, roll back Social Security, and cut funding for environmental initiatives like the Superfund and Safe Drinking Water Act. Though Gingrich favored cutting benefits for the working poor and taxes for the rich, he blamed increasing wealth disparity on “radical seculars.” In cutting back on the size of congressional committees and limiting the power of senior committee chairs, Gingrich’s reforms seemingly made Congress leaner and more transparent. However, in a classic case of how reform can have unforeseen consequences, by the early 21st century, it became harder for party leaders (or anyone, for that matter) to assert leadership in Congress.
The 1980s, in retrospect, were the last decade (so far) in American history when politicians of opposing parties socialized together. Newt Gingrich’s popularity in the mid-1990s signaled a new type of confrontational politics that demonized opponents and made bipartisanship difficult. Gone were the days of Democratic-Republican whiskey, poker, and golf, as new rules sent Congressmen home on the weekends. As a history Ph.D. and former professor, Gingrich understood how politicians can help hammer home peoples’ worldviews through repetition. In 1990, he and his GOPAC action committee issued a list of negative terminology — corrupt, betray, bizarre, cheat, devour, disgrace, greed, steal, sick, traitors, shallow, radical — advising Republicans to never speak of Democrats without associating them with those terms. This “Newtspeak” mandated calling the opposition the “Democrat Party” or “Dems” because Gingrich feared the adjective Democratic had positive connotations. The comic strip Doonesbury called Gingrich’s memo the Magna Carta of attack politics. Gingrich’s negativity dovetailed well with the proliferation of cable TV as Roger Ailes aligned FOX News with the GOP. Such partisanship, regardless of which side of the aisle it originates on, is usually accompanied by disingenuous complaints that the opposition is who doesn’t want to cooperate.
What Gingrich’s Contract didn’t promise was to crack down on corporate lobbying, as many people riding the “reform” wave into Washington were there to cash in themselves. The new House Majority Whip in 1994 was Tom Delay of suburban Houston, who went into office promising to reform Washington and left as one of the most corrupt politicians in the modern era. Gingrich overstretched a bit with his Contract, not taking into account that only 38% of Americans had voted in the 1994 midterm elections. Clinton cherry-picked the popular portions of the Contract (welfare reform and the balanced budget — not as an amendment, but at least as a reality for a few years) and held firm against the rest.
Clinton backed Gingrich down and rode the momentum to victory in the 1996 election. He had good economic tailwinds at his back, including improving information technology, the post-Cold War “peace dividend” of reduced military spending, heavy worker immigration, and Baby Boomers passing through peak years of productivity. And Clinton played to the centrist popularity that helped get him elected in 1992 by beefing up police forces and reforming the worst abuses of the welfare system. Welfare recipients now faced benefit limits, had to look harder for a job, and couldn’t have more kids while on welfare. Clinton defeated Bob Dole, a WWII vet who ran a clean election. At the GOP’s summer convention, Dole played on his seniority and credibility, promising a “bridge to the past.” The Democrats held their convention two weeks later and promised a “bridge to the 21st century.” When it comes to conventions, it sometimes helps to go second.
Conclusion: the Reagan Revolution
The Reagan Revolution shaped Republican Party politics from 1980 to 2016 and pushed the Democrats to the right economically. In some respects, we live in the shadow of the Reagan Revolution today. Just as conservatives won office and impacted policy after Roosevelt’s New Deal, from 1933-1980, liberals haven’t been silent since 1980 and two Democrats have won the presidency. But Republicans Eisenhower and Nixon were okay with the New Deal and even expanded the federal government with departments like Health & Human Services (1953-), Environmental Protection Agency (1970-), and Drug Enforcement Administration (1973-). Democrats Clinton and Obama, meanwhile, were no doubt liberal in some respects but were friendly to Wall Street.
Language is one of the best indicators of overall trends. No aspiring Democratic politician would ever call himself or herself a liberal today while Republicans knock each other out claiming who is most conservative. In the 2000 primary, fellow Democrats accused New Jersey Senator Bill Bradley of being a liberal and he dropped out of the race shortly thereafter. Republicans had the overall momentum and upper hand from 1980 to at least 2008, with Democrats on the defensive.
As for the actual size and role of government, it hasn’t budged far in either direction since Reagan took office in 1980. It 2002, it consolidated a number of existing agencies under the Department of Homeland Security and added some smaller banking and consumer protection agencies after the financial meltdown of 2008. Beyond that, the biggest growth in government at the national level has been the post-9/11 increase of the National Security Agency (1952-) in domestic eavesdropping and Obamacare mandating insurers to expand coverage and citizens to be insured. There have been no dramatic increases or reductions in entitlement programs like Social Security beyond prescription drug coverage for Medicare (Bush 43), and tax rates stabilized with the top bracket oscillating between 36-39.6%.
Up until the 2008 economic crises, it was understood that regulation is bad and deregulation is good. There were even people who blamed the 2008 financial meltdown on too much regulation even though the derivatives markets that imploded had virtually no oversight. Federal income taxes show no signs of increasing to anything near pre-1980 levels. Corporations that contributed around 30% of the country’s revenues in the 1950s now pay 6%, with some of the bigger firms on welfare (in 2010, General Electric earned $14 billion and paid Uncle Sam $-3.2 billion). The U.S. now ranks last among developed nations in upward mobility.
Because its jobs pay better, high finance lures more of the country’s top graduates than law, medicine, science, or industry. Rather than just fueling the economy by lending to other businesses and encouraging constructive investments, finance itself is the biggest business in America, and most investing is purely speculative, high-frequency, and short-term. These changes aren’t just the result of presidential administrations, but rather overall structural changes in the economy, including trends toward globalization, outsourcing, etc. But the corporate-friendly politics of Reagan and his successors in both parties contributed to an era more financially conservative than postwar America — back in the direction of the late 19th century, but not as extreme. Has this historic shift run its course? With that very question at stake, pundits on the contested cable, radio, and blogosphere outlets fight to shape our minds and future.