Bully Pulpit ~ (n): “An office or position that provides its occupant with an outstanding opportunity to speak out on any issue”
Here, we’ll pick up where we left off in Chapter 4 on the role of government in society, except that we’ll focus on the economy and unpack how the government’s economic role increased during the Progressive Era, both locally and nationally under presidents Theodore Roosevelt (R), William Howard Taft (R), and Woodrow Wilson (D). As we introduced in Chapter 2, commentators visualize the degree of government’s role in the economy as a spectrum of greater role on the left and less on the right. Like all diagrams, this one is a simplified abstraction and there are more elaborate two-dimensional political spectrum models, but it’s a basic starting point:In modern America, we associate this left-right political spectrum with liberals on the left and conservatives on the right. The Left favors more government intervention on behalf of workers and consumers while the right favors a freer, unregulated market. Liberals, for instance, brought about Social Security and the minimum wage in the 1930s, backing labor unions, whereas conservatives would prefer lower taxes and less regulation and don’t support labor unionizing for higher wages. You might remember that liberal and left both start with L and visualize Republican conservatives drinking RC Cola®. It’s important to get the basic idea of this political-economic spectrum as it scaffolds into topics in future chapters like the 1930s New Deal. You’ll often see the French term laissez-faire (trans. “let it be”) to describe the right-wing, free-market approach.
In modern politics left and right can signify one’s stand on a host of other issues — gender, church/state, guns, abortion, drugs, immigration, etc. — but, when push comes to shove, economics supersedes the culture wars. Social, racial, and religious issues weave in and out of the political parties over history, but this economic spectrum has held steady for roughly a century, with the bow in the middle of the tug-of-war rope shifting gradually to the left and right — to the left with the Progressive Era and New Deal and to the right with Ronald Reagan in the 1980s (Chapter 20). Most presidents don’t have a lasting impact on “moving the needle.” What’s the difference between moderates and those on the edges? Unlike economic libertarians, most Republicans support moderate levels of socialism, like Social Security, roads/bridges, and K-12 public schools. Unlike communists, liberals and democratic socialists want most of the economy to stay in private hands (businesses/corporations, shops, farms, etc.), but also want the capitalist engine to distribute wealth throughout society. Politically, democratic socialists favor democracy whereas communists favor dictatorships. Socialism is a slippery term because it’s sometimes used interchangeably with communism, but can also mean any public sector of an otherwise capitalist economy like schools, post office, police, and fire departments, or safety nets like Social Security and Medicare/Medicaid. This confusion traps the sort of left-leaning Democrats that support, say, public health insurance, when reporters ask them whether they are socialists, as conservative spinners delight in leaving off the democratic qualifier in hopes that their audience is inclined to equate the word with totalitarian communist dictatorships like the Soviet Union, Cuba, Venezuela, or North Korea rather than the democracies of Europe, Canada, Japan, and Australia that provide healthcare or free college tuition. When journalist/historian John Steele Gordon said that democratic socialism is an oxymoron, he was really just underscoring that democratic communism is a contradiction in terms because they’ve proven indisputably incompatible. In the Fox documentary The Unauthorized History of Socialism (2020), the first 55 minutes is a straightforward history of communism, but the last five use smoke-and-mirrors to sever modern American leftists from democratic socialists in Sweden and tie them to autocratic Venezuela’s failed economy. Yet conservatives have dictionaries on their side, be they Oxford or Webster’s, who define the term socialism broadly to include a centralized system whereby the government controls the means of production, more similar to communism.
Here’s more detail on a spectrum otherwise consistent with the one above:The redistribution of wealth refers to funding public services and programs (education, roads/bridges, parks, mail, disaster relief, disease control, welfare, etc.) through taxes, toward which wealthier Americans pay a higher quantity if not a higher rate (more below). A quick look at the Google image page for “political spectrum” shows a bewildering array of one- and two-dimensional spectrums, with fascism (authoritarian ultra-nationalism) often seen on the far right, but economic libertarianism is a better fit on a political-economic spectrum, and libertarian shouldn’t be confused with liberal (more on both below), even if the two share the first five letters and sometimes overlap.
A good example of wealth redistribution is your reduced tuition at Austin Community College. The laissez-faire/libertarian approach would be to keep the government out of education altogether: if you want a good job or the type of education and training that often qualifies you for one, then “pull yourself up by the bootstraps” by working hard and paying your own way at private schools. Maybe the elite private colleges can hand out a few scholarships here and there to the deserving poor so as to encourage a semblance of meritocracy and upward mobility. Liberals and moderate conservatives argue that, realistically, we’re not all born onto an even playing field and that one role of government should be to help aspiring people pull themselves up by their bootstraps with subsidized public education or training; otherwise, we’re basically living in a quasi-aristocracy, if not quite as stuffy or inflexible as England’s traditional version. It is society’s role to offer that framework of opportunity, in other words, and, if we don’t, that comes back to bite us anyway in the form of an untrained workforce, higher crime rates/incarceration, and poverty. In the U.S., we fund public education with taxes for K-12 and partially for public colleges like ACC, at which your “full tuition” is ~ 5x lower than it would be without public subsidies/taxes (see below). Whether you think that’s too generous or not generous enough indicates where you fall along this political-economic spectrum, at least for one issue. Left-wingers argue that free community college tuition would boost upward mobility, while right-wingers argue that their property taxes are being wasted on students that mostly don’t finish their two-year degrees. Most voters fall somewhere in between, reflected in the budget compromise below.We’ve never had a market completely free of government interference, nor could we. Such an economy would be impossible in the modern age since there would be no protection of property rights, clean water, military protection, disease control, infrastructure for businesses to get around on or, so far at least, currency. There would be no laws governing contracts, liability or bankruptcy. As economist Robert Reich paradoxically put it, “Without government, there can be no free market.” Without government, we’d only barter goods, anyone could steal from anyone else, informal militias would protect us from foreign invaders or terrorists, and you would travel only by bartering tolls with private companies. Employers would hire from a pool of mostly illiterate workers since only the rich would be able to educate their children, while some fortunate kids would be properly home-schooled. The smoggy, toxic air and poison water wouldn’t bother you too much in the short run since you’d be preoccupied fielding incessant robocalls.
None of that stops politicians from making grandiose claims like history has shown that government can’t solve any problems or free markets solve all problems. Bear in mind the people making those claims are taking many basic government functions for granted and putting a rosy spin on how the economy operated in the 19th century when the government had a smaller role. History doesn’t back up either generalization about government or free markets and neither has it offered up any examples of communist countries that created strong economies by eliminating free markets altogether. Over the last two centuries, we’ve seen that capitalism spurs growth and innovation and has made most peoples’ lives more comfortable. And we’ve seen that free markets have flaws, including lopsided concentrations of wealth and poverty, monopolies, pollution, higher infant mortality, bad working conditions, and other externalities (economist jargon for drawbacks). Thomas Jefferson, the Founding Father most commonly enlisted in support of freedom and limited government, defined a “wise and frugal government” as one that should “restrain men from injuring one another” and “leave them otherwise free to regulate their own pursuits of industry and improvement” (1801 Inaugural Speech). That’s a balancing act because men sometimes injure each other in their own pursuits.
As the tension between Jefferson’s injuring and industry-pursuing unfolded during the Industrial Revolution, most countries arrived at some compromise we can roughly call regulated capitalism. The United States is one such country. Compromises generally please no one, and there are plenty of intelligent people that would favor more extreme options on either end of the spectrum: the left toward greater government control, or the right toward less. In the U.S., a favorite among the far right is novelist Ayn Rand (1905-1982), whose Objectivist philosophy advocated a libertarian, laissez-faire society unencumbered by altruism. After coming of age in Revolutionary Russia, Rand moved to the U.S. in 1925 and anticipated “greed makes good” long before Gordon Gekko in Wall Street (1987). In Rand’s vision of minimal statism, citizens can’t use physical force to get what they want and a skeletal government exists to protect property rights. Beyond that, though, society should run on its own. In a passage from her most famous book, Atlas Shrugged (1957), the atheist philosopher inverted the Biblical phrase (KJV 1 Timothy 6:10) about money being the root of all evil: “Until and unless you discover that money is the root of all good, you ask for your own destruction. When money ceases to be the tool by which men deal with one another, then men become the tools of men. Blood, whips, and guns–or dollars. Take your choice–there is no other–and time is running out.” Given her Russian background, Rand suspected all governments of being inclined toward totalitarianism. The protagonists of Atlas Shrugged are a cabal of industrialists who destroy the government and its parasitical socialist “looters” and “moochers” so that they can rebuild it along minimalist lines. The first line of their revised Constitution is: “Congress shall make no law abridging the freedom of production and trade.” Ayn Rand’s followers include Wikipedia founder Jimmy Wales, former Secretary of State Rex Tillerson, Senator Rand Paul (R-KY), former House Speaker Paul Ryan (R-WI), Donald Trump, and Trump/Brexit campaign donor and Breitbart investor Robert Mercer, along with former Federal Reserve Chair Alan Greenspan (1987-2006). Rand inspired influential GOP donors Charles and David Koch, who wanted to abolish Social Security, Medicare/Medicaid, and welfare and have wielded their influence to delay action on climate change.
In its purest form, libertarian anarcho-capitalism is tough to test with so few real-world examples — part of why its founding text is a novel. One attempt was the Republic of Minerva on a reef south of Fiji in the early 1970s — free of taxes, welfare, subsidies or any form of economic intervention — but it didn’t last long and is now underwater, perhaps ironically so. Another from the same era was the short-lived Abaco Independence Movement in the Bahamas. Paypal co-founder Peter Thiel funded Patri Friedman’s Seasteading, that supports libertarian sovereignties on offshore islands, rigs or reefs. Based on the example of the Sealand abandoned/occupied oil rig off England, the United Nations won’t recognize such squatter sovereignty. The Citadel in northern Idaho was based on libertarian principles, but its emphasis was more political than economic. There’s a small but growing trend in this direction and those interested should see Raymond Craib’s CounterPunch optional article below. In The Not So Wild, Wild West (2004, review), the Hoover Institution’s Terry Anderson and Peter Mill argue that low-government “free-market environmentalism” functioned in the Old West of their native Montana, though they don’t provide verifiable economic or crime statistics. While libertarian communities are scattered and scarce, we can say that Ayn Rand’s dichotomy between libertarianism and totalitarianism is false since none of the near 200 countries on Earth are libertarian and most aren’t totalitarian (false dilemma fallacy). For reasons that confound historians of Revolutionary America, those that favor the individual over the collective also indulge fantasies of the Founders creating a society in 1776 with no rules or obligations. They did not, but it’s true that some of them envisioned a minimal role for the national government in relation to strong state governments and less social safety net. But there is ample primary source evidence that the Founders sought a balance between freedom and order, rather than just one or the other. Rules are the bedrock of any civilization just as they’re the essence of sports.
There is a more moderate American Libertarian Party (1971- ), though, that is more fiscally conservative than Republicans and at least as culturally liberal as Democrats. Their basic goal is to minimize the government’s role in society as much as reasonably possible. Its platform includes a leaner federal government, isolationist foreign policy (military for defensive purposes only), looser regulations on guns, and ending all drug prohibition. In different ways, anti-authoritarian rebellious streaks thread their way through the Republican and Democratic parties and among independents, forming a bigger part of the American Creed than that of other countries. These garden-variety forms of social and political libertarianism (as opposed to purely economic) are widespread and defy placement on the simple left-right spectrum above (e.g., being pro-choice is a mainstream libertarian stance). There are even left-wing versions of quasi-libertarianism, too, like libertarian socialism, which favors de-centralized control, and anarcho-syndicalism, which is too messy to unpack here but essentially an ideological justification for blowing things up.
If COVID-19 could think, it would’ve appreciate the libertarian “freedom to infect” that undermined collective efforts to stop its spread and/or discouraged vaccinations for anything other than sound medical reasons. Refined, utilitarian libertarianism argues for freedom from government interference if one’s actions don’t harm others. Cruder versions equate all rules with “Nazism” or advocate the right to reckless behavior dangerous to others on behalf of personal liberty and freedom from government interference. As many rebellious teenagers have discovered, default resistance to authority can be problematic if the advice of one’s parents is actually smart. Racists infiltrated the Libertarian Party recently because they were angry that they’d siphoned GOP votes in the 2020 election but, historically, libertarianism is neutral on race, favoring neither government policies aimed at minorities nor policies to counter discrimination (the Hill).
On the far left end of the spectrum, there’s no need for fictional speculation. We have historical examples of communism going far in the big-government direction, with the Soviet Union and other dictatorships that drowned in their own bureaucracy and stifled peoples’ ambition as they seized the means of production (farms, factories, businesses) to share the profit among all the people equally. Innovation has been virtually non-existent because they don’t reward risk, and the lack of profit motive undermines work ethic except among those truly dedicated to the cause. On the upside, communist societies lower crime and poverty, eradicate unemployment, and provide comprehensive healthcare coverage, albeit for mediocre healthcare. But they’ve also killed and imprisoned dissidents and starved people to death through misguided and poorly implemented economic planning. Communism’s main apostle, Karl Marx (image on right), also underestimated the capacity of capitalist democracies to appease workers through incremental and moderate reforms, trade unions, government pensions (e.g. Social Security), redistribution of wealth, child labor restrictions, free public education, and opportunities for upward mobility. Pollution, a byproduct of industrialization regardless of the political system, has been just as bad if not worse in communist countries as capitalist. The most fundamental problem is that communists have been unable or unwilling to operate within a democratic political framework. For the New Left, we still haven’t seen genuine Marxism because real-world examples have been warped by the totalitarianism of Lenin, Stalin, Mao, Castro, Pol Pot, Kim Jong-un, etc. After all, in the Communist Manifesto (1948), Marx predicted the state’s eventual dissolution after workers seized the means of production. But that’s such a thorough list of despots, including some outright sociopaths, that it seems pure communism leads inevitably toward totalitarianism. While technically it’s too soon to tell after a century+, dictatorships are likely intrinsic, or essential, to communism. We have small examples of communes like the Israeli Kibbutz that have worked well but, when it comes to the sort of idealized, non-totalitarian, state-level communism imagined by Marxists, there’s less track record than even the libertarian, oceanic outcroppings, which is to say, none.
Marx remains popular among some academics and young trustafarians, but no elected officials. In America, the bearded sage of communism lives on mainly via microeconomic terminology and Academic Marxism, which doesn’t necessarily advocate a leftist political stance so much as a theoretical emphasis on economics and class-conflict as the driving force of history or the best way to understand society. It’s impossible, in fact, to imagine modern economics or historiography (the study of history) without Marx because, regardless of how many people in those fields disagree with him, he applied theoretical frameworks that no one can ignore. Before Marx, “history” was usually just a chronicle of top-down events concerning dynastic changes or wars, etc., not a thoroughgoing analysis of how society, as a whole, evolves. No one, in other words, would’ve discussed workers or regular people in a history book.
However you may feel about pure left-wing and pure right-wing options — and thoughtful students should read widely with open minds — neither extreme is likely to play out in the U.S. anytime in the foreseeable future. Ayn Rand hasn’t gained mainstream traction and neither Rand Paul nor Paul Ryan suggested that we completely eliminate entitlements like Social Security-Medicare, though Ryan did say in 2009 that “we are now living in an Ayn Rand novel.” Donald Trump, often described as far right, has consistently pledged to never even reduce Social Security-Medicare benefits, let alone get rid of them, and moreover claims that all Democrats are communists.
In the meantime, the most practical way of approaching the role of government in the non-fictional, non-theoretical world is to debate the extent of government interference on an item-by-item basis, then argue it out in the public forum. Too many regulations can be counter-productive, and so can too few. Alan Greenspan himself came around to the latter notion after the financial meltdown of 2008-09, lamenting the deregulation of Wall Street he’d helped bring about. Such centrism is boring, tedious, and frustrating and its philosophers don’t enjoy the simple purity of Rand or Marx. True believers are in a perpetual state of disappointment that moderates aren’t being ambitious enough. Leftists, for instance, think near-left, moderate liberals are co-opted by the system and aren’t fighting hard enough against the Man, and right-wing commentators eviscerate moderate Republicans for selling out as Washington “insiders” or being part of the “swamp.” For Leftists like Noam Chomsky, America’s two-party system is a myth, with Republicans and moderate Democrats really constituting one pro-business party and, sure enough, many corporations just bribe both equally with donations to cover their bases. Moderates answer that they’re doing what they can within real rather than theoretical constraints — the so-called “politics of the possible.” They can claim that the back-and-forth of compromise has produced the best real-world results we’ve seen so far.
In the last chapter, we saw Americans striking a balance between freedom and order on white voting, education, entertainment, food, drugs, and alcohol. Here, we’ve started with a head-spinning whirlwind tour of the political spectrum, but now we’ll home in on how the U.S. regulated capitalism in the middle regions of that spectrum during that same Progressive Era. The central figure in the national government was combative and fiery president Teddy Roosevelt, whom one journalist described as a “steamboat in trousers.”
Though a writer and historian, “TR” didn’t worship the Founders and wasn’t hidebound by their original intention to create a smaller, weaker central government: “Our forefathers faced certain perils which we have outgrown. We now face other perils, the very existence of which it was impossible for them to foresee” (1905 Inaugural Speech). Founder Thomas Jefferson said as much himself in this 1816 letter, describing such “sanctimonious reverence” for the Constitution as “too sacred to be touched” as a man wearing a coat that fit as a boy. Though no fan of Jefferson, TR felt likewise and used his bully pulpit (public speeches, media, etc.) to take his case more directly to the American public than previous presidents. Modern politicians use a combination of Twitter®, press conferences, cable TV, and state-of-the-union addresses as their bully pulpits, just as Franklin Roosevelt used radio in the 1930s. Teddy Roosevelt used his speeches as a sounding board to broker what he called a square deal for everyone, both workers and management. He coined the phrase during the 1902 Coal Strike, the first strike in American history the government intervened in as a neutral arbitrator rather than on behalf of management. The Republican TR liked the ring of it and branded his overall economic policy the Square Deal, laying the foundation for his Democratic cousin Franklin Roosevelt’s New Deal in the 1930s. In 1902, though, Franklin was a sophomore at conservative Harvard and criticized Teddy’s actions for interfering with the free market. Still, it was getting cold and the future president was grateful for the arrival of coal on campus for the stoves.
Roots of Economic Intervention
People often cite Franklin Roosevelt’s more famous and substantial 1930s New Deal as the fork in the road where America strayed off the free-market path toward a regulatory state, but that happened gradually and started earlier. We’ve already seen the government affecting trade through tariffs (import taxes), giving away land to favored recipients (railroads, farmers, universities), and influencing labor relations by intervening militarily on the side of management to break strikes.
The Constitution’s Commerce Clause (aka “Interstate Commerce Clause”) gives the national government the right to regulate commerce between but not within the states. Relying mainly on the Commerce Clause for constitutional justification, the government established legal authority over railroads, banks, pipelines and medicine between 1882-1906, long before Franklin Roosevelt arrived on the scene during the Great Depression of the ’30s. Another turning point was Swift & Co. v. United States (1905), involving Chicago meatpackers. That case helped establish the Commerce Clause precedent and, along with lobbying efforts mentioned in the previous chapter (Roosevelt, Sinclair, Heinz, etc.), led to the meat industry’s regulation. The case also broke up a trust that wasn’t just one company monopolizing an industry but rather a series of companies colluding on pricing, agreeing to set a price basement. The same principle applied to railroads. Fulfilling the Populist Party’s goal, the Interstate Commerce Commission under Teddy Roosevelt regulated railroad rates with the 1903 Elkins Act and 1906 Hepburn Act. This, too, was part of TR’s Square Deal whereby he didn’t redistribute wealth to the poor like his cousin Franklin later would, but he used intervention to prevent big companies from exploiting working Americans. In an unprecedented action for a president, TR rode the rails to campaign on behalf of railroad regulation the same way a politician would campaign for an election in a Whistle-Stop campaign.
The national government also outlawed alcohol and narcotics during the Progressive Era, both economic sectors in their own right. As the Industrial Revolution and immigration fueled rapid growth, these trends continued through the 1910s on national, state, and local levels. Below, we’ll cover national government intervention in the economy as it grew during the Progressive Era, including the Federal Reserve, Federal Income Tax, child labor laws, and trust-busting. These topics are dry but important. While our chapter title highlights Teddy Roosevelt as the movement’s figurehead, the Progressive Era includes William Taft and Woodrow Wilson’s presidencies and played out locally in states and cities.
Congress created the Federal Reserve in 1913 in response to the Panic of 1907 to manage currency. In previous panics, only private “lender of last resort” J.P. Morgan (above, center) stabilized the government’s gold standard (1895) and saved the banks by lending them his own cash to offset their losses (1907). But private lenders of last resort couldn’t be counted on to always be there in the future. With “the Fed,” common shorthand for the Federal Reserve, the government would serve that purpose, pooling the resources of thousands of member banks. They added the dual mandate of maximizing employment rates and stabilizing prices in 1977, though the Fed has struggled more in those roles to have anything other than an indirect influence. The 1907 mini-meltdown also compelled the government to regulate the stock market for the first time. They created the Federal Trade Commission in 1914, a precursor to today’s SEC, or Securities & Exchange Commission.
The Fed had moderate success in stabilizing the banking system, at least in comparison to the late 19th and early 20th centuries. The in comparison is a key qualifier because there were serious problems in 1929-1933, the 1970s, and 2008-09, though the Fed came to the rescue in 2008-09 rather than causing the problem. Unlike the two earlier national banks (1791-1833), the Fed is a non-speculating (non-investing) bank set up to distribute money from the Treasury Department’s Mint (coins) and Engraving Office (bills) to regular banks through twelve regional reserves that maintain some private control. The Dallas branch (#11 below), for instance, serves Austin. Regionalizing the branches keeps cash from concentrating in certain regions.
You could think of the Fed as where “banks bank,” serving as what most countries call their central bank. The U.S. hadn’t had a central bank since Andrew Jackson vetoed re-chartering the Second Bank of the U.S. in 1833, which is why progressive government reformers meeting on Georgia’s Jekyll Island had to fend off the “ghost of Jackson.” With the revamped Federal Reserve, banks swap federal IOU’s (U.S. treasuries/bonds), municipal bonds, and mortgage-backed securities back and forth with the Fed in exchange for reserve cash through Open Market Operations — swap here meaning that they buy or sell these assets. After the Great Recession of 2008, the Fed purchased treasuries to infuse cash into the banking system, worrying critics that by loading too much cash into the economy they’d set the stage for future inflation. Banks also borrow and lend to each other through the Fed-influenced “repo market.” Each member bank has to keep ~ 10% of their customers’ money “on reserve,” depending on size, to stabilize the system, thus its name.
The Fed works with the Secret Service to check for counterfeit bills and replaces torn and tattered bills (helpful, too, because cash is unsanitary). The Chicago branch alone destroys ~ $23 million in old bills per day, along with 50 counterfeits, though those numbers will likely drop as we move toward a more cash-less society. With $70-80 billion on-site at any given time, the Fed branches are high-security with nobody working alone and cameras everywhere. Federal Reserve member banks also fall under FBI jurisdiction when robbed.
Created by Congress, the semi-independent Federal Reserve distributes cash from the U.S. Treasury and funnels profit back into the Treasury rather than to shareholders. The Fed doesn’t exist to earn a profit, but rather to stabilize the economy by furnishing an elastic currency. After its role expanded during the New Deal with the 1935 Banking Act , the Fed’s Open Market Committee (FOMC) has contracted and expanded the economy by influencing the short-term Target Federal Funds Rate: the interest rate on overnight loans that member banks with surplus cash loan to those just under the 10% reserve limit — influenced in turn by the Fed’s buying and selling of bonds and tweaking the reserve requirement, which in turn influences the interest rate the Treasury pays for bonds (U.S. Treasuries). Elastic currency is complicated, but understand the main concept depicted in the diagram on the right. The more bonds the Fed buys, the more cash in the banking system; the more cash in the system the lower the interest rates and vice-versa. It’s a target because the Fed is manipulating the rates consumers pay for business loans, homes, cars, tuition, etc. indirectly by setting the rates it charges banks. The Fed sets the discount rate they charge at their Discount Window for short-term overnight loans to banks. This is usually what the media is referencing when they say the Fed is raising or lowering interest rates. These benchmark rates, in turn, impact the prime interest rate (aka “prime”) that banks charge their favored customers. The Prime Rate is generally ~ 3% higher than the Discount Rate and the lowest rate at which customers (non-banks) can borrow from commercial banks. The Wall Street Journal prime interest rate rose from 3.25% to 7.5% in 2022, but that’s a rough average and not the exact rate any one person necessarily gets on any one loan. Open Market Operations and the Discount Window are Tools that enable the Fed to set monetary policy by moderating interest rates and controlling inflation (prices). History teaches us that inflation can destabilize society, as demonstrated in Weimar Germany in the 1920s and during the Age of Exploration, with its infusion of silver from the Americas into Europe. Higher rates also strengthen the U.S. dollar internationally, which is good for travelers and importers but bad for exporters.
The Fed’s manipulation of interest rates to smooth out boom and bust cycles is at the core of national macroeconomic monetary policy. Longtime Fed Chair William McChesney Martin (1951-1970) said the Fed’s role was to “take away the [spiked] punch bowl just as the party gets going.” In other words, once they’ve reignited the economy with “easy money” low-interest rates (more cash), they want to rein it in with higher rates (less cash) to avoid triggering high inflation. The less money in the system, the more interest a bank is likely to charge customers for a loan, due to the basic law of supply and demand. Customers’ rates are also impacted by their own credit ratings (their history of paying bills on time). Since the 1960s, Americans have the right to view their credit ratings. Borrowing money costs money, and how much has more impact on our lives than you might think, which is also why racially-mandated subprime rates caused systemic racism in the 20th century.
After 2008, the Fed under Chair Ben Bernanke kept rates low (0.00-0.25%), hoping that would fuel more borrowing and economic growth in the wake of the Great Recession, but also making it difficult for savers to earn interest on their money. They knew that such low rates would herd investors into investing in stocks instead of safer alternatives like bonds or bank savings accounts/CDs (certificate of deposit). The Fed pumped $85 billion a month into the banking system by buying up mortgages and long-term treasuries to keep yields low in a program called Quantitative Easing, or QE, and the economy slowly but surely mended. The Fed controls liquidity in credit markets, tightening or loosening lending rates, but can also undertake more unorthodox buying programs like QE to shore up banking and the stock market, or to encourage mild inflation (< 3%). When people said “money was free” during QE, they didn’t mean that the government was giving it away, but rather that the rates were so low that borrowing was free, meaning that those with the wherewithal larded up and used that money to invest or grow their companies.
Has the Federal Reserve done its job? It went to work quickly after it was set up in 1913-14, helping to stabilize the U.S. economy during World War I by shoring up the system. But the U.S. went off the Gold Standard during WWI, and the Fed struggled to balance the economy over the next twenty years as monetary policy seesawed, sometimes leaving the gold standard and other times increasing or decreasing the amount used to peg the dollar to gold. They reverted to the gold standard and stuck with it at the worst time, prior to the Stock Market Crash of 1929, decreasing the cash in circulation just as banks were running out, contracting the money supply. Then they doubled the reserve requirement (cash kept safe, out of investments) at an inopportune time in 1937, contributing to a recession within a recovery during the Great Depression.
The Fed also failed to raise interest rates in the late 1960s because they and President Lyndon Johnson didn’t want to weaken the economy, but inflation rose steadily and finally President Richard Nixon had to sever the dollar from the gold standard altogether in 1971. While the Fed is, in theory, independent from the executive branch, both Johnson and Nixon pressured the Fed to keep interest rates low to help the economy, and any president is well-served in the short-term by this “sugar high” of an economic boost, so their opinions are predictable. However, if employment is solid, low interest rates tend to boost borrowing and spending too much and cause inflation, and that very thing happened because of Johnson and Nixon’s influence, with the government itself borrowing to fund the Vietnam War and social programs. Inflation worsened throughout the 1970s with rising oil prices until Fed Chair Paul Volcker dramatically raised rates, deliberately causing a recession in the early 1980s to halt inflation. Then after a market crash in 2000 and 9/11 (2001), Alan Greenspan’s Fed kept rates low and pumped cash into the system between 2001 and ’05 even after the economy improved, helping to fuel a housing bubble — in that case, abandoning its mission to stabilize fluctuations in the economy.
History will judge the Fed’s massive infusion of cash into the system between 2008-2014, after that real estate bubble burst, and a second round during COVID-19. In late 2015, Chair Janet Yellen signaled that the Fed would reverse course. Their goal from 2015-19 was to slowly “take the punch bowl out of the party” by reversing quantitative easing, swapping treasuries back to banks for cash in an attempt to defuse inflation. In the meantime, the Fed held so many assets (including bonds, real estate mortgages, etc.), that the government (Treasury) made a lot of money from interest, which is good for citizens. In 2019-20, under Chair Jerome Powell, the Fed started gingerly lowering rates again, infusing the economy with cash. Meanwhile, President Trump — like most presidents more concerned with short-term growth than long-term inflation — pressured the Fed to lower rates faster yet, calling them “clueless, pathetic, boneheads.” COVID accelerated the loose monetary trend, with the Fed serving as the primary backstop to the economy and providing loans to businesses along with cash to banks. But, in 2020, the Fed had less slack in the rope because they’d already been lowering rates when the economy was strong. Joe Biden is one of those unfortunate presidents in office when the Fed pulls away the punch bowl (raises rates), and the party guests won’t be happy.
By flooding the system with easy money in the early 21st century, the Federal Reserve created the Everything Bubble that encouraged investing in real estate, stocks, private equity, and cryptocurrency. If you haven’t “cut the cord,” view any random business show on weekday cable and they’re more likely to be fretting about Fed rates than the actual earnings of the companies whose stock viewers are investing in. At the bubble’s peak, over half of the homes in Austin were bought by investors. If rates are low, it makes sense to borrow and invest and flip houses as much as possible, especially in growing areas. Just ask the Property Brothers.
Moving forward, we have significant inflation in America for the first time in decades so, based on what you’ve learned above, the Fed should raise interest rates and start swapping assets back for cash. But the Fed doesn’t have its hands on all the levers in the economy. Recent inflation results mainly from too much COVID relief money among consumers (the last round of $2.2 trillion in 2021 was, in retrospect, overkill, especially that which fell into the wrong hands), COVID-related shortages connected to disruptions in supply-chains (especially globally), tight oil supplies because of the [post-Ukraine invasion] Russian boycott and refineries that aren’t operating at full pre-COVID capacity, food and fertilizer shortages caused by the Ukrainian invasion, China tariffs initiated by Donald Trump and continued by Joe Biden, and worker shortages (which raise wages, leading to a price/wage spiral) — all combining in a perfect storm to make demand exceed supply. When considering financial matters, always consider supply and demand. Fixing all those problems quickly isn’t in the Fed’s wheelhouse because they really just control how much cash is in the economy, not how well supplied that economy is or how many willing workers it has. But they will do their part by raising rates, likely fueling a slowdown, though inflation was subsiding as of early 2023. The challenge is to gently raise rates just enough to curb inflation without causing a recession, aiming for a “soft landing.”
There is no good evidence, in case you’ve heard rumors to the contrary, that the Fed is a conspiring cabal. It’s transparent relative to other agencies, and we might even take heart that this apparatus that controls our economic lives is boring and can really only move in two directions, limiting bad decision-making. Technocracy is much-maligned but, in this case, makes sense if the host exercises good judgement with the punch bowl. Reader: since you’re likely working age, you can hopefully benefit from the wage-price spiral while it lasts, but it will be uneven across professions.
Federal Income Tax
On to another boring but important topic. Benjamin Franklin once said that “in this world, nothing is certain except for death and taxes.” Yet, the U.S. had no federal income tax prior to 1913, except for briefly during the Civil War. Despite the Constitution’s Article I, Section 8 (aka the Taxing & Spending Clause), the Supreme Court declared an 1894 national tax unconstitutional in Pollock v. Farmer’s Loan & Trust Co. (1896). The Revenue Act and 1913 Sixteenth Amendment overturned that precedent, allowing the government to raise revenue in an era when it could no longer simply sell off western lands and depend on tariffs and bond sales. The revenue allows the government to conduct its basic functions, including maintaining a military, building infrastructure, and providing entitlements that include health insurance and a modest monthly pension for the elderly. Workers contribute to the latter via payroll deductions in their paychecks (itemized under FICA). Populists originated the national tax idea. The first 1913 bracket graduated or progressed upwards from just 1% for the poor to 7% for the wealthy. This was a variation, for income, on what Jefferson advised for property taxes when he wrote James Madison that “Another means of silently lessening the inequality of property is to exempt all from taxation below a certain point, and to tax the higher portions of property in geometrical progression as they rise.” (TJ to JM, 10.28.1785) The Founders didn’t restrict the national government from taxing in Madison’s Bill of Rights, even though there was more noise about such a potential Constitutional amendment in the 1790s than there was about guns, religious freedom, or legal rights.
Today’s graduated brackets apply to everyone from lower middle-classes on up, with the rate of payments increasing as one moves up the scale, topping out at 37% for the upper bracket as of 2018 (here are the inflation-adjusted historic and 2015 rates). With recent legislation, the lowest bracket (earning $10k-$38) dropped from 15% to 12%. Contrary to popular belief, workers can’t barely increase their salary over a cut-off line and end up losing money. People only pay the higher rate on that extra portion of income in the higher bracket. No married couple, for instance, pays 37% on their entire income; only 37% on income beyond $648k. The rates vary depending on whether someone is filing jointly, or single, etc. Everyone that works outside a pension system pays a 6.2% Social Security payroll tax that contributes toward their own retirement. Sales taxes or local taxes on food are regressive since the poor use a greater portion of their money for essentials. Lottery tickets basically funnel money from workers into government coffers, but much of it cycles back to programs like education that help poor and middle classes.
For most of the 20th century, the top rates were high, usually above 50%, but they dropped dramatically in the 1980s. While today’s income taxes are graduated, the tax on investments — dividends and capital gains — is only 15-20% for stocks held over three years. Since the wealthy derive most of their income from investing rather than income, their overall effective rates can be lower than people in middle classes paying 24-35%, though their totals are higher. Some are quick to complain of “class warfare” among anyone who doesn’t like this arrangement, but this regressive tax code even has some wealthy critics. Warren Buffet suggested a 30% overall bottom rate for the wealthy as defined by the top 0.3% (aka the Buffet Rule) and iconic conservatives Andrew Mellon (Treasury Secretary, 1921-32) and Ronald Reagan (President, 1981-89) favored taxing capital (investments) and labor (work) at the same rate. Buffet pointed out that, as a multi-billionaire, it was ridiculous that he paid a lower effective rate than his secretary. Currently, though, the bottom 99.7% is either fine with paying more, too fatalistic or apathetic to protest, thinks the lower capital gain rates spur the economy (“trickle-down”) or, most likely, doesn’t know exactly what’s going on.
For all taxpayers, there are numerous write-offs for homeownership, home improvement, charitable donations, work-related travel expenses, etc. that lower one’s reported income. Total deductions of $900 billion in 2013 would’ve paid for almost all of the combined cost of Medicare and Medicaid. While America has made strides in lessening racial and sexual discrimination, home-owning voters of both parties continue to support rigging the tax codes against renters. The Home Mortgage Interest Deduction goes back to the beginning in 1913, to encourage homeownership, and the Charitable Contribution Deduction started in 1917. Overall, Americans today pay slightly lower rates than they did for most of the 20th century, but not significantly lower if one includes local taxes (state, county, sales, etc.).
American corporate rates dropped from 35% to 21% in 2018, but few pay the full rate. Some route their profits through Bermuda, Puerto Rico, Ireland (e.g. Apple), or other “tax havens” (low-tax jurisdictions). One downside of economic globalization for revenue-seeking governments is this legal tax dodge, creating a “race to the bottom” because, no matter how low corporate rates go, there’s always some other country willing to provide a haven by charging even less. Domestically, states like Texas poach companies from liberal states like California, enticing them with lower taxes (e.g. Tesla, Inc.). The same holds for individuals. Liberal and conservative elites might disagree on some things, but they bond over this important detail as they party in the Caribbean.
Due to the corporate tax code’s complexity and the corruption of lobbying, rates are distributed unevenly from company to company, usually favoring bigger companies with more political pull. Some corporations pay the full amount while others are, in effect, on welfare insofar as their rebates are more than their bills. The big Wall Street banks avoid taxes with offshore havens. Goldman Sachs, with direct influence in the cabinets of Bill Clinton, Barack Obama, and Donald Trump, paid less than 1%. The overall American corporate tax average as of 2012 was 17.3% — just half the supposed 35%. It remains to be seen whether the new drop from 35 to 21% will be accompanied by stricter enforcement and closed loopholes, but don’t hold your breath. According to Treasury Department estimates, if the U.S. got rid of all cheating, loopholes, and tax havens, the country could cut everyone’s taxes by 12% across the board and still balance the budget. We lose roughly $1 trillion annually in lost revenue to cheating. Yet, angry, suspicious, or duped non-cheating voters have instead irrationally cut funding to the Internal Revenue Service, easily the most cost-effective agency in all of government. The IRS can be heavy-handed, but it returns at least $2 for every $1 spent and even more when they go after cheaters. When the GOP took control of the House in January 2023, their first action was trying to repeal the Inflation Reduction Act of 2022 provision that cracks down on wealthy and corporate “non-compliers,” but the workers that voted them into office don’t cheat. Trump-appointed IRS Commissioner Charles Rettig clarified that there would be no stepped-up enforcement for non-compliers making less than $400k under the IRA, but Senator Ted Cruz (R-TX) warned FOX viewers that the IRS would descend on families and small businesses “like a swarm of locusts.” We should argue in good faith about what to spend on and who pays how much, but the only way to offset the revenue lost to tax dodgers is for the rest of us to pay more as the debt mounts.
But a lot of lost revenue is rather from American corporations defraying taxes by keeping their overseas profits abroad in these tax havens rather than bringing it home. According to Bloomberg BusinessWeek, as of 2015 the top ten such tax dodgers alone — Microsoft, Apple, Oracle, Citigroup, Amgen, Qualcomm, JPMorgan Chase, Gilead Sciences, Goldman Sachs, and Bank of America — could’ve funded NASA, the Army Corps of Engineers, the Environmental Protection Agency, the Pentagon’s training of and equipping of security forces in Afghanistan, the Transportation and Security Administration, Social Security overhead, and the Departments of Justice, Commerce, Agriculture, Treasury, and Interior. In 2018, Amazon cleared $11.2 billion in profit and paid zero in corporate taxes, in their case because of reinvesting profit in infrastructure. Joe Biden and the U.S. Treasury helped broker the Global Minimum Corporate Tax Rate (GMCTR) of 15% with 135 other countries that would eliminate havens, set to start in 2023, but Congress backed out of the arrangement in 2022. The Inflation Reduction Act set the domestic corporate tax basement, after loopholes, at 15% (some companies will still pay 21%).
The 16th Amendment is an important watershed in American history. Since federal tax codes are graduated, it set up a mechanism for the government to redistribute wealth. As mentioned, for individuals that’s partially offset by capital investments being taxed at a lower rate than income but, even so, rich Americans pay far higher totals than poor. There are two ways to spin these statistics. The wealthy can complain that they pay a large proportion of revenues, while the rest can counter that the reason the wealthy pay so much in taxes is that they have such a high portion of the country’s money to begin with. The upper 1% have more money than the bottom 50% in America. For liberals, redistribution makes society more equitable and “evens the playing field” some. For conservatives, redistribution is unjust because graduated tax rates put the government in a permanent state of rewarding failure and punishing success, weakening everyone. While there are various rewards and punishments built into the tax codes (e.g., sin taxes on cigarettes and alcohol), most taxes aren’t punitive. They’re just being collected to fund the government, not encourage or discourage behavior. Taxes will remain at the heart of Republican-Democratic debates into the foreseeable future, regardless of what other issues come and go onto their platforms and who spins their party as “populist.” One thing voters in both parties agree on — other than accountants, who profit from the complexity of tax codes — is that they’d prefer to simplify the system by getting rid of all the complicated write-offs and loopholes…other than all the ones they like and profit from.
Government & Labor
Labor, meanwhile, focused on safety and hours more than wages in the Progressive Era. It wasn’t uncommon for blue-collar workers to put in 72 hours a week (6 days x 12 hours). One in eleven steelworkers died annually. Think about that. Would you work in a job where there was a 1/11 chance of dying in a given year? Those odds are worse than for soldiers or sailors in most wars or the most dangerous occupations today: lumberjack, fisherman, roofer, steelworker, pilot, and driver (taxi, truck, etc.). According to the Bureau of Labor Statistics (BLS), the chance of a logger or fisherman dying in a given year is just over one in a thousand — a hundred times safer than a steelworker a century ago.
At the Triangle Shirtwaist factory in New York’s Garment District, 146 girls and women died in 1911, unable to escape a fire because their bosses nailed the exit shut to keep out union organizers. When some made it out onto the fire escape, it collapsed under their weight. Others jumped to their death from the 11th-floor windows.
The fire led directly to building codes mandating multiple marked exits and outward-opening doors since many asphyxiated victims unable to get out an exit were trampled behind an inward-opening interior door. Happening in the middle of the day, just blocks from the New York Times office, the Triangle fire drew the public’s attention to child labor and union-busting, along with the Ludlow Massacre in Colorado in 1913 that we read about in the Gilded Age chapter.
Unions gained some strength, but still lacked the right to collective bargaining, so management could simply fire or abuse strikers or union organizers. The Triangle Fire underscored these problems. The fight for worker safety and shorter work-weeks is a good example of how America’s three branches of government — congressional, executive and judicial — can check each other’s interests, with each never able to override the other two. Congress and the presidency, the two branches most closely connected to the people, favored regulating the workplace while the judiciary, led by the Supreme Court, saw such regulations as unconstitutional.
The Keating-Owens Act of 1916 set the age limit for miners at 16 and factory workers at 14, but the Supreme Court reversed the law in Hammer v. Dagenhart (1918), a case where the plaintiff wasn’t management but rather a North Carolina family that needed their child to work. The Court wasn’t even friendly to state laws regulating worker hours unless they applied to women. In Lochner v. New York (1905), the Court shot down a state rule limiting the number of consecutive hours bakers could work without a break to 10 (several had collapsed from exhaustion and fallen into ovens). But the Court authorized a similar state law in Oregon that applied to female hotel workers. Courts and Congress locked horns on monopolies, or trusts, as well.
In 1904, Lizzie Magie, a stenographer/typist at Washington’s Dead Letter Office, patented the Landlord’s Game. Dice-rolling players circled the square board buying up properties, railroads, and utilities and paying and collecting rent, all the while hoping to avoid “going to jail.” Like Upton Sinclair with The Jungle, her point was to underscore the evils of capitalism. Magie hoped to alert players that, “In a short time…[they] will discover that they are poor because Carnegie and Rockefeller…have more than they know what to do with.” As in the Jungle’s case, the public largely missed the point and her creation became the most popular board game of all time when marketed by Parker Brothers as Monopoly® in 1935. It turns out that players loved nothing more than the satisfaction of bleeding their opponents dry and giving them a friendly shove down the road toward bankruptcy, especially satisfying against one’s siblings. The snowballing effect of wealth accumulation is fun to experience, if only vicariously in a fantasy version, and, today, Parker Bros. prints 30x as much fake monopoly money, annually, as the U.S. Treasury does real money. In real life, though, many Americans shared Magie’s hatred of trusts, as monopolies were commonly called. Congress responded by outlawing companies that acquired or maintained monopolies through unfair practices.
But the U.S. has a three-branch government and, just as the Supreme Court was lukewarm toward labor laws, so too they were skeptical toward the 1890 Sherman Antitrust Act aimed at monopolies. The Court came to an accommodation with the Teddy Roosevelt and William Howard Taft administrations since those presidents only enforced the law in egregious cases. As we saw in the first chapter, Standard Oil was an early target, broken up in 1911, and in 1915 a federal district court ruled in favor of what became 20th-Century Fox, breaking up (Thomas’) Edison Trust and ushering in the competitive Hollywood era.
TR’s first big target was J.P. Morgan’s railroad conglomerate Northern Securities in 1902. You may remember Morgan from Chapter 1 for backing Edison and, later, Tesla after he poached Westinghouse’s Tesla patents. He bailed out the economy in 1895 and 1907 as we saw above, leading to the Federal Reserve to take such responsibility out of individual hands. The Supreme Court went along with the government’s injunction against his railroad empire in a 5-4 ruling, but Roosevelt was still fuming that his own appointee, Civil War veteran and Associate Judge Oliver Wendell Holmes, voted against and wrote a dissenting opinion. Holmes argued that capitalism encouraged competition then outlawed whoever won the competition with the Sherman Act. Wasn’t horizontal integration a natural and fair goal of any big company? TR, conversely, saw monopolies as undermining competitive pricing and told the newspapers that he could “carve a judge with more backbone [than Holmes] out of a banana,” which was funny but didn’t directly address Holmes’ concern (TR no doubt would’ve held his own on social media). Roosevelt also resented the dismissive way that Morgan had tried to buy him off when he became president, in the same way that he’d negotiate with another industrial titan. When TR first broached the subject of breaking up U.S. Steel, J.P. Morgan chuckled and contemptuously telegrammed that he would send his man down to talk to Roosevelt and work out the problem. While Roosevelt’s courage was laudable, U.S. Steel ultimately won that case in 1911.
Breaking up monopolies is a big intrusion into the free market but, left on their own, economies will naturally tend toward concentrations of power. Bigger companies gain efficiency advantages as they scale up, aka economies of scale, which can benefit consumers in the form of lower prices. Yet, if a company corners an entire market, they can raise prices to the disadvantage of consumers. The advantage of capitalism from the consumers’ perspective is competition and monopolies can destroy competition even as they bring efficiency and order to industries, the way Rockefeller did with Standard Oil. Many of today’s billionaires, like Bill Gates (former chair/CEO of Microsoft) or Mexican telecom mogul Carlos Slim, made their money by finding niches in the economy with barriers to competition. Wise investors love these “wide moats,” but they make it hard for smaller, up-and-coming firms to compete, leading to the capitalist “snake eating its own tail” as the anti-trust line of thinking goes.
Yet, the Sherman Act doesn’t outlaw monopolies outright, just those acquired through illegal or unfair practices. Also, as we saw in Chapter 1, patents award short-term monopolies through proprietary rights before the patent expires. The Clayton Anti-Trust Act of 1914 supplemented the Sherman Act by prohibiting any anti-competitive mergers. The ambiguity of illegal, unfair, and anti-competitive has complicated attempts by the Justice Department (Anti-Trust Division) and Federal Trade Commission to bring antitrust suits against companies like Microsoft (for exclusively bundling the Internet Explorer browser with its own operating system), Google (for favoring itself and its clients in searches), Amazon (for not allowing its vendors to sell their products cheaper on other platforms), and Apple (for conspiring with publishers to raise e-Book prices). After initially losing an appeal, the government essentially won the first “Browser War” against Microsoft in 1998-2000 through a compromise based on the Sherman Act, though they never broke the company up into two “baby Bills” as originally proposed. Given the dominance of its operating systems on PCs, we’ll never know — just as we can never know any counterfactual (what if?) history — whether Google would’ve emerged or Apple reemerged had Microsoft been allowed to continue bundling its browser and operating system. The anti-trust laws’ ambiguity frustrates the companies and their legal teams as they struggle to survive and compete with one another. In addition, globalization has brought American companies like General Electric and Microsoft under the jurisdiction of foreign agencies. The European Union blocked GE’s merger with Honeywell and continued to dog Microsoft after the second Bush administration settled out of court in America.
Today, Alphabet (Google) dominates Internet searching, Meta (Facebook, Instagram, Messenger, WhatsApp, etc.) dominates social media, and Amazon dominates online retail enough that each represents a concentration of power, especially with more artificial intelligence on the horizon. Still, their familiarity is convenient and their big scales allow them to develop features attractive to their customers. Big tech’s lawyers will be busy for the foreseeable future in endless cases involving anti-trust and patents. Two examples: Apple fending off legislation to ease the downloading and data sharing of outside apps on their phones; Samsung losing a 2017 case and paying Apple for use of its slide-to-unlock technology.
Progressive Politics: Bipartisan & Local
These Progressive Era battles did not break down neatly into Democrats versus Republicans. Today it’s not an oversimplification to classify most Democrats as liberal and Republicans as conservative, at least by American standards. We’ll explore that terminology more below because both terms have complicated histories, but most of us know roughly what they’ve come to mean. Each side even grades their own politicians and each other as to how loyal they stay to their party’s ideological principles. But such hardened categories weren’t defined in the early 20th century. Both parties had progressive and conservative elements within their ranks. Republicans led the charge nationally by regulating food and drugs, civil service reform (awarding government jobs by merit rather than mere patronage or the spoils system), and breaking up monopolies. Republicans also helped bring about public utilities, including sanitation, sewage, and water. Cleveland Democrat Tom Johnson pioneered the concept of publicly-owned utilities as a mayor elected by Populists and labor unions. Detroit’s Republican mayor Hazen Pingree grew vegetables in vacant lots, “Pingree’s Potato Patches,” to feed the poor during the 1890s depression.
The most famous and influential Progressive Republican at the state level was Wisconsin’s Robert La Follette, who served as governor and U.S. senator. “Fightin’ Bob” fought corporate influence on politicians and pioneered government-subsidized mass transit to reduce congestion and pollution. Wisconsin passed the country’s first environmental restrictions and La Follette capped sailors’ workweeks at 56 hours. As a presidential nominee of his own Progressive Party in 1924, La Follette won an impressive 16% of the vote. To this day, progressive politics are sometimes referred to as the Wisconsin Idea. Wisconsin pioneered the primary system to give voters a greater say in who the respective parties nominated to run in general elections.
Montana pioneered campaign finance reform by restricting corporate lobbying after the Anaconda Copper Mining Co. of Butte bought off its legislature. That law was overturned, though, by Citizens United v. FEC (2010) and an unsuccessful challenge to Citizens United in Western Tradition Partnership, Inc. v. Montana (2012). A lot of Progressive politics was local rather than national or a combination.
After the 1900 Galveston hurricane, one of the worst natural disasters in American history, local and national governments cooperated in rebuilding a seawall and dredging the Houston Ship Channel (1909-14), relocating the major Texas port inland as far as East Houston’s Turning Basin for better protection. Galveston’s city commission became a model as other towns hired city managers to run municipalities as a CEO would a company. Houston’s Ship Channel was a pioneering example of public/private partnership, as companies built warehouses and docks to augment evenly matched public funds from Harris County taxpayers and the Army Corps of Engineers to dredge Buffalo Bayou and Galveston Bay. President Woodrow Wilson officially opened the channel in 1914 and it served as an oil port during World War I. The federal government dredged the canal deeper in the 1920s and 30’s.
Colossal amounts of poverty, disorder, disease, and filth confronted New York City and Chicago as they grew into the nation’s biggest cities. Modern Americans don’t usually associate cities with animals and might think of the West first when it comes to horses, but draft animals drove American industry and commerce as steam, electrical, and gasoline power were still in their infancy. Aside from pulling burdens — including omnibuses, railed horsecars (streetcars), bread, beer, and milk — horses also walked treadmills to turn gears and pulled the winches, pulleys, and elevators that enabled building and dam construction. In New York, tens of thousands of horses each contributed around 15-35 lbs. of manure and two gallons of urine daily. “Muck handlers” tried to collect, dispose, and recycle into fertilizer the over three million pounds deposited daily onto the streets, but manure flowed down cobblestone joints and seeped into basements on rainy days, and dried into fly-infested piles in the heat, spreading typhoid. On hot, windy days, it blew through the air in pulverized form. The stench was nearly unbearable even for those accustomed to the era’s olfactory standards. The wealthy built brownstone walk-ups with staircases high enough to avoid the piles. The country worked most of its 20+ million horses to death in 4-5 years and New York officials dragged dead horses to Barren Island off Brooklyn to make glue, rendered fat, leather, and horsehair. Yet, cities couldn’t function without horses, today’s population of which is less than 1% of what it was then. When an epizootic flu epidemic decimated Boston’s horses in 1872, a small fire nearly burned the whole city since no horses were available to pull pump-wagons.
New York City officials also estimated the feral pig population at ~ 20k. And, as in most cities since ancient times, humans threw excrement out the window onto the street. Disease epidemics (cholera, typhoid, malaria, etc.) swept through American cities on a regular basis, where packed residents used outhouses like those on a farm except with more traffic. Tuberculosis, or “consumption,” killed more Americans in the 19th century than influenza, AIDS, and polio combined in the 20th. To confront these challenges, New York was the first American city to collect garbage and build an extensive sewer and water system. Chicago, a city that went from 100 villagers to 1.7 million between 1830 and 1900, did likewise soon after to combat disease, especially after an 1854 cholera outbreak. The Raising of Chicago (below) involved lifting all buildings 4-14 ft. higher with hydraulic jackscrews to run a sewer system underneath.
Cities tackled crime along with waste and disease. Police departments emerged partially from the Deep South’s old slave patrols and immigration patrols in the Southwest, but mostly in urban America in the late 19th-century. Real police departments, with full-time employees and procedures that were answerable to central governments, started in Boston in 1838, New York in 1845, and Chicago, New Orleans, Philadelphia, and Baltimore in the 1850s. By the 1880s, every American city had a “thin blue line.”
Referendums and initiatives, also called propositions, as in “Prop X,” emerged locally, too, during the Progressive Era, pioneered as we saw in Chapter 2 by Populists. Voters vote directly, yes or no, rather than indirectly through politicians. For instance, if a town wants to build a new high school or light rail, it puts the issue on the ballot directly rather than arguing it out in the city council, asking voters whether it can issue a bond (borrow money) for such a project. South Dakota was the first state to put referendums and initiatives on its ballots in 1898. Portland, Oregon voted on so many referendums in the early 20th century that other Americans initially called such votes the “Oregon System.” These referendums, or plebiscites as ancient Romans called them, are as close as we come in the U.S. to pure democracy as originally defined by 6th c. BCE Athenian Greeks. The Greek word demos is loosely defined as the “peoples’ grip.” Ancient Athenians rotated their non-enslaved male population in and out of assemblies the way America requires its citizens to sit on juries. Working, instead, through politicians is variously called a republic, representative democracy, or (elsewhere) parliamentary democracy.
TR: That Damned Cowboy
At the national level, Republicans also led the Progressive charge, most famously with Theodore Roosevelt (1901-09) and his successor, William Howard Taft (1909-13). The Progressive Era culminated during the presidency of Democrat Woodrow Wilson, from 1913-21. Roosevelt was a strange mixture, politically, who would be impossible to classify today as liberal or conservative — part of what makes him so interesting. It wouldn’t be a huge stretch to say that he was a democratic socialist in the Republican Party. If a party or think tank tried to grade him on consistency to their principles, as they do today, he’d flunk and probably wad up the report card and throw it back in their face. The public would be curious to see such a swashbuckling alpha promoting wilderness conservation and universal health insurance.
On the one hand, TR advocated foreign wars for their own sake and white supremacy. Of American Indians, he said, “I don’t go so far as to think that the only good Indians are dead Indians, but I believe nine out of ten are, and I shouldn’t like to inquire too closely into the case of the tenth.” He thought American minorities were inferior to white Gentiles, but he also thought that anyone who wasn’t inferior deserved an equal job and he personally appointed African Americans to government positions. He created a scandal by inviting black activist and educator Booker T. Washington to the White House for dinner. Also, on the liberal/progressive side of the ledger, Roosevelt thundered against corporate corruption, set aside wilderness for national parks, started the Forest Service, and supported universal healthcare insurance and women’s suffrage.
Abraham Lincoln was first to cordon off wilderness as federal land (Yosemite in 1864) and the integrated National Park Service started in 1916 under Woodrow Wilson, but Roosevelt did more for the cause of wilderness protection overall than any other president (later, his cousin Franklin gave him a serious run for the money). TR signed the controversial Antiquities Act (1906) that allows the President to circumvent Congress on behalf of preserving certain federal land, a boon for modern Democrats. He doubled the national parks from five to ten (some built on land taken back from Indian reservations), created game reserves, bird sanctuaries, national monuments, and the Forest Service to manage natural resources sustainably. He designated the Grand Canyon a National Monument to fend off copper and asbestos miners and Wilson made it into a National Park in 1919. He set aside Mt. Olympus National Monument in Washington state in 1909 to preserve the region’s elk (now called Roosevelt elk) and Franklin Roosevelt transformed that into Olympic National Park in 1938. Teddy Roosevelt also started the Bureau of Reclamation to manage western rivers and (later) hydro dams, especially on the Colorado. Today, the Colorado River (not the one that runs through Austin) supplies over 12% of Americans’ drinking water and 15% of that used to grow crops. Overall, Roosevelt was responsible for preserving more square mileage of wilderness than the entire state of Texas.
While respectful of free enterprise, TR loved strong government and was openly confrontational toward corporate America in a way that no modern Republican or moderate Democrat would dare be. He relished busting monopolies. Today, Wall Street “butters the bread” of both parties and most voters have a stake in the stock market with a 401(k) or pension, making it unlikely we’ll see another TR anytime soon, at least one that actually wins the presidency. While Roosevelt took campaign donations from Wall Street banks, he also worked for the American public rather than powerful lobbies. He pushed for campaign finance reform, spurring Congress’s first attempt to reign in the political bribery that Mark Twain lampooned in the Gilded Age (TR was accused of taking money himself during the 1904 campaign). With his bombastic personality and high-registered voice (listen to the video near the top of the chapter), the stocky, bespectacled fireplug was America’s most dangerous president so far in terms of disrupting the economic status quo.
First, the captains of industry spent millions to keep Populist/Democrat William Jennings Bryan out of office in the 1896 election. Then, in 1900, Republicans tried to hide Roosevelt in the vice-presidency, a notoriously weak office, but were foiled when William McKinley was shot in 1901. As McKinley’s aide Mark Hanna famously asked beforehand, “Don’t you fools realize only one life stands between that damned cowboy and the White House?” Roosevelt’s maverick attitude was even more remarkable given the government’s relative weakness when he took office. It was small in comparison with corporate America, as depicted in the cartoon above with TR in the red shirt and titans like J.P. Morgan looming over him. He was up against some formidable companies, many of whom had bigger war chests than the entire federal government.
Today, Wall Street and Washington are symbiotic (mutually beneficial) partners, with Washington keeping financial regulations light in exchange for insider trading information and campaign donations. Politicians of both parties crush average annual market returns in their own accounts. When Washington does regulate Wall Street, foreknowledge of those laws gives politicians and donors an edge. Threats of stronger regulations are an excellent way to elicit higher donations from lobbyists while appearing to do something in voters’ eyes. Roosevelt, unlike most politicians then and now, wasn’t for sale.
TR didn’t necessarily side with labor or management and that alone distinguished him from predecessors of both parties who sided with management. The unspoken but dramatic shift was Roosevelt’s assumption that it was even within the government’s realm to be brokering relations between businesses and workers (or consumers) in the first place. Despite TR being a Republican, that shift reshaped the 20th-century regulatory state and the modern American definition of liberalism as practiced by the Democratic Party
Liberal & Conservative: Two Evolving Terms
Liberal, just to make things confusing, is a word that means different things in different times and places. Read this section slowly as it can put your head on a swivel. In the 18th century, what’s now called classical liberalism meant a combination of democratic political freedoms/rights and free-market economics. You could say of liberalism that, unlike authoritarianism (be they today’s dictators or yesterday’s monarchs), the main thrust of government is to protect rather than restrict individual rights. However, as Britain and the U.S. dealt with industrialization in the 19th century, they realized that free markets posed certain problems, or externalities, that the public wanted dealt with. Pollution is an obvious modern example. Scotsman Adam Smith, the godfather of free markets, conceded that externalities should be regulated in cases where those regulations benefitted public welfare. Smith’s real concern in the Wealth of Nations (1776), the book that launched modern economics, wasn’t consumer-inspired government regulation that interrupted free markets, but rather governments beholden to mercantilist interests that disrupted the “invisible hand” of free trade to the public detriment, harming general prosperity. That’s a subtle but major difference that has caused Smith to represent something he didn’t believe in among those who either never cracked his 1,000-page doorstop, or did and deliberately misrepresent or cherry-pick him. He favored free trade over protectionism and tariffs, yet Adam Smith also lacked faith in even the best-intentioned governments to administer economic central planning or to fundamentally alter the invisible hand’s allocation of resources, at home or abroad. Smith didn’t offer up a specific prescription to balance the invisible hand’s efficiency with the need for regulating externalities; he merely predicted, correctly, that modern economies would struggle to maintain the right balance.
Modern near-left liberals favor market-based economies overall, but think that the capitalist garden needs tending — what Franklin Roosevelt called “trimming the vines” in the 1930s — to deal with Smith’s externalities. Conservatives prefer to let (economic) nature take its course and see intervention as likely to interfere with the market’s natural function, similar to how a naturopath views drugs and surgery. At their most caricatured extremes, at least, right-wing conservatives are blind to capitalism’s imperfections and left-wing liberals are blind to capitalism being the goose that’s laying golden eggs for society at large — the economic engine for everyone that finances education, the arts, scientific research, the welfare state, etc. — not just lining the pockets of Rich Uncle Pennybags, aka the “Monopoly Man.” This fable has different versions across cultures but let’s say, for our purposes, that the goose is capitalism and the eggs are good jobs and a humane quality of life for workers. If a right-winger ignores everything but fattening the goose while a leftist doesn’t sympathize with the challenges of bearing goslings, a moderate utilitarian might boringly advocate a healthy, sensibly-regulated goose laying as many eggs as possible. Communists just kill the goose, but then they run low on eggs. Having tortured this metaphor nearly beyond recognition, we best migrate to the next paragraph before we’re debating whether the gaggle lives on a level playing field or whether fledglings are anatomically capable of pulling themselves up by their own bootstraps.
As free men increasingly won the right to vote in Britain and the U.S. during the 19th century, economic liberalism morphed from a free-market ideology into a mostly free market increasingly regulated to tamp down some of these externalities or things people didn’t like about completely free markets. These problems included, among other things, cyclical disruptions that the Federal Reserve was designed to mitigate, exploitative child labor, and monopolies. Liberalism morphed into a form of regulated capitalism that could be described as a reconciliation between capitalism and democracy. As people won the right to vote, workers and consumers voted to tinker with capitalism so as to mitigate its negative effects on them. This is what the Progressive Era’s economic component was all about and how the influential New Republic magazine came to define liberal. New Republic editors were part of a group centered in the Dupont Circle neighborhood of Washington, D.C. that included supporters of Teddy Roosevelt’s progressive agenda and Supreme Court Judges Felix Frankfurter and Oliver Wendell Holmes. Eventually, Teddy’s young cousin Franklin shed his conservative upbringing and joined the group.
Slowly but surely the government grew bigger and it wasn’t until after the 1980s Reagan Revolution that liberal became a tainted word that even most Democratic politicians avoided. They wouldn’t highlight it in a campaign ad, for instance, especially in Texas. By the 1980s, a critical balance of voters thought it was time to pull back on the regulatory throttle. Marginalizing the term liberal was brilliant politics on the Republicans’ part, putting Democrats on the defensive. By the 21st century, progressive Democrats often used liberal to refer to their center-left or near-left faction (e.g. Obama, Biden, Booker, Klobuchar) and progressive for further left (e.g. Sanders, Warren, Ocasio-Cortez), yet moderate Democrats still accused progressives of being “too liberal,” so that each wing was using the same derogative to describe the other, leaving anyone who was paying attention bewildered. Competing wings of the Democratic Party insulting each other with liberal no doubt complicates their marketing and branding and confuses the public, much to the delight of the Republicans. Democrats should just use progressive and moderate, as in progressive Democrats disagree with moderate Democrats’ support for free trade, interventionist foreign policy (including drone attacks on terrorists), and market-based solutions to healthcare insurance and climate change. Conservatives, in turn, still use liberal to describe everyone left-of-center or, depending on which TV station they’re on, just simplify things and conflate the entire Democrat Party with communism. Some words pass out of existence because of disuse; liberal should be retired from service because of too many contradictory and confusing uses. Another weird thing about the U.S. is that we use radical to describe the far-left but not the far-right.
Meanwhile, some conservatives got scared of the word capitalism, preferring free markets or free enterprise system, maybe because Karl Marx used capitalism negatively in the 1850s, though he wasn’t the first. Texas public schools avoid the term capitalism while indoctrinating students to the upside of free markets. A famous French diplomat, Talleyrand, said “an important art of politicians is to find new names for institutions which under old names have become odious to the public.”
Like liberalism, conservatism is a difficult-to-define term that’s changed over time. Traditional American conservatives were less likely to favor interventionist over isolationist foreign policies than the more Hawkish Republicans of 1941-2016, and less likely to define “pro-business” as either pro-Wall Street or pro-free trade — often favoring more “main street” small businesses over corporations or protectionism (tariffs) over free trade. A recent incarnation of the main street variety is The American Conservative (2002- ), that critiques corporate power and corruption and advocates husbanding America’s military strength, opposing interventions like Iraq in 2003. Conservatives in most eras are leery of change, while more “reactionary” conservatives favor restoration of earlier political or social orders. Soldier/journalist Ambrose Bierce took a friendly jab at everyone in his Devil’s Dictionary (1881-1906), defining a conservative as a statesman “who is enamored with existing evils, as distinguished from the Liberal, who would replace them with others.”
Aside from changing over time, liberal and conservative also have different meanings in international, as opposed to domestic, contexts. Here’s where liberal gets really tricky. When American relations with Cuba began to improve in 2015 and Fidel Castro died in 2016, American commentators expressed hope that Cuba would “liberalize their economy.” Wasn’t it already ultra-liberal because it was communist? No; the term switches even as it crosses the narrow 90-mile strait between Florida and Cuba. Overseas, we hope for other countries to liberalize their economies in the old-fashioned, free-market sense of the word — to make them what, stateside, we’d call more conservative. All Americans that favor overseas military interventions strive to protect what the rest of what the world now defines as liberalism: representative government, free markets, and individual rights. You could define liberalism in the international sense as meaning democratic capitalism. Neoconservatives sent Americans into combat in Iraq in 2003 to fight for liberalism in the broader sense of the word. When the Taliban, al Qaeda or ISIS hates on Western liberals, they mean all of us, the whole shebang. Benjamin Studebaker’s excellent optional article in Chapter 22, “The Ungoverned Globe,” wouldn’t make sense unless the reader understood this broader definition. The shifting terminology is similar to the word Yankee. At home, Yankee meant fighting for the Union during the Civil War and now means being from the Northeast, especially New England (or, a pinstriped baseball player from the Bronx). But for the British we’re all Yanks and, in Latin America, anyone from the U.S. is a Yanqui.
In 2019, a humorous exchange demonstrated the confusion over liberalism’s broader international and narrower domestic meanings. Authoritarian Russian President Vladimir Putin gloated that the “liberal idea” was “obsolete,” no doubt referring, prematurely one hopes, to the whole international experiment in democracy and, more specifically after WWII, the [U.S.-led/supported] liberal international order of free trade, rules-based organizations like the United Nations, and imperfectly-met ideals of protecting human rights, etc. Asked to respond to Putin’s attack on “Western-style liberalism,” U.S. President Donald Trump got his wires crossed between the different meanings of western and liberalism, affirming Putin by noting that Los Angeles and San Francisco were “sad to look at” because they were “run by liberal people.”
Maybe Webster’s Dictionary can help us:
In these definitions, Putin was talking about the first part of definition C whereas Trump was talking about the second part of C, made all the more confusing by Putin speaking to a British Financial Times reporter through a translator. Note: when I use the term West in later chapters, I don’t mean cowboys or California; I mean the “western world” of Europe, U.S., Canada, Australia, etc. It’s possible that the reason some Americans on the far right favored Putin and Russia over Ukraine in 2022 was because they were confused by talk on the news about defending western liberalism and didn’t realize the distinction between the term’s domestic and global use.
You’ve no doubt found this section rather difficult. You’re not alone and, in fact, being confused by these terms just shows that you’re thinking. It’s sad that, while our language often has more than enough words for one concept — e.g. drunk has nearly 3k synonyms — liberal has been stretched to the breaking point to mean different and even contradictory things. Lawyers have yet another definition, similar to classical economics. For instance, conservative legal scholars side with liberalizing gun laws, meaning to make their access less restrictive, even though political liberals tend to support more restrictive gun laws. That’s because in legal terminology, liberal and conservative retain their general rather than political definitions. Conservatives, for instance, are more liberal on the issue of tech censorship (favoring less) and liberals are more conservative about wearing masks during a pandemic. Just to add a cherry on top this banana split of confusion, there’s also neoliberal economics and neoconservative diplomacy that don’t mean quite what you’d think they might, but we won’t bother trying to unpack that mess here.
In addition to evolving over time and space, these conservative and liberal labels tend to pigeonhole individuals. Out of necessity, you’ll see the terms plenty in the rest of the textbook as I describe political and cultural debates, but remember that such categories tend to stereotype people into clusters that often don’t correspond well to how any one person actually views the world. Most people have their own complex views that change as they grow older and don’t fit neatly into any one box. And most of us are combinations of liberal and conservative tendencies, regardless of how one defines them. That was true of Theodore Roosevelt (R), William Howard Taft (R), and the third Progressive-era president, Woodrow Wilson (D).
As of the 1912 election, the liberal and conservative tags didn’t equate with Democrats and Republicans as they do today. There were conservative and progressive wings in each party, creating a de facto four-party system that necessitated bipartisan compromise and, in 1912, made for an interesting election. Teddy Roosevelt thought that his successor, William Howard Taft, had betrayed the progressive spirit. If we define progressivism as TR’s presidency, that was crap, since Taft broke up more monopolies than Roosevelt, oversaw passage of the graduated national income tax, signed off on the creation of Montana’s Glacier National Park, and, on his last day in office, the Department of Labor. But Taft made cuts to Roosevelt’s hard-earned Forest Service that exacerbated the damage of a three million-acre conflagration in the Northwest known as the Great Fire of 1910, inadvertently strengthening the conservation movement. More importantly, Taft wasn’t as bold as TR about confronting moneyed interests and corporations head-on. Who was? Taft disliked being caught between various special interests and never grew into the sort of confrontational firebrand the public had come to love in Roosevelt. Famous for his Falstaffian personality and figure, Taft served one term and ended up as the only president in American history to move from the executive to judicial branch, becoming Chief Justice of the Supreme Court in 1921.
Roosevelt, meanwhile, had retired to Africa to hunt big game and write a few books. J.P. Morgan quipped, “every American hopes that every lion will do its duty” [eat TR]. There for nearly a year, Roosevelt and his son Kermit killed hundreds of lions, elephants, and giraffes. But, as he breakfasted over week-old New York Times before going out to blast rhinos with his Winchester, TR couldn’t help but get the sense that the corporate camels had nudged their nose too far under Taft’s tent. Moreover, when Taft did get more aggressive with corporations, such as when he took on U.S. Steel’s monopoly, TR took offense because he thought it made him look soft for having tolerated U.S. Steel. Poor Taft couldn’t win, it seemed, in earning Roosevelt’s approval. Roosevelt thought that Taft was overly respectful toward the Constitution and separation of powers between the executive and judicial branches. He called the 330-lb. Taft a “fat-head and a flub-dub with a streak of the second-rate and the common in him…Taft has the brains of a guinea pig!” Roosevelt was bitter that he wasn’t president anymore and itching to get back in the White House for a third term.
So Roosevelt resolved to do something no retired president since Martin Van Buren had done: he reentered politics under a different party banner and ran against the incumbent Taft. The robust 53-year-old proclaimed, “The parting of the ways has come…my hat is in the ring, the fight is on, and I am stripped to the buff.” In terms of physique if not political leanings, this was a candidate that Sarah Palin could’ve admired as much as she did the shirtless, tiger hunter Vlad Putin. TR eventually branded himself under the Progressive Party banner, but it wasn’t so much an entirely separate party as the existing Republicans’ more progressive plank — promoting national income and inheritance taxes on “fortunes,” universal healthcare coverage, minimum wage, women’s suffrage, and organized labor, defending natural resources, and advocating a six-day/eight-hour work limit (48-hour week). We should avoid comparing too recklessly across an entire century because the context changed, but it’s worth noting that Teddy Roosevelt was to the left of Barack Obama on healthcare insurance, aligned with Bernie Sanders, AOC, and Elizabeth Warren. TR also pushed for the same type of elderly pension and unemployment insurance that his cousin Franklin would reluctantly pass a quarter-century later with Social Security. TR argued that the public should toss out any judges who blocked these liberal reforms because they were unconstitutional. He was channeling Wisconsin Republican Robert La Follette, arguing that some concessions toward democratic socialism would inoculate American capitalism against more radical options like communism. The GOP at the time was mostly a collection of state parties beholden to various corporate interests like railroads, mining, and timber. TR thought they needed a strong executive at the top who could appeal directly to the workers: “The Republican Party must stand for the rights of humanity, or else it must stand for special privileges.”
Not all states had binding primaries at the time, wherein the respective political parties chose their candidates through voting. Only thirteen states had Republican primaries in 1912 and TR won nine. But party delegates nonetheless chose Taft at their summer convention in Chicago, infuriating Roosevelt. TR said of the GOP: “The dog has returned to its vomit.” For him, if the existing parties were unresponsive then creating a third party in 1912 was no different than the Republicans stirring up the old Democratic-Whig two-party system in 1854. Republicans too were once a third party. It was rough on President Taft, who called TR a “freak” and a “demagogue” but didn’t take naturally to mudslinging and dirty politics. One reporter found him weeping on a train car after a speech, lamenting that he’d lost his best friend in Roosevelt. TR, on the other hand, ran up and down caboose aisles shadowboxing as he prepared to give whistle-stop speeches trashing his old buddy. No president had ever run for a third term (TR had served 1 and ¾ terms), even though it was legal until 1951. George Washington had voluntarily stepped down after two terms and it became traditional to not go past two.
The 1912 Presidential Election occurred during the Progressive era’s height and was among the more interesting and important elections in American history. For starters, it was a four-horse race between the Republicans Taft and Roosevelt, Democrat Woodrow Wilson, and Democratic Socialist Eugene Debs. Debs had shifted more to the left since his days as a labor organizer in the 1890s, still devoted to small-d democracy (voting) but calling for the surrender of the “capitalist class.” Roosevelt misrepresented Debs as an anarchist while cleverly stealing popular parts of his platform. Roosevelt’s progressive GOP plank was basically advocating democratic socialism as well and his faction became better known as the “Bull Moose Party” after a lunatic shot TR before a campaign speech in Milwaukee. Let me explain.
The assassin, a New York saloonkeeper named John Schrank, disapproved of Roosevelt breaking Washington’s two-term tradition and shot him in the chest from point-blank range with a Colt .38 caliber revolver. Schrank said he dreamt that William McKinley’s ghost sat up in his coffin and pointed to VP Roosevelt as his killer, instructing Schrank to avenge him. Fortunately, he hit Roosevelt’s breast pocket, which contained a rolled-up 50-page speech and his glasses case. The bullet lodged just below his skin and Roosevelt went ahead and gave the speech after realizing that he wasn’t coughing up blood, meaning the bullet hadn’t punctured his lung. An aide announced what happened before he began and someone yelled, “fake!” TR opened his coat, showed the crowd his bloodied shirt and bellowed, “Ladies and gentlemen, I don’t know whether you fully understand that I’ve been shot; but it takes more than that to bring down a Bull Moose.” He spoke for ninety more minutes contrasting his dedication to peoples’ rights with Woodrow Wilson’s commitment to “the old flint-locked, muzzle-loaded doctrine of states’ rights,” then walked to the hospital. They sent Schrank to an insane asylum.
The 1912 election was a bit awkward for young New York Assemblyman Franklin Delano Roosevelt. He was a distant cousin of his idol Teddy, but as a Democrat had committed to supporting Wilson. Franklin’s wife Eleanor was Teddy’s favorite niece and favored him over Wilson. Uncle Ted advocated the aforementioned liberal platform, along with more “square dealing” under the mantle New Nationalism. Woodrow Wilson promised New Freedom, whereby he meant less government regulation and a more decentralized, local economy. There was an obvious flaw in Wilson’s reasoning: if capitalist economies have a natural tendency to concentrate power at the top in the form of monopolies, then why would deregulating the economy lead to smaller, local businesses? Why would less regulation break up monopolies?
The flaw wasn’t fatal, though, since Taft and Roosevelt mainly stole Republican votes from one another, while Debs stole some progressive votes from Roosevelt. Roosevelt out-polled Taft, but Wilson walked away the winner at 42%, however contradictory his platform, breaking the long post-Civil War grip of Bourbon Democrats on their party, as Wilson sided more with Populists that Democrats were recruiting and co-opting (he brought on ex-Populist and Democratic candidate W.J. Bryan on as Secretary of State). Debs garnered 6% of the popular vote, white in the pie chart below, the high-water mark for democratic socialism in American history unless you want to count TR’s 27%, and second-most for a Lefty behind La Follette’s 16% in 1924.
TR fell briefly into a depressive funk then went on a mapping expedition up the Amazon River with his son Kermit, where he nearly died of an infection and malaria. He returned to the U.S., having seemingly aged five years in two months, to lambast Wilson’s administration for not getting into World War I quicker. It’s safe to say we may never see another Teddy Roosevelt.
The parties’ platforms weren’t fully clarified by 1912 and never will be; they’re constantly evolving. But the 1912 election weakened the GOP’s progressive wing with Roosevelt’s defection and loss, and it mostly died out with Herbert Hoover’s presidency of 1929-1933. It was gone altogether by 1964, though John D. Rockefeller’s grandson Nelson Rockefeller (R) was a relatively liberal New York governor from 1959-1973. Progressive legislation has emerged from Republican administrations since, but it’s never dominated a platform the way it did with Roosevelt’s breakaway party in 1912. Hoover ran to the left of Franklin Roosevelt in 1932 on some issues, but he was more of a centrist than a progressive. Also, Thomas Dewey’s 1948 GOP platform was to the left of Democrat Harry Truman’s on some issues, but that was much to the chagrin of congressional Republicans.
As for the Democrats, Woodrow Wilson wasn’t much of a liberal when he was elected but he fast became one as the Progressive tide crested, reorienting himself to catch the wave in time for his 1916 reelection. Under Wilson, the U.S. enacted the Clayton Anti-Trust Act, implemented the Federal Income Tax passed at the end of Taft’s term, created the Federal Reserve and National Park Service, and signed congress’s first child labor laws. Toward the end of his presidency, women won the right to vote. Although he didn’t spearhead these initiatives (and opposed women’s suffrage at first), he projected a progressive improve-the-world spirit onto the international stage by trying to outlaw war and create a world police organization, the League of Nations, after World War I (next chapter).
Wilson also embodied the racial blind spot of Progressives with his appreciation for the Ku Klux Klan and reversal of Roosevelt’s policy of hiring minorities in the federal government. The Democrats carried the racist legacy of the Confederacy with them up through the mid-20th century, even as they assumed an otherwise pro-worker mantle by the 1930s.
Progressive legislation didn’t always originate with activists or the people suffering from harsh conditions. More often, establishment leaders sensed the tide of history and cut their losses by preemptively enacting moderate legislation. They were letting some steam out of the pot to preclude a boil over in the form of a working-class uprising. In other words, they defused more radical options with compromise. Both Teddy Roosevelt (R) and his cousin Franklin Delano Roosevelt (D) saw things that way. While some of their more vocal critics saw them as loose cannons or communist Antichrists ruining America, they saw themselves as closet conservatives who were saving capitalism from itself, inoculating it against more drastic alternatives.
Patricia O’Toole, “The Speech That Saved Teddy Roosevelt’s Life,” Smithsonian (November 2012)
Andrew Glass, “Theodore Roosevelt Reviews Race Relations,” Politico (February 2017)
Elaine Godfrey, “Iowa Is What Happens When A Government Does Nothing,” Atlantic (12.3.20)
P.J. O’Rourke, “Deciphering Fed Speak,” American Consequences (June 2017)
Paul Sagar, “The Real Adam Smith,” Aeon (January 2018)
Raymond Craib, “Egotopia,” CounterPunch (August 2018)