Bully Pulpit ~ (n): “An office or position that provides its occupant with an outstanding opportunity to speak out on any issue”
We’ve never had a market free of government interference, nor could we. Such an economy would be impossible in the modern age since there would be no currency, protection of property rights, clean water, military protection, disease control or infrastructure for businesses to get around on. There would be no laws governing contracts, liability or bankruptcy. As economist Robert Reich put it, “Without government, there can be no free market.” Without government, we’d only barter goods, anyone could steal from anyone else, informal militias would protect us from foreign invaders or terrorists, and you would travel only by bartering tolls with private companies. Employers would hire from a pool of mostly illiterate workers since only the rich would be able to educate their children, while some fortunate kids would be properly home-schooled.
None of that stops politicians from making grandiose claims like history has shown that government can’t solve any problems or free markets solve all problems. Bear in mind the people making those claims are taking many basic government functions for granted and putting a rosy spin on how the economy really operated in the 19th century when the government had a smaller role. History doesn’t back up either generalization about government or free markets or offer up examples of countries that created strong economies by eliminating free markets altogether. Over the last two centuries, we’ve seen that capitalism spurs growth and innovation and has made most peoples’ lives more comfortable. And we’ve seen that free markets have flaws, including lopsided concentrations of wealth at the very top, monopolies, pollution, bad working conditions, and other externalities (economist jargon for drawbacks). Thomas Jefferson, the Founding Father most commonly enlisted in support of freedom and limited government, defined a “wise and frugal government” as one that should “restrain men from injuring one another” and “leave them otherwise free to regulate their own pursuits of industry and improvement” (1801 Inaugural Speech). That’s a balancing act because men sometimes injure each other in their own pursuits.
As the tension between injuring and industry-pursuing unfolded during the Industrial Revolution, most countries arrived at some compromise we can roughly call regulated capitalism. The United States is one such country. Compromises generally please no one, and there are plenty of intelligent people that would favor more extreme options on either end of the spectrum: the far left toward greater government control, or the far right toward less. In the U.S., a favorite among the far right is novelist Ayn Rand (1905-1982), whose Objectivist philosophy advocated a libertarian, laissez-faire society unencumbered by altruism. After coming of age in Revolutionary Russia, Rand moved to the U.S. in 1925 and anticipated “greed makes good” long before Gordon Gekko in Wall Street (1987). In Rand’s vision of minimal statism, citizens can’t use physical force to get what they want and a skeletal government exists to protect property rights. Beyond that, though, society should run on its own. In this passage from her most famous book, Atlas Shrugged (1957), the atheist philosopher inverts the Biblical phrase (KJV 1 Timothy 6:10) about money being the root of all evil: “Until and unless you discover that money is the root of all good, you ask for your own destruction. When money ceases to be the tool by which men deal with one another, then men become the tools of men. Blood, whips, and guns–or dollars. Take your choice–there is no other–and time is running out.” Given her Russian background, Rand suspected all governments of being inclined toward totalitarianism. The protagonists of Atlas Shrugged are a cabal of industrialists who destroy the government so that they can rebuild it along minimalist lines. Ayn Rand’s followers include Wikipedia founder Jimmy Wales, former Secretary of State Rex Tillerson, Senator Rand Paul (R-KY), former House Speaker Paul Ryan (R-WI), Donald Trump’s biggest campaign donor Bob Mercer, and former Federal Reserve Chair Alan Greenspan (1987-2006). Also, influential GOP donors Charles and David Koch would like to abolish Social Security, Medicare/Medicaid, and welfare.
In its purest form, libertarian anarcho-capitalism is tough to test with so few real world examples — part of why its founding text is a novel. One attempt was the Republic of Minerva on a reef south of Fiji in the early 1970s — free of taxes, welfare, subsidies or any form of economic intervention — but it didn’t last long and is now underwater (perhaps ironically so). Paypal co-founder Peter Thiel promoted seasteading, the establishment of libertarian experimental societies on offshore islands, rigs or reefs. Based on the example of the Sealand abandoned/occupied oil rig off England, the United Nations won’t recognize such squatter sovereignty. The Citadel in northern Idaho is based on libertarian principles, though its emphasis is more political than economic. In The Not So Wild, Wild West (2004, review), the Hoover Institution’s Terry Anderson and Peter Mill argue that low-government “free market environmentalism” functioned in the Old West of their native Montana, though they don’t provide verifiable economic or crime statistics. While libertarian communities are scattered and scarce, we can say that Ayn Rand’s dichotomy between libertarianism and totalitarianism is false since none of the near 200 countries on Earth are libertarian and most aren’t totalitarian (false dilemma fallacy).
There is a more moderate American Libertarian Party (1971- ), though, that is more fiscally conservative than Republicans and more culturally liberal than Democrats. Its platform includes a leaner federal government, isolationist foreign policy (military for defensive purposes only), looser regulations on guns, and ending drug prohibition.
On the other end of the spectrum, there’s no need for fictional speculation. We have historical examples of going far in the big government direction, with the Soviet Union and other communist dictatorships that drowned in their own bureaucracy and stifled peoples’ ambition as they seized the means of production (farms, factories, businesses) on behalf of the people. On the upside, communist societies lowered crime and poverty and eradicated unemployment. But they also killed and imprisoned dissidents and starved people to death through misguided and poorly implemented economic planning. Communism’s main apostle, Karl Marx (left), also underestimated the capacity of capitalist democracies to appease workers through incremental and moderate reforms, trade unions, government pensions (e.g. Social Security), redistribution of wealth, child labor restrictions, free public education, and opportunities for upward mobility. Pollution, a byproduct of industrialization regardless of the political system, has been just as bad if not worse in communist countries as capitalist. For the New Left, we still haven’t seen genuine Marxism because real-world examples have been warped by the totalitarianism of Lenin, Stalin, Mao, Castro, Pol Pot, Kim Jong-un, etc. After all, in the Communist Manifesto (1948), Marx predicted the eventual dissolution of the state after workers seized the means of production.
However you may feel about pure left-wing and right-wing options — and thoughtful students should read widely with open minds — neither is likely to play out in the U.S. anytime in the foreseeable future. Ayn Rand hasn’t gained mainstream traction and neither Rand Paul nor Paul Ryan suggested that we completely eliminate entitlements like Social Security-Medicare, though Ryan did say in 2009 that “we are now living in an Ayn Rand novel.” Marx remains popular among some academics and young trustafarians, but no elected officials. In the meantime, the most practical way of approaching the role of government in the non-fictional, non-theoretical world is to debate the extent of government interference on an item-by-item basis, then argue it out in the public forum. Too many regulations can be counter-productive, and so can too few. Alan Greenspan himself came around to such a notion after the financial meltdown of 2008-09, lamenting the deregulation of Wall Street he’d helped bring about. Such centrism is boring, tedious, and frustrating and its philosophers don’t enjoy the simple purity of Rand or Marx. True believers are in a perpetual state of disappointment that moderates aren’t being ambitious enough. Leftists, for instance, think liberals aren’t fighting hard enough against “the man” and right-wing commentators eviscerate moderate Republicans for selling out as “insiders.” Moderates answer that they’re doing what they can within real rather than theoretical political constraints. They can claim that the back-and-forth of compromise has produced the best real-world results we’ve seen so far. As the saying goes, the proof is in the pudding.
In the last chapter, we saw Americans striking a balance between freedom and order on voting, education, entertainment, food, drugs, and alcohol. In this chapter, we’ll hone in on the political economy of regulated capitalism during that same Progressive Era. The central figure in the national government was combative and fiery president Teddy Roosevelt, whom one journalist described as a “steamboat in trousers.”
Though a writer and historian, “TR” didn’t worship the Founders and wasn’t hidebound by their original intention to create a weaker central government: “Our forefathers faced certain perils which we have outgrown. We now face other perils, the very existence of which it was impossible for them to foresee” (1905 Inaugural Speech). Founder Thomas Jefferson said as much himself in this 1816 letter, describing such “sanctimonious reverence” for the Constitution as “too sacred to be touched” as a man wearing a coat that fit as a boy. Though no fan of Jefferson, Teddy Roosevelt felt likewise and used the bully pulpit (public speeches, media, etc.) as a sounding board to broker what he called a square deal for everyone, both workers and management. He coined the phrase during the 1902 Coal Strike, the first strike in American history the government intervened in as a neutral arbitrator rather than on behalf of management. The Republican TR liked the ring of it and applied Square Deal to his overall economic policy, laying the foundation for his Democratic cousin Franklin Roosevelt’s New Deal in the 1930s. In 1902, though, Franklin was a sophomore at conservative Harvard and criticized Teddy’s actions for interfering with the free market. Still, it was getting cold and the future president was grateful for the arrival of coal on campus for the stoves.
Roots Of Economic Intervention
People often cite Franklin Roosevelt’s more famous and substantial New Deal of the 1930s as the fork in the road where America strayed off the free market path toward a regulatory state, but that happened gradually and started earlier. We’ve already seen the government affecting trade through tariffs (import taxes), giving away land to favored recipients (railroads, farmers, universities), and influencing labor relations by intervening militarily on the side of management to break strikes.
The Constitution’s Commerce Clause (aka “Interstate Commerce Clause”) gives the national government the right to regulate commerce between but not within the states. Relying mainly on the Commerce Clause for constitutional justification, the government established legal authority over railroads, banks, pipelines and medicine between 1882-1906, long before Franklin Roosevelt arrived on the scene during the Great Depression of the 30’s. Another turning point was Swift & Co. v. United States (1905), involving Chicago meatpackers. That case helped establish the Commerce Clause precedent and, along with lobbying efforts mentioned in the previous chapter (Roosevelt, Sinclair, Heinz, etc.), led to regulation of the meat industry. The case also broke up a trust that wasn’t just one company monopolizing an industry but rather a series of companies colluding on pricing, agreeing to set a price basement. The same principle applied to railroads. Fulfilling the Populist Party’s goal, the Interstate Commerce Commission under Teddy Roosevelt regulated railroad rates with the Elkins Act of 1903 and Hepburn Act of 1906 (the stronger Hepburn Act is what empowered the ICC). This, too, was part of TR’s Square Deal whereby he didn’t redistribute wealth to the poor like his cousin Franklin later would, but he used intervention to prevent big companies from exploiting working Americans. In an unprecedented action for a president, TR rode the rails to campaign on behalf of railroad regulation the same way a politician would campaign for an election in a Whistle-Stop campaign.
The national government also outlawed alcohol and narcotics during the Progressive Era, both economic sectors in their own right. As the Industrial Revolution and immigration fueled rapid growth, these trends continued through the 1910s on national, state, and local levels. Below, we’ll cover national government intervention in the economy as it grew during the Progressive Era, including the creation of the Federal Reserve, the advent of the Federal Income Tax, child labor regulation, and anti-monopoly legislation. These topics are boring but important, with the first two constituting much of what we debate economically.
Congress created the Federal Reserve in 1913 in response to the Panic of 1907 to manage currency, stabilize prices, and control inflation. In previous panics, only private “lender of last resort” J.P. Morgan (above) stabilized the government’s gold standard (1895) and saved the banks by flooding them with cash (1907). Private lenders of last resort couldn’t be counted on to always be there in the future. With “the Fed” (common shorthand for the Federal Reserve) the government would serve that purpose. They added the mandate of stabilizing employment rates in the 1970s, though the Fed has struggled more in that role to have anything other than an indirect influence. The 1907 mini-meltdown also compelled the government to regulate the stock market for the first time. They created the Federal Trade Commission in 1914, a precursor to today’s SEC, or Securities & Exchange Commission.
The Fed has had moderate success in stabilizing the banking system, at least in comparison to late 19th and early 20th centuries. The in comparison is a key qualifier because there were serious problems in 1929-1933, the 1970s and 2008-09. Unlike the two earlier national banks (1791-1833), the Fed is a non-speculating (non-investing) bank set up to distribute money from the Mint (coins) and Engraving Office (bills) to regular banks through twelve regional reserves that maintain some private control. Austin, for instance, is served by the Dallas branch (#11 below). Regionalizing the banks keeps cash from concentrating in certain parts of the country.
You could think of the Fed as where “banks bank,” serving as what most countries call their central bank. The U.S. hadn’t had a central bank since Andrew Jackson vetoed the re-chartering of the Second Bank of the U.S. in 1833, which is why Progressive reformers meeting on Georgia’s Jekyll Island had to fend off the “ghost of Jackson.” With the revamped Federal Reserve, banks swap Federal IOU’s (U.S. Treasuries/bonds) back and forth with the Fed in exchange for reserve cash through Open Market Operations. After the Great Recession of 2008, for instance, the Fed purchased treasuries to infuse cash into the banking system, worrying critics that by loading too much cash into the economy they’d set the stage for future inflation. Banks also borrow and lend to each other through the Fed. Each member bank has to keep ~ 10% of their customers’ money on reserve (depending on size), thus its name. That adds stability to the system. The Fed can also relieve banks in danger of failing because customers are panicking and withdrawing their cash too quickly — the sort of run on banks that happened in 1907. Federal Reserve member banks also fall under FBI jurisdiction when robbed.
Created by Congress, the semi-independent Federal Reserve distributes cash from the U.S. Treasury and funnels profit back into the Treasury rather than shareholders. The Fed doesn’t really exist to earn a profit, but rather to stabilize the economy by furnishing an elastic currency. After its role expanded during the New Deal of the 1930s, the Fed’s Open Market Committee (FOMC) has contracted and expanded the economy by influencing the short-term Target Federal Funds Rate: the interest rate on overnight loans that member banks with surplus cash loan to those just under the 10% reserve limit — influenced in turn by the Fed’s buying and selling of bonds and tweaking of the reserve requirement, which in turn influences the interest rate the Treasury pays for bonds (U.S. Treasuries). Elastic currency is complicated, but understand the main concept depicted in the diagram on the right. The more bonds the Fed buys, the more cash in the system; the more cash in the system the lower the interest rates and vice-versa. It’s a target rather than a set interest rate because the Fed is manipulating rates through natural laws of supply and demand. The Fed also sets the discount rate they charge at their Discount Window for short-term overnight loans to banks — this is usually what the media is referring to when they say the Fed is raising or lowering interest rates. These benchmark rates, in turn, impact the prime interest rate banks charge, influencing (but not actually setting) borrowing rates for home mortgages, cars, student loans, corporate debt, etc. The current Wall Street Journal prime interest rate is 5.25%, though that’s a rough average and not the exact rate any one person necessarily gets on any one loan. The Fed’s manipulation of interest rates to smooth out boom and bust cycles is at the core of national macroeconomics. Longtime Fed Chair William McChesney Martin (1951-1970) said the Fed’s role was to “take away the punch bowl just as the party gets going.” In other words, once they’ve reignited the economy with “easy money” low-interest rates (more cash), they want to rein it in with higher rates (less cash) to avoid triggering high inflation. The less money in the system, the more interest a bank is likely to charge customers for a loan, due to the basic law of supply and demand. Customers’ rates are also impacted by their own credit ratings (their history of paying bills on time). Since the 1960s, Americans have the right to view their credit ratings.
After 2008, the Fed under Chair Ben Bernanke kept rates low (0.00-0.25%), hoping that would fuel more borrowing and economic growth in the wake of the Great Recession, but also making it difficult for savers to earn interest on their money. They also hoped that such low rates would herd investors into the stock market instead of safer alternatives like bonds or bank CD’s (certificate of deposit). The Fed pumped $85 billion a month into the banking system by buying up mortgages and long-term treasuries to keep yields low in a program called Quantitative Easing (or QE), and the economy slowly but surely mended. The Fed controls liquidity in credit markets, tightening or loosening lending rates, but can also undertake more unorthodox buying programs like QE to shore up banking and the stock market, or to encourage mild inflation (< 3%).
Has the Federal Reserve done its job? It went to work quickly after it was set up in 1913-14, helping to stabilize the U.S. economy during World War I by shoring up the system with hundreds of millions of dollars. But the U.S. went off the Gold Standard during WWI, and the Fed struggled to balance the economy over the next twenty years as monetary policy seesawed, sometimes leaving the gold standard and other times increasing or decreasing the amount used to peg the dollar to gold. They reverted to the gold standard and stuck to it at the worst time, prior to the Stock Market Crash of 1929, decreasing the cash in circulation just as banks were running out, contracting the money supply. Then they doubled the reserve requirement (cash kept safe, out of investments) at an inopportune time in 1937, contributing to a recession within a recovery during the Great Depression. The Fed also failed to raise interest rates in the late 1960s because they and President Lyndon Johnson didn’t want to weaken the economy, but inflation rose steadily and finally President Richard Nixon had to sever the dollar from the gold standard altogether in 1971. While the Fed is, in theory, independent from the executive branch, both Johnson and Nixon pressured the Fed to keep interest rates low to help the economy, and often presidents would be well-served in the short-term by this “sugar high” of an economic boost. However, if employment is solid, low interest rates tend to boost borrowing and spending too much and cause inflation, and that very thing happened because of Johnson and Nixon’s influence. Inflation worsened throughout the 1970s until Fed Chair Paul Volcker dramatically raised rates, deliberately causing a recession in the early 1980s to halt inflation. Then after a market crash in 2000 and 9/11, Alan Greenspan’s Fed kept rates low and pumped cash into the system between 2001 and ’05 even after the economy improved, helping to fuel a housing bubble — in that case, abandoning its mission to stabilize fluctuations in the economy.
History will judge the Fed’s massive infusion of cash into the system between 2008-2014, after that real estate bubble burst. The goal will be to “take the punch bowl out of the party” by reversing the quantitative easing process as inflation rises, swapping treasuries back to banks for cash. In the meantime, the Fed held so many assets (including bonds, real estate mortgages, etc.), that the government (Treasury) made a lot of money from interest. In late 2015, Chair Janet Yellen signaled that the Fed would reverse course. As of 2019 under Chair Jerome Powell, the Fed remains in a cautionary contractionary mode, gingerly raising rates and draining money back out of the economy. Meanwhile, President Trump — like most presidents more concerned with short-term growth than long-term inflation — tries to pressure the Fed to keep rates low.
Federal Income Tax
Benjamin Franklin once said that “in this world, nothing is certain except for death and taxes.” Yet, the U.S. had no federal income tax prior to 1913, except for briefly during the Civil War. The Supreme Court declared an 1894 national tax unconstitutional in Pollock v. Farmer’s Loan & Trust Co. (1896). The Revenue Act and 16th Amendment of 1913 overturned that precedent, allowing the government to raise revenue in an era when it could no longer simply sell off western lands and depend on tariffs and bond sales. The revenue allows the government to conduct its basic functions, including maintaining a military, building infrastructure, and providing entitlements that include health insurance and a modest monthly pension for the elderly. Workers contribute to the latter via payroll deductions in their paychecks (itemized under FICA). Populists originated the national tax idea. The first 1913 bracket graduated or progressed upwards from just 1% for the poor to 7% for the wealthy.
Today’s graduated brackets apply to everyone from lower middle-classes on up, with the rate of payments increasing as one moves up the scale, topping out at 37% for the upper bracket as of 2018 (here are the inflation-adjusted historic and 2015 rates). With recent legislation, the lowest bracket (earning $10k-$38) dropped from 15% to 12%. Contrary to popular belief, workers can’t hit a point where they barely move over a cut-off line and end up losing money. People only pay the higher rate on that extra portion of income in the higher bracket. No one, in other words, pays 37% on their entire income. The rates vary depending on whether someone is filing jointly, or single, etc. Everyone that works outside a pension system pays a 6.2% Social Security payroll tax, which really amounts to contributing toward one’s own retirement. Local taxes on food are regressive since the poor use a greater portion of their money for essentials. Lottery tickets are especially regressive, basically funneling money from workers into government coffers, though much of it cycles back to programs like education that help poor and middle classes.
For most of the 20th century, the top rates were high, usually above 50%, but they dropped dramatically in the 1980s. While today’s income taxes are graduated, the tax on investments — dividends and capital gains — is only 15-20% (+ 3.8% for Obamacare for married filing jointly over $250k/yr.). Since the wealthy derive most of their income from the stock market rather than income, their overall effective rates can be lower than people in middle classes paying 24-35%, though their totals are higher. Some are quick to complain of “class warfare” among anyone who doesn’t like this arrangement, but this regressive tax code even has some wealthy critics. Warren Buffet has suggested a 30% overall bottom rate for the wealthy as defined by the top 0.3% (aka the Buffet Rule) and iconic conservatives Andrew Mellon (Treasury Secretary, 1921-32) and Ronald Reagan (President, 1981-89) favored taxing capital and labor at the same rate. Buffet pointed out that, as a multi-billionaire, it was ridiculous that he paid a lower effective rate than his secretary. Currently, though, the bottom 99.7% is either fine with paying more, too fatalistic or apathetic to protest, thinks the lower capital gain rates spur investment, or doesn’t know exactly what’s going on.
Moreover, for all taxpayers, there are numerous write-offs for home ownership, home improvement, charitable donations, work-related travel expenses, etc. that lower one’s reported income. Total deductions of $900 billion in 2013 would’ve paid for almost all of the combined cost of Medicare and Medicaid. While America has made great strides in lessening discrimination, voters of both parties continue to support rigging the tax codes against renters. The Home Mortgage Interest Deduction goes back to the beginning in 1913, to encourage home ownership, and the Charitable Contribution Deduction started in 1917. Overall, Americans today pay slightly lower rates than they did for most of the 20th century, but not significantly lower if one includes local taxes (state, county, sales, etc.). American corporate rates dropped from 35% to 21% in 2018, but few pay the full rate. Some route their profits through Bermuda, Puerto Rico, Ireland (e.g. Apple), or other “tax havens” (low-tax jurisdictions). Because of the complexity of tax codes and corruption of lobbying, rates are distributed unevenly from company to company, usually favoring bigger companies with more political pull. Some corporations pay the full amount while others are, in effect, on welfare insofar as their rebates are more than their bills. The big Wall Street banks avoid taxes with offshore havens. Goldman Sachs, with direct influence in the cabinets of Bill Clinton, Barack Obama, and Donald Trump, pays less than 1%. The overall American corporate tax average as of 2012 was 17.3% — half of the supposed 35%. It remains to be seen whether or not the new drop from 35 to 21% will be accompanied by stricter enforcement and closed loopholes, but don’t hold your breath. According to Treasury Department estimates, if the U.S. got rid of all cheating, loopholes, and tax havens, the country could cut everyone’s taxes by 12% across the board and still balance the budget. Yet, angry non-cheating voters have instead irrationally and consistently cut funding to the Internal Revenue Service, the most cost-effective agency in all of government; the IRS returns $6 for every $1 spent.
A lot of the lost revenue isn’t through cheating, but rather American corporations defraying taxes by keeping their overseas profits abroad rather than bringing it home. According to Bloomberg BusinessWeek, as of 2015 the top ten such tax dodgers alone — Microsoft, Apple, Oracle, Citigroup, Amgen, Qualcomm, JPMorgan Chase, Gilead Sciences, Goldman Sachs and Bank of America — could fund NASA, the Army Corps of Engineers, the Environmental Protection Agency, the Pentagon’s training of and equipping of security forces in Afghanistan, the Transportation and Security Administration, Social Security overhead, and the Departments of Justice, Commerce, Agriculture, Treasury, and Interior.
The 16th Amendment is an important watershed in American history. Since federal tax codes are graduated, it set up a mechanism for the government to redistribute wealth. As mentioned, that’s partially offset by the fact that capital investments are taxed at a lower rate than income but, even so, rich Americans pay far higher totals than poor. There are two ways to spin these statistics. The wealthy can complain that they pay a large proportion of revenues, while liberals can counter that the reason the wealthy pay so much in taxes is that they have such a high portion of the country’s money to begin with. The upper 1% have more money than the bottom 50% in America. For many liberals, redistribution makes society more equitable and “evens the playing field” some. For many conservatives, redistribution is unjust because it puts the government in a permanent state of rewarding failure and punishing success, weakening everyone. This issue will remain at the heart of Republican-Democratic debates into the foreseeable future, regardless of what other issues come and go onto their platforms. One thing voters in both parties agree on — other than accountants, who profit from the complexity of tax codes — is that they’d prefer to simplify the system by getting rid of all the complicated write-offs and loopholes…other than all the ones they like and profit from.
Government & Labor
Labor, meanwhile, focused on safety and hours more than wages in the Progressive Era. It wasn’t uncommon for blue-collar workers to put in 72 hours a week (6 days x 12 hours). One in eleven steelworkers died annually. Think about that. Would you work in a job where there was a 1/11 chance of dying in a given year? Those odds are worse than for soldiers or sailors in most wars or the most dangerous occupations today: lumberjack, fisherman, roofer, steelworker, pilot, and driver (taxi, truck, etc.). According to the Bureau of Labor Statistics (BLS), the chance of a logger or fisherman dying in a given year is just over one in a thousand — a hundred times safer than a steelworker a century ago.
At the Triangle Shirtwaist factory in New York’s Garment District, 146 girls and women died in 1911, unable to escape a fire because their bosses nailed the exit shut to keep out union organizers. When some made it out onto the fire escape, it collapsed under their weight. Others jumped to their death from the 11th-floor windows.
The fire led directly to building codes mandating multiple marked exits and outward-opening doors since many of the asphyxiated victims trampled themselves to death behind an inward-opening door. Happening in the middle of the day, just blocks from the New York Times office, the Triangle fire drew the public’s attention to child labor and union busting, along with the Ludlow Massacre in Colorado in 1913 that we read about in the Gilded Age chapter.
Unions gained some strength, but still lacked the right to collective bargaining, so management could simply fire or abuse strikers or union organizers. The Triangle Fire underscored these problems. The fight for worker safety and shorter work-weeks is a good example of how America’s three branches of government — congressional, executive and judicial — can check each other’s interests, with none of the three ever able to override the other two. Congress and the presidency, the two branches most closely connected to the people, favored regulating the workplace while the judiciary, led by the Supreme Court, saw such regulations as unconstitutional.
The Keating-Owens Act of 1916 set the age limit for miners at 16 and factory workers at 14, but the Supreme Court reversed the law in Hammer v. Dagenhart (1918), a case where the plaintiff wasn’t management but rather a North Carolina family that needed their child to work. The Court wasn’t even friendly to state laws regulating worker hours unless they applied to women. In Lochner v. New York (1905), the Court shot down a state rule limiting the number of consecutive hours bakers could work without a break to 10 (several had collapsed from exhaustion and fallen into ovens). But the Court authorized a similar state law in Oregon that applied to female hotel workers. Courts and Congress locked horns on monopolies (or trusts) as well.
In 1904, a stenographer/typist at Washington’s Dead Letter Office named Lizzie Magie patented the Landlord’s Game. Dice-rolling players circled the square board buying up properties, railroads, and utilities and paying and collecting rent, all the while hoping to avoid “going to jail.” Like Upton Sinclair with The Jungle, her point was to underscore the evils of capitalism. Magie hoped to alert players that, “In a short time…[they] will discover that they are poor because Carnegie and Rockefeller…have more than they know what to do with.” As in the Jungle’s case, the public largely missed the point and her creation became the most popular board game of all time when marketed by Parker Brothers as Monopoly® in 1935. It turns out that players loved nothing more than the satisfaction of bleeding their opponents dry on their road toward bankruptcy. The snowballing effect of wealth accumulation was fun to experience, if only vicariously in fantasy version. In real life, though, many Americans shared Magie’s hatred of trusts, as they were commonly called. Congress responded by outlawing companies that acquired or maintained monopolies through unfair practices.
However, the U.S. has a three-branch government and, just as the Court was lukewarm toward labor laws, so too they were skeptical toward the 1890 Sherman Antitrust Act aimed at monopolies. The Court came to an accommodation with the Teddy Roosevelt and William Howard Taft administrations since those presidents only enforced the law in egregious cases. As we saw in the first chapter, Standard Oil was an early target, broken up in 1911, and in 1915 a federal district court ruled in favor of what became 20th-Century Fox, breaking up (Thomas’) Edison Trust and ushering in the Hollywood era.
TR’s first big target was J.P. Morgan’s railroad conglomerate Northern Securities in 1902. The Supreme Court went along with the injunction in a 5-4 ruling, but Roosevelt was outraged that his own appointee, Civil War veteran and Associate Judge Oliver Wendell Holmes, wrote a dissenting opinion. Holmes’ problem with the Sherman Act was that capitalism encouraged competition then outlawed whoever won the competition. TR, conversely, saw monopolies as undermining competitive pricing and told the newspapers that he could “carve a judge with more backbone [than Holmes] out of a banana.” Roosevelt also resented the dismissive way that Morgan had tried to buy him off when he became president, in the same way that he’d negotiate with another industrial titan. When TR first broached the subject of breaking up U.S. Steel, J.P. Morgan chuckled and contemptuously telegrammed that he would send his man down to talk to Roosevelt and work out the problem. While Roosevelt’s courage was laudable, U.S. Steel ultimately won that case in 1911.
Breaking up monopolies is a significant intrusion into the free market but, left on their own, economies will naturally tend toward concentrations of power. Bigger companies gain efficiency advantages as they scale up (aka economies of scale), which can benefit consumers in the form of lower prices. Yet, if a company corners an entire market, they can raise prices to the disadvantage of consumers. In other words, the advantage of capitalism from the consumers’ perspective is competition and monopolies can destroy competition even as they bring efficiency and order to industries, the way Rockefeller did with Standard Oil. Many of today’s billionaires, like Bill Gates (former chair/CEO of Microsoft) or Mexican telecom mogul Carlos Slim, made their money by finding niches in the economy with barriers to competition.
Yet, the Sherman Act doesn’t outlaw monopolies outright, just those acquired through illegal or unfair practices. The Clayton Anti-Trust Act of 1914 supplemented the Sherman Act by prohibiting any anti-competitive mergers. The ambiguity of illegal, unfair, and anti-competitive has complicated attempts by the Justice Department (Anti-Trust Division) and Federal Trade Commission to bring antitrust suits against companies like Microsoft (for exclusively bundling the Internet Explorer browser with its own operating system), Google (for favoring itself and its clients in searches), and Apple (for conspiring with publishers to raise e-Book prices). The government won its case against Microsoft in 1998-2000 based on the Sherman Act, though they never broke the company up into two “baby Bills” as originally proposed, with Netscape losing share through natural competition and technology anyway. The ambiguity of the terms frustrates the companies themselves and their legal teams as they struggle to survive and compete with one another. In addition, globalization has brought American companies like General Electric and Microsoft under the jurisdiction of foreign agencies. The European Union blocked GE’s merger with Honeywell and continued to dog Microsoft after the second Bush administration settled out of court in America with a wrist-slap.
Today’s tech giants present the Justice Department with new challenges because Google dominates Internet searching and Facebook/Instagram dominates social media so thoroughly that each present dangerous concentrations of power (political, even, rather than merely economic), especially with more artificial intelligence on the horizon. Still, their familiarity is convenient and their big scales allow them to develop features attractive to their customers.
Progressive Politics: Bipartisan & Local
These Progressive Era battles did not break down neatly into Democrats versus Republicans. Today it’s not an oversimplification to classify most Democrats as liberal and Republicans as conservative, at least by American standards. We’ll explore that terminology more below because both terms have complicated histories, but most of us know roughly what they’ve come to mean. Each side even grades their own politicians and each other as to how loyal they stay to their party’s ideological principles. But such hardened categories weren’t defined in the early 20th century. Both parties had progressive and conservative elements within their ranks. Republicans led the charge nationally by regulating food and drugs, civil service reform (awarding government jobs by merit rather than mere patronage or the spoils system), and breaking up monopolies. Republicans also helped bring about public utilities, including sanitation, sewage, and water. Cleveland Democrat Tom Johnson pioneered the concept of publicly-owned utilities as a mayor elected by Populists and labor unions. Detroit’s Republican mayor Hazen Pingree grew vegetables in vacant lots, “Pingree’s Potato Patches,” to feed the poor during the 1890s depression.
The most famous and influential of the Progressive Republicans at the state level was Wisconsin’s Robert La Follette, who served as governor and U.S. senator. “Fightin’ Bob” fought corporate influence on politicians and pioneered government-subsidized mass transit to reduce congestion and pollution. Wisconsin passed the country’s first environmental restrictions and La Follette capped sailors’ workweeks at 56 hours. As a presidential nominee of his own Progressive Party in 1924, La Follette won an impressive 16% of the vote. To this day, progressive politics are sometimes referred to as the Wisconsin Idea. Wisconsin pioneered the primary system to give voters a greater say in who the respective parties nominated to run in general elections. Southern states also conducted informal straw poll caucuses as a way to exclude otherwise eligible black voters from the process. Montana pioneered campaign finance reform by restricting corporate lobbying after the Anaconda Copper Mining Co. of Butte bought off its legislature. That law was overturned, though, by Citizens United v. FEC (2010) and an unsuccessful challenge to Citizens United in Western Tradition Partnership, Inc. v. Montana (2012).
A lot of Progressive politics was local rather than national, or a combination. After the 1900 Galveston hurricane, local and national governments cooperated in rebuilding a seawall and dredging the Houston Ship Channel (1909-14), relocating the major Texas port inland as far as East Houston’s Turning Basin for better protection. Galveston’s city commission became a model as other towns hired city managers to run municipalities as a CEO would a company. Houston’s Ship Channel was a pioneering example of public/private partnership, as companies built warehouses and docks to augment evenly matched public funds from Harris County taxpayers and the Army Corps of Engineers to dredge Buffalo Bayou and Galveston Bay. President Woodrow Wilson officially opened the channel in 1914 and it served as an oil port during World War I. The federal government dredged the canal even deeper in the 1920s and 30’s.
New York City and Chicago were confronted with colossal amounts of poverty, disorder, disease, and filth as they grew into the nation’s biggest cities. Modern Americans don’t usually associate cities with animals and might think first of the West when it comes to horses, but draft animals drove American industry and commerce as steam, electrical, and gasoline power were still in their infancy. Aside from pulling burdens — including omnibuses, railed horsecars (streetcars), bread, beer, and milk — horses also walked treadmills to turn gears and pulled the winches, pulleys, and elevators that enabled building and dam construction. In New York, tens of thousands of horses each contributed around 15-35 lbs. of manure and two gallons of urine daily. “Muck handlers” tried to collect, dispose, and recycle into fertilizer the over three million pounds deposited daily onto the streets, but manure flowed down cobblestone joints and seeped into basements on rainy days, and dried into fly-infested piles in the heat, spreading typhoid. On hot, windy days, it blew through the air in pulverized form. The stench was nearly unbearable even for those accustomed to the standards of the time. The wealthy built brownstone walk-ups with staircases high enough to avoid the piles. Most of the country’s 20+ million horses were worked to death in the space of 4-5 years, and New York officials drug dead horses to Barren Island off Brooklyn to make glue, rendered fat, leather, and horsehair. Yet, cities couldn’t function without them. When an epizootic flu epidemic decimated Boston’s horses in 1872, a small fire nearly burned the whole city as no horses were available to pull pump-wagons.
New York City officials also estimated the feral pig population at ~ 20k. And, as in every city since ancient times, humans threw excrement out the window onto the street. Disease epidemics (cholera, typhoid, malaria, etc.) swept through American cities on a regular basis. Tuberculosis, or “consumption,” killed more Americans in the 19th century than influenza, AIDS, and polio combined in the 20th. To confront these challenges, New York was the first American municipality to collect garbage and build an extensive sewer and water system. Chicago, a city that went from 100 villagers to 1.7 million between 1830 and 1900, did likewise shortly thereafter to combat disease, especially cholera after a terrible 1854 outbreak. The Raising of Chicago (below) involved lifting all buildings 4-14 ft. higher with hydraulic jackscrews to run a sewer system underneath.
Cities tackled crime along with waste and disease. Police departments emerged partially from the old slave patrols of the Deep South and immigration patrols in the Southwest, but mostly in urban America in the late 19th-century. Real police departments — with full-time employees and procedures that were answerable to central governments — started in Boston in 1838, New York in 1845, and Chicago, New Orleans, Philadelphia, and Baltimore in the 1850s. By the 1880s, every American city had a “thin blue line.”
Referendums and initiatives, also called propositions, as in “Prop X,” emerged locally, too. In these cases, voters vote directly — yes or no — rather than indirectly through politicians. For instance, if a town wants to build a new high school or light rail, it puts the issue on the ballot directly rather than arguing it out in the city council, asking voters whether it can issue a bond (borrow money) for such a project. South Dakota was the first state to put referendums and initiatives on its ballots in 1898. Portland, Oregon voted on so many referendums in the early 20th century that other Americans initially called such votes the “Oregon System.” These referendums, or plebiscites as ancient Romans called them, are as close as we come in the U.S. to pure democracy as originally defined by the Athenian Greeks. The Greek word demos is loosely defined as the “grip of the people.” Ancient Athenians rotated their non-slave male population in and out of assemblies the way America requires its citizens to sit on juries. Working, instead, through politicians is variously called a republic, representative democracy, or (elsewhere) parliamentary democracy.
TR: That Damned Cowboy
At the national tier, Republicans also led the Progressive charge, most famously with Theodore Roosevelt (1901-09) and his successor, William Howard Taft (1909-13). The Progressive Era culminated during the presidency of Democrat Woodrow Wilson, from 1913-21. Roosevelt was a strange mixture, politically, who would be impossible to classify today as liberal or conservative — part of what makes him so interesting. If a party or think tank tried to grade him on consistency to some principle, he’d flunk and probably roll the report card up in a wad and throw it back in their face.
On the one hand, TR advocated foreign wars for their own sake and white supremacy. Of American Indians, he said, “I don’t go so far as to think that the only good Indians are dead Indians, but I believe nine out of ten are, and I shouldn’t like to inquire too closely into the case of the tenth.” He thought American minorities were inferior to white Gentiles, but he also thought that anyone who wasn’t inferior deserved an equal job and he personally appointed African Americans to government positions. He created a scandal by inviting black activist and educator Booker T. Washington to the White House for dinner. Also on the liberal/progressive side of the ledger, Roosevelt thundered against corporate corruption, set aside wilderness for national parks, started the Forest Service, and supported universal healthcare insurance and women’s suffrage.
Abraham Lincoln was first to cordon off wilderness as federal land (Yosemite in 1864), and the integrated National Park Service started in 1916 under Woodrow Wilson, but Roosevelt did more for the cause of wilderness protection overall than any other president, though as we’ll see, his cousin Franklin gave him a serious run for the money. TR signed the controversial Antiquities Act (1906) that allows the President to circumvent Congress on behalf of preserving certain federal land. He doubled the national parks from five to ten, created game reserves, bird sanctuaries, national monuments, and the Forest Service to manage natural resources sustainably (unlike National Parks, National Forests are not intended as sanctuaries). He designated the Grand Canyon a National Monument to fend off copper and asbestos miners and Wilson made it into a National Park in 1919. Roosevelt also started the Bureau of Reclamation to manage western rivers and (later) hydro dams, especially on the Colorado. Today, the Colorado River (not the one that runs through Austin) supplies over 12% of Americans’ drinking water and 15% of that used to grow crops. Overall, Roosevelt was responsible for preserving more square mileage of wilderness than the entire state of Texas.
While respectful of free enterprise, TR loved strong government and was openly confrontational toward corporate America in a way that no modern Republican or even Democrat would dare be. He relished busting monopolies. Today, Wall Street “butters the bread” of both parties and most voters have a stake in the stock market with a 401(k) or pension, making it unlikely we’ll see another TR anytime soon – at least one that actually wins the presidency. While Roosevelt took campaign donations from Wall Street banks, he also worked for the American public rather than powerful lobbies. He pushed for campaign finance reform, spurring Congress’s first attempt to reign in the political bribery that Mark Twain lampooned in the Gilded Age (TR was accused of taking money himself during the 1904 campaign). With his bombastic personality and high-registered voice (listen to the video near the top of the chapter), the stocky, bespectacled fireplug was America’s most dangerous president in terms of disrupting the economic status quo.
First, the captains of industry spent millions to keep Populist/Democrat William Jennings Bryan out of office in the 1896 election. Then, in 1900, Republicans tried to hide Roosevelt in the vice-presidency — a notoriously weak office — but were foiled when William McKinley was shot in 1901. As McKinley’s aid Mark Hanna famously asked beforehand, “Don’t you fools realize only one life stands between that damned cowboy and the White House?” Roosevelt’s maverick attitude was even more remarkable given the relative weakness of the government when he took office. It was small in comparison with corporate America, as depicted in the cartoon above with TR in the red shirt and titans like J.P. Morgan looming over him. He was up against some formidable companies, many of whom had bigger war chests than the entire federal government.
Today, Wall Street and Washington are symbiotic (mutually beneficial) partners, with Washington keeping financial regulations light in exchange for insider trading information and campaign donations. Politicians of both parties crush average annual market returns in their own accounts. When Washington does regulate Wall Street, foreknowledge of those laws gives politicians and donors an edge. Threats of stronger regulations are an excellent way to elicit higher donations from lobbyists while appearing to do something in voters’ eyes. Roosevelt, unlike most politicians then and now, wasn’t for sale.
TR didn’t necessarily side with labor or management and that alone distinguished him from predecessors of both parties who sided with management. The unspoken but dramatic shift was Roosevelt’s assumption that it was even within the realm of the government to be brokering relations between businesses and workers (or consumers) in the first place. Despite TR being a Republican, that shift reshaped the regulatory state of the 20th century and the modern American definition of liberalism.
Liberal & Conservative: Two Evolving Terms
Liberal, just to make things confusing, is a word that means different things in different times and places. Read this section slowly as it can put your head on a swivel. In the 18th century, what’s now called classical liberalism meant a combination of democratic political freedoms/rights and free market economics. However, as Britain and the U.S. dealt with industrialization in the 19th century, they realized that free markets posed certain problems, or externalities, that the public wanted dealt with. Scotsman Adam Smith, commonly (if somewhat mistakenly) trotted out as the godfather of free markets, conceded that externalities should be regulated in cases where those regulations benefitted public welfare. Smith’s real concern in the Wealth of Nations (1776) — the book that launched the field of economics — wasn’t consumer-inspired government regulation that interrupted free markets, but rather governments beholden to mercantilist interests that disrupted the “invisible hand” of free trade to the detriment of the public and general prosperity. That’s a subtle but major difference that has caused Smith to represent something he didn’t believe in among those who either never actually cracked his 1,000-page doorstop or did and deliberately misrepresent or cherry-pick him. Yet, Adam Smith also lacked faith in even the best-intentioned governments to administer economic central planning or to fundamentally alter the invisible hand’s allocation of resources, at home or abroad. Smith didn’t offer up a specific prescription as to how to balance the efficiency of the invisible hand with the need for regulating externalities; he merely predicted that modern economies would struggle to maintain the right balance.
Modern near-left liberals think that the capitalist garden needs tending, or what Franklin Roosevelt called “trimming the vines” in the 1930s, to deal with Smith’s externalities. Conservatives prefer to let (economic) nature take its course and see intervention as likely to mess up the market’s natural function, similar to how a naturopath views drugs and surgery. At their most caricatured extremes, at least, right-wing conservatives are blind to capitalism’s imperfections and left-wing liberals are blind to capitalism being the goose that’s laying golden eggs for society at large — the economic engine for everyone that finances education, the arts, scientific research, the welfare state, etc. — not just lining the pockets of Rich Uncle Pennybags, aka the “Monopoly Man.”
As free men increasingly won the right to vote in Britain and the U.S. during the 19th century, economic liberalism morphed from a free-market ideology into a mostly free market increasingly regulated to tamp down some of these externalities or things people didn’t like about completely free markets. These problems included, among other things, cyclical disruptions that the Federal Reserve was designed to mitigate, exploitive child labor, and monopolies. Liberalism morphed into what’s best described as a reconciliation between capitalism and democracy. As people won the right to vote, workers and consumers voted to tinker with capitalism so as mitigate its negative effects on them. This is the type of liberalism the forenamed Ayn Rand’s followers dislike and would like to get rid of. It’s also what the economic portion of the Progressive Era was all about and how the influential New Republic magazine came to define liberal. The New Republic editors were part of a group centered in the Dupont Circle neighborhood of Washington, D.C. that included supporters of Teddy Roosevelt’s progressive agenda and Supreme Court Judges Felix Frankfurter and Oliver Wendell Holmes. Eventually, Teddy’s young cousin Franklin shed his conservative upbringing and joined the group.
Slowly but surely the government grew bigger and it wasn’t until after the Reagan Revolution of the 1980s that liberal became a dirty word that even Democratic politicians avoided like the plague. By then, there was a prevailing feeling among a critical balance of voters that it was time to pull back on the regulatory throttle. Marginalizing the term liberal was brilliant politics on the Republicans’ part, putting Democrats on the defensive. Meanwhile, conservatives got scared of the word capitalism, preferring free markets or free enterprise system, maybe because Karl Marx used capitalism negatively in the 1850s, though he wasn’t the first. You shouldn’t be scared away from either term regardless of your views. Texas public schools avoid the term capitalism while indoctrinating students to the upside of free markets.
Like liberalism, conservatism is a difficult-to-define term that’s changed over time. Traditional American conservatives were less likely to favor interventionist over isolationist foreign policies than today’s more Hawkish Republicans, and less likely to define “pro-business” as either pro-Wall Street or pro-free trade — often favoring more “main street” small businesses over corporations or protectionism (tariffs) over free trade (more in Chapter 21). A recent incarnation of the main street variety is The American Conservative (2002- ), that critiques corporate power and corruption and advocates husbanding America’s military strength, opposing interventions like Iraq in 2003. Conservatives in most eras are leery of change, while more “reactionary” conservatives favor restoration of earlier political or social orders. Soldier/journalist Ambrose Bierce took a friendly jab at everyone in his Devil’s Dictionary (1881-1906), defining a conservative as a statesman “who is enamored with existing evils, as distinguished from the Liberal, who would replace them with others.”
Aside from changing over time, liberal and conservative also have different meanings in international, as opposed to domestic, contexts. Here’s where liberal gets really tricky. When American relations with Cuba began to improve in 2015, American commentators expressed hope that Cuba would “liberalize their economy.” Wasn’t it already ultra-liberal because it was communist? No; the term switches even as it crosses the narrow 90-mile strait between Florida and Cuba. Overseas, we hope for other countries to liberalize their economies in the old-fashioned, free-market sense of the word — to make them what, stateside, we’d call more conservative. Conservatives (and liberals) that favor overseas military interventions strive to protect what the rest of what the world now defines as liberalism: representative government, free markets, and individual rights. Neoconservatives sent Americans into combat in Iraq in 2003 to fight for liberalism in the broader sense of the word. When the Taliban, al Qaeda or ISIS hates on Western liberals, they mean all of us.
In addition to evolving over time and space, these conservative and liberal labels tend to pigeonhole individuals. Out of necessity, you’ll see the terms plenty in the rest of the textbook as I describe political and cultural debates, but remember that such categories tend to stereotype people into clusters that usually don’t correspond well to how any one person actually views the world. Most people have their own complex views that change as they grow older and don’t fit neatly into any one box.
As of the 1912 election, the liberal and conservative tags weren’t associated with Democrats and Republicans as they are today. There were conservative and progressive wings in each party. Teddy Roosevelt thought that his successor, William Howard Taft, had betrayed the progressive spirit. If we define progressivism as TR’s presidency, that was crap, since Taft broke up more monopolies than Roosevelt, oversaw passage of the graduated national income tax, and signed off on the creation of Montana’s Glacier National Park. But Taft made cuts to Roosevelt’s hard-earned Forest Service that exacerbated the damage of a 3 million-acre conflagration in the Northwest known as the Great Fire of 1910 (inadvertently strengthening the conservation movement). More importantly, Taft wasn’t as bold as TR about confronting moneyed interests and corporations head on. Who was? Taft disliked being caught between various special interests and never grew into the sort of confrontational firebrand the public had come to love in Roosevelt. Famous for his Falstaffian personality and figure, Taft served one term and ended up as the only president in American history to move from the executive to judicial branch, becoming Chief Justice of the Supreme Court in 1921.
Roosevelt, meanwhile, had retired to Africa to hunt big game and write a few books. J.P. Morgan quipped, “every American hopes that every lion will do its duty” [eat TR]. There for nearly a year, Roosevelt and his son Kermit killed hundreds of lions, elephants, and giraffes. But, as he breakfasted over week-old New York Times before going out to blast rhinos with his Winchester, TR couldn’t help but get the sense that the corporate camels had nudged their nose too far under Taft’s tent. Moreover, when Taft did get more aggressive with corporations, such as when he took on U.S. Steel’s monopoly, TR took offense because he thought it made him look soft for having tolerated U.S. Steel. Poor Taft couldn’t win, it seemed, in earning Roosevelt’s approval. Roosevelt thought that Taft was overly respectful toward the Constitution and separation of powers between the executive and judicial branches. He called the 330-lb. Taft a “fat-head and a flub-dub with a streak of the second-rate and the common in him…Taft has the brains of a guinea pig!” Really, TR was just bitter that he wasn’t president anymore and itching to get back in the White House for a third term.
So Roosevelt resolved to do something no retired president since Martin Van Buren had done: he reentered politics under a different party banner and ran against the incumbent Taft. The robust 53-year-old proclaimed, “The parting of the ways has come…my hat is in the ring, the fight is on, and I am stripped to the buff.” In terms of physique if not political leanings, this was a candidate that Sarah Palin could’ve admired as much as she did the shirtless, tiger hunter Vladimir Putin (Russian President). TR eventually branded himself under the banner of the Progressive Party, but it wasn’t so much an entirely separate party as the more progressive plank of the existing Republicans — promoting national income and inheritance taxes on “fortunes,” universal healthcare coverage, minimum wage, women’s suffrage, and organized labor, defending natural resources, and advocating a six-day/eight-hour work limit (48-hour week). He pushed for the same type of elderly pension and unemployment insurance his cousin Franklin would reluctantly pass a quarter century later with Social Security. TR argued that the public should toss out any judges who blocked these liberal reforms because they were unconstitutional. He was channeling Wisconsin Republican Robert La Follete, arguing that some concessions toward democratic socialism would inoculate American capitalism against more radical options like communism. The GOP at the time was mostly a collection of state parties beholden to various corporate interests like railroads, mining, and timber. TR thought they needed a strong executive at the top who could appeal directly to the workers: “The Republican Party must stand for the rights of humanity, or else it must stand for special privileges.”
Not all states had binding primaries at the time, wherein the respective political parties chose their candidates through voting. Only thirteen states had Republican primaries in 1912 and TR won nine. But party delegates nonetheless chose Taft at their summer convention in Chicago, infuriating Roosevelt. He said of the GOP: “The dog has returned to its vomit.” For TR, if the existing parties were unresponsive than creating a third party in 1912 was no different than the Republicans stirring up the old Democratic-Whig two-party system in 1854. They too were once a third party. It was rough on President Taft, who called TR a “freak” and a “demagogue” but didn’t take naturally to mudslinging and dirty politics. One reporter found him weeping on a train car after a speech, lamenting that he’d lost his best friend in Roosevelt. TR, on the other hand, ran up and down caboose aisles shadow boxing as he prepared to give whistle-stop speeches trashing his old buddy. No president had ever run for a third term (TR had served 1 and ¾ terms), even though it was legal until 1951. George Washington had voluntarily stepped down after two terms and it became traditional to not go past two.
The 1912 Presidential Election occurred during the height of the Progressive era and was one of the more interesting and important elections in American history. For starters, it was a four-horse race between the Republicans Taft and Roosevelt, Democrat Woodrow Wilson, and Democratic Socialist Eugene Debs. Roosevelt’s progressive GOP plank was basically advocating democratic socialism as well and became better known as the “Bull Moose Party” after he was shot before a campaign speech in Milwaukee. Let me explain.
The assassin, a New York saloonkeeper named John Schrank, disapproved of Roosevelt breaking Washington’s two-term tradition and shot him in the chest from point-blank range with a Colt .38 caliber revolver. Shrank said he dreamt that William McKinley’s ghost sat up in his coffin and pointed to VP Roosevelt as his killer, instructing Shrank to avenge him. Fortunately, he hit Roosevelt’s breast pocket, that contained a rolled up 50-page speech and his glasses case. The bullet lodged just below his skin and Roosevelt went ahead and gave the speech after realizing that he wasn’t coughing up blood, meaning the bullet hadn’t punctured his lung. An aide announced what happened before he began and someone yelled, “fake!” TR opened his coat, showed the crowd his bloodied shirt and bellowed, “Ladies and gentlemen, I don’t know whether you fully understand that I’ve been shot; but it takes more than that to bring down a Bull Moose.” He spoke for ninety more minutes contrasting his dedication to peoples’ rights with Woodrow Wilson’s commitment to “the old flint-locked, muzzle-loaded doctrine of states’ rights,” then walked to the hospital. They sent Shrank to an insane asylum.
The 1912 election was a bit awkward for young New York Assemblyman Franklin Delano Roosevelt. He was a distant cousin of his idol Teddy, but as a Democrat had committed to supporting Wilson. Franklin’s wife Eleanor was Teddy’s favorite niece and favored him over Wilson. Uncle Ted advocated the aforementioned liberal platform, along with more “square dealing” under the mantle New Nationalism. Woodrow Wilson promised New Freedom, whereby he meant less government regulation and a more decentralized, local economy. There was an obvious flaw in Wilson’s reasoning: if capitalist economies have a natural tendency to concentrate power at the top in the form of monopolies, then why would deregulating the economy lead to smaller, local businesses? Why would less regulation break up monopolies?
The flaw wasn’t fatal, though, since Taft and Roosevelt mainly stole Republican votes from one another, while Debs stole some progressive votes from Roosevelt. Roosevelt outpolled Taft, but Wilson walked away the winner at 42%, however contradictory his platform. Debs garnered 6% of the vote (white in the pie chart below), the high-water mark for democratic socialism in American history unless you want to count TR’s 27%, and second most for a Lefty behind La Follette’s 16% in 1924.
TR fell briefly into a depressive funk then went on a mapping expedition up the Amazon River with his son Kermit, where he nearly died of an infection and malaria. He returned to the U.S., having seemingly aged five years in two months, to lambast Wilson’s administration for not getting into World War I quicker. It’s safe to say we may never see another Teddy Roosevelt.
The parties’ platforms weren’t fully clarified by 1912 and never will be; they’re constantly evolving. But the 1912 election weakened the GOP’s progressive wing with Roosevelt’s defection and loss, and it mostly died out with Herbert Hoover’s presidency of 1929-1933. It was gone altogether by 1964, though John D. Rockefeller’s grandson Nelson Rockefeller (R) was a relatively liberal New York governor from 1959-1973. Progressive legislation has emerged from Republican administrations since, but it’s never dominated a platform the way it did with Roosevelt’s breakaway party in 1912. Hoover ran to the left of Franklin Roosevelt in 1932 on some issues, but he was really a centrist more than a progressive. Also, Thomas Dewey’s 1948 GOP platform was to the left of Democrat Harry Truman’s on some issues, but that was much to the chagrin of congressional Republicans.
As for the Democrats, Woodrow Wilson wasn’t much of a liberal when he was elected but he fast became one as the Progressive tide crested, reorienting himself to catch the wave in time for his 1916 reelection. Under Wilson, the U.S. enacted the Clayton Anti-Trust Act, implemented the Federal Income Tax passed at the end of Taft’s term, created the Federal Reserve and National Park Service, and signed congress’s first child labor laws. Toward the end of his presidency, women won the right to vote. Although he didn’t spearhead these initiatives (and opposed women’s suffrage at first), he projected a progressive improve-the-world spirit onto the international stage by trying to outlaw war and create a world police organization, the League of Nations, after World War I (next chapter).
Wilson also embodied the racial blind spot of Progressives with his appreciation for the Ku Klux Klan and reversal of Roosevelt’s policy of hiring minorities in the federal government. The Democrats carried the racist legacy of the Confederacy with them up through the mid-20th century, even as they assumed an otherwise pro-worker mantle by the 1930s.
Progressive legislation didn’t always originate with activists or the people suffering from harsh conditions. More often, establishment leaders sensed the tide of history and cut their losses by preemptively enacting moderate legislation. They were letting some steam out of the pot in order to preclude a boil over in the form of a working-class uprising. In other words, they defused more radical options with compromise. Both Teddy Roosevelt (R) and his cousin Franklin Delano Roosevelt (D) saw things that way. While some of their more vocal critics saw them as loose cannons or communist Antichrists ruining America, they saw themselves as closet conservatives who were saving capitalism from itself, inoculating it against more drastic alternatives.
Patricia O’Toole, “The Speech That Saved Teddy Roosevelt’s Life,” Smithsonian (November 2012)
P.J. O’Rourke, “Deciphering Fed Speak,” American Consequences (June 2017)
Paul Sagar, “The Real Adam Smith,” Aeon (January 2018)