Republicanism, Democracy, and the Minimum Wage

President Barack Obama chose to make the minimum wage a focal point of the 2014 State of the Union Address. In it, he said that he would raise the minimum rate of federal contractors by executive order. Doing so for all wage earners in the country is beyond the presidents purview, but he implored Congress to “give America a raise. Give ’em a raise.” The White House then published a white paper estimating that an increase in the federal minimum wage from $7.25 per hour to $9.00 per hour – and indexing it to inflation thereafter such that the wage would automatically increase with price levels – “would directly boost wages for 15 million workers and reduce poverty and inequality.”

As of January 1, 21 states and the District of Columbia had minimum wage standards higher than the federal requirement. The highest of these is Washington state’s at $9.32 per hour. Massachusetts is set to eclipse that, as the state’s Senate voted on a bill in November 2013 that would raise the minimum wage to $9 per hour on July 1, 2014, $10 per hour on July 1, 2015, and $11 per hour on July 1, 2016. The wage for workers who rely primarily on tips would be set at half the new minimum wage. The motion passed 32-7, and will be heard in the state House of Representatives this year. Massachusetts’ minimum wage is set to automatically increase to $0.10 above the federal rate, should the federal rate surpass the state’s current minimum wage. Furthermore, advocacy group Raise Up Massachusetts claims that they have enough signatures to add a referendum to the ballot, which, if passed, would raise the minimum wage to $10.50 per hour and make paid sick-time mandatory.

The United States had no minimum  wage for a longer period of time than it has had one. This article will provide an overview of the  ideological precepts which led to the establishment of a minimum wage, an examination of the 1938 Fair Labor Standards Act, and data concerning subsequent alterations to the minimum wage.

Independence

John Adams argued in one of his later letters that the American Revolution began as early as 1620 with “the first plantation.” His meaning was that the guiding principle of the War for Independence was not just national sovereignty, but also personal independence. The British Empire at the time of the Revolution was divided in such a way that complete economic self-sufficiency was not possible for the nascent American Republic. The period of history we call the Industrial Revolution was already well underway in the British Isles, and North America was a major market for British manufactured goods. Mostly artisans and craftsman in small urban centers — along with their apprentices and journeymen — performed the little non-agricultural production that existed at the time.1

Economist Albert Wenger put together these nifty graphs from US Census data. The first shows wage-laborers as a percentage of the population, and the second shows the economic sectors these workers were employed in.

fig1

fig2

In 1810 less than a third of the population sold their labor for a wage and 75% of these were farmers.  A faction of the nation’s founders, led by Alexander Hamilton, believed that in order to achieve meaningful political sovereignty the United States had to attain economic independence. In 1791 Hamilton issued his “Report on Manufactures” to Congress. In it he argued for the imposition of a protective tariff on manufactured goods from England, and the use of the new revenue to subsidize American industries until they were able to compete internationally. 

Many founders were horrified by implications of a growing class of people who worked for others. Early Americans were gaga over republicanism, and the efficacy of a republic was believed to predominate on the civic virtues of its citizenry. The most supreme of these virtues was individualism or independence. In his Notes on the State of Virginia, Thomas Jefferson characterized subsistence farmers, or yeomen, as the exemplars of republican values. Writing about slavery, Jefferson said:

 For if a slave can have a country in this world, it must be any other in preference to that in which he is born to live and labour for another: in which he must lock up the faculties of his nature, contribute as far as depends on his individual endeavours to the evanishment of the human race, or entail his own miserable condition on the endless generations proceeding from him. With the morals of the people, their industry also is destroyed.

That is to say that, to Jefferson, the horror of slavery was the absence of both liberty and  independence. The founders feared the creation of a working class, already extant and extensive in England. Since there was never a blood nobility in the North American colonies it was believed that there need be no social class. “Keep our workshops in Europe,” was the phrase. American social life was centered around the family, and foremost, the father. When the founders spoke of independence, they were speaking mostly of the head of a household’s independence from other men. That is, each father was a “miniature king.”2

It was the idea of men being dependent on a patriarch they were not related to, and the subsequent loss of virtue that would entail, that most frightened early republicans about an economic conversion to manufacturing. Hamilton explicitly addressed this concern in his report, saying:

In places where those institutions [factories, mills] prevail, besides the persons regularly engaged in them, they afford occasional and extra employment to industrious individuals and families, who are willing to devote the leisure resulting from the intermissions of their ordinary pursuits to collateral labours, as a resource of multiplying their acquisitions or [their] enjoyments. The husbandman himself experiences a new source of profit and support from the encreased industry of his wife and daughters; invited and stimulated by the demands of the neighboring manufactories… Besides this advantage of occasional employment to classes having different occupations, there is another of a nature allied to it [and] of a similar tendency. This is — the employment of persons who would otherwise be idle (and in many cases a burthen on the community), either from the bypass of temper, habit, infirmity of body, or some other cause, indisposing, or disqualifying them for the toils of the Country.

Translation: Wage labor is something that is going to be done mostly by women and children in their down-time so that they, and their fathers, will have some extra spending money. If a man is engaged in wage labor it is because he is crippled, sick, a drunk, or just does not have a good protestant work ethic. This became a commonly held belief, and later many philanthropic groups in American cities would attribute the condition of the poor solely to moral failing or racial inferiority.

Lincoln and Marx

It is interesting that the term “wage-slavery,” which would later become a favorite of Marxist polemics, predates Karl Marx’s active period by almost a century. The term appeared in a number of philosophic discussions and had been firmly adopted by American labor movements by the 1820s.

There has, of course, always been poverty in every civilization. American views on poverty began to clearly bifurcate into a camp that believed poverty was a result of moral failing, and another which viewed it as systemic. Early labor activists and union leaders like Thomas Skidmore – who founded the Workingmen’s Party in New York circa 1828 – framed the issues of the working poor  in staunchly populist republican terms. Skidmore and his cohorts argued that American workers were being “kept in a humble state of dependence.”

Thousands of our people today in deep distress and poverty , dependent for their daily subsistence on a few among us, whom the unnatural operation of our own free and republican, as we are pleased to call them, has thus arbitrarily and barbarously made enormously rich… For he in all countries is a slave who must work more for another than that other must work for him. It does not matter how this state of things is brought about; whether the sword of victory hew down the liberty of the captive…or whether the sword of want extort our consent… through a denial to us of the materials of nature.3

Labor Unions saw a rise in membership in the antebellum United States, but early republicans were encouraged by the fact that many wage-earning jobs were being filled by an increasingly large influx of immigrants. The Jacksonian Democrats came to identify with craftsmen, subsistence farmers, and later southern plantation owners. Unions were primarily local organizations until national organizations such as the Knights of Labor, and the more conservative American Federation of Labor emerged in the 1880s.

Without getting into too much detail — since this is a history of the minimum wage and not specifically a history of unionization4 — unions before the Civil War were either conservative tribal groups that protected jobs in a specific trade in a specific community and usually for a specific ethnicity, or they were radical groups dedicated to the complete abolition of the wage labor institution.5

Even though many of the labor groups established around this time spoke in the language of “wage slavery” and “producers and non-producers,” ascribed to a labor-theory of value, and even identified as socialists they were not Marxist. Marxism relied on a meta history of dialectic materialism, atheism, and did not permeate working class culture with much real success until the advent of the International Workers of the World in the 20th century.

Past the Civil War and into the 20th century, from Richard T. Ely to Martin Luther King, progressive leadership was more likely to be possessed of an ecclesiastic teleology. It came to hold ideas espoused by Lincoln in his 1860 campaign, the motto of which was “free soil, free labor, free men.” Speaking in 1859 Lincoln put forth the philosophy that the GOP nominally identifies with to this day:

[I] do not deny that there is, and probably always will be, a relation between labor and capital. The error… is in assuming that the whole labor of the world exists within that relation… A large majority belong to neither class – neither work for others, nor have others working for them. Even in all our slave States except South Carolina, a majority of the whole people of all colors are neither slaves nor masters. In these free States, a large majority are neither hirers nor hired. Men, with their families – wives, sons and daughters – work for themselves, on their farms, in their houses, and in their shops, taking the whole product to themselves, and asking no favors of capital on the one hand, nor of hirelings or slaves on the other. It is not forgotten that a considerable number of persons mingle their own labor with capital – that is, labor with their own hands and also buy slaves or hire free men to labor for them; but this is only a mixed, and not a distinct, class…

The prudent, penniless beginner in the world labors for wages awhile, saves a surplus with which to buy tools or land for himself, then labors on his own account another while, and at length hires another new beginner to help him. This, say its advocates, is free labor – the just, and generous, and prosperous system, which opens the way for all, gives hope to all, and energy, and progress, and improvement of condition to all. If any continue through life in the condition of the hired laborer, it is not the fault of the system, but because of either a dependent nature which prefers it, or improvidence, folly, or singular misfortune.

As president, Lincoln firmly put his money where his mouth was, by emancipating the Confederacy’s slaves and progressing the United States greatest act of industrialization to date – the construction of the transcontinental railroad. Lincoln was also responsible for the passage of the United States most “socialist” action to that point – the Homestead Act of 1862.

The act allowed any person (including freed slaves) who was 21 years old or the head of a household and had not taken up arms against the United States to apply for a free land grant in the American West. The grant could subsequently be retained as private property so long as after a period of five years it could be proven the new owner had “improved” it — that is, by labor. This was a major effort to conserve the population of the United States as independent, and therefore morally virtuous, republican yeomen.6

Revolutions, Strike Waves, and Damned Cowboys

As industrialization and immigration multiplied the population of the United States over the ensuing decades, the country saw a rise in poverty and urban blight. Jacob Riis seminal work of photojournalism, How the Other Half Lives, showcased the squalid living conditions of New York City’s urban working poor. Riis wrote that the impetus for his investigation was the disastrous Franco-Prussian War (1870-71), which saw the destruction of the Second French Empire and the brief establishment of the doomed Paris Commune. He shared a fear with the Association for the Improvement of the American Poor, who wrote in their 44th Annual Statement that “reform may come in a burst of public indignation destructive to property and to good morals.” Riis went on to describe revolutions as representing

one solution of the problem of ignorant poverty vs ignorant wealth that has come down to us unsolved, the danger-cry of which we have lately heard in the shout that never should have been raised on American soil — the shout of ”the masses against the classes’ — the solution of violence. There is another solution, that of justice. The choice is between the two.

Indeed, the period of American history following Reconstruction saw a number of strikes culminate in violent clashes between unions and private as well as government authorities. The Great Railroad Strike of 1877 occurred when the B&O Railroad company cut wages twice in a single year. It unleashed a cacophony of death and destruction between union-members and militia in West Virginia, Maryland, Pittsburgh, and Chicago. The Haymarket Strike in Chicago took the lives of at least ten police and an unknown number of protestors in 1886.  A 1969 report by Phillip Taft and Phillip Ross asserted that the United States had the most violent labor history of any industrial country. They wrote of a strike wave around the turn of the century:

In the 1890s violent outbreaks occurred in the North, South, and West, in small communities and metropolitan cities, testifying to the common attitudes of Americans in every part of the United States…Serious violence erupted in several major strikes in the 1890’s the question of union recognition being a factor in all of them.7

Plainly enough, the increased interdependence of the states brought about by Reconstruction and the railroad, compounded by the new arrival of millions of immigrants and the economic stratification of society these developments entailed, had led members of the working class to unionize in an effort to guarantee themselves living wages.

The language and deeds of these people began to glean markedly away from ideals of republican virtuosity and more toward populist democracy. The efforts of workers to organize were sharply contested by owners of capital. Unions’ only recourse — the strike — often reached a fever pitch of mayhem and lawlessness. It was against this backdrop that President William McKinley was assassinated by Leon Czolgosz, a first generation American, and an anarchist. At his execution Czolgosz said, “I killed the president because he was an enemy of the good people – of the working people.”

In an attempt to isolate the extremely popular Theodore Roosevelt, the Republican Party had given him a very public but very powerless position; the vice-presidency. When McKinley was killed, republican senator Marcus Hanna fumed, “Now, look, that damned cowboy is president of the United States!” Although Roosevelt came from a family of austere New Yorkers, he was able to project the image of a self-made common man. With a hyper-masculine persona and an air of approachability, Roosevelt became the first president to intercede in a strike on a union’s behalf.

In 1902 he arbitrated a coal-mining strike between a wildcat union and JP Morgan, which concluded with the denial of union recognition but the payment of much higher wages with shorter work days. The shorter workday and workweek often took primacy in union demands, especially in this period, whereas control of the workplace was the major concern of early republicans. Along with embarking on extensive “trust busting” campaigns that saw the dissolution of the nation’s largest railroad and oil monopolies, Roosevelt committed himself to creating a “Square Deal” between corporations, workers, and consumers.

The New Deal and the First Minimum Wage Laws

The notion that government would intervene to guarantee workers just compensation was first put into legislation by the commonwealth of Massachusetts in 1912. The law was designed to protect working women, entitling them to be paid a living wage as determined by a special board. Similar laws were passed by fourteen states, the District of Columbia, and Puerto Rico between 1912 and 1919. Then, in 1923 a challenge to the minimum wage law in Washington DC was heard by the Supreme Court. The Court found that the District’s law violated a right to negotiate one’s own contracts implied by the Due Process clause of the Fifth Amendment. Chief Justice William Howard Taft wrote, dissenting:

Legislatures in limiting freedom of contract between employer and employee by a minimum wage proceed on the assumption that employees, in the class receiving the least pay, are not upon a full level of equality with their employer and in their necessitous circumstances are prone to accept pretty much anything that is offered. They are peculiarly subject to the overreacting of the harsh and greedy employer.

Striking down the DC minimum wage did not automatically repeal laws other states had enacted. The enforcement arm of these laws, however, were substantially weakened by knowing the Supreme Court could repeal them with any lawsuit. Most of the minimum wage requirements were supervised by a statutory board on a case-by-base basis rather than legislated by a broad statute – as the current federal minimum wage is. There were some states like Wisconsin that had minimum wage by statute, but this was rare, and the minimum wage almost always pertained exclusively to women and children. Enforcement was often perfunctory, and oftentimes the penalty for noncompliance would be as mild as public shaming. Later studies of the Massachusetts law found that although it led to a wage increase for between 20 and 25 percent of working women, it also led to a rise in unemployment, and in many cases the replacement of girls with women and women with men.8

As the United States economy evolved, and its society changed, so too did the demands of its people. By the time of the 1929 stock market crash that caused the Great Depression, common Americans had fully made the transition from espousing “republican values” to demanding democracy and equality. These values were distinct from each other. Most labor unions were not in favor of the abolition of the wage-labor system and a move toward agrarian autonomy, but rather sought to get themselves the highest possible wage for the least amount of time.

When the Depression began the Hoover administration’s solution was for the United States to retreat into itself, isolating its economy from the toxic assets in Europe. The results were catastrophic, and when Hoover left office in 1932 prices were continually deflating to the point where consumption was stifled. Exacerbating the urban manufacturing crisis was the devastation of the Midwest’s agriculture by famine and drought. Unemployment hovered above 26 percent.

This was the domestic circumstance Franklin Roosevelt was elected into. The international circumstance was worse. The masses had overthrown the classes in Russia and established a dictatorship of the proletariat. Josef Stalin had ostracized Leon Trotsky and his ideals of global revolution, focusing instead on “socialism in one country.” He was purging his government, collectivizing his agriculture, whipping peasant farmers into industrial manufacturers, assassinating his opponents, forcibly deporting ethnic and religious minorities, and inflicting untold horror on the whole of the population.

Meanwhile the German nation was rapidly descending into madness. On the fifth anniversary of the armistice that ended the World War I, Adolf Hitler attempted a putsch to overthrow the Weimar Republic. It was defeated. After being released from prison, Hitler and the National Socialists continued to engage in political activity by propagating canards such as that the government was being undermined by Bolsheviks, that these were the same people who had “stabbed Germany in the back” to end the First World War, that Jews were responsible for communism, the depression, and Germany’s defeat. The Nazis politicked their way into power, playing on increased political violence, pandering to and balancing the myriad German parties against each other. Hitler became chancellor in 1933, was granted plenary powers after the Reichstag fire that same year, and outlawed all other political parties.

This may seem a digression for an essay about the minimum wage, but it is not. Popular thinking was that the American Experiment itself was being assailed by the tribulations of the day. Material economic factors were seen to have impelled the revolutions in Russia and Germany. It was, in large part, because of poverty that the Nazis and Bolsheviks could make scapegoats out of any group of “undesirables,” they chose to. Isaiah Berlin wrote in retrospect in 1955:

The most insistent propaganda in the 1930s declared that humanitarianism, and liberalism, and democratic forces were played out and that the choice now lay between two bleak extremes; Communism and Fascism – the red or the black. To those who were not carried away by this patter the only light in the darkness was Mr. Roosevelt and the New Deal in the United States. At a time of weakness and mounting despair in the democratic world, Mr. Roosevelt radiated confidence and strength. He was the leader of the democratic world and even today of all the leaders of the 1930s upon him alone no cloud has rested – neither on him or on the New Deal, which to European eyes still looks a bright chapter in the history of mankind. It was true that his great social experiment was conducted with an isolationist disregard for the outside world, but it was psychologically intelligible that America, which had come into being in reaction to the follies and evils of a Europe perpetually distraught by religious and national struggles, should try to seek salvation undisturbed by the currents of European life, particularly at a moment when Europe seemed about to collapse into a totalitarian nightmare.

Roosevelt and his “brain trust” defined the New Deal in “Three R’s” — Relief, Recovery, and Reform. When Washington was president, the State Department consisted of Thomas Jefferson and his secretary, but Roosevelt added dozens upon dozens of “Alphabet Agencies,” dedicated to those three R’s. Many of them were created in his first 100 days in office. The right of workers to unionize, engage in collective bargaining, and strike was first made law in 1935 by the National Labor Relations Act. Three years later, a bill called the “Fair Labor Standards Act” was signed into law.

The Fair Labor Standards Act established a minimum wage of $0.25 per hour, a maximum 44 hour work-week, required time and a half be paid for hours in excess of that, and prohibited “oppressive child labor.” The sponsoring senator, Hugo Black – later a Supreme Court Justice – initially had advocated a thirty hour work week and a minimum wage of $0.40, but this was modified.

Of interest is that organized labor, particularly the American Federation of Labor, was opposed to the adoption of a minimum wage for men. Previous minimum wage laws had been targeted toward female employees, and the AFL was still committed to the ideal of a one-income family. Additionally, the AFL already had negotiated hire wages for its members. Nevertheless, once it was made law, the AFL fought vociferously to amend and preserve it. In 1938, a quarter was worth about $4 today.

Increases and Problems

The major problem with the minimum wage is that it is not correlative to any real number. It does not automatically adjust to inflation or — as wages are meant to reflect — workers’ productivity. Rather, Congress must vote on increases. The most substantial extensions of coverage occurred during the Kennedy administration in 1961 when coverage was broadened to retail, service, transit, construction, and gasoline station service employees; and again in 1966 under Lyndon Johnson’s “Great Society” programs, when coverage was broadened to “state and local government employees of hospitals, nursing homes, and schools, and to laundries, dry cleaners, and large hotels, motels, restaurants, and farms.”

Further amendments extended coverage to the remaining federal, state and local government employees who were not protected in 1966, to certain workers in retail and service trades previously exempted, and to certain domestic workers in private household employment.

Rather than go through the political maneuvering that took place for each raise, let’s take a peek at a few charts. The first is the minimum wage with its increases transliterated into real dollars.

fig3

The minimum wage reached its maximum value in real dollars in 1969, when it was $1.60 per hour, or $10.56 in 2014 dollars. This is 90 percent of the poverty level, if extrapolated over fifty 40-hour work weeks.

Since its first establishment, the minimum wage has been raised 28 times. In the ‘60s and ‘70s, it was increased close to annually but was not raised between 1981 and 1991, or between 1997 and 2007. Dean Baker asserts the reason for the dramatic increase in the minimum wage between 1938 and 1968 was the corresponding increase in worker productivity, and that the cessation of regular wage increases allowed the gains to be largely wiped out by inflation. This chart illustrates what a minimum wage tied to non-farm productivity would look like:

fig4

You read that correctly. If the minimum wage were adjusted to reflect the productivity of American workers who did not work on farms, it would be $16.24 per hour. If it incorporated farm workers as well, it would be $21.75 per hour.


1. If you dig through the Census, you’ll find the 1810 United States had a population of just a hair above 7 million, compared to about 300 million today. Therefore less than 2 million people were employees in 1810, compared to almost 150 million in 2010.

2. Gordon Wood,  Radicalism of the American Revolution  (New York: Vintage, 1993)

3. Thomas Skidmore, “The Rights of Man to Property!” (New York, 1829.)

4. Although it would be difficult to attribute the enactment of things like the minimum wage, anti-child labor laws, the eight hour day and forty hour week to things that had nothing to do with organized labor.

5. For further reading on these subjects see  Ignatiev , .How the Irish Became White  (New York: Routledge, 1995.)

6. Eric Foner, Free Soil, Free Labor, Free Men. (Oxford University Press, 1995)

7. Phillip Taft and Phillip Ross “American Labor Violence: It’s Causes, Character, and Outcome.” A Report to the National Commission on the Causes and Prevention of Violence. 1969.

8. John Peterson and Charles Stewart “Employment Effects of Minimum Wage Rates” (DC: American Enterprise Institution, 1969)