The current debate surrounding the minimum wage in the United States is characterized in part by disagreement about its original purpose. Those who support raising the minimum wage (usually to somewhere between $10 and $15 per hour) claim that it should provide workers with a reasonable standard of living. According to many opponents of such legislation, however, the minimum wage is meant to be a starting point for those with little job experience — a point from which workers can grow the value of their labor — and as such should not be expected to provide workers with a living wage. It will not be possible to make progress in this debate without first agreeing on what the minimum wage is for and what we can expect it to provide, which requires looking at the history of minimum wage legislation.
Minimum wage laws can be traced back to seventeenth century England. In 1604, King James I passed the “Act Fixing a Minimum Wage,” which set a wage floor for textile workers. The next formal minimum wage laws were enacted in 1894 and 1895 in Victoria and New South Wales, Australia. The reason for the passage of these laws, as one legislator explained, was the belief that “it is absolutely imperative that municipalities and government bodies generally should fix a minimum wage…at a rate sufficient to enable the [employees] to provide reasonable comforts for themselves and their families.” This legislation may have been influenced in part by the views of Australian Cardinal Patrick Francis Moran, who expressed in 1891 that wages should allow for “the frugal support of [a worker’s] wife and family.”
Franklin D. Roosevelt, as part of his New Deal plan, implemented the first official minimum wage in the United States. FDR explicitly stated that the minimum wage he passed was meant to be a living wage: “It seems to me to be equally plain that no business which depends for existence on paying less than living wages to its workers has any right to continue in this country.”
History clearly shows that the intention of minimum wage laws has always been to provide a living wage for society’s most vulnerable workers. The minimum wage is not meant only for middle class suburban teenagers, as some would have us believe, but rather for everyone. In fact, FDR emphasized that his legislation was aimed at a living wage for “all workers, the white collar class as well as the men in overalls.” The question then becomes: what is a living wage?
Merriam Webster defines a living wage as “a wage sufficient to provide the necessities and comforts essential to an acceptable standard of living.” For all FDR’s talk of a “living wage” (which he similarly defines as “the wage of a decent living”), the minimum wage he set was 25 cents per hour, which is the equivalent of about $4.20 today. Clearly, this is not enough to cover the costs of a worker’s basic “necessities and comforts,” raising the question of what exactly FDR intended meant by “a decent living.”
Cardinal Moran stated that a living wage would allow for the “frugal support” of a worker’s family. This seems to be a reasonable definition, but it doesn’t clarify just how frugal we should expect people to be. Should a full time minimum wage worker be able to afford to pay rent in a one-bedroom apartment, or should we expect them to find a more affordable living arrangement, such as living with roommates or family members? If they are a single parent, should we expect them to be able to provide for their children?
Forbes contributor Tim Worstall has a different conception of a living wage, arguing that $4.20 per hour is “indeed a living wage as it would be possible to live upon it.” He goes on to say, “I don’t say live well, or live not in poverty, but that $8,400 a year puts you into the top 20% of all global incomes.” Worstall’s very literal interpretation of the term “living wage” is interesting to consider, but seems to purposely ignore the fact that the cost of living varies around the world, and we would therefore expect wages in the United States to reflect that the cost of living is considerably more expensive.
President Obama recently made the case that the federal minimum wage should be raised, arguing “If you truly believe you could work full-time and support a family on less than $15,000 a year, go try it.” While Obama believes that the minimum wage should be a living wage for workers and their families, the $10.10 minimum wage he endorses would still leave a single parent with two children at or very close to the poverty line.
This all brings us back to the troubling question of what exactly can we or should we expect the minimum wage to provide, and at what point are we demanding too much? Economist Alan Krueger concludes that it is reasonable to support a federal minimum wage of up to $12 per hour, but that a $15 minimum wage “would put us in uncharted waters, and risk undesirable and unintended consequences.”
A few cities have already made the leap of faith into murky waters: New York, Seattle, San Francisco, and Los Angeles have all passed legislation to incrementally raise their minimum wage to $15 per hour. While it is too early to predict the long-term effects of this ordinance, some are observing a drop in employment in the restaurant industry between January and June. While Tim Worstall and other classical liberals interpret this as evidence that having a minimum wage is bad for the economy as it always leads to a decrease in potential employment, a group of 210 top economists remind us that “the weight of evidence from the extensive professional literature has, for decades, consistently found that no significant effects on employment opportunities result when the minimum wage rises in reasonable increments.” They argue that the employment rates in cities like Seattle will not suffer in the long term, and thus endorse a $15 federal minimum wage.
It is worth noting that Seattle, New York, San Francisco, and L.A. are all cities that have relatively high costs of living. Despite his opposition to a $15 federal minimum wage, Alan Krueger admits, “some high-wage cities and states could probably absorb a $15-an-hour minimum wage with little or no job loss.” A $15 minimum wage might seem reasonable in the aforementioned cities because it would bring the cost of rent to about a third of a full time minimum wage workers’ overall income, which is the recommended rent-to-income ratio. This ratio is currently as high as 56 percent in Los Angeles, where the minimum wage is $9.00 per hour. The issue of regional cost of living discrepancies raises the question of whether it is useful to legislate a federal minimum wage at all, or whether it is better to require individual cities and states to establish a minimum wage according to their respective costs of living.
To be sure, some of the opposition to raising the minimum wage comes from those who share Krueger’s and Worstall’s concern that it will have a negative impact on the economy. While these are valid concerns that should be addressed, the issue of whether minimum wage jobs were “intended” to support workers and their families should be laid to rest.
It is clear that throughout history, minimum wage legislation has consistently attempted to set a wage floor that could allow the most vulnerable and disadvantaged workers to meet their basic needs. However, there seems to be disagreement (or at least a lack of clarity) about what actually constitutes a living wage. Because the minimum wage is supported by most Americans and momentum for it won’t dissipate anytime soon, it seems that a more productive conversation about the minimum wage should focus not on whether it should exist, but rather on finding a reasonable way to ensure that the minimum wage upholds its original purpose: to provide a decent standard of living, whatever that may be.