Minimum Wage Research Paper Starter

Minimum Wage

This article presents an overview of the issue of the rate of the minimum wage in the United States. The Fair Minimum Wage Act of 2007 established that the federal minimum wage would increase from $5.15 an hour in 2006 to $7.25 an hour in three stages by 2009. The debate about whether raising the minimum wage has a beneficial effect on the employment rates for low-skilled workers as a group has partially reverted to the form it took in the Progressive Era (between about 1912 and 1923). What was then termed a "family wage" has recently been more modestly recast in terms of an adequate living wage for an individual. Much of this debate implicitly surrounds the issue of whether the private sector or the public sector should bear the burden of relieving poverty. The debate focuses on whether the government can (or should) attempt to tweak the economic system in order to encourage the market to effectively absorb low-skilled workers. The answer to both elements of this question is a mildly qualified "yes," according to recent revisionist studies on the issue over the previous several decades.

Keywords Card-Krueger Theory; Deadweight Loss; Earned Income Tax Credit; Fair Minimum Wage Act of 2007; Family Wage; Liberty of Contract Principle; Living Wage; Monopsonistic Competition; Oppositional Subculture; Perversity Thesis; Poverty Line; Prevailing Wage; Reservation Wage; Real Wages

Social Issues

Overview

Historical Background

Before 1938, attempts to legislate a minimum wage at the federal level met resistance in the courts based on a "liberty of contract" principle. In short, the government was not prepared to interfere with the prerogatives of employers. Early state legislation dealing with a minimum wage requirement largely addressed the interests of female, immigrant, and child employees who worked in sweatshops and were not represented by unions (Herbert, 2006). Once the Depression arrived, the 25 percent unemployment rate was considered compelling grounds for state intervention and the establishment of a federal minimum wage (Levin-Waldman, 1998).

The real value (or, the "purchasing power") of the minimum wage increased rapidly after 1949 and reached its peak value in 1968. Its real value fell 20 percent between 1997 and 2006 and about 40 percent between 1968 and 2006. In other words, the minimum wage in 2006 would have been between $9 and $14 an hour if it had been indexed for inflation in 1968 to account for the rising cost of living. In the 1950s, the minimum wage was set at half of average income of a production (meaning non-managerial or non-supervisory) worker. The prevailing wage of a production worker is now about $17 an hour. The idea of indexing the minimum wage for inflation has only recently been re-examined by Congress and the Democratic Party, and several states have enacted legislation to implement this policy (Pratsch & Sheth, 1999, p. 466; Uchitelle, 2006).

Between 1949 and 1977, Congress increased the minimum wage whenever inflation caused its real value to fall below the level of income that one worker would require to keep a family of three above or near the poverty line. Since the early 1980s, however, a full-time job paying minimum wage has usually been just enough (about $15,000 a year now) to keep a single person with no children above the poverty line. In short, inequality of income has grown substantially since the 1970s (Dalmut, 2005, p. 94-98). In 2012, a restaurant industry CEO made 788 times more on average than a minimum wage worker made in a year (Eisenbrey, Sabadish, & Essrow, 2013).

Minimum wage earners were worse off in 2013 than in 1997 or 1968, though their numbers have improved since the implementation of the Fair Minimum Wage Act of 2007 (Elwell, 2013). According to a study by economists, 15 percent of American workers earned minimum wage in the 1980s. By 2012, that figure had declined to 4.7 percent (Rocheteau & Marat, 2007; U.S. Bureau of Labor Statistics, Minimum wage workers, 2013). Porter (2007) suggests that these figures more than double the actual decline in minimum wage workers. This discrepancy is probably explained by the fact that tens of millions of workers make slightly less or slightly more than the minimum wage. Tipped workers (namely, servers) can legally be paid less than $2.50 an hour if gratuities raise their pay to the minimum wage. Short-term agricultural workers, some minors, and the disabled can be legally employed for less than the minimum wage, and more than 1.5 million workers now earn less than the federal minimum wage.

Increasing the Minimum Wage

Between 13 and 27 million workers (or 10–20 percent of the workforce) received a raise when the Fair Minimum Wage Act was fully implemented in 2009 (Pratsch & Sheth, 1999; Economic Policy Institute, 2007). In other words, 33 percent of the American workforce earned less than $7.25 an hour in 2006 (Uchitelle, 2006). About 15-20 percent more women and ethnic minorities than white males benefited from this pending raise (Economic Policy Institute, 2007). The vast majority of minimum-wage workers occupy the service sector, which tends to be very stable. McDonald's and Burger King will not likely relocate to another state or nation if the minimum wage increases locally or nationally (Porter, 2007).

Some employers who pay roughly the minimum wage have been willing to consistently keep their wages slightly higher than the local standard in order to attract good employees. For similar reasons, other employers will not pay sub-minimum wage amounts to workers (for example, to out-of-state minors) even when doing so would be legal. The theory is that a worker's perception of receiving a fair or advantageous wage will improve worker performance and reduce employee turnover, which is invariably high and relatively costly in the service sector. This principle is referred to in terms of shaping workers' incentives and acknowledging workers' reservation wage. A "reservation" wage denotes the minimum level of pay that a worker is willing to accept for a specific type of work. One might, for example, be willing to accept less pay (to lower one's reservation wage) for a job that is undemanding, familiar, or otherwise convenient (e.g., that will reduce travel expenses). A raise in the minimum wage is generally thought to attract more semi-skilled or semi-experienced workers to the low-paying job market and thereby increase competition for low-end jobs (Simon & Kaester, 2004; Falk, Fehr, & Zehnder, 2005).

There is statistical evidence that a substantial increase in the minimum wage has triggered relatively consistent patterns of unemployment among the least-skilled workers. The 90 percent raise in the minimum wage in 1947 resulted in about 15,500 jobs losses; the increase in 1972 resulted in 90,000 job losses; and after 1974, 120,000 long-term job losses can be linked with increases to the minimum wage. These effects tend to last from about one to five years (Wolfson & Belman, 2003). This pattern has been termed the "perversity thesis" in which a social policy meant to help poor workers accidentally had the opposite effect for some of them (Pratsch & Sheth, 1999, p. 465).

Alternative approaches to increasing the minimum wage for the purposes of combating poverty include expanded tax credits (which exist in the form of the Earned Income Tax Credit for low-income workers with two or more children), direct or indirect subsidies to workers, and subsidies to employers. All of these approaches transfer more of the burden for relieving poverty from the private sector to public resources.

The Living Wage

About 150 cities or counties have enacted living wage measures that account for localized prices. These measures usually only apply to workers who hold state contracts such as home health-care workers, garbage collectors, and security staff (Shipler, 2004, p. 291; Greenhouse, 2007). Housing expenses in large urban areas are often twice as high as in rural areas in the same state (Dalmut, 2005). Establishing a mandatory living wage is not an alternative to increasing the minimum wage but rather a significant extension of essentially the same approach that primarily benefits semi-skilled workers.

About a hundred years ago, unions attempted to establish a similar standard under the principle of a "family wage." A family wage was supposed to provide a male worker with enough income to support a wife and child. Most married women then did not receive income and, the notion of a family wage is largely outdated. The nation's first state-wide living wage legislation was enacted in Maryland. Virtually all employees holding state contracts are required to receive a minimum of $8.50 an hour in rural areas and $11.30 an hour in the Baltimore-Washington corridor. Similar local legislation elsewhere requires an hourly wage as high as $14.75 for workers holding a state contract (Greenhouse, 2007).

A Washington State University study concluded that minimum wage increases at the state level have been beneficial to 97 percent of low-skilled workers (Egan, 2007). The bulk of the evidence is not quite that encouraging. The long-standing conventional argument among economists until the early 1990s had been that a substantial increase in the minimum wage would price low-skilled workers out of the lowest level of the labor market. In other words, raising the minimum wage would encourage employers to eliminate the least-productive workers or cease to hire new entry-level workers. The recent revision of this argument, known as the Card-Krueger thesis (or paradox), asserts that a higher minimum wage may not necessarily increase the rate of unemployment and can have non-detrimental or even beneficial results across the board; that is, in terms of wages, fringe benefits (such as health care), and employer productivity. Prices almost invariably increase after the rise of the minimum wage, but overall sales in the service sector have not suffered significantly according to most research.

The Card-Krueger Thesis: A Paradox?

This result is based on some specific studies of the fast-food industry after state minimum-wage rates were increased by about ten percent circa 1990. The conventional logic was that a ten percent raise in the minimum wage would result in a 1 percent rise in unemployment for adults and a 2 or 3 percent increase for teens. Those who have investigated the Card-Krueger model further have usually concluded that a large increase in the minimum wage results in an extensive reorganization of the employment patterns of low-skilled workers but not a decrease in overall employment or earning. Perhaps the most questionable element of the Card-Krueger model is that it is most applicable to states that were relatively prosperous before a higher minimum wage was established, whereas the conventional model may apply more to poorer states or during times of severe recession.

The number of scholarly studies about the minimum wage quadrupled around the time that Card and Krueger (1994; 1995) published their influential and controversial early study (Wolfson & Belman, 2003). Curiously, David Card and Alan B. Krueger actually intended to confirm the consensual hypothesis of economists but arrived at largely antithetical conclusions. The Clinton administration used their work to support its efforts to raise the minimum wage.

Card and Krueger (1994; 1995) initially studied the effect of the raise in the minimum wage in New Jersey circa 1989 for 410 fast-food restaurants with reference to similar restaurants in the neighboring state of Pennsylvania, which did not increase its minimum wage. They concluded that the New Jersey fast-food outlets were able to absorb the higher cost of labor by raising prices by 4 percent, that there was no loss of non-wage benefits (e.g., health care benefits) for workers, and that those fast food chains in New Jersey were able to open new franchises at their usual rate (Levin-Waldman, 1997; Ross, 2000; Porter, 2007). Only those New Jersey stores that paid more than the new minimum wage employed fewer workers than they had before the rate was increased. Others stores hired or employed more employees than they had previously and employment rates in New Jersey grew faster than those in Pennsylvania (Levin-Waldman, 1997; Porter, 2007).

This surprising result called for revision of the conventional logic about "monopsonistic" competition, which denotes an inverse condition to the better-known term "monopoly": a monopsonistic market is one in which there is only one buyer and many sellers (in this case, of labor). In theory, a wage higher than the prevailing wage should result in a decrease in employment and perhaps in prices. Card and Krueger did not call for a thorough revision of major elements of conventional economic theory, but their study opened the flood gates for such revisionism (Card & Krueger, 1995; Ross, 2000).

One possible and very straightforward interpretation of the Card-Krueger thesis is that the minimum wage has been excessively low since the 1980s in terms of the profits its recipients produced for employers. On this interpretation, employers...

(The entire section is 5794 words.)