The Evolution of Compensation in a Changing Economy
Originally Posted: January 30, 2003 Over the course of the 20th century, American workers have witnessed an evolution in compensation. Through the century, the changes in the methods of pay have usually been stimulated by some form of imbalance caused by a crisis or demographic shift. For the 20th century American worker, no greater crisis was experienced than the Great Depression, a watershed in how employers paid their workers. But growth in unionization and the increase in the number of working women, among other shifts, have also contributed to changes in pay practices. Payment for labor services has evolved from simple piecework pay to sophisticated contractual compensation packages. At the turn of the 20th century in America, few workers would have received anything more than wages as compensation for their labor services. But by the close of the century, a typical worker received more than 25 percent compensation in the form of benefits. These benefits, which were termed fringe benefits for most of the century, consisted of employer-paid items such as health, life and unemployment insurance; retirement and savings plans; and holiday and vacation leave. Today, benefit components making up the compensation package continue to evolve, with variable pay plans--such as profit-sharing and stock options--growing in importance. Additionally, emerging benefits, such as family care, are becoming widely available. Structural change and American laborFor the first third of the 20th century, compensation for industrial workers was composed mainly of wages that were based on a worker’s production performance, typically a piece rate paid on each unit produced. (This article focuses on compensation of industrial workers. Agricultural and domestic workers are excluded, as a substantial number received a significant portion of their compensation in kind. In kind pay, such as room and meals, is not captured in most compensation surveys.) The setting of piece rates for unit production was rarely prescribed by any formal managerial or industrial standards but was typically at the discretion of the individual shop foreman. Since wage standards would not come until later--through legislation and union activity--many workers were at the mercy of current business conditions in their individual industries. Even the early-century industrial shifts in employment, driven by technological advances and product demand, had little impact on the way workers were paid. That is not to say that real wages were stagnant, however. Weekly earnings of workers in manufacturing moderately increased for most quinquenniums, with substantial increases coming during the World War I years, when labor markets were constrained.1 (See chart 1 and table 1) With no modern-day benefits, workers and their families bore the economic risks of sickness, unemployment, and old age. Household savings provided the main source of security, with charitable organizations sometimes helping. At this time, labor unions were actually reluctant to take up the cause for economic insurance benefits, as unions were adverse to employers--or the government--mingling in such worker affairs. American labor unions and their members espoused freedom and independence, favoring a pro-labor capitalistic approach. Labor’s stance was traceable to the many in the labor movement who had an agrarian heritage of self-sufficiency and independence that provided little ideological rationale for bargaining for security benefits. The sentiment of the time could be heard in the words of Samuel Gompers, president of the American Federation of Labor from 1886 to 1924,2 who argued in 1917 that compulsory benefits, "...weaken independence of spirit, delegates to outside authorities some of the powers and opportunities that rightfully belong to wage earners, and breaks down industrial freedom by exercising control over workers through a central bureaucracy."3 Labor’s attitude towards self-sufficiency and independence would not weaken until some 15 years later under the devastation of the Great Depression. The influence of Social Security legislationThe burden of the Great Depression would prove too great for households and charitable organizations to bear. At no time in modern America’s history had such a large proportion of the work- force been without jobs; estimates of annual average unemployment approached 25 percent. The depth of the Depression would ultimately provide the catalyst for change in labor’s attitude about self-sufficiency that would, in turn, give way to changes in how American workers were paid. President Roosevelt’s New Deal legislation provided sweeping change. In 1935, with so many with so little, the Federal Government passed, with the approval of labor, the Social Security Act (SSA). The passage of this legislation provided a nationwide system of social insurance that today still protects workers from loss of wages stemming from unemployment and old age. The 1935 SSA was the first thread of a public social security net that would limit the economic hardship of workers and their families. When first enacted, the SSA provided coverage for fewer than 60 percent of the workforce; but following several amendments, coverage soon expanded to more than 90 percent. Aside from increasing the numbers covered, amendments extended benefits to dependents and survivors and to the disabled in 1939 and 1956, respectively. The Act was broadened in scope, in 1965, to provide medical coverage to the elderly retired. Social Security was the first nationwide legally required benefit. Although some States beforehand had enacted legislation requiring employers to provide workers’ compensation benefits, no State had a program that protected workers’ incomes through economic cycles or old age. The passage of the SSA and the hardships experienced during the Great Depression would pave the way for a series of changes in the composition of pay; but the drafting of this seminal act purposefully maintained, at least in part, Labor's spirit of self-sufficiency. From its inception, the economic protections afforded under the SSA have been treated as social insurance in which participation was a right acquired by working, and the premiums shared equally by employer and employee through payroll taxes. The right to bargain collectivelyIn the wake of the Great Depression, important pro-labor legislation was passed, but none was more fundamental than the National Labor Relations Act of 1935 (Wagner Act). The Wagner act guaranteed the twin rights of workers to join labor unions and to bargain collectively. This act turned the tide for union labor that had too often encountered court defeats in cases of management and union entanglements. The immediate impact of the Wagner Act can be seen in the increase in union membership. Unions swelled more than two-fold between 1935 and 1940, rising from 3.8 million to 9 million--a stark change of events from the declines experienced just a few years earlier. This quinquennium growth would be matched by no other period in the history of American labor. The rapid growth in strength of unions, numerically and financially, continued through the World War II years. After the war, unions--with their newfound strength--pressed hard for higher wages, and when not met, orchestrated widespread strikes that would, in the end, raise the public’s ire. Although the Wagner Act had prohibited unfair labor practices by management, there were no prohibitions on union’s behavior. Similar to the cries heard at the turn of the century for trust busting, the public demanded that Congress enact legislation that would restrict and control union behavior. As an amendment to the Wagner Act, in 1947, Congress passed the Labor Management Relations (Taft-Hartley) Act, which specifically prohibited unfair union practices, such as jurisdictional and sympathy strikes and featherbedding. The Taft-Hartley Act also placed restrictions on union administration, contract contents, and health and safety strikes. After the passage of the Taft-Hartley Act reeled in union power, however, two court cases came on its heels that would expand unions’ bargaining scope to employer-provided benefits. Economic constraints and accompanying inflationary pressures of World War II forged changes in compensation practices of many employers. The War Labor Board, charged with maintaining price stability, placed restrictions on cash-wage increases employers could offer. With a short supply of labor to produce a growing demand for war products,4 employers began offering nonwage benefits, which included insurance, pension plans, and holiday and vacation leave, as a means to attract and retain workers. The War Labor Board encouraged these offerings, considering them as fringe benefits with little inflationary potential. Once these benefits made their way into practice, however, workers began to regard them as mainstay components of compensation. Gomper's cry 30 years earlier that mandating benefits "weakens independence of spirit" had dissipated. In the post-war years, unions would not only fight for wage increases but also benefits. The courts would prove instrumental in this fight. In the 1948 case of Inland Steel v. NLRB, the court interpreted the right to bargain for working conditions, protected under the Wagner Act, to include the right to bargain for retirement benefits. In the 1949 case of W. W. Cross and Co. v. NLRB, the court came to the same conclusion for health insurance. These benefits would become mainstay compensation components of union contracts and would slowly emerge as part of nonunion compensation as well. (The growth of employer provided benefits is described later in this section.) Setting standardsOther important labor legislation was also passed in the wake of the Great Depression. The Davis-Bacon Act of 1931 and the Walsh-Healey Act of 1936, to name two, established wage standards for workers employed by contractors or subcontractors on public construction or in the provision of materials and supplies to the Federal Government. (Before these laws, formal wage standards of any kind had been uncommon.) The passage of the Fair Labor Standards Act (FLSA) of 1938, which remains today one of the most significant acts regarding labor standards, set working-condition requirements for most workers engaging in or producing goods for interstate commerce. The FSLA set minimum wages, maximum hours, and overtime standards that employers had to follow. Additionally, this act set national rules for child labor at a critical time in history. (Child labor legislation had been evolving for some time in State houses, but falling real wages during the Great Depression precipitated a national restriction on the use of child labor.) The FSLA had a direct effect on compensation, as it not only set minimum wage standards, but also established provisions for overtime hours and pay that would become part of wage benefits for all nonexempt workers. In conjunction with the SSA, the FSLA wove an additional thread into the national social security net by legislatively setting a living wage and decent hours for American workers. In 1949, the FLSA was amended to directly prohibit child labor; in 1958, the Welfare and Pension Disclosure Act was passed, setting reporting requirements for administrators of health insurance, pensions, and supplementary plans; and, in 1959, the Labor-Management Reporting Act was passed, providing additional protection for the rights of union members. During the 1960s and 1970s, laws protecting against discrimination and laws protecting the health and safety of workers were passed. Still other labor-related legislation dealt with taxation and standards for administering pension plans. Throughout these years, families were undergoing significant economic changes. Women, particularly married women with children, were a growing presence in the workforce. Between 1960 and 1995, the number of married working mothers grew from 6.6 million to 18 million. The number of single working mothers also took on its own presence, growing from 0.6 million in 1980 to 2.1 million in 1995.5 While these changes in families’ work choices were occurring, the economy was shifting from goods-producing to service-producing industries, which led to a disproportional growth in white-collar occupations--occupations where unionization was not very common. As a result, changes in pay methods and working conditions would not be ushered in by unions, as they were at mid-century. Instead, legislative initiatives provided the framework for new workplace and compensation practices. The compositional change in families brought a desire for flexibility: flexibility in leave for family care and flexibility in the assortment of benefits employers provided. For the former, legislation helped with the passage of the Pregnancy Discrimination Act in 1978 and the Family and Medical Leave Act of 1993. For the latter, employers have begun to offer flexible benefit plans, in an attempt to tailor benefits offered to employees. With a rising number of two-earner families, conflicts in benefits received by families began to emerge. Perhaps the most important was double health insurance coverage. In terms of hourly costs, health insurance is the most expensive voluntary benefit employers offer. Thus, it is economically prudent not to have employer expenditures dispersed on double coverage. This--among other motivations--brought about flexible benefit packages, or cafeteria plans, that first emerged in the 1970s. Flexible benefit plans are arrangements in which employees are given an allotment of benefit costs to tailor individual benefit packages, by selecting only those benefits that are most valuable to workers' specific needs. Although flexible benefit plans still are quite limited, they are growing in popularity. In 1986, only 2 percent of workers employed in medium and large private establishments were eligible to participate in a flexible benefit plan; but, by 1997, it had grown to 13 percent. Composition of payIn early 2000, the average hourly cost of compensation for employers was $21.16, of which 82 percent consisted of wage payments that included paid leave and supplemental pay.6 The remaining 18 percent comprised hourly costs for non-wage supplements, such as health and life insurance, retirement and savings, and other legally required benefits. As presented earlier, few workers at the turn of the 20th century received any form of nonwage benefits; and, in fact, these nonwage supplements to compensation were called fringe benefits for most of the century. The word fringe connoted that these components of pay were of little substance to the overall pay structure of workers. With nonwage benefits now accounting for nearly one-fifth of average compensation, they are anything but fringe. Measuring changes in components of pay across the 20th century is made difficult by the lack of a comprehensive and consistent series of compensation data. Compensation studies undertaken through most of the century have measured components of pay through the years and targeted specific workers, such as mill and manufacturing workers, or worker categories, such as union or white-collar workers. Each of these compensation studies had specific purposes, frequently responding to labor issues of the day. However, the National Income and Product Accounts (NIPA) of the Bureau of Economic Analysis provided a consistent source of compensation data for the economy as a whole for the better part of the century. The NIPA provides aggregate estimates of both wages and salaries, as well as supplements to wages and salaries. These supplements, in large part, are measures of non-wage benefits, including employer contributions for legally required benefits--such as Social Security and unemployment insurance--and voluntary benefits, such as health and life insurance, private pension plans, and profit-sharing plans. Supplements increased sharply through most of the decades of the 20th century, increasing from 1.4 percent in 1929 (the earliest year in which these data are available) to 17.5 percent by the close of the century.7 The remaining sections of this issue discuss the major economic, political, and demographic influences on compensation during the 20th century. These sections track the growth of new forms and types of compensation. Additionally, these sections track the changes in the Bureau of Labor Statistics’ compensation studies and the reasons for these changes. The final article explores future trends in employee compensation and the data collection challenges these trends might pose.
End Notes1 See John T. Dunlop and Walter Galenson, eds., Labor in the Twentieth Century (New York, Academic Press, 1978), table 1.26. 2 Samuel Gompers was president of the American Federation of Labor from its inception in 1886 until his death in 1924, except for the year 1895. 3 Some aspects of health insurance, Monthly Labor Review, May 1917, pp. 746-51. 4 Unemployment had fallen from 9.9 percent in 1941 to 1.2 percent in 1944. See Historical Statistics of the United States, Colonial Times to 1957, Series D46-47 (Bureau of the Census, 1960). See also Dunlop and Galenson, table 1.25. 5 See Statistical Abstract of the United States, 1998 (Washington, U.S. Department of Commerce, 1998), table 654. 6 Hourly costs of compensation were obtained from "Employer Costs for Employee Compensation," USDL 00-186 (Bureau of Labor Statistics, June 29, 2000), available on the Internet at http://www.bls.gov/news.release/History/ecec_292000.txt (visited June 14, 2001). 7 The NIPA measure of supplements to wages and salaries does not correspond exactly to the Bureau of Labor Statistics definition of benefits. For instance, the BLS Employer Costs for Employee Compensation series defines a broader scope of payments as benefits, including supplemental pay for overtime and shift differentials and paid leave for such items as holiday, sick, and vacation leave. These same payments are included among NIPA’s wage and salary estimates.
|
Tools |
Calculators |
Help |
Info |