Printer-friendly version (HTML)

The Evolution of Compensation in a Changing Economy

by Thomas G. Moehrle
Bureau of Labor Statistics
This article was originally printed in the Fall 2001 issue of Compensation and Working Conditions.

Originally Posted: January 30, 2003

Over the course of the 20th century, American workers have witnessed an evolution in compensation. Through the century, the changes in the methods of pay have usually been stimulated by some form of imbalance caused by a crisis or demographic shift. For the 20th century American worker, no greater crisis was experienced than the Great Depression, a watershed in how employers paid their workers. But growth in unionization and the increase in the number of working women, among other shifts, have also contributed to changes in pay practices.

Payment for labor services has evolved from simple piecework pay to sophisticated contractual compensation packages. At the turn of the 20th century in America, few workers would have received anything more than wages as compensation for their labor services. But by the close of the century, a typical worker received more than 25 percent compensation in the form of benefits. These benefits, which were termed fringe benefits for most of the century, consisted of employer-paid items such as health, life and unemployment insurance; retirement and savings plans; and holiday and vacation leave. Today, benefit components making up the compensation package continue to evolve, with variable pay plans--such as profit-sharing and stock options--growing in importance. Additionally, emerging benefits, such as family care, are becoming widely available.

Structural change and American labor

For the first third of the 20th century, compensation for industrial workers was composed mainly of wages that were based on a worker’s production performance, typically a piece rate paid on each unit produced. (This article focuses on compensation of industrial workers. Agricultural and domestic workers are excluded, as a substantial number received a significant portion of their compensation in kind. In kind pay, such as room and meals, is not captured in most compensation surveys.)

The setting of piece rates for unit production was rarely prescribed by any formal managerial or industrial standards but was typically at the discretion of the individual shop foreman. Since wage standards would not come until later--through legislation and union activity--many workers were at the mercy of current business conditions in their individual industries.

Even the early-century industrial shifts in employment, driven by technological advances and product demand, had little impact on the way workers were paid. That is not to say that real wages were stagnant, however. Weekly earnings of workers in manufacturing moderately increased for most quinquenniums, with substantial increases coming during the World War I years, when labor markets were constrained.1 (See chart 1 and table 1)

With no modern-day benefits, workers and their families bore the economic risks of sickness, unemployment, and old age. Household savings provided the main source of security, with charitable organizations sometimes helping. At this time, labor unions were actually reluctant to take up the cause for economic insurance benefits, as unions were adverse to employers--or the government--mingling in such worker affairs. American labor unions and their members espoused freedom and independence, favoring a pro-labor capitalistic approach.

Labor’s stance was traceable to the many in the labor movement who had an agrarian heritage of self-sufficiency and independence that provided little ideological rationale for bargaining for security benefits. The sentiment of the time could be heard in the words of Samuel Gompers, president of the American Federation of Labor from 1886 to 1924,2 who argued in 1917 that compulsory benefits, "...weaken independence of spirit, delegates to outside authorities some of the powers and opportunities that rightfully belong to wage earners, and breaks down industrial freedom by exercising control over workers through a central bureaucracy."3 Labor’s attitude towards self-sufficiency and independence would not weaken until some 15 years later under the devastation of the Great Depression.

The influence of Social Security legislation

The burden of the Great Depression would prove too great for households and charitable organizations to bear. At no time in modern America’s history had such a large proportion of the work- force been without jobs; estimates of annual average unemployment approached 25 percent. The depth of the Depression would ultimately provide the catalyst for change in labor’s attitude about self-sufficiency that would, in turn, give way to changes in how American workers were paid.

President Roosevelt’s New Deal legislation provided sweeping change. In 1935, with so many with so little, the Federal Government passed, with the approval of labor, the Social Security Act (SSA). The passage of this legislation provided a nationwide system of social insurance that today still protects workers from loss of wages stemming from unemployment and old age. The 1935 SSA was the first thread of a public social security net that would limit the economic hardship of workers and their families.

When first enacted, the SSA provided coverage for fewer than 60 percent of the workforce; but following several amendments, coverage soon expanded to more than 90 percent. Aside from increasing the numbers covered, amendments extended benefits to dependents and survivors and to the disabled in 1939 and 1956, respectively. The Act was broadened in scope, in 1965, to provide medical coverage to the elderly retired.

Social Security was the first nationwide legally required benefit. Although some States beforehand had enacted legislation requiring employers to provide workers’ compensation benefits, no State had a program that protected workers’ incomes through economic cycles or old age. The passage of the SSA and the hardships experienced during the Great Depression would pave the way for a series of changes in the composition of pay; but the drafting of this seminal act purposefully maintained, at least in part, Labor's spirit of self-sufficiency. From its inception, the economic protections afforded under the SSA have been treated as social insurance in which participation was a right acquired by working, and the premiums shared equally by employer and employee through payroll taxes.

The right to bargain collectively

In the wake of the Great Depression, important pro-labor legislation was passed, but none was more fundamental than the National Labor Relations Act of 1935 (Wagner Act). The Wagner act guaranteed the twin rights of workers to join labor unions and to bargain collectively. This act turned the tide for union labor that had too often encountered court defeats in cases of management and union entanglements. The immediate impact of the Wagner Act can be seen in the increase in union membership. Unions swelled more than two-fold between 1935 and 1940, rising from 3.8 million to 9 million--a stark change of events from the declines experienced just a few years earlier. This quinquennium growth would be matched by no other period in the history of American labor.

The rapid growth in strength of unions, numerically and financially, continued through the World War II years. After the war, unions--with their newfound strength--pressed hard for higher wages, and when not met, orchestrated widespread strikes that would, in the end, raise the public’s ire. Although the Wagner Act had prohibited unfair labor practices by management, there were no prohibitions on union’s behavior. Similar to the cries heard at the turn of the century for trust busting, the public demanded that Congress enact legislation that would restrict and control union behavior. As an amendment to the Wagner Act, in 1947, Congress passed the Labor Management Relations (Taft-Hartley) Act, which specifically prohibited unfair union practices, such as jurisdictional and sympathy strikes and featherbedding. The Taft-Hartley Act also placed restrictions on union administration, contract contents, and health and safety strikes. After the passage of the Taft-Hartley Act reeled in union power, however, two court cases came on its heels that would expand unions’ bargaining scope to employer-provided benefits.

Economic constraints and accompanying inflationary pressures of World War II forged changes in compensation practices of many employers. The War Labor Board, charged with maintaining price stability, placed restrictions on cash-wage increases employers could offer. With a short supply of labor to produce a growing demand for war products,4 employers began offering nonwage benefits, which included insurance, pension plans, and holiday and vacation leave, as a means to attract and retain workers. The War Labor Board encouraged these offerings, considering them as fringe benefits with little inflationary potential.

Once these benefits made their way into practice, however, workers began to regard them as mainstay components of compensation. Gomper's cry 30 years earlier that mandating benefits "weakens independence of spirit" had dissipated. In the post-war years, unions would not only fight for wage increases but also benefits. The courts would prove instrumental in this fight. In the 1948 case of Inland Steel v. NLRB, the court interpreted the right to bargain for working conditions, protected under the Wagner Act, to include the right to bargain for retirement benefits. In the 1949 case of W. W. Cross and Co. v. NLRB, the court came to the same conclusion for health insurance. These benefits would become mainstay compensation components of union contracts and would slowly emerge as part of nonunion compensation as well. (The growth of employer provided benefits is described later in this section.)

Setting standards

Other important labor legislation was also passed in the wake of the Great Depression. The Davis-Bacon Act of 1931 and the Walsh-Healey Act of 1936, to name two, established wage standards for workers employed by contractors or subcontractors on public construction or in the provision of materials and supplies to the Federal Government. (Before these laws, formal wage standards of any kind had been uncommon.)

The passage of the Fair Labor Standards Act (FLSA) of 1938, which remains today one of the most significant acts regarding labor standards, set working-condition requirements for most workers engaging in or producing goods for interstate commerce. The FSLA set minimum wages, maximum hours, and overtime standards that employers had to follow. Additionally, this act set national rules for child labor at a critical time in history. (Child labor legislation had been evolving for some time in State houses, but falling real wages during the Great Depression precipitated a national restriction on the use of child labor.)

The FSLA had a direct effect on compensation, as it not only set minimum wage standards, but also established provisions for overtime hours and pay that would become part of wage benefits for all nonexempt workers. In conjunction with the SSA, the FSLA wove an additional thread into the national social security net by legislatively setting a living wage and decent hours for American workers.

In 1949, the FLSA was amended to directly prohibit child labor; in 1958, the Welfare and Pension Disclosure Act was passed, setting reporting requirements for administrators of health insurance, pensions, and supplementary plans; and, in 1959, the Labor-Management Reporting Act was passed, providing additional protection for the rights of union members.

During the 1960s and 1970s, laws protecting against discrimination and laws protecting the health and safety of workers were passed. Still other labor-related legislation dealt with taxation and standards for administering pension plans. Throughout these years, families were undergoing significant economic changes. Women, particularly married women with children, were a growing presence in the workforce. Between 1960 and 1995, the number of married working mothers grew from 6.6 million to 18 million. The number of single working mothers also took on its own presence, growing from 0.6 million in 1980 to 2.1 million in 1995.5

While these changes in families’ work choices were occurring, the economy was shifting from goods-producing to service-producing industries, which led to a disproportional growth in white-collar occupations--occupations where unionization was not very common. As a result, changes in pay methods and working conditions would not be ushered in by unions, as they were at mid-century. Instead, legislative initiatives provided the framework for new workplace and compensation practices.

The compositional change in families brought a desire for flexibility: flexibility in leave for family care and flexibility in the assortment of benefits employers provided. For the former, legislation helped with the passage of the Pregnancy Discrimination Act in 1978 and the Family and Medical Leave Act of 1993. For the latter, employers have begun to offer flexible benefit plans, in an attempt to tailor benefits offered to employees.

With a rising number of two-earner families, conflicts in benefits received by families began to emerge. Perhaps the most important was double health insurance coverage. In terms of hourly costs, health insurance is the most expensive voluntary benefit employers offer. Thus, it is economically prudent not to have employer expenditures dispersed on double coverage. This--among other motivations--brought about flexible benefit packages, or cafeteria plans, that first emerged in the 1970s. Flexible benefit plans are arrangements in which employees are given an allotment of benefit costs to tailor individual benefit packages, by selecting only those benefits that are most valuable to workers' specific needs. Although flexible benefit plans still are quite limited, they are growing in popularity. In 1986, only 2 percent of workers employed in medium and large private establishments were eligible to participate in a flexible benefit plan; but, by 1997, it had grown to 13 percent.

Composition of pay

In early 2000, the average hourly cost of compensation for employers was $21.16, of which 82 percent consisted of wage payments that included paid leave and supplemental pay.6 The remaining 18 percent comprised hourly costs for non-wage supplements, such as health and life insurance, retirement and savings, and other legally required benefits. As presented earlier, few workers at the turn of the 20th century received any form of nonwage benefits; and, in fact, these nonwage supplements to compensation were called fringe benefits for most of the century. The word fringe connoted that these components of pay were of little substance to the overall pay structure of workers. With nonwage benefits now accounting for nearly one-fifth of average compensation, they are anything but fringe.

Measuring changes in components of pay across the 20th century is made difficult by the lack of a comprehensive and consistent series of compensation data. Compensation studies undertaken through most of the century have measured components of pay through the years and targeted specific workers, such as mill and manufacturing workers, or worker categories, such as union or white-collar workers. Each of these compensation studies had specific purposes, frequently responding to labor issues of the day.

However, the National Income and Product Accounts (NIPA) of the Bureau of Economic Analysis provided a consistent source of compensation data for the economy as a whole for the better part of the century. The NIPA provides aggregate estimates of both wages and salaries, as well as supplements to wages and salaries. These supplements, in large part, are measures of non-wage benefits, including employer contributions for legally required benefits--such as Social Security and unemployment insurance--and voluntary benefits, such as health and life insurance, private pension plans, and profit-sharing plans. Supplements increased sharply through most of the decades of the 20th century, increasing from 1.4 percent in 1929 (the earliest year in which these data are available) to 17.5 percent by the close of the century.7

The remaining sections of this issue discuss the major economic, political, and demographic influences on compensation during the 20th century. These sections track the growth of new forms and types of compensation. Additionally, these sections track the changes in the Bureau of Labor Statistics’ compensation studies and the reasons for these changes. The final article explores future trends in employee compensation and the data collection challenges these trends might pose.

Measuring Real Earnings over the Long Term

The evolution of the average hourly earnings of production workers in manufacturing--adjusted to reflect changes in the purchasing power of the dollar —might tempt one to announce that the real wage of factory workers quadrupled between 1909 and 1999.

There are, however, significant statistical issues that undermine confidence in that statement. First, the equivalence of the concepts of earnings, wages, and compensation has eroded tremendously, as this article documents in some detail. Second, there have been changes in the sheer technical quality of estimates of both earnings and prices, as this section documents briefly. Third, and most significantly, there exists great difficulty in making valid comparisons over long spans of time of the cost of living or its inverse--the purchasing power of cash earnings.

The average hourly earnings of production workers in manufacturing is one of the longest running series in the Bureau of Labor Statistics (BLS) repertoire. Data on earnings of factory workers were first published regularly starting in the January 1916 edition of the Monthly Labor Review. Additionally, similar data are available from BLS as far back as 1909 in less regular form, and economic historians have constructed estimates for years prior to that.

Naturally, there have been numerous efforts to improve the quality of the payroll survey estimates over the years. For example, BLS Bulletin 610, Revised Indexes of Factory Employment and Pay Rolls 1919 to 1933, was the Bureau’s first essay at benchmarking survey estimates to adjust for any pronounced bias when compared with trends in censuses of employment.

In the late forties, BLS addressed some methodological problems, including making the estimates of average weekly earnings and average hourly earnings consistent with each other, using the link relative technique to eliminate inconsistencies due to changing samples, and using aggregate hours--instead of employment--as the weight for aggregation of average hourly earnings to higher levels of industry aggregation.

In the early 1960s, all industries became classified on the Standard Industrial Classification (SIC) basis, when nonmanufacturing was converted to the SIC system from the Social Security Board classifications. In 1961, work began to design comprehensively a sample stratified by size of establishment, instead of sampling only establishments with employment over a certain industry-specific number. And in 1966, the link and taper method became routinely used for the monthly calculation of hours and earnings.

In 1970, for the first time, the Current Employment Statistics (CES) program began publishing seasonally adjusted estimates of average hourly earnings, using the BLS Seasonal Factor Method. Seasonal factoring, or adjustment, permits the more accurate interpretation of intra-year trends in economic time series by smoothing regular month-to-month fluctuations caused by weather, holidays, and other factors. In the 1980s, the CES program continued to expand the survey sample and made additional changes in seasonal adjustment procedures and industry coding, as well as other technical changes. The number of establishments surveyed in the service sector doubled between 1979 and 1989, although sampling as a percent of the service-producing universe remained unchanged.

Starting in 1995, changes in sampling techniques were developed to achieve a genuinely random sample. Besides creating a new sampling design, the CES program made modifications in the formulas for estimation. For hourly earnings, the link technique was kept, but weights were assigned to each sampled unit. (The use of weights replaced the use of size-based strata.) By the end of the decade, however, the new sample and new formulas were in use only in the wholesale trade division; changes are to extend to the remaining divisions over the next few years.

As a result of these and other program improvements, the degree to which Current Employment Statistics estimates needed to be adjusted to benchmarks was reduced substantially. Bulletin 610, published in 1934, reported a cumulative bias of about 11 percent between 1923 and 1929. Today’s status is outlined in the monthly Employment Situation news release: "Over the past decade, the benchmark revision for total nonfarm employment has averaged 0.3 percent, ranging from zero to 0.7 percent."

Calculating real, inflation-adjusted, earnings requires a price index to deflate current dollars to a constant level of buying power. The most commonly used index for this purpose is the Consumer Price Index (CPI). Like the measure of unadjusted, or nominal, earnings, the CPI has a long history of development and improvement.

Cost-of-living and retail price statistics are mentioned as early in the Bureau’s history as 1891, and the first weighted retail price index was published in 1903. Since those early days, there is virtually no aspect of price index statistics that has not been improved. The number of monthly prices collected has grown from about 5,000 for the 30 principal items of food in the 1903 publication to about 70,000 that are grouped into 305 categories called entry level items. Additionally, the number of outlets sampled has grown from 800 for the earliest years of the index to about 30,000 retail and service establishments; and about 27,000 landlords and tenants provide data on housing units. Also, the number of localities for which data are collected has risen from 32 to 87.

Perhaps the most consistent element of the consumer price program’s scope has been its framework of a family’s living costs. The definition of the index family for the CPI used in the calculation of real wages, the Consumer Price Index for Urban Wage Earners and Clerical Workers (CPI-W) has been fairly stable. In the earliest reports, the family was composed of two or more persons with a chief earner--either a wage earner or an earner working at a relatively low salary. The restriction to wage-earner families continues; but, in 1964, single-person families were introduced for the first time.

Another consistent characteristic of the Consumer Price Index program has been technical improvement. Starting in 1940, the Bureau of Labor Statistics conducted a full-scale revision of the Consumer Price Index, to take into account new population patterns, changes in the composition of consumer expenditures, and improvements in survey concepts and methods. Five subsequent revisions have ensued, with the latest one introduced in 1998. It is important, however, to recognize that many improvements in the CPI have been implemented outside the formal revision process. Some of the most important of these inter-revision changes were the adoption, in 1967, of the quality adjustment concept in handling automobile model changeovers; the shift in 1985 (1983 for the CPI for All Urban Consumers, or CPI-U) to a flow-of-services model for pricing owner-occupied housing; and the implementation of hedonic or regression-based quality adjustments, starting with apparel prices in 1991. Perhaps most significant of the more modern improvements has been the adoption of a new functional form, the geometric mean, to calculate the average of prices of items within most CPI product categories. One effect of using geometric means is that the formula now adjusts to some degree for changes in consumption that one might assume would result from changes in the relative prices of items in the CPI market basket.

As a result of this and the other improvements, the CPI-W today is a better measure of living costs than previously, and is the best statistic to use to deflate one month’s or one year’s earnings’ estimates into dollars comparable with the dollars of adjacent (or at least close-by) months or years. But even with nearly perfect earnings’ estimates and price indexes, is it legitimate to make a comparison of the real earnings of 1909 with real earnings of 1999?

Simply doing the arithmetic results in real earnings of $2.03 constant 1982 dollars in 1909 and $8.26 constant 1982 dollars in 1999. A more complex question is whether or not we can meaningfully compare--over a span of nearly a century--the standard of living purchased by even the most precisely measured nominal dollar deflated by even the most carefully constructed price index. The main issue is the vastly different character of actual consumption between widely separated points in time. For example, purchasing an Internet connection, at any price, would have been impossible, in 1909; and something like a buggy whip has gone from a common tradesman’s tool to an item of esoteric taste.

To combine the changing definition of the average consumption bundle, with changing notions of an adequate budget, with a changing level and composition of compensation means that there has been a increase in the measured real cost of a moderate standard of living. One avenue to explore toward an explanation is the possibility of using labor hours as the metric, rather than real dollars.

Doing that arithmetic shows that a fair level of living for a typical cotton mill worker could be earned in 1909, with about 3,750 hours of labor; and that a median family budget for 1998 could be obtained in exchange for about 2,625 hours of work. Thus, if one can assume that the "fair" level of living in 1909 is no better than the median family budget of 1998, then one could conclude that workers in 1998 were better off. While this may show some improvement across the 90-year span, most of the old questions about comparability remain; and, in fact, new ones are raised. For one thing, the nature of work has changed, and increasing incomes have led to an increased taste for leisure time.

In the end, it is generally true that price indexes and measures of purchasing power are accurate only over short time horizons within which tastes, technologies, and economic structures are relatively homogeneous. Comparisons over longer periods, the interest they generate notwithstanding, will always be subject to noncomparability and misinterpretation, because the assumptions that underlie these comparisons--constancy of tastes and technology—are violated.

 

Thomas G. Moehrle
Senior Economist, Division of Compensation Data Estimation, Office of Compensation Levels and Trends. Telephone: 202-691-6237, E-mail: Moehrle_T@bls.gov

End Notes

1 See John T. Dunlop and Walter Galenson, eds., Labor in the Twentieth Century (New York, Academic Press, 1978), table 1.26.

2 Samuel Gompers was president of the American Federation of Labor from its inception in 1886 until his death in 1924, except for the year 1895.

3 Some aspects of health insurance, Monthly Labor Review, May 1917, pp. 746-51.

4 Unemployment had fallen from 9.9 percent in 1941 to 1.2 percent in 1944. See Historical Statistics of the United States, Colonial Times to 1957, Series D46-47 (Bureau of the Census, 1960). See also Dunlop and Galenson, table 1.25.

5 See Statistical Abstract of the United States, 1998 (Washington, U.S. Department of Commerce, 1998), table 654.

6 Hourly costs of compensation were obtained from "Employer Costs for Employee Compensation," USDL 00-186 (Bureau of Labor Statistics, June 29, 2000), available on the Internet at http://www.bls.gov/news.release/History/ecec_292000.txt (visited June 14, 2001).

7 The NIPA measure of supplements to wages and salaries does not correspond exactly to the Bureau of Labor Statistics definition of benefits. For instance, the BLS Employer Costs for Employee Compensation series defines a broader scope of payments as benefits, including supplemental pay for overtime and shift differentials and paid leave for such items as holiday, sick, and vacation leave. These same payments are included among NIPA’s wage and salary estimates.