blog




  • Essay / Human resources management past, present and future

    Table of contentsSummary “Beginnings”: 1400s to 1700s “Personnel”: 1800s “Labor relations/human relations”: 1900s to 1970s “Strategic HRM”: 1980s to todaySummaryWhat we call Human Resources Management (HRM) today has a long and eventful history. A number of key changes in the social and economic environment have affected the evolution of HRM, some of which we will highlight in the following sections. Although many historians of HRM begin with the 19th century, a period of rapid industrialization in the United States, we begin our study much earlier with the development of tribes and, later, systems of apprenticeship and entrepreneurship independent of the late medieval period (Dulebohn, Ferris, & Stodd, 1995; One reason for this is that we wish to highlight changes in the employment relationship over time. This brief historical overview is not not intended to be comprehensive; instead, it provides a context for appreciating the progress we have made in what we now call "HRM." there will be a companion piece to this special issue later this year that will focus on the present and future of HRM Say No to Plagiarism Get a custom essay on “Why Violent Video Games Should Not Be Banned.” ? Get the original essay “Early Beginnings: 1400-1700 Historically, human resource management was probably the first evolved management function, preceding other functions such as finance, accounting, and marketing. Although not recorded, actual human resource management has arguably taken place since the first organization of people into functional units such as tribes. As tribes formed and, in particular, as they evolved from hunting to agriculture, a division of labor undoubtedly emerged with the recognition of differences in the productivity of individuals. This development was a form of division of labor in which different people occupied different roles in productive society. Artisans capable of developing tools for farmers and being supported by the productivity of others engaged in agriculture undoubtedly emerged, and a natural division of labor arose. In short, the productivity of various trades and professions varied, and commerce evolved to take advantage of these variations. Whether managed by the natural functioning of a market and a distribution of productive roles, or by the human resource management of a tribal leader, human resource management problems have arisen. In the late 18th century, the Industrial Revolution began in Europe and spread to the United States. This revolution completely changed the way individuals earned their living and led to a shift from an agricultural society to an industrial or manufacturing society. Human skills and craftsmanship were replaced by machines, and the factory system was born (Dulebohn et al., 1995). Factories and manufacturing dramatically improved production and changed labor relations. For example, these systems replaced the independent contractor system and created permanent employees employed by organizations. At the same time, this results in a rationalization of work and a different division of labor. Workers who were skilled contractors became machine suppliers and performed highly specialized, routine tasks. The new manufacturing system also requiredsupervise large numbers of workers, and management practices tended to be autocratic and paternalistic (Dulebohn et al., 1995). Management had little concern for the safety or well-being of workers, and workers were controlled by force and fear (Slichter, 1919). This approach to management continued until the end of the 19th century. "Personnel": 1800s Around 1800, an English factory owner named Robert Owens changed a number of aspects of the employment relationship and developed "welfare at work" systems to improve both social and social aspects. and the working conditions of workers (Dulebohn et al., 1995). In particular, he taught that the temperance and cleanliness of his workers improved working conditions and refused to employ young children (Davis, 1957). In some cases, these practices evolved into more elaborate paternalistic systems where workers were offered company housing, company stores, company schools, apprenticeships, pensions, life and accident insurance, hospitals and libraries (Davis, 1957). Social assistance systems at work can be defined as "anything which aims at the comfort and improvement, intellectual or social, of employees, in addition to the wages paid, which is not a necessity of the industry or n 'is not required by law' (US Bureau of Labor). , 1919, p.8). These new systems were designed to promote good management and labor relations, increase productivity, and avoid worker conflict and unionization (Dulebohn et al., 1995). Not surprisingly, these practices paved the way for many of the employee benefits that are used today to attract, motivate and retain workers. They have also become the standard for many social benefit systems in Western countries. In the aftermath of the Civil War (1860s), conflicts between labor and management began to erupt. Employers wanted to thwart unions and believed that changes in working conditions would improve performance (Dulebohn et al., 1995). As a result, welfare-to-work programs proliferated, but these programs were actually designed to benefit businesses, not workers. As these programs grew in size in the late 1800s, organizations hired social secretaries to administer them, and eventually the role of the social secretary evolved into that of employment manager and, later, " personnel manager”. The primary functions of this role were to hire, fire, discipline and reward employees, which meant that line managers no longer had to focus on managing and retaining the workforce. Many organizations began to adopt paternalistic practices, but some employers mistreated employees, leading artisans and other workers to join protective societies later known as unions (Scarpello, 2008). Predictably, employers have fought the growth of unions and have taken a number of steps to restrict unionization, including court injunctions or forcing applicants to sign yellow dog contracts indicating that they would not join a union. “Labor Relations/Human Relations”: 1900s – 1970s With the advent of manufacturing, employers looked for ways to improve efficiency and productivity. Engineers (e.g. Frederick Taylor), industrial and organizational psychologists (e.g. Lillian Gilbreath), sociologists (e.g.e.g. Max Weber) and management scholars (e.g. Heny Fayol) focused on strategies to improve organizational effectiveness and developed new management approaches. workers. For example, the scientific management approach favored by Frederick Taylor (1947) emphasized streamlining work by scientifically studying work, breaking it down into components, and determining the best way to perform the work. This approach diminished worker autonomy and insisted that employees must be closely monitored to ensure that they performed work exactly as expected. At the same time, Max Weber (1927) suggested that organizational effectiveness could be improved by using legitimate rules and authority systems. New job design and resulting autocratic management systems have created even greater levels of conflict between workers and organizations. In the 1930s, the National Labor Relations Act, the Norris-LaGuardia Act (1932), the Wagner Act (1935), and other laws led to the development of unions. Due to increased unionization and the use of scientific management principles, personnel departments grew and focused on job analysis as a basis for selection, training, job evaluation and employee compensation. Additionally, the Wagner Act defined the New Deal system of industrial relations and "declared that the goal of public policy was to encourage the practice of collective bargaining, to eliminate inequality in the bargaining power of workers and introduce democratic due process rights for the industry” (Kaufman, 1993, p. 61). Faced with these policies, industrial relations (IR) departments have appeared in organizations in order to manage collective agreements (Dulebohn et al., 1995). World War II created exceptional demand for labor and temporarily slowed the growth of unions (Dulebohn et al., 1995). al., 1995). The war led to wage freezes and strikes being banned, but after the war the need for HRM increased. The postwar period brought renewed interest in unions and workers were determined to recoup lost wage increases. Additionally, federal labor laws and wage controls have created increased demand for personnel services. Additionally, the growing power of unions and labor unrest resulted in the passage of the Taft Hartley Act. The law aimed to equalize power between workers and management. In the 1940s and 1950s, unions represented 47% of the American workforce and 95% of businesses had at least one union (Dulebohn et al., 1995). At the same time, employers began hiring more educated personnel managers due to constraints imposed by unions and the need to manage a unionized workforce. In the 1930s, employment managers began to argue that conflicts were not inherent in employment relations, but were caused by poor management and work systems. As a result, researchers conducted a series of experiments to examine the effects of different work systems on worker productivity (Roethlisberger & Dickson, 1939). These researchers found that social elements and worker needs had a significant impact on production and worker well-being. This new approach was called the human relations movement and emphasized that workers have social needs. The Human Relations approach has.