什么时候像科学家一样思考

IF 1 4区 心理学 Q4 PSYCHOLOGY, APPLIED
Greta Ontrup, Jana Moeschke, Ralf Buechsenschuss, Torsten Biemann
{"title":"什么时候像科学家一样思考","authors":"Greta Ontrup, Jana Moeschke, Ralf Buechsenschuss, Torsten Biemann","doi":"10.1026/0932-4089/a000418","DOIUrl":null,"url":null,"abstract":"Free AccessWhen to Think Like a ScientistBalancing Scientific Rigor and Satisfying Business Needs in HR AnalyticsGreta Ontrup, Jana Moeschke, Ralf Buechsenschuss, and Torsten BiemannGreta OntrupGreta Ontrup, Department of Work, Organizational and Business Psychology, Ruhr University Bochum, Universitätsstraße 150, 44801 Bochum, Germany, [email protected]https://orcid.org/0000-0003-4720-1494Department of Work, Organizational and Business Psychology, Ruhr University Bochum, GermanySearch for more papers by this author, Jana MoeschkeCommunardo Software GmbH, Bonn, GermanySearch for more papers by this author, Ralf BuechsenschussZurich Insurance Company Ltd., Zürich, SwitzerlandDepartment of Business Administration, Human Resource Management and Leadership, University of Mannheim, GermanySearch for more papers by this author, and Torsten Biemannhttps://orcid.org/0000-0003-1728-6765Department of Business Administration, Human Resource Management and Leadership, University of Mannheim, GermanySearch for more papers by this authorPublished Online:July 20, 2023https://doi.org/10.1026/0932-4089/a000418PDF ToolsAdd to favoritesDownload CitationsTrack Citations ShareShare onFacebookTwitterLinkedInReddit SectionsMorePresentation of the ProblemHuman resource analytics (HR-A) refers to a data-driven HR practice that is based on analyzing data related to employees, HR processes, and organizational performance (Marler & Boudreau, 2017). The results serve as a basis for decision-making on strategically relevant business issues (Falletta & Combs, 2020). It further concerns the aspiration to show how HR measures can create (monetary) value by linking “soft” people factors (e. g., employee engagement) to strategically relevant outcomes such as customer satisfaction (Falletta & Combs, 2020).A prerequisite for using HR-A to improve decision-making is increased research and experimentation within the organization (Falletta & Combs, 2020; Peeters, et al., 2020). In this, HR-A constitutes an intersection between a science-oriented operational practice and an application-oriented HR science. We argue that it is a key challenge for successful HR-A to meet high scientific standards but to also satisfy business needs in a direct way.Concern: Balancing Scientific Rigor and Satisfying Business NeedsOn the one hand, the use of scientific methods for generating value-adding insights fosters the expectation to use rigorous methods. Successful science is characterized by (a) rigorous research questions and hypotheses that derive logically from theory, (b) rigorous methods and research designs that allow valid conclusions to be drawn and eliminate alternative hypothesis, and (c) rigorous measures (Dipboye, 2007). Psychometric quality and fitness for use of the data are discussed as critical prerequisites for HR-A (Cai & Zhu, 2015; Levenson & Fink, 2017). Expressed poignantly by the phrase “garbage in, garbage out,” good quality data are a necessary condition for valuable insights (Levenson & Fink, 2017; Peeters et al., 2020).On the other hand, stakeholders expect HR-A to solve business problems in a direct and quick way. A recent case study highlights how organizations prioritize “impression management and speed of output” of HR-A (Jörden et al., 2021, p. 11). Due to work overload, short deadlines, and tight budgets, HR-A needs to satisfy business needs quickly rather than rigorously (Jörden et al., 2021; Levenson & Fink, 2017). In line with this, a preoccupation with data quality distracts attention away from the essentials of HR-A (Boudreau & Ramstad, 2004). Following this line of argumentation, an academic mindset (i. e., an emphasis on scientific rigor) even hinders HR-A, which should serve to deliver practical “good enough” solutions to complex problems (Rasmussen & Ulrich, 2015). This adds to the belief that rigor comes at the cost of relevance. The two are often discussed as mutually exclusive, implying that rigorous research comes at the expense of practical relevance, rendering scientifically derived results useless for the “real world” (Palmer et al., 2009). With this apparent tension between scientific rigor and the satisfaction of business needs, the question arises: Does HR-A need more or less scientific rigor to deliver value?In this dialogue article, we argue that HR-A can profit from the use of rigorous methods, but that an academic mindset does not facilitate satisfaction of business needs throughout all project phases. By comparing process steps in research and HR-A projects, we delineate similarities and differences and suggest when HR-A should adhere to high scientific rigor and what negative consequences might arise if scientific standards are not adhered to in these phases. We do not believe the scientific process to be the optimal blueprint for HR-A in general but propose that the satisfaction of business needs should be prioritized in other project phases. Thus, we argue that different capabilities are needed during different HR-A project phases.The goal of this article is to stimulate a dialogue about similarities and differences between HR-A and research processes so that science and research can achieve a shared mental model regarding potentially differing goals. We believe this to be a prerequisite for successful science–practice collaboration. Further, we aim to provide (a) researchers with starting points for capabilities that might explain the impact and success of HR-A and (b) practitioners with an overview of necessary capabilities that can translate into trainable competencies for successful HR-A.Presentation of the Differences Between HR-A and Research ProjectsTable 1 shows a simplified process of HR-A and research projects. HR-A projects can range from descriptive to prescriptive analyses (Marler & Boudreau, 2017); we refer to sophisticated HR-A that goes beyond HR controlling.HR-A differs significantly from research projects regarding the identification of the problem. As the novelty of the research question is not important for HR-A (e. g., companies might copy a project another company did before), HR-A is phenomenon-driven rather than theory-driven. Stakeholders might point at potential problems (e. g., management forms a hypothesis and HR-A team must investigate) or HR might detect anomalies in a reporting and thus forms a hypothesis concerning a potential problem. An academic mindset would hinder effective projects as HR-A teams are expected to contribute to strategy execution rather than to offer incremental insights on theory (Levenson, 2018; Rasmussen & Ulrich, 2015). In this project phase, the capability to prioritize strategically important questions is of importance. Table 1 A comparison between HR-A and research projects in terms of a simplified project process Simplified processHR-A projectResearch project(1) Identification ofproblemShort-‍, mid-‍, long-term impact for strategic business challenges very important; phenomenon-drivenNovelty of research question very important; theory-driven(2) Derivation of research questionTheory-based derivation of research question and hypotheses(3) Data collectionand analysisData quality assessment and use of appropriate analytical procedures(4) ResultsGeneralizability of findings beyond organizational context does not matterGeneralizability of findings very important(5) Implementationand evaluationHighly relevant to the organizationUsually not part of research projectTable 1 A comparison between HR-A and research projects in terms of a simplified project process View as image HTML Similarly, the interpretation of the results and implementation and evaluation of outcomes differ. For HR-A, the generalizability of the findings does not matter; it might even be counterproductive for organizations to publish results that other companies can benefit from (cf. resource-based view; Barney, 1996). Further, the implementation of derived implications and their evaluation is usually not part of research projects.By contrast, the derivation of research questions and data collection and analysis should not differ between HR-A and research projects. Although the defined problem for an HR-A project is phenomenon-driven, the subsequent research questions/hypotheses can either be developed based on theory or investigated exploratorily. A process with high theoretical and methodological rigor would place an emphasis on a thorough literature review for deriving research questions and testable hypotheses and ensure rigorous data quality assessment. However, due to short deadlines, exploratory data analysis is oftentimes favored and “interesting patterns” in the data are used for gaining insights (Levenson & Fink, 2017). The need for quick insights promotes the use of easily available data, since psychometric data assessment takes time. Yet, the exploratory investigation of “good enough” data without a theoretical rationale and data quality assessment entails several drawbacks. Thus, we suggest that during these project phases, high scientific rigor should be a key priority, as otherwise severe negative consequences might arise, which are detailed in the following.Justification of a Scientific Approach for Deriving Research Questions and Collecting and Analyzing DataThe importance of high scientific rigor for the derivation of research questions, data collection, and analysis can be illustrated with a hypothetical example. An international insurance company experiences considerable turnover among customer consultants and sets out to answer the question of what explains these high turnover rates. The HR-A team starts by reviewing scientific literature on individual (e. g., job involvement), job-related (e. g., autonomy), and structural (e. g., training opportunities) predictors of turnover. Shortly afterwards, the team is pressed to present their analyses and recommended actions. Due to the time pressure, the HR-A team starts analysis immediately, which causes a focus on already available data. Individual- and job-related factors from the HR information system (personnel, structural, contractual data) and annual feedback reports (commitment, job satisfaction) are available. Exploratory analyses show that personnel and contractual factors explain most variance: Employees that work from home (WFH) and men are less likely to leave the company, compared to employees working from the office and women. Based on the results, the management decides to implement a new WFH policy and a female mentoring program. The approach can be deemed reasonable regarding the time restrictions. Nevertheless, skipping the derivation of a theory-based hypothesis and a thorough data quality assessment entails severe drawbacks.A lack of theoretical underpinnings for the expected effects entails ethical and legal challenges. If data are mined with the primary purpose of maximizing predicted outcomes (Braun & Kuljan, 2015), the interpretation and recommendations based on the results can be challenging. A theoretical rationale for including variables prevents a focus on irrelevant (not job-related) and potentially biased (e. g., gender) factors (Tippins et al., 2021). In the example, there is no theoretical rationale (a theory that explains1 why gender might predict turnover). Yet, the statistical relation (“interesting pattern”) can become the reason for personnel decisions (recruiting, mentoring, etc.). If effects cannot be explained by theory, deriving actions is still guesswork and accompanied by potential ethical and legal conflicts. Regarding the example, the effect of gender might be explained in multiple ways (less support, discriminant culture, etc.). As there is no theoretical framework to help explain the effect, recommended actions might not be effective (a mentoring program will likely not help a discriminant culture) and bear potential for ethical and legal concerns (e. g., if it leads to fewer promotions of women). Therefore, HR-A should aim not only at finding significant effects, but to understand the reasons why effects occur (Tippins et al., 2021).Additionally, a theoretical framework also ensures that all relevant factors are considered. Without a theoretical foundation, it is unclear why WFH and gender emerge as predictors. It is conceivable that WFH leads to more autonomy and women receive less support than men. In these cases, the actions taken by the management would be purposeful. It is also conceivable that other factors explain the effect. Perhaps employees WFH and men have better access to (virtual) trainings offered at off-peak times, that is, times at which office workers commute and women are more likely to have care responsibilities. Although the actions taken by the organization will not be detrimental in this case, it would be more effective to invest in more flexible training opportunities.Another drawback concerns statistical artifacts produced by exploratory analyses. Especially if big data are processed, small amounts of shared confounding variance can produce artifactual associations of otherwise unrelated variables (Smith & Nichols, 2018). Without a theoretical rationale for an expected effect, the risk of statistically significant but practically irrelevant findings increases. In this case, actions derived from the results will not lead to an improvement of the outcomes at stake.Lastly, the omission of a psychometric assessment can have a major impact on the results. Information on commitment and job satisfaction is derived from annual feedback reports. Although it is desirable to integrate data from various sources for HR-A, it necessitates a verification of their content validity (do they measure what they are supposed to measure?), reliability (do they assess the phenomenon of interest with precision?), and fitness of use (are they up to date and exhaustive?). When extracting information from existing data (i. e., data were primarily collected for a different purpose), particular emphasis should be placed on ensuring content validity, that is, that concept and measure align (McAbee et al., 2017). If remarks from feedback reports are interpreted as commitment but instead display team satisfaction, the conclusion that commitment is not a significant driver of turnover is false. Actions taken based on the results might waste resources (e. g., valid data might show an effect of commitment, indicating a need for interventions) or might even be harmful (e. g., when commitment-supporting practices are discontinued).Consequences for Research and Theory BuildingWe argued that HR-A constitutes an intersection between a science-oriented operational practice and an application-oriented HR science, which necessitates balancing scientific rigor with the satisfaction of business needs. We suggest that for the derivation of research questions, data collection, and analysis, high scientific rigor can enhance HR-A success significantly. However, for identifying the problem, interpreting the results, and implementing and evaluating outcomes, the satisfaction of business needs should be prioritized. Thus, we posit that different capabilities predict HR-A success – depending on the project phase.Table 2 offers suggestions for operationalizing the capabilities of satisfying business needs and scientific rigor for HR-A as predictors to understand their impact on HR-A success. Theorizing necessary capabilities and how these might be operationalized is of importance as there is a need to explain how HR-A works and what success factors contribute to successful HR-A (McCartney & Fu, 2022a, 2022b). The proposed capabilities also offer starting points for discussing what competencies HR analysts need. Based on the empirical investigation of the relation between competencies and HR-A success, implications for the training of HR analysts or for teaching in degree programs, such as work and organizational psychology or HR management, can be derived. Work on necessary competencies for the role of HR analysts is only emerging. We suggest that it is indispensable to offer rigorous statistical training as well as knowledge/skills training that ensures that business needs can be identified and stakeholder communication can be initiated. One important competency in that regard could be storytelling. The importance of storytelling is increasingly emphasized (Fu et al., 2022), as it enables HR analysts to present analytical findings in a way that enables stakeholders to understand their importance. Storytelling competencies could be the link between methodological rigor and the simultaneous communication of the relevance of the findings for the satisfaction of business needs. We encourage theoretical work on the relation between these competencies and a dialogue on best-practice teaching methods for preparing students for the role of HR analysts.Consequences for PracticeDespite the value propositions of HR-A, adoption rates in practice are low (McCartney & Fu, 2022b). Most organizations are far from a strategic use of HR-A (Weibel et al., 2019) and organizations in Europe feel less ready to apply HR-A compared to non-European countries (Guenole et al., 2017). We argue that the success of HR-A can be enhanced by the capability to balance the satisfaction of business needs and the adherence to scientific standards depending on the project phase. HR-A teams might use the operationalizations specified in Table 2 to check whether their capabilities need to be developed in this regard. Table 2 Operationalization of HR-A capabilities CapabilityOperationalizationSatisfaction of business needs•A strategically important question is addressed•The focus is on strategic capabilities rather than on short-term incremental improvements•Results enable …decision-support for stakeholders …justifying a business case and people-related investments Rigorous derivation of research questions•There are theoretical assumptions about investigated effects•Testable hypotheses are formulated that are derived logically from theoryRigorous data quality assessment•Psychometric rigor is tested (reliability, validity, objectivity)•Fitness for use is ensured (accuracy, timeliness, usability, completeness)Table 2 Operationalization of HR-A capabilities View as image HTML Science-practice collaborations have been proposed as a means to ensure scientific rigor of HR-A, since organizational researchers can contribute methodological and analytical expertise (Angrave et al., 2016). However, for science–practice collaborations to be successful, there needs to be a shared mental model regarding the goals of the project. As presented in Table 1, we suggest that reaching a shared goal for HR-A and a research project can be challenging, due to significant differences in problem definition, implementation, and evaluation of the project. Thus, a focus should be placed on the collaborative project execution (derivation of testable hypotheses, data quality assessment), for which the academic expertise can be a valuable addition to HR-A execution. Such collaborations can also add to leveraging scientific results in science in the sense that results from scientific studies can be tested by HR-A teams in different settings and contexts. In this way, we believe it is possible for practice to profit from science and from science to profit from practice.LiteraturAngrave, D., Charlwood, A., Kirkpatrick, I., Lawrence, M., & Stuart, M. (2016). HR and analytics: Why HR is set to fail the big data challenge. Human Resource Management Journal, 26 (1), 1 – 11. https://doi.org/10.1111/1748-8583.12090 First citation in articleCrossref, Google ScholarBarney, J. B. (1996). The resource-based theory of the firm. Organization Science, 7 (5), 469 https://doi.org/10.1287/orsc.7.5.469 First citation in articleCrossref, Google ScholarBoudreau, J. W., & Ramstad, P. M. (2004). Talentship and human resource measurement and analysis: From ROI to strategic organizational change. Marshall School of Business. First citation in articleGoogle ScholarBraun, M. T., & Kuljanin, G. (2015). Big data and the challenge of construct validity. Industrial and Organizational Psychology, 8 (4), 521 – 527. https://doi.org/10.1017/iop.2015.77 First citation in articleCrossref, Google ScholarCai, L., & Zhu, Y. (2015). The challenges of data quality and data quality assessment in the big data era. Data Science Journal, 14 (2), 1 – 10. https://doi.org/10.5334/dsj-2015-002 First citation in articleGoogle ScholarDipboye, R. L. (2007). Eight outrageous statements about HR science. Human Resource Management Review, 17 (2), 96 – 106. https://doi.org/10.1016/j.hrmr.2007.04.001 First citation in articleCrossref, Google ScholarFalletta, S. V., & Combs, W. L. (2020). The HR analytics cycle: A seven-step process for building evidence-based and ethical HR analytics capabilities. Journal of Work-Applied Management, 13 (1), 51 – 68. https://doi.org/10.1108/JWAM-03-2020-0020 First citation in articleCrossref, Google ScholarFu, N., Keegan, A., & McCartney, S. (2022). The duality of HR analysts’ storytelling: Showcasing and curbing. Human Resource Management Journal. Advance online publication https://doi.org/10.1111/1748-8583.12466 First citation in articleGoogle ScholarGuenole, N., Feinzig, S., Green, D., & Zhang, H. (2017). HR analytics readiness: How does Europe compare to the rest of the world? IBM Smart Workforce Institute. First citation in articleGoogle ScholarJörden, N. M., Sage, D., & Trusson, C. (2021). ‘It’s so fake’: Identity performances and cynicism within a people analytics team. Human Resource Management Journal. Advance online publication. Article 1748 – 8583.12412 https://doi.org/10.1111/1748-8583.12412 First citation in articleGoogle ScholarLevenson, A. (2018). Using workforce analytics to improve strategy execution. Human Resource Management, 57 (3), 685 – 700. https://doi.org/10.1002/hrm.21850 First citation in articleCrossref, Google ScholarLevenson, A., & Fink, A. (2017). Human capital analytics: Too much data and analysis, not enough models and business insights. Journal of Organizational Effectiveness: People and Performance, 4 (2), 145 – 156. https://doi.org/10.1108/JOEPP-03-2017-0029 First citation in articleCrossref, Google ScholarMarler, J. H., & Boudreau, J. W. (2017). An evidence-based review of HR analytics. The International Journal of Human Resource Management, 28 (1), 3 – 26. https://doi.org/10.1080/09585192.2016.1244699 First citation in articleCrossref, Google ScholarMcAbee, S. T., Landis, R. S., & Burke, M. I. (2017). Inductive reasoning: The promise of big data. Human Resource Management Review, 27 (2), 277 – 290. https://doi.org/10.1016/j.hrmr.2016.08.005 First citation in articleCrossref, Google ScholarMcCartney, S., & Fu, N. (2022a). Bridging the gap: Why, how and when HR analytics can impact organizational performance. Management Decision, 60 (13), 25 – 47. https://doi.org/10.1108/MD-12-2020-1581 First citation in articleCrossref, Google ScholarMcCartney, S., & Fu, N. (2022b). Promise versus reality: A systematic review of the ongoing debates in people analytics. Journal of Organizational Effectiveness: People and Performance, 9 (2), 281 – 311. https://doi.org/10.1108/JOEPP-01-2021-0013 First citation in articleCrossref, Google ScholarPalmer, D., Dick, B., & Freiburger, N. (2009). Rigor and Relevance in Organization Studies. Journal of Management Inquiry, 18 (4), 265 – 272. https://doi.org/10.1177/1056492609343491 First citation in articleCrossref, Google ScholarPeeters, T., Paauwe, J., & van de Voorde, K. (2020). People analytics effectiveness: developing a framework. Journal of Organizational Effectiveness: People and Performance, 7 (2), 203 – 219. https://doi.org/10.1108/JOEPP-04-2020-0071 First citation in articleCrossref, Google ScholarRasmussen, T., & Ulrich, D. (2015). Learning from practice: How HR analytics avoids being a management fad. Organizational Dynamics, 44 (3), 236 – 242. https://doi.org/10.1016/j.orgdyn.2015.05.008 First citation in articleCrossref, Google ScholarSmith, S. M., & Nichols, T. E. (2018). Statistical challenges in “big data” human neuroimaging. Neuron, 97 (2), 263 – 268. https://doi.org/10.1016/j.neuron.2017.12.018 First citation in articleCrossref, Google ScholarTippins, N. T., Oswald, F. L., & McPhail, S. M. (2021). Scientific, legal, and ethical concerns about AI-based personnel selection tools: A call to action. Personnel Assessment and Decisions, 7 (2), 1 – 22. https://doi.org/10.31234/osf.io/6gczw First citation in articleCrossref, Google ScholarWeibel, A., Schafheitle, S. D., & Ebert, I. L. (2019). Goldgräberstimmung im Personalmanagement? Wie Datafizierungs-Technologien die Personalsteuerung verändern [Gold-rush atmosphere in personnel management? How datafication technologies are changing human resource management]. Zeitschrift für Organisationsentwicklung, 3, 23 – 29. https://www.alexandria.unisg.ch/server/api/core/bitstreams/1063c47b-b26a-46cd-9e87-812db9a7dbf7/content First citation in articleGoogle Scholar1The explainability of an effect (logical framework) is not the same as explainable analytics (is the output of complex analyses understandable/ are decisions by models transparent?). Although explainable analytics are of great importance for ethical and legal HR-A, this part refers to the logical reasoning behind an expected effect.FiguresReferencesRelatedDetails Volume 0Issue 0ISSN: 0932-4089eISSN: 2190-6270 InformationZeitschrift für Arbeits- und Organisationspsychologie A&O (2023), 0,https://doi.org/10.1026/0932-4089/a000418.© 2023Hogrefe VerlagPDF download","PeriodicalId":44883,"journal":{"name":"Zeitschrift Fur Arbeits-Und Organisationspsychologie","volume":null,"pages":null},"PeriodicalIF":1.0000,"publicationDate":"2023-10-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"1","resultStr":"{\"title\":\"When to Think Like a Scientist\",\"authors\":\"Greta Ontrup, Jana Moeschke, Ralf Buechsenschuss, Torsten Biemann\",\"doi\":\"10.1026/0932-4089/a000418\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"Free AccessWhen to Think Like a ScientistBalancing Scientific Rigor and Satisfying Business Needs in HR AnalyticsGreta Ontrup, Jana Moeschke, Ralf Buechsenschuss, and Torsten BiemannGreta OntrupGreta Ontrup, Department of Work, Organizational and Business Psychology, Ruhr University Bochum, Universitätsstraße 150, 44801 Bochum, Germany, [email protected]https://orcid.org/0000-0003-4720-1494Department of Work, Organizational and Business Psychology, Ruhr University Bochum, GermanySearch for more papers by this author, Jana MoeschkeCommunardo Software GmbH, Bonn, GermanySearch for more papers by this author, Ralf BuechsenschussZurich Insurance Company Ltd., Zürich, SwitzerlandDepartment of Business Administration, Human Resource Management and Leadership, University of Mannheim, GermanySearch for more papers by this author, and Torsten Biemannhttps://orcid.org/0000-0003-1728-6765Department of Business Administration, Human Resource Management and Leadership, University of Mannheim, GermanySearch for more papers by this authorPublished Online:July 20, 2023https://doi.org/10.1026/0932-4089/a000418PDF ToolsAdd to favoritesDownload CitationsTrack Citations ShareShare onFacebookTwitterLinkedInReddit SectionsMorePresentation of the ProblemHuman resource analytics (HR-A) refers to a data-driven HR practice that is based on analyzing data related to employees, HR processes, and organizational performance (Marler & Boudreau, 2017). The results serve as a basis for decision-making on strategically relevant business issues (Falletta & Combs, 2020). It further concerns the aspiration to show how HR measures can create (monetary) value by linking “soft” people factors (e. g., employee engagement) to strategically relevant outcomes such as customer satisfaction (Falletta & Combs, 2020).A prerequisite for using HR-A to improve decision-making is increased research and experimentation within the organization (Falletta & Combs, 2020; Peeters, et al., 2020). In this, HR-A constitutes an intersection between a science-oriented operational practice and an application-oriented HR science. We argue that it is a key challenge for successful HR-A to meet high scientific standards but to also satisfy business needs in a direct way.Concern: Balancing Scientific Rigor and Satisfying Business NeedsOn the one hand, the use of scientific methods for generating value-adding insights fosters the expectation to use rigorous methods. Successful science is characterized by (a) rigorous research questions and hypotheses that derive logically from theory, (b) rigorous methods and research designs that allow valid conclusions to be drawn and eliminate alternative hypothesis, and (c) rigorous measures (Dipboye, 2007). Psychometric quality and fitness for use of the data are discussed as critical prerequisites for HR-A (Cai & Zhu, 2015; Levenson & Fink, 2017). Expressed poignantly by the phrase “garbage in, garbage out,” good quality data are a necessary condition for valuable insights (Levenson & Fink, 2017; Peeters et al., 2020).On the other hand, stakeholders expect HR-A to solve business problems in a direct and quick way. A recent case study highlights how organizations prioritize “impression management and speed of output” of HR-A (Jörden et al., 2021, p. 11). Due to work overload, short deadlines, and tight budgets, HR-A needs to satisfy business needs quickly rather than rigorously (Jörden et al., 2021; Levenson & Fink, 2017). In line with this, a preoccupation with data quality distracts attention away from the essentials of HR-A (Boudreau & Ramstad, 2004). Following this line of argumentation, an academic mindset (i. e., an emphasis on scientific rigor) even hinders HR-A, which should serve to deliver practical “good enough” solutions to complex problems (Rasmussen & Ulrich, 2015). This adds to the belief that rigor comes at the cost of relevance. The two are often discussed as mutually exclusive, implying that rigorous research comes at the expense of practical relevance, rendering scientifically derived results useless for the “real world” (Palmer et al., 2009). With this apparent tension between scientific rigor and the satisfaction of business needs, the question arises: Does HR-A need more or less scientific rigor to deliver value?In this dialogue article, we argue that HR-A can profit from the use of rigorous methods, but that an academic mindset does not facilitate satisfaction of business needs throughout all project phases. By comparing process steps in research and HR-A projects, we delineate similarities and differences and suggest when HR-A should adhere to high scientific rigor and what negative consequences might arise if scientific standards are not adhered to in these phases. We do not believe the scientific process to be the optimal blueprint for HR-A in general but propose that the satisfaction of business needs should be prioritized in other project phases. Thus, we argue that different capabilities are needed during different HR-A project phases.The goal of this article is to stimulate a dialogue about similarities and differences between HR-A and research processes so that science and research can achieve a shared mental model regarding potentially differing goals. We believe this to be a prerequisite for successful science–practice collaboration. Further, we aim to provide (a) researchers with starting points for capabilities that might explain the impact and success of HR-A and (b) practitioners with an overview of necessary capabilities that can translate into trainable competencies for successful HR-A.Presentation of the Differences Between HR-A and Research ProjectsTable 1 shows a simplified process of HR-A and research projects. HR-A projects can range from descriptive to prescriptive analyses (Marler & Boudreau, 2017); we refer to sophisticated HR-A that goes beyond HR controlling.HR-A differs significantly from research projects regarding the identification of the problem. As the novelty of the research question is not important for HR-A (e. g., companies might copy a project another company did before), HR-A is phenomenon-driven rather than theory-driven. Stakeholders might point at potential problems (e. g., management forms a hypothesis and HR-A team must investigate) or HR might detect anomalies in a reporting and thus forms a hypothesis concerning a potential problem. An academic mindset would hinder effective projects as HR-A teams are expected to contribute to strategy execution rather than to offer incremental insights on theory (Levenson, 2018; Rasmussen & Ulrich, 2015). In this project phase, the capability to prioritize strategically important questions is of importance. Table 1 A comparison between HR-A and research projects in terms of a simplified project process Simplified processHR-A projectResearch project(1) Identification ofproblemShort-‍, mid-‍, long-term impact for strategic business challenges very important; phenomenon-drivenNovelty of research question very important; theory-driven(2) Derivation of research questionTheory-based derivation of research question and hypotheses(3) Data collectionand analysisData quality assessment and use of appropriate analytical procedures(4) ResultsGeneralizability of findings beyond organizational context does not matterGeneralizability of findings very important(5) Implementationand evaluationHighly relevant to the organizationUsually not part of research projectTable 1 A comparison between HR-A and research projects in terms of a simplified project process View as image HTML Similarly, the interpretation of the results and implementation and evaluation of outcomes differ. For HR-A, the generalizability of the findings does not matter; it might even be counterproductive for organizations to publish results that other companies can benefit from (cf. resource-based view; Barney, 1996). Further, the implementation of derived implications and their evaluation is usually not part of research projects.By contrast, the derivation of research questions and data collection and analysis should not differ between HR-A and research projects. Although the defined problem for an HR-A project is phenomenon-driven, the subsequent research questions/hypotheses can either be developed based on theory or investigated exploratorily. A process with high theoretical and methodological rigor would place an emphasis on a thorough literature review for deriving research questions and testable hypotheses and ensure rigorous data quality assessment. However, due to short deadlines, exploratory data analysis is oftentimes favored and “interesting patterns” in the data are used for gaining insights (Levenson & Fink, 2017). The need for quick insights promotes the use of easily available data, since psychometric data assessment takes time. Yet, the exploratory investigation of “good enough” data without a theoretical rationale and data quality assessment entails several drawbacks. Thus, we suggest that during these project phases, high scientific rigor should be a key priority, as otherwise severe negative consequences might arise, which are detailed in the following.Justification of a Scientific Approach for Deriving Research Questions and Collecting and Analyzing DataThe importance of high scientific rigor for the derivation of research questions, data collection, and analysis can be illustrated with a hypothetical example. An international insurance company experiences considerable turnover among customer consultants and sets out to answer the question of what explains these high turnover rates. The HR-A team starts by reviewing scientific literature on individual (e. g., job involvement), job-related (e. g., autonomy), and structural (e. g., training opportunities) predictors of turnover. Shortly afterwards, the team is pressed to present their analyses and recommended actions. Due to the time pressure, the HR-A team starts analysis immediately, which causes a focus on already available data. Individual- and job-related factors from the HR information system (personnel, structural, contractual data) and annual feedback reports (commitment, job satisfaction) are available. Exploratory analyses show that personnel and contractual factors explain most variance: Employees that work from home (WFH) and men are less likely to leave the company, compared to employees working from the office and women. Based on the results, the management decides to implement a new WFH policy and a female mentoring program. The approach can be deemed reasonable regarding the time restrictions. Nevertheless, skipping the derivation of a theory-based hypothesis and a thorough data quality assessment entails severe drawbacks.A lack of theoretical underpinnings for the expected effects entails ethical and legal challenges. If data are mined with the primary purpose of maximizing predicted outcomes (Braun & Kuljan, 2015), the interpretation and recommendations based on the results can be challenging. A theoretical rationale for including variables prevents a focus on irrelevant (not job-related) and potentially biased (e. g., gender) factors (Tippins et al., 2021). In the example, there is no theoretical rationale (a theory that explains1 why gender might predict turnover). Yet, the statistical relation (“interesting pattern”) can become the reason for personnel decisions (recruiting, mentoring, etc.). If effects cannot be explained by theory, deriving actions is still guesswork and accompanied by potential ethical and legal conflicts. Regarding the example, the effect of gender might be explained in multiple ways (less support, discriminant culture, etc.). As there is no theoretical framework to help explain the effect, recommended actions might not be effective (a mentoring program will likely not help a discriminant culture) and bear potential for ethical and legal concerns (e. g., if it leads to fewer promotions of women). Therefore, HR-A should aim not only at finding significant effects, but to understand the reasons why effects occur (Tippins et al., 2021).Additionally, a theoretical framework also ensures that all relevant factors are considered. Without a theoretical foundation, it is unclear why WFH and gender emerge as predictors. It is conceivable that WFH leads to more autonomy and women receive less support than men. In these cases, the actions taken by the management would be purposeful. It is also conceivable that other factors explain the effect. Perhaps employees WFH and men have better access to (virtual) trainings offered at off-peak times, that is, times at which office workers commute and women are more likely to have care responsibilities. Although the actions taken by the organization will not be detrimental in this case, it would be more effective to invest in more flexible training opportunities.Another drawback concerns statistical artifacts produced by exploratory analyses. Especially if big data are processed, small amounts of shared confounding variance can produce artifactual associations of otherwise unrelated variables (Smith & Nichols, 2018). Without a theoretical rationale for an expected effect, the risk of statistically significant but practically irrelevant findings increases. In this case, actions derived from the results will not lead to an improvement of the outcomes at stake.Lastly, the omission of a psychometric assessment can have a major impact on the results. Information on commitment and job satisfaction is derived from annual feedback reports. Although it is desirable to integrate data from various sources for HR-A, it necessitates a verification of their content validity (do they measure what they are supposed to measure?), reliability (do they assess the phenomenon of interest with precision?), and fitness of use (are they up to date and exhaustive?). When extracting information from existing data (i. e., data were primarily collected for a different purpose), particular emphasis should be placed on ensuring content validity, that is, that concept and measure align (McAbee et al., 2017). If remarks from feedback reports are interpreted as commitment but instead display team satisfaction, the conclusion that commitment is not a significant driver of turnover is false. Actions taken based on the results might waste resources (e. g., valid data might show an effect of commitment, indicating a need for interventions) or might even be harmful (e. g., when commitment-supporting practices are discontinued).Consequences for Research and Theory BuildingWe argued that HR-A constitutes an intersection between a science-oriented operational practice and an application-oriented HR science, which necessitates balancing scientific rigor with the satisfaction of business needs. We suggest that for the derivation of research questions, data collection, and analysis, high scientific rigor can enhance HR-A success significantly. However, for identifying the problem, interpreting the results, and implementing and evaluating outcomes, the satisfaction of business needs should be prioritized. Thus, we posit that different capabilities predict HR-A success – depending on the project phase.Table 2 offers suggestions for operationalizing the capabilities of satisfying business needs and scientific rigor for HR-A as predictors to understand their impact on HR-A success. Theorizing necessary capabilities and how these might be operationalized is of importance as there is a need to explain how HR-A works and what success factors contribute to successful HR-A (McCartney & Fu, 2022a, 2022b). The proposed capabilities also offer starting points for discussing what competencies HR analysts need. Based on the empirical investigation of the relation between competencies and HR-A success, implications for the training of HR analysts or for teaching in degree programs, such as work and organizational psychology or HR management, can be derived. Work on necessary competencies for the role of HR analysts is only emerging. We suggest that it is indispensable to offer rigorous statistical training as well as knowledge/skills training that ensures that business needs can be identified and stakeholder communication can be initiated. One important competency in that regard could be storytelling. The importance of storytelling is increasingly emphasized (Fu et al., 2022), as it enables HR analysts to present analytical findings in a way that enables stakeholders to understand their importance. Storytelling competencies could be the link between methodological rigor and the simultaneous communication of the relevance of the findings for the satisfaction of business needs. We encourage theoretical work on the relation between these competencies and a dialogue on best-practice teaching methods for preparing students for the role of HR analysts.Consequences for PracticeDespite the value propositions of HR-A, adoption rates in practice are low (McCartney & Fu, 2022b). Most organizations are far from a strategic use of HR-A (Weibel et al., 2019) and organizations in Europe feel less ready to apply HR-A compared to non-European countries (Guenole et al., 2017). We argue that the success of HR-A can be enhanced by the capability to balance the satisfaction of business needs and the adherence to scientific standards depending on the project phase. HR-A teams might use the operationalizations specified in Table 2 to check whether their capabilities need to be developed in this regard. Table 2 Operationalization of HR-A capabilities CapabilityOperationalizationSatisfaction of business needs•A strategically important question is addressed•The focus is on strategic capabilities rather than on short-term incremental improvements•Results enable …decision-support for stakeholders …justifying a business case and people-related investments Rigorous derivation of research questions•There are theoretical assumptions about investigated effects•Testable hypotheses are formulated that are derived logically from theoryRigorous data quality assessment•Psychometric rigor is tested (reliability, validity, objectivity)•Fitness for use is ensured (accuracy, timeliness, usability, completeness)Table 2 Operationalization of HR-A capabilities View as image HTML Science-practice collaborations have been proposed as a means to ensure scientific rigor of HR-A, since organizational researchers can contribute methodological and analytical expertise (Angrave et al., 2016). However, for science–practice collaborations to be successful, there needs to be a shared mental model regarding the goals of the project. As presented in Table 1, we suggest that reaching a shared goal for HR-A and a research project can be challenging, due to significant differences in problem definition, implementation, and evaluation of the project. Thus, a focus should be placed on the collaborative project execution (derivation of testable hypotheses, data quality assessment), for which the academic expertise can be a valuable addition to HR-A execution. Such collaborations can also add to leveraging scientific results in science in the sense that results from scientific studies can be tested by HR-A teams in different settings and contexts. In this way, we believe it is possible for practice to profit from science and from science to profit from practice.LiteraturAngrave, D., Charlwood, A., Kirkpatrick, I., Lawrence, M., & Stuart, M. (2016). HR and analytics: Why HR is set to fail the big data challenge. Human Resource Management Journal, 26 (1), 1 – 11. https://doi.org/10.1111/1748-8583.12090 First citation in articleCrossref, Google ScholarBarney, J. B. (1996). The resource-based theory of the firm. Organization Science, 7 (5), 469 https://doi.org/10.1287/orsc.7.5.469 First citation in articleCrossref, Google ScholarBoudreau, J. W., & Ramstad, P. M. (2004). Talentship and human resource measurement and analysis: From ROI to strategic organizational change. Marshall School of Business. First citation in articleGoogle ScholarBraun, M. T., & Kuljanin, G. (2015). Big data and the challenge of construct validity. Industrial and Organizational Psychology, 8 (4), 521 – 527. https://doi.org/10.1017/iop.2015.77 First citation in articleCrossref, Google ScholarCai, L., & Zhu, Y. (2015). The challenges of data quality and data quality assessment in the big data era. Data Science Journal, 14 (2), 1 – 10. https://doi.org/10.5334/dsj-2015-002 First citation in articleGoogle ScholarDipboye, R. L. (2007). Eight outrageous statements about HR science. Human Resource Management Review, 17 (2), 96 – 106. https://doi.org/10.1016/j.hrmr.2007.04.001 First citation in articleCrossref, Google ScholarFalletta, S. V., & Combs, W. L. (2020). The HR analytics cycle: A seven-step process for building evidence-based and ethical HR analytics capabilities. Journal of Work-Applied Management, 13 (1), 51 – 68. https://doi.org/10.1108/JWAM-03-2020-0020 First citation in articleCrossref, Google ScholarFu, N., Keegan, A., & McCartney, S. (2022). The duality of HR analysts’ storytelling: Showcasing and curbing. Human Resource Management Journal. Advance online publication https://doi.org/10.1111/1748-8583.12466 First citation in articleGoogle ScholarGuenole, N., Feinzig, S., Green, D., & Zhang, H. (2017). HR analytics readiness: How does Europe compare to the rest of the world? IBM Smart Workforce Institute. First citation in articleGoogle ScholarJörden, N. M., Sage, D., & Trusson, C. (2021). ‘It’s so fake’: Identity performances and cynicism within a people analytics team. Human Resource Management Journal. Advance online publication. Article 1748 – 8583.12412 https://doi.org/10.1111/1748-8583.12412 First citation in articleGoogle ScholarLevenson, A. (2018). Using workforce analytics to improve strategy execution. Human Resource Management, 57 (3), 685 – 700. https://doi.org/10.1002/hrm.21850 First citation in articleCrossref, Google ScholarLevenson, A., & Fink, A. (2017). Human capital analytics: Too much data and analysis, not enough models and business insights. Journal of Organizational Effectiveness: People and Performance, 4 (2), 145 – 156. https://doi.org/10.1108/JOEPP-03-2017-0029 First citation in articleCrossref, Google ScholarMarler, J. H., & Boudreau, J. W. (2017). An evidence-based review of HR analytics. The International Journal of Human Resource Management, 28 (1), 3 – 26. https://doi.org/10.1080/09585192.2016.1244699 First citation in articleCrossref, Google ScholarMcAbee, S. T., Landis, R. S., & Burke, M. I. (2017). Inductive reasoning: The promise of big data. Human Resource Management Review, 27 (2), 277 – 290. https://doi.org/10.1016/j.hrmr.2016.08.005 First citation in articleCrossref, Google ScholarMcCartney, S., & Fu, N. (2022a). Bridging the gap: Why, how and when HR analytics can impact organizational performance. Management Decision, 60 (13), 25 – 47. https://doi.org/10.1108/MD-12-2020-1581 First citation in articleCrossref, Google ScholarMcCartney, S., & Fu, N. (2022b). Promise versus reality: A systematic review of the ongoing debates in people analytics. Journal of Organizational Effectiveness: People and Performance, 9 (2), 281 – 311. https://doi.org/10.1108/JOEPP-01-2021-0013 First citation in articleCrossref, Google ScholarPalmer, D., Dick, B., & Freiburger, N. (2009). Rigor and Relevance in Organization Studies. Journal of Management Inquiry, 18 (4), 265 – 272. https://doi.org/10.1177/1056492609343491 First citation in articleCrossref, Google ScholarPeeters, T., Paauwe, J., & van de Voorde, K. (2020). People analytics effectiveness: developing a framework. Journal of Organizational Effectiveness: People and Performance, 7 (2), 203 – 219. https://doi.org/10.1108/JOEPP-04-2020-0071 First citation in articleCrossref, Google ScholarRasmussen, T., & Ulrich, D. (2015). Learning from practice: How HR analytics avoids being a management fad. Organizational Dynamics, 44 (3), 236 – 242. https://doi.org/10.1016/j.orgdyn.2015.05.008 First citation in articleCrossref, Google ScholarSmith, S. M., & Nichols, T. E. (2018). Statistical challenges in “big data” human neuroimaging. Neuron, 97 (2), 263 – 268. https://doi.org/10.1016/j.neuron.2017.12.018 First citation in articleCrossref, Google ScholarTippins, N. T., Oswald, F. L., & McPhail, S. M. (2021). Scientific, legal, and ethical concerns about AI-based personnel selection tools: A call to action. Personnel Assessment and Decisions, 7 (2), 1 – 22. https://doi.org/10.31234/osf.io/6gczw First citation in articleCrossref, Google ScholarWeibel, A., Schafheitle, S. D., & Ebert, I. L. (2019). Goldgräberstimmung im Personalmanagement? Wie Datafizierungs-Technologien die Personalsteuerung verändern [Gold-rush atmosphere in personnel management? How datafication technologies are changing human resource management]. Zeitschrift für Organisationsentwicklung, 3, 23 – 29. https://www.alexandria.unisg.ch/server/api/core/bitstreams/1063c47b-b26a-46cd-9e87-812db9a7dbf7/content First citation in articleGoogle Scholar1The explainability of an effect (logical framework) is not the same as explainable analytics (is the output of complex analyses understandable/ are decisions by models transparent?). Although explainable analytics are of great importance for ethical and legal HR-A, this part refers to the logical reasoning behind an expected effect.FiguresReferencesRelatedDetails Volume 0Issue 0ISSN: 0932-4089eISSN: 2190-6270 InformationZeitschrift für Arbeits- und Organisationspsychologie A&O (2023), 0,https://doi.org/10.1026/0932-4089/a000418.© 2023Hogrefe VerlagPDF download\",\"PeriodicalId\":44883,\"journal\":{\"name\":\"Zeitschrift Fur Arbeits-Und Organisationspsychologie\",\"volume\":null,\"pages\":null},\"PeriodicalIF\":1.0000,\"publicationDate\":\"2023-10-01\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"1\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Zeitschrift Fur Arbeits-Und Organisationspsychologie\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1026/0932-4089/a000418\",\"RegionNum\":4,\"RegionCategory\":\"心理学\",\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q4\",\"JCRName\":\"PSYCHOLOGY, APPLIED\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Zeitschrift Fur Arbeits-Und Organisationspsychologie","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1026/0932-4089/a000418","RegionNum":4,"RegionCategory":"心理学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q4","JCRName":"PSYCHOLOGY, APPLIED","Score":null,"Total":0}
引用次数: 1

摘要

因此,我们认为在不同的HR-A项目阶段需要不同的功能。本文的目的是激发关于HR-A和研究过程之间的异同的对话,以便科学和研究可以在潜在的不同目标上实现共享的心理模型。我们相信这是科学与实践合作成功的先决条件。此外,我们的目标是为(a)研究人员提供可能解释HR-A的影响和成功的能力的起点,以及(b)从业者提供必要能力的概述,这些能力可以转化为成功的HR-A的可培训能力。HR-A与研究项目的差异介绍stable 1显示了HR-A与研究项目的简化过程。HR-A项目可以从描述性到规范性分析(Marler & Boudreau, 2017);我们指的是超越人力资源控制的复杂人力资源管理。HR-A在识别问题方面与研究项目有很大不同。由于研究问题的新颖性对HR-A来说并不重要(例如,公司可能会复制另一家公司以前做过的项目),因此HR-A是现象驱动的,而不是理论驱动的。利益相关者可能会指出潜在的问题(例如,管理层形成一个假设,HR- a团队必须进行调查),或者HR可能会发现报告中的异常情况,从而形成一个关于潜在问题的假设。学术思维会阻碍有效的项目,因为HR-A团队被期望为战略执行做出贡献,而不是提供理论上的增量见解(Levenson, 2018;Rasmussen & Ulrich, 2015)。在这个项目阶段,对战略上重要的问题进行优先排序的能力是很重要的。表1 HR-A与研究项目在简化项目流程方面的比较简化流程HR-A项目研究项目(1)问题识别短期-‍,中期-‍,对战略业务挑战的长期影响非常重要;现象驱动研究问题的新颖性非常重要;理论驱动的(2)研究问题的推导基于理论的研究问题和假设的推导(3)数据收集和分析数据质量评估和使用适当的分析程序(4)结果发现的普遍性超出组织背景并不重要发现的普遍性非常重要(5)实施和评估与组织高度相关通常不是研究项目的一部分表1 HR-A与研究项目的比较同样,对结果的解释以及对结果的实施和评估也是不同的。对于HR-A,研究结果的普遍性并不重要;对于组织来说,公布其他公司可以从中受益的结果甚至可能适得其反(参见资源基础观点;巴尼,1996年)。此外,衍生影响的实施及其评价通常不是研究项目的一部分。相比之下,HR-A和研究项目之间的研究问题的推导和数据收集和分析不应有所不同。虽然HR-A项目的定义问题是现象驱动的,但后续的研究问题/假设既可以基于理论发展,也可以进行探索性研究。一个具有高度理论和方法严谨性的过程将强调对得出研究问题和可测试假设的全面文献综述,并确保严格的数据质量评估。然而,由于截止日期较短,探索性数据分析通常受到青睐,数据中的“有趣模式”用于获得见解(Levenson & Fink, 2017)。由于心理测量数据评估需要时间,因此对快速洞察的需求促进了容易获得数据的使用。然而,没有理论基础和数据质量评估的“足够好”数据的探索性调查会带来一些缺点。因此,我们建议在这些项目阶段,高度的科学严谨性应该是一个关键的优先事项,否则可能会产生严重的负面后果,这在下面详细说明。高度科学严谨性对于研究问题的提出、数据的收集和分析的重要性可以用一个假设的例子来说明。一家国际保险公司在客户顾问中经历了相当大的营业额,并着手回答是什么解释了这些高营业额的问题。HR-A团队首先回顾有关个人(例如,工作投入)、工作相关(例如,自主性)和结构(例如,培训机会)人员流动预测因素的科学文献。不久之后,团队被要求展示他们的分析和建议的行动。 Goldgräberstimmung im个人管理?Datafizierungs-Technologien die Personalsteuerung verändern[人事管理的淘金热气氛?]数据化技术如何改变人力资源管理。<s:1>组织学报,3,23 - 29。https://www.alexandria.unisg.ch/server/api/core/bitstreams/1063c47b-b26a-46cd-9e87-812db9a7dbf7/content文章中首次引用google scholar1效果的可解释性(逻辑框架)与可解释性分析不同(复杂分析的输出是否可以理解/模型的决策是否透明?)虽然可解释分析对于伦理和法律人力资源管理非常重要,但这部分是指预期效果背后的逻辑推理。FiguresReferencesRelatedDetails vol . 0 issue 0ISSN: 0932-4089eISSN: 2190-6270 InformationZeitschrift frs Arbeits- und organisationspsyologie A&O (2023), 0,https://doi.org/10.1026/0932-4089/a000418.©2023Hogrefe VerlagPDF下载
本文章由计算机程序翻译,如有差异,请以英文原文为准。
When to Think Like a Scientist
Free AccessWhen to Think Like a ScientistBalancing Scientific Rigor and Satisfying Business Needs in HR AnalyticsGreta Ontrup, Jana Moeschke, Ralf Buechsenschuss, and Torsten BiemannGreta OntrupGreta Ontrup, Department of Work, Organizational and Business Psychology, Ruhr University Bochum, Universitätsstraße 150, 44801 Bochum, Germany, [email protected]https://orcid.org/0000-0003-4720-1494Department of Work, Organizational and Business Psychology, Ruhr University Bochum, GermanySearch for more papers by this author, Jana MoeschkeCommunardo Software GmbH, Bonn, GermanySearch for more papers by this author, Ralf BuechsenschussZurich Insurance Company Ltd., Zürich, SwitzerlandDepartment of Business Administration, Human Resource Management and Leadership, University of Mannheim, GermanySearch for more papers by this author, and Torsten Biemannhttps://orcid.org/0000-0003-1728-6765Department of Business Administration, Human Resource Management and Leadership, University of Mannheim, GermanySearch for more papers by this authorPublished Online:July 20, 2023https://doi.org/10.1026/0932-4089/a000418PDF ToolsAdd to favoritesDownload CitationsTrack Citations ShareShare onFacebookTwitterLinkedInReddit SectionsMorePresentation of the ProblemHuman resource analytics (HR-A) refers to a data-driven HR practice that is based on analyzing data related to employees, HR processes, and organizational performance (Marler & Boudreau, 2017). The results serve as a basis for decision-making on strategically relevant business issues (Falletta & Combs, 2020). It further concerns the aspiration to show how HR measures can create (monetary) value by linking “soft” people factors (e. g., employee engagement) to strategically relevant outcomes such as customer satisfaction (Falletta & Combs, 2020).A prerequisite for using HR-A to improve decision-making is increased research and experimentation within the organization (Falletta & Combs, 2020; Peeters, et al., 2020). In this, HR-A constitutes an intersection between a science-oriented operational practice and an application-oriented HR science. We argue that it is a key challenge for successful HR-A to meet high scientific standards but to also satisfy business needs in a direct way.Concern: Balancing Scientific Rigor and Satisfying Business NeedsOn the one hand, the use of scientific methods for generating value-adding insights fosters the expectation to use rigorous methods. Successful science is characterized by (a) rigorous research questions and hypotheses that derive logically from theory, (b) rigorous methods and research designs that allow valid conclusions to be drawn and eliminate alternative hypothesis, and (c) rigorous measures (Dipboye, 2007). Psychometric quality and fitness for use of the data are discussed as critical prerequisites for HR-A (Cai & Zhu, 2015; Levenson & Fink, 2017). Expressed poignantly by the phrase “garbage in, garbage out,” good quality data are a necessary condition for valuable insights (Levenson & Fink, 2017; Peeters et al., 2020).On the other hand, stakeholders expect HR-A to solve business problems in a direct and quick way. A recent case study highlights how organizations prioritize “impression management and speed of output” of HR-A (Jörden et al., 2021, p. 11). Due to work overload, short deadlines, and tight budgets, HR-A needs to satisfy business needs quickly rather than rigorously (Jörden et al., 2021; Levenson & Fink, 2017). In line with this, a preoccupation with data quality distracts attention away from the essentials of HR-A (Boudreau & Ramstad, 2004). Following this line of argumentation, an academic mindset (i. e., an emphasis on scientific rigor) even hinders HR-A, which should serve to deliver practical “good enough” solutions to complex problems (Rasmussen & Ulrich, 2015). This adds to the belief that rigor comes at the cost of relevance. The two are often discussed as mutually exclusive, implying that rigorous research comes at the expense of practical relevance, rendering scientifically derived results useless for the “real world” (Palmer et al., 2009). With this apparent tension between scientific rigor and the satisfaction of business needs, the question arises: Does HR-A need more or less scientific rigor to deliver value?In this dialogue article, we argue that HR-A can profit from the use of rigorous methods, but that an academic mindset does not facilitate satisfaction of business needs throughout all project phases. By comparing process steps in research and HR-A projects, we delineate similarities and differences and suggest when HR-A should adhere to high scientific rigor and what negative consequences might arise if scientific standards are not adhered to in these phases. We do not believe the scientific process to be the optimal blueprint for HR-A in general but propose that the satisfaction of business needs should be prioritized in other project phases. Thus, we argue that different capabilities are needed during different HR-A project phases.The goal of this article is to stimulate a dialogue about similarities and differences between HR-A and research processes so that science and research can achieve a shared mental model regarding potentially differing goals. We believe this to be a prerequisite for successful science–practice collaboration. Further, we aim to provide (a) researchers with starting points for capabilities that might explain the impact and success of HR-A and (b) practitioners with an overview of necessary capabilities that can translate into trainable competencies for successful HR-A.Presentation of the Differences Between HR-A and Research ProjectsTable 1 shows a simplified process of HR-A and research projects. HR-A projects can range from descriptive to prescriptive analyses (Marler & Boudreau, 2017); we refer to sophisticated HR-A that goes beyond HR controlling.HR-A differs significantly from research projects regarding the identification of the problem. As the novelty of the research question is not important for HR-A (e. g., companies might copy a project another company did before), HR-A is phenomenon-driven rather than theory-driven. Stakeholders might point at potential problems (e. g., management forms a hypothesis and HR-A team must investigate) or HR might detect anomalies in a reporting and thus forms a hypothesis concerning a potential problem. An academic mindset would hinder effective projects as HR-A teams are expected to contribute to strategy execution rather than to offer incremental insights on theory (Levenson, 2018; Rasmussen & Ulrich, 2015). In this project phase, the capability to prioritize strategically important questions is of importance. Table 1 A comparison between HR-A and research projects in terms of a simplified project process Simplified processHR-A projectResearch project(1) Identification ofproblemShort-‍, mid-‍, long-term impact for strategic business challenges very important; phenomenon-drivenNovelty of research question very important; theory-driven(2) Derivation of research questionTheory-based derivation of research question and hypotheses(3) Data collectionand analysisData quality assessment and use of appropriate analytical procedures(4) ResultsGeneralizability of findings beyond organizational context does not matterGeneralizability of findings very important(5) Implementationand evaluationHighly relevant to the organizationUsually not part of research projectTable 1 A comparison between HR-A and research projects in terms of a simplified project process View as image HTML Similarly, the interpretation of the results and implementation and evaluation of outcomes differ. For HR-A, the generalizability of the findings does not matter; it might even be counterproductive for organizations to publish results that other companies can benefit from (cf. resource-based view; Barney, 1996). Further, the implementation of derived implications and their evaluation is usually not part of research projects.By contrast, the derivation of research questions and data collection and analysis should not differ between HR-A and research projects. Although the defined problem for an HR-A project is phenomenon-driven, the subsequent research questions/hypotheses can either be developed based on theory or investigated exploratorily. A process with high theoretical and methodological rigor would place an emphasis on a thorough literature review for deriving research questions and testable hypotheses and ensure rigorous data quality assessment. However, due to short deadlines, exploratory data analysis is oftentimes favored and “interesting patterns” in the data are used for gaining insights (Levenson & Fink, 2017). The need for quick insights promotes the use of easily available data, since psychometric data assessment takes time. Yet, the exploratory investigation of “good enough” data without a theoretical rationale and data quality assessment entails several drawbacks. Thus, we suggest that during these project phases, high scientific rigor should be a key priority, as otherwise severe negative consequences might arise, which are detailed in the following.Justification of a Scientific Approach for Deriving Research Questions and Collecting and Analyzing DataThe importance of high scientific rigor for the derivation of research questions, data collection, and analysis can be illustrated with a hypothetical example. An international insurance company experiences considerable turnover among customer consultants and sets out to answer the question of what explains these high turnover rates. The HR-A team starts by reviewing scientific literature on individual (e. g., job involvement), job-related (e. g., autonomy), and structural (e. g., training opportunities) predictors of turnover. Shortly afterwards, the team is pressed to present their analyses and recommended actions. Due to the time pressure, the HR-A team starts analysis immediately, which causes a focus on already available data. Individual- and job-related factors from the HR information system (personnel, structural, contractual data) and annual feedback reports (commitment, job satisfaction) are available. Exploratory analyses show that personnel and contractual factors explain most variance: Employees that work from home (WFH) and men are less likely to leave the company, compared to employees working from the office and women. Based on the results, the management decides to implement a new WFH policy and a female mentoring program. The approach can be deemed reasonable regarding the time restrictions. Nevertheless, skipping the derivation of a theory-based hypothesis and a thorough data quality assessment entails severe drawbacks.A lack of theoretical underpinnings for the expected effects entails ethical and legal challenges. If data are mined with the primary purpose of maximizing predicted outcomes (Braun & Kuljan, 2015), the interpretation and recommendations based on the results can be challenging. A theoretical rationale for including variables prevents a focus on irrelevant (not job-related) and potentially biased (e. g., gender) factors (Tippins et al., 2021). In the example, there is no theoretical rationale (a theory that explains1 why gender might predict turnover). Yet, the statistical relation (“interesting pattern”) can become the reason for personnel decisions (recruiting, mentoring, etc.). If effects cannot be explained by theory, deriving actions is still guesswork and accompanied by potential ethical and legal conflicts. Regarding the example, the effect of gender might be explained in multiple ways (less support, discriminant culture, etc.). As there is no theoretical framework to help explain the effect, recommended actions might not be effective (a mentoring program will likely not help a discriminant culture) and bear potential for ethical and legal concerns (e. g., if it leads to fewer promotions of women). Therefore, HR-A should aim not only at finding significant effects, but to understand the reasons why effects occur (Tippins et al., 2021).Additionally, a theoretical framework also ensures that all relevant factors are considered. Without a theoretical foundation, it is unclear why WFH and gender emerge as predictors. It is conceivable that WFH leads to more autonomy and women receive less support than men. In these cases, the actions taken by the management would be purposeful. It is also conceivable that other factors explain the effect. Perhaps employees WFH and men have better access to (virtual) trainings offered at off-peak times, that is, times at which office workers commute and women are more likely to have care responsibilities. Although the actions taken by the organization will not be detrimental in this case, it would be more effective to invest in more flexible training opportunities.Another drawback concerns statistical artifacts produced by exploratory analyses. Especially if big data are processed, small amounts of shared confounding variance can produce artifactual associations of otherwise unrelated variables (Smith & Nichols, 2018). Without a theoretical rationale for an expected effect, the risk of statistically significant but practically irrelevant findings increases. In this case, actions derived from the results will not lead to an improvement of the outcomes at stake.Lastly, the omission of a psychometric assessment can have a major impact on the results. Information on commitment and job satisfaction is derived from annual feedback reports. Although it is desirable to integrate data from various sources for HR-A, it necessitates a verification of their content validity (do they measure what they are supposed to measure?), reliability (do they assess the phenomenon of interest with precision?), and fitness of use (are they up to date and exhaustive?). When extracting information from existing data (i. e., data were primarily collected for a different purpose), particular emphasis should be placed on ensuring content validity, that is, that concept and measure align (McAbee et al., 2017). If remarks from feedback reports are interpreted as commitment but instead display team satisfaction, the conclusion that commitment is not a significant driver of turnover is false. Actions taken based on the results might waste resources (e. g., valid data might show an effect of commitment, indicating a need for interventions) or might even be harmful (e. g., when commitment-supporting practices are discontinued).Consequences for Research and Theory BuildingWe argued that HR-A constitutes an intersection between a science-oriented operational practice and an application-oriented HR science, which necessitates balancing scientific rigor with the satisfaction of business needs. We suggest that for the derivation of research questions, data collection, and analysis, high scientific rigor can enhance HR-A success significantly. However, for identifying the problem, interpreting the results, and implementing and evaluating outcomes, the satisfaction of business needs should be prioritized. Thus, we posit that different capabilities predict HR-A success – depending on the project phase.Table 2 offers suggestions for operationalizing the capabilities of satisfying business needs and scientific rigor for HR-A as predictors to understand their impact on HR-A success. Theorizing necessary capabilities and how these might be operationalized is of importance as there is a need to explain how HR-A works and what success factors contribute to successful HR-A (McCartney & Fu, 2022a, 2022b). The proposed capabilities also offer starting points for discussing what competencies HR analysts need. Based on the empirical investigation of the relation between competencies and HR-A success, implications for the training of HR analysts or for teaching in degree programs, such as work and organizational psychology or HR management, can be derived. Work on necessary competencies for the role of HR analysts is only emerging. We suggest that it is indispensable to offer rigorous statistical training as well as knowledge/skills training that ensures that business needs can be identified and stakeholder communication can be initiated. One important competency in that regard could be storytelling. The importance of storytelling is increasingly emphasized (Fu et al., 2022), as it enables HR analysts to present analytical findings in a way that enables stakeholders to understand their importance. Storytelling competencies could be the link between methodological rigor and the simultaneous communication of the relevance of the findings for the satisfaction of business needs. We encourage theoretical work on the relation between these competencies and a dialogue on best-practice teaching methods for preparing students for the role of HR analysts.Consequences for PracticeDespite the value propositions of HR-A, adoption rates in practice are low (McCartney & Fu, 2022b). Most organizations are far from a strategic use of HR-A (Weibel et al., 2019) and organizations in Europe feel less ready to apply HR-A compared to non-European countries (Guenole et al., 2017). We argue that the success of HR-A can be enhanced by the capability to balance the satisfaction of business needs and the adherence to scientific standards depending on the project phase. HR-A teams might use the operationalizations specified in Table 2 to check whether their capabilities need to be developed in this regard. Table 2 Operationalization of HR-A capabilities CapabilityOperationalizationSatisfaction of business needs•A strategically important question is addressed•The focus is on strategic capabilities rather than on short-term incremental improvements•Results enable …decision-support for stakeholders …justifying a business case and people-related investments Rigorous derivation of research questions•There are theoretical assumptions about investigated effects•Testable hypotheses are formulated that are derived logically from theoryRigorous data quality assessment•Psychometric rigor is tested (reliability, validity, objectivity)•Fitness for use is ensured (accuracy, timeliness, usability, completeness)Table 2 Operationalization of HR-A capabilities View as image HTML Science-practice collaborations have been proposed as a means to ensure scientific rigor of HR-A, since organizational researchers can contribute methodological and analytical expertise (Angrave et al., 2016). However, for science–practice collaborations to be successful, there needs to be a shared mental model regarding the goals of the project. As presented in Table 1, we suggest that reaching a shared goal for HR-A and a research project can be challenging, due to significant differences in problem definition, implementation, and evaluation of the project. Thus, a focus should be placed on the collaborative project execution (derivation of testable hypotheses, data quality assessment), for which the academic expertise can be a valuable addition to HR-A execution. Such collaborations can also add to leveraging scientific results in science in the sense that results from scientific studies can be tested by HR-A teams in different settings and contexts. In this way, we believe it is possible for practice to profit from science and from science to profit from practice.LiteraturAngrave, D., Charlwood, A., Kirkpatrick, I., Lawrence, M., & Stuart, M. (2016). HR and analytics: Why HR is set to fail the big data challenge. Human Resource Management Journal, 26 (1), 1 – 11. https://doi.org/10.1111/1748-8583.12090 First citation in articleCrossref, Google ScholarBarney, J. B. (1996). The resource-based theory of the firm. Organization Science, 7 (5), 469 https://doi.org/10.1287/orsc.7.5.469 First citation in articleCrossref, Google ScholarBoudreau, J. W., & Ramstad, P. M. (2004). Talentship and human resource measurement and analysis: From ROI to strategic organizational change. Marshall School of Business. First citation in articleGoogle ScholarBraun, M. T., & Kuljanin, G. (2015). Big data and the challenge of construct validity. Industrial and Organizational Psychology, 8 (4), 521 – 527. https://doi.org/10.1017/iop.2015.77 First citation in articleCrossref, Google ScholarCai, L., & Zhu, Y. (2015). The challenges of data quality and data quality assessment in the big data era. Data Science Journal, 14 (2), 1 – 10. https://doi.org/10.5334/dsj-2015-002 First citation in articleGoogle ScholarDipboye, R. L. (2007). Eight outrageous statements about HR science. Human Resource Management Review, 17 (2), 96 – 106. https://doi.org/10.1016/j.hrmr.2007.04.001 First citation in articleCrossref, Google ScholarFalletta, S. V., & Combs, W. L. (2020). The HR analytics cycle: A seven-step process for building evidence-based and ethical HR analytics capabilities. Journal of Work-Applied Management, 13 (1), 51 – 68. https://doi.org/10.1108/JWAM-03-2020-0020 First citation in articleCrossref, Google ScholarFu, N., Keegan, A., & McCartney, S. (2022). The duality of HR analysts’ storytelling: Showcasing and curbing. Human Resource Management Journal. Advance online publication https://doi.org/10.1111/1748-8583.12466 First citation in articleGoogle ScholarGuenole, N., Feinzig, S., Green, D., & Zhang, H. (2017). HR analytics readiness: How does Europe compare to the rest of the world? IBM Smart Workforce Institute. First citation in articleGoogle ScholarJörden, N. M., Sage, D., & Trusson, C. (2021). ‘It’s so fake’: Identity performances and cynicism within a people analytics team. Human Resource Management Journal. Advance online publication. Article 1748 – 8583.12412 https://doi.org/10.1111/1748-8583.12412 First citation in articleGoogle ScholarLevenson, A. (2018). Using workforce analytics to improve strategy execution. Human Resource Management, 57 (3), 685 – 700. https://doi.org/10.1002/hrm.21850 First citation in articleCrossref, Google ScholarLevenson, A., & Fink, A. (2017). Human capital analytics: Too much data and analysis, not enough models and business insights. Journal of Organizational Effectiveness: People and Performance, 4 (2), 145 – 156. https://doi.org/10.1108/JOEPP-03-2017-0029 First citation in articleCrossref, Google ScholarMarler, J. H., & Boudreau, J. W. (2017). An evidence-based review of HR analytics. The International Journal of Human Resource Management, 28 (1), 3 – 26. https://doi.org/10.1080/09585192.2016.1244699 First citation in articleCrossref, Google ScholarMcAbee, S. T., Landis, R. S., & Burke, M. I. (2017). Inductive reasoning: The promise of big data. Human Resource Management Review, 27 (2), 277 – 290. https://doi.org/10.1016/j.hrmr.2016.08.005 First citation in articleCrossref, Google ScholarMcCartney, S., & Fu, N. (2022a). Bridging the gap: Why, how and when HR analytics can impact organizational performance. Management Decision, 60 (13), 25 – 47. https://doi.org/10.1108/MD-12-2020-1581 First citation in articleCrossref, Google ScholarMcCartney, S., & Fu, N. (2022b). Promise versus reality: A systematic review of the ongoing debates in people analytics. Journal of Organizational Effectiveness: People and Performance, 9 (2), 281 – 311. https://doi.org/10.1108/JOEPP-01-2021-0013 First citation in articleCrossref, Google ScholarPalmer, D., Dick, B., & Freiburger, N. (2009). Rigor and Relevance in Organization Studies. Journal of Management Inquiry, 18 (4), 265 – 272. https://doi.org/10.1177/1056492609343491 First citation in articleCrossref, Google ScholarPeeters, T., Paauwe, J., & van de Voorde, K. (2020). People analytics effectiveness: developing a framework. Journal of Organizational Effectiveness: People and Performance, 7 (2), 203 – 219. https://doi.org/10.1108/JOEPP-04-2020-0071 First citation in articleCrossref, Google ScholarRasmussen, T., & Ulrich, D. (2015). Learning from practice: How HR analytics avoids being a management fad. Organizational Dynamics, 44 (3), 236 – 242. https://doi.org/10.1016/j.orgdyn.2015.05.008 First citation in articleCrossref, Google ScholarSmith, S. M., & Nichols, T. E. (2018). Statistical challenges in “big data” human neuroimaging. Neuron, 97 (2), 263 – 268. https://doi.org/10.1016/j.neuron.2017.12.018 First citation in articleCrossref, Google ScholarTippins, N. T., Oswald, F. L., & McPhail, S. M. (2021). Scientific, legal, and ethical concerns about AI-based personnel selection tools: A call to action. Personnel Assessment and Decisions, 7 (2), 1 – 22. https://doi.org/10.31234/osf.io/6gczw First citation in articleCrossref, Google ScholarWeibel, A., Schafheitle, S. D., & Ebert, I. L. (2019). Goldgräberstimmung im Personalmanagement? Wie Datafizierungs-Technologien die Personalsteuerung verändern [Gold-rush atmosphere in personnel management? How datafication technologies are changing human resource management]. Zeitschrift für Organisationsentwicklung, 3, 23 – 29. https://www.alexandria.unisg.ch/server/api/core/bitstreams/1063c47b-b26a-46cd-9e87-812db9a7dbf7/content First citation in articleGoogle Scholar1The explainability of an effect (logical framework) is not the same as explainable analytics (is the output of complex analyses understandable/ are decisions by models transparent?). Although explainable analytics are of great importance for ethical and legal HR-A, this part refers to the logical reasoning behind an expected effect.FiguresReferencesRelatedDetails Volume 0Issue 0ISSN: 0932-4089eISSN: 2190-6270 InformationZeitschrift für Arbeits- und Organisationspsychologie A&O (2023), 0,https://doi.org/10.1026/0932-4089/a000418.© 2023Hogrefe VerlagPDF download
求助全文
通过发布文献求助,成功后即可免费获取论文全文。 去求助
来源期刊
CiteScore
2.50
自引率
8.30%
发文量
18
期刊介绍: In Kooperation mit der Sektion Arbeits-, Betriebs- und Organisationspsychologie im Berufsverband Deutscher Psychologen (BDP)
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信