{"title":"Demonstrating Research Quality","authors":"Marie Crowe, Paul Slater, Hugh McKenna","doi":"10.1111/jpm.13145","DOIUrl":null,"url":null,"abstract":"<p>It is a truism that research papers that take account of previous studies and report interesting findings are of little merit if the methodology lacks rigour. Such papers are of no value to nurses or nursing if the science is wrong. As editors, we often get frustrated when what initially appears to be an interesting paper has to be rejected because the authors did not convince us that the research approach was robust and systematic.</p><p>Rigorously designed and conducted research enables transparency and reproducibility, qualities valued by funders and governments. Journals play an important role in promoting such work. Because of our experience, the Journal of Psychiatric and Mental Health Nursing is publishing a series of articles to assist authors in producing research papers that demonstrate excellence in design and execution.</p><p>Quality mental health nursing research focuses on building evidence to improve the care of those with lived experience. However, research users, such as practitioners or policymakers, must have faith in what researchers produce. Much research is publicly funded, regardless of whether the grants come from governments (taxpayers) or charities (donations). It is in the interest of these bodies to ensure that money from the public purse only funds high-quality research.</p><p>As a result, many countries have developed processes for evaluating the quality of research conducted in their publicly funded universities. For instance, since 1986, the United Kingdom has run research assessment exercises every 5–7 years, with other countries emulating this. The initial drive for this was primarily economic and initiated by the Thatcher Government, which was fundamentally suspicious of academia and wanted to bring market forces to bear on all aspects of economic, social and cultural life (Harvey <span>2005</span>). As a consequence, academic research became a measurable commodity aligned with government business and linked to financial incentives. Universities are now regarded as key contributors to the performance of the economy.</p><p>While its origins were within the United Kingdom (Research Assessment Exercise, followed by the Excellence Framework [REF]), this approach to evaluating academic research has been copied in at least 20 other countries. For example, Hong Kong, Poland, Sweden and Norway. New Zealand and Australia have also conducted similar quality research assessments but have paused their planned upcoming evaluations. For some countries, the amount of annual research funding that a university obtains from the government is informed by the results of these research assessment exercises.</p><p>It is often the case that the quality of research is judged by an assessment of a university's research publications, research impact and research environment. We will concentrate here on the quality of publications, which is assessed on originality, significance and rigour. It will come as little surprise to readers that good quality research papers must adhere to these three criteria. However, it is surprising how many of the 700 manuscripts that are submitted to this journal each year do not mention these three words.</p><p>As far as originality is concerned, editors and reviewers are seeking evidence of the research having been built on previous studies to push forward the boundaries of knowledge on the topic. The questions asked include—was the study the first to investigate the topic, did it produce and interpret new findings, engage with new problems, develop new research methods or analytical approaches, provide new arguments, interpretations or insights, collect new types of data or advance theory, policy or practice? It is important that the authors make the case for originality and not have editors or reviewers second guessing this.</p><p>Authors should also make the case for significance or what may be referred to as the potential impact of the study's findings. How, for instance, do the findings advance or have the potential to advance knowledge, skills, scholarly thought or the development and understanding of practice, education, management or policy? The timeliness of the research may also suggest significance, as is its contribution to theory-building or theory-testing.</p><p>As alluded to above, without rigour, the integrity of originality and significance is undermined. Therefore, rigour is of specific importance to any evaluation of publication quality. It can be defined as the extent to which the research demonstrates intellectual coherence and integrity, using robust and appropriate concepts, analyses, sources, theories and/or methodologies. We will now deal with how rigour is assured in quantitative and qualitative research studies.</p><p>In quantitative research, the focus is on the application of theory, research designs, methodologies and procedures to generate hypotheses and test them. It aims to produce research findings with internal and external validity and reliability. Internal validity is concerned with how a study is designed, conducted and analysed and the procedural sureties we put in place to maximise the study's inferential impact. External validity is concerned with whether the findings of a study are generalisable to other settings. This also includes ecological validity, the measure of applicability to practice settings. It is these measures that raise a paper from parochial relevance to international excellence and world-leading importance, thereby increasing its publish-ability and research assessment worth.</p><p>The ‘a priori’ and objective nature of quantitative research means that issues affecting validity, such as inappropriate research design, sampling issues, poor instrument selection, limited statistical analysis, etc., need to be addressed before data collection commences. Once data collection begins, there is no opportunity for correction. Many irreversible internal and external validity errors were made because researchers did not consult a methodologist/statistician early enough in the conceptualisation stage. Remember, there are limits to what statistical techniques can correct. Journal reviewers are often left feeling that the authors could have substantially strengthened the paper for publication if they had a better understanding and application of quantitative methods.</p><p>Rigour in qualitative research involves providing an auditable decision trail of the design, conduct and reporting. A rationale needs to be provided for each step of the research process. Arguably, it is the research question that drives the design and reporting of the research. It must be clearly focused and supported by evidence. It also needs to be underpinned by a strong conceptual framework, which informs the selection of appropriate research methods. This enhances trustworthiness and minimises the researcher bias sometimes inherent in qualitative methodologies (Johnson, Adkins, and Chauvin <span>2020</span>).</p><p>The research question is foundational to the choice of conceptual framework, methodology and design of a qualitative study, and these facilitate the research process that best answers the research question. Furthermore, each phase of the study must be described in terms of its relevance to the research question. The research plan (recruitment, data collection and data analysis) should also systematically examine the research question within a relevant context.</p><p>One common problem seen in qualitative papers in research assessment exercises and reviews for publication is that the research question is not resolved in the findings. A well-designed interview (or other data collection method) will capture complex data pertinent to the research question. An interview that gets side-tracked (by data that may be interesting but not pertinent) or one that fails to interrogate the participants' descriptions in relation to the overarching research question will produce insubstantial data for analysis. The analysis should be driven by both the methodological approach and the research question, and the reporting of findings needs to reflect the outcomes of the analysis.</p><p>The analytic process is a systematic interpretation supported by evidence from the data. It needs to focus on interpretation rather than description alone, that is, what does the data mean in terms of the research questions and how does the analysis reflect the methodological approach? Sometimes, submissions to the Journal report under-analysed data. This may take the form of ‘themes’ that reproduce responses to the interview questions. These should be designed to elicit responses to the research question. It is an example of insufficient analysis when responses are reported without interpretation. The interpretation needs to be guided by the methodological approach or, in the case of thematic analysis within the conceptual framework. Following these rules enables others to ascertain whether the findings might be relevant to their clinical practice.</p><p>In conclusion, regardless of whether the research approach is quantitative or qualitative, the publication should show scientific excellence in design, method, execution and analysis. It should demonstrate a systematic and rigorous approach to its relationship with existing research, be adequately detailed for replication purposes, error sources are identified, accounted for and minimised and limitations are highlighted.</p><p>The Journal of Psychiatry and Mental Health Nursing is publishing a series of research methodology papers covering both quantitative and qualitative research approaches. For the former, they deal with philosophical paradigms, research designs, data collection methods and psychometric and statistical analysis techniques. For qualitative research, three papers focus on understanding the place of qualitative research in mental health nursing practice, conducting qualitative research and publishing qualitative research studies. The overall aim of these series is to enhance the quality of research papers submitted to the journal to increase understanding of the research process and improve the application of findings (internal/external validity) to produce good quality, rigorous research of high international relevance.</p><p>Marie Crowe and Paul Slater are Associate Editors JPMHN. Hugh McKenna is an Editor JPMHN.</p>","PeriodicalId":50076,"journal":{"name":"Journal of Psychiatric and Mental Health Nursing","volume":"32 3","pages":"686-688"},"PeriodicalIF":2.9000,"publicationDate":"2024-12-11","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://onlinelibrary.wiley.com/doi/epdf/10.1111/jpm.13145","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Journal of Psychiatric and Mental Health Nursing","FirstCategoryId":"3","ListUrlMain":"https://onlinelibrary.wiley.com/doi/10.1111/jpm.13145","RegionNum":4,"RegionCategory":"医学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"NURSING","Score":null,"Total":0}
引用次数: 0
Abstract
It is a truism that research papers that take account of previous studies and report interesting findings are of little merit if the methodology lacks rigour. Such papers are of no value to nurses or nursing if the science is wrong. As editors, we often get frustrated when what initially appears to be an interesting paper has to be rejected because the authors did not convince us that the research approach was robust and systematic.
Rigorously designed and conducted research enables transparency and reproducibility, qualities valued by funders and governments. Journals play an important role in promoting such work. Because of our experience, the Journal of Psychiatric and Mental Health Nursing is publishing a series of articles to assist authors in producing research papers that demonstrate excellence in design and execution.
Quality mental health nursing research focuses on building evidence to improve the care of those with lived experience. However, research users, such as practitioners or policymakers, must have faith in what researchers produce. Much research is publicly funded, regardless of whether the grants come from governments (taxpayers) or charities (donations). It is in the interest of these bodies to ensure that money from the public purse only funds high-quality research.
As a result, many countries have developed processes for evaluating the quality of research conducted in their publicly funded universities. For instance, since 1986, the United Kingdom has run research assessment exercises every 5–7 years, with other countries emulating this. The initial drive for this was primarily economic and initiated by the Thatcher Government, which was fundamentally suspicious of academia and wanted to bring market forces to bear on all aspects of economic, social and cultural life (Harvey 2005). As a consequence, academic research became a measurable commodity aligned with government business and linked to financial incentives. Universities are now regarded as key contributors to the performance of the economy.
While its origins were within the United Kingdom (Research Assessment Exercise, followed by the Excellence Framework [REF]), this approach to evaluating academic research has been copied in at least 20 other countries. For example, Hong Kong, Poland, Sweden and Norway. New Zealand and Australia have also conducted similar quality research assessments but have paused their planned upcoming evaluations. For some countries, the amount of annual research funding that a university obtains from the government is informed by the results of these research assessment exercises.
It is often the case that the quality of research is judged by an assessment of a university's research publications, research impact and research environment. We will concentrate here on the quality of publications, which is assessed on originality, significance and rigour. It will come as little surprise to readers that good quality research papers must adhere to these three criteria. However, it is surprising how many of the 700 manuscripts that are submitted to this journal each year do not mention these three words.
As far as originality is concerned, editors and reviewers are seeking evidence of the research having been built on previous studies to push forward the boundaries of knowledge on the topic. The questions asked include—was the study the first to investigate the topic, did it produce and interpret new findings, engage with new problems, develop new research methods or analytical approaches, provide new arguments, interpretations or insights, collect new types of data or advance theory, policy or practice? It is important that the authors make the case for originality and not have editors or reviewers second guessing this.
Authors should also make the case for significance or what may be referred to as the potential impact of the study's findings. How, for instance, do the findings advance or have the potential to advance knowledge, skills, scholarly thought or the development and understanding of practice, education, management or policy? The timeliness of the research may also suggest significance, as is its contribution to theory-building or theory-testing.
As alluded to above, without rigour, the integrity of originality and significance is undermined. Therefore, rigour is of specific importance to any evaluation of publication quality. It can be defined as the extent to which the research demonstrates intellectual coherence and integrity, using robust and appropriate concepts, analyses, sources, theories and/or methodologies. We will now deal with how rigour is assured in quantitative and qualitative research studies.
In quantitative research, the focus is on the application of theory, research designs, methodologies and procedures to generate hypotheses and test them. It aims to produce research findings with internal and external validity and reliability. Internal validity is concerned with how a study is designed, conducted and analysed and the procedural sureties we put in place to maximise the study's inferential impact. External validity is concerned with whether the findings of a study are generalisable to other settings. This also includes ecological validity, the measure of applicability to practice settings. It is these measures that raise a paper from parochial relevance to international excellence and world-leading importance, thereby increasing its publish-ability and research assessment worth.
The ‘a priori’ and objective nature of quantitative research means that issues affecting validity, such as inappropriate research design, sampling issues, poor instrument selection, limited statistical analysis, etc., need to be addressed before data collection commences. Once data collection begins, there is no opportunity for correction. Many irreversible internal and external validity errors were made because researchers did not consult a methodologist/statistician early enough in the conceptualisation stage. Remember, there are limits to what statistical techniques can correct. Journal reviewers are often left feeling that the authors could have substantially strengthened the paper for publication if they had a better understanding and application of quantitative methods.
Rigour in qualitative research involves providing an auditable decision trail of the design, conduct and reporting. A rationale needs to be provided for each step of the research process. Arguably, it is the research question that drives the design and reporting of the research. It must be clearly focused and supported by evidence. It also needs to be underpinned by a strong conceptual framework, which informs the selection of appropriate research methods. This enhances trustworthiness and minimises the researcher bias sometimes inherent in qualitative methodologies (Johnson, Adkins, and Chauvin 2020).
The research question is foundational to the choice of conceptual framework, methodology and design of a qualitative study, and these facilitate the research process that best answers the research question. Furthermore, each phase of the study must be described in terms of its relevance to the research question. The research plan (recruitment, data collection and data analysis) should also systematically examine the research question within a relevant context.
One common problem seen in qualitative papers in research assessment exercises and reviews for publication is that the research question is not resolved in the findings. A well-designed interview (or other data collection method) will capture complex data pertinent to the research question. An interview that gets side-tracked (by data that may be interesting but not pertinent) or one that fails to interrogate the participants' descriptions in relation to the overarching research question will produce insubstantial data for analysis. The analysis should be driven by both the methodological approach and the research question, and the reporting of findings needs to reflect the outcomes of the analysis.
The analytic process is a systematic interpretation supported by evidence from the data. It needs to focus on interpretation rather than description alone, that is, what does the data mean in terms of the research questions and how does the analysis reflect the methodological approach? Sometimes, submissions to the Journal report under-analysed data. This may take the form of ‘themes’ that reproduce responses to the interview questions. These should be designed to elicit responses to the research question. It is an example of insufficient analysis when responses are reported without interpretation. The interpretation needs to be guided by the methodological approach or, in the case of thematic analysis within the conceptual framework. Following these rules enables others to ascertain whether the findings might be relevant to their clinical practice.
In conclusion, regardless of whether the research approach is quantitative or qualitative, the publication should show scientific excellence in design, method, execution and analysis. It should demonstrate a systematic and rigorous approach to its relationship with existing research, be adequately detailed for replication purposes, error sources are identified, accounted for and minimised and limitations are highlighted.
The Journal of Psychiatry and Mental Health Nursing is publishing a series of research methodology papers covering both quantitative and qualitative research approaches. For the former, they deal with philosophical paradigms, research designs, data collection methods and psychometric and statistical analysis techniques. For qualitative research, three papers focus on understanding the place of qualitative research in mental health nursing practice, conducting qualitative research and publishing qualitative research studies. The overall aim of these series is to enhance the quality of research papers submitted to the journal to increase understanding of the research process and improve the application of findings (internal/external validity) to produce good quality, rigorous research of high international relevance.
Marie Crowe and Paul Slater are Associate Editors JPMHN. Hugh McKenna is an Editor JPMHN.
期刊介绍:
The Journal of Psychiatric and Mental Health Nursing is an international journal which publishes research and scholarly papers that advance the development of policy, practice, research and education in all aspects of mental health nursing. We publish rigorously conducted research, literature reviews, essays and debates, and consumer practitioner narratives; all of which add new knowledge and advance practice globally.
All papers must have clear implications for mental health nursing either solely or part of multidisciplinary practice. Papers are welcomed which draw on single or multiple research and academic disciplines. We give space to practitioner and consumer perspectives and ensure research published in the journal can be understood by a wide audience. We encourage critical debate and exchange of ideas and therefore welcome letters to the editor and essays and debates in mental health.