{"title":"A new structure for quality improvement reports.","authors":"F Moss, R Thompson","doi":"10.1136/qshc.8.2.76","DOIUrl":null,"url":null,"abstract":"Finding out about how others’ schemes to implement change succeed or why they fail can be extremely helpful. It can save time and eVort and may accelerate improvements in service delivery. One of the stated aims of this journal is to publish such quality improvement reports alongside papers that report the results of relevant research. The editorial team are aware through discussion with colleagues, from papers presented at meetings, and reading local reports that many people are involved in useful and informative quality improvement projects that could have valuable messages for others. And yet in the past seven and a half years we have only published 12 quality improvement reports—the most recent one in December 1995. We rely on submitted reports, and one of the reasons for this dearth of published quality improvement reports may be that people are simply too busy improving care to have time to write. But there may be other barriers. The standard form for writing papers in medical journals is the scientific IMRaD (introduction, methods, results, and discussion) structure. This is a convenient and helpful structure for writing about research. When writing a quality improvement report this structure does not quite fit, however. For example, there will be a first methods section—when the measurements are made—and a first results section—when the results are analysed. However, there follows a second methods section describing the implementation of change, perhaps followed by a third methods section when the measurements are repeated to assess progress, and then a second results section describing the improvements. Writing quality improvement reports in this way may not only be diYcult but may result in a paper that does not convey the lessons that others would find useful. The editorial team has therefore developed a new structure (box) for describing quality improvement work that we think will reflect this work more accurately and which we hope will encourage authors to write about their experience. The first quality improvement report using this structure is published in this issue of Quality in Health Care (page 119). There is also another fundamental diVerence between quality improvement reports and the reports of original research. Research seeks broadly to produce generalisable results. Thus, trials of thrombolytic treatment in acute myocardial infarction sought to determine whether trombolysis reduced subsequent mortality, such that the results could be generalised to coronary care units and medical wards treating such patients. On the other hand, a local audit or quality improvement project, which seeks to assess whether patients are appropriately treated with thrombolytic therapy does so to monitor and ensure the implementation of evidence based treatment in practice. The results of such a study are not generalisable to other coronary care units in the same way as the preceding research evidence, and for many this would suggest that the work is not publishable. We would disagree. The results may not be generalisable, they are unique to the unit where the audit was undertaken—and most probably to the time of the audit. Any indentified problem needs local diagnosis and local change to occur to create improvement. But a well written and structured quality improvement report may include generalisable methods and strategies for change from which others undertaking similar audits would benefit. Thus, good quality improvement reports should oVer a means of disseminating good practice, and there is little doubt in our minds that much that is good about such work is not as yet widely reported. As a result practitioners are denied the opportunity to learn from each other as the science of audit and quality improvement matures. All quality improvement reports submitted to the journal will be peer reviewed and the decision on acceptance made by the editorial team. Quality improvement reports do not necessarily have to report success. However, all should contain lessons or messages that have relevance to others and that could help them in the process of improving care. Measurements need to be robust and rigorous and results analysed and interpreted with care. Quality improvement reports should include a reflection on the cause of deficiencies in care. Problems associated with implementing change should not be glossed over but described, and possible causes and solutions discussed. We hope that the new structure will encourage those with practical experience of quality improvement to write about it in a way that will help others. And we hope that readers will find the new quality improvement reports interesting and useful—please let us know.","PeriodicalId":20773,"journal":{"name":"Quality in health care : QHC","volume":"8 2","pages":"76"},"PeriodicalIF":0.0000,"publicationDate":"1999-06-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://sci-hub-pdf.com/10.1136/qshc.8.2.76","citationCount":"68","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Quality in health care : QHC","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1136/qshc.8.2.76","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 68
Abstract
Finding out about how others’ schemes to implement change succeed or why they fail can be extremely helpful. It can save time and eVort and may accelerate improvements in service delivery. One of the stated aims of this journal is to publish such quality improvement reports alongside papers that report the results of relevant research. The editorial team are aware through discussion with colleagues, from papers presented at meetings, and reading local reports that many people are involved in useful and informative quality improvement projects that could have valuable messages for others. And yet in the past seven and a half years we have only published 12 quality improvement reports—the most recent one in December 1995. We rely on submitted reports, and one of the reasons for this dearth of published quality improvement reports may be that people are simply too busy improving care to have time to write. But there may be other barriers. The standard form for writing papers in medical journals is the scientific IMRaD (introduction, methods, results, and discussion) structure. This is a convenient and helpful structure for writing about research. When writing a quality improvement report this structure does not quite fit, however. For example, there will be a first methods section—when the measurements are made—and a first results section—when the results are analysed. However, there follows a second methods section describing the implementation of change, perhaps followed by a third methods section when the measurements are repeated to assess progress, and then a second results section describing the improvements. Writing quality improvement reports in this way may not only be diYcult but may result in a paper that does not convey the lessons that others would find useful. The editorial team has therefore developed a new structure (box) for describing quality improvement work that we think will reflect this work more accurately and which we hope will encourage authors to write about their experience. The first quality improvement report using this structure is published in this issue of Quality in Health Care (page 119). There is also another fundamental diVerence between quality improvement reports and the reports of original research. Research seeks broadly to produce generalisable results. Thus, trials of thrombolytic treatment in acute myocardial infarction sought to determine whether trombolysis reduced subsequent mortality, such that the results could be generalised to coronary care units and medical wards treating such patients. On the other hand, a local audit or quality improvement project, which seeks to assess whether patients are appropriately treated with thrombolytic therapy does so to monitor and ensure the implementation of evidence based treatment in practice. The results of such a study are not generalisable to other coronary care units in the same way as the preceding research evidence, and for many this would suggest that the work is not publishable. We would disagree. The results may not be generalisable, they are unique to the unit where the audit was undertaken—and most probably to the time of the audit. Any indentified problem needs local diagnosis and local change to occur to create improvement. But a well written and structured quality improvement report may include generalisable methods and strategies for change from which others undertaking similar audits would benefit. Thus, good quality improvement reports should oVer a means of disseminating good practice, and there is little doubt in our minds that much that is good about such work is not as yet widely reported. As a result practitioners are denied the opportunity to learn from each other as the science of audit and quality improvement matures. All quality improvement reports submitted to the journal will be peer reviewed and the decision on acceptance made by the editorial team. Quality improvement reports do not necessarily have to report success. However, all should contain lessons or messages that have relevance to others and that could help them in the process of improving care. Measurements need to be robust and rigorous and results analysed and interpreted with care. Quality improvement reports should include a reflection on the cause of deficiencies in care. Problems associated with implementing change should not be glossed over but described, and possible causes and solutions discussed. We hope that the new structure will encourage those with practical experience of quality improvement to write about it in a way that will help others. And we hope that readers will find the new quality improvement reports interesting and useful—please let us know.