A proposal for computed tomography–based algorithm for the management of radial head and neck fractures: the Proximal and Articular Radial fractures Management (PARMa) classification
{"title":"A proposal for computed tomography–based algorithm for the management of radial head and neck fractures: the Proximal and Articular Radial fractures Management (PARMa) classification","authors":"Filippo Calderazzi MD, PhD , Davide Donelli MD , Cristina Galavotti MD , Alessandro Nosenzo MD , Paolo Bastia MD , Enricomaria Lunini MD , Marco Paterlini MD , Giorgio Concari MD , Alessandra Maresca MD , Alessandro Marinelli MD","doi":"10.1016/j.jseint.2024.09.031","DOIUrl":null,"url":null,"abstract":"<div><h3>Background</h3><div>Owing to the great variety of fracture patterns and limitations of the standard radiographic investigation, all the already available classification systems for radial head and neck fractures (RHNFs) are limited by a poor-to-moderate degree of intraobserver and interobserver reliability. Although computed tomography (CT) is being increasingly used to better understand the fracture characteristics, a CT-based classification system of RHNFs is still lacking. Therefore, in this agreement study, we aimed to propose a classification system based on two-dimensional and three-dimensional (2D/3D) CT to test the hypothesis that this classification has good intraobserver and interobserver reliability. We have also provided a treatment algorithm.</div></div><div><h3>Methods</h3><div>Our proposed classification—Proximal and Articular Radial fractures Management (PARMa)—is based on 2D/3D CT imaging. It is divided into four types based on different fractures patterns. The 2D/3D scans of 90 RHNFs were evaluated in a blinded fashion by eight orthopedic and one radiology consultant, according to the proposed classification. The first phase of observation aimed to estimate the interobserver agreement. The second phase involved a new observation, 4 weeks after the first analysis, and estimated the intraobserver reliability. The standard radiographs of these 90 fractures were also evaluated by the same observers, with the same timing and methods, based on the same classification. Cohen's Kappa was applied for intraobserver agreement. Fleiss's Kappa was used both within and among the evaluators. Kendall's coefficient of concordance was employed to determine the strength of association among the appraisers’ rankings. Furthermore, Krippendorff's alpha was chosen as an adjunctive analysis to assess between evaluators’ agreement.</div></div><div><h3>Results</h3><div>For the intraobserver agreement, Fleiss’ Kappa statistics confirmed the consistency (overall kappa values: 0.70-0.82). Cohen’s Kappa statistics aligned with Fleiss’ Kappa, with similar kappa values and significant <em>P</em> values (<em>P</em> < .001). For interobserver agreement, Fleiss’ Kappa statistics for between appraisers showed moderate-to-substantial agreement, with kappa values ranging from 0.54 to 0.82 for different responses. The results relating to the appraisers' observation of standard radiographs showed that the overall Fleiss’ Kappa values for intraobserver agreement ranged from 0.34 to 0.82, whereas Fleiss’ Kappa statistics for interobserver agreement ranged from 0.40 to 0.69.</div></div><div><h3>Conclusions</h3><div>The proposed classification system is expected to be reliable, reproducible, and useful for preoperative planning and surgical management. Both 2D and 3D CT allow the identification of the magnitude and position of displacement and articular surface involvement.</div></div>","PeriodicalId":34444,"journal":{"name":"JSES International","volume":"9 2","pages":"Pages 549-561"},"PeriodicalIF":0.0000,"publicationDate":"2025-03-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"JSES International","FirstCategoryId":"1085","ListUrlMain":"https://www.sciencedirect.com/science/article/pii/S2666638324004432","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q2","JCRName":"Medicine","Score":null,"Total":0}
引用次数: 0
Abstract
Background
Owing to the great variety of fracture patterns and limitations of the standard radiographic investigation, all the already available classification systems for radial head and neck fractures (RHNFs) are limited by a poor-to-moderate degree of intraobserver and interobserver reliability. Although computed tomography (CT) is being increasingly used to better understand the fracture characteristics, a CT-based classification system of RHNFs is still lacking. Therefore, in this agreement study, we aimed to propose a classification system based on two-dimensional and three-dimensional (2D/3D) CT to test the hypothesis that this classification has good intraobserver and interobserver reliability. We have also provided a treatment algorithm.
Methods
Our proposed classification—Proximal and Articular Radial fractures Management (PARMa)—is based on 2D/3D CT imaging. It is divided into four types based on different fractures patterns. The 2D/3D scans of 90 RHNFs were evaluated in a blinded fashion by eight orthopedic and one radiology consultant, according to the proposed classification. The first phase of observation aimed to estimate the interobserver agreement. The second phase involved a new observation, 4 weeks after the first analysis, and estimated the intraobserver reliability. The standard radiographs of these 90 fractures were also evaluated by the same observers, with the same timing and methods, based on the same classification. Cohen's Kappa was applied for intraobserver agreement. Fleiss's Kappa was used both within and among the evaluators. Kendall's coefficient of concordance was employed to determine the strength of association among the appraisers’ rankings. Furthermore, Krippendorff's alpha was chosen as an adjunctive analysis to assess between evaluators’ agreement.
Results
For the intraobserver agreement, Fleiss’ Kappa statistics confirmed the consistency (overall kappa values: 0.70-0.82). Cohen’s Kappa statistics aligned with Fleiss’ Kappa, with similar kappa values and significant P values (P < .001). For interobserver agreement, Fleiss’ Kappa statistics for between appraisers showed moderate-to-substantial agreement, with kappa values ranging from 0.54 to 0.82 for different responses. The results relating to the appraisers' observation of standard radiographs showed that the overall Fleiss’ Kappa values for intraobserver agreement ranged from 0.34 to 0.82, whereas Fleiss’ Kappa statistics for interobserver agreement ranged from 0.40 to 0.69.
Conclusions
The proposed classification system is expected to be reliable, reproducible, and useful for preoperative planning and surgical management. Both 2D and 3D CT allow the identification of the magnitude and position of displacement and articular surface involvement.