Pedro Henrique de Magalhães Tenório, Marcelo Marques Vieira, Abner Alberti, Marcos Felipe Marcatto de Abreu, João Carlos Nakamoto, Alberto Cliquet Júnior
{"title":"Evaluation of intra- and interobserver reliability of the AO classification for wrist fractures","authors":"Pedro Henrique de Magalhães Tenório, Marcelo Marques Vieira, Abner Alberti, Marcos Felipe Marcatto de Abreu, João Carlos Nakamoto, Alberto Cliquet Júnior","doi":"10.1016/j.rboe.2017.08.024","DOIUrl":null,"url":null,"abstract":"<div><h3>Objective</h3><p>This study evaluated the intraobserver and interobserver reliability of the AO classification for standard radiographs of wrist fractures.</p></div><div><h3>Methods</h3><p>Thirty observers, divided into three groups (orthopedic surgery senior residents, orthopedic surgeons, and hand surgeons) classified 52 wrist fractures, using only simple radiographs. After a period of four weeks, the same observers evaluated the initial 52 radiographs, in a randomized order. The agreement among the observers, the groups, and intraobserver was obtained using the Kappa index. Kappa-values were interpreted as proposed by Landis and Koch.</p></div><div><h3>Results</h3><p>The global interobserver agreement level of the AO classification was considered fair (0.30). The three groups presented fair global interobserver agreement (residents, 0.27; orthopedic surgeons, 0.30; hand surgeons, 0.33). The global intraobserver agreement level was moderated. The hand surgeon group obtained the higher intraobserver agreement level, although only moderate (0.50). The residents group obtained fair levels (0.30), as did the orthopedics surgeon group (0.33).</p></div><div><h3>Conclusion</h3><p>The data obtained suggests fair levels of interobserver agreement and moderate levels of intraobserver agreement for the AO classification for wrist fractures.</p></div>","PeriodicalId":101095,"journal":{"name":"Revista Brasileira de Ortopedia (English Edition)","volume":"53 6","pages":"Pages 703-706"},"PeriodicalIF":0.0000,"publicationDate":"2018-11-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://sci-hub-pdf.com/10.1016/j.rboe.2017.08.024","citationCount":"2","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Revista Brasileira de Ortopedia (English Edition)","FirstCategoryId":"1085","ListUrlMain":"https://www.sciencedirect.com/science/article/pii/S2255497118301290","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 2
Abstract
Objective
This study evaluated the intraobserver and interobserver reliability of the AO classification for standard radiographs of wrist fractures.
Methods
Thirty observers, divided into three groups (orthopedic surgery senior residents, orthopedic surgeons, and hand surgeons) classified 52 wrist fractures, using only simple radiographs. After a period of four weeks, the same observers evaluated the initial 52 radiographs, in a randomized order. The agreement among the observers, the groups, and intraobserver was obtained using the Kappa index. Kappa-values were interpreted as proposed by Landis and Koch.
Results
The global interobserver agreement level of the AO classification was considered fair (0.30). The three groups presented fair global interobserver agreement (residents, 0.27; orthopedic surgeons, 0.30; hand surgeons, 0.33). The global intraobserver agreement level was moderated. The hand surgeon group obtained the higher intraobserver agreement level, although only moderate (0.50). The residents group obtained fair levels (0.30), as did the orthopedics surgeon group (0.33).
Conclusion
The data obtained suggests fair levels of interobserver agreement and moderate levels of intraobserver agreement for the AO classification for wrist fractures.