Evaluation of intra- and interobserver reliability of the AO classification for wrist fractures

Pedro Henrique de Magalhães Tenório, Marcelo Marques Vieira, Abner Alberti, Marcos Felipe Marcatto de Abreu, João Carlos Nakamoto, Alberto Cliquet Júnior
{"title":"Evaluation of intra- and interobserver reliability of the AO classification for wrist fractures","authors":"Pedro Henrique de Magalhães Tenório,&nbsp;Marcelo Marques Vieira,&nbsp;Abner Alberti,&nbsp;Marcos Felipe Marcatto de Abreu,&nbsp;João Carlos Nakamoto,&nbsp;Alberto Cliquet Júnior","doi":"10.1016/j.rboe.2017.08.024","DOIUrl":null,"url":null,"abstract":"<div><h3>Objective</h3><p>This study evaluated the intraobserver and interobserver reliability of the AO classification for standard radiographs of wrist fractures.</p></div><div><h3>Methods</h3><p>Thirty observers, divided into three groups (orthopedic surgery senior residents, orthopedic surgeons, and hand surgeons) classified 52 wrist fractures, using only simple radiographs. After a period of four weeks, the same observers evaluated the initial 52 radiographs, in a randomized order. The agreement among the observers, the groups, and intraobserver was obtained using the Kappa index. Kappa-values were interpreted as proposed by Landis and Koch.</p></div><div><h3>Results</h3><p>The global interobserver agreement level of the AO classification was considered fair (0.30). The three groups presented fair global interobserver agreement (residents, 0.27; orthopedic surgeons, 0.30; hand surgeons, 0.33). The global intraobserver agreement level was moderated. The hand surgeon group obtained the higher intraobserver agreement level, although only moderate (0.50). The residents group obtained fair levels (0.30), as did the orthopedics surgeon group (0.33).</p></div><div><h3>Conclusion</h3><p>The data obtained suggests fair levels of interobserver agreement and moderate levels of intraobserver agreement for the AO classification for wrist fractures.</p></div>","PeriodicalId":101095,"journal":{"name":"Revista Brasileira de Ortopedia (English Edition)","volume":"53 6","pages":"Pages 703-706"},"PeriodicalIF":0.0000,"publicationDate":"2018-11-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://sci-hub-pdf.com/10.1016/j.rboe.2017.08.024","citationCount":"2","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Revista Brasileira de Ortopedia (English Edition)","FirstCategoryId":"1085","ListUrlMain":"https://www.sciencedirect.com/science/article/pii/S2255497118301290","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 2

Abstract

Objective

This study evaluated the intraobserver and interobserver reliability of the AO classification for standard radiographs of wrist fractures.

Methods

Thirty observers, divided into three groups (orthopedic surgery senior residents, orthopedic surgeons, and hand surgeons) classified 52 wrist fractures, using only simple radiographs. After a period of four weeks, the same observers evaluated the initial 52 radiographs, in a randomized order. The agreement among the observers, the groups, and intraobserver was obtained using the Kappa index. Kappa-values were interpreted as proposed by Landis and Koch.

Results

The global interobserver agreement level of the AO classification was considered fair (0.30). The three groups presented fair global interobserver agreement (residents, 0.27; orthopedic surgeons, 0.30; hand surgeons, 0.33). The global intraobserver agreement level was moderated. The hand surgeon group obtained the higher intraobserver agreement level, although only moderate (0.50). The residents group obtained fair levels (0.30), as did the orthopedics surgeon group (0.33).

Conclusion

The data obtained suggests fair levels of interobserver agreement and moderate levels of intraobserver agreement for the AO classification for wrist fractures.

Abstract Image

腕部骨折AO分类的观察者内部和观察者之间可靠性评价
目的评价腕关节骨折标准x线片AO分型在观察者内和观察者间的可靠性。方法采用单纯x线片对52例腕部骨折进行分类,观察对象30人,分为骨科高级住院医师、骨科医生和手外科医生3组。四周后,同样的观察人员以随机顺序评估最初的52张x光片。使用Kappa指数获得观察者,小组和内部观察者之间的协议。kappa值由Landis和Koch提出。结果AO分类的整体观察者间一致性水平为公平(0.30)。三个组呈现公平的全球观察者间协议(居民,0.27;骨科,0.30;手外科医生,0.33)。全球观察者内部的共识水平有所缓和。手外科医生组获得了更高的观察者内一致性水平,尽管只有中等水平(0.50)。住院医师组获得了相当的水平(0.30),骨科医生组也获得了相当的水平(0.33)。结论所获得的数据表明,对于腕关节骨折的AO分类,观察者之间的一致程度一般,观察者内部的一致程度中等。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信