对色素性病变的临床照片分类的一致意见:与年轻皮肤科医生培训课程后的锻炼。

IF 2.3 Q2 DERMATOLOGY
Simone Cazzaniga, Lucia De Ponti, Giorgio Maria Baratelli, Salvatore Francione, Carlo La Vecchia, Anna Di Landro, Andrea Carugno, Marco Di Mercurio, Lerica Germi, Giampaolo Trevisan, Mirko Fenaroli, Claudia Capasso, Michele Pezza, Pietro Dri, Emanuele Castelli, Luigi Naldi
{"title":"对色素性病变的临床照片分类的一致意见:与年轻皮肤科医生培训课程后的锻炼。","authors":"Simone Cazzaniga,&nbsp;Lucia De Ponti,&nbsp;Giorgio Maria Baratelli,&nbsp;Salvatore Francione,&nbsp;Carlo La Vecchia,&nbsp;Anna Di Landro,&nbsp;Andrea Carugno,&nbsp;Marco Di Mercurio,&nbsp;Lerica Germi,&nbsp;Giampaolo Trevisan,&nbsp;Mirko Fenaroli,&nbsp;Claudia Capasso,&nbsp;Michele Pezza,&nbsp;Pietro Dri,&nbsp;Emanuele Castelli,&nbsp;Luigi Naldi","doi":"10.4081/dr.2022.9500","DOIUrl":null,"url":null,"abstract":"<p><p>Smartphone apps may help promoting the early diagnosis of melanoma. The reliability of specialist judgment on lesions should be assessed. Hereby, we evaluated the agreement of 6 young dermatologists, after a specific training. Clinical judgment was evaluated during 2 online sessions, 1 month apart, on a series of 45 pigmentary lesions. Lesions were classified as highly suspicious, suspicious, non-suspicious or not assessable. Cohen's and Fleiss' kappa were used to calculate intra- and inter-rater agreement. The overall intra-rater agreement was 0.42 (95% confidence interval - CI: 0.33-0.50), varying between 0.12-0.59 on single raters. The inter-rater agreement during the first phase was 0.29 (95% CI: 0.24-0.34). When considering the agreement for each category of judgment, kappa varied from 0.19 for not assessable to 0.48 for highly suspicious lesions. Similar results were obtained in the second exercise. The study showed a less than satisfactory agreement among young dermatologists. Our data point to the need for improving the reliability of the clinical diagnoses of melanoma especially when assessing small lesions and when dealing with thin melanomas at a population level.</p>","PeriodicalId":11049,"journal":{"name":"Dermatology Reports","volume":"15 1","pages":"9500"},"PeriodicalIF":2.3000,"publicationDate":"2023-03-07","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://ftp.ncbi.nlm.nih.gov/pub/pmc/oa_pdf/eb/80/dr-15-1-9500.PMC10099286.pdf","citationCount":"0","resultStr":"{\"title\":\"Agreement on classification of clinical photographs of pigmentary lesions: exercise after a training course with young dermatologists.\",\"authors\":\"Simone Cazzaniga,&nbsp;Lucia De Ponti,&nbsp;Giorgio Maria Baratelli,&nbsp;Salvatore Francione,&nbsp;Carlo La Vecchia,&nbsp;Anna Di Landro,&nbsp;Andrea Carugno,&nbsp;Marco Di Mercurio,&nbsp;Lerica Germi,&nbsp;Giampaolo Trevisan,&nbsp;Mirko Fenaroli,&nbsp;Claudia Capasso,&nbsp;Michele Pezza,&nbsp;Pietro Dri,&nbsp;Emanuele Castelli,&nbsp;Luigi Naldi\",\"doi\":\"10.4081/dr.2022.9500\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"<p><p>Smartphone apps may help promoting the early diagnosis of melanoma. The reliability of specialist judgment on lesions should be assessed. Hereby, we evaluated the agreement of 6 young dermatologists, after a specific training. Clinical judgment was evaluated during 2 online sessions, 1 month apart, on a series of 45 pigmentary lesions. Lesions were classified as highly suspicious, suspicious, non-suspicious or not assessable. Cohen's and Fleiss' kappa were used to calculate intra- and inter-rater agreement. The overall intra-rater agreement was 0.42 (95% confidence interval - CI: 0.33-0.50), varying between 0.12-0.59 on single raters. The inter-rater agreement during the first phase was 0.29 (95% CI: 0.24-0.34). When considering the agreement for each category of judgment, kappa varied from 0.19 for not assessable to 0.48 for highly suspicious lesions. Similar results were obtained in the second exercise. The study showed a less than satisfactory agreement among young dermatologists. Our data point to the need for improving the reliability of the clinical diagnoses of melanoma especially when assessing small lesions and when dealing with thin melanomas at a population level.</p>\",\"PeriodicalId\":11049,\"journal\":{\"name\":\"Dermatology Reports\",\"volume\":\"15 1\",\"pages\":\"9500\"},\"PeriodicalIF\":2.3000,\"publicationDate\":\"2023-03-07\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"https://ftp.ncbi.nlm.nih.gov/pub/pmc/oa_pdf/eb/80/dr-15-1-9500.PMC10099286.pdf\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Dermatology Reports\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.4081/dr.2022.9500\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q2\",\"JCRName\":\"DERMATOLOGY\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Dermatology Reports","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.4081/dr.2022.9500","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q2","JCRName":"DERMATOLOGY","Score":null,"Total":0}
引用次数: 0

摘要

智能手机应用程序可能有助于促进黑色素瘤的早期诊断。专家对病变判断的可靠性应进行评估。在此,我们评估了6名年轻皮肤科医生经过专门培训后的同意度。临床判断在2个在线会议,间隔1个月,对一系列的45个色素病变。病变分为高度可疑、可疑、非可疑或不可评估。Cohen’s和Fleiss’s kappa被用来计算内部和内部的一致性。总体评分者内部一致性为0.42(95%置信区间- CI: 0.33-0.50),单个评分者的一致性在0.12-0.59之间变化。第一阶段的评分间一致性为0.29 (95% CI: 0.24-0.34)。当考虑每一类判断的一致性时,kappa从不可评估的0.19到高度可疑病变的0.48不等。在第二次练习中也得到了类似的结果。这项研究显示,年轻皮肤科医生的意见不太一致。我们的数据表明需要提高黑色素瘤临床诊断的可靠性,特别是在评估小病变和在人群水平上处理薄黑色素瘤时。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
Agreement on classification of clinical photographs of pigmentary lesions: exercise after a training course with young dermatologists.

Smartphone apps may help promoting the early diagnosis of melanoma. The reliability of specialist judgment on lesions should be assessed. Hereby, we evaluated the agreement of 6 young dermatologists, after a specific training. Clinical judgment was evaluated during 2 online sessions, 1 month apart, on a series of 45 pigmentary lesions. Lesions were classified as highly suspicious, suspicious, non-suspicious or not assessable. Cohen's and Fleiss' kappa were used to calculate intra- and inter-rater agreement. The overall intra-rater agreement was 0.42 (95% confidence interval - CI: 0.33-0.50), varying between 0.12-0.59 on single raters. The inter-rater agreement during the first phase was 0.29 (95% CI: 0.24-0.34). When considering the agreement for each category of judgment, kappa varied from 0.19 for not assessable to 0.48 for highly suspicious lesions. Similar results were obtained in the second exercise. The study showed a less than satisfactory agreement among young dermatologists. Our data point to the need for improving the reliability of the clinical diagnoses of melanoma especially when assessing small lesions and when dealing with thin melanomas at a population level.

求助全文
通过发布文献求助,成功后即可免费获取论文全文。 去求助
来源期刊
Dermatology Reports
Dermatology Reports DERMATOLOGY-
CiteScore
1.40
自引率
0.00%
发文量
74
审稿时长
10 weeks
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信