0
We're unable to sign you in at this time. Please try again in a few minutes.
Retry
We were able to sign you in, but your subscription(s) could not be found. Please try again in a few minutes.
Retry
There may be a problem with your account. Please contact the AMA Service Center to resolve this issue.
Contact the AMA Service Center:
Telephone: 1 (800) 262-2350 or 1 (312) 670-7827  *   Email: subscriptions@jamanetwork.com
Error Message ......
Original Investigation |

Validity and Reliability of Dermoscopic Criteria Used to Differentiate Nevi From Melanoma A Web-Based International Dermoscopy Society Study

Cristina Carrera, MD, PhD1,2; Michael A. Marchetti, MD1; Stephen W. Dusza, DrPH1; Giuseppe Argenziano, MD3; Ralph P. Braun, MD4; Allan C. Halpern, MD1; Natalia Jaimes, MD5; Harald J. Kittler, MD6; Josep Malvehy, MD2; Scott W. Menzies, MBBS, PhD7,8; Giovanni Pellacani, MD9; Susana Puig, MD2; Harold S. Rabinovitz, MD10; Alon Scope, MD11; H. Peter Soyer, MD12,13; Wilhelm Stolz, MD14; Rainer Hofmann-Wellenhof, MD15; Iris Zalaudek, MD15; Ashfaq A. Marghoob, MD1
[+] Author Affiliations
1Dermatology Service, Department of Medicine, Memorial Sloan Kettering Cancer Center, New York, New York
2Melanoma Unit, Department of Dermatology, Hospital Clinic Barcelona, Institut d’Investigacions Biomèdiques August Pi i Sunyer, University of Barcelona, Centro de Investigacion Biomedica en red de enfermedades raras, Barcelona, Spain
3Dermatology Unit, Second University of Naples, Naples, Italy
4Department of Dermatology, University Hospital Zürich, Zürich, Switzerland
5Dermatology Service, Aurora Skin Cancer Center, Universidad Pontificia Bolivariana, Medellín, Colombia
6Department of Dermatology, Medical University of Vienna, Vienna, Austria
7Sydney Melanoma Diagnostic Centre, Sydney Cancer Centre, Royal Prince Alfred Hospital, Camperdown, Australia
8Discipline of Dermatology, The University of Sydney, New South Wales, Australia
9Center for Environmental, Genetic, and Nutritional Epidemiology, Department of Diagnostic, Clinical, and Public Health Medicine, University of Modena and Reggio Emilia, Modena, Italy
10Skin and Cancer Associates, Plantation, Florida
11Department of Dermatology, Sheba Medical Center, Sackler School of Medicine, Tel Aviv University, Tel Aviv, Israel
12Dermatology Research Centre, The University of Queensland, Brisbane, Queensland, Australia
13School of Medicine, Translational Research Institute, Brisbane, Queensland, Australia
14Clinic for Dermatology, Allergology, and Environmental Medicine, Klinik Thalkirchner Straße Städt, Klinikum München GmbH, Munich, Germany
15Department of Dermatology, Medical University of Graz, Graz, Austria
JAMA Dermatol. 2016;152(7):798-806. doi:10.1001/jamadermatol.2016.0624.
Text Size: A A A
Published online

Importance  The comparative diagnostic performance of dermoscopic algorithms and their individual criteria are not well studied.

Objectives  To analyze the discriminatory power and reliability of dermoscopic criteria used in melanoma detection and compare the diagnostic accuracy of existing algorithms.

Design, Setting, and Participants  This was a retrospective, observational study of 477 lesions (119 melanomas [24.9%] and 358 nevi [75.1%]), which were divided into 12 image sets that consisted of 39 or 40 images per set. A link on the International Dermoscopy Society website from January 1, 2011, through December 31, 2011, directed participants to the study website. Data analysis was performed from June 1, 2013, through May 31, 2015. Participants included physicians, residents, and medical students, and there were no specialty-type or experience-level restrictions. Participants were randomly assigned to evaluate 1 of the 12 image sets.

Main Outcomes and Measures  Associations with melanoma and intraclass correlation coefficients (ICCs) were evaluated for the presence of dermoscopic criteria. Diagnostic accuracy measures were estimated for the following algorithms: the ABCD rule, the Menzies method, the 7-point checklist, the 3-point checklist, chaos and clues, and CASH (color, architecture, symmetry, and homogeneity).

Results  A total of 240 participants registered, and 103 (42.9%) evaluated all images. The 110 participants (45.8%) who evaluated fewer than 20 lesions were excluded, resulting in data from 130 participants (54.2%), 121 (93.1%) of whom were regular dermoscopy users. Criteria associated with melanoma included marked architectural disorder (odds ratio [OR], 6.6; 95% CI, 5.6-7.8), pattern asymmetry (OR, 4.9; 95% CI, 4.1-5.8), nonorganized pattern (OR, 3.3; 95% CI, 2.9-3.7), border score of 6 (OR, 3.3; 95% CI, 2.5-4.3), and contour asymmetry (OR, 3.2; 95% CI, 2.7-3.7) (P < .001 for all). Most dermoscopic criteria had poor to fair interobserver agreement. Criteria that reached moderate levels of agreement included comma vessels (ICC, 0.44; 95% CI, 0.40-0.49), absence of vessels (ICC, 0.46; 95% CI, 0.42-0.51), dark brown color (ICC, 0.40; 95% CI, 0.35-0.44), and architectural disorder (ICC, 0.43; 95% CI, 0.39-0.48). The Menzies method had the highest sensitivity for melanoma diagnosis (95.1%) but the lowest specificity (24.8%) compared with any other method (P < .001). The ABCD rule had the highest specificity (59.4%). All methods had similar areas under the receiver operating characteristic curves.

Conclusions and Relevance  Important dermoscopic criteria for melanoma recognition were revalidated by participants with varied experience. Six algorithms tested had similar but modest levels of diagnostic accuracy, and the interobserver agreement of most individual criteria was poor.

Figures in this Article

Topics

Sign in

Purchase Options

• Buy this article
• Subscribe to the journal
• Rent this article ?

Figures

Place holder to copy figure label and caption
Figure.
Comparison of the Diagnostic Accuracy of the Dermoscopic Algorithms

Receiver operating characteristic curves for 6 dermoscopic algorithms were evaluated. CASH indicates color, architecture, symmetry, and homogeneity.

Graphic Jump Location

Tables

References

Correspondence

CME
Also Meets CME requirements for:
Browse CME for all U.S. States
Accreditation Information
The American Medical Association is accredited by the Accreditation Council for Continuing Medical Education to provide continuing medical education for physicians. The AMA designates this journal-based CME activity for a maximum of 1 AMA PRA Category 1 CreditTM per course. Physicians should claim only the credit commensurate with the extent of their participation in the activity. Physicians who complete the CME course and score at least 80% correct on the quiz are eligible for AMA PRA Category 1 CreditTM.
Note: You must get at least of the answers correct to pass this quiz.
Please click the checkbox indicating that you have read the full article in order to submit your answers.
Your answers have been saved for later.
You have not filled in all the answers to complete this quiz
The following questions were not answered:
Sorry, you have unsuccessfully completed this CME quiz with a score of
The following questions were not answered correctly:
Commitment to Change (optional):
Indicate what change(s) you will implement in your practice, if any, based on this CME course.
Your quiz results:
The filled radio buttons indicate your responses. The preferred responses are highlighted
For CME Course: A Proposed Model for Initial Assessment and Management of Acute Heart Failure Syndromes
Indicate what changes(s) you will implement in your practice, if any, based on this CME course.

Multimedia

Some tools below are only available to our subscribers or users with an online account.

1,002 Views
0 Citations
×

Sign in

Purchase Options

• Buy this article
• Subscribe to the journal
• Rent this article ?

Related Content

Customize your page view by dragging & repositioning the boxes below.

Jobs
brightcove.createExperiences();