0
We're unable to sign you in at this time. Please try again in a few minutes.
Retry
We were able to sign you in, but your subscription(s) could not be found. Please try again in a few minutes.
Retry
There may be a problem with your account. Please contact the AMA Service Center to resolve this issue.
Contact the AMA Service Center:
Telephone: 1 (800) 262-2350 or 1 (312) 670-7827  *   Email: subscriptions@jamanetwork.com
Error Message ......
Original Investigation |

Learning to Detect, Categorize, and Identify Skin Lesions A Meta-analysis FREE

Liam Rourke, PhD1; Sarah Oberholtzer, BSc2; Trish Chatterley, MLIS3; Alain Brassard, MD, FRCPC1
[+] Author Affiliations
1Department of Medicine, University of Alberta, Edmonton, Alberta, Canada
2Faculty of Medicine and Dentistry, University of Alberta, Edmonton, Alberta, Canada
3Faculty of Pharmacy and Pharmaceutical Sciences, University of Alberta, Edmonton, Alberta, Canada
JAMA Dermatol. 2015;151(3):293-301. doi:10.1001/jamadermatol.2014.3300.
Text Size: A A A
Published online

Importance  Educators use a variety of practices to train laypersons, medical students, residents, and primary care providers to diagnose skin lesions. Researchers have described these methods for decades, but there have been few attempts to catalog their scope or effectiveness.

Objective  To determine the scope and effectiveness of educational practices to improve the detection, categorization, and identification of skin lesions.

Data Sources  Literature indexed in MEDLINE, EMBASE, CINAHL, ERIC, PsycINFO, and BIOSIS Previews from inception until April 1, 2014, using terms cognate with skin disease, diagnosis, and education.

Study Selection  Studies in which the educational objective was operationalized as the ability to detect, categorize, or identify skin lesions, and the intervention was evaluated through comparisons of participants’ abilities before and after the intervention.

Data Extraction and Synthesis  Information about trainees, educational practices, educational outcomes, and study quality was extracted; it was synthesized through meta-analysis using a random effects model. Effect sizes were calculated by dividing the differences between preintervention and postintervention means by the pooled standard deviation (ie, standardized mean difference [SMD]). Heterogeneity was assessed using an I2 statistic.

Main Outcomes and Measures  Pooled effect size across all studies and separate effect sizes for each of the educational practices.

Results  Thirty-seven studies reporting 47 outcomes from 7 educational practices met our inclusion criteria. The pooled effect of the practices on participants’ abilities was large, with an SMD of 1.06 (95% CI, 0.81-1.31) indicating that posttest scores were approximately 1 SD above pretest scores. Effect sizes varied categorically between educational practices: the dermatology elective (SMD = 1.64; 95% CI, 1.17-2.11) and multicomponent interventions (SMD = 2.07; 95% CI, 0.71-3.44) had large effects; computer-based learning (SMD = 0.64; 95% CI, 0.36-0.92), lecture (SMD = 0.59; 95% CI, 0.28-0.90), pamphlet (SMD = 0.47; 95% CI, –0.11 to 1.05), and audit and feedback (SMD = 0.58; 95% CI, 0.10-1.07) had moderate effects; and moulage had a small effect (SMD = 0.15; 95% CI, –0.26 to 0.57).

Conclusions and Relevance  A number of approaches are used to improve participants’ abilities to diagnose skin lesions; some are more effective than others. The most effective approaches engage participants in a number of coordinated activities for an extended period, providing learners with the breadth of knowledge and practice required to change the mechanisms underlying performance.

Figures in this Article

Skin diseases result in billions of dollars in direct medical costs annually and a substantial toll on patients.1 Early detection, correct categorization, and accurate identification are pivotal for the successful treatment of skin diseases. Unfortunately, these skills are underdeveloped in the groups most centrally affected. Individuals at risk for skin cancer have difficulty detecting new or changing lesions, which increases the rates of morbidity and mortality.2 Their primary care providers have difficulty differentiating benign skin lesions from melanomas,3 resulting in unnecessary excisions and referrals, and medical students have difficulty learning to identify even a small number of common lesions.4

Concerned groups petition for more training, but it is not clear that training sharpens these abilities, or if it does, it is not clear what types of training are most effective. Two recent systematic reviews explore aspects of these questions. One found that the presentation of images with text improved laypersons’ knowledge and self-efficacy regarding skin self-examination; however, the effect on participants’ ability to detect lesions was inconclusive.5 The second review focused on primary care physicians and reported improvements in knowledge and confidence; however, of the 7 studies that measured participants’ abilities to categorize or identify lesions, 4 reported no improvement.6

These foci do not provide a full examination of the issue. The purpose of the current review is to determine the full scope and effectiveness of educational practices that are used to improve participants’ abilities to detect, categorize, and identify skin lesions. As a review of literature, with no involvement of human or animal subjects, the approval of the University of Alberta’s Research Ethics Board was not required.

Search Strategy

We conducted a meta-analysis of the literature cataloged in MEDLINE, EMBASE, CINAHL, ERIC, PsycINFO, and BIOSIS Previews from inception to April 1, 2014. We used terms cognate with the Medical Subject Headings skin disease, diagnosis, and education (see eFigure 1 in the Supplement for the full MEDLINE search strategy). References of retrieved articles were scanned for additional studies. We also examined the articles included in the 2 previous systematic reviews.5,6

Inclusion and Exclusion Criteria

We included studies that met 5 criteria:

  1. The educational objective was to improve participants’ ability to detect a skin lesion, categorize a lesion into a finite set of categories (eg, benign or malignant), or identify a lesion by its name, either through multiple-choice or constructed-response formats.

  2. The educational objective was operationalized as the participants’ ability to detect, categorize, or identify lesions presented with photographs or patients.

  3. The means through which the educational objectives were addressed was described in sufficient detail to allow classification.

  4. The effectiveness of the educational practice was evaluated through comparisons of participants’ abilities before and after the intervention or to the abilities of nonparticipants.

  5. Sufficient information to calculate a standardized mean difference (SMD) was available in the article or from the authors through a direct request.

We excluded studies in which:

  1. The intervention used to improve detection, categorization, or identification was not education but rather a technological method (eg, dermoscopy, computer-aided diagnosis).

  2. The objective of the educational intervention was to improve the treatment of a skin lesion rather than its detection, categorization, or identification.

  3. The objective was to improve the frequency with which participants sought clinical skin examinations, thus improving the rate of detection, categorization, or identification, but not the ability of participants to do so.

Study Selection

Two investigators (L.R. and S.O.) worked independently and in duplicate to screen all titles and abstracts returned by the search query, and subsequently, the full texts of items marked provisionally for inclusion. Disagreements were resolved through discussion.

Data Extraction

From the articles selected for inclusion, 2 reviewers (L.R. and S.O.) independently extracted information about:

  1. The population being trained. Across studies, participants were of 4 types: laypersons, medical students, residents, or primary care providers.

  2. The type of educational practice used to improve participants’ abilities.

  3. The educational objective, which was 1 of 3 types:
    detect, which was to discover the presence of a skin lesion on oneself, a patient, or a standardized patient;
    categorize, which was to assign a skin lesion to a finite set of categories (eg, benign or malignant); and
    identify, which was to provide the correct name of a lesion or select the correct name from a list of alternatives.

  4. The quality of studies, which was evaluated using the Medical Education Research Study Quality Instrument.7 The instrument highlights 6 dimensions of a study quality: research design, sampling strategy, type of data collected, soundness of the measurement procedures, appropriateness and complexity of the data analysis procedures, and types of outcomes that are measured. Since its introduction in 2007, the psychometric properties of the Medical Education Research Study Quality Instrument have been investigated in multiple studies, and evidence is accruing of its validity and reliability.

Data Synthesis

Meta-analysis was performed using a random effects model. Effect sizes were calculated as SMD and computations were carried out in Review Manager 5.2 (The Cochrane Collaboration; https://tech.cochrane.org/revman/about-revman-5). When the data required to calculate an SMD were not included in an article, we requested it from the study’s authors, and when information was not forthcoming, we imputed missing information using formulas recommended by The Cochrane Collaboration.8 Heterogeneity was assessed using an I2 statistic. An aggregate effect size was calculated, as were effect sizes for each of the educational practices. Additional subgroup analyses were planned a priori based on a review of previous meta-analyses of medical education topics. These included subgroups for duration, study design, population, and assessment task.

Study Characteristics

The initial database queries returned 2758 unique items.5,6 Ultimately, 37 studies met all inclusion criteria (Table 1).4,943 A flowchart tracing the selection process is available as a supplement (eFigure 2 in the Supplement).

Table Graphic Jump LocationTable 1.  Characteristics of Included Studies by Educational Practice
Methodologic Quality

The methodologic quality of the studies was measured with the Medical Education Research Study Quality Instrument. Conventionally, this instrument offers a possible score of 18; however, one of its dimensions—response rate—was not relevant to the designs included in this review. This reduced the total possible score to 16.5, which was standardized to present a score out of 18 that could be compared with other reports. Among the studies included in our review, scores ranged from 7.09 to 18.00, with a mean (SD) of 11.09 (1.97) (eTable 1 in the Supplement).

Populations

Four types of learners participated in training. The frequency (f) with which they are represented in our review is as follows: medical students (f = 12), primary care providers (f = 10), laypersons (f = 9), and residents (internal medicine residents, f = 3; primary care residents, f = 2; and family medicine residents, f = 2).

Scope of Educational Practices

Seven educational practices were used to enhance participants’ skills. The practices, their descriptions, and the frequency with which they appear in the literature are presented in Table 2. Lecture was most frequent, while moulage was the least frequent.

Table Graphic Jump LocationTable 2.  Educational Practices Used to Improve Participants’ Abilities to Diagnose Skin Lesions
Effect

The effect of the interventions, pooled across populations and educational practices, was large: SMD = 1.06 (95% CI, 0.81-1.31) (Figure 1). Examined by educational practice and presented in order of magnitude, the effect size for each practice was: multicomponent interventions, SMD = 2.07 (95% CI, 0.71-3.44); dermatology elective, SMD = 1.64 (95% CI, 1.17-2.11); computer-based learning, SMD = 0.64 (95% CI, 0.36-0.92); formal lecture, SMD = 0.59 (95% CI, 0.28-0.90); audit and feedback, SMD = 0.58 (95% CI, 0.10-1.07); pamphlet, SMD = 0.47 (95% CI, –0.11 to 1.05); and moulage, SMD = 0.15 (95% CI, –0.26 to 0.57) (Figure 2).

Place holder to copy figure label and caption
Figure 1.
Random Effects Meta-analysis of Education to Improve Participants’ Abilities to Detect, Categorize, or Identify Skin Lesions

This forest plot illustrates the effect of education on participants’ abilities. Some reports included separate analysis of (1) different types of trainees (residents or medical students), (2) different educational interventions (instruction with text, with images, or with text and images; computer-based learning [CBL]; or a dermatology elective [DE]), (3) different tasks (categorize or identify lesions), or (4) different types of dermatoses (lesions or eruptions). The results of these analyses appear separately in the forest plot, and their difference is labeled. SMD indicates standardized mean difference.

Graphic Jump Location
Place holder to copy figure label and caption
Figure 2.
Subgroup Analysis by Educational Practice

This forest plot represents the random effects meta-analysis of the practices used to improve participants’ abilities to diagnose skin lesions. Some reports included separate analysis of different types of trainees, educational practices, tasks, or dermatoses. These results appear separately in the forest plot, and their difference is labeled. SMD indicates standardized mean difference.

Graphic Jump Location

Examined by population, the effect sizes for trainee groups were: laypersons, SMD = 1.40 (95% CI, 0.36-2.45); medical students, SMD = 1.31 (95% CI, 0.95-1.67); residents (family medicine, primary care, and internal medicine), SMD = 0.64 (95% CI, 0.72-1.37); and primary care providers, SMD = 0.45 (95% CI, 0.30-0.60) (Figure 3).

Place holder to copy figure label and caption
Figure 3.
Subgroup Analysis by Trainee Population

This forest plot represents the effect of the educational practices on the 4 groups of trainees. Some reports included separate analysis of (1) different types of trainees (residents or medical students), (2) different educational interventions (instruction with text, with images, or with text and images; computer-based learning [CBL]; or a dermatology elective [DE]), (3) different tasks (categorize or identify lesions), or (4) different types of dermatoses (lesions or eruptions). The results of these analyses appear separately in the forest plot, and their difference is labeled. SMD indicates standardized mean difference.

Graphic Jump Location

Heterogeneity was large (>50%) in the aggregate analysis and the analysis by educational practice and population. It was not attenuated by subgroup analyses, which included analyses by study design (single-group, pre-post, randomized controlled trial, or controlled trial), study quality (low or high), task (detect, categorize, and identify), or response format on pretests and posttests (multiple-choice or constructed response).

The purpose of this review was to investigate the scope and effectiveness of the educational practices that are commonly used to improve participants’ abilities to diagnose skin lesions. Five practices were recurrent in the literature: dermatology electives, lectures, computer-based learning, pamphlets, and multicomponent interventions. Dermatology electives and multicomponent interventions had large effects, improving participants’ abilities by 1½ to 2 SDs. Computer-based learning, lectures, and pamphlets had moderate effects, improving participants’ abilities by half a standard deviation.

Two issues are pertinent in the interpretation of these effect sizes. First, in 34 of the 37 studies, the educational intervention was compared with no intervention. Previous reviewers of medical education studies have shown that effect sizes are predictably large under these conditions.44 Second, despite substantial improvement during training, abilities often remain unsatisfactory. For example, in one evaluation of a dermatology elective, medical students identified on average 3 of 25 lesions at pretest and 8 of 25 at posttest.4 This gain yielded a large mean difference, yet after 4 weeks of intensive training, participants remained unable to identify 17 of 25 common lesions.

Setting aside these concerns, it is clear that across and within educational practices, larger effects were associated with approaches that engaged participants in a wider variety of activities for longer durations. Previous meta-analyses have uncovered similar associations when examining continuing medical education,45 Internet-based learning,46 and simulation training.47 Models of expertise in diagnostic image interpretation may account for the association between the variety of educational activities and learning. One model suggests that diagnosis draws on 3 types of knowledge: basic science (eg, physiology, anatomy, and microbiology), clinical (eg, manifestations of disease and epidemiology), and experiential (eg, exemplars and cases).48 In this frame, the dermatology elective—comprising supervised direct patient care, lectures, readings, demonstrations, and various types of rounds—is most likely to equip trainees with each type of knowledge. Although the dermatology elective is an option only for medical students and residents, multicomponent interventions incorporate a variety of educational activities, and they also generate large effects with laypersons and primary care providers.

In addition to the number of components, learning was associated with the duration of training—the median durations for multicomponent interventions and dermatology electives were longer than they were for the other practices. Duration, however, may be an index to the volume of lesions that trainees encounter, and volume may be the variable that is associated with learning. Several theories of expert visual diagnosis stress the importance of diagnosing large numbers of lesions in developing a mental library of lesions, mental prototypes of lesion categories, implicit rules for categorizing lesions, or changes in one’s visual information processing structures and mechanisms.49,50 This account is speculative. Few studies provided the precise number of lesions that participants encountered.

Despite its educational advantages, dermatology educators are not focused only on comprehensive approaches to training. Many studies were conducted explicitly to investigate the effectiveness of brief, inexpensive interventions. Computer-based learning (with a median duration of 45 minutes), lectures (with a median duration of 45 minutes), and pamphlets (with a median exposure time of 5 minutes) yielded moderate effects across contexts and types of learners.

There are limitations to these conclusions. To synthesize the literature through meta-analysis, we excluded a large number of studies of educational practices in dermatology. Qualitative studies were excluded altogether, as were quantitative studies designed in a manner that did not produce statistics required for meta-analytic procedures. Furthermore, within this subset of studies, we excluded research on a broad range of outcomes that are studied regularly, including trainee knowledge, confidence, and lesion treatment.

In addition, heterogeneity was large in the main analysis and the analysis by educational practices. This draws attention to variance at 3 levels: effect sizes varied between educational practices, within each of the educational practices, and between participants engaged in a specific intervention. This is consistent with previous meta-analyses of medical education issues whose authors have identified several predictable sources of variation, including learners, content, instructional design, research methods, and outcome measures.4547 Among the studies in the current review, each of these sources of variance were apparent. The main analysis encompassed 4 types of learners, 7 educational practices, several categories of lesions, 3 research designs, and 37 researcher-designed outcome measures. Subgroup analyses did not eliminate the heterogeneity because several sources of variance operated concurrently. A sufficient number of studies were not available for nested subgroup analyses. However, despite the prevalence of clinical, methodologic, and statistical heterogeneity, our estimates of the mean effect sizes for the main analysis and the analysis by educational practices were robust and interpretable.

A pervasive source of variance that should be addressed in subsequent studies is the difficulty of the diagnostic task. Between studies, participants were required variously to detect a lesion, categorize a lesion into 1 of 2 categories (eg, benign or malignant), categorize a lesion into one of several ordinal categories (eg, do nothing, keep an eye on it, show someone else, show physician at next visit, or show physician immediately), identify a lesion using multiple-choice format, or identify a lesion using constructed-response format. Although our subgroup analysis did not establish a systematic relationship, presumably these tasks are increasingly difficult for participants. Another source of variance in task difficulty was the type of lesions included in the tests. Among the studies that presented data on the participants’ abilities to diagnose specific types of lesions, it was apparent that some were more difficult to identify than others, and that for some of these lesions, abilities did not improve substantially through training.

The study quality instrument we used underscores this problem on the dimension labeled validity of evaluation instrument. Of the 3 points that are available for this dimension, the mode across the included studies was zero. Scoring well on this dimension would have meant providing evidence of the soundness of tests that were used to estimate participants’ abilities and thereby the effectiveness of the interventions. Subsequent studies should provide evidence that the tests are reliable measures of participants’ abilities and that the types of lesions, their instances, and quantity are adequately representative. Together, the test’s properties should lead to appropriate decisions about test-takers’ abilities. The American Board of Dermatology provides guidelines for test development, and the literature includes studies that exemplify this process.

Subsequent researchers should also consider framing their investigations in theories or models of visual information processing, perceptual learning, or diagnostic image interpretation. Researchers from diverse fields, including medical education, have been examining these questions for decades, and several conceptual frameworks are available to guide the design of studies and the interpretation of results.4850 Such frameworks are useful for synthesizing disparate observations and building a body of literature that is unified, generalizable, and progressive.

The early detection and accurate diagnosis of skin lesions has a substantial effect on patient outcomes and health system resources. There are a number of approaches to imparting these skills, and a review of 4 decades of evaluative research suggests that some approaches are more effective than others. The most effective approaches engaged participants in a number of coordinated activities for a substantial period.

Accepted for Publication: August 23, 2014.

Corresponding Author: Liam Rourke, PhD, Department of Medicine, University of Alberta, 5-125 Clinical Sciences Building, 11350-83 Ave, Edmonton, AB T6G 2G3 Canada (lrourke@ualberta.ca).

Published Online: January 7, 2015. doi:10.1001/jamadermatol.2014.3300.

Author Contributions: Dr Rourke had full access to all the data in the study and takes responsibility for the integrity of the data and the accuracy of the data analysis.

Study concept and design: Rourke, Brassard.

Acquisition, analysis, or interpretation of data: All authors.

Drafting of the manuscript: Rourke, Oberholtzer.

Critical revision of the manuscript for important intellectual content: Rourke, Chatterley, Brassard.

Statistical analysis: Rourke.

Administrative, technical, or material support: Rourke, Oberholtzer, Chatterley.

Study supervision: Rourke, Brassard.

Conflict of Interest Disclosures: None reported.

Additional Contributions: Ben Vandermeer, MSc, Alberta Research Centre for Health Evidence, provided assistance with the statistical analysis. He was not compensated for his assistance.

Kalia  S, Haiducu  ML.  The burden of skin disease in the United States and Canada. Dermatol Clin. 2012;30(1):5-18, vii.
PubMed   |  Link to Article
Oliveria  SA, Chau  D, Christos  PJ, Charles  CA, Mushlin  AI, Halpern  AC.  Diagnostic accuracy of patients in performing skin self-examination and the impact of photography. Arch Dermatol. 2004;140(1):57-62.
PubMed   |  Link to Article
Chen  SC, Bravata  DM, Weil  E, Olkin  I.  A comparison of dermatologists’ and primary care physicians’ accuracy in diagnosing melanoma. Arch Dermatol. 2001;137(12):1627-1634.
PubMed   |  Link to Article
Aldridge  RB, Maxwell  SS, Rees  JL.  Dermatology undergraduate skin cancer training. BMC Med Educ. 2012;12:27.
PubMed   |  Link to Article
McWhirter  JE, Hoffman-Goetz  L.  Visual images for patient skin self-examination and melanoma detection. J Am Acad Dermatol. 2013;69(1):47-55.
PubMed   |  Link to Article
Goulart  JM, Quigley  EA, Dusza  S,  et al; INFORMED (INternet curriculum FOR Melanoma Early Detection) Group.  Skin cancer education for primary care physicians. J Gen Intern Med. 2011;26(9):1027-1035.
PubMed   |  Link to Article
Reed  DA, Beckman  TJ, Wright  SM, Levine  RB, Kern  DE, Cook  DA.  Predictive validity evidence for medical education research study quality instrument scores. J Gen Intern Med. 2008;23(7):903-907.
PubMed   |  Link to Article
Higgins  JP, Green  S, eds. Cochrane Handbook for Systematic Reviews of Interventions Version 5.1.0 [updated March 2011].http://handbook.cochrane.org. Accessed June 1, 2014.
Bukhari  I, AlAkloby  O. Evaluation of diagnostic skills of interns electively rotating at the dermatology department of King Fahad Hospital of the University in Alkhobar, Saudi Arabia.Internet J Dermatol. 2006; 5(2). http://ispub.com/IJD/5/2/8520. Accessed May 4, 2014.
Dolev  JC, O’Sullivan  P, Berger  T.  The eDerm online curriculum. J Am Acad Dermatol. 2011;65(6):e165-e171.
PubMed   |  Link to Article
Enk  CD, Gilead  L, Smolovich  I, Cohen  R.  Diagnostic performance and retention of acquired skills after dermatology elective. Int J Dermatol. 2003;42(10):812-815.
PubMed   |  Link to Article
Sherertz  EF.  Learning dermatology on a dermatology elective. Int J Dermatol. 1990;29(5):345-348.
PubMed   |  Link to Article
Simon  PE, Bergstresser  PR, Eaglstein  WH.  Medical education and the dermatology elective. Int J Dermatol. 1977;16(9):760-763.
PubMed   |  Link to Article
Whitaker-Worth  DL, Susser  WS, Grant-Kels  JM.  Clinical dermatologic education and the diagnostic acumen of medical students and primary care residents. Int J Dermatol. 1998;37(11):855-859.
PubMed   |  Link to Article
Girgis  A, Sanson-Fisher  RW, Howe  C, Raffan  B.  A skin cancer training programme. Med Educ. 1995;29(5):364-371.
PubMed   |  Link to Article
Gerbert  B, Bronstone  A, Wolff  M,  et al.  Improving primary care residents’ proficiency in the diagnosis of skin cancer. J Gen Intern Med. 1998;13(2):91-97.
PubMed   |  Link to Article
Jain  N, Anderson  MJ, Patel  P,  et al.  Melanoma simulation model: promoting opportunistic screening and patient counseling. JAMA Dermatol. 2013;149(6):710-716.
PubMed   |  Link to Article
Robinson  JK, Turrisi  R.  Skills training to learn discrimination of ABCDE criteria by those at risk of developing melanoma. Arch Dermatol. 2006;142(4):447-452.
PubMed   |  Link to Article
Harris  JM, Salasche  SJ, Harris  RB.  Can Internet-based continuing medical education improve physicians’ skin cancer knowledge and skills? J Gen Intern Med. 2001;16(1):50-56.
PubMed
Harris  JM  Jr, Salasche  SJ, Harris  RB.  Using the Internet to teach melanoma management guidelines to primary care physicians. J Eval Clin Pract. 1999;5(2):199-211.
PubMed   |  Link to Article
Jenkins  S, Goel  R, Morrell  DS.  Computer-assisted instruction versus traditional lecture for medical student teaching of dermatology morphology: a randomized control trial. J Am Acad Dermatol. 2008;59(2):255-259.
PubMed   |  Link to Article
Gerbert  B, Bronstone  A, Maurer  T, Berger  T, McPhee  SJ, Caspers  N.  The effectiveness of an Internet-based tutorial in improving primary care physicians’ skin cancer triage skills. J Cancer Educ. 2002;17(1):7-11.
PubMed
Borland  R, Mee  V, Meehan  JW.  Effects of photographs and written descriptors on melanoma detection. Health Educ Res. 1997;12(3):375-384.
PubMed   |  Link to Article
Brooks  A, Predebon  J, van der Zwan  R.  Perceptual strategies to improve skin cancer discriminations in naive observers. Public Health. 2001;115(2):139-145.
PubMed   |  Link to Article
Girardi  S, Gaudy  C, Gouvernet  J, Teston  J, Richard  MA, Grob  JJ.  Superiority of a cognitive education with photographs over ABCD criteria in the education of the general population to the early detection of melanoma: a randomized study. Int J Cancer. 2006;118(9):2276-2280.
PubMed   |  Link to Article
Mickler  TJ, Rodrique  JR, Lescano  CM.  A comparison of three methods of teaching skin self-examinations. J Clin Psychol Med Settings. 1999;6(3):273-286. doi:10.1023/A:1026291705517.
Link to Article
Muslim  TA, Naidou  S.  The effects of an educational intervention on the early management of oral lesions in the uMgungundlovu District in KwaZulu-Natal. South Afr J Epidemiol Infect. 2013;28(1):55-60.
Ahiarah  A, Fox  C, Servoss  T.  Brief intervention to improve diagnosis and treatment knowledge of skin disorders by family medicine residents. Fam Med. 2007;39(10):720-723.
PubMed
Bedlow  AJ, Cliff  S, Melia  J, Moss  SM, Seyan  R, Harland  CC.  Impact of skin cancer education on general practitioners’ diagnostic skills. Clin Exp Dermatol. 2000;25(2):115-118.
PubMed   |  Link to Article
Bradley  HB.  Implementation of a skin cancer screening tool in a primary care setting. J Am Acad Nurse Pract. 2012;24(2):82-88.
PubMed   |  Link to Article
Bränström  R, Hedblad  MA, Krakau  I, Ullén  H.  Laypersons’ perceptual discrimination of pigmented skin lesions. J Am Acad Dermatol. 2002;46(5):667-673.
PubMed   |  Link to Article
Carli  P, De Giorgi  V, Crocetti  E, Caldini  L, Ressel  C, Giannotti  B.  Diagnostic and referral accuracy of family doctors in melanoma screening. Eur J Cancer Prev. 2005;14(1):51-55.
PubMed   |  Link to Article
Cliff  S, Bedlow  AJ, Melia  J, Moss  S, Harland  CC.  Impact of skin cancer education on medical students’ diagnostic skills. Clin Exp Dermatol. 2003;28(2):214-217.
PubMed   |  Link to Article
de Gannes  GC, Ip  JL, Martinka  M, Crawford  RI, Rivers  JK.  Early detection of skin cancer by family physicians: a pilot project. J Cutan Med Surg. 2004;8(2):103-109.
PubMed   |  Link to Article
Dolan  NC, Ng  JS, Martin  GJ, Robinson  JK, Rademaker  AW.  Effectiveness of a skin cancer control educational intervention for internal medicine housestaff and attending physicians. J Gen Intern Med. 1997;12(9):531-536.
PubMed   |  Link to Article
Goulart  JM, Dusza  S, Pillsbury  A, Soriano  RP, Halpern  AC, Marghoob  AA.  Recognition of melanoma. J Am Acad Dermatol. 2012;67(4):606-611.
PubMed   |  Link to Article
Grange  F, Hédelin  G, Halna  JM,  et al.  Assessment of a general practitioner training campaign for early detection of cutaneous melanoma in the Haut-Rhin department of France [in French]. Ann Dermatol Venereol. 2005;132(12, pt 1):956-961.
PubMed   |  Link to Article
Liebman  TN, Goulart  JM, Soriano  R,  et al.  Effect of dermoscopy education on the ability of medical students to detect skin cancer. Arch Dermatol. 2012;148(9):1016-1022.
PubMed   |  Link to Article
Seoane  J, Varela-Centelles  PI, Diz Dios  P, Suárez Quintanilla  JM, Aguado  A.  Experimental intervention study about recognition of erythroplakia by undergraduate dental students. Int Dent J. 1999;49(5):275-278.
PubMed   |  Link to Article
Raasch  BA, Hays  R, Buettner  PG.  An educational intervention to improve diagnosis and management of suspicious skin lesions. J Contin Educ Health Prof. 2000;20(1):39-51.
PubMed   |  Link to Article
Youl  PH, Raasch  BA, Janda  M, Aitken  JF.  The effect of an educational programme to improve the skills of general practitioners in diagnosing melanocytic/pigmented lesions. Clin Exp Dermatol. 2007;32(4):365-370.
PubMed   |  Link to Article
Garg  A, Haley  HL, Hatem  D.  Modern moulage: evaluating the use of 3-dimensional prosthetic mimics in a dermatology teaching program for second-year medical students. Arch Dermatol. 2010;146(2):143-146.
PubMed   |  Link to Article
Mikkilineni  R, Weinstock  MA, Goldstein  MG, Dube  CE, Rossi  JS.  The impact of the basic skin cancer triage curriculum on providers’ skills, confidence, and knowledge in skin cancer control. Prev Med. 2002;34(2):144-152.
PubMed   |  Link to Article
Cook  DA.  Randomized controlled trials and meta-analysis in medical education: what role do they play? Med Teach. 2012;34(6):468-473.
PubMed   |  Link to Article
Mansouri  M, Lockyer  J.  A meta-analysis of continuing medical education effectiveness. J Contin Educ Health Prof. 2007;27(1):6-15.
PubMed   |  Link to Article
Cook  DA, Levinson  AJ, Garside  S, Dupras  DM, Erwin  PJ, Montori  VM.  Internet-based learning in the health professions: a meta-analysis. JAMA. 2008;300(10):1181-1196.
PubMed   |  Link to Article
Cook  DA, Hatala  R, Brydges  R,  et al.  Technology-enhanced simulation for health professions education: a systematic review and meta-analysis. JAMA. 2011;306(9):978-988.
PubMed   |  Link to Article
de Bruin  AB, Schmidt  HG, Rikers  RM.  The role of basic science knowledge and clinical knowledge in diagnostic reasoning: a structural equation modeling approach. Acad Med. 2005;80(8):765-773.
PubMed   |  Link to Article
Rees  JL.  Teaching and learning in dermatology: from Gutenberg to Zuckerberg via way of Von Hebra. Acta Derm Venereol. 2013;93(1):13-22.
PubMed
Goldstone  RL, Braithwaite  DW, Byrge  LA. Perceptual learning. In: Seel  NM, ed. Encyclopedia of the Sciences of Learning. New York, NY: Springer; 2012:2580-2583.

Figures

Place holder to copy figure label and caption
Figure 3.
Subgroup Analysis by Trainee Population

This forest plot represents the effect of the educational practices on the 4 groups of trainees. Some reports included separate analysis of (1) different types of trainees (residents or medical students), (2) different educational interventions (instruction with text, with images, or with text and images; computer-based learning [CBL]; or a dermatology elective [DE]), (3) different tasks (categorize or identify lesions), or (4) different types of dermatoses (lesions or eruptions). The results of these analyses appear separately in the forest plot, and their difference is labeled. SMD indicates standardized mean difference.

Graphic Jump Location
Place holder to copy figure label and caption
Figure 2.
Subgroup Analysis by Educational Practice

This forest plot represents the random effects meta-analysis of the practices used to improve participants’ abilities to diagnose skin lesions. Some reports included separate analysis of different types of trainees, educational practices, tasks, or dermatoses. These results appear separately in the forest plot, and their difference is labeled. SMD indicates standardized mean difference.

Graphic Jump Location
Place holder to copy figure label and caption
Figure 1.
Random Effects Meta-analysis of Education to Improve Participants’ Abilities to Detect, Categorize, or Identify Skin Lesions

This forest plot illustrates the effect of education on participants’ abilities. Some reports included separate analysis of (1) different types of trainees (residents or medical students), (2) different educational interventions (instruction with text, with images, or with text and images; computer-based learning [CBL]; or a dermatology elective [DE]), (3) different tasks (categorize or identify lesions), or (4) different types of dermatoses (lesions or eruptions). The results of these analyses appear separately in the forest plot, and their difference is labeled. SMD indicates standardized mean difference.

Graphic Jump Location

Tables

Table Graphic Jump LocationTable 2.  Educational Practices Used to Improve Participants’ Abilities to Diagnose Skin Lesions
Table Graphic Jump LocationTable 1.  Characteristics of Included Studies by Educational Practice

References

Kalia  S, Haiducu  ML.  The burden of skin disease in the United States and Canada. Dermatol Clin. 2012;30(1):5-18, vii.
PubMed   |  Link to Article
Oliveria  SA, Chau  D, Christos  PJ, Charles  CA, Mushlin  AI, Halpern  AC.  Diagnostic accuracy of patients in performing skin self-examination and the impact of photography. Arch Dermatol. 2004;140(1):57-62.
PubMed   |  Link to Article
Chen  SC, Bravata  DM, Weil  E, Olkin  I.  A comparison of dermatologists’ and primary care physicians’ accuracy in diagnosing melanoma. Arch Dermatol. 2001;137(12):1627-1634.
PubMed   |  Link to Article
Aldridge  RB, Maxwell  SS, Rees  JL.  Dermatology undergraduate skin cancer training. BMC Med Educ. 2012;12:27.
PubMed   |  Link to Article
McWhirter  JE, Hoffman-Goetz  L.  Visual images for patient skin self-examination and melanoma detection. J Am Acad Dermatol. 2013;69(1):47-55.
PubMed   |  Link to Article
Goulart  JM, Quigley  EA, Dusza  S,  et al; INFORMED (INternet curriculum FOR Melanoma Early Detection) Group.  Skin cancer education for primary care physicians. J Gen Intern Med. 2011;26(9):1027-1035.
PubMed   |  Link to Article
Reed  DA, Beckman  TJ, Wright  SM, Levine  RB, Kern  DE, Cook  DA.  Predictive validity evidence for medical education research study quality instrument scores. J Gen Intern Med. 2008;23(7):903-907.
PubMed   |  Link to Article
Higgins  JP, Green  S, eds. Cochrane Handbook for Systematic Reviews of Interventions Version 5.1.0 [updated March 2011].http://handbook.cochrane.org. Accessed June 1, 2014.
Bukhari  I, AlAkloby  O. Evaluation of diagnostic skills of interns electively rotating at the dermatology department of King Fahad Hospital of the University in Alkhobar, Saudi Arabia.Internet J Dermatol. 2006; 5(2). http://ispub.com/IJD/5/2/8520. Accessed May 4, 2014.
Dolev  JC, O’Sullivan  P, Berger  T.  The eDerm online curriculum. J Am Acad Dermatol. 2011;65(6):e165-e171.
PubMed   |  Link to Article
Enk  CD, Gilead  L, Smolovich  I, Cohen  R.  Diagnostic performance and retention of acquired skills after dermatology elective. Int J Dermatol. 2003;42(10):812-815.
PubMed   |  Link to Article
Sherertz  EF.  Learning dermatology on a dermatology elective. Int J Dermatol. 1990;29(5):345-348.
PubMed   |  Link to Article
Simon  PE, Bergstresser  PR, Eaglstein  WH.  Medical education and the dermatology elective. Int J Dermatol. 1977;16(9):760-763.
PubMed   |  Link to Article
Whitaker-Worth  DL, Susser  WS, Grant-Kels  JM.  Clinical dermatologic education and the diagnostic acumen of medical students and primary care residents. Int J Dermatol. 1998;37(11):855-859.
PubMed   |  Link to Article
Girgis  A, Sanson-Fisher  RW, Howe  C, Raffan  B.  A skin cancer training programme. Med Educ. 1995;29(5):364-371.
PubMed   |  Link to Article
Gerbert  B, Bronstone  A, Wolff  M,  et al.  Improving primary care residents’ proficiency in the diagnosis of skin cancer. J Gen Intern Med. 1998;13(2):91-97.
PubMed   |  Link to Article
Jain  N, Anderson  MJ, Patel  P,  et al.  Melanoma simulation model: promoting opportunistic screening and patient counseling. JAMA Dermatol. 2013;149(6):710-716.
PubMed   |  Link to Article
Robinson  JK, Turrisi  R.  Skills training to learn discrimination of ABCDE criteria by those at risk of developing melanoma. Arch Dermatol. 2006;142(4):447-452.
PubMed   |  Link to Article
Harris  JM, Salasche  SJ, Harris  RB.  Can Internet-based continuing medical education improve physicians’ skin cancer knowledge and skills? J Gen Intern Med. 2001;16(1):50-56.
PubMed
Harris  JM  Jr, Salasche  SJ, Harris  RB.  Using the Internet to teach melanoma management guidelines to primary care physicians. J Eval Clin Pract. 1999;5(2):199-211.
PubMed   |  Link to Article
Jenkins  S, Goel  R, Morrell  DS.  Computer-assisted instruction versus traditional lecture for medical student teaching of dermatology morphology: a randomized control trial. J Am Acad Dermatol. 2008;59(2):255-259.
PubMed   |  Link to Article
Gerbert  B, Bronstone  A, Maurer  T, Berger  T, McPhee  SJ, Caspers  N.  The effectiveness of an Internet-based tutorial in improving primary care physicians’ skin cancer triage skills. J Cancer Educ. 2002;17(1):7-11.
PubMed
Borland  R, Mee  V, Meehan  JW.  Effects of photographs and written descriptors on melanoma detection. Health Educ Res. 1997;12(3):375-384.
PubMed   |  Link to Article
Brooks  A, Predebon  J, van der Zwan  R.  Perceptual strategies to improve skin cancer discriminations in naive observers. Public Health. 2001;115(2):139-145.
PubMed   |  Link to Article
Girardi  S, Gaudy  C, Gouvernet  J, Teston  J, Richard  MA, Grob  JJ.  Superiority of a cognitive education with photographs over ABCD criteria in the education of the general population to the early detection of melanoma: a randomized study. Int J Cancer. 2006;118(9):2276-2280.
PubMed   |  Link to Article
Mickler  TJ, Rodrique  JR, Lescano  CM.  A comparison of three methods of teaching skin self-examinations. J Clin Psychol Med Settings. 1999;6(3):273-286. doi:10.1023/A:1026291705517.
Link to Article
Muslim  TA, Naidou  S.  The effects of an educational intervention on the early management of oral lesions in the uMgungundlovu District in KwaZulu-Natal. South Afr J Epidemiol Infect. 2013;28(1):55-60.
Ahiarah  A, Fox  C, Servoss  T.  Brief intervention to improve diagnosis and treatment knowledge of skin disorders by family medicine residents. Fam Med. 2007;39(10):720-723.
PubMed
Bedlow  AJ, Cliff  S, Melia  J, Moss  SM, Seyan  R, Harland  CC.  Impact of skin cancer education on general practitioners’ diagnostic skills. Clin Exp Dermatol. 2000;25(2):115-118.
PubMed   |  Link to Article
Bradley  HB.  Implementation of a skin cancer screening tool in a primary care setting. J Am Acad Nurse Pract. 2012;24(2):82-88.
PubMed   |  Link to Article
Bränström  R, Hedblad  MA, Krakau  I, Ullén  H.  Laypersons’ perceptual discrimination of pigmented skin lesions. J Am Acad Dermatol. 2002;46(5):667-673.
PubMed   |  Link to Article
Carli  P, De Giorgi  V, Crocetti  E, Caldini  L, Ressel  C, Giannotti  B.  Diagnostic and referral accuracy of family doctors in melanoma screening. Eur J Cancer Prev. 2005;14(1):51-55.
PubMed   |  Link to Article
Cliff  S, Bedlow  AJ, Melia  J, Moss  S, Harland  CC.  Impact of skin cancer education on medical students’ diagnostic skills. Clin Exp Dermatol. 2003;28(2):214-217.
PubMed   |  Link to Article
de Gannes  GC, Ip  JL, Martinka  M, Crawford  RI, Rivers  JK.  Early detection of skin cancer by family physicians: a pilot project. J Cutan Med Surg. 2004;8(2):103-109.
PubMed   |  Link to Article
Dolan  NC, Ng  JS, Martin  GJ, Robinson  JK, Rademaker  AW.  Effectiveness of a skin cancer control educational intervention for internal medicine housestaff and attending physicians. J Gen Intern Med. 1997;12(9):531-536.
PubMed   |  Link to Article
Goulart  JM, Dusza  S, Pillsbury  A, Soriano  RP, Halpern  AC, Marghoob  AA.  Recognition of melanoma. J Am Acad Dermatol. 2012;67(4):606-611.
PubMed   |  Link to Article
Grange  F, Hédelin  G, Halna  JM,  et al.  Assessment of a general practitioner training campaign for early detection of cutaneous melanoma in the Haut-Rhin department of France [in French]. Ann Dermatol Venereol. 2005;132(12, pt 1):956-961.
PubMed   |  Link to Article
Liebman  TN, Goulart  JM, Soriano  R,  et al.  Effect of dermoscopy education on the ability of medical students to detect skin cancer. Arch Dermatol. 2012;148(9):1016-1022.
PubMed   |  Link to Article
Seoane  J, Varela-Centelles  PI, Diz Dios  P, Suárez Quintanilla  JM, Aguado  A.  Experimental intervention study about recognition of erythroplakia by undergraduate dental students. Int Dent J. 1999;49(5):275-278.
PubMed   |  Link to Article
Raasch  BA, Hays  R, Buettner  PG.  An educational intervention to improve diagnosis and management of suspicious skin lesions. J Contin Educ Health Prof. 2000;20(1):39-51.
PubMed   |  Link to Article
Youl  PH, Raasch  BA, Janda  M, Aitken  JF.  The effect of an educational programme to improve the skills of general practitioners in diagnosing melanocytic/pigmented lesions. Clin Exp Dermatol. 2007;32(4):365-370.
PubMed   |  Link to Article
Garg  A, Haley  HL, Hatem  D.  Modern moulage: evaluating the use of 3-dimensional prosthetic mimics in a dermatology teaching program for second-year medical students. Arch Dermatol. 2010;146(2):143-146.
PubMed   |  Link to Article
Mikkilineni  R, Weinstock  MA, Goldstein  MG, Dube  CE, Rossi  JS.  The impact of the basic skin cancer triage curriculum on providers’ skills, confidence, and knowledge in skin cancer control. Prev Med. 2002;34(2):144-152.
PubMed   |  Link to Article
Cook  DA.  Randomized controlled trials and meta-analysis in medical education: what role do they play? Med Teach. 2012;34(6):468-473.
PubMed   |  Link to Article
Mansouri  M, Lockyer  J.  A meta-analysis of continuing medical education effectiveness. J Contin Educ Health Prof. 2007;27(1):6-15.
PubMed   |  Link to Article
Cook  DA, Levinson  AJ, Garside  S, Dupras  DM, Erwin  PJ, Montori  VM.  Internet-based learning in the health professions: a meta-analysis. JAMA. 2008;300(10):1181-1196.
PubMed   |  Link to Article
Cook  DA, Hatala  R, Brydges  R,  et al.  Technology-enhanced simulation for health professions education: a systematic review and meta-analysis. JAMA. 2011;306(9):978-988.
PubMed   |  Link to Article
de Bruin  AB, Schmidt  HG, Rikers  RM.  The role of basic science knowledge and clinical knowledge in diagnostic reasoning: a structural equation modeling approach. Acad Med. 2005;80(8):765-773.
PubMed   |  Link to Article
Rees  JL.  Teaching and learning in dermatology: from Gutenberg to Zuckerberg via way of Von Hebra. Acta Derm Venereol. 2013;93(1):13-22.
PubMed
Goldstone  RL, Braithwaite  DW, Byrge  LA. Perceptual learning. In: Seel  NM, ed. Encyclopedia of the Sciences of Learning. New York, NY: Springer; 2012:2580-2583.

Correspondence

CME
Also Meets CME requirements for:
Browse CME for all U.S. States
Accreditation Information
The American Medical Association is accredited by the Accreditation Council for Continuing Medical Education to provide continuing medical education for physicians. The AMA designates this journal-based CME activity for a maximum of 1 AMA PRA Category 1 CreditTM per course. Physicians should claim only the credit commensurate with the extent of their participation in the activity. Physicians who complete the CME course and score at least 80% correct on the quiz are eligible for AMA PRA Category 1 CreditTM.
Note: You must get at least of the answers correct to pass this quiz.
Please click the checkbox indicating that you have read the full article in order to submit your answers.
Your answers have been saved for later.
You have not filled in all the answers to complete this quiz
The following questions were not answered:
Sorry, you have unsuccessfully completed this CME quiz with a score of
The following questions were not answered correctly:
Commitment to Change (optional):
Indicate what change(s) you will implement in your practice, if any, based on this CME course.
Your quiz results:
The filled radio buttons indicate your responses. The preferred responses are highlighted
For CME Course: A Proposed Model for Initial Assessment and Management of Acute Heart Failure Syndromes
Indicate what changes(s) you will implement in your practice, if any, based on this CME course.

Multimedia

Supplement.

eFigure 1. Full MEDLINE Query

eFigure 2. PRISMA Flow

eTable 1. Summary of the Methodological Quality of the Studies

Supplemental Content

Some tools below are only available to our subscribers or users with an online account.

1,464 Views
2 Citations
×

Related Content

Customize your page view by dragging & repositioning the boxes below.

Articles Related By Topic
Related Collections
PubMed Articles
Jobs