0
We're unable to sign you in at this time. Please try again in a few minutes.
Retry
We were able to sign you in, but your subscription(s) could not be found. Please try again in a few minutes.
Retry
There may be a problem with your account. Please contact the AMA Service Center to resolve this issue.
Contact the AMA Service Center:
Telephone: 1 (800) 262-2350 or 1 (312) 670-7827  *   Email: subscriptions@jamanetwork.com
Error Message ......
Research Letter |

Standardized Patient–Based Assessment of Dermatology Resident Communication and Interpersonal Skills FREE

Stephanie Wang, BS1; Lynda Shadrake, BS2; Milena J. Lyon, MD1; Hajwa Kim, MS, MA3; Rachel Yudkowsky, MD, MHPE2; Claudia Hernandez, MD1
[+] Author Affiliations
1Department of Dermatology, University of Illinois at Chicago
2Department of Medical Education, University of Illinois at Chicago Graham Clinical Performance Center
3University of Illinois at Chicago Center for Clinical and Translational Science
JAMA Dermatol. 2015;151(3):340-342. doi:10.1001/jamadermatol.2014.3646.
Text Size: A A A
Published online

Effective physician-patient communication is essential for the delivery of quality dermatologic care. The Accreditation Council for Graduate Medical Education recognizes the importance of physician communication and interpersonal skills (CIS) as proficiency in these skills is identified as a core competency in the Program Requirements for Graduate Medical Education in Dermatology.1 We developed and piloted a 6-station objective structured clinical examination (OSCE) using standardized patient (SP)–based assessments for use in dermatology residency programs to assess CIS.

This study was approved by the University of Illinois at Chicago (UIC) Institutional Review Board. Study participants were not asked to provide informed consent because a waiver of consent was granted by the UIC Institutional Review Board for this study.

Six dermatology CIS-OSCE scenarios (Table 1) were created by modifying previously published OSCEs assessing other specialties.2,3 Our CIS-OSCE was piloted with 12 UIC dermatology residents (4 postgraduate year [PGY] 2, 3 PGY-3, and 5 PGY-4). Standardized patients were trained to portray the scenarios and rate the residents’ ability to maintain a patient-centered approach across different communication tasks using the published and validated Revised UIC Communication and Interpersonal Skills (RUCIS) Scale, a 13-item instrument rated on a 4-point behaviorally anchored scale4 (where 1 indicates unacceptable; 2, minimally acceptable; 3, solid; and 4, exceptional). The scale was used as a formative assessment with no pass-fail score or predetermined proficiency level. Each station consisted of a 10-minute SP encounter, after which SPs assessed residents using the RUCIS Scale. The residents then received 10 minutes of SP feedback that focused on having residents reflect on whether their behavior had been effective or ineffective.

Table Graphic Jump LocationTable 1.  Dermatology Objective Structured Clinical Examination Scenarios for 6 Communication and Interpersonal Skills Tasks

Individual resident scores were calculated as mean scores across RUCIS Scale items for each resident across all cases. Case scores were calculated as mean scores across RUCIS Scale items for all residents per case. Overall case scores were calculated by taking the case scores across all cases. Internal consistency reliability was measured by coefficient α. Generalizability was calculated across cases. Statistical analyses were performed using SAS, version 9.2 (SAS Institute Inc).

Results by station are shown in Table 2. Individual resident scores ranged from 2.6 to 3.2. Generalizability across cases was G = 0.87 using case scores. In regard to residents’ perception of the SP feedback sessions, 10 residents (83%) agreed or strongly agreed that the feedback was beneficial in providing insight into a patient’s interpretation and experience of the clinical encounter.

Objective structured clinical examination assessments using SPs offer several advantages when evaluating residents: standardization, objectivity, reproducibility, and direct comparison of skills across individuals.2 We found that implementation of our 6-station CIS-OSCE served as a helpful adjunctive method to test a resident’s CIS performance in common dermatology scenarios.

Limitations of our study included testing only 12 residents from 1 dermatology residency program and an uneven distribution of PGY-2, PGY-3, and PGY-4 residents, which could be variables affecting performance results. The current study assessed residents using the established RUCIS Scale.4 With the transition to the Accreditation Council for Graduate Medical Education Dermatology Milestones Project (a joint effort of the Accreditation Council for Graduate Medical Education and the American Board of Dermatology) educational framework in all dermatology training programs, we anticipate the need for further research to determine specific Dermatology Milestones Project–based targets for resident CIS performance when using SPs.5 Targets for proficiency can be adjusted based on individual program director preference and available resources for remediation.

Standardized case scenarios provide faculty with a chance to observe a resident’s CIS that may be otherwise difficult to evaluate. The ability to directly compare resident performance allows programs to identify weaknesses and targets for curricular improvement. In addition, CIS cases portraying challenging communication situations give residents the opportunity to practice dealing with such scenarios in a safe and nonjudgmental environment. Furthermore, SPs can provide valuable feedback from a perspective not typically available to residents. Our findings suggest that SP-driven dermatology CIS-OSCEs have the potential to serve as a useful learning and assessment tool to enhance dermatology resident education.

Accepted for Publication: September 5, 2014.

Corresponding Author: Stephanie Wang, BS, Department of Dermatology, University of Illinois at Chicago, 808 S Wood St, Room 376 CME, Chicago, IL 60612 (swang36@uic.edu).

Published Online: November 12, 2014. doi:10.1001/jamadermatol.2014.3646.

Author Contributions: Ms Wang and Dr Hernandez had full access to all the data in the study and take responsibility for the integrity of the data and the accuracy of the data analysis.

Study concept and design: Wang, Yudkowsky, Hernandez.

Acquisition, analysis, or interpretation of data: All authors.

Drafting of the manuscript: Wang, Lyon, Hernandez.

Critical revision of the manuscript for important intellectual content: All authors.

Statistical analysis: Kim, Hernandez.

Obtained funding: Hernandez.

Administrative, technical, or material support: Wang, Shadrake, Lyon, Yudkowsky.

Study supervision: Yudkowsky, Hernandez.

Conflict of Interest Disclosures: None reported.

Funding/Support: This project described was supported in part by the University of Illinois at Chicago Council for Excellence in Teaching and Learning, Curriculum and Instruction Grant and by the National Center for Advancing Translational Sciences, National Institutes of Health, through grant UL1TR000050 (Ms Kim).

Role of the Funder/Sponsor: The funding sources had no role in the design and conduct of the study; collection, management, analysis, and interpretation of the data; preparation, review, or approval of the manuscript; and decision to submit the manuscript for publication.

Disclaimer: The views expressed herein are those of the authors and do not necessarily represent the official views of the National Institutes of Health.

ACGME Program Requirements for Graduate Medical Education in Dermatology. https://www.acgme.org/acgmeweb/Portals/0/PFAssets/ProgramRequirements/080_dermatology_07012014_u06152014.pdf. Accessed January 31, 2014.
Yudkowsky  R, Alseidi  A, Cintron  J.  Beyond fulfilling the core competencies: an objective structured clinical examination to assess communication and interpersonal skills in a surgical residency. Curr Surg. 2004;61(5):499-503.
PubMed   |  Link to Article
Yudkowsky  R, Downing  SM, Sandlow  LJ.  Developing an institution-based assessment of resident communication and interpersonal skills. Acad Med. 2006;81(12):1115-1122.
PubMed   |  Link to Article
Iramaneerat  C, Myford  CM, Yudkowsky  R, Lowenstein  T.  Evaluating the effectiveness of rating instruments for a communication skills assessment of medical residents. Adv Health Sci Educ Theory Pract. 2009;14(4):575-594.
PubMed   |  Link to Article
The Dermatology Milestones Project. A Joint Initiative of the Accreditation Council for Graduate Medical Education and the American Board of Dermatology, June 2014.http://acgme.org/acgmeweb/Portals/0/PDFs/Milestones/DermatologyMilestones.pdf. Accessed August 5, 2014.

Figures

Tables

Table Graphic Jump LocationTable 1.  Dermatology Objective Structured Clinical Examination Scenarios for 6 Communication and Interpersonal Skills Tasks

References

ACGME Program Requirements for Graduate Medical Education in Dermatology. https://www.acgme.org/acgmeweb/Portals/0/PFAssets/ProgramRequirements/080_dermatology_07012014_u06152014.pdf. Accessed January 31, 2014.
Yudkowsky  R, Alseidi  A, Cintron  J.  Beyond fulfilling the core competencies: an objective structured clinical examination to assess communication and interpersonal skills in a surgical residency. Curr Surg. 2004;61(5):499-503.
PubMed   |  Link to Article
Yudkowsky  R, Downing  SM, Sandlow  LJ.  Developing an institution-based assessment of resident communication and interpersonal skills. Acad Med. 2006;81(12):1115-1122.
PubMed   |  Link to Article
Iramaneerat  C, Myford  CM, Yudkowsky  R, Lowenstein  T.  Evaluating the effectiveness of rating instruments for a communication skills assessment of medical residents. Adv Health Sci Educ Theory Pract. 2009;14(4):575-594.
PubMed   |  Link to Article
The Dermatology Milestones Project. A Joint Initiative of the Accreditation Council for Graduate Medical Education and the American Board of Dermatology, June 2014.http://acgme.org/acgmeweb/Portals/0/PDFs/Milestones/DermatologyMilestones.pdf. Accessed August 5, 2014.

Correspondence

CME
Also Meets CME requirements for:
Browse CME for all U.S. States
Accreditation Information
The American Medical Association is accredited by the Accreditation Council for Continuing Medical Education to provide continuing medical education for physicians. The AMA designates this journal-based CME activity for a maximum of 1 AMA PRA Category 1 CreditTM per course. Physicians should claim only the credit commensurate with the extent of their participation in the activity. Physicians who complete the CME course and score at least 80% correct on the quiz are eligible for AMA PRA Category 1 CreditTM.
Note: You must get at least of the answers correct to pass this quiz.
Please click the checkbox indicating that you have read the full article in order to submit your answers.
Your answers have been saved for later.
You have not filled in all the answers to complete this quiz
The following questions were not answered:
Sorry, you have unsuccessfully completed this CME quiz with a score of
The following questions were not answered correctly:
Commitment to Change (optional):
Indicate what change(s) you will implement in your practice, if any, based on this CME course.
Your quiz results:
The filled radio buttons indicate your responses. The preferred responses are highlighted
For CME Course: A Proposed Model for Initial Assessment and Management of Acute Heart Failure Syndromes
Indicate what changes(s) you will implement in your practice, if any, based on this CME course.

Multimedia

Some tools below are only available to our subscribers or users with an online account.

642 Views
0 Citations
×

Related Content

Customize your page view by dragging & repositioning the boxes below.

Articles Related By Topic
Related Collections
PubMed Articles
Jobs
JAMAevidence.com

Care at the Close of Life: Evidence and Experience
The Physician and Patient: Facing the Inevitable Together

Care at the Close of Life: Evidence and Experience
Organizing an Approach to End-of-Life Decision Making