0
We're unable to sign you in at this time. Please try again in a few minutes.
Retry
We were able to sign you in, but your subscription(s) could not be found. Please try again in a few minutes.
Retry
There may be a problem with your account. Please contact the AMA Service Center to resolve this issue.
Contact the AMA Service Center:
Telephone: 1 (800) 262-2350 or 1 (312) 670-7827  *   Email: subscriptions@jamanetwork.com
Error Message ......
Original Investigation |

Teaching the Simple Suture to Medical Students for Long-term Retention of Skill FREE

Ethan Routt, BA1,2; Yasaman Mansouri, MD1; Ellen H. de Moll, BA1,3; Daniel M. Bernstein, MD1; Sebastian G. Bernardo, MD1; Jacob Levitt, MD1
[+] Author Affiliations
1Department of Dermatology, Icahn School of Medicine at Mount Sinai, New York, New York
2currently a medical student at The University of Hawaii at Manoa
3currently a medical student at the University of Connecticut, Storrs
JAMA Dermatol. 2015;151(7):761-765. doi:10.1001/jamadermatol.2015.118.
Text Size: A A A
Published online

Importance  Instructional methods for the simple suture technique vary widely and are seldom based on educational research. Published data indicate that video primers and structured instruction and evaluation decrease learning time and improve skill acquisition.

Objectives  To determine the amount of practice needed to attain simple suture proficiency and to identify the optimal teaching schedule for retention of skill.

Design, Setting, and Participants  First-year and second-year medical students at the Icahn School of Medicine at Mount Sinai with little to no suturing experience were randomly divided into 2 equal groups, with one being taught on day 1 and tested for proficiency on day 30 (control group) and the other being taught on day 1 and tested for proficiency on days 10, 20, and 30 (experimental group). Students were evaluated using the objective structured assessment of technical skills method and a checklist. Those initially not proficient on a given day were immediately prompted to practice and retest. This cycle continued until proficiency was achieved for that day. The study was conducted from April 7, 2014, to June 30, 2014.

Main Outcomes and Measures  Simple suture proficiency at 30 days and the mean number of practice sutures needed for proficiency on day 1.

Results  All students ultimately achieved proficiency. The mean (SD) number of practice sutures required to achieve proficiency at the initial training was 41 (15). Students in the control group had a 0% pass rate at the 30-day initial proficiency test, while students in the experimental group had a 91.7% pass rate at day 30 (P < .001). There were no differences in instructional time, cumulative number of sutures, or objective structured assessment of technical skills scores at proficiency between groups across the study.

Conclusions and Relevance  Single instructional sessions may not be sufficient to maintain simple suture proficiency over the course of a 30-day elective. We propose the use of preparatory instructional videos, followed by instructor demonstration to introduce the technique. Independent practice with intermittent evaluation and critique allows for skill acquisition and time efficiency at the initial training. Students should view instructional videos and practice at least 10 repetitions every 10 days to maintain their skill.

Medical students are generally taught how to suture before beginning the clinical portion of their education. However, the timing and process through which they are taught vary greatly across institutions, and the time from instruction to suturing in the clinic may be many months, with questionable skill retention. Teaching methods include large group instruction during preclinical training, small group instruction before specific clinical rotations, and individual instruction from physicians of various specialties. While the basic mechanics of placing a simple suture are universal, techniques vary widely among different instructors.

Most publications regarding evaluation of suturing technique come from the surgical literature and are based on data obtained from surgical residents.14 The objective structured assessment of technical skills (OSATS) is a validated checklist used for evaluation of competence in procedural skills that can be adapted for specific tasks, including wound closure. The OSATS has been used successfully to evaluate medical students performing laceration repair in a simulated environment.5

Supplemental video instruction of surgical technique has been demonstrated as superior to classroom instruction alone for procedural skills in surgery and dentistry.68 To this end, we created videos of how to perform a simple suture, including common mistakes and their negative consequences (https://sites.google.com/site/dermatologyeducation/residency/certification).

The aim of this study was 2-fold: (1) to identify the mean number of repetitions necessary to perform the simple suture competently; and (2) to demonstrate that one-time teaching sessions fail, whereas spaced practice of the simple suture yields retention of skill.

The study was conducted from April 7, 2014, to June 30, 2014. First-year and second-year medical students at the Icahn School of Medicine at Mount Sinai were recruited via classwide e-mail and were randomly divided into equal experimental and control groups. Students with prior suturing experience were excluded. Each student watched 2 instructional videos on the simple suture technique within 24 hours before their small group lesson (http://www.youtube.com/watch?v=c8-uU4gIMhQ&list=UUqDYUtn6USEApGMAexP_EQg and http://www.youtube.com/watch?v=gkGkU6SlxLA&list=UUqDYUtn6USEApGMAexP_EQg). The students were asked to review a written outline of the suturing procedure (supplementing the videos) (Table 1) and an OSATS scoring sheet2 (Table 2) so that they were familiar with how they would be evaluated.

Table Graphic Jump LocationTable 1.  Simple Suture Scoring Rubrica
Table Graphic Jump LocationTable 2.  Global Rating Scoring Sheeta

Our study was granted an exemption by the Mount Sinai Hospital institutional review board because it was determined to be similar to normal educational activities. Students heard a brief description of the risks and benefits of the study and gave oral consent to participate.

A pair of students from the control group and from the experimental group was assigned to 1 of 5 instructors (E.R., Y.M., E.H.M., D.M.B., and J.L.). Each instructor was certified in the suture technique by one seasoned dermatologist (J.L.) to standardize didactic messaging, as summarized in Table 1. In the assigned pair, students were shown the simple suture technique and were taught how to perform it properly using a disposable suturing tray (No. 747; Busse) and 3-0 polyglactin suture (VICRYL Plus; Ethicon) with a short 26-mm half-circle taper needle (Ethicon) and a simulated 10-cm wound on a manikin (IL Duomo; SimSkin).

Once clear on the proper technique, students practiced 10 times and were then evaluated by the instructor on the scoring rubric (Table 1) and the OSATS scoring sheet (Table 2). If the student successfully completed a simple suture 3 times in a row in less than 30 seconds per suture without mistakes, he or she was classified as proficient. If the student was not proficient, he or she was given feedback, was asked to perform 10 more repetitions, and was retested for proficiency. This cycle was repeated until the student was deemed proficient. If a student had already completed 40 repetitions, he or she was allowed to test at the instructor’s discretion, with all repetitions being counted.

Within 24 hours after the first instructional session, students watched both videos again to reinforce what they had learned. They were asked not to watch the videos at any time after the second required viewing and were told not to prepare in any way for the follow-up visits.

Students in the control group returned at a mean (SD) of 30 (2) days after their initial instructional session. Students in the experimental group returned at a mean (SD) of 10 (2), 20 (2), and 30 (2) days after their initial instructional session. Students were evaluated at these follow-up visits on the same scoring rubric and scoring sheet used at the first visit. However, they were allowed 1 failed suture in their attempt to achieve proficiency at the follow-up visit but still needed to achieve 3 consecutive proficient sutures. Failure to achieve proficiency prompted the instructors to give feedback on where the mistakes occurred, followed by the students’ practicing half of the number of attempts necessary for them to achieve proficiency at their initial training. After practice, they were retested for proficiency. Failure demanded practice of another half of the number of attempts needed for initial proficiency and retest until proficiency was achieved. The OSATS scores were only recorded once a student achieved proficiency. Each student was compensated $50 and was given his or her suturing instruments at the end of the study.

The primary end point for the study was simple suture proficiency at a mean (SD) of 30 (2) days from the initial instruction date. The control group had a 0% pass rate at the end of 30 days, while the experimental group had a 91.7% pass rate. In the other measurements recorded, there was no statistically significant difference between the experimental and control groups at the initial training (day 1) or at the final evaluation (day 30), as shown in Table 3 and Table 4.

Table Graphic Jump LocationTable 3.  Day 1 Teaching Time, Suture Number, and OSATS Score
Table Graphic Jump LocationTable 4.  Day 30 Pass Rate, Teaching Time, Suture Number, and OSATS Score

Once being shown and understanding the mechanics of the technique, the mean (SD) number of practice sutures necessary to achieve competence at baseline was 41 (15). The mean number of practice sutures necessary for competence at 30 days was similar between the experimental and control groups, as was the time necessary for competence at 30 days (Table 4). The cumulative mean number of sutures required for competence during 30 days was 74, and the mean time was 180 minutes (3 hours).

Safety outcomes were not formally recorded. However, no injuries were reported by the instructors during the study.

Our data demonstrate that spaced reinforcement of the simple suture technique is critical for retention over the course of 30 days. While total teaching time was the same for both groups across the study, students not engaged in spaced reinforcement cannot safely contribute in a clinical situation because of a lack of proficiency. Furthermore, smaller increments of teaching time are required to catalyze proficiency with spaced reinforcement compared with larger blocks for students who do not practice. As a practical matter, limited instructor time favors a spaced practice paradigm.

No matter how the instruction is divided, it would appear that all students reach similar subjective levels of proficiency in similar amounts of time. Three hours of instruction and 80 practice sutures seem to yield proficiency. These parameters should be considered when planning a surgical curriculum. The variable that separated the groups was their ability to correctly and quickly (and ostensibly safely) perform the simple suture between the initial training and the 30-day evaluation. If the initial instruction is immediately followed by supervised practical application on patients (which could constitute spaced reinforcement), 2 hours of instruction time with 40 practice sutures appear to be a minimum prerequisite in a surgical curriculum, as shown by the day 1 results (Tables 3 and 4).

Our results demonstrate a flaw in the single-session model of teaching clinical skills. One instructional session is not sufficient without further practice to maintain clinical skills. This concept of spaced learning, first reported in 1980 by Glenberg and Lehmann,9 has been well described in more recently published literature reviews and is supported by multiple randomized clinical trials.1012 The physician educator who is teaching medical students basic clinical skills should use (1) a safe and well-structured curriculum; (2) technology (simulated skin and instructional videos) to reduce the time needed to teach, enhance retention, and reduce risk of injury; and (3) brief supervised follow-up practice to maintain skills that are infrequently used.

Recent studies13,14 have shown that using well-trained and standardized peer instructors can help to reduce the workload for the physician instructor. The use of peer instructors (ie, students previously certified by an instructor) following the initial instruction by a senior clinician may be an additional way to decrease the demand on physician instruction in the medical education setting and provides an additional benefit of reinforcing the skill for the peer instructor.

We are unaware of any study that has compared the separate educational interventions of our study, including video primers, spaced practice, and instructor feedback, in evaluation of procedural skills education. All were included based on published data supporting their value in education.1,512 Therefore, we sought to develop the most up-to-date and effective curriculum. This design limited our ability to evaluate which instructional technique was most effective. The number of medical students needed to compare the efficacy of all 3 separate interventions individually and a combination of them is prohibitively large for a single site. However, we believe that it would be valuable to determine how each of these interventions contributes to the efficiency of instruction and skill retention in medical students’ learning of basic procedural skills. A multicenter or multiyear study would help to determine this and significantly further the refinement of our suggestions based on this initial study.

Single instructional sessions may not be sufficient to maintain proficiency over the course of a 30-day elective. A lack of proficiency in basic procedural skills compromises the learning opportunities for medical students during their clinical rotations because they are unable to use what they were taught in the skills laboratory on patients. Most important, it endangers the welfare of poorly trained students, who are more likely to be injured in a clinical setting, and translates to worse outcomes for patient safety.

We propose the following schedule for teaching the simple suture to medical students. First, the student watches preparatory instructional videos and reviews evaluation tools (see the Methods section and Tables 1 and 2). Second, the instructor demonstrates the simple suture technique for 30 minutes and ensures that the student understands the proper mechanics. Third, the student performs 20 simple sutures, followed by evaluation for proficiency and guidance from the instructor. Fourth, the student practices an additional 20 simple sutures and is tested for proficiency by the instructor. If the student is not proficient, he or she should practice 10 repetitions and recertify after each 10 until proficiency is achieved. Fifth, the student should view instructional videos and practice at least 10 repetitions every 10 days to maintain his or her skill.

Accepted for Publication: January 14, 2015.

Corresponding Author: Jacob Levitt, MD, Department of Dermatology, Icahn School of Medicine at Mount Sinai, 5 E 98th St, Fifth Floor, Campus Box 1048, New York, NY 10029 (jacob.levittmd@gmail.com).

Published Online: March 18, 2015. doi:10.1001/jamadermatol.2015.118.

Author Contributions: Mr Routt and Dr Levitt had full access to all the data in the study and take responsibility for the integrity of the data and the accuracy of the data analysis.

Study concept and design: Routt, Levitt.

Acquisition, analysis, or interpretation of data: All authors.

Drafting of the manuscript: Routt, de Moll, Levitt.

Critical revision of the manuscript for important intellectual content: All authors.

Obtained funding: Levitt.

Administrative, technical, or material support: Routt, Mansouri, Levitt.

Study supervision: de Moll, Bernstein, Levitt.

Conflict of Interest Disclosures: None reported.

Funding/Support: This study was supported by the Department of Dermatology, Icahn School of Medicine at Mount Sinai. Dr Mansouri was supported by the Geoffrey Dowling Fellowship, a grant from the British Association of Dermatologists.

Role of the Funder/Sponsor: The funding sources had no role in the design and conduct of the study; collection, management, analysis, and interpretation of the data; preparation, review, or approval of the manuscript; and decision to submit the manuscript for publication.

Additional Contributions: Emilia Bagiella, PhD (Mount Sinai Hospital), performed the statistical analysis. Dr Bagiella received no compensation.

Morris  MC, Gallagher  TK, Ridgway  PF.  Tools used to assess medical students competence in procedural skills at the end of a primary medical degree: a systematic review. Med Educ Online. 2012;17. doi:10.3402/meo.v17i0.18398.
PubMed
Alam  M, Nodzenski  M, Yoo  S, Poon  E, Bolotin  D.  Objective structured assessment of technical skills in elliptical excision repair of senior dermatology residents: a multirater, blinded study of operating room video recordings. JAMA Dermatol. 2014;150(6):608-612.
PubMed   |  Link to Article
Martin  JA, Regehr  G, Reznick  R,  et al.  Objective structured assessment of technical skill (OSATS) for surgical residents. Br J Surg. 1997;84(2):273-278.
PubMed   |  Link to Article
Khan  MS, Bann  SD, Darzi  AW, Butler  PE.  Assessing surgical skill using bench station models. Plast Reconstr Surg. 2007;120(3):793-800.
PubMed   |  Link to Article
Acton  RD, Chipman  JG, Gilkeson  J, Schmitz  CC.  Synthesis versus imitation: evaluation of a medical student simulation curriculum via objective structured assessment of technical skill. J Surg Educ. 2010;67(3):173-178.
PubMed   |  Link to Article
Aragon  CE, Zibrowski  EM.  Does exposure to a procedural video enhance preclinical dental student performance in fixed prosthodontics? J Dent Educ. 2008;72(1):67-71.
PubMed
Hauser  AM, Bowen  DM.  Primer on preclinical instruction and evaluation. J Dent Educ. 2009;73(3):390-398.
PubMed
Pape-Koehler  C, Immenroth  M, Sauerland  S,  et al.  Multimedia-based training on Internet platforms improves surgical performance: a randomized controlled trial. Surg Endosc. 2013;27(5):1737-1747.
PubMed   |  Link to Article
Glenberg  AM, Lehmann  TS.  Spacing repetitions over 1 week. Mem Cognit. 1980;8(6):528-538.
PubMed   |  Link to Article
Kerfoot  BP, DeWolf  WC, Masser  BA, Church  PA, Federman  DD.  Spaced education improves the retention of clinical knowledge by medical students: a randomised controlled trial. Med Educ. 2007;41(1):23-31.
PubMed   |  Link to Article
Augustin  M.  How to learn effectively in medical school: test yourself, learn actively, and repeat in intervals. Yale J Biol Med. 2014;87(2):207-212.
PubMed
Kerfoot  BP, Fu  Y, Baker  H, Connelly  D, Ritchey  ML, Genega  EM.  Online spaced education generates transfer and improves long-term retention of diagnostic skills: a randomized controlled trial. J Am Coll Surg. 2010;211(3):331-337.e1. doi:10.1016/j.jamcollsurg.2010.04.023.
PubMed   |  Link to Article
Saleh  M, Sinha  Y, Weinberg  D.  Using peer-assisted learning to teach basic surgical skills: medical students’ experiences. Med Educ Online. 2013;18:21065.
PubMed   |  Link to Article
Wirth  K, Malone  B, Barrera  K, Widmann  WD, Turner  C, Sanni  A.  Is there a place for medical students as teachers in the education of junior residents? Am J Surg. 2014;207(2):271-274.
PubMed   |  Link to Article

Figures

Tables

Table Graphic Jump LocationTable 1.  Simple Suture Scoring Rubrica
Table Graphic Jump LocationTable 2.  Global Rating Scoring Sheeta
Table Graphic Jump LocationTable 3.  Day 1 Teaching Time, Suture Number, and OSATS Score
Table Graphic Jump LocationTable 4.  Day 30 Pass Rate, Teaching Time, Suture Number, and OSATS Score

References

Morris  MC, Gallagher  TK, Ridgway  PF.  Tools used to assess medical students competence in procedural skills at the end of a primary medical degree: a systematic review. Med Educ Online. 2012;17. doi:10.3402/meo.v17i0.18398.
PubMed
Alam  M, Nodzenski  M, Yoo  S, Poon  E, Bolotin  D.  Objective structured assessment of technical skills in elliptical excision repair of senior dermatology residents: a multirater, blinded study of operating room video recordings. JAMA Dermatol. 2014;150(6):608-612.
PubMed   |  Link to Article
Martin  JA, Regehr  G, Reznick  R,  et al.  Objective structured assessment of technical skill (OSATS) for surgical residents. Br J Surg. 1997;84(2):273-278.
PubMed   |  Link to Article
Khan  MS, Bann  SD, Darzi  AW, Butler  PE.  Assessing surgical skill using bench station models. Plast Reconstr Surg. 2007;120(3):793-800.
PubMed   |  Link to Article
Acton  RD, Chipman  JG, Gilkeson  J, Schmitz  CC.  Synthesis versus imitation: evaluation of a medical student simulation curriculum via objective structured assessment of technical skill. J Surg Educ. 2010;67(3):173-178.
PubMed   |  Link to Article
Aragon  CE, Zibrowski  EM.  Does exposure to a procedural video enhance preclinical dental student performance in fixed prosthodontics? J Dent Educ. 2008;72(1):67-71.
PubMed
Hauser  AM, Bowen  DM.  Primer on preclinical instruction and evaluation. J Dent Educ. 2009;73(3):390-398.
PubMed
Pape-Koehler  C, Immenroth  M, Sauerland  S,  et al.  Multimedia-based training on Internet platforms improves surgical performance: a randomized controlled trial. Surg Endosc. 2013;27(5):1737-1747.
PubMed   |  Link to Article
Glenberg  AM, Lehmann  TS.  Spacing repetitions over 1 week. Mem Cognit. 1980;8(6):528-538.
PubMed   |  Link to Article
Kerfoot  BP, DeWolf  WC, Masser  BA, Church  PA, Federman  DD.  Spaced education improves the retention of clinical knowledge by medical students: a randomised controlled trial. Med Educ. 2007;41(1):23-31.
PubMed   |  Link to Article
Augustin  M.  How to learn effectively in medical school: test yourself, learn actively, and repeat in intervals. Yale J Biol Med. 2014;87(2):207-212.
PubMed
Kerfoot  BP, Fu  Y, Baker  H, Connelly  D, Ritchey  ML, Genega  EM.  Online spaced education generates transfer and improves long-term retention of diagnostic skills: a randomized controlled trial. J Am Coll Surg. 2010;211(3):331-337.e1. doi:10.1016/j.jamcollsurg.2010.04.023.
PubMed   |  Link to Article
Saleh  M, Sinha  Y, Weinberg  D.  Using peer-assisted learning to teach basic surgical skills: medical students’ experiences. Med Educ Online. 2013;18:21065.
PubMed   |  Link to Article
Wirth  K, Malone  B, Barrera  K, Widmann  WD, Turner  C, Sanni  A.  Is there a place for medical students as teachers in the education of junior residents? Am J Surg. 2014;207(2):271-274.
PubMed   |  Link to Article

Correspondence

CME
Also Meets CME requirements for:
Browse CME for all U.S. States
Accreditation Information
The American Medical Association is accredited by the Accreditation Council for Continuing Medical Education to provide continuing medical education for physicians. The AMA designates this journal-based CME activity for a maximum of 1 AMA PRA Category 1 CreditTM per course. Physicians should claim only the credit commensurate with the extent of their participation in the activity. Physicians who complete the CME course and score at least 80% correct on the quiz are eligible for AMA PRA Category 1 CreditTM.
Note: You must get at least of the answers correct to pass this quiz.
Please click the checkbox indicating that you have read the full article in order to submit your answers.
Your answers have been saved for later.
You have not filled in all the answers to complete this quiz
The following questions were not answered:
Sorry, you have unsuccessfully completed this CME quiz with a score of
The following questions were not answered correctly:
Commitment to Change (optional):
Indicate what change(s) you will implement in your practice, if any, based on this CME course.
Your quiz results:
The filled radio buttons indicate your responses. The preferred responses are highlighted
For CME Course: A Proposed Model for Initial Assessment and Management of Acute Heart Failure Syndromes
Indicate what changes(s) you will implement in your practice, if any, based on this CME course.
Submit a Comment
Ever thus
Posted on August 8, 2015
Dr Jim Muir
Private practice, Brisbane Australia
Conflict of Interest: None Declared
This article nicely illustrates the truism that skills are only acquired through experience. Lessons should be taken from this when we think about medical education in general. To learn to be a doctor you still need to see undifferentiated patients, for whom you have responsibility in a supervised environment. Workshops, seminars, simulated patients etc only get you so far.
Submit a Comment

Multimedia

Some tools below are only available to our subscribers or users with an online account.

1,001 Views
0 Citations
×

Related Content

Customize your page view by dragging & repositioning the boxes below.

Articles Related By Topic
Related Collections
PubMed Articles
Jobs