Development and Psychometric Validation of the Minho Suture Assessment Scale (Minho-SAS) for Medical Students
DOI:
https://doi.org/10.20344/amp.23567Keywords:
Clinical Competence, Educational Measurement, Students, Medical, Surveys and Questionnaires, Suture Techniques/educationAbstract
Introduction: Even though mastery of suturing is a core technical skill in surgical education, existing tools for its assessment often lack psychometric validation or are not specifically designed for undergraduate training. The aim of this study was to develop and validate the Minho Suture Assessment Scale (Minho-SAS), a structured instrument to evaluate fundamental suturing competencies in medical students. The research question was whether the Minho-SAS demonstrates validity and reliability as a psychometric tool.
Methods: The development process involved collaboration with multidisciplinary surgical teams and experienced practitioners to ensure content validity. Data from a cohort of medical students were utilized for psychometric evaluation. Dimensionality was assessed using parallel analysis, Bayesian information criterion, unidimensional congruence, item unidimensional congruence, explained common variance, item explained common variance and mean of item residual absolute loadings. Validity based on internal structure was assessed with Rasch model analysis and factor analysis from the tetrachoric correlation matrix. Reliability was assessed using Rasch model standard errors of measurement to obtain a conditional reliability curve and Cronbach’s alpha and McDonald’s omega internal consistency coefficients.
Results: Analyses supported a unidimensional structure for the Minho-SAS. The single-factor solution explained 39.96% of variance, and Rasch measures accounted for 29.15% (16.43% by persons, 12.72% by items). Residual correlations, factor loadings, and item fit statistics were within acceptable ranges. Reliability indices were satisfactory: Rasch reliability = 0.706; McDonald’s omega = 0.889; Cronbach’s alpha = 0.883.
Conclusion: The Minho-SAS is a robust instrument specifically tailored for assessing fundamental suturing skills among medical students. Rasch model analysis yielded less favorable results than factor analysis, yet still acceptable. While demonstrating considerable potential, further exploration of Minho-SAS across diverse populations and educational settings is crucial to affirm its broader applicability and impact in medical education and clinical practice.
Downloads
References
Ten Cate O, Khursigara-Slattery N, Cruess RL, Hamstra SJ, Steinert Y, Sternszus R. Medical competence as a multilayered construct. Med Educ. 2024;58:93-104.
Epstein RM, Hundert EM. Defining and assessing professional competence. JAMA. 2002;287:226.
Witheridge A, Ferns G, Scott-Smith W. Revisiting Miller’s pyramid in medical education: the gap between traditional assessment and diagnostic reasoning. Int J Med Educ. 2019;10:191.
Frank JR, Danoff D. The CanMEDS initiative: implementing an outcomes-based framework of physician competencies. Med Teach. 2007;29:642-7.
Moskowitz EJ, Nash DB. Accreditation council for graduate medical education competencies: practice-based learning and systems-based practice. Am J Med Qual. 2007;22:351-82.
Ten Cate O. Entrustability of professional activities and competency-based training. Med Educ. 2005;39:1176-7.
Epstein RM. Assessment in medical education. Cox M, Irby DM, editors. N Engl J Med. 2007;356:387-96.
Emmanuel T, Nicolaides M, Theodoulou I, Yoong W, Lymperopoulos N, Sideris M. Suturing skills for medical students: a systematic review. In Vivo. 2021;35:1-12.
Vaidya A, Aydin A, Ridgley J, Raison N, Dasgupta P, Ahmed K. Current status of technical skills assessment tools in surgery: a systematic review. J Surg Res. 2020;246:342-78.
Bilgic E, Endo S, Lebedeva E, Takao M, McKendy KM, Watanabe Y, et al. A scoping review of assessment tools for laparoscopic suturing. Surg Endosc. 2018;32:3009-23.
Almeland SK, Lindford A, Sundhagen HP, Hufthammer KO, Strandenes E, Svendsen HL, et al. The effect of microsurgical training on novice medical students’ basic surgical skills - a randomized controlled trial. Eur J Plast Surg. 2020;43:459-66.
Dormegny L, Neumann N, Lejay A, Sauer A, Gaucher D, Proust F, et al. Multiple metrics assessment method for a reliable evaluation of corneal suturing skills. Sci Rep. 2023;13:2920.
Nugent E, Joyce C, Perez-Abadia G, Frank J, Sauerbier M, Neary P, et al. Factors influencing microsurgical skill acquisition during a dedicated training course. Microsurgery. 2012;32:649-56.
Chipman JG, Schmitz CC. Using objective structured assessment of technical skills to evaluate a basic skills simulation curriculum for first-year surgical residents. J Am Coll Surg. 2009;209:364.
Goova MT, Hollett LA, Tesfay ST, Gala RB, Puzziferri N, Kehdy FJ, et al. Implementation, construct validity, and benefit of a proficiency-based knot-tying and suturing curriculum. J Surg Educ. 2008;65:309-15.
Sato E, Mitani S, Nishio N, Kitani T, Sanada T, Ugumori T, et al. Development of proficiency-based knot-tying and suturing curriculum for otolaryngology residents: a pilot study. Auris Nasus Larynx. 2020;47:291-8.
Thinggaard E, Zetner DB, Fabrin A, Christensen JB, Konge L. A study of surgical residents’ self-assessment of open surgery skills using gap analysis. Simul Healthc. 2023;18:305-11.
Buckley CE, Kavanagh DO, Gallagher TK, Conroy RM, Traynor OJ, Neary PC. Does aptitude influence the rate at which proficiency is achieved for laparoscopic appendectomy? J Am Coll Surg. 2013;217:1020-7.
Nickel F, Brzoska JA, Gondan M, Rangnick RM, Chu J, Kenngott HG, et al. Virtual reality training versus blended learning of laparoscopic cholecystectomy: a randomized controlled trial with laparoscopic novices. Medicine. 2015;94:e764.
Ebina K, Abe T, Hotta K, Higuchi M, Furumido J, Iwahara N, et al. Objective evaluation of laparoscopic surgical skills in wet lab training based on motion analysis and machine learning. Langenbecks Arch Surg. 2022;407:2123-32.
Sundhagen HP, Almeland SK, Hansson E. Development and validation of a new assessment tool for suturing skills in medical students. Eur J Plast Surg. 2018;41:207-16.
Vassiliou MC, Feldman LS, Andrew CG, Bergman S, Leffondré K, Stanbridge D, et al. A global assessment tool for evaluation of intraoperative laparoscopic skills. Am J Surg. 2005;190:107-13.
De Champlain AF. A primer on classical test theory and item response theory for assessments in medical education. Med Educ. 2010;44:109-17.
Bond T, Yan Z, Heene M. Applying the Rasch model: fundamental measurement in the human sciences. 4th ed. New York: Routledge; 2020.
Tavakol M, Dennick R. The foundations of measurement and assessment in medical education. Med Teach. 2017;39:1010-5.
American Educational Research Association, American Psychological Association, National Council on Measurement in Education. Standards for Educational and Psychological Testing. 2014. [cited 2024 Feb 06]. Available from: https://www.testingstandards.net/uploads/7/6/6/4/76643089/standards_2014edition.pdf.
Boateng GO, Neilands TB, Frongillo EA, Melgar-Quiñonez HR, Young SL. Best practices for developing and validating scales for health, social, and behavioral research: a primer. Front Public Health. 2018;6:1-18.
Goodhue DL, Lewis W, Thompson R. Does PLS have advantages for small sample size or non-normal data? MIS Q. 2012;36:981-1001.
Martin JA, Regehr G, Reznick R, MacRae H, Murnaghan J, Hutchison C, et al. Objective structured assessment of technical skill (OSATS) for surgical residents. Br J Surg. 1997;84:273-8.
Baker FB. The basics of item response theory. 2nd ed. 2001. [cited 2025 Mar 23]. Available from: https://eric.ed.gov/?id=ED458219.
Walsh A, Cao R, Wong D, Kantschuster R, Matini L, Wilson J, et al. Using item response theory (IRT) to improve the efficiency of the Simple Clinical Colitis Activity index (SCCAI) for patients with ulcerative colitis. BMC Gastroenterol. 2021;21:132.
Institute for Objective Measurement. What do infit and outfit, mean-square and standardized mean? [cited 2023 Nov 16]. Available from: https://www.rasch.org/rmt/rmt162f.htm.
Cecilio-Fernandes D, Medema H, Collares CF, Schuwirth L, Cohen-Schotanus J, Tio RA. Comparison of formula and number-right scoring in undergraduate medical training: a Rasch model analysis. BMC Med Educ. 2017;17:192.
Institute for Objective Measurement. Dichotomous mean-square fit statistics. [cited 2023 Nov 16]. Available from: https://www.rasch.org/rmt/rmt82a.htm.
Wood TJ, Pugh D. Are rating scales really better than checklists for measuring increasing levels of expertise? Med Teach. 2020;42:46-51.
Ilgen JS, Ma IW, Hatala R, Cook DA. A systematic review of validity evidence for checklists versus global rating scales in simulation-based assessment. Med Educ. 2015;49:161-73.
Published
How to Cite
Issue
Section
License
Copyright (c) 2025 Acta Médica Portuguesa

This work is licensed under a Creative Commons Attribution-NonCommercial 4.0 International License.
All the articles published in the AMP are open access and comply with the requirements of funding agencies or academic institutions. The AMP is governed by the terms of the Creative Commons ‘Attribution – Non-Commercial Use - (CC-BY-NC)’ license, regarding the use by third parties.
It is the author’s responsibility to obtain approval for the reproduction of figures, tables, etc. from other publications.
Upon acceptance of an article for publication, the authors will be asked to complete the ICMJE “Copyright Liability and Copyright Sharing Statement “(http://www.actamedicaportuguesa.com/info/AMP-NormasPublicacao.pdf) and the “Declaration of Potential Conflicts of Interest” (http:// www.icmje.org/conflicts-of-interest). An e-mail will be sent to the corresponding author to acknowledge receipt of the manuscript.
After publication, the authors are authorised to make their articles available in repositories of their institutions of origin, as long as they always mention where they were published and according to the Creative Commons license.

