• No results found

Högskolepedagogiskt utvecklingsprojekt The rationale for proposing an extended score interval in the assessment protocol in the Basic Surgical Skills course in Sweden

N/A
N/A
Protected

Academic year: 2022

Share "Högskolepedagogiskt utvecklingsprojekt The rationale for proposing an extended score interval in the assessment protocol in the Basic Surgical Skills course in Sweden"

Copied!
30
0
0

Loading.... (view fulltext now)

Full text

(1)

Högskolepedagogiskt utvecklingsprojekt

The rationale for proposing an extended score interval in the assessment protocol in the Basic Surgical Skills course in Sweden

Stefan Acosta , Vascular Centre, Malmö, Skåne University Hospital.

Tutor: Anders Beckman, Lund University, Dept of Clinical Science, Malmö

Address for correspondence:

Stefan Acosta

Vascular Centre, Malmö-Lund

Skåne University Hospital

Sweden

E-mail: Stefan.acosta@telia.com

(2)

Abstract

Objective: The Basic Surgical Skills course has adopted the teaching concept of one safe and standardized surgical technique for the most elementary skills, and was introduced from Great Britain to Sweden in 2002. The assessment score sheet has been criticized of having a too narrow score interval, 0–3, with little possibilities to reflect training progression and proper constructive feedback to trainees. An extended score interval, 1–6, was proposed by the Swedish steering committee of the course. The aim of this study was to analyze the trainees scores in the current and proposed score sheets.

Design: Participants were evaluated in the current and proposed assessment forms by

instructors/ observers (n=11) and them self during the first and second day. In each

assessment form, 17 tasks were assessed, and six assessment forms were completed for each participant. Inter-rater agreement was expressed as percentage agreement and inter-rater reliability as intra-class correlation (ICC).

Setting: The course was delivered in April 2013 at Practicum, Lund, Skåne University Hospital Sweden.

Participants: Sixteen residents, seven females and nine males, in surgery within their first year of training.

Results: The highest overall inter-rater agreement was 68% for the current and 48% in the

proposed assessment form between instructors and observers, and the lowest was 55% and 33%, respectively, between female trainees and instructors. The overall inter-rater reliability between the current and proposed score sheets after assessment by the instructors increased from ICC 0.38 in day 1 to 0.83 in day 2.

(3)

Conclusions: The proposed score sheet is more dynamic and has a better potential than the

current score sheet to be a platform for more accurate scores, better feedback and instrument for learning and retention of acquired technical surgical skills in the Basic Surgical Skills Course.

Key words: surgical education, basic surgical skills, residents, score interval, assessment

Competencies: Practice-based learning and Improvement, Patient Care

(4)

Introduction

Surgical competence is dependent on technical skills as well as non-technical skills such as decision-making, communication, team work and leadership1 (Crossley 2011). Effective teaching and learning in technical surgical skills courses should follow the pedagogic principles of constructive alignment2, where there is a harmony between goal-directed

teaching, teaching activities and assessment. It is well-known that assessment per se is a drive for learning3. The form of assessment, however, summative or formative, has been used to serve different purposes. It seems that elements of both summative and formative assessment may be beneficial in learning4,5, whereas regular less stressful formative assessments may be better in retention of technical skills6.

The Basic Surgical Skills course has adopted the teaching concept of one safe and

standardized surgical technique for the most elementary skills. The course was introduced from Great Britain to Sweden in 2002. Since then, the course has been modified to conditions in Sweden and the course is mainly intended for surgical trainees within their first year of training. The aims and learning activities of the course has been modified, whereas the assessment score sheet has remained the same. The assessment score sheet (Appendix 1) has been criticized for having a too narrow score interval, scores 0 – 3, with little possibilities to reflect training progression and proper constructive feedback to the trainees. Furthermore, the score “0” has to our knowledge not been given during our courses, nor has anybody failed, indicating that the assessment at this level of training should be formative rather than

summative. Indeed, it was decided in a national steering meeting to revise the protocol into an entirely formative evaluation score sheet and to compare the proposed wider score interval of 1 – 6 (Appendix 2), adopted from the direct observational procedural skills (DOPS) method7-

8, against the current score sheet. The aim of this study was to analyze the trainees scores in

(5)

the current and proposed score sheets, assessed by the instructors, external observers and by them self, and to estimate inter-rater agreement within each respective score sheet and inter- rater reliability between the two score sheets.

(6)

Methods

Sixteen course participants were evaluated during the first and second day of the Basic Surgical Skills course (17 – 19th of April 2013) at Practicum, Lund, Skåne University Hospital, Sweden. All instructors and external observers recruited for this study were all experienced instructors. The instructors and external observers were informed both in writing and orally about the revised proposed protocol one week prior to start of study and at the start of study. The course participants were informed at the start of the study. One instructor and one observer were assigned to evaluate four participants in the respective four working stations according to a protocol made up before the start of study. The instructors were instructor the first day and observer the second day, vice versa for the observers. In all, there were eleven instructors or observers, of whom one was female. Technical skills of each specific task were assessed in the current and the proposed evaluation form independently by the instructor, external observer and the course participants themselves (self assessment). Oral and written scores were given at the end of each morning or afternoon session by the

respective instructors to each participant. In that way, formal assessments were performed four times during the study. All 96 evaluation forms were completed. In each evaluation form, 17 tasks were assessed. Oral feedback from the trainees to the teachers were given at the end of each day and written feedback according to the participation course evaluation form (Appendix 3) returned to the teachers immediately after the end of the course.

Statistics

Age in women and men were defined in median age (range). Differences between groups were evaluated with the Mann-Whitney U test, and related samples with the Wilcoxon-signed rank test, and p-value < 0.05 was considered significant. The inter-rater agreement (i.e. the extent to which assessors make exactly the same judgment about a subject ) was evaluated

(7)

with proportional agreement and expressed in percentage of agreement. There were seventeen scores to be appointed in the current and proposed score sheet, respectively, and 272 (17 x 16) scores were given in the respective 16 score sheets, and the overall inter-rater agreement between, for instance, instructors and observers in the current score sheet, was calculated by the summing up all the perfect matches divided by 272 (percentage agreement, see Table 1, Appendix 4). Inter-rater agreement was graded as follows: Lack of agreement (0.00 – 0.30), weak agreement (0.31 – 0.50), moderate agreement (0.51 – 0.70), strong agreement (0.71 – 0.90) and very strong agreement (0.91 – 1.00)9. A floor or ceiling effect was considered to be present when >15% of participants received the lowest or highest score, respectively. The inter-rater reliability (i.e. theconsistency in the rating of subjects, although each subject is not provided exactly the same rating by all assessors) between the current and proposed score sheets were evaluated with intra-class correlation (ICC) with 95% confidence intervals (CI) (two-way mixed model, consistency10). A value of > 0.7 was regarded as satisfactory11. The total summarized score in the current and proposed score sheet for each trainee during day 1 and day 2, respectively, by the instructors, observers and themselves, was calculated, and the reliability analysis was performed after entering, for instance, the instructors total scores in the current and proposed score sheet of all trainees day 1 (see Table 2, Appendix 5). The mean score of the specific tasks “instrument handling”, “knot tying” and “suture technique”

were calculated for all four time points in the current and proposed assessment score sheet, respectively, when written evaluation took place, and development of acquired scores was graphically displayed (see Figure 1 – 5). Analysis was performed in SPSS, version 20.0, and Excel.

(8)

Results

The course participants

There were no difference in age between the nine male and seven female participants with a median age of 35 years (range 28 – 43) and 31 years (range 29 – 39), respectively (p=0.41).

Inter-rater agreement between assessors

Inter-rater agreement between instructors and observers, and between assessors and

participants, was higher for the current compared to the proposed score sheet (Table 1). The inter-rater agreement between instructors and observers for knot tying in day one am in the current assessment and proposed assessment score sheet were 44% and 69%, respectively.

The inter-rater agreement between instructors and observers for knot tying in day two pm in the current assessment and proposed assessment score sheet were 56% and 31%, respectively.

The inter-rater agreement between instructors and observers for arterial patch anastomoses at the end of day one in the current assessment and proposed assessment score sheet were 94%

and 38%, respectively. The inter-rater agreement between instructors and observers for bowel anastomoses side to side at the end of day two in the current assessment and proposed

assessment score sheet were 44% and 56%, respectively (Appendix 4).

Inter-rater reliability between the current and proposed score sheets

The ICC between the current and the proposed score sheet was lower day 1 than for day 2, particularly for the instructors (Table 2). The ICC between the two score sheets after

assessment of knot tying at day one am were 0.50 (-0.43 – 0.82), 0.50 (-0.43 – 0.82) and 0.80 (0.43 – 0.93) by instructors, observers and themselves, respectively. The ICC between the two score sheets after assessment of knot tying at day two pm were 0.92 (0.78 – 0.97), 0.70 (0.13

(9)

– 0.89) and 0.75 (0.28 – 0.91) by instructors, observers and themselves, respectively (Appendix 5).

Assessment of repeated technical skills

The progression lines towards higher scores in instrument handling, knot tying and suture technique throughout the study was steeper in the proposed compared to the current score sheet (Figure 1 – 5). The distribution of scores given by the instructors for assessment of

“knot tying” according to the current score sheet at first and last time point was score 1 (n=6), score 2 (n=10), and score 2 (n=11) and score 3 (n=5), respectively (p=0.001). The highest score, score 3, was given in 31% (5/16) at the last time point. The distribution of scores given by the instructors for assessment of “knot tying” according to the proposed score sheet was score 2 (n=3), score 3 (n=12), score 4 (n=1), and score 4 (n=4) and score 5 (n=12),

respectively (p<0.001). The distribution of scores given by the instructors for assessment of

“bowel anastomosis end to end” at day 2, am, and “bowel anastomosis side to side” at day 2, pm, according to the current score sheet was score 2 (n=14), score 3 (n=2), and score 2 (n=6), score 3 (n=10), respectively (p=0.011). The highest score, score 3, was given in 62% (10/16) at “bowel anastomosis side to side”. The distribution of scores given by the instructors for assessment of “bowel anastomosis end to end” at day 2, am, and “bowel anastomosis side to side” at day 2, pm, according to the proposed score sheet was score 3 (n=5), score 4 (n=10), score 5 (n=1), and score 3 (n=2), score 4 (n=5), score 5 (n=9), respectively (p=0.001).

Gender perspectives on self assessments

The female participants assessed themselves with lower scores than male participants (Fig 4 – 5), but this was only significant for knot tying at time point 3 in the proposed score sheet : The median score for females and males after self assessment in knot tying at time point 3 in the proposed score sheet was 3 (range 3 – 4) and 4 (4 – 5), respectively (p=0.016), whereas the

(10)

instructors scored 4 (range 3 – 4) and 4 (range 3 – 4), respectively (p=0.61). The inter-rater agreement in knot tying at the end of day 2 between female trainees and instructors, and between female trainees and instructors/observers in the current assessment score sheet were 57% and 29%, respectively. The inter-rater agreement in knot tying at the end of day 2 between female trainees and instructors, and between female trainees and

instructors/observers in the proposed assessment score sheet were 71% and 43%, respectively.

The inter-rater agreement in knot tying at the end of day 2 between male trainees and instructors, and between male trainees and instructors/observers in the current assessment score sheet were 56% and 44%, respectively (Appendix 6). The inter-rater agreement in knot tying at the end of day 2 between male trainees and instructors, and between male trainees and instructors/observers in the proposed assessment score sheet were 44% and 11%, respectively (Appendix 7).

Trainee feedback

The feedback mean scores for “content”, “delivery”, “materials” and “overall rating” was 4.93, 4.81, 4.37 and 4.87, respectively (reference 1 – 5). The specific learning activities instrument handling”, “knot tying”, “suture technique” and “end-to-end bowel anastomoses”

received the feedback mean scores 4.37, 4.68, 4.74 and 4.81, respectively.

(11)

Discussion

The proposed score sheet has a more extended score interval than the current score sheet, making it possible to better use the scores as a more dynamic feedback12-14 instrument, with a larger room for improvement of scores in repeated assessments of the same task and retention of acquired skills15. The proposed score sheet can be a more precise steering tool, better reflecting actual level of acquired skills. The inter-rater reliability between the instructors current and proposed score sheet were low in day one, probably due to a very limited room for different scores in the current score sheet, whereas the higher inter-rater reliability in day two may reflect an effect of training with improved scores, which is better reflected in the proposed score sheet. As expected, this extended score interval of the proposed score sheet led to a lower percentage of agreement between different assessors.

The current score interval of 0 – 3 is too narrow, where the score “0”, is a strong symbol for failure. It is an unnecessary repressive score in a formal assessment context where the open stimulating interaction between teacher and trainee is important to mantain16. It is important to distinguish such a repressive score from negative feedback, which may be as effective on surgical performance and motivation as positive feedback17. No instructor feedback at all is associated with inferior skills performance compared to when instructor feedback is given18. The other extreme, the perfect score of “3”, should in practice be considered nearly

impossible to achieve for trainees. Nevertheless, perfect scores were given to some extent at the final assessment in various technical learning activities in the current assessment sheet and, indeed, a clear ceiling effect was noted, questioning the validity of the current score interval. If the participant deteriorates during training, temporarily or permanent during the course, it may be difficult to lower a score from “2” to “1”. Hence, the score interval of 1 – 2 in the current score sheet has to be replaced by a revised score sheet. In accordance with our

(12)

opinion, the Royal College Surgeons of Surgeons in Great Britain has found it necessary to revise the Intercollegiate Basic Surgical Skills assessment scale and feedback. The revised slightly extended scale is, however, similar to the old scale, where the scale interval has been altered from 0 – 3 to 1 – 5. The interpretation of the revised scores 1 – 2 and 4 – 5,

corresponds to the old scores 0 – 1 and 2 – 3, respectively, whereas the revised score “3”, has been added. This intermediate revised score of “3”, means that the participant performs satisfactory, identifies occasional errors and needs some supportive assistance to correct these errors. There are, for example, extended score intervals of 1 – 9 (1 – 3 unsatisfactory, 4 – 6 satisfactory, 7 – 9 excellent) that may be even better, although not proven, in the teaching of manual technical skills19.

Assessment of technical surgical skills in trainees at the beginning of their specialization is usually based on procedure-specific checklists or global ranking scales that may be applied for any type of surgical procedure20. The most valid and used global rating scale to test operative technical skills is the Objective Structured Assessment of Technical Skill (OSATS), originally developed for bench model simulations19. The global seven item 5 point ranking scale (1 – 5) assess the trainees respect for tissue, time and motion, instrument handling, instrument knowledge, use of assistant, flow of operation and knowledge of specific procedure. This global rating scale has been shown to be a more appropriate method to test technical skills than procedure specific check lists22-26. The global rating form in OSATS is, however, not directly applicable for the Basic Surgical Skills course, since only “instrument handling” and “use of assistant” has been defined in OSATS, whereas “knot tying”, “suture technique”, “bowel anastomosis”, “abdominal wall closure”, arterioraphy” and “patch

anastomosis” has not. Development of defined assessment criteria for the scores 1 – 5 or 1 – 6 for each skill or procedure in the Basic Surgical Skills course may be helpful for instructors to give more accurate scores, improving inter-observer agreement, and to be able to provide

(13)

more precise and understandable feedback to the trainees. However, a dedicated work force task group would have to be constituted to deal with this challenging task and these defined assessment criteria would have to be validated before implementation into the curriculum.

Furthermore, it is highly likely, at least during the first years of experience of such scoring system, that it would lead to an unwanted lengthening of the course. In somewhat contrast, the national steering committee of the Basic Surgical Skills course found the DOPS methodology scale simple and directly applicable for thecourse at its present form. Indeed, the instructors and faculty of the course in the present study felt, independently of the objective results, that the proposed extended score interval scale of 1 – 6 after the course was an improvement in the assessment and feedback form. Vice versa, the results of the participant evaluation form showed that the trainees were very pleased with the teaching quality and interaction with the teachers of the course. The face validity of the assessment seemed to be improved with the extended score interval. The faculty of the course has been stable throughout the years, yearly adjusting the aims, content and assessment for trainees in general surgery in Sweden. For instance, all learning activities related to orthopedic surgery has been abandoned from the original curriculum and focus has moved towards knot tying and bowel surgery. The trainees rated the content of the course with the highest feedback mean score of all items in the participant evaluation form, and were, thus, very pleased with the relevance and scope of the course and the level for target group. Hence, the content validity of the assessment has improved greatly.

The assessment form for trainees is often mixed, both formative and summative as for the global rating form in OSATS. Formative assessment may, however, be better for trainees, stimulating learning in a more relaxed manner together with the instructor, avoiding excessive

(14)

stress associated with summative assessment27,28. For this purpose, the revised proposed score sheet in the current study does not assess whether the trainee has “passed” or “failed”.

The instructors and observers had a higher inter-rater agreement, than assessments between assessors and trainees, which is understandable, since the instructors and observers had the same level of experience of teaching at the course. The female trainees rated them self lower than men in skill assessment, although non-significant in several learning activities, which may be due to a type 2 statistical error. This discrepancy in self assessment between ge nder, however, is well-known29, and may be due to underestimation of the level of acquired skills by the females, or due to overestimation by the males, or a combination of both. Scientific reports needs to take this aspect into account. Consideration of gender differences in self- perception is also important when providing feedback to female surgical residents. Higher year of training, older age and non-European nationality was reported to be even more predictive, than gender, of accuracy in self-prediction and self-assessment30.

In conclusion, the proposed score sheet has a better potential than the current score sheet to be a platform for more accurate scores, avoiding ceiling effects, offer better feedback and

instrument for learning and retention of acquired technical surgical skills during day one and two in the Basic Surgical Skills Course. It is suggested that the proposed score sheet replaces the current in the curriculum for this three day long course. The next step to improve inter- observer agreement and learning outcome might be to develop valid assessment criteria for the scores 1 – 6 for the learning activities, although it is most likely that its full

implementation would need a highly committed, professional, instructor staff.

(15)

References

1. Crossley J, Marriott J, Purdie H, Beard JH (2011). Prospective observational study to evaluate NOTSS (Non-Technical Skills for Surgeons) for assessing

trainees´non-technical performance in the operating theathre. Br J Surg 1998: 1010 – 1020.

2. Biggs J, Tang C. Teaching for Quality Learning at University. 2011. McGraw-Hill and Open University Press, Maidenhead.

3. Sonnadara RR, Garbedian S, Safir O, Nousiainen M, Alman B, Ferguson P, Kraemer W, Reznick R. Orthopedic Boot Camp II: examining the retention rates of an intensive surgical skills course. Surgery 2012; 151: 803 – 7.

4. Yu TC, Wheeler BR, Hill AG. Clinical supervisor evaluations during general surgery clerkships. Med Teach 2011; 33: 479 – 84.

5. Massey S, Stallman J, Lee L, Klingaman K, Holmerud D. The relationship between formative and summative examinations and PANCE scores; can the past predict the future ? J Physician Assist Educ 2011; 22: 41 – 5.

6. Macluskey M, Hanson C. The retention of suturing skills in dental undergraduates.

Eur J Dent Educ 2011; 15: 42 – 6.

7. Oldfield Z, Beasley SW, Smith J, Anthony A, Watt A. Correlation of a selection scores with subsequent assessment scores during surgical training. ANZ J Surg 2013; PMID: 23647783.

8. Barton JR, Corbett S, van der Vleuten CP. The validity and reliability of a Direct Observation of Procedural Skills assessment tool: assessing colonoscopic skills of senior endoscopists. Gastrointest Endosc 2012; 75: 591 – 7.

9. LeBreton JM, Senter JL. Answers to 20 questions about interrater reliability and interrater agreement. Organizational Research Methods 2008; 11: 815 – 852.

10. Cicchetti D. Guidelines, Criteria, and Rules of Thumb for Evaluating Normed and Standardized Assessment Instruments in Psychology. Psychological Assessment.

1994; 4: 284 – 290.

11. Bland, Altman. Statistics notes: Cronbach's alpha. BMJ 1997.

12. Hattie J, Timperley H. The power of feedback. Rev Educ Res 2007; 77: 81 – 112.

13. Pandey VA, Wolfe JNN, Black SA, Cairols M, Liapis CD, Bergqvist D. Self- assessment of technical skill in surgery: the need for expert feedback. A nn R Coll Surg Engl 2008; 90; 286 – 90.

14. Norcini JJ. The power of feedback. Med Edu 2010; 44: 16 – 17.

15. Torkington J, Smith SGT, Rees BI, Darzi A. Surg Endosc 2001; 15: 1071 – 1075.

(16)

16. Laidlaw AH. Social anxiety in medical students: Implications for communicati on skills teaching. Med Teach 2009; 31: 649 – 54.

17. Kannappan A, Yip DT, Lodhia NA, Morton J, Lan JN. The effect of positive and negative verbal feedback on surgical skills performance and motivation. J Surg Educ 2012; 69: 798 – 801.

18. Strandbygaard J, Bjerrum F, Maagaard M, Winkel P, Larsen CR, Ringsted C, et al.

Instructor feedback versus no instructor feedback on performance in a

laparoscopic virtual reality simulator: a randomized trial. Ann Surg 2013; 257: 839 – 44.

19. Prescott LE, Norcini JJ, McKinlay P, Rennie JS. Facing the challenges of competency-based assessment of postgraduate dental training: Longitudinal Evaluation of Performance (LEP). Med Edu 2002; 36: 92 – 97.

20. van Hove PD, Tuijthof GJM, Verdaasdonk EGG, Stassen LPS, Dankelman J.

Objective assessment of technical surgical skills. Br J Surg 2010; 97: 972 – 87.

21. Reznick R, Regehr G, MacRae H, Martin J, McCulloch W. Testing technical skill via an innovative “bench station” examination. Am J Surg 1996; 172: 226 – 230.

22. Beard JD, Jolly BC, Newble DI, Thomas WEG, Donnelly J, Southgate LJ.

Assessing the technical skills of surgical trainees. Br J Surg 2005; 92: 778 – 82.

23. Chipman J, Schmitz C. Using objective structured assessment of technical skills to evaluate a basic skills examination curriculum for first-year surgical residents. J Am Coll Surg 2009; 209: 364 – 70.

24. Datta V, Bann S, Aggarwal R, Mandalia M, Hance J, Darzi A. Technical skills examination for general surgical trainees. Br J Surg 2006; 93: 1139 – 1146.

25. Niitsu H, Hirabayashi N, Yoshimitsu M, Mimura T, Taomoto J, Sugiyama Y, Murakami S, Saeki S, Mukaida H, Takiyama W (2012). Using the Objective Structured Assessment of Technical Skills (OSATS) global rating scale to evaluate the skills of surgical trainees in the operating room. Surg Today 2012. PMID:

22941345.

26. Martin JA, Regehr G, Reznick R, Macrae H, Murnaghan J, Hutchison C, Brown M (1997). Objective structured assessment of technical skill (OSATS) for surgical residents. Br J Surg 1997; 84: 273 – 278.

27. Beard JD, Choksy S, Khan S, on behalf of the Vascular Society of Great Britain and Ireland. Asessment of operative competence during carotid endarterectomy. Br J Surg 2007; 94: 726 – 730.

28. Hays R, Wellard R. In-Training Assessment in Postgraduate Training for General Practitioners. Med Educ 1998; 32: 507 – 13.

29. Minter RM, Gruppen LD, Napolitano KS, Gauger PG. Gender differences in the self-assessment of surgical residents. Am J Surg 2005; 189: 647 – 50.

(17)

30. De Blacam C, O´Keeffe DA, Nugent E, Doherty E, Traynor O. Are residents accurate in their assessment of their own surgical skills ? Am J Surg 2012; 204:

724 – 31.

(18)

Table 1.The overall inter-rater agreement between pairs of assessors when scoring in the trainees´ current and proposed assessment sheet

Pairs of assessors Current

assessment

Proposed assessment Agreement (%) Agreement (%)

Instructors - observers

Self assessment by females – instructors Self assessment by males - instructors

68 55 61

48 33 37

Table 2.The overall inter-rater reliability of instructors, observers and self assessment of trainees when scoring in the current and proposed

assessment sheet during day 1 and day 2

Assessor Day 1 ICC (95% CI)

Day 2

ICC (95% CI) Instructors 0.38 (- 0.77 – 0.78) 0.83 (0.51 – 0.94)

Observers 0.68 (0.08 – 0.89) 0.69 (0.10 – 0.89)

Self

assessment

0.77 (0.33 – 0.92) 0.83 (0.52 – 0.94)

ICC = Intra-class correlation

(19)

Figure 1. Instrument handling: Assessment by instructors

Figure 2. Knot tying: Assessment by instructors

(20)

Figure 3. Suture technique: Assessment by instructors

Figure 4. Knot tying: Assessment by female trainees

(21)

Figure 5. Knot tying: Assessment by male trainees

(22)

Appendix

Appendix 4.Inter-rater agreement between instructors and observers when scoring in the trainees´ current and proposed assessment sheet

Technical learning event Current

assessment sheet Agreement (%)

Proposed

assessment sheet Agreement (%) Instrument handling

time point 1 time point 2 time point 3 time point 4

9/16 (56) 15/16 (94) 12/16 (75) 8/16 (50)

9/16 (56) 10/16 (62) 10/16 (62) 7/16 (44) Knot tying

time point 1 time point 2 time point 3 time point 4

7/16 (44) 12/16 (75) 13/16 (81) 9/16 (56)

11/16 (69) 7/16 (44) 6/16 (38) 5/16 (31) Suture technique

time point 1 time point 2 time point 3 time point 4

9/16 (56) 15/16 (94) 15/16 (94) 8/16 (50)

9/16 (56) 5/16 (31) 9/16 (56) 5/16 (31)

Arteriotomy and closure 12/16 (75) 9/16 (56)

Arterial patch anastomosis 15/16 (94) 6/16 (38)

Abdominal wall incision and closure 9/16 (56) 7/16 (44) Bowel anastomosis end to end 12/16 (75) 6/16 (38) Bowel anastomosis side to side 7/16 (44) 9/16 (56)

Sum 184/272 (68) 130/272 (48)

(23)

Appendix 5. Intra-rater reliability of instructors´, observers´ and self assessment of trainees when scoring in the current and proposed assessment sheet

Technical learning event Instructor ICC (95% CI)

Observer ICC (95% CI)

Self assessment ICC (95% CI) Instrument handling

time point 1 time point 2 time point 3 time point 4

0.25 (-1.1 – 0.74) 0.0 (-1.9 – 0.65) 0.79 (0.39 – 0.92) 0.76 (0.31 – 0.92)

0.85 (0.57 – 0.95) 0.39 (-0.76 – 0.78) 0.79 (0.39 – 0.92) 0.65 (0.0 – 0.88)

0.71 (0.17 – 0.90) 0.47 (-0.51 – 0.82) 0.84 (0.55 – 0.94) 0.56 (-0.25 – 0.85) Knot tying

time point 1 time point 2 time point 3 time point 4

0.50 (-0.43 – 0.82) 0.49 (-0.47 – 0.82) 0.43 (-0.64 – 0.80) 0.92 (0.78 – 0.97)

0.50 (-0.43 – 0.82) 0.07 (-1.7 – 0.68) 0.40 (-0.71 – 0.79) 0.70 (0.13 – 0.89)

0.80 (0.43 – 0.93) 0.62 (-0.09 – 0.87) 0.74 (0.25 – 0.91) 0.75 (0.28 – 0.91) Suture technique

time point 1 time point 2 time point 3 time point 4

0.30 (-0.99 – 0.76) 0.38 (-0.79 – 0.78) 0.12 (-1.5 – 0.69) 0.88 (0.64 – 0.96)

0.74 (0.25 – 0.91) 0.38 (-0.79 – 0.78) 0.0 (-0.48 – 0.48) 0.77 (0.35 – 0.92)

0.59 (-0.18 – 0.86) 0.79 (0.39 – 0.93) 0.92 (0.78 – 0.97) 0.77 (0.34 – 0.92) Arteriotomy and closure 0.30 (-0.99 – 0.76) 0.30 (-0.99 – 0.76) 0.74 (0.26 – 0.91) Arterial patch anastomosis 0.50 (-0.43 – 0.82) 0.50 (-0.42 – 0.83) 0.82 (0.48 – 0.94) Abdominal wall incision and

closure

0.85 (0.57 – 0.95) 0.72 (0.20 – 0.90) 0.86 (0.61 – 0.95)

Bowel anastomosis end to end

0.62 (-0.10 – 0.87) 0.62 (-0-10 – 0.87) 0.70 (0.14 – 0.90)

Bowel anastomosis side to side

0.88 (0.67 – 0.96) 0.60 (-0.16 – 0.86) 0.72 (0.18 – 0.90)

ICC = Intra-class correlation

(24)

Appendix 6. Inter-rater agreement between self assessment among

female trainees and assessors when scoring in the current and proposed assessment sheet

Current assessment sheet Proposed assessment sheet Technical learning event Agreement

with instructors (%)

Agreement with instructors and observers (%)

Agreement with

instructors (%)

Agreement with instructors and observers (%)

Instrument handling

time point 1 time point 2 time point 3 time point 4

5/7 (71) 6/7 (86) 5/7 (71) 1/7 (14)

3/7 (43) 6/7 (86) 5/7 (71) 0/7 (0)

3/7 (43) 1/7 (14) 3/7 (43) 3/7 (43)

3/7 (43) 1/7 (14) 3/7 (43) 1/7 (14) Knot tying

time point 1 time point 2 time point 3 time point 4

3/7 (43) 4/7 (57) 5/7 (71) 4/7 (57)

1/7 (14) 1/7 (14) 4/7 (57) 2/7 (29)

1/7 (14) 1/7 (14) 3/7 (43) 5/7 (71)

1/7 (14) 0/7 (0) 1/7 (14) 3/7 (43) Suture technique

time point 1 time point 2 time point 3 time point 4

4/7 (57) 4/7 (57) 3/7 (43) 3/7 (43)

3/7 (43) 4/7 (57) 3/7 (43) 1/7 (14)

3/7 (43) 3/7 (43) 1/7 (14) 1/7 (14)

3/7 (43) 1/7 (14) 1/7 (14) 1/7 (14) Arteriotomy and closure 4/7 (57) 4/7 (57) 1/7 (14) 1/7 (14) Arterial patch anastomosis 3/7 (43) 3/7 (43) 1/7 (14) 1/7 (14) Abdominal wall incision and

closure

4/7 (57) 4/7 (57) 3/7 (43) 1/7 (14)

Bowel anastomosis end to end 5/7 (71) 4/7 (57) 3/7 (43) 2/7 (29) Bowel anastomosis side to side 3/7 (43) 3/7 (43) 3/7 (43) 3/7 (43)

Sum 66/119 (55) 51/119 (43) 39/119 (33) 27/119 (23)

(25)

Appendix 7. Inter-rater agreement between self assessment among male trainees and assessors when scoring in the current and proposed

assessment sheet

Current assessment sheet Proposed assessment sheet Technical learning event Agreement

with instructors (%)

Agreement with instructors and observers (%)

Agreement with

instructors (%)

Agreement with instructors and observers (%)

Instrument handling

time point 1 time point 2 time point 3 time point 4

7/9 (78) 8/9 (89) 7/9 (78) 4/9 (44)

3/9 (33) 8/9 (89) 7/9 (78) 3/9 (33)

3/9 (33) 3/9 (33) 5/9 (56) 2/9 (22)

2/9 (22) 2/9 (22) 3/9 (33) 1/9 (11) Knot tying

time point 1 time point 2 time point 3 time point 4

3/9 (33) 5/9 (56) 3/9 (33) 5/9 (56)

1/9 (11) 4/9 (44) 2/9 (22) 4/9 (44)

3/9 (33) 2/9 (22) 3/9 (33) 4/9 (44)

3/9 (33) 1/9 (11) 2/9 (22) 1/9 (11) Suture technique

time point 1 time point 2 time point 3 time point 4

6/9 (67) 7/9 (78) 5/9 (56) 4/9 (44)

3/9 (33) 7/9 (78) 5/9 (56) 2/9 (22)

2/9 (22) 4/9 (44) 4/9 (44) 4/9 (44)

1/9 (11) 3/9 (33) 3/9 (33) 2/9 (22) Arteriotomy and closure 8/9 (89) 5/9 (56) 4/9 (44) 2/9 (22) Arterial patch anastomosis 6/9 (67) 6/9 (67) 4/9 (44) 2/9 (22) Abdominal wall incision and

closure

3/9 (33) 3/9 (33) 2/9 (22) 1/9 (11)

Bowel anastomosis end to end 7/9 (78) 6/9 (67) 5/9 (56) 2/9 (22) Bowel anastomosis side to side 6/9 (67) 4/9 (44) 4/9 (44) 2/9 (22)

Sum 94/153 (61) 73/153 (48) 56/153 (37) 33/153 (22)

(26)
(27)

Course in

Name ________________________

Basic Surgical Technique

City: ________________________

Date: ________________________

Open surgery Day 2

Training am pm

Instrument handling Knot tying

Suture technique

Abdominal wall closure and incision

Ligate mesenteric vessels Dissection of lymphatic gland

Bowel anastomosis end to end

Bowel anstomosis side to side

Comments:

Passed □ Failed □

Day 2:

Instructor __________________________

(Surgeon)

Laparoscopic surgery Day 3

Training am pm

Functions of the stapel Open access to abdominal cavity

Camera handling-Port placement

Eye-hand-eye coordination Bimanual manipulation- Cutting

Clips, cholangiography Endostapling

Comments:

Passed □ Failed □

Day 3:

Instructor __________________________

(Laparoskopic surgeon) Open Surgery Day 1

Training am pm

Instrument handling Knot tying

Suture technique Arteriotomy & closure Patch anastomosis Comments:

Passed □ Failed □

Day 1 am:

Instructor __________________________

(Surgeon)

Passed □ Failed □

Day 1 pm:

Instructor __________________________

(Vascular Surgeon)

Scores: 3 = No errors observed ; 2 = Single errors corrected by the participant; 1 = Single errors not corrected by the participant 0 = Frequent errors observed and/or dangerous surgical technique

Assessment & feedback

(28)

Course in

Name ________________________

Basic Surgical Skills

City: ________________________

Date: ________________________

Open surgery Day 2

Training am pm

Instrument handling Knot tying

Suture technique Art of assistance

Abdominal wall incision and closure

Ligate mesenteric vessels Dissection of lymphatic gland

Bowel anastomosis end to end

Bowel anastomosis side to side

Comments:

Day 2:

Instructor __________________________

(Surgeon)

Laparoscopic surgery Day 3

Training am pm

Functions of the stapel Open access to the abdominal cavity

Camera handling-Port placement

Eye-hand-eye coordination Bimanual manipulation Cutting

Clips, cholangiography Endostapling

Comments:

Day 3:

Instructor __________________________

(Laparoskopic surgeon) Open surgery Day 1

Training am pm

Instrument handling Knot tying

Suture technique Art of assistance Arteriotomy & closure Patch anastomosis Comments:

Day 1 am:

Instructor __________________________

(Surgeon)

Day 1 pm:

Instructor __________________________

(Vascular Surgeon)

Scores 1 2 3 4 5 6

Unsatisfactory Satisfactory Excellent

Assessment & feedback

(29)

Basic Surgical Skills participant evaluation form

Centre: ………

Course dates: ………..

Overall course ratings

Please rate each aspect of the course listed below, by ticking the relevant box.

Key: 5 very pleased; 3 indifferent; 1 very disappointed

5 4 3 2 1 1 Content – relevance; scope; level for target group ………

    

2 Delivery – teaching quality; participant; interaction; faculty; participant ratio ………

    

3 Assessment – appropriateness ………..

    

4 Materials – eg handbook/video; quality of presentation and content, usefulness ……

    

5 Resources – standard of technical instruments/consumables workshop/seminar room

    

6 Administration – application/registration procedures and general organisation …….

    

7 Overall rating for the whole course

    

Course sessions ratings Day one

5 4 3 2 1

Introduction and statement of course objectives ………

    

Handling instruments ………..

    

Knots ……….

    

Knots continued ………

    

Handling sutures ………..

    

Handling vessels (anastomoses and closure)………..

    

Handling vessels (vein graft patch) ………

    

Discussion and feedback ………..

    

Day two

5 4 3 2 1

The Aberdeen knot ………..

    

Abdominal incision and closure ………..

    

Handling tissues ………...

    

Handling bowel 1 (end-to-end extramucosal anastomosis) ………...

    

Handling bowel 2 (end-to-side anastomosis on immobile bowel) ………..

    

Discussion and feedback ………..

    

(30)

Day three

Key: 5 very pleased; 3 indifferent; 1 very disappointed

5 4 3 2 1

Introduction to minimal access surgery ……… …

    

The laparoscopic stack ……….

    

Open method of port insertion ……….

    

Camera handling ………..

    

Safe port management and pneumoperitoneum ………..

    

Hand-eye-camera coordination ……… ………....

    

Grasping and manipulation skills ………

    

Diathermy safety ……….

    

Advanced dexterity skills

Loop ligation ………

    

Diathermy skills exercises………

    

Discussion and feedback ………..

    

Comments

Comments: eg what did you like best about the course or what could be improved?

Thank you for taking the time to complete this form

References

Related documents

Generella styrmedel kan ha varit mindre verksamma än man har trott De generella styrmedlen, till skillnad från de specifika styrmedlen, har kommit att användas i större

Parallellmarknader innebär dock inte en drivkraft för en grön omställning Ökad andel direktförsäljning räddar många lokala producenter och kan tyckas utgöra en drivkraft

Närmare 90 procent av de statliga medlen (intäkter och utgifter) för näringslivets klimatomställning går till generella styrmedel, det vill säga styrmedel som påverkar

• Utbildningsnivåerna i Sveriges FA-regioner varierar kraftigt. I Stockholm har 46 procent av de sysselsatta eftergymnasial utbildning, medan samma andel i Dorotea endast

I dag uppgår denna del av befolkningen till knappt 4 200 personer och år 2030 beräknas det finnas drygt 4 800 personer i Gällivare kommun som är 65 år eller äldre i

Figur 11 återger komponenternas medelvärden för de fem senaste åren, och vi ser att Sveriges bidrag från TFP är lägre än både Tysklands och Schweiz men högre än i de

Det har inte varit möjligt att skapa en tydlig överblick över hur FoI-verksamheten på Energimyndigheten bidrar till målet, det vill säga hur målen påverkar resursprioriteringar

Detta projekt utvecklar policymixen för strategin Smart industri (Näringsdepartementet, 2016a). En av anledningarna till en stark avgränsning är att analysen bygger på djupa