IJCRR - 4(7), April, 2012
Pages: 108-112
Date of Publication: 18-Apr-2012
Print Article
Download XML Download PDF
A COMPARATIVE STUDY OF OBJECTIVE STRUCTURED PRACTICAL EXAMINATION WITH CONVENTIONAL PRACTICAL EXAMINATION
Author: D. Ravichandran, R. Shankar, Varun Malhotra
Category: Healthcare
Abstract:Background and Aim of the Study: The Objective Structured Practical Examination (OSPE)
recommended by medical educationists is a new concept in practical assessment. However, this tool of practical evaluation is not widely used in our country. The validity of the examination is improved in OSPE as this tool tests both the process and the product with much importance to individual competency 1,5. Presently OSPE is used as a formative assessment tool in some medical colleges. Nationwide very few studies had been conducted on OSPE, so this study aimed at comparing the OSPE with conventional examination pattern. Methodology: The study was conducted in the department of Anatomy, VMKV Medical College, Salem, Tamil Nadu, India. 1st year MBBS students of total 100 were included in the study. All the 100 students were made to face both the conventional pattern of examination and OSPE. The mean marks obtained by the students by conventional and OSPE method was analysed by a paired
student T test. A well-structured questionnaire was administered to the same students and a feedback was got about the process of OSPE. Results : The marks obtained by the students in conventional method was comparatively less when compared to the marks obtained through OSPE type. This difference was found to be statistically significant (p< 0.005). Analysis of student's feedback found that maximum number of students felt that OSPE was more useful and comfortable than the conventional pattern of examination. Conclusion: Testing a student's ability to integrate knowledge without subjectivity, scoring based on importance, uniform and reproducible level of assessment, constructive feedback to students for improvement are the advantages that justify the inclusion of OSPE as an important assessment tool in the field of medicine.
Keywords: OSPE, Conventional exams, Assessment, Feedback
Full Text:
INTRODUCTION
Practical assessment of students in the medical curriculum needs a better tool which is reliable, uniform and has a capacity to differentiate between different categories of students1,2. The Objective Structured Practical Examination (OSPE) recommended by medical educationists is a new concept in practical assessment 1 . The term OSPE was introduced in the year 1975 as a modification of the Objective Structured Clinical Examination (OSCE) used for evaluation of pre and preclinical subjects2, 3 . However, this tool of practical evaluation is not widely used in our country. The All India Institute of Medical Sciences has standardized this method4 . The validity of the examination is improved in OSPE as this tool tests both the process and the product with much importance to individual competency 1, 5. Presently OSPE is used as a formative assessment tool in some medical colleges. Currently a modification of OSPE called as SOSPE (Semi-Objective Structured Practical Examination) is also being followed in some medical schools2, 6 . Aim and objective: The aim of the present study was to compare the scores obtained by medical students in OSPE and Traditional or Conventional Practical Examination in the subject of Anatomy. To get the students feedback for Objective Structured Practical Examination process.
METHODOLOGY
The study was conducted in the department of Anatomy, Vinayaka Missions Medical College, Salem, Tamil Nadu, India. All the first year MBBS students (n=100) were included in the study For those 100 students a surprise conventional method of anatomy practical examination was conducted on a chapter (head and neck) where they wrote their recent theory internal assessment. The conventional examination comprised of spotters in gross anatomy and histology and viva voce in osteology, embryology and radiology. The same set of students also faced the new Objective Structured Practical Examination type of practical examination a week later, and as earlier this time also it was a surprise test. The Objective Structured Practical Examination had five sections namely dissection spotters, histology slides, embryology, osteology and surface anatomy. Each section had 3 stations out of which 1 was an observer station and 2 were response stations. The time allotted for each station was 2 minutes. Both the examinations were conducted for a total of 50 marks. The mean marks obtained by the students by conventional and Objective Structured Practical Examination method was analysed by a paired student T test. In addition a well-structured questionnaire was administered to the same students and a feedback was got about the process of Objective Structured Practical Examination. The results of the feedback was analysed with SPSS 16.0.
RESULTS
The results of the comparison of marks obtained in conventional practical test with OSPE are presented in Table 1. The results of the feedback obtained from the students are presented in the Table 2.
The marks obtained by the students in conventional method was comparatively less when compared to the marks obtained through OSPE type. This difference was found to be statistically significant (p<0.005). The difference was found in the sections like Osteology, Surface anatomy and Embryology (Table 1). Whereas the marks in dissection spotters and histology slides were almost similar in both exams.
Seventy percent of the students strongly agreed that the OSPE is well structured whereas only 6% of students strongly disagreed that it was well structured. 58% of students felt that OSPE was less frightening than regular exams and 67% was of the opinion that OSPE is a useful practical exercise. 70% of students felt that OSPE increases the chance of passing the examination.
DISCUSSION
An attempt was made to compare the marks obtained by I MBBS students through the Conventional Practical Examination pattern and the Objective Structured Practical Examination (OSPE) method. We observed a significant variation in the overall marks obtained by students in both methods. The mean marks obtained by conventional method were 26.63 + 8.19 (out of 50 marks) and the mean marks obtained in OSPE method was 36.53 + 3.60 (out of 50 marks). The difference was statistically significant (p<.001). The results show that the performance and scoring of students vary with different tools of assessment and it is also clear that different tools of assessment tests the different abilities of the students7,8. Few other authors have also observed similar findings in their study1 .
Comparison of marks between these two methods in every section of the test conducted showed a similarity in marks obtained in the dissection spotters section and histology section (Table 1). The questions asked in spotter examination in Anatomy (both dissected parts and histology slides) is almost similar to the OSPE method, in other words it is well structured and free of subjectivity of the examiners. Whereas the comparison of marks obtained in other sections including Osteology, Surface Anatomy & Embryology showed a statistically significant difference. The marks obtained by the students in OSPE in these sections was better than in the conventional method (Table 1). All these sections in conventional method of assessment require interaction with examiners. Facing the examiners is a problem for most of the students. The conventional viva-voce examination creates a sense of fear and lack of confidence in most of the students leading to poor performance. Also the examiner bias involved in these conventional examinations grossly alters the result. Considering these facts OSPE seems to be superior over other methods of assessment. Direct observation of student‘s performance, increased objectivity and reduced subjectivity, equal time for all students, scoring based on check-list are the advantages of OSPE1 . However an interactive in-depth analysis of the knowledge and skill of the student is possible in our conventional examination. The response stations of OSPE evaluate areas of knowledge, interpretation and problem solving. Few authors feel that OSPE if properly structured, along with a short written component can replace the current clinical examination exercise taught in the preclinical years1 . Praveen Singh et al9 , noted that OSPE/OSCE type of assessment was well accepted by first year medical students. Few other authors have reported that OSPE was well received by the students as it is a fair system of examination eliminating examiner bias10, 11. Similarly in our study also OSPE was well accepted by the students. CONCLUSION Our results have shown that Objective Structured Practical Examination marks are similar to Conventional methods in sections like dissection spotters and histology slides but found to be different in sections like embryology, osteology and surface anatomy. This is because in conventional examinations these sections (embryology, osteology and surface anatomy) are conducted in a viva voce pattern whereas in Objective Structured Practical Examination method these sections are more structured and objective, eliminating the examiner‘s bias. Testing a student‘s ability to integrate knowledge without subjectivity, scoring based on importance, uniform and reproducible level of assessment, constructive feedback to students for improvement are the advantages that justify the inclusion of Objective Structured Practical Examination as an important assessment tool in the field of medicine. Our feed back analysis from the students show that Objective Structured Practical Examination is well received and appreciated by the students. The authors recommend future studies in this field with an integrated approach between the basic science departments in the preclinical years.
ACKNOWLEDGEMENTS
Authors acknowledge the immense help received from the scholars whose articles are cited and included in references of this manuscript. The authors are also grateful to authors / editors / publishers of all those articles, journals and books from where the literature for this article has been reviewed and discussed.
The authors thank the teaching and non-teaching staff of the department of Anatomy, Vinayaka Mission Medical College, Salem for their support in the conduct of the study.
References:
1. Aarti Sood Mahajan, Nilima Shankar, O.P. Tandon. The Comparison of OSPE with Conventional Physiology Practical Assessment. JIAMSE Vol: 14: No 2
2. Hasan S, Malik, Hamad A, Khan H, Bilal M. Conventional/Traditional Practical Examination (CPE/TDPE) Versus Objective Structured Practical Evaluation (OSPE)/Semi Objective Structured Practical Evaluation (SOSPE). Pak J Physiol 2009; 5 (1); 58-64
3. Harden, R.M and Gleeson, F.A. Assessment of clinical competence using an objective structured clinical examination (OSCE). Medical Education. 1979; 13: 41-54
4. Nayar, U. Objective structured practical examinations. In: R.L. Bijlani and U. Nayar (Eds)Teaching Physiology, Trends and Tools. All India Institute of Medical Science. New Delhi. 198; 151-159
5. Ananthakrishnan N. Objective structured clinical/practical examination (OSCE/OSPE). JPGM 1993; 39 (2): 82-4
6. Gitanjali B. The other side of OSPE. Indian J Pharmacol 2004; 36: 388-9
7. Newble, D.I and Swanson, D.B. Psychometric characteristics of the objective structured clinical examination. Medical Education. 1988; 22: 325 – 334
8. Nayar, U., Malik, S.L. and Bijlani, R.L. Objective structure practical examination: a new concept in assessment of laboratory exercise in preclinical sciences. Medical Education. 1986; 20: 204-209
9. Praveen R Singh, Raksha Bhatt, Suman Singh. Perceptions Towards Implementation of OSPE as an assessment tool in anatomy for undergraduates at a rural medical college in Western India. National Journal of Basic Medical Sciences. 2010;VolII Issue 1, 54- 60.
10. Watson A R, Houston I B, Close G C. Evaluation of an objective structured clinical examination. Arch Dis Child 1982;57: 390- 392
11. Smith L J, Price D A, Houston I B. Objective structured clinical examination compared with other forms of student assessment. Arch Dis Child 1984;59:1173- 1176.
|