SciELO - Scientific Electronic Library Online

 
vol.24 issue2Dental caries in 12-Year-Old schoolchildren who participate in a preventive and restorative dentistry programDetermination of age and sex using bimastoid diameter: a cone beam computed tomography Study author indexsubject indexarticles search
Home Pagealphabetic serial listing  

Services on Demand

Journal

Article

Indicators

Related links

  • Have no similar articlesSimilars in SciELO

Share


Odovtos International Journal of Dental Sciences

On-line version ISSN 2215-3411Print version ISSN 1659-1046

Odovtos vol.24 n.2 San José May./Aug. 2022

http://dx.doi.org/10.15517/ijds.2021.47533 

Clinical research

Faculty perceptions on objective structured clinical exam in dental education

Percepción de los profesores sobre el examen clínico estructurado y su objetivo en la educación dental

1Vice Dean, Dentistry program. Ibn Sina National College for Medical Studies, Jeddah Kingdom of Saudi Arabia. https://orcid.org/0000-0001-6250-4376

2Faculty of Dentistry, Department of Preventive Dental Sciences. Ibn Sina National College for Medical Studies, Jeddah Kingdom of Saudi Arabia. https://orcid.org/0000-0002-0890-0471

3Faculty of Dentistry, Department of Preventive Dental Sciences. Ibn Sina National College for Medical Studies, Jeddah Kingdom of Saudi Arabia. https://orcid.org/0000-0001-7712-8756

Abstract

Structured Clinical Exam (OSCE) uses standardized content and procedures to assess students across multiple domains of learning. The study is aimed to assess knowledge, attitudes, practices and observations of dental faculty on OSCE. The survey was distributed into dental faculty members in randomly selected government and private institutions in Saudi Arabia. The questionnaire was pre-tested and consisted of 4 categories including general characteristics of respondents, knowledge on utility of OSCE in curriculum and its reliability, attitudes regarding OSCE on a 5 point Likert scale, practices and observations on OSCE on Multiple choice questions (both single answer and multiple answer) and responses on a 5 point Likert scale. The sample size was determined to be 93 and the survey was sent electronically to 10 institutes. 101 complete responses from 7 institutions were considered from the 122 received. Faculty participation in OSCE was high within evaluators 94% (n=94) and administrators 61% (n=61). Majority of respondents (62%) believed that OSCE is most suited for competency based education, to assess cognitive skills (73%) and diagnostic interpretation (79%). Reliability of OSCE can be increased by standardization of evaluators (77%) with highest number believing that 6-8 stations (42%) are the minimum required in an OSCE. Institution guidelines (49%) coupled with workshops (47%) was the preferred method of preparation for OSCE. Majority felt that OSCE is most suitable for high stakes exams (mean=3.37) and it is an indispensable part of dental assessment (mean=3.78). Minimum number of stations for adequate reliability was reported to be lesser that in reported literature, specially so for high stakes assessments. Logistics required for arranging an OSCE and difficulty in standardized patients, may suggest that OSCE should be used in select situations.

Keywords OSCE; Faculty; Knowledge; Practices; Reliability

Resumen

El examen clínico estructurado (ECOE) utiliza contenido y procedimientos estandarizados para evaluar a los estudiantes en múltiples dominios de aprendizaje. Este estudio tiene como objetivo evaluar los conocimientos, las actitudes, las prácticas y las observaciones de los profesores de odontología sobre la ECOE. La encuesta se distribuyó a los miembros de la facultad de odontología en instituciones gubernamentales y privadas seleccionadas al azar en Arabia Saudita. El cuestionario se utilizó previamente y constaba de 4 categorías que incluían generalidades de los encuestados, conocimiento sobre la utilidad de la ECOE en el plan de estudios y su confiabilidad, actitudes con respecto a la ECOE en una escala Likert de 5 puntos, prácticas y observaciones sobre la ECOE en preguntas de opción múltiple (ambas respuesta y respuesta múltiple) y respuestas en una escala Likert de 5 puntos. Se determinó el tamaño de la muestra en 93 y la encuesta se envió electrónicamente a 10 institutos. Se consideraron 101 respuestas completas de 7 instituciones. La participación del profesorado en ECOE fue alta entre los evaluadores 94% (n=94) y los administradores 61% (n=61). La mayoría de los encuestados (62%) cree que la ECOE es más adecuada para la educación basada en competencias, para evaluar las habilidades cognitivas (73%) y la interpretación del diagnóstico (79%). La confiabilidad de la ECOE puede aumentarse mediante la estandarización de los evaluadores (77%) y el número más alto cree que 6-8 estaciones (42%) son el mínimo requerido en una ECOE. Las directrices de la institución (49%) junto con los talleres (47%) fue el método preferido de preparación para la ECOE. La mayoría consideró que la ECOE es más adecuada para exámenes de alto riesgo (media=3,37) y es una parte indispensable de la evaluación dental (media=3,78).Se informó que el número mínimo de estaciones para una confiabilidad adecuada es menor que en la literatura reportada, especialmente para evaluaciones de alto riesgo. La logística necesaria para organizar un ECOE y la dificultad en los pacientes estandarizados pueden sugerir que el ECOE se debe utilizar en situaciones seleccionadas.

Palabras clave ECOE; Docencia; Conocimiento; Prácticas; Fiabilidad

Introduction

Assessment is important to determine if students are achieving the outcomes designed for respective courses and programs. Student assessment methods and strategies have witnessed significant changes over the years and remain in a state of continuous flux. As the need grows to include innovative educational strategies and focus on skill development, particularly so in the dental field, methods ranging from a simple pen and paper test to more complex assessment strategies such as problem based learning and standardized patients have evolved (1). While written exams are designed to assess cognitive knowledge, clinical exams may be confined to technical skills, with concerns on examiner variability and patient safety (2). The Objective Structured Clinical Exam (OSCE) is an assessment tool conceptualized by Harden in the year 1975 (3). From its nascent stage, OSCE has evolved to be a tool capable of assessing students across domains and skill sets.

In the broad sense, OSCE encompasses a group of examinations that use multiple, standardized stations each of which requires candidates to use their clinical skills to successfully complete one or more problem solving tasks. The OSCE format often includes physical materials, such as radiographs, photographs, models, and order/ prescription writing (4). Some of the variants included under the broad umbrella of OSCE include: Objective Structured Practical Examination (OSPE); Objective Structured Long Examination Record (OSLER); Group Objective Structured Clinical Examination (GOSCE). Though the objectives may vary; these examinations retain all the characteristics of the original OSCE (5).

The implementation of OSCE and its variants continues to increase in various dental schools. National boards have adopted OSCE for licensure exams (or are seriously contemplating doing so in the near future). Several guidelines have been developed on the basis of which OSCE may be incorporated into different curricula. The designing and steps in implementation however vary and standard procedures are often modified to suit the prevailing conditions (6). The advent of the COVID-19 pandemic has affected the implementation of clinical competencies involving clinical procedures on patients (7). The dependence on OSCE as a competency under the prevailing conditions, in all probability, is set to increase.

The cornerstone of a successful OSCE undoubtedly is the faculty involved in the development and implementation of an OSCE. Roles of faculty may vary from administrating, setting questions, evaluating students and providing feedback. Steps taken by faculty, based on their beliefs and knowledge, may play a crucial role in the introduction and conduction of OSCE. The aim of the present study is to assess knowledge, attitudes, practices and observations of dental faculty members on OSCE.

AIM

To assess knowledge, attitudes, practices and observations of dental faculty members on OSCE.

Materials and methods

The study is descriptive, cross sectional and is approved by ethical committee of Ibn Sina National College, Jeddah. The survey was conducted by means of a pre-tested questionnaire and was determined to be exempt from the ethical committee. The sample size required was calculated to be 93 and margin of error of 10% at 95% confidence interval. A web-link to an anonymous web-based survey was created and distributed to 10 randomly selected government and private institutes (out of a total of 26) in Saudi Arabia. 6 weeks after the initial round a reminder was sent. The data gathered was confidential.

The survey consisted of 4 categories: Part 1 consisted of information regarding general characteristics of respondents with their experience in participating/conducting OSCE. Part 2 (6 questions) addressed the knowledge on utility of OSCE in curriculum and its reliability. Part 3 (5 questions) evaluated attitudes regarding OSCE on a 5 point Likert scale (strongly agree to strongly disagree) and Part 4 (8 questions) assessed practices and observations on OSCE on Multiple choice questions (both single answer and multiple answer) and responses on a 5 point Likert scale. The first draft of the questionnaire was pilot and tested with ten faculty members (who were excluded from the total). Modifications were made to the contents and wordings based on their suggestions to enhance content validity. Response to the survey was considered to be consent from the respondents. The responses were collected electronically.

Statistical analysis: The responses were represented as number, percentage, mean and standard deviation. Statistical analysis was performed using IBM SPSS version 22. Chi-square test was applied to select questions from the questionnaire. p<0.05 was considered to be statistically significant.

Results

A total of 101 complete responses (out of 122 received) from 7 dental institutions were included in the study. All the respondents that had completed the questionnaire in all aspects were included into the final sample of analysed questionnaires. Responses from institute was included if the number of responses exceeded 10.

Male respondents were higher than the females and largest population of respondents were between the age group of 36-40 (32%). Table 1 shows that Masters degree was the highest (71%) with the subgroup of respondents with designation of assistant professor (38%) and experience of 6-10 years(32%). The responses were evenly distributed between government (48%) and private (52%) institutes. Most faculty had participated in 1-4 OSCEs (37%), but the highest number of them (39%) had never been administrators for OSCE. Table 2 shows the faculty responses regarding knowledge and practices and observations (n=6) with multiple options. Table 3A and Table 4B are about the details of Faculty practices and observations towards OSCE evaluated using multiple choice questions and 5 point likert scale which are expressed as number and percentage of responses. Faculty attitude towards OSCE evaluated on a 5 point Likert scale are detailed in Table 5. Association between select responses showed significant correlation OSCE experience and utility of OSCE in dental assessment and reliability testing being done after every exam is explained in Table 6.

Table 1 The demographic data, gender, nationality, age, qualification, designation, degree in dental education, teaching experience, experience with OSCE, expressed in terms of numbers and percentages. 

Demographic data - Number Percentage
Gender Female 39 39%
- Male 62 61%
Nationality Saudi 20 20%
- Non Saudi 81 80%
Age 30-35 22 22%
- 36-40 33 32%
- 41-45 22 22%
- 46-50 21 21 %
- Above 51 3 3 %
Qualification Board 10 10 %
- Masters 71 70 %
- Phd 20 20 %
Designation Lecturer 23 23%
Assistant professor 39 38% -
Associate professor 33 33% -
- Professor 6 6%
Teaching experience 0-5 5 5 %
- 6-10 33 32 %
- 11-15 31 31 %
- 16-20 18 18 %
More than 20 14 14 % -
Institute Government 52 48 %
- Private 49 52%
How many OSCE exams have you participated in (as evaluator) 0 6 6%
- 1-4 38 37%
- 5-8 29 30%
- 9-12 9 9%
More than 12 19 18%
How many OSCE exams have you conducted (as an administrator i.e conducted entire OSCE ) ? 0 40 39%
- 1-4 33 33%
- 5-8 16 16%
- 9-12 7 7%
- More than 12 5 5%

** percentages may not be 100% in all cases due to rounding.

Table 2 Faculty knowledge regarding OSCE expressed as number of respondents and percentages. 

Question Options Response
Which of the following models of curriculum is OSCE best suited to ?** Objective based 40(40%)
- Outcome based 44(44%)
- Competency based 63 (62%)
Which of the following domains are best assessed by the OSCE ? ** Knowledge 64 (61 %)
- Cognitive skills 74 (73%)
- Interpersonal 46 (46%)
- IT and communication 34 (34%)
- Psychomotor 32 (32%)
Which of the following skills do you feel are best assessed by OSCE?** Patient examination 56 (55%)
- Case history taking 62 (61%)
- Diagnostic interpretation 80 (79%)
- Communication skills 60 (59%)
- Treatment planning 65 (64%)
- Clinical / preclinical skills 40 (40%)
Which of the following measures do you feel increases the *reliability of an OSCE ?** Increasing the number of stations 37 (37%)
- Standardization of patients 65 (64%)
- Standardization (calibration) of evaluators 78 (77%)
- Reducing student anxiety 28 (28%)
- Increasing the overall duration of the exam 8 (8%)
What is the minimum number of stations you feel are needed in an OSCE ? 1-5 9 (9%)
- 6-8 42 (41%)
- 9-10 35 (35%)
- 11-12 10 (10%)
- More than 12 5 (5 %)
Which of the following methods were used by you while preparing for the OSCE ? I read for the OSCE myself from various sources and prepared for it 38 (38%)
- My department and/or colleagues explained the procedure and I adopted it 34 (34%)
- My institution has guidelines on how to conduct OSCE and we followed it 49 (49%)
- MY institution has guidelines and we attended workshops before conducting OSCE 47 (47%)
- We followed the SCFHS guidelines on how to conduct OSCE exam 13 (13%)

** percentages may not be 100% in all cases due to rounding.

** for questions with multiple responses, the responses are more than 101.

Table 3A Faculty practices and observations towards OSCE evaluated using multiple choice questions. 

Question Options Response
Which of the following teaching methods/ strategies have you used to prepare students for an OSCE? Lectures 51 (50%)
- Group Discussions 54 (53%)
- Clinical/lab demonstrations 58 (57%)
- Case scenarios 74 (73%)
- Simulation 47 (46%)
- Clinical supervision 36 (35%)
Due to which of the following reasons, an OSCE would NOT be the preferred assessment tool ?** Organizing an OSCE is a very difficult task 42 (42%)
- OSCE has no major benefits over the other methods of assessment 10 (10%)
- Clinical competency is a substitute for OSCE 38 (37%)
- OSCE is very expensive 16 (16%)
- OSCE is not being conducted in the proper way 34 (34%)

** for questions with multiple responses, the responses will not add up to 95.

** percentages may not be 100% in all cases due to rounding

Table 4 Faculty practices and observations towards OSCE evaluated on a 5 point Likert scale. 

Item Strongly Disagree Disagree Neutral Agree Strongly Agree Mean Std deviation
Reliability testing of the OSCE is done after every exam 4(4%) 14(14%) 29(29%) 45(46%) 9(9%) 3.62 0.773
Students from my institute are well prepared to attempt an OSCE exam: Since they have adequate knowledge and skills 3(3%) 3(3%) 16(16%) 65(64%) 14(14%) 3.86 0.788
Students from my institute are well prepared to attempt an OSCE exam: Since they have been sufficiently exposed to the OSCE format of exams 4(4%) 6(6%) 15(15%) 56(55%) 20(20%) 3.82 0.942
Cheating is less in OSCE compared to other forms of assessment 2(2%) 15(15%) 20(20%) 40(39%) 24(24%) 3.74 1.045
OSCE causes more stress among students than other types of assessment 4(4%) 25(25%) 27(27%) 33(32%) 12(12%) 3.24 1.001

Table 5 Faculty practices and observations towards OSCE evaluated on a 5 point Likert scale. 

Item Strongly Disagree Disagree Neutral Agree Strongly Agree Mean Std deviation
OSCE can be used to cover multiple discipli nes within dentistry in a single exam 11(11%) 3(3%) 13(13%) 41(40%) 33(32%) 3.79 0.921
OSCE is the most suitable high stakes assessment ( Final exam, licensing exam) 8(9%) 11(11%) 27(26%) 44(43%) 11(11%) 3.37 1.068
OSCE MUST be compulsory for students to evaluate their clinical skills before they give the clinical competency exam (test) 5(5%) 6(6%) 23(22%) 54(53%) 13(13%) 3.63 0.951
OSCE can be used as a teaching tool. 4(4%) 5(5%) 16(16%) 49(48%) 27(27%) 4.01 0.808
OSCE is a an essential and indispensable part of dental assessment 3(3%) 6(6%) 18(18%) 55(54%) 19(19%) 3.78 0.912

Table 6 Association between select responses with Chi-square test. 

Item How many OSCE exams have you participated in How many OSCE exams have you conducted
OSCE is the most suitable high stakes assessment ( Final exam, licensing exam) 0.034* 0.836
OSCE MUST be compulsory for students to evaluate their clinical skills before they give the clinical competency exam (test) 0.179 0.798
OSCE can be used as a teaching tool. 0.163 0.773
OSCE is a an essential and indispensable part of dental assessment 0.000* 0.053*
Reliability testing of the OSCE is done after every exam 0.003* 0.094*
Students from my institute are well prepared to attempt an OSCE exam: Since they have adequate knowledge and skills 0.132 0.474
Students from my institute are well prepared to attempt an OSCE exam: Since they have been sufficiently exposed to the OSCE format of exams 0.027* 0.275
A formative OSCE helps students to be better prepared for a summative OSCE 0.029* 0.758
Cheating is less in OSCE compared to other forms of assessment 0.095* 0.250
OSCE causes more stress among students than other types of assessment 0.422 0.062*

* p<0.05 was considered to be statistically significant.

Discussion

The current study collected responses from private and government dental colleges in the kingdom. The results from the study may thus be considered to be fairly representative of the opinions from a diverse population. With more than 60% of the faculty having experience ranging from 6-15 years, their opinions may be considered to represent a more contemporary thought process regarding the OSCE. Only 6 respondents( approx. 6%) had never participated in the OSCE and a significant number (60%) were OSCE administrators. Together data suggests that, OSCE appears to be an assessment tool utilized frequently in dental education, with robust faculty participation.

Responses indicate that faculty believed OSCE is a flexible tool, capable of assessing students in diverse models of curriculum including objective, outcome or competency based. Moreover, the highest percentage of responses indicated suitability in competency based curriculum. A comparison with written exam showed that OSCE resulted in an increased proficiency in the tested clinical competence (8). Indeed, OSCE has been deemed to be a valuable mechanism to assess curriculum, not only for the content but also its effectiveness(9). A well designed OSCE is believed to be a precursor to the development of competency based curriculum since it plays an important role in the evaluation process (10). Responses from faculty appears to corroborate the findings from previous studies.

The national qualification framework in Saudi Arabia had previously designated 5 domains of learning namely: Knowledge, Cognitive skills, Interpersonal, IT and communication and Psychomotor (11). Responses indicated that while all domains can be assessed by OSCE, the highest responses were for knowledge and cognitive domains. OSCE is regarded to be a ubiquitous tool capable of assessing across domains. Nevertheless, it is important to appreciate that some domains are better assessed by methods other than OSCE. Application of knowledge in theoretical context is best measured by MCQs (12). Similarly the Does on Millers pyramid (13) is better assessed by work-based assessments such as Mini-CEX or DOPS as opposed to the OSCE that primarily assesses the Show how in a simulated environment (14).

We probed further to evaluate specific skillsor activities that may be assessed by OSCE. Highest number of respondents believed that diagnostic interpretation is best assessed by OSCE while the lowest was clinical/preclinical skills. These findings are supported from a previous studies where the authors concluded that OSCEs are a valuable and versatile method for of assessment in clinical disciplines, it is apparent that they are best suited to the assessment of diagnostic, interpretation and treatment planning scenarios and have limitations in the assessment of clinical operative procedures(15,16).

Several studies have been conducted on the reliability of OSCE and have found it to be acceptable but not ideal (17). Amongst the measures employed to increase the reliability of an OSCE, most respondents agreed that standardization of the evaluator and standardization of patients were essential measures. For example, a study reported that part-time faculty accorded greater scores than full timers, thus emphasizing on the need to calibrate evaluator (18). Indeed, the maximum drop in reliability score was accounted for by variation in student performance from station to station, probably reflecting on the content and the evaluator in equal measure (19). Another major determinant of reliability is the test duration including time for each station and the number of stations (20). Poor reliability due to content specificity can be overcome by increasing the number of cases being tested (21) to achieve a Cronbachs Alpha or generalisability value of 0.7 to 0.8 (22). Majority of the faculty believed that the number of stations needed in an OSCE range from 6-10. Coupled with a large portion of respondents agreeing that reliability testing is conducted after OSCE, number of stations would appear to be satisfactory except for high stakes examinations where anywhere from 14 to 20 stations have been recommended to achieve acceptable reliability (10).

The overall attitude regarding OSCE is a positive one and majority agree that it is an indispensable part of dental assessment. A large portion of dentists agreed that OSCE is the most suitable exam for high stakes such as licensure exams. The use of live patients for licensing examinations has been extensively debated upon (23), and arguments and resolutions for banning the practice has been made (24). Additionally, positive correlation has been reported between the written parts and OSCE of license exam (25) or while transitioning between preclinical and clinical parts of the curriculum (26). The literature on OSCE and its adoption by countries such as Canadian Board, appears to endorse the opinions of faculty on incorporating OSCE more vigorously in high stake exams. Subsequently, it would appear natural that faculty wanted multiple disciplines and domains to be assessed within a single exam to probably emulate the high stakes exam.

An institute, besides ensuring achievement of the learning outcomes, bears responsibility for preparing students for licensure exams. Faculty overall expressed confidence that their students are well prepared for an OSCE due to adequate skills and knowledge and exposure to the OSCE format. A formative OSCE seems to be favoured by faculty, particularly by evaluators, as supported from previous research (27) in which case the OSCE can be used as a teaching tool as well. (28,29).

While implementing an OSCE, faculty generally followed the guidelines laid down by institution supported by workshops. Faculty preparation is essential for calibration of assessment, providing feedback and logistic planning and the resources developed by an institute can significantly affect faculty performance and in turn the OSCE itself. Student feedback, reliability scores and OSCE exam scores must be used by institutions to design faculty development programs.

Faculty reported barriers to implement OSCE had high scores on difficulty to get standardized patients and the task being a difficult one, as related to logistics. Additionally, OSCE administrators believed that OSCE causes more stress among students and no significant association was found between number of OSCEs conducted and suggestion that formative OSCE may prepare students for a summative assessment. Together, they appear to suggest that OSCE must be used judiciously in the curriculum and should not be considered a panacea for all assessment related problems.

Conclusion

Faculty participation in OSCE is extensive and they are are well versed with various aspects of OSCE.

Faculty believe:

That OSCE is a flexible tool, that can be used combining multiple disciplines and domains in various curriculum models.

OSCE assesses showing how to do on millers pyramid and hence is important before actually performing the clinical procedure in order to safeguard both: the operator and the patient. However, clinical skills, particularly in dentistry are better assessed by work-based assessments.

Reliability testing is performed frequently but literature suggests that the ideal number of stations to be utilized in an OSCE, particularly for high stakes, may need to be increased.

Arranging standardized patients and logistics are formidable challenges in the implementation of OSCE at institute level.

References

Turner J.L., Dankoski M.E. Objective structured clinical exams: a critical review. Fam Med. 2008 Sep 1; 40 (8): 574-8. [ Links ]

Zayyan, Marliyya. Objective structured clinical examination: the assessment of choice. Oman Med J 2011; 2: (6) 219-22. [ Links ]

Harden R.M. What is an OSCE?. Med Teach. 1988 Jan 1; 10 (1): 19-22. [ Links ]

National Dental Examining Board of Canada available at: https://ndeb-bned.ca/en/accredi- ted/osce-examinationLinks ]

Shumway J.M., Harden R.M. Association for Medical Education in Europe (AMEE) Education Guide No 25: The assessment of learning outcomes for the competent and reflective physician. Med Teach. 2003; 25 (6): 569-84. [ Links ]

Vanka A., Wali O., Akondi B.R., Vanka S., Ravindran S. OSCE-A New Assessment Method for Pharmaceutical Education. Indian J Pharm Educ. 2018 1; 52 (4):S1-6. [ Links ]

Boursicot K., Kemp S., Ong T.H., Wijaya L., Goh S.H., Freeman K., Curran I. Conducting a high-stakes OSCE in a COVID-19 environment. Med Ed Publish. 2020 27; 9. [ Links ]

Schoonheim-Klein M., Walmsley A.D., Habets L.L., Van Der Velden U., Manogue M. An implementation strategy for introducing an OSCE into a dental school. Eur J Dent Educ. 2005 ; 9 (4):143-9. [ Links ]

Zartman R.R., McWhorter A.G., Seale N.S., Boone W.J. Using OSCE-based evaluation: curricular impact over time. J Dent Edu. 2002; 66 (12): 1323-30. [ Links ]

Carraccio C., Englander R. The objective structured clinical examination: a step in the direction of competency-based evaluation. Arch Pediatr Adolesc Med. 2000; 154 (7): 736-41. [ Links ]

National Qualifications Framework for Higher Education in the Kingdom of Saudi Arabia. National Commission for Academic Accre ditation & Assessment. Available at https:// www.mu.edu.sa/sites/default/files/National%20Qualifications%20Framework%20 for%20HE%20in%20KSA.pdfLinks ]

Khan K.Z., Ramachandran S., Gaunt K., Pushkar P. The objective structured clinical examination (OSCE): AMEE guide no.81. Part I: an historical and theoretical perspective. Med teach. 2013; 35 (9): e1437-46. [ Links ]

Miller G.E. The assessment of clinical skills/ competence/performance. Acad Med 1990; 65: S63-S67. [ Links ]

Baharin S. Objective structured clinical examination (OSCE) in operative dentistry course-its implementation and improvement. Procedia-Procedia Soc Behav Sci 2012; 60: 259-65. [ Links ]

Mosssey P. Scope of the OSCE in the assessment of clinical skills in dentistry. Br. Dent. J.2001; 190: 323-6. [ Links ]

Turner J.L. , Dankoski M.E. Objective structured clinical exams: a critical review. Fam Med. 2008; 40 (8): 574-8. [ Links ]

Park S.E., Kim A., Kristiansen J., Karimbux N.Y. The influence of examiner type on dental students OSCE scores. J Dent Edu. 2015; 79 (1): 89-94. [ Links ]

Boulet J.R., McKinley D.W., Whelan G.P., Hambleton R.K. Quality assurance methods for performance-based assessments. Adv Health Sci Educ Theory Pract 2003; 8 (1): 27-47. [ Links ]

Newble D. Techniques for measuring clinical competence: Objective structured clinical examinations. Med Educ 2004; 38: 199-203 [ Links ]

Roberts C., Newble D., Jolly B., Reed M., Hampton K. Assuring the quality of high-stakes undergraduate assessments of clinical competence. Med Teach 2006; 28: 535-543. [ Links ]

Khan K.Z. , Gaunt K. , Ramachandran S. , Pushkar P. The Objective Structured Clinical Examination (OSCE): AMEE Guide No. 81. Part II: organisation & administration. Med Teach. 2013; 35 (9): e1447-63. [ Links ]

Gibbs A., Christ A. The Ethics of Using a Live Patient for Dental Board Exams. Availa ble at: https://scholarscompass.vcu.edu/ denh_student/17/Links ]

Formicola A.J., Shub J.L., Murphy F.J. Banning live patients as test subjects on licensing examinations. J Dent Edu 2002; 66 (5): 605-9. [ Links ]

Gerrow J.D., Murphy H.J., Boyd M.A., Scott D.A. Concurrent validity of written and OSCE components of the Canadian dental certification examinations. J Dent Edu. 2003; 67 (8): 896-901 [ Links ]

Graham R., Bitzer L.A., Anderson O.R. Reliability and predictive validity of a comprehensive preclinical OSCE in dental education. J Dent Edu 2013 ;77 (2):161-7. [ Links ]

Lele S.M. A mini OSCE for formative assessment of diagnostic and radiographic skills at a dental college in India. J Dent Edu. 2011; 75 (12): 1583-9. [ Links ]

Brazeau C., Boyd L., Crosson J. Changing an existing OSCE to a teaching tool: the making of a teaching OSCE. Academic medicine: J Assoc Am Med Coll. 2002 Sep; 77 (9): 932. [ Links ]

van der Vleuten C.P.M., Swanson D.B. Assessment of clinical skills with standardized patients: state of the art. Teach Learn Med 1990; 2 (2): 58-76. [ Links ]

Graham R. , Bitzer L.A. , Mensah F.M., Anderson O.R. Dental student perceptions of the educational value of a comprehensive, multidisciplinary OSCE. J Dent Edu 2014; 78 (5): 694-702. [ Links ]

Received: March 19, 2021; Accepted: April 24, 2021; pub: June 23, 2021

Creative Commons License This is an open-access article distributed under the terms of the Creative Commons Attribution License