Flyer

Health Science Journal

  • ISSN: 1791-809X
  • Journal h-index: 61
  • Journal CiteScore: 17.30
  • Journal Impact Factor: 18.23
  • Average acceptance to publication time (5-7 days)
  • Average article processing time (30-45 days) Less than 5 volumes 30 days
    8 - 9 volumes 40 days
    10 and more volumes 45 days
Awards Nomination 20+ Million Readerbase
Indexed In
  • Genamics JournalSeek
  • China National Knowledge Infrastructure (CNKI)
  • CiteFactor
  • CINAHL Complete
  • Scimago
  • Electronic Journals Library
  • Directory of Research Journal Indexing (DRJI)
  • EMCare
  • OCLC- WorldCat
  • MIAR
  • University Grants Commission
  • Geneva Foundation for Medical Education and Research
  • Euro Pub
  • Google Scholar
  • SHERPA ROMEO
  • Secret Search Engine Labs
Share This Page

Abstract

Software-Assisted Identification and Improvement of Suboptimal Multiple Choice Questions for Medical Student Examination

Gerovasili Vasiliki, Filippidis Filippos T, Routsi Christina and Nanas Serafim1

Background: Multiple choice questions (MCQs) are often used to assess student achievement. Questions’ content is mainly chosen by the tutor according to his judgment and experience. We aimed to develop an evaluation program of MCQs for medical students. Method and Material: Specifically designed software was developed utilizing a database of all MCQs that were used to examine medical students. We evaluated 220 multiple choice questions used in a population of 497 students. For each question the Difficulty and Discrimination indices were calculated. The Discrimination index represents a question’s discrimination ability -whether it has a high rate of success within high performing students. We evaluated 220 multiple choice questions used in a population of 497 students. A logistic regression model was tested to assess the association between Difficulty and Discrimination indices. Nineteen questions with Discrimination index lower than 0.20 were modified and given to 140 students. Results: Out of the 220 questions, 37(16.8%) were of recommended difficulty while 30 (13.6%) were of “high difficulty - not acceptable” and 54 (24.5%) of “high facility- not acceptable”. Seventy three questions were of excellent discrimination (33.2%), while 53 (24.1%) were of bad discrimination. Too easy and too difficult question were less likely to be of good/excellent discrimination (Odds ratio=0.18). The mean Discrimination index of the 19 questions that were modified improved significantly from 0.06 to 0.26 (p<0.001). Conclusions: Choosing MCQs according to tutor’s judgment only is not sufficient to create an objective evaluation system of students’ achievement. The use of specifically designed software can help identify and improve flawed questions.