ACM Logo  An ACM Publication  |  CONTRIBUTE  |  FOLLOW    

Testing... One, Two, Three: Can Online Multiple-Choice Exams Make the Grade?

By Lisa Neal / October 2005

Print Email
Comments Instapaper
My hobby, I'm embarrassed to admit, is taking free online courses. My primary motivation is seeing how organizations design courses on different topics for a variety of target audiences. I recently took a course offered for CMEs (Continuing Medical Education credits), which US physicians are required to get. This was a course on Hepatitis, about which I knew little. Not only did I pass, but I passed without actually taking the course. Does this mean I that should have become a physician? That I'm a good test taker? Or, more disturbingly, that physicians who complete such courses might know about as much about Hepatitis as I do?

This particular course started off with instructions that told me that there was a test at the end, and provided a link to that test. However, I didn't notice that I was invited (though in a smaller, non-bolded font) to proceed directly to the questions. (Anyone else guilty of skimming or skipping the instructions?)

I started reading the dense material and browsing the many links and references. While designing for physicians is different than designing for a general learner population not accustomed to as much reading and studying, I had trouble understanding what made this an online course instead of an online article.

Since I couldn't understand a lot of the material, I scrolled through the course and discovered the test at the end. What a challenge-given how hard it is to develop truly comprehensive multiple choice questions, could I pass the test without reading the material, and, further, without a medical background?

On my first try I had five questions right out of six. Nice explanations were provided for the reasons behind each of my correct answers. The one wrong one was just marked "incorrect." I tried again and got it right the second time.

Admitting in advance to having no special expertise in developing multiple choice questions, I have written a test for my readers below.

Was the Hepatitis course:

1) Well-designed for the target audience, and I am being too picky about the question design

2) Poorly designed since it wasn't testing an understanding of the content

3) Adequately designed given the difficulty and expense of the alternative- asking open-ended questions which a human has to grade

4) All of the above

5) None of the above

Personally, while I could argue for one, two, or three depending on my mood, I believe two is the right answer. (I added four and five because that was how I learned to write test questions.)

I learned test-taking strategies for SATs, which US high school students take for college applications. Children in the US and many other countries learn test-taking skills at younger ages now due to standardized testing. Does this mean that, as these children grow, they will be even better than I was at passing tests without demonstrating any knowledge gain?

Common sense and good test-taking skills shouldn't be enough to get course credit. While I acknowledge that creating multiple choice questions that demonstrate a deep understanding is hard, if it's too hard to do well when it matters, then use open text questions and put the effort into making sure that students actually know something about the topic for which they are receiving credit.

About the Author
Lisa Neal is Editor-in-Chief of eLearn Magazine and an e-learning consultant.


  • There are no comments at this time.