The online bar exam should not be easier, but competent work may look different than before because exam timing and tools changed.
Thousands of examinees successfully completed the novel October 2020 online bar exam. Hundreds of graders and bar examiners are right now undertaking evaluation of the assessment artifacts (essay answers, Performance Test answers, and MBE-style multiple choice question responses) generated by the first-ever online bar exam. These artifacts are the evidence that will establish (or not) the minimum competency of these pioneer online examinees. Here is what the states should keep in mind.
The mechanisms of testing make a difference in the type and quality of artifacts generated under those testing conditions. Bar examiners should not expect examinee performance that is the same as in the past, because the conditions, timing, and tools used were different. To use a sports analogy, cutting the length of a basketball game would mean fewer shots were taken and “made,” so the team scores would be lower. But it would not mean the players were lesser athletes. Likewise, changing the mechanics of a testing environment means competent performance may look different than before the change. Graders will have to look closer, with a fresh perspective and an open mind to the quality and content of answers produced under these testing conditions, not just with a mind to how these answers compare to those produced by past examinees under different conditions.
Let’s look at what changed:
Essay Scoring: Essay section time allocation strategies were impacted by splitting the exam into shorter segments, and the tools for making notes and outlining answers were radically changed. The effect of those changed mechanics is not yet well known; we only just now have the first set of exams to evaluate for that difference. Graders should, therefore, set appropriate expectations for this exam based on the testing conditions that applied. The fact that the majority of examines worked with unfamiliar tools and interfaces -- for example online "scratch paper" -- for the first time, without good preparation and practice tools available to them, might mean, for example, that competent answers are somewhat shorter this time. While still looking for the same thing -- minimum competence -- the level of artifact quality that demonstrates minimum competence, in a different testing setting, must be realistic for that environment.
Performance Test Scoring: The essay comments above may apply even more to the Performance Test, depending on the environment the exam provided. If, as in California, the examiners permitted physical scratch paper, that is quite a bit more akin to the customary testing environment than one requiring use of only limited digital scratch paper on an exam as complex as the PT. But… reviewing complex and lengthy documents by scrolling through computer windows rather than flipping through or taking apart the traditional physical PT test packet may take more time and/or make it harder to track down some types of information. We have just begun understanding how these changes impact performance, so graders should take that into account when "leveling" their expectations of the length and complexity of PT answers deemed "minimally competent."
MBE-Style Multiple Choice Question (MCQ) Scoring (we can’t call them “MBEs”): Like the essays and PTs, mechanisms for taking the MCQs changed dramatically for most test takers. Those who took online bar review courses had some experience with this environment, but it has never been used in bar exam testing before, and the exact test taking interface was unavailable and unfamiliar to most examinees until a few weeks before (when they had only a session or two to practice with it.) One might, therefore, find that a competent examinee on this rigidly timed test scores a few questions less well than the same applicant in a traditional paper testing environment, where the mechanics are more familiar. Or not; we simply don’t know yet. It will be critical for the states to carefully re-evaluate their expectations vis-à-vis the score that demonstrates competence on the multiple-choice portion, and of the MCQs as a component of the overall score. Expectations may not change, but the question must be asked and answered with these testing conditions in mind.
The online exam will be just as rigorous and valid a measure of competence as the traditional exam if states properly account for these differences during the grading and scoring. Minimum competence is still the standard, but we need to know if the new mechanics require any adjustment to our expectations. (For example, can anyone ever write as much as in the past in answer to the PT, given the time it takes to manage the various file and library materials in a wholly online setting? It may be there simply isn’t the time.) Over the years, after understanding the effects of new mechanics and making needed adjustments, this will all be as familiar and routine as the traditional system of in-person exams. Admissions authorities across the country should, however, approach grading and scaling of these October 2020 exams with due caution and respect for the impact of the changes in the testing environment.
Both New York and California have already announced an online option or fully online exam for February 2021. The NCBE has already announced plans to make a full, proper MBE available for online testing in February 2021.
Greg Brandes is Dean of St. Francis School of Law, a non-profit online law school accredited by the Committee of Bar Examiners of the State Bar of California. He has written before on online bar exams, and is an expert in curriculum and assessment for law schools and the bar exam. Visit his SSRN page at: https://papers.ssrn.com/sol3/cf_dev/AbsByAuth.cfm?per_id=4091929 and St. Francis School of Law at: http://stfrancislaw.com.
©2020 Gregory J. Brandes. All Rights Reserved.