Skip to main content

NCBE Selects AccessLex to Assist in Study Aid Development for NextGen UBE

Contact Us
LinkedIn Logo
X/Twitter logo
Facebook Icon
instagram icon
youtube icon
tiktok icon
AccessLex logo AccessLex
Home Home
  • Focus Areas
    Academic and Bar Success Diversity Programs Financial Education Grant Programs Helix Bar Review Policy and Advocacy Research and Data
  • Resources
    AccessConnex Analytix The ARC Ask EDNA Diversity Pathway Program Directory JDEdge LexScholars #MakeTheCase MAX MAX Pre-Law Scholarship Databank Student Loan Calculator XploreJD
  • Student Center
    Aspiring Law Students Current Students Recent Graduates
  • Administrators
    Our Team MAX Administrators Pre-Law Services Publications Order Form
  • News and Events
    Student Events Events for School Professionals News Room
  • About Us
    History Board of Directors Leadership Team Our Member Law Schools
Focus Areas

Discover our areas of expertise and learn how we’re fulfilling our nonprofit mission.

Academic and Bar Success Diversity Programs Financial Education Grant Programs Helix Bar Review Policy and Advocacy Research and Data
A smiling woman wearing a grey jacket and white shirt holding a tablet computer

Top Notch Bar Prep

Resources

Check out our free tools for success from school selection through bar prep.

AccessConnex Analytix The ARC Ask EDNA Diversity Pathway Program Directory JDEdge LexScholars #MakeTheCase MAX MAX Pre-Law Scholarship Databank Student Loan Calculator XploreJD
Ask Edna! Education Network at AccessLex [logo]

Law School Questions? Ask EDNA!

Student Center

Explore free resources on the path from pre-law to legal professional.

Aspiring Law Students Current Students Recent Graduates
A smiling woman holds her laptop close to her body with the sun shining in the background

Graduate Resources

Administrators

Learn how AccessLex supports legal education professionals and their students.

Our Team MAX Administrators Pre-Law Services Publications Order Form
A pair of hands hovering over a laptop

Order Free Publications

News and Events

Catch up on AccessLex news and view our schedule of events.

Student Events Events for School Professionals News Room
A smiling man wearing a yellow sweater holds a tablet

Free Webinars

About Us

Learn more about nonprofit AccessLex Institute®.

History Board of Directors Leadership Team Our Member Law Schools
A compass where the arrow is pointing to the word "Mission"

Our Mission

Search
teal background
teal background

Academic Success Interventions Literature Review

Table of Contents of All Academic and Bar Success Literature Reviews by Subject

Overview
ACADEMIC SUCCESS INTERVENTIONS
In-School Bar Exam Success Interventions
Post-Graduate Bar Exam Success Interventions

Academic Success Interventions

 

How do legal writing and analysis interventions influence academic and bar success?

A few studies have explored the impact of taking legal writing and analysis courses on academic success and bar passage. In each case, the study assessed an intervention targeting students at the bottom of the class. The pedagogy employed in the courses was only described and assessed in one study, which explored three different courses aimed at improving student performance on law school exams. Two of the three courses, Legal Reasoning and Legal Analysis, employed self-critique as a metacognitive tool to improve students’ writing skills. The other studies that did not describe the course pedagogy focused generally on the performance of course-takers compared to a control group. 

Each study asserts that these legal writing and analysis interventions targeting students of lowest class rank are effective in boosting their law school academic performance or bar passage.

read more read less

Florida International University College of Law

Florida International University College of Law’s Academic Excellence Program incorporates three legal writing and analysis academic success interventions: a first semester Introduction to the Study of Law course, a Legal Reasoning course in the second semester, and a Legal Analysis course in the third semester.

read more

Introduction to the Study of Law is a required course for all students in their first semester of law school. The first unit of the course teaches law school study skills (e.g., outlining and time management), the second unit teaches legal analysis, and the final unit teaches how to prepare for law school exams.[1]

Legal Reasoning is a second semester course open to the bottom 20% of the class and focused on teaching legal analysis.[2] The articles do not state whether this is a required course. Students write and self-critique five essays over the semester, each modeled after an exam essay they could encounter in their required first-year doctrinal courses. Self-critique is used to “engage metacognition,” making students reflect on whether they understand a concept and consider their next steps if they do not.[3]

Legal Analysis is a third semester course, also designed for the bottom 20% of the class.[4] The articles do not state whether this is a required course. It is similar in structure to the Legal Reasoning course, with students writing and self-critiquing five essays, but the essays are based on the Evidence course.

Schulze reports that students in the Introduction to the Study of Law course who “participate… regularly” and “complete the exercises” perform better on first-semester exams than students who do not. The article does not define these terms.[5]

Similarly, students in the Legal Reasoning course who “work diligently” outperform similar students who do not take the course. The article does not define this term.[6]

Finally, students in the Legal Analysis course perform better in their Evidence and other courses.[7]

 


[1] Louis N. Schulze, Jr., Using Science to Build Better Learners: One School’s Successful Efforts to Raise Its Bar Passage Rates in an Era of Decline, 68 J. Legal Educ. 230, 235 (2019), https://papers.ssrn.com/sol3/papers.cfm?abstract_id=2960192.

[2] Id.; Raul Ruiz, Leveraging Noncognitive Skills to Foster Bar Exam Success: An Analysis of the Efficacy of the Bar Passage Program at FIU Law, 99 Neb. L. Rev. 141, 177 (2020), https://heinonline.org/HOL/P?h=hein.journals/nebklr99&i=146.

[3] Louis N. Schulze, Jr., Using Science to Build Better Learners: One School’s Successful Efforts to Raise Its Bar Passage Rates in an Era of Decline, 68 J. Legal Educ. 230, 235 (2019), https://papers.ssrn.com/sol3/papers.cfm?abstract_id=2960192.

[4] Id.; Raul Ruiz, Leveraging Noncognitive Skills to Foster Bar Exam Success: An Analysis of the Efficacy of the Bar Passage Program at FIU Law, 99 Neb. L. Rev. 141, 177 (2020), https://heinonline.org/HOL/P?h=hein.journals/nebklr99&i=146.

[5] Louis N. Schulze, Jr., Using Science to Build Better Learners: One School’s Successful Efforts to Raise Its Bar Passage Rates in an Era of Decline, 68 J. Legal Educ. 230, 235 (2019), https://papers.ssrn.com/sol3/papers.cfm?abstract_id=2960192.

[6] Id.

[7] Id.

read less

Johns

University of Denver Sturm College of Law requires students with a 1L GPA of 2.6 or lower to take an Intermediate Legal Analysis course in their 2L year.[8] Other students may take the course as an elective. The course is focused on helping students to perform better on law school exams, teaching legal analysis and writing skills, along with some review of the first-year curriculum.

read more

Johns analyzed bar passage data for 642 Denver students who graduated in May 2008-2010 and immediately took the Colorado bar exam in July. In the sample, 8.4% took the Intermediate Legal Analysis course.[9]

Graduates with a 1L GPA at or below 2.9 who took Intermediate Legal Analysis passed the bar exam at a higher rate than those in that 1L GPA range who did not participate in this intervention.[10]

Johns also utilized linear regression analysis to test the relationship between the Intermediate Legal Analysis course, LSAT score, law school GPAs, and bar exam score. The variables tested were: age; sex; underrepresented minority status; enrollment status; LSAT; 1L GPA; final GPA; and participation in the intervention.[11] Johns found that taking the 2L Intermediate Legal Analysis course was not a statistically significant predictor of bar exam score.[12]


[8] Scott Johns, Empirical Reflections: A Statistical Evaluation of Bar Exam Program Interventions, 54 U. Louisville L. Rev. 35, 36 (2016), https://heinonline.org/HOL/P?h=hein.journals/branlaj54&i=41.

[9] Id. at 48–49.

[10] Id. at 54.

[11] Id. at 57.

[12] Id. at 59–60.

read less

Zeigler et al.

New York Law School offers a second semester Principles of Legal Analysis course that is required for the bottom 40 students in each section after the end of their first semester (first year for evening students).[13] The course focuses on legal writing and analysis skills (identifying relevant facts and correctly applying legal rules to those facts) and incorporates weekly essay exams.[14] This course was originally developed as an experimental section: the bottom 40 students in one class section were required to take the course, while the bottom 40 students in the other two sections were not offered the course and served as a control group.[15] The course was taught as an experimental section for three sessions (Spring 1999-2001).

read more

During this period, New York Law School also implemented other policy changes intended to improve bar passage rates. All law school exams were required to be closed book, to better mirror the New York bar exam, and professors were encouraged to include multiple-choice questions in their exams.[16]

When evaluating the experimental Principles of Legal Analysis course, Zeigler et al. verified that the LSAT scores and first-semester GPAs of the experimental section and control groups were similar and found no significant differences. By the end of the second semester, after the intervention, the experiment section had a mean GPA of 2.51, while the control groups had mean GPAs of 2.26 and 2.21. The experimental section earned higher 1L GPAs in the next two intervention cohorts as well.[17] Zeigler et al. note that the experimental group had a lower attrition rate than the control group, and the GPA difference may have been even larger without that difference.[18] The 1999 experimental section also passed the bar at a much higher rate than their control group peers (69% versus 55% and 36%).[19]


[13] Donald H. Zeigler et al., Curriculum Design and Bar Passage: New York Law School’s Experience, 59 J. Legal Educ. 393, 401 (2010), https://papers.ssrn.com/sol3/papers.cfm?abstract_id=1551477.

[14] Id. at 396–97.

[15] Id. at 397–98.

[16] Id. at 398.

[17] Id. at 397–98.

[18] Id. at 400.

[19] Id.

read less

How do metacognitive and study skill interventions influence academic success?

A few studies have examined the impact of metacognitive interventions in the first or second year of law school on academic success. Metacognitive interventions are designed to teach students how to become better learners, providing them with knowledge of study skills and strategies and coaching them on how best to use them. Two of these studies examined the effects of specific strategies (class note-taking and receiving individualized feedback on exams and assignments), while two examined the effects of teaching metacognitive and self-regulated learning strategies as part of a law school course.

The studies examining incorporation of metacognition instruction into law school courses assert that these interventions can improve academic performance.

The study examining the effects of note-taking methods found that handwriting class notes instead of using a laptop improved academic performance, resulting in a 0.2 higher average class GPA among those taking handwritten notes compared to those who typed notes. The study exploring the effects of giving students individualized feedback also found evidence that it enhanced academic performance, resulting in an average 0.134 course GPA increase for those who received feedback.

read more read less

Florida International University College of Law

Florida International University College of Law’s Academic Excellence Program incorporates three academic success interventions: a first semester Introduction to the Study of Law course, a Legal Reasoning course in the second semester, and a Legal Analysis course in the third semester.

read more

All three interventions were designed to promote metacognition and self-regulated learning among students.[20] Schulze writes that he expressly teaches 1L students to synthesize, outline, and self-test the knowledge obtained from their weekly readings.[21] Other strategies used and taught in the courses include retrieval practice, spaced repetition, and cognitive schema theory.[22]

Schulze reports that students in the Introduction to the Study of Law course who “participate… regularly” and “complete the exercises” perform better on first-semester exams than students who do not. The article does not define those terms.[23]

Similarly, students who take the Legal Reasoning course and “work diligently” outperform similar students who do not take the course. The article does not define this term.[24]

Finally, students who take the Legal Analysis course perform better in their Evidence and other courses.[25]


[20] Schulze p. 232, https://papers.ssrn.com/sol3/papers.cfm?abstract_id=2960192 

[21] Schulze p. 240, https://papers.ssrn.com/sol3/papers.cfm?abstract_id=2960192 

[22] Schulze p. 241, 243, 247, https://papers.ssrn.com/sol3/papers.cfm?abstract_id=2960192 

[23] Schulze p. 235, https://papers.ssrn.com/sol3/papers.cfm?abstract_id=2960192 

[24] Schulze p. 235, https://papers.ssrn.com/sol3/papers.cfm?abstract_id=2960192 

[25] Schulze p. 235, https://papers.ssrn.com/sol3/papers.cfm?abstract_id=2960192 

read less

Gundlach and Santangelo

Gundlach and Santangelo examined the effects of metacognitive skills on success in law school.[26] Their sample was 129 1L students at Hofstra University Deane School of Law, all of whom were enrolled in two sections of a required first-semester Civil Procedure course.[27] One section received a metacognitive intervention, while the other served as a control group. The intervention and control sections had similar race and gender demographics, LSAT scores, and undergraduate GPAs.[28]

read more

The researchers administered two quantitative surveys to students in both the control and intervention groups, Schraw and Dennison’s Metacognitive Awareness Inventory and Pintrich and De Groot’s Motivated Strategies for Learning Questionnaire, that asked Likert-scale questions about their knowledge and usage of various metacognitive strategies. The surveys were administered at both the beginning and end of the semester.[29]

They also administered four qualitative, narrative-response questionnaires mid-semester, around exams, asking about law school-specific learning strategies.[30] The questionnaires were coded based on the student’s knowledge of and willingness to use learning strategies, ranging from “Not Engaging” to “Developing”.[31] Students were then assigned a Global code reflecting their development over the semester based on all four questionnaires.[32]

The intervention section received instruction about metacognition and learning strategies in their first class of the semester, and refreshers on those skills after midterm exams. The intervention section was also tested (non-graded) more frequently on their understanding of the materials than the control section.[33]

The researchers used quartile in class and z-scores on the course exams (the number of standard deviations a raw exam score is above or below the mean) to evaluate the relationship between the two quantitative survey scores and academic performance.[34] They found no relationship between metacognition scores on the surveys (either at beginning or end of semester) and performance on the course exams.[35]

The researchers did find a positive relationship between a student’s Global code and their exam z-scores, although the relationship was smaller than that between LSAT score and exam z-score.[36]

As for the impact of the intervention on metacognitive skill development, the researchers found no difference in change between the intervention and control group survey scores from the beginning to the end of the semester.[37] The researchers theorized that many students in the control group may have been exposed to metacognitive strategy instruction through their peers, academic support faculty, or even through the administered questionnaires.[38]
 


[26] Jennifer A. Gundlach & Jessica R. Santangelo, Teaching and Assessing Metacognition in Law School, 69 J. Legal Educ. 156, 163–64 (2019), https://heinonline.org/HOL/P?h=hein.journals/jled69&i=167.

[27] Id.

[28] Id. at 165–66.

[29] Id. at 167.

[30] Id. at 167–68.

[31] Id. at 171–73.

[32] Id.

[33] Id. at 168–69.

[34] Id. at 178–79.

[35] Id.

[36] Id. at 179–80.

[37] Id. at 183.

[38] Id.

read less

Murphy et al.

Evaluating the theory that laptops introduce distractions into the classroom and hamper learning, Murphy et al. examined the use of laptops versus handwritten notes in law school courses, and their relationship with academic performance.[39] Data came from 113 2L students in four sections of Constitutional Law and Evidence courses taught in Fall 2016 at Roger Williams University School of Law.[40] 2L students were chosen for the intervention because the researchers wanted a study population that had all been exposed to the same previous curriculum.[41] Each section was taught by a different professor. The sample was divided into four groups (Hand-writers/Memo, Hand-writers/No Memo, Computer/Memo, Computer/No Memo), with students in the Memo groups having previously read a memo advising against laptop usage in a 1L course.[42]

read more

The author used sample t-tests to determine statistical significance, and difference-in-differences methods.[43]

Controlling for mean LSAT score in each group, Murphy et al. found that hand-writers outperformed laptop users in the courses. Groups were not controlled for 1L GPA, although Murphy et al. note that future studies should consider doing so because of its predictive effects.[44] The mean GPA for hand-writers was 0.2 points higher than laptop users in the Memo groups. Results for the non-Memo groups were not statistically significant, likely due to the small number of non-Memo hand-writers.[45] Disaggregating the LSAT control, hand-writers outperformed laptop users at every LSAT quartile, although this was only statistically significant in half the combinations.[46]


[39] Colleen P. Murphy et al., Note-Taking Mode and Academic Performance in Two Law School Courses, 69 J. Legal Educ. 207, 210–16 (2019), https://dx.doi.org/10.2139/ssrn.3134218.

[40] Id. at 217.

[41] Id. at 227.

[42] Id. at 218.

[43] Id. at 220-21.

[44] Id. at 227.

[45] Id. at 223.

[46] Id. at 224.

read less

Schwarcz and Farganis

Schwarcz and Farganis examined the effects of receiving individualized feedback during the first year on law school performance, under the theory that formative feedback would enable students to better evaluate their own knowledge and progress.[47] Individualized feedback was defined as assignments or exams before the conclusion of a course where students received individual grades, comments, or oral feedback.

read more

University of Minnesota Law School has 4–5 1L sections of 40–50 students who take all their classes together, but sometimes for scheduling reasons two sections will take a specific course together. The authors identified professors who provide individualized feedback in single-section courses, and then used these double-section courses as an experiment, examining double-sections where one half of the class had experienced receiving individualized feedback and the other half had not.[48]

The authors identified eight double-sections with split feedback experiences between 2011 and 2015, containing 541 students.[49] The sample had the same mean LSAT and UGPA as the law school generally, and significance testing found no statistically significant differences between students in the paired sections.[50]

In all eight double-section courses, the feedback section outperformed the no-feedback section in the course. This difference was not statistically significant within each section, but it was when the sections were aggregated.[51] The performance disparity was highest among the lowest-performing students.[52]

The authors performed linear regression analysis using distance-to-mean grade as the dependent variable.[53 ]They found that feedback is a statistically significant and positive predictor of course performance, with students who receive feedback predicted to see course grade improvements of .134 points.[54] The feedback effect was larger for students with below-median LSAT and UGPA.[55]

The authors propose that receiving feedback in one course helps students develop law school skills, including exam-taking skills, that they are able to take into their other courses.[56]


[47] Daniel Schwarcz & Dion Farganis, The Impact of Individualized Feedback on Law Student Performance, 67 J. Legal Educ. 152 (2017), https://dx.doi.org/10.2139/ssrn.2772393.

[48] Id. 152–54.

[49] Id.

[50] Id. at 157.

[51] Id. at 158–59.

[52] Id. at 160.

[53] Id. at 162.

[54] Id. at 163–64.

[55] Id. at 165.

[56] Id. at 171.

 

read less
Accesslex Institute Logo

10 North High Street
Suite 400
West Chester, PA 19380

 

 

  • Focus Areas
    Focus Areas
    Academic and Bar Success Diversity Programs Financial Education Grant Programs Policy and Advocacy Research and Data
  • Student Center
    Student Center
    Aspiring Law Students Current Students Recent Graduates
  • Administrators
    Administrators
    Our Team MAX Administrators Pre-Law Services Publications Order Form
  • News and Events
    News and Events
    Student Events Events for School Professionals News Room
  • About Us
    About Us
    History Board of Directors Leadership Team Our Member Law Schools
Resources
Helix Bar Review
Contact Us
Careers
Financial Highlights
Terms of Use
Privacy Policy
Accessibility Statement

Copyright © 2025. AccessLex Institute®, ACCESSLEX®, and other logos or product names herein are trademarks of AccessLex Institute®. All rights reserved.

Terms of Use Privacy Policy Accessibility Statement
LinkedIn Logo X/Twitter logo Facebook Icon instagram icon youtube icon tiktok icon

Copyright © 2025. AccessLex Institute®, ACCESSLEX®, and other logos or product names herein are trademarks of AccessLex Institute®. All rights reserved.