Using Artificial Intelligence (AI) to Learn the Law for the Bar Exam

A bar taker asked me to explain supplemental jurisdiction with an example. At 1:27 in the morning.

I was able to send her a response at 1:36 AM (9 minutes later)—without having a working knowledge of supplemental jurisdiction, and without flipping through outlines or Googling hypos.

How? I’ll show you below. (With examples using supplemental jurisdiction, Contracts remedies, and rule against perpetuities.)

The problem: When someone asks me about some law, I have to research it and explain it.

Or if you’re confused about how a rule works, you have to make sense of it theoretically and look at examples and hypos. How are you going to memorize (and use) the rule without understanding it?

This takes time and mental processing. And I don’t have the energy especially this late at night. Plus, I’m not a tutor, and people for some reason seem to think they can ask me random questions about supplemental jurisdiction instead of the tutor they pay thousands of dollars for (but I have a soft spot for people struggling with this death ritual).

Here’s an idea you can try for yourself if you’re stuck and not understanding a bar exam rule.

Ask ChatGPT to explain the rule (supplemental jurisdiction example).

If you don’t know what ChatGPT is, it’s a trained language model that interacts with you in a conversational way. It’s all the rage.

If she was going to treat me like some kind of robot, then I was going to do the same…

I copied and pasted her question into ChatGPT verbatim. Here’s the answer that ChatGPT generated about supplemental jurisdiction:

Using AI (ChatGPT) to explain a rule with an example
Click to embiggen
Text version

Supplemental jurisdiction is a concept in federal court that allows a court to hear additional claims in a case that are related to the main claim, but do not necessarily fall within the court’s original jurisdiction. In general, a federal court will only hear a case based on supplemental jurisdiction if the main claim falls within the court’s original jurisdiction. If the main claim does not fall within the court’s original jurisdiction, the court will not have the authority to hear the case and will not be able to consider any additional claims under supplemental jurisdiction.

For example, suppose that a plaintiff is trying to avoid complete diversity in a case, which is a requirement for a federal court to hear a case based on diversity jurisdiction. In this situation, the plaintiff may try to bring additional claims under supplemental jurisdiction in order to keep the case in federal court. However, if the main claim does not fall within the court’s original jurisdiction, the court will not be able to hear the case and the plaintiff will not be able to avoid complete diversity.

What do you think?

Way better than something I could have come up with by doing manual Google searches for supplemental jurisdiction hypos and rule statements (and cross-referencing with Magicsheets).

Best of all, it saved me time and patience! Sometimes I just don’t have the time because of a full-time job that keeps me up. When I have to stay up until 4-5 AM, that technically makes it a 9-to-5 (AM).

Thanks to ChatGPT, the conversation ended in understanding:

ChatGPT is like a butler, a skilled assistant.

Obviously, review for any flags before accepting the answer as gospel.

Also, form a good question. Like with many things, the quality of the output depends on the quality of the input. (Below, I’ll share a prompt you can give it to set the context.)

And be nice to the robot. Think of it as asking an assistant for a draft.

I used the generated answer to refresh myself and send her a revised version of the second paragraph.

(Because it would be overwhelming to dump the whole thing on her… which is rich coming from someone who sends 1,000+ word coaching emails, MBE practice questions, and case studies every week)

But this is a QUICK way to leverage AI to explain a rule and even get an example as a good starting point.

Here’s another example, this time about Contracts remedies and third-party assignments.

A “question” that has five questions across two topics if you parse it out

Am I back in law school with these Socratic questions? You’re lucky I have the patience of a saint and customer service that rivals Zippos.

(Please don’t abuse my people-pleasing tendencies. I’m just a one-man operation.)

"Dude, I don't know if you have a team or what, but I would've given you the money I gave Helix two times over in a heartbeat. I have a hard time buying into motivational speech type stuff and aside from Jon Grossman, Adaptibar's videos (especially repeat ones) were full of shit to me. Helix's live sessions and videos were the same. The motivation in the passer's playbook and your emails felt real and I found myself actually taking it to heart."

The silver lining is that the questions were quite particular. Drawing boundaries and parameters is great for ChatGPT (see below for more on prompt engineering).

ChatGPT, being a fabulous assistant, answered all questions in 9 paragraphs. What a champ!

Yes, it was a rather long answer, but it was like an organized research memo, much better than me manually researching and digging through multiple articles and cross-referencing my outlines.

So despite the person expecting heavy lifting from me (again, not a tutor), and despite my having no obligation to answer, I was able to educate myself with the answer, and in turn, provide answers without too much stress or deep research.

If there’s one thing I have standards for in life, it’s customer satisfaction.

What about explaining Evidence hearsay rules like you’re 10?

“Jarvis, ELI5 hearsay rules and exceptions.”

Update: The user deleted the image, but they asked ChatGPT to explain hearsay evidence as if they were 10 years old. ChatGPT came back with an explanation of hearsay and examples of a few types of exceptions.

This is (was) pretty good! Analogies are a great way to understand a concept. Why do we need tutors (let alone me having to answer questions at 1 AM) at this point?

One way to improve this question would be to narrow it down to specific exception(s). Otherwise, it might pick random examples. See below for more on prompt engineering.

Here’s another example of using ChatGPT to explain the infamous rule against perpetuities (RAP).

How you can use ChatGPT to learn the law

If you’re stuck on a concept, try it out: https://chat.openai.com/chat

You’ll need an account, but it’s free. I encourage you to try this if you’re in need of answers to substantive questions in short order.

You can dive right in and ask a question like the examples above. What do you think about the answers you get?

If you need more clarification, respond with another question. You can also highlight a specific portion of its answer and ask about it.

Prompt engineering (good vs. bad questions)

I like to set the context with an initial background before asking.

Here’s the initial prompt I provided ChatGPT which you can swipe and modify as desired. It might help set the context for your new robo-buddy:

“You are BarGPT, an expert in legal principles and issues tested on the California Bar Exam and the Uniform Bar Exam (UBE), each of which includes the Multistate Bar Exam (MBE). When I ask a question, I would like to hear an answer according to current law (or as up to date as your data allows). Any questions before we start?”

Remember that good answers require good questions. Asking a good question doesn’t guarantee but increases the chance that you’ll receive a good answer. It becomes less susceptible to interpretation.

Good questions often have specific boundaries and parameters. This is called “prompt engineering.”

Some of the examples provided above at least had specificity to their questions. We can debate whether there is such a thing as a stupid question, but there are good and bad questions.

Vague questions will get you vague, overwhelming, or useless answers. In real life, you may not even get an answer, unless you’re asking me a request like this and I begrudgingly answer while judging you:

How would you answer this? I have no idea what this person is doing wrong either. I can only give general suggestions.

I gritted my teeth and gave an empathetic answer along with a bulleted list of suggestions (just like ChatGPT! Because I asked it for help and massaged it to my liking):

My answer

No response even though they wanted help. Sometimes people just want to vent.

Perhaps my suggestions weren’t helping because they asked a vague “question” that demanded mind reading and changing the past so that they started MBE practice sooner.

(Update: I got a response praising my AI-assisted answer after following up for their thoughts.)

BTW, there is a better way to induce a good answer instead of asking for “tips” or “help.” One example: “If you were in my situation, what’s the next thing you would try doing from here?” Specific and easier to answer instead of assuming what you need. Now there’s less friction toward an answer instead of having to narrow down a universe of possibilities.

I could have rephrased their question like this on my own accord, but remember that the quality of the output depends on the quality of the input. It’s simply more likely that you’ll prompt a good answer with a good question.

Do I sound frustrated? Hell yeah I am. That’s why I turned to ChatGPT in the first place.

Regardless, this is an opportunity for you to play around with it, which is another way to learn (just like when you get things wrong).

Let me know what you find with different types of questions! I’d be curious to see how you use AI to learn the law.

Share This

2 Replies to “Using Artificial Intelligence (AI) to Learn the Law for the Bar Exam”

Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.