Ever since Artificial Intelligence (AI) powered chatbots such as ChatGPT were unveiled for public use, teachers have been worried about students using them to aid or even fully complete their school assignments. This year, teachers at the School have taken extreme measures to prevent students from cheating with these tools, such as assigning more in-class assignments. “I’ve gotten much stricter about how to submit online work and things like version history, and in particular this year, English 11 is doing a lot more in-class, handwritten work,” said Jacob Leland, an English teacher at the School.
Fellow English teacher Benjamin Norton shares these concerns. Norton, who holds a largely tech-free class, said “If there wasn’t AI, I’d be perfectly content with computer-typing [for larger assignments]. AI is just so hard to detect that I don’t want to spend a single moment having to think about it…I can’t tell you how many hours last year I lost wondering if a student had used AI on an assignment.”
When teachers do find students have cheated with Large Language Models (LLMs) like ChatGPT, it’s not fun for anyone involved, according to Dr. Jacob Leland. “It’s a drag when I catch them; I have to give them a zero. They don’t like that, I don’t like that, we both have to do extra work, and they haven’t done the learning,” he said, and added that it’s not uncommon for him to catch students using it.
So why turn to AI? “Ultimately, much of this rests upon the commodification of education; the way that it’s so high stakes and so expensive,” said Norton, citing the “prioritization of grades over learning and a sort of ‘sink or swim’ feeling in education.” Given the School’s rigorous curriculum, as well as extracurricular demands, many students feel that receiving a better grade on individual assignments is more important than growing their skills. As a result, relying on AI to help complete their assignments can sound appealing. “I used ChatGPT because I could save my effort and have the teachers see what they want,” says one anonymous senior at the School when reflecting upon their junior year.
However, according to Leland, AI usage does a disservice to students in the long run. He compares it to using a forklift to lift weights at a gym; sure, the job will be completed, but you won’t improve your skills in any meaningful way.
Despite all of AI’s drawbacks in education, its usage can be ethical and advantageous in specific scenarios, such as brushing up on a topic covered while a student was absent from class according to Norton. “I think it’s foolish to say ‘AI is bad, don’t ever use it’…If you use it to help walk down a road, I think that’s fine,” said Norton. “But if you use it to take your steps for you, then you’re missing the point.”
History teacher Yosup Joo agrees; he uses AI to help him ensure his tests reflect lecture material, and incorporates AI into his students’ projects. For instance, since minors can’t consent to having research run on them, it can be used to generate data sets his Psychology students can then use to simulate scientific processes. “The line for me is if I imply that work that I’ve done is my own solely,” said Joo, adding that he doesn’t see an issue with using AI in the classroom “if you cite your sources, which is what we always teach our students to do.”
While precautions such as handwriting longer essays or more in-class assessments can feel frustrating for many students, from a teachers’ perspective, AI prevention measures are crucial for ensuring they build the skillsets that make their education worthwhile.