AI in the Classroom: Challenges and Opportunities

ChatGPT appeared on Michael Schlenker’s radar last November, when it first became available to the public. The Rivers science faculty member reports, “A student who had taken my AP computer science course the year before said, ‘Hey, Mr. Schlenker, have you checked out ChatGPT yet?’ And I said, ‘No. Have you?’ And she said, ‘Yeah, I just asked it to write my video game project from last year.’”

Schlenker asked how that had gone. The student replied, “So in, say, 10 seconds, it did three to four weeks of work.”

At that moment, Schlenker says, “[I realized] I’m going to have to rethink a lot of things.”

ChatGPT is an LLM—a large language model generated by artificial intelligence (AI). Available for free to anyone with access to a computer, it has an extraordinary ability to mimic human language and to write computer code. ChatGPT can generate and edit content of all kinds, summarize material, and compose convincing love letters. It can invent computer games, solve math problems, make images, and create videos. The latest version, Chat GPT4, can even pass the notoriously difficult Environmental Science AP exam with a 5 or ace a bar exam.

New iterations continue to arrive, and other tech companies are gearing up AI applications, too. Microsoft uses AI for Bing; Google’s version is Bard; Anthropic makes Claude. None of these are regulated by the government at this point, though in July the Federal Trade Commission opened an investigation into OpenAI about how it collects data, and the House of Representatives has held hearings on copyright and cybersecurity issues raised by AI. On the Senate side, Majority Leader Chuck Schumer has announced a series of nine “AI Insight Forums” this fall to explore both its promise and its dangers.

ChatGPT can leave users breathless with its speed and accuracy, but it can also generate content riddled with errors and biases. Some of its answers are simply made up—“hallucinations,” in tech-speak. And, of course, it tempts users to pass off its work as their own, whether it’s a letter of recommendation or five-paragraph essay on Shakespeare.

These systems are very new and very powerful—and have huge implications for education at every level. Head of School Ryan S. Dahlem, who began his tenure on July 1, has already been thinking about what AI might mean for a school like Rivers. “My initial thought is what an exciting time this is to be in education,” he says. “We’ve got this remarkable new technology that is incredibly powerful. It’s paradigm-shifting, it’s disruptive, and it’s going to be a part of our students’ lives, in both the next chapter of their education and in their professional lives.”

Dahlem is confident that Rivers’ “culture of innovation and willingness to try new approaches, to pilot and to iterate,” will influence the school’s approach to AI. “We need to ask, How are we teaching students to harness this powerful tool? How are we learning alongside them? And how are we considering the ethical implications of the technology? AI presents a compelling opportunity in our role as educators and as models of lifelong learners.”

Michael Schlenker says AI is “a tool, like anything else.” But it’s a tool that has already changed, and will continue to change, what happens in the classroom. “It’s a different type of learning,” he says. 

Interacting with ChatGPT and using it to solve problems in the classroom means learning to ask it the right questions—a process called prompt engineering. Last spring, Schlenker’s class used ChatGPT to help build websites that could be of use to Rivers. “The prompt engineering side of things was big,” he says. “If you don’t include the right information in your prompt, you get stuff you can’t use or don’t know how to use. That was a big learning for us.”

One of the websites the students created hosted the Rivers student elections. “To build it, we had to work through security concerns and build logins to ensure data integrity,” Schlenker says. Another successful website managed the springtime schoolwide water-tag game. A third website was meant to create a lunch-line monitoring system, but the class ran out of time to get it up and running. Even so, says Schlenker, “Without ChatGPT, none of those websites would have happened. It was awesome.”

But while Schlenker, who has a background in engineering and teaches computer science, is intrigued about the potential of the technology, he’s also clear about its limitations. What AI can’t offer is the human interaction—the relationship between teacher and student—at the core of the Rivers experience, says Schlenker. “That can’t be replicated.”
 
A Matter of Mission
The promise and the challenges inherent in AI have been on the Rivers radar for some time. Last year, the school formed a task force, led by Director of Education Technology John Adams,  to look at AI and its impact on teaching and learning at Rivers. “Rivers’ mission makes us well prepared to learn how to best utilize this technology,” says Adams, citing this section in particular:

We cultivate a caring, respectful, and collaborative environment that encourages student performance, including demonstration of logical thought, informed and articulate voice, creative vision, and integrity.

Adams and his fellow task-force members have spent many months pondering the proper role of AI in the classroom across all subjects. “It’s going to be so important for our teachers, in their respective disciplines, to think about it, to experiment, and keep learning about how it’s impacting their space, to make sure that we’re leveraging AI as an effective tool,” he says. “It’s about understanding how to ask really, really good questions to make sure you’re able to collect data that gives you meaningful results.”

The task force is intentionally interdisciplinary. “We have an English teacher on it who also teaches ethics,” says Adams; the group also includes the robotics teacher, the computer science teacher, the Grade 10 dean, and a history teacher. “We wanted to get a mix, not only in disciplines but in experience level with the software generation,” says Adams. “The goal is to take a holistic approach toward thinking about AI.”

The task force recently released a set of guidelines and principles that can be adapted to individual departments’ needs. But on a broader scale, “Rivers educators can play a significant role in shaping how this technology is used,” says Adams. “We are an amazing institution with a great mission and great people, and we can be at the forefront of using this effectively—if we continue collaborating, experimenting, trying new things, listening to our students, and seeing our students work with it.”
 
Navigating Uncharted Territory
AI is raising questions in many areas of the school. Melissa Dolan ’98, formerly a humanities teacher and three-sport coach at Rivers, directs Middle School curriculum development. Last spring, she helped pull together a professional development day focused on AI. During the planning, “it was fascinating to see the different lenses that we all brought based on the age of the kids we were working with and the discipline,” she says. “It’s certainly uncharted territory, not just at Rivers, but everywhere—in government, in economics, and society. It’s raising huge questions that we don’t have all the answers for. But as educators in the world we’ve been navigating over the past number of years with the pandemic, for better or for worse, we’re getting used to unfamiliar terrain.”

Dolan looks at AI with a middle-school lens, she says, taking into account students’ developmental needs. She says the media literacy course required for Grade 7 students is critically important right now, at a time when social media messages are coming at students from every direction. Media literacy equips them with tools that empower them to navigate the online world with an appropriately critical lens.

That begins with knowledge, Dolan says. Students need to understand ChatGPT’s algorithms, which enable its uncanny ability to mimic patterns of human speech and writing. “The algorithms weren’t brand new,” she says. “They were in existence on TikTok. But ChatGPT put them on steroids. And when those algorithms combine with adolescent development, they raise questions about identity development and mental health.”

Dolan isn’t quite ready to embrace AI, she says, but she recognizes that it must be addressed. “When it first came out, I thought ‘I don’t even want to.’ But this is the world we’re living in, and we want our students to be able to navigate it.”

AI is also shaking up the college application process, says Dave Lyons ’99, director of college counseling. It could potentially be used to write essays or letters of recommendation—but Lyons isn’t too worried about Rivers students getting swept up in AI-powered application interventions.

“The number-one lesson that I have learned doing this work, and the thing we tout all the time, is authenticity,” he says. “There is nothing more powerful in the process than a kid who can stand up to a college and say, ‘Here’s who I am. Take me or leave me. If you don’t want me, I’m on to the next one.’ What the colleges don’t like is the kids who are answering the question in the way that they think the colleges want them to answer it. That’s what ChatGPT does. It’s really about believing in yourself and putting yourself out there—the exact opposite of ChatGPT. You can’t write a prompt that's going to reveal your authentic self.”
 
Raising Questions—and Finding Answers
Rivers alumni who work in the field of education are also grappling with the questions raised by AI. Jason Medeiros ’01 is principal of Hudson High School in Hudson, MA. He reports that—for now, at least—the school isn’t banning ChatGPT, as some other public school systems have. “We’ve opted to not necessarily press the panic button, although I’m not sure all of my teachers would agree,” says Medeiros. “We’re not trying to draft new policies or anything like that.” Rather, he says, the school is using existing policies as a template: “The idea is that if you use ChatGPT, it’s like using any other resource to plagiarize. If you're caught using this, it’s no different from any other tool that you may use to gain an advantage.”

That’s the practical, boots-on-the ground consideration, he says. But, he adds, “From a philosophical standpoint, I’m definitely worried about what AI means as a deterrent to students’ engaging in their own thinking.” Medeiros, who graduated from Dartmouth College and earned an M.A. at Stanford and Ed.D. at Boston College, started his career in education as an English teacher, encouraging his students to think for themselves. AI’s potential to interrupt that developmental process gives him pause.

“ChatGPT’s doing everything for them in a way that other tools aren’t,” he says. “I worry about that not only on the educational front, but also as a society. Are we giving away our capacity to think for ourselves, to be creative for ourselves, to generate ideas for ourselves?”

Medeiros says that if educators don’t continue to create learning environments where students are invited and encouraged to take intellectual risks, we’re in trouble. “We cannot be educating students to be dependent on artificial intelligence to engage in higher-order thinking skills for them,” he says.

Other alums are more sanguine. Jason Gorman ’92 isn’t too concerned about AI upending education; in fact, he applauds ChatGPT’s capacity to act as a peer tutor for students, freeing up teachers to actually teach. Gorman is the founder of Jackrabbit Learning Experience, a consulting, design, and development agency that creates online courses and programs for its clients. The company has worked with a variety of healthcare organizations, nonprofits, and startups. In 2020, Jackrabbit worked with Boston University to imagine, design, and roll out fully online orientation, when all BU students were remote.

“The whole goal of education technology has been exactly what AI is able to do now: create better one-to-one support for students,” Gorman says. “My feeling is it unburdens teachers and allows them to have a deeper understanding in terms of where a student is going wrong, how they’re challenged, what blocks they need to fix.”

But Gorman agrees that managing AI thoughtfully and rigorously is important for Rivers—and he understands why some community members may approach it with anxiety or even fear.

“I deeply empathize with pretty much every reaction to AI,” he says. “It’s very, very complicated. It’s unprecedented. I think it probably is the most important invention of humankind, and it raises existential questions of all kinds. If we’re outsourcing our thinking and reasoning, then what does it mean to be human, if a nonhuman thing of our own creation can do that? Who are we, fundamentally?”

As luck would have it, that’s the kind of question a Rivers education is designed to answer.
 
This story, written by Catherine O’Neill Grace, first appeared in the fall 2023 issue of the Riparian, the Rivers School's alumni magazine.  
Back
333 Winter Street Weston, MA 02493
P: 781.235.9300 F: 781.239.3614