For some educators, the prospect of cheating and laziness has led to bans on artificial intelligence (AI) in schools. That’s the situation in Charlotte-Mecklenburg Schools. But North Carolina’s education technology officials now say that’s the wrong approach.
Instead, they say, teachers and students need to think of AI as a personal tutor, a time-saving assistant and a tool for future jobs.
“The most exciting part is when you can take this and customize the learning experience for individual students to make sure they stay engaged in their work,” said Vanessa Wrenn, chief information officer for the Department of Public Instruction. She’s one of the authors of a 34-page AI guidebook for schools released last week.
“Children are always a step ahead of us in technology,” Wrenn said. “And if we don’t teach them how to understand it and use it well, then they will not have the appropriate guidance on how to use it.”
Wrenn and other North Carolina officials started thinking about an AI strategy shortly after ChatGPT debuted in November of 2022. The chatbot responds to queries and prompts submitted in plain English — no programming required — and generates essays, articles and other text. Other generative AI programs quickly followed, allowing users to create images, video and music.
Wrenn says some states quickly banned generative AI for fear of all the ways it could be misused. But she had the opposite reaction.
“There are so many good uses for generative AI to improve our students’ outcomes and improve our experience for teachers that I want to embrace it,” she said. “But let’s create guidance on how to use it safely, how to use it responsibly and how to use it well.”
Among the possibilities for using AI mentioned in the guide:
- Adjusting vocabulary used in classroom material to match students’ reading levels.
- Creating assignments or writing prompts tailored to students’ interests.
- Taking over routine administrative tasks to free more time for teaching and learning.
- Providing tutoring for students who need help outside school hours.
- Translating material for families who don’t speak English.
- Using voice-to-text and text-to-voice tools for students with disabilities.
- Helping students brainstorm ideas, explore topics and study for tests.
Student sees good and bad use
Nadia Sesay, a senior at Palisades High in southwest Charlotte, says she worries about AI based on the way she’s seen some classmates misuse it.
She says one student often won praise from the teacher for his essays. “And then behind closed doors he would always brag about how he got away with using AI,” she said.
Sesay says some students find that funny, but she worries about overreliance on the tool. “I’m hearing sophomores not even knowing how to write an essay without dependency of AI,” she said.
But as she applies to colleges, she has used AI to proofread essays and avoid repetitive phrasing.
“I went to AI and I was like, what is another way I can say ‘curated content for brands’ social media?’ And it provided 10 different ways I could say the same thing. So I was able to enhance my resume,” she said.
Sesay, whose family came from Sierra Leone and who will be a first-generation college student, says that helps “level the playing field” when she’s competing with students whose parents can afford college admission coaches.
AI skills are an equity issue
One of the state’s goals in pushing AI skills is about leveling the playing field for graduates, Wrenn said.
“The Future of Jobs report says that 40% of all of our jobs in five years are on an AI or machine learning trajectory. And AI will bring at least one million new jobs over the next five years,” she said.
According to the guide, “responsible implementation will prepare students for a future in which AI is sure to be integral to all aspects of their lives. However, ignoring generative AI, or not implementing it responsibly and equitably, can have the opposite effect, increasing the disparities that put many students at a disadvantage and increasing the digital divide.”
Ben Allred, chief innovation and technology officer for Cabarrus County schools, is an AI enthusiast — and an example of the kind of AI skills that can be useful on the job. When WFAE’s interview query landed in his inbox, he said he was using ChatGPT to write some formulas for Excel that he couldn’t figure out.
“It’s a thought partner. Like, ‘Hey, how do I do this?’ ” he said. “And it’ll say ‘Try this,’ and like, that didn’t work and it’s like, ‘Try this.’ And then, you know, you kind of learn something.”
He says he has used AI to prepare questions for job candidates — fully aware that savvy candidates are also using AI to prepare for job interviews — and to write difficult letters when employees fall short. Ironically, he says the computer even advises him on people skills.
“I can put things in like, ‘This is a person who’s going through some difficult personal struggles that is also struggling at work.’ And it coaches me to be kind to the person while I’m delivering the information! There’s just some really good stuff in there,” Allred said.
Plagiarism and cheating aren’t new
Virtually all schools already have policies in place related to plagiarism and cheating, which can happen whether someone misuses print-on-paper resources, online material or generative AI.
“As AI becomes more commonplace in all aspects of life, it is imperative that educators adapt to this new reality and rethink current attitudes about plagiarism and cheating. Teachers should educate students about the responsible use of generative AI, promoting the values of honesty, critical thinking, and originality in academic endeavors,” the guide says.
Wrenn says that means understanding that AI-generated material can be a starting point, but it can also produce material that looks authoritative but is just plain wrong. Teachers need to help students learn how to check facts and cite sources, she says.
Meanwhile, the guide advises teachers against relying on computers to catch students who might be tempted to rely on AI without attribution.
“The platforms that have come out there to detect if something is AI have already been found to have a high failure rate,” Wrenn said. She said teachers need to understand that such programs may incorrectly identify student work as computer-generated.
Support for making the transition
The state’s AI guide includes information about finding and choosing AI products that are tailored to education. It offers advice on writing good prompts, which can be the key to getting useful information. It even has graphics — generated with the help of AI — to drive home the point that if AI is used correctly it’s like an electric bike: The person remains in charge, but the device helps the rider move faster and farther.
The guide also acknowledges that even though the technology is advancing at lightning speed, school districts may need time to make sure staff are well trained and everyone — including students and parents — understands AI policies and guidelines.
Wrenn says DPI began training teachers over the summer and has regional consultants who can provide support for educators trying to figure out smart ways to use AI.
In the Charlotte region, districts are all at different starting points.
In Charlotte-Mecklenburg Schools, “Chat GPT and other AI are blocked from CMS devices for students and staff,” Communications Director Susan Vernon-Devlin reports.
After the state guidelines came out last week, CMS Chief Technology Officer Candace Salmon-Hosey said the district will create “a small working group” to look at AI, using the state’s material and national resources.
“We know that AI has a lot to offer. As with any new resource, we intend to be both cautious and courageous as we move forward,” she said in a written statement.
Iredell-Statesville Schools “has started to share some AI tools with our teachers and administrators … specifically focused on how AI can help teachers enhance their instruction and planning by creating assignments, assessments, and lesson plans,” according to Public Information Officer Jada Jonas. She says teachers have not been encouraged to use AI with students “since the majority of AI apps require the user to be 18 or older.” The state guide says the common age limit is 13.
Allred says Cabarrus County has never blocked use of generative AI, and to his knowledge has never encountered a serious problem with it. He cites the example of an English teacher who worked with a school technology coordinator to create a poetry lesson.
“So they had the AI, with prompts, write poetry. And then the students read it, analyzed it then compared it to what they would have written,” he said.
Allred says all new forms of technology, including the internet itself, go through a cycle that often starts with fear.
“They started with, ‘Oh no!’ And then it became, ‘This is neat!’ And then it became ubiquitous,” he said. “ We’re probably 12 to 18 months to ubiquity with this.”