NANKIN — Mapleton Local Schools approved a policy officially dealing with generative artificial intelligence at its Dec. 11 regular board meeting.

That policy leaves use of generative AI in the classroom in the hands of individual teachers, according to district technology director Craig Wentworth.
“While AI may be used as reasonably necessary to supplement, aide, and/or assist students in their educational endeavors, AI shall not be used as a substitute for a student’s own critical thinking, analysis, and/or compositional creations. Nor shall it be used in a way which otherwise undermines the educational purpose of an assignment, as determined by the teacher,” the policy states.
“There are new tools released all the time,” Wentworth said. “It’s hard to stay up to date and be informed, but it’s really important.”
Mapleton Local Schools isn’t the only Ashland County school district grappling with how to deal with generative AI.
It all comes down to philosophies
At the university level, Ashland University started addressing the technology last spring. The faculty senate approved a change about AI to the university’s academic integrity policy at its first meeting of the 2023-2024 academic year.
The change added a line to the academic integrity policy stating that, unless explicitly asked to use AI to complete an assignment, use of the technology would be considered an academic integrity violation.
Use in classrooms at AU remains up to professors’ discretion — similar to Mapleton’s newly approved policy. At AU, some professors have implemented it into their classrooms. Others have taken a more skeptical approach, according to previous Ashland Source reporting.
Other public K-12 districts started discussing generative AI recently, too.
Catherine Trevathan, the superintendent at Hillsdale Local Schools, said that district is putting together a toolkit for teachers about AI. It also plans to establish an AI committee to determine guidelines for student use.
At Ashland City Schools’ November board meeting, chief innovations officer Ben Spieldenner told the board he’d begun thinking about AI too. He added he’d work on determining whether the district needed a policy specifically dealing with the technology.
But, in an interview with Ashland Source, Spieldenner said the considerations about such a policy in a K-12 district are different than at a university.
“[In higher ed] you’ve got students who choose to go to an institution because that institution focuses on X, Y and Z,” Spieldenner said.
“At the public schools, people don’t necessarily choose to go to the public school they’re at, but we have to provide them with an opportunity to have exposure to all of this stuff. It has to be in a safe way.”
In K-12, Spieldenner said districts he’s researched tend to go one of two extreme ways: banning the technology altogether, or forcing it upon teachers. The former, Spieldenner said, is much more common than the latter.
According to T.J. Houston, who teaches cybersecurity at the Ashland County-West Holmes Career Center, those choices all come down to individual schools’ philosophies.
What does it look like in practice?
At the Career Center, and especially in Houston’s classes, students receive exposure to generative AI. Donne Copenhaver, the assistant principal, said it’s easy to jump at horror stories of students cheating with AI.
If you think you can be replaced by AI, then you should be. You provide so much more as a teacher, coach or tutor than just content, which is what AI does.
T.J. Houston, cybersecurity instructor at Ashland county-west holmes Career Center
So, at the start of the year, Copenhaver said the Career Center offered professional development on the technology. Houston presented on how teachers could use it in the classroom.
Copenhaver and Houston said the tech has been especially helpful with UDL, or universal design for learning.
That teaching approach allows teachers to accommodate the needs of students with different learning styles by adapting their lesson plans. AI can make those adjustments to teachers’ content.
Still, Copenhaver said the important part is that teachers look over it. Houston agreed.
“If you think you can be replaced by AI, then you should be,” Houston said. “You provide so much more as a teacher, coach or tutor than just content, which is what AI does.”
Some teachers at the Career Center have incorporated the tech into their classrooms more than others. Houston said he uses it in his classroom regularly, and hosts students weekly to discuss its implications.
His students have completed research projects on AI in medicine, used it to rewrite resumes and cover letters for different jobs and outlined essays with AI. They still write the papers themselves.
Logan Heichel, one of Houston’s students, said he has used AI to help with rewording definitions he doesn’t understand. He also has had the tech help with tasks like alphabetizing vocabulary lists.
The 17-year-old senior from Loudonville-Perrysville Schools said he finds it useful for that kind of help, but would be concerned about peers using it to write papers.
“If you cheat, you won’t learn anything,” Heichel said.
Houston and Copenhaver said they haven’t had issues with cheating at the school.
Copenhaver said from an administrator’s perspective, it helps for teachers to have exposure to the technology. If they know it well, that informs disciplinary action that comes from inappropriate use.
Still, even if cheating does happen, Houston and Copenhaver both view it as important that students can “fail safe.”
“I’d rather have them fail here where it’s safe and teach them to use the tool properly so they’re prepared for what’s next, whether that’s a career or college,” Houston said.
Still debating
Spieldenner, with Ashland City Schools, agreed that generative AI can give good feedback in some situations. He added it helps with tutoring, re-explains concepts well at different grade levels and aids with lesson planning.
For example, Spieldenner experimented with the technology by inputting a short paper into it to proofread. He received helpful feedback on grammar, mechanics and readability.

But, in his estimation, the new technology still isn’t ready for practical use across classrooms.
Spieldenner also finds concerns with student privacy when it comes to AI, and said for him, cheating remains a worry. He said it’s not as widespread as he thought it might be when ChatGPT first emerged, but still exists.
In his estimation, the potential of students cheating is especially concerning in K-12 education.
Students in that age range are still receiving the fundamentals they need to propel them forward in higher education. Cheating while those building blocks fall into place creates reason for concern, Spieldenner said.
Ashland City Schools remains in the process of determining what next steps to take.
At the district’s November board meeting, Spieldenner said more information would become available about the district’s thoughts after the new year.
Why does it matter?
Regardless of progress on putting AI policy in place, Wentworth and Spieldenner agreed: School districts play a role leading the community in the conversation about generative AI and its future.
Spieldenner said students, parents and community members alike are watching the decisions the district makes about this technology. To him, that means the way the district approaches it requires care.
“I think it provides us with the opportunity to be able to show kids that if you have reckless abandon for technology, the consequences will be serious,” Spieldenner said.
“But, if you also ignore emerging technologies, the consequences will also be serious. Like anything, I think moderation is important.
“One of the core values for Ashland City Schools is better decisions, fewer regrets. And I think that is extremely important when it comes to artificial intelligence or any emerging technologies.”
