CANTON TOWNSHIP, Mich. – The emergence of artificial intelligence is pushing some area school districts to consider how it should be used in classrooms.
Wayne-Westland Community Schools told Local 4 it’s in the early stages of exploring AI for assignments and projects, with an emphasis on safety.
Interim Superintendent Dr. Catherine Cost said the district is working to create learning environments that are “authentic and safe,” and that any adoption of AI would require clear boundaries.
Cost said the district’s approach is centered on using AI as a support tool, not a replacement for student work.
“We want students to use it as a tool. We want our staff to use as a tool, but we want them to be able to create and have the work be authentically theirs,” Cost said.
District committee reviewing platforms
Wayne-Westland leaders say the district is not yet using AI in classrooms, but the discussion is ongoing.
The goal is to prepare students for a world where AI will be widespread, while setting limits designed to protect students and preserve academic integrity, according to the district.
Recently, Wayne-Westland formed a committee of teachers, administrators, students and parents to evaluate platforms and determine what best fits students’ needs, Cost said.
Authenticity concerns: teachers watching the process
As districts weigh AI, one concern is determining when student work reflects independent thinking versus AI-generated output. Cost said educators often recognize when something does not match a student’s normal voice or ability level.
“Educators are passionate people and they know student to student what their capabilities are,” Cost said. “So, if a student were to turn in something that really isn’t in their voice, a teacher is going to recognize that pretty easily.”
She said some classroom practices can reinforce authenticity, including requiring drafts, peer editing, and one-on-one conferences to review work in progress rather than only final submissions. She also noted performance may show up in testing, where students must demonstrate skills on the spot.
Online safeguards already in place
Cost said the district’s “paramount concern” is student safety.
The district uses a monitoring system called Lightspeed to track student activity online, flag potential red flags and alert adults to intervene when necessary. She said staff can also close browser tabs and redirect students to appropriate learning sites.
Cybersecurity expert urges policy “guardrails”
Rich Miller, president and CEO of STACK Cybersecurity, said districts should adopt a custom GPT to adhere to school’s policies for what’s considered acceptable use.
“They should consider how they’re going to build their guardrails,” Miller said. “That’s what the policies are. They should be a guardrail system.”
Miller said similar preference settings should be considered at home, with parents setting expectations for how children use AI tools.
Data and privacy risks: free vs. paid tools
Miller said a key issue in AI use is data collection, particularly for children using free tools.
“Every time we use it, essentially, it’s an opportunity for later on down the road for that information, that data, to be mined,” Miller said.
He believes parents and school leaders should ask what information is stored, whether tools are set to “learn” from user inputs, and what types of data — including voice or image-related information — may be collected over time.
“Let’s say a 15-year-old student and they’re doing all their assignments for the next three or four years, all of that data that they’re putting in the system’s learning about the person,” Miller said.
Eventually, the expert says AI users will begin to get detailed, targeted advertising and marketing based on what the system has learned about the user.
“It’s just like emails … a lot of the free accounts like Google and Yahoo and stuff, all of that data is mining for targeted advertising,” he said. “With AI, we’re just doing that on steroids.”