Education

More students are using AI tools, a new report shows. Are Fort Worth schools ready?

A left hand hovering over a laptop keyboard
A new report says that teachers lack guidance on how to handle AI in schools.

Across the country, a growing number of students are using artificial intelligence tools for schoolwork — and many of them are getting in trouble for it.

Those are among the findings in a new report from a national digital rights nonprofit. The report, released Wednesday, also notes that while most districts across the country have offered teachers training on AI, most educators said they’ve never gotten any guidance on what to do if a student uses a program in unacceptable ways, including plagiarism.

Meanwhile, officials in Fort Worth school districts say they’re hammering out details on policies governing what acceptable use of AI looks like.

School AI policies have grown, but gaps remain

The report was based on surveys of middle and high school teachers conducted last fall by researchers from the Center for Democracy & Technology, a D.C.-based nonprofit. When generative AI tools like ChatGPT and DALL-E came into widespread use last year, school leaders were caught off guard, researchers wrote. But since then, districts have made huge progress in creating policies around the use of that technology, according to the report.

But despite that progress, most teachers told researchers that they’d never gotten guidance on how to handle student use of AI in their classrooms. Just 28% said their districts had told them what to do if they suspected a student was using AI in ways that weren’t allowed, such as having a program write a homework assignment for them. Only 37% said they’d gotten guidance on what responsible student use of AI looks like, and 37% said they’d received instructions on how to spot the use of AI in students’ assignments.

Elizabeth Laird, director of the nonprofit’s Equity in Civic Technology Project, said many districts have recommended that teachers use online AI detectors to catch students who are using those programs to complete homework assignments. The problem, she said, is that AI detectors aren’t especially reliable. Other research suggests those programs are more likely to flag text that’s written by non-native English speakers as being AI-generated. That’s an especially big problem in Texas schools, which have one of the largest proportions of English learners in the country.

Laird, who was one of the paper’s authors, said districts need more detailed policies and better training for their teachers on how to handle AI, including what uses are acceptable and what’s off limits. When districts simply hand teachers AI detection software and don’t offer any guidance on what its limitations are, or what they should do when they find a student using AI inappropriately, it leaves teachers to navigate tricky situations on their own, she said.

Anthony Tosie, a spokesperson for the Northwest Independent School District, said the district temporarily blocked access to generative AI sites like ChatGPT last year to give school leaders time to weigh potential benefits and risks. Since then, the district has begun offering training to its teachers on how to use AI in the classroom, he said.

The district has rules in place covering AI and academic dishonesty as a part of its policy on the responsible use of technology. District leaders are working to develop a plan for how the technology fits into the district’s broader goals, he said.

Jessica Becerra, a spokesperson for the Fort Worth Independent School District, said the district doesn’t yet have a policy in place on the acceptable use of AI. But the district has assembled a committee to develop policies and regulations around the technology, she said.

AI-generated homework is a growing problem

Michael Sanks, an English teacher at Western Hills High School in Fort Worth ISD, said he hasn’t gotten much training or guidance from the district on how to use AI, or what to do when a student misuses it. But it’s a growing problem, he said: Almost weekly, he sees students turn in assignments written by an AI program. When he suspects a student has turned in an AI-generated assignment, he uses an AI detector to check.

But Sanks said he usually doesn’t need the detector to tell when one of his students has used ChatGPT or some similar program to write an assignment. AI programs can only produce exactly what the user asks them to generate, so users have to be adept at giving them specific instructions. So some students submit AI-generated assignments that are well written, but don’t address the topic at hand, or only do so in passing, he said.

Even AI-written essays that are on topic often feel formulaic, he said. Those programs also tend not to be able to manage the kind of assignments he asks his students for. His assignments generally ask students to analyze text they’ve read in class, he said. Assignments written by an AI are usually a surface-level paraphrase of the material, he said.

Sanks said his thinking on how to deal with AI-generated assignments has evolved over the past year. When he first started seeing students submit homework completed by ChatGPT, his first instinct was to give those students a zero. But then he realized that if he simply gave those students no credit, they lost out on the learning opportunity the assignment was supposed to represent. Now, when a student submits an assignment written by an AI, he gives them a zero, but also lets them redo the assignment.

Although he worries about how students are using the technology to get out of doing homework, Sanks noted that there are legitimate uses, both for students and teachers. There are AI programs that can give students feedback on assignments they’ve written themselves and point out ways for them to strengthen their writing. Some English teachers use AI to generate model sentences, which they can use as examples when they’re teaching students about grammar and sentence structure.

Later this year, Sanks plans to work with students in his dual-credit class on a project in which they’ll be required to use AI to complete the work. Before they begin work, he’ll talk with them about how to use the technology responsibly, he said. As AI becomes a bigger part of the world students are getting ready to walk into as adults, they need to be able to use it well, he said. Ultimately, he thinks that’s the best way for schools to treat AI — as neither good nor bad, but as a tool that students need to understand.

“For students especially, I don’t think it was the great liberator that they thought it was going to be,” he said. “And it hasn’t been the catastrophe that I think teachers thought it was going to be.”

This story was originally published March 27, 2024 at 5:00 AM.

Silas Allen
Fort Worth Star-Telegram
Silas Allen is a former journalist for the Star-Telegram
Get unlimited digital access
#ReadLocal

Try 1 month for $1

CLAIM OFFER