top of page

February 26, 2026

Learning to Write with Robots

English Professors Consider AI’s Place in Classroom

wal_172619-woman-8429762_1280.jpg

Minnie Bardenhagen

Staff Writer

“Does artificial intelligence have a place in college writing curricula?”

 

That is the prompt I wrote Google Gemini, to which it responded, “In 2026, the question is no longer if AI has a place in writing curricula, but how it is being integrated.” 

“It is becoming a permanent fixture in the process of writing.”

 

Even before NMC rolled out its AI policy last August, instructors in all areas of study had to consider the question of Gen AI. This question can be particularly precarious for writing-centered curricula. 

 

Since ChatGPT rose to prominence in late 2022, stories of students using chatbots to do writing assignments for them had gone mainstream, and instructors made AI use and cheating synonymous. This notion has changed recently as more curricula have adopted AI as a tool, and one that students entering the job market may find advantageous.

 

Ryan Berstein, a curriculum and instructional designer and an adjunct professor teaching technical writing, requires AI use in his technical writing course. He noted that sounding robotic is “kind of the point” of technical writing, so AI is a tool that fits well.

 

In his technical writing course, Bernstein said that he teaches students how to write effective prompts and think critically about AI’s responses. He emphasized that AI is a tool of writing, but not a replacement for the writer.

 

“People are afraid it’s going to take [their] job, right?” Bernstein said. “There’s a common response to that fear, which is, ‘No, AI is not going to take your job. Someone who knows how to use AI is going to take your job… so that’s why I think it’s important that we practice it.’”

 

Michael Anderson, who chairs the Communications Department at NMC, said that from a general English composition perspective, norms for AI use in specific professions are still being debated.

 

“Employers clearly know about AI, and they’re interested in it,” Anderson said. “[Students] need to understand the tool and what it can and can’t do… but the degree to which it actually gets applied in work is very much up in the air.”

 

In his English courses, Anderson recommends NotebookLM to his students as a tool for research and outlining. He makes it clear that he does not want that to extend to AI writing for the student.

 

“The language that chatbots produce simply is not good writing,” Anderson said, “It doesn’t use any sources or examples or personality or any of the other things that we’re discussing in composition that are so important for actually getting people to attend to your argument or to your essay.”

 

When he finds that students overrely on AI, Anderson engages with them directly. “Most of the time the students admit, ‘Oh, yeah, I used a chatbot to help me do this or that,’ and then we get to have that conversation.”

 

Most writing assignments begin with another task that’s relevant for AI: research. While using AI in research can be helpful, Nicco Pandolfi, an NMC librarian, said AI cannot access a source the way a student can.

 

“My advice would be, if you use these tools at all in research, to treat them as pre-research tools, similar to how you might use Wikipedia,” Pandolfi said, “They can help you discover material to read, but letting them ‘do your reading for you’ is a recipe for misunderstanding and learning loss.”

 

Bernstein—who co-leads Strategy 1, “Future–Focused Edcation,” of NMC’s current 2026–2029 strategic plan—noted a slow recognition among faculty to address AI’s role.

 

“I think the most effective strategy is not to treat it as a punitive thing,” he said, “[but] to discuss it as an interesting tool that might be helpful and might be harmful.”

Bernstein noted the concerns that come with increased student AI use. “The biggest concern is that AI is going to remove that critical thinking process that a student has to go through to learn material and to demonstrate learning,” he said. 

 

Bernstein also said security is a concern for the college regarding AI. NMC has safety measures in place, such as a site license for Google Gemini that stops the platform from using students’ prompts to train its systems. 

 

NMC’s AI policy outlines several guardrails. The policy cautions instructors and students against submitting college data and intellectual property into AI platforms, and clarifies that students’ academic work is the students’ intellectual property. As for addressing communication concerns, the policy asks for caution from instructors and students.

 

“Because GenAI applications are incapable of intent, feeling, judgment, or authentic human communication, overreliance on GenAI tools in the creation of communications can have serious consequences,” NMC’s AI policy states. 

 

Energy and water usage by Gen AI platforms is another concern addressed by Bernstein in his interview with the White Pine Press. Data centers use water to cool down their servers and routers. The Lawrence Berkeley National Laboratory estimates that US data centers used 17 billion gallons of water in 2023. 

 

Despite ongoing ethical debates, AI has a grip on academics. 

 

“Does the calculator have a permanent place in learning math? Does a wrench have a permanent place in learning to be a mechanic?” Bernstein asked, “To some extent, there is permanence in [AI], but only as a tool.”

Photo Courtesy of Ashley Halladay-Schmandt

The Coalition to End Homelessness Task Force.

bottom of page