AI tool uses dead professors to help students write better
Grammarly's new "Expert Review" feature is under fire for letting users get feedback from AI versions of real professors—including some who have passed away, like historian David Abulafia.
The tool, launched last summer, includes multiple AI agents: its "AI Grader" agent pulls publicly available instructor information to assign grades, while "Expert Review" draws on subject-matter experts and trusted publications to give feedback and revise writing.
But when academics spotted their late colleagues' names being listed, Kathleen Alves wrote, "This is 'literally digital necromancy.'"
Many questioned if this crosses an ethical line.
Implications for academic integrity and ethical considerations
This story hits close to home for students and anyone using AI tools in school. It's not just about tech—it's about respect, consent, and where we draw the line with AI in education.
Some universities are now tightening rules on AI use after cases of students getting in trouble for using these tools.
The controversy shows how fast the conversation around AI and academic honesty is changing—and why it matters to pay attention.