In a recent article from The Conversation, Abby Cathcart, Melinda Laundon, and Sam Cunningham discuss a new way of screening student evaluations to avoid exposing instructors to abusive feedback that could permanently stay on their record. Cathcart, Laundon, and Cunningham discuss the development of the “Screenomatic” screening system that works from an updatable dictionary of unacceptable terms and learns from user input, improving its auto-detection accuracy over time. The system allows comments to be re-identified in cases of abuse so that students who make inappropriate comments can be connected with mental health resources or reminded of the code of conduct. “[T]he cost of screening responses is nothing compared to the cost to individuals (including mental health or career consequences),” write the authors. The Conversation Note: Archived stories may contain dead links or be missing source links.