Genie Nicole Giaimo, Hofstra University
Recently, IWCA leadership put a taskforce together on AI and writing centers. This group has been working together for about a year (spring 2024) and has produced a blog post about its other actions, such as surveying writing center community members, holding listening sessions, and sharing findings at the annual IWCA conference. In January, the taskforce gathered together several dozen interested stakeholders to form working groups including: reports and research, pedagogy and professional development, communications and outreach, and advocacy and policy.
Working groups often take time to develop results. The Advocacy and Policy group realized that we would need to wait for other groups to first share their findings in distilled and readable/visual formats. And while we hope to eventually share these findings, perhaps through another blog post on the IWCA website, we also wanted to share our thoughts and resources we have developed with the larger writing center community. We wanted this because we know that many of our colleagues are trying to navigate the ever-changing landscape of AI at the same time as other crises and challenges taking place in higher education.
So, below, we share narratives from practitioners in the field about how they are confronting AI in their writing centers and classrooms. We highlight the experiences of four K – 12 educators (Kara Boltz, Stephanie Erickson, Meg Heyssel, & Melissa Ligh) who took a course on AI and education with Joe Essid, another member of the IWCA AI taskforce. We also share narratives from members of our working group including Laura Hardin Marshall who writes about the ways in which confronting AI in the writing center has changed over the last two and a half years; the writing center and its tutors have steadily engaged in more training and discursive work around working with writers who use AI, either individually or as part of assigned writing tasks. Clare Goulet (MSVU Halifax, Nova Scotia, Canada) shares a narrative that details their approach to addressing AI in their writing center and guidance for tutor best practices and preliminary AI policies.
After these narratives, we provide two resources that we hope will be useful. One resource promotes ethical AI literacy (Laura Hardin Marshall, Webster University) among tutors and instructors in writing. This resource cues people who teach and coach writing to consider several ways in which AI might be used to positive effect while also identifying reasons for why students might rely on AI to produce and complete writing assignments. This resource asks us to challenge our assumptions as writing educators about why students use AI and ways to have more thoughtful conversations about incorporating AI use into one’s practice.
The second resource, “AI Use in Writing Centers: Decision-Making Workflow for Tutors” (Rachel Willis, University of Lynchburg) was created based on conversations with peer tutors who identified several challenging tutoring sessions that involved AI. One scenario that the working group noted is not discussed in current research on AI and writing is one in which students resist using AI and feel frustrated or put on the spot when it is required in assignments by faculty. Here, we have an opportunity not only to think about how to document unethical use or push students to be more thoughtful about incorporation of AI into their writing practices, but also how tutors can confront sessions where a writer is charged with using AI and that charge runs counter to their ethical and other writing practices. Saurabh Anand (University of Georgia) created a visualization of the “AI Use in Writing Centers: Decision-Making Workflow for Tutors” that can be adopted as a training tool and quick reference for tutors who confront AI use in a tutoring session and need to respond in real time.
The policy and advocacy working group will eventually share data that the taskforce collected from writing center practitioners about how AI is impacting their work. Before that time, however, we believe it is important for individual practitioners to think about ways they want to engage in advocacy at their own institutions–and in their school districts–around AI and the teaching/tutoring of writing. Many are in higher education which has accreditation requirements. Some might be educators in the K – 12 system. Some schools might have very open AI use policies and some might have strict non-use policies. In some cases, teachers encounter a paradoxical situation where students use AI after school to complete homework yet while in school, access to AI has been restricted at the school or county level. Teachers working in these settings report that developing best practices for AI has proven difficult, given inflexible policies by administration and varying levels of funding and resources for technology and technological literacy programs. Moreover, these teachers note that they must cope with other stresses bearing down on K – 12: state and Federal mandates, anti-DEI backlash, overly large classes, understaffing, loss of experienced colleagues leaving the field. That environment makes it difficult to get any work done, let alone try new technologies.
We suggest getting into contact with your community members either through accreditation boards, union representatives, internal IR and assessment teams, school boards, superintendents, school technology specialists, county technology departments, and political representatives to express your particular opinions, concerns, and beliefs around the teaching of writing and AI use. For some, this might mean putting an emphasis on critical thinking and information literacy–two cornerstones of civic engagement and democracy–in curricula. For others, this might mean asking for more resources for under-served school districts to create AI and technology literacy programs. For still others, this might mean convening stakeholders to explore and highlight the differences between what writing centers provide to writers and what GenAI provides to writers. Or this might mean getting involved with collective bargaining committees in order to protect academic freedom and teacher expertise by gating how AI is used for teacher evaluation, student feedback, and other essential tasks.
AI is becoming more intertwined with our technology systems (among other systems), and while many of us might feel resistance to fully embracing AI in our teaching and tutoring work, these resources can help us to have thoughtful and measured conversations with our tutors and colleagues about ethically incorporating AI into writing work. At the same time, we urge you to think about ways to advocate for ethical AI use in education and protect the rights of your workers, your students, and yourselves.
AI Policy and Advocacy Working Group Members:
-
- Genie Nicole Giaimo, Hofstra University (Working Group Chair)
- Saurabh Anand, University of Georgia
- Clare Goulet, MSVU Halifax, Nova Scotia, Canada
- Laura Hardin Marshall, Webster University
- Rachel Willis, University of Lynchburg
Narratives from the Field
- Laura Hardin Marshall, Webster University
Collaborengine: GenAI at the Mount Writing Centre
- Clare Goulet, MSVU Halifax, Nova Scotia, Canada
Four High School Teachers Contemplate AI
- Kara Boltz, Stephanie Erickson, Meg Heyssel, & Melissa Ligh
Introduction by Joe Essid, University of Richmond
School of Professional and Continuing Studies
Resources from the Field
- Dr. Laura Hardin Marshall, Webster University
- Dr. Carolyn I. Brown, Webster University
- With Assistance from Writing Center Coaches (Spring 2025)
Conflict Management Guide for Tutors Regarding AI Use
- Rachel Willis, University of Lynchburg