Writing Center Instruction for the Age of AI: Tutors Professional Development Workshop

Alexandra Krasova, Indiana University of Pennsylvania
Mahmoud Othman, Indiana University of Pennsylvania

Abstract

This paper explores the implications of Artificial Intelligence (AI) on writing center instructions and presents a professional development workshop designed for writing center tutors to help them discover the affordances and the constraints of using AI in tutoring. Since AI became increasingly integrated in the academic environment, writing center tutors face new challenges and opportunities in supporting the students’ writing. The paper highlights key strategies for tutors to integrate AI awareness into their teaching practices, ensuring they remain effective in guiding students through the writing process while fostering academic integrity. Through a combination of theoretical insights and practical exercises, this professional development initiative promotes a balanced approach, emphasizing both the potential and limitations of AI in the writing center context. The goal is to prepare tutors for the evolving landscape of academic writing and enhance their ability to support students in a technology-driven educational environment. 

Keywords: AI policy, writing center practices, workshop, AI assisted writing, tutor training

In the fall of 2022, a writer visited the Kathleen Jones Writing Center at Indiana University of Pennsylvania (IUP) to schedule a writing tutorial. Their main objective for the session was to ensure that their piece sounded more humane. During the session, the writer disclosed that their paper had been generated by ChatGPT and sought assistance to avoid detection by their professors. Although there was no existing policy on AI at the time, the tutor politely informed the writer that the center only worked with human-authored pieces. Following this incident, the director of the IUP Writing Center established an AI task force, with its first mission being the creation of an official AI policy for the center. As PhD candidates in Composition and Applied Linguistics, the task force members knew their AI policy should not violate the objectives of first-year cComposition classes. As a result, the policy recognizes AI as “not reflective of a student’s own understanding and effort and, thus, is not acceptable, unless authorized specifically by the instructor/administrator.” The Kathleen Jones White Writing Center at IUP supports student success and engages in creating AI policies for the departments to implement in their classrooms. 

While concerns about AI’s potential to reduce students’ engagement with writing are valid, writing center tutors as well as students have also explored its potential benefits. For instance, ChatGPT could serve as a writing coach or a source of inspiration (Kleiman, 2022), or it might “support bottom-up writing skills, freeing up time, space, and energy for more advanced aspects of composition” (Daniel et al., 2023, p. 37). Mollick and Mollick (2023) provide a list of ways students can engage with AI as a partner in their work. For instance, AI can assist students in writing by offering real-time feedback, suggesting improvements in grammar and style, and providing creative prompts, allowing them to refine their work while enhancing their writing skills. Moreover, AI has the potential not only to enhance writing processes, but to transform or even redefine them—much like Google Docs redefined collaborative writing by enabling real-time, location-independent co-authoring (Puentedura, 2013). As a result of this growing body of literature that emphasizes the potential benefits of AI, the writing center held a tutor training about AI to help tutors direct their writers who use AI in the writing session. One of the ways to make writing center tutors aware of students’ challenges and concerns and help them overcome those challenges is to conduct a professional development workshop, which we did at Kathleen Jones White Writing Center. This workshop was designed and led in February 2024 and aimed to highlight the importance of creating a united AI policy to help tutors work with students who use AI tools to write their papers. It also aimed to help tutors discuss possible challenging situations around AI that they might expect in the writing center. One more goal of that workshop was to compare the feedback given by writing center tutors with AI feedback provided by ChatGPT to learn specific features of AI writing and to recognize its rhetorical moves. This workshop was titled Overview of AI Technology and its Relevance to Writing Center Support and consisted of four main parts: (a) discussion of different challenging issues concerning AI technology; (b) lecture and discussion, which focused on the introduction of AI and open discussion about its use; (c) an activity part that focused on proving different types of feedback and comparing human and AI feedback; (d) and creating a brief draft of a united AI policy that would help tutors work with students who use AI to write their papers.

Preparing for the Workshop

In order to prepare for the workshop, we took the following steps first: 

    1. Read and Collect data about the affordances and constraints about AI tools in Writing. 
    2. Try different AI tools to understand how they would support writing to familiarize ourselves with their strengths and weaknesses. 
    3. Revising some of the existing AI policies of other writing centers in other institutions. 

The literature provided valuable insights into the advantages and disadvantages of using AI in writing. For instance, Godwin-Jones (2022) found that AI tools like Grammarly are effective for proofreading, particularly in identifying spelling and lexical errors. Similarly, Tsufim and Pomerleau (2023) highlighted how students used ChatGPT and other AI tools to brainstorm ideas and structure their essays. Their study revealed that interacting with ChatGPT encouraged participants to think critically about their audience. Additionally, Afnan et al. (2020) offered insights into how AI can support various stages of the writing process, including brainstorming, drafting, and editing.

On the other hand, some studies underscore concerns about AI’s potential to undermine students’ writing abilities. For example, Fyfe (2023) surveyed students who used ChatGPT for assignments and found that more than half felt as though they were cheating, as the tool provided ideas or generated essays that felt inauthentic. Fyfe (2023) also noted that reliance on ChatGPT could hinder students’ abilities to synthesize, summarize, and critically engage with the literature—skills that are fundamental to writing courses. Vetter et al. (2024) proposed an ethical framework for integrating AI into student writing. They emphasized the need for students to critically analyze AI-generated content, as tools like ChatGPT can produce fabricated references. They also argued for fostering student agency, cautioning that over-reliance on AI-generated text could erode students’ confidence in their own writing. Finally, Lingard (2023) warned that ChatGPT might disrupt traditional writing practices by failing to properly cite sources and, in some cases, generating fake citations, further complicating its ethical use in academic contexts. These studies helped us to think more about the content of our workshop. 

In the second stage, we explored various generative AI tools, such as ChatGPT and Claude, to assess their capabilities in supporting writing tasks. During this process, we identified both useful features and notable limitations of these tools. One striking example occurred when we asked ChatGPT to define the concept of “writing transfer.” While it provided a clear and concise definition, it attributed the information to a fabricated source, highlighting a significant ethical and practical concern regarding the reliability of AI-generated content. On the other hand, ChatGPT demonstrated strong capabilities in formatting and correcting citation styles, making it a helpful tool for ensuring adherence to academic standards in referencing. Additionally, we observed that engaging in conversational interactions with ChatGPT and Claude could stimulate deeper thinking about how to approach specific writing tasks. By asking questions or presenting prompts, users could refine their ideas, explore alternative perspectives, and develop strategies for structuring their work. 

This dynamic conversational feature reflects how AI tools can act as virtual brainstorming partners, encouraging critical thinking and enhancing the writing process. Given that writing is often considered a social activity, these AI tools can also serve as effective collaborators. Chatbots can help generate ideas, refine arguments, or provide feedback, offering valuable support for writers at different stages of their work. However, their role should complement, not replace, human interaction and judgment to ensure the authenticity and ethical integrity of the writing process. To do this activity, we agreed to use a sample paper written by an undergraduate student and to divide tutors into two groups. One group would use AI to develop feedback whereas the second group would give the sample paper the feedback. We contacted the undergraduate and asked for their permission to use their paper for training purposes. They sent us an email to consent about the usage of such a sample. 

Finally, we gathered and examined AI policies to gain a comprehensive understanding of how writing centers and academic institutions are addressing the use of AI in writing. This involved reviewing policy documents from multiple sources, including universities, professional organizations, and writing support centers, to identify common approaches, best practices, and areas of concern. We discovered that the policies varied significantly in their approaches to AI usage. Some policies permitted students to use AI tools without providing any specific guidelines or ethical frameworks, leaving the responsibility largely to the user. Others took a stricter stance, outright prohibiting the use of AI in academic work. Meanwhile, some policies adopted a balanced approach, allowing AI to serve as a collaborative partner in the writing process but emphasizing that it should not replace the human writer’s role.

Workshop Part I. Discussion

The workshop started with a discussion of what tutors see with regards to AI and the writing center and the challenges AI creates. We began by revisiting the incident in which a student came to the writing center seeking help to make an AI-generated paper sound more “human.” This situation sparked a variety of responses among the tutors. Some expressed concerns about addressing the challenges posed by AI-generated work, viewing it as a potential detriment to the student’s writing development. They emphasized how relying on AI could hinder the cultivation of critical writing skills. Others suggested that the issue should be escalated to the administration, as the use of AI in this context might violate Indiana University of Pennsylvania’s academic integrity policy. Meanwhile, some tutors admitted uncertainty about how to handle such a situation, highlighting the need for clearer guidance and institutional policies on the use of AI in writing. Then we asked tutors if they use AI. Some stated that they use AI for academic purposes while others generate some artistic projects with the help of AI. This open discussion helped to identify tutors’ personal experiences with AI as well as their attitude towards those tools. 

Workshop Part II. Lecture and Discussion

After the discussion, I provided information on AI, the release of ChatGPT, different stages of AI development, and AI types underlying that AI has existed for a long time. However, the release of ChatGPT sparked the interest and discussions about the use of AI in academic fields. Tutors seemed eager to learn about different stages of AI development and types of AI that are most frequently used in education.

We also discussed the affordances of the AI use, such as thought stimulation, overcoming psychological obstacles during brainstorming, proofreading, and revision stages while working with texts from the field of writing, and creative multimodal projects that help students use the whole variety of modes other than written texts. Since the Kathleen Jones White Writing Center provides service to a large number of international, multilingual, and ESL students, we also reflected the affordances of AI use for those writers. We specifically discussed how tools, such as ChatGPT, can help ESL students create translations and explanations in multiple languages while also creating space for developing linguistic expertise in those languages.

Challenges of Using AI Tools

Along with the affordances, we focused on the challenges of using AI tools; specifically, we were interested in what types of challenges AI creates for writing center tutors. We discussed the ways to work with students submitting AI-generated texts, how AI impacts students’ writing skills, and how students fail to see the importance of writing because they replace it with ChatGPT. Tutors actively engaged into the discussion and projected these situations onto their appointments. They also added that some issues connected with AI might link to the standardized version of English that those tools usually promote. Tutors agreed that AI language can marginalize different dialects and expressed interest in embracing international students’ diverse linguistic experiences during their tutoring sessions. One more challenge we highlighted during our discussion is unequal access to the most elite tools that AI provides. The issue tutors discussed is that some students and institutions can purchase more sophisticated versions of the technology than others, which creates societal inequality. This statement, announced by the CCCC, provoked a lot of contradictory opinions among writing center tutors. While several tutors agreed that inequality exists in academia, others supported the opinion that everyone should have similar access to education tools, including AI. 

Ethical Consideration of Using ChatGPT or AI

The next topic we discussed was the ethical considerations of using ChatGPT and other AI tools. This topic was important for the following reasons: (1) tutors need to verify the texts submitted to the writing center if they feel like it was written by AI; (2) writing centers need to explain to students that AI might serve as a threat to their personal and private information if shared inappropriately; (3) tutors need to understand the long term impacts of AI-generated content and be able to explain it to students. 

The discussion of ethical considerations of using AI started with the idea that “while ChatGPT can generate text that appears coherent and well-written, it may not always provide accurate or reliable information.” Tutors discussed the ways to explain to students how they can verify the information provided by ChatGPT so that their texts are accurate and well-informed. We also mentioned that ethical considerations of AI tools might be crucial for first-year composition classrooms where students learn that asking ChatGPT to write a paper would be similar to asking someone else which violates academic integrity (Cummings et al., 2024).

Continuing the conversation about ethical considerations of AI, we also discussed that the disclosure of sensitive information can compromise students’ privacy and be accessed by other users. Writing center staff discussed the ways to deliver this information to students as well as the ways tutors can protect students from sharing their personal data. We further focused on the exclusion or discrimination of certain groups of people. For instance, when students use AI tools, they need to ensure that they do not discriminate against marginalized groups. We discussed how ChatGPT can exclude individuals with disabilities and what accessibility features need to be considered while working with those populations.

The last part of our discussion focused on the “potential long-term impacts” of AI adoption on society, namely, “employment, education, and communication.” Tutors discussed the way to explain to students their responsibility in this matter and advocate for the responsible use of AI. 

Workshop Part III. Activity—ChatGPT Feedback

The last part of the workshop was devoted to the activity that aimed to develop tutors’ professional skills and demonstrate the difference between human and AI-generated feedback on a student’s paper. For this activity, I used a sample submitted to the writing center by one of the undergraduate students for asynchronous feedback. I divided tutors into two groups: those who were going to provide their feedback manually and those who were going to provide feedback generated by ChatGPT. The first group understood the task and started working on it right away because they needed more time to provide meaningful feedback. The second group followed the steps to make sure everyone was on the same page: (1) tutors opened a web browser and went to chat.openai.com; (2) they clicked login and if it was their first time using ChatGPT, they had to select sign up and register their account; (3) in the send message box, they entered their message: “Can you please give me feedback on this…” providing the sample paper submitted by the student; (4) they waited for ChatGPT to finish generating a response and (5) after reviewing this response, copied and pasted it into their document. 

Once both groups finished their tasks, they shared their feedback. Tutors were asked to think about this question while sharing their feedback: What is the difference between feedback that you would give as a tutor and the feedback that is given by ChatGPT? Tutors were asked to think about (1) building a better writer and (2) how this feedback would contribute towards building the writer’s confidence. 

  The results that we received were astonishing. Tutors found out AI focused exclusively on the written text and offered revisions or edits without engaging with the writer’s thought process or providing guidance for improvement. The feedback was primarily focused on surface-level corrections, such as grammar, punctuation, and style, but lacked the depth needed to foster the writer’s development. Additionally, ChatGPT’s feedback did not offer opportunities for the writer to engage with or reflect on their work, which is a key part of the learning process. Human feedback, on the other hand, encouraged dialogue and offered constructive suggestions that allowed the writer to grow and understand the reasoning behind the changes. Tutors also noted that the AI’s feedback often lacked empathy and did not account for the emotional aspect of writing, which human tutors typically address by acknowledging the writer’s effort and offering positive reinforcement.

In contrast, human feedback fostered a sense of collaboration, enabling the writer to take ownership of their work and build confidence in their writing abilities. This highlighted a significant difference in the approach to feedback, where AI, despite its efficiency, fell short in terms of personal connection and growth-oriented support. The findings of this activity were that AI will never replace the writing center tutors. 

Workshop Part IV: Designing AI Policy

  Finally, we showed various AI policies from different writing centers to the tutors, and we asked them to carefully analyze them and discuss them in pairs, focusing on the pros and cons of each policy. Then tutors were asked to complete a Google form to think about a possible policy with regards to students bringing in AI-generated papers to the writing center. In that policy design activity, tutors highlighted the following:

    • Should a paper be strongly suspected to be AI-generated (or if it is confirmed to be AI-generated), then asynchronous appointments shouldn’t be used. This is so that tutors can have a dialogue with the students.
    • Tutors should always remind students what IUP’s AI policy is and the consequences of a violation of academic integrity.
    • Have students pick out pieces of the AI-generated text that they like, then build their new piece of writing on those pieces.
    • Have face-to-face conversations encouraging the student to only use AI as idea generation, not for writing their papers.
    • Be supportive and non-judgmental instead of refusing the session.
    • Employ similar strategies as when working with a student who has plagiarized. 
    • Be particularly careful when dealing with international students, especially because of differences in views and understandings of academic integrity.
    • Tutors do not have the right to use ChatGPT to give feedback to their writers without consent.

In addition, tutors agreed to revise the previous AI policy of the Writing Center. The revised one is “Students cannot simply generate work using AI and submit it as their own. However, they are allowed to use AI at any stage of the writing process to assist with brainstorming, drafting, editing, or other tasks. It is important that students explicitly acknowledge their use of AI, ensuring transparency and maintaining academic integrity. This approach allows students to benefit from AI tools while still ensuring that the work reflects their own understanding and effort.”

Future Workshops

The purpose of the workshop was to demonstrate the importance of creating AI policies for the writing center to ensure inclusion and make tutors aware of AI integration into the academic world. Tutors had lively discussions about their personal use of AI, their experiences with students using AI, and challenges they experience while tutoring those students. Tutors learned how AI can be used in academic settings and what the affordances and risks of using AI are. 

The described workshop can be implemented in any writing center that focuses on AI policies to teach writing center tutors and staff to work with students using AI to write their paper, help them integrate AI in their work in meaningful ways, and provide support for both students and tutors. 

In addition, this workshop raised several questions about the role of AI in the writing centers such as:

    • How can writing centers ensure that AI tools are used ethically and responsibly by students?
    • What are the potential impacts of AI on the development of students’ writing skills over time?
    • How can tutors balance the use of AI in the writing process without compromising student learning and growth?
    • What kind of training and resources are needed for tutors to effectively integrate AI in writing instruction?
    • How can writing centers maintain academic integrity while allowing students to use AI tools for support?

Workshop Benefits

Along with the opportunity to share ideas and thoughts about AI in writing and composition classrooms, there are the following benefits of implementing this workshop in the writing centers:

    • This workshop offers insights and tips to leverage AI in the academic world, including exploring AI to provide professional feedback. 
    • This workshop considers the benefits and drawbacks of using AI to help tutors realize its risks and challenges along with the affordances. 
    • This workshop highlights the differences between human and AI feedback, which creates space for AI integration into writing center sessions.
    • This workshop teaches to support student writers and help them become more knowledgeable about AI use by discussing ethical considerations.
    • This workshop contains a practical part that promotes discussion about the use of ChatGPT feedback. 

Thus, other writing centers would benefit from running a similar workshop situated in their institutional context. It would provide tutors and writing center staff with the opportunity to discuss their own fears and concerns regarding AI and also share their positive experiences with AI that can be applied during writing center sessions. One more advantage of this workshop is that it can be modified to be conducted in various writing centers on specific AI topics to provide support for tutors and students. 

References

AlAfnan, Mohammad Awad, Samira Dishari, Marina Jovic, and Koba Lomidze. (2023). ChatGPT as an educational tool: Opportunities, challenges, and recommendations for communication, business writing, and composition courses. Journal of Artificial Intelligence and Technology. https://doi.org/10.37965/jait.2023.0184

Cummings, R., Monroe, S., Watkins, M. (2024). Generative AI in first-year writing: An early analysis of affordances, limitations, and a framework for the future. Computers and Composition, 71, https://doi.org/10.1016/j.compcom.2024.102827

Daniel, S., Pacheco, M., Smith, B., Burriss, S., & Hundley, M. (2023). Cultivating writerly virtues: Critical human elements of multimodal writing in the age of artificial intelligence. Journal of Adolescent & Adult Literacy, 67, 32–38. https://doi.org/10.1002/jaal.1298

Godwin-Jones, R. (2022). Partnering with AI: Intelligent writing assistance and instructed language learning. Language Learning & Technology, 26(2), 5–24.

Kleiman, G. and GPT-3 (2022, August 12). AI in Writing Class: Editor, Co-Author, Ghostwriter, or Muse? Medium. 23. https://medium.com/@glenn_kleiman/ai-in-
writing-class-editor-co-author-ghostwriter-or-muse -348532d896a6

Lingard L. (2023, June 29). Writing with ChatGPT: An illustration of its capacity, limitations & implications for academic writers. Perspect Med Educ., 12(1), 261-270. 10.5334/pme.1072. PMID: 37397181; PMCID: PMC10312253

Mathew, Vetter. Brent, A. Jalie, J., Othman, M. (2023). Towards a framework for local interrogation of AI ethics: A case study on text generators, academic integrity, and composing with ChatGPT. Computer and Composition.
https://doi.org/10.1016/j.compcom.2024.102831

MLA-CCCC Joint Task Force on Writing and AI (2023). https://aiandwriting.hcommons.org

Mollick, E., & Mollick, L. (2023). Assigning AI: Seven approaches for students, with prompts. arXiv Preprint arXiv, 2306, 10052

Paule, Fyfe. (2023).  How to cheat on your final paper: Assigning AI for student writing. AI & SOCIETY. https://doi.org/10.1007/s00146-022-01397-z

Puentedura, R. R. (2013, May 29). SAMR: Moving from enhancement to transformation [Web log post]. Retrieved from http://www.hippasus.com/rrpweblog/archives/000095.html

Ranade, N, & Eyman, D. (2024). Introduction: Composing with Generative AI. Computers and Composition, 71, https://doi.org/10.1016/j.compcom.2024.102834

Tsufim, F & Pomerleau, L. More is less?: Using generative AI for idea generation and diversification in early writing processes. Teaching and Generative AI.
https://uen.pressbooks.pub/teachingandgenerativeai/chapter/more-is-less-using-
generative-ai-for-idea-generation-and-diversification-in-early-writing-processes/

https://thepeerreview-iwca.org