AI tools in writing: A blessing for students or a blind spot?

Introduction

With AI-powered writing assistance becoming an everyday companion in higher education learners’ daily practice, one cannot help but wonder: Are we conditioning learners to think more critically or merely outsourcing their cognitive process?

What once constituted light-touch support through Grammarly has now evolved into all-inclusive authorship delegation through QuillBot and ChatGPT. They are reshaping not only how students write but also how they learn, interact, and understand their relationship with knowledge. As someone who researches teaching digital literacies across engineering and technology, I have observed both the potential benefits and the drawbacks of these changes. However, I believe we are also reaching an inflexion point that necessitates critical pedagogic consideration.

 

Generative AI and the rise of ready-made writing

It begins harmlessly, as it sometimes does, with a piece of software like Grammarly that corrects grammar, refines tone, enhances fluency, etc. This response, at best, has the effect of increasing learners’ confidence (Younis et. al., 2023). They are reactive, rather than generative.

QuillBot moves the needle even further. It offers AI-driven summarising and paraphrasing, features most utilised by second-language learners who struggle to express complex college ideas accurately. According to the recent multi-country survey, QuillBot is the second most utilised AI writing assistance after Grammarly (Ghafar et al., 2023), mainly in English as a Foreign Language (EFL) and English as a Second Language (ESL) contexts.  

Then, there is ChatGPT, which is capable of not only correcting grammar and paraphrasing text but also creating whole assignments. With a single question, for example, a student can have an already-prepared answer with references (authentic or otherwise), proper formatting and a scholarly tone on their mobile phone. This functionality has led it to serve for many as a means of yielding rather than learning (Raheem et al., 2023).

 

Fluent yet flawed: AI and the erosion of knowledge creation

In a digital systems module that I recently taught, one of the students submitted a report on electromagnetic (EM) sensors used for monitoring steel quality. At first, the report looked smooth and consistent, perhaps even too smooth. But then there was this long subsection on Magnetic Resonance Imaging (MRI) sitting smack in the middle. Up came the alarm bells.

The truth came to light in a short conversation: The student had come to ChatGPT looking for help. The AI had ‘hallucinated’, produced a section that, while it sounded fine, was entirely incorrect, referencing EM sensing in metals and used an MRI section as an example. Lacking any disciplinary insight into how ill-suited this was, the student blindly included it.

 

This is the situation described in previous weeks, where the AI tool not only gets things wrong but does so in a way that you are likely to believe to be correct. Setiana and Chamalah (2025) argue that the fluency of AI contributes to an illusion of scholarly security, particularly as it concerns beginning writers.

 

It is no longer a question of if, but how AI tools are used. Well over half (62%) of UK students have used generative AI tools to some extent, yet just 26% are confident that the output produced is correct or appropriate (JISC, 2023). This fluency, as opposed to the critical literacy digital divide, is arguably the most significant threat to meaningful learning within further and higher education.

 

In Reading Club, this disconnect arose: Are students learning, or submissively surrendering to it? As Al-Shaboul et al. (2024) point out, writing is no longer, in the eyes of learners, an act of creating knowledge but a problem that we solve with the help of AI. We cannot ignore that as a change in pedagogy.

The promise and peril of AI writing assistance

Shall we do away with the very affordances the tools offer? AI writing assistance is capable of making students with dyslexia, ADHD, and other neurodiverse disorders more articulate. It is capable of assisting international students who face difficulties based on the conventional standards. It is capable of reducing writing anxiety (Aljuaid, 2024).

 

Nevertheless, blanket dependency carries some liabilities. ChatGPT may facilitate surface learning, the danger of plagiarism-by-paraphrasing, and restrict opportunities to build disciplinary identity through genuine voice (Demirel, 2024).

 

Building critical AI literacy in higher education

When it comes to educational institutions, it is a different story altogether. Some have adopted AI detection software, while others have opted for blanket bans. However, there has been little to no effort to collaboratively develop AI-specific usage policies with students that are inclusive, ethical, and context-specific (Zulfa et al., 2023). The confusion is set to reward those who go undetected while penalising those who transparently acknowledge the use of AI.

 

We can take a different, more realistic view, an evolutionary view, not a backward one. Instead of fearing AI, we need to make critical AI literacy part of the curriculum. That is what it might look like.

AI as a co-author, not a back-end author. We can model responsible use of AI in the classroom and teach students how they use AI for brainstorming, critical thinking or editing.

Transparency about AI policies. Students want the freedom to use any resources without fear of retaliation. The country-to-university policies and regulations, including AI guidelines, support academic integrity without any conditions.

Critical thinking as the new foundation. Now that AI is doing the writing, we need to think more about our approach. Educators must continue to consider metacognitive reflection, evaluative reasoning, and curiosity—perhaps all the things AI cannot replicate.

 

In this exciting new world of generative AI, we owe it to ourselves and our students to remember that writing is not just a product. It is a way of thinking, a space for critical thinking, a means of developing identity and achieving purpose. These tools will definitely continue to evolve. We will see ChatGPT become more accurate, QuillBot introduce more nuanced writing, and Grammarly offer even more user-friendly features. The central question is whether students are still able to think critically about their activities. This poses a significant challenge for us as educators, and it requires our immediate attention and action.

Click to view a reference list of AI tools in writing: A blessing for students or a blind spot?

Next
Next

The Times Top 100 Graduate Employers 2024-2025: Insights