Skip to main content
Best News Website or Mobile Service
WAN-IFRA Digital Media Awards Worldwide 2022
Best News Website or Mobile Service
Digital Media Awards Worldwide 2022
Hamburger Menu




Commentary: We must become better writers in an age of ChatGPT

Using generative AI tools to string sentences together, whether for school or work, would be silencing your own voice, says Edson C Tandoc Jr of NTU Wee Kim Wee School of Communication and Information.

Commentary: We must become better writers in an age of ChatGPT
Relying on generative AI alone deprives us of opportunities to learn and improve. (Photo: iStock/fizkes)

SINGAPORE: Last semester, I taught a writing class at the Wee Kim Wee School of Communication and Information. The module included peer review sessions, where students read anonymised copies of their classmates’ writing assignments and gave comments.

For one workshop, I secretly included a feature article that no one in the class had written. Upon reading it, many students said the writing was decent, the flow was logical, and the copy was free of grammatical errors. 

But one said that it was too dry. Another said she wished the writing was more personal. Someone said that it “didn’t have a soul”.

It was an article written by ChatGPT, a generative artificial intelligence (AI) chatbot.

Hitting 100 million monthly active users only two months after it was launched, ChatGPT is considered one of the fastest growing online platforms in history. The chatbot can be prompted to generate original texts in various formats, such as poems, essays, or even news and feature articles.

Like other language models, the chatbot can generate coherent and grammatically correct sentences in only a matter of seconds. It can also be prompted to write in a particular style, even that of a particular author.

But it also raises a lot of questions. For me, one concern is its impact on our understanding of original writing. If I had used ChatGPT to generate this commentary and prompted it to write using my style, would that count as my own writing? 


In February, the Centre for Information Integrity and the Internet (IN-cube) at Nanyang Technological University conducted an online survey and found that of 779 adult participants, about 46 per cent said they had used ChatGPT at least once in the past month, and nearly 15 per cent had used it more than five times.

Younger participants used it more often. Among the Gen Z respondents (21-25 years old), about 68 per cent had used it. Compare that with 52 per cent of millennials (26-41), 34 per cent of Gen Xers (42-57), and only 7 per cent of boomers (58-76).

Most of those who had used the chatbot had positive attitudes toward it, with about 75 per cent agreeing that ChatGPT was “helpful” and about 70 per cent agreeing that it saved time.

Indeed, ChatGPT can churn out writing at a fraction of regular human speed. The assignment where I snuck in a ChatGPT-generated article required my students to visit Fort Siloso in Sentosa and write about their experience. 

While my students had one week to visit Fort Siloso and pen an 800-word article about it, ChatGPT did so in less than three minutes.

Others have argued that AI-powered writing tools can help users in several ways. They can check grammar and tackle writer’s block by generating an outline for an article. Outsourcing these tasks to AI allows writers to focus on the substance and content of their arguments or stories.


In addition to the Fort Siloso article that my students critiqued as impersonal, I also asked ChatGPT to write another version using a first-person perspective. It was similarly logical and decently written, but more engaging.

However, the article it generated was dishonest. In one paragraph, it wrote: “Upon entering the fort, I was greeted by a knowledgeable guide who gave me a brief overview of the fort's history.” How can an AI programme write about a first-hand experience it never had, describing sights it never saw?

A few of my students also pointed out factual mistakes. Indeed, when my colleagues tested ChatGPT by asking it to write short bios of faculty members, many details included were wrong.

When I prompted it to write a literature review and to include a reference list, it listed references that I could not find, and cited articles for information not discussed in them.

If I had used ChatGPT to generate this commentary and it included factual errors, who should be held accountable for any mistakes?

Bylines are not just about acknowledging the efforts and unique voices of authors - they are also about accountability. They identify the writer responsible for what is communicated in the article.


The popularity of ChatGPT and other AI chatbots isn’t surprising in a period of information overload. They can help us synthesise, organise and simplify data points into digestible and meaningful chunks.

However, ChatGPT presents a challenge for someone like me, who teaches a writing class where many assignments are done out of class. Tools that detect AI-written articles are available, though still in their infancy. The bigger challenge is getting my students to value originality and creativity, so that they choose to hone their writing skills and not take shortcuts. 

Plagiarism has been around even before the rise of generative AI. In recent years, our faculty have come across websites offering to write essays or dissertations for a fee. But when we take these shortcuts to generate articles instead of write our own work, we are ultimately short-changing ourselves.

I encourage students to find their own voice and discover their own style. In the process of doing so, they will feel frustrated, as distinguishing ourselves from other writers is not easy.

But doing so has never been this important, at a time when almost everyone’s voice can be heard or read online, and when almost everything is templatised. Originality is what will help our writing stand out from all the noise.


LISTEN – Daily Cuts: How can students and educators embrace ChatGPT in the classroom?

Generative AI may produce articles with perfect grammar, but it is through struggles, challenges and imperfections that writers learn to be better.

Students may not produce the best copy the first few times, and that’s fine. They can use ChatGPT to help them proofread sentences they’ve written, learning from the errors it may flag or correct. They can even use it for comparison, like what we’ve done in our workshops, to learn what makes for a good, compelling writing.

But relying on generative AI alone to string sentences together - whether for a university writing assignment, a freelance gig or a PR campaign later on in professional life - deprives us of opportunities to learn and improve. It silences our own voices.

The technology is still evolving and will continue to improve. It is important that we continue to reflect on the extent to which we will allow these tools to change us.

Edson C Tandoc Jr is an Associate Professor and Associate Chair at the NTU Wee Kim Wee School of Communication and Information and the Director of the Centre for Information Integrity and the Internet (IN-cube).

Source: CNA/fl


Also worth reading