Skip to main content
Best News Website or Mobile Service
WAN-IFRA Digital Media Awards Worldwide 2022
Best News Website or Mobile Service
Digital Media Awards Worldwide 2022
Hamburger Menu

Advertisement

Advertisement

Commentary

Commentary: Want ChatGPT to do your homework? Learn how to use it first

If students are to use artificial intelligence tools like ChatGPT appropriately, they need proper guidelines and literacy, say NIE’s Looi Chee Kit and Wong Lung Hsiang.

Commentary: Want ChatGPT to do your homework? Learn how to use it first

File photo. The rise of generative AI presents exciting opportunities for learning and education, but it is essential to invest in AI literacy to ensure success. (Photo: iStock/Edwin Tan)

SINGAPORE: The COVID-19 pandemic accelerated technology integration in education by nudging policymakers and educators to rethink teaching and learning practices. Now, artificial intelligence (AI) chat programme ChatGPT is pushing us to see the importance of AI literacy.

The advent of ChatGPT has set off a heated discussion in the education community, with many discussions in the media, academic and professional circles.

Its potential has also been discussed in Parliament, with Education Minister Chan Chun Sing saying in a response on Feb 7 that AI tools like ChatGPT could be useful in the classroom when used appropriately.

Yet ChatGPT is just the tip of the iceberg. Other generative AI tools can transform text to images, videos, audio, code, scientific papers and algorithms.

Educators have raised concerns students may use AI to cheat or do their homework. These are valid concerns, but there is a bigger picture here - generative AI tools have the potential to revolutionise the way we learn and educate.

Any potential advancement, however, relies on one critical factor: AI literacy.

CONTINUED GUIDANCE NEEDED ON AI-GENERATED CONTENT FOR SCHOOLS

AI literacy involves an understanding of AI, its capabilities, limitations and potential ethical implications. This literacy is essential to make informed decisions about the appropriate use of AI (including but not restricted to generative AI tools) in education.

As AI becomes increasingly integrated into our lives and the workplace, it is crucial to invest in AI literacy education to ensure that such tools are used to their full potential. Indeed, AI Singapore, which aims to boost the development of an AI ecosystem, has been actively promoting AI literacy education.

Aside from discussions on the pros and cons of generative AI in education by the media and academia, educators who have embraced AI in their classrooms have been keenly sharing what works and what does not.

Clear directives on guidelines and boundaries have been or will be useful for schools and tertiary institutions to respond to AI-generated content, and for educators to be empowered to incorporate AI in their teaching.

For example, teachers and faculty can be encouraged to be open to using these tools, which can support their professional practices, but they should not rely on such tools entirely and should continue to apply their expertise and judgment in core tasks such as lesson preparation and assessment design.

It is also important to ensure that the use of such tools does not undermine the development of basic skills and concepts, especially for younger students.

FROM WOW TO HOW

One of the key aspects of AI literacy is understanding the limitations of these tools. Generative AI tools such as ChatGPT are trained on vast amounts of text data, which allows them to generate human-like responses.

However, it is essential to keep in mind that these tools may not always be accurate, neutral or appropriate. The model learns from the biases present in its training data, and these biases can sometimes lead to unintended consequences.

The Dunning-Kruger effect - a cognitive bias where people believe they are smarter than they are - has been used to describe the five stages of a ChatGPT user.

In stage one, the user feels: Wow! Amazing! ChatGPT is awesome and can answer my questions. It quickly generates detailed and consistent answers, and it behaves like a superhuman.

In stage two, the user starts to doubt and realises that ChatGPT is a language model and is ultimately a statistical tool for predictably collating reasonable answers without really understanding language.

In stage three, the user realises that ChatGPT occasionally makes mistakes, or provides information that is not accurate.  

The user develops more understanding in stage four that ChatGPT is useful for when there is no need for accurate answers. But when the answer has to be reliable and useful, ChatGPT can’t be trusted.

In stage five, the “A-Ha!” moment arrives. The user realises that the true value of ChatGPT lies in its ability to improve productivity, and that it is important to recognise when ChatGPT gets things wrong.

Students and teachers can readily move up these stages if they have some basic form of AI literacy.

At the National Institute of Education (NIE), we have started offering courses on AI literacy to teachers. The courses are intended to help teachers develop a working awareness of AI, understand the potential and limitations of its uses in education and, in particular, consider the ethical aspects of AI application in education.

Teachers link the capabilities of AI with their pedagogical knowledge and experiences to apply it to design an action plan for their schools, which may include fostering AI literacy in teachers and students, and selecting AI systems for education that are beneficial for students.

MIND THE EDUCATION GAP

Generative AI is set to change the way students learn; that is a sure thing. But discussions on its use in education must also consider if it will fuel the divide between students with and without access to technology or widen the gap between students who have high AI literacy versus those who are not so AI literate.

The availability of generative AI could potentially widen the learning divide if there is unequal access to these tools or if they are only affordable for certain groups of learners. For example, if some schools or students have access to more advanced generative AI than others, they may be able to produce higher-quality work and have an advantage over their peers.

However, it is also possible that the availability of generative AI tools could help narrow the learning divide by providing new opportunities for students to learn and create.

For example, AI tools can help students who may struggle with writing to generate high-quality text or provide personalised feedback on their work. This can help level the playing field for students who may face challenges in traditional learning environments.

Listen:

While AI-literate students may be able to use these tools to enhance their learning, learners with less AI literacy may be more prone to simply copying and pasting AI-generated text without fully understanding it. This can have negative consequences for their learning and academic integrity, as well as for the overall quality of education.

The rise of generative AI presents exciting opportunities for learning and education, but it is essential to invest in AI literacy to ensure success.

By understanding the capabilities, limitations and ethical implications of these tools, individuals can make informed decisions about their use in education and prepare for a future where human-AI collaboration may be the norm. This will help create a better and more equitable future for all.

Looi Chee Kit is Professor of Education at the National Institute of Education (NIE), Nanyang Technological University (NTU), and co-Director of the Centre for Research and Development in Learning, NTU. Wong Lung Hsiang is Senior Education Research Scientist at the Centre of Research in Practice and Pedagogy, NIE.

Source: CNA/aj

Advertisement

Also worth reading

Advertisement