Reflections on ChatGPT
Chris Ivey / Insights
Over the past couple of weeks, there have been a multitude of news articles and opinion pieces written about ChatGPT and the impact this is going to have on us all. I thought some simple reflections from me might be helpful based on what I’ve read, and discussed with colleagues both at St Andrews and across Australia.
03 February 2023
Dear St Andrew’s community
Over the past couple of weeks, there have been a multitude of news articles and opinion pieces written about ChatGPT and the impact this is going to have on us all. If you haven’t heard the news, here’s a simple summary from The Australian commentator, Tim Dodd.
For the first time we have a free, easily accessed AI bot which has close to human communication capacities, as well as the ability to mimic typically human capacities, – such as essay writing, coding, joke telling, exam sitting and yarn spinning – in the blink of an eye. If that weren’t enough it also has access to virtually all human knowledge.
ChatGPT, and other similar products pose several profound challenges and opportunities for those of involved in education. One is that ChatGPT can credibly do most assignments given to students – whether research tasks, essays or exams. Another is that they are a superb source of knowledge. Any student who decides to use an AI bot to learn rather than cheat, immediately finds they have a free, and in most cases entirely reliable tutor always at their elbow, capable of giving a coherent explanation of almost anything. The problem we now have in schools and universities is to decide what types of learning are relevant and how we assess in this new world. We should expect many things taught now to lose relevance while new fields will appear. For example, one area of intense interest will be in taming AI, stopping it being used in ways that are unacceptable or unethical.
There have been various responses to the launch, from acceptance and embracing change right through to banning, which is what both the state NSW and QLD education departments have done. However, for many of us, it is still a cautious watch and see as we ascertain how best to manage this change. It is best to ‘hasten slowly’ as the saying goes. I thought some simple reflections from me might be helpful based on what I’ve read, and discussed with colleagues both at St Andrews and across Australia.
- Good teachers can spot plagiarised work. In addition, in so many subjects in face-to-face school, the drafting process begins in the classroom, often pretty rough and needing refining. If a final submission was quite different to the draft worked on in class, it would raise a red flag.
- The program is somewhat limited when it comes to higher level critical analysis or evaluation.
- ChatGPT does, however, have real potential to reduce teacher workload by quickly producing draft worksheets, quizzes and simpler tasks which establish the foundations of knowledge. Possibly, more efficiency here releases teachers to apply their expertise to one on one assistance with students and the higher level tasks.
- It is going to challenge educators to consider the style of assessment tasks they set. Any assessment that a student completes independently at home is now going to be compromised.
I think the biggest reassurance needed for our parent community is that we take a multi-model approach to teaching and value the building of critical and creative thinking skills so their child will not be able to 'game' the system and get ChatGPT to do all their work and cruise through school. We know that this is not beneficial to them in the long run.
The hard reality is we can’t stop this, and even if we could or we decide to ban, it is merely a temporary solution. There will be something else around the corner, so burying our heads in the sand is a little futile. Our job as educators is now going to include teaching students how to use it wisely, knowing they’re going to use AI in the workplace anyway. As one writer put it, “It’s like having a driving school but teaching people how to ride horses.” Many may remember the introduction of calculators in the mid 1970s. I was in early primary, but at the time it was seen with negativity, as something to stop students thinking. However, what it did do, was stop maths students having to spend time on ‘easy ‘ calculations which in turn allowed teachers to teach the ideas and methods, and how to best use a calculator, then move on to set more complicated assignments. Similarly much of the current media discussion is focussed on AI and assessment with a negative bias, however just as with calculators all those years ago, we will be able to leverage AI for learning and it will be powerful.
The rationale in our Academic Integrity Policy remains consistent –
‘In stating its desire to develop graduates who possess integrity and a sense of personal integrity and accountability, the College values academic integrity to ensure that the academic achievements of its students are earned honestly and are trusted and valued by the student body, the broader College community and the educational community beyond the College.’
So at this point, my thinking is that in the short term, we will need to update policies to reflect using AI as a part of our approach to managing plagiarism, also modify assessments to ensure they are valid and reliable indicators of student learning. It’s also important to carefully begin to identify the benefits of AI in the learning process and integrate into teaching practices accordingly.
Change is good. It forces us to think, to reflect and ask questions. As Mr Moller, our Director of Knowledge Services quipped this week, technology can’t replace teachers, but if it can; it should! Of course, acknowledging that all change requires us to respond to how best we support the integrity and accountability of student learning.