AI in education – staff and student perspectives

AI in education – Staff and Student Perspectives 

In another post titled ‘AI in education -my own experiences’, I mentioned that I gave a staff presentation on the future of AI in education. In that post, I gave a brief summary of the purpose of the staff presentation; that we need to be educating our students about AI technologies and incorporating such technologies into our teaching and learning activities to better prepare our students for the future. 

I want to go into more detail about that presentation in this post, and also talk about the very different experiences and discussions I had, when doing a presentation on plagiarism and AI with the year 10 cohort. 

In my staff presentation, I first gave a brief overview of what AI technologies are, some of their applications and a very simple straightforward explanation of how they work. Some staff are already very familiar with these generative AI tools and utilise them in their own teaching and learning. Other staff had mentioned to me previously that they had no idea about how these tools worked or their potential applications. 

I then briefly showed some excerpts from the plagiarised book reviews I received to show the algorithmic thinking behind these tools and how they don’t necessarily sound like student writing. I come to my TL career with the experience of being an English teacher and an EAL/D teacher. I therefore acknowledge that I have a somewhat better chance of detecting the difference between student writing and AI generated text than teachers in non-writing based KLAs. However, as the students move into higher grades and their writing gets more sophisticated, and the abilities of these tools grow, I expect that I’ll have an increasingly harder time picking up plagiarism. 

 My school does not subscribe to plagiarism detection services such as Turnitin and I know that these subscription options can often be quite expensive for schools to buy. Staff at the school have been under the assumption that Google’s plagiarism detection service offered through Google Classroom is an adequate free alternative to this growing dilemma. Before the presentation, one of the deputies and I had set up a kind of experiment to test the plagiarism detection abilities of Google Classroom. I set up several ‘assignments’ through Google Classroom, across a range of KLAs, directly copying and pasting sections of the National Curriculum dot points for Stage 5 to use as the basis for the assignment questions. The deputy then ‘completed’ the assignments using ChatGPT and submitted these to the Google Classroom. I turned on ‘originality reports’ and reviewed the results. 

Firstly, it is important to note that Google Classroom’s plagiarism detection software, called ‘originality reports’ is only available to use for 5 assignments per class. After that, schools must purchase a Teaching and Learning upgrade. 

The results of our plagiarism experiment were clear: not only did Google Classroom’s originality reports NOT detect a single instance of plagiarism from entirely AI generated content copied and pasted directly from ChatGPT, the originality reports did not even correctly identify that I had copied and pasted the curriculum dot points from ACARA’s website to form the questions. 

Now I cannot speak for paid subscription tools such as Turnitin. However, regardless of the tool,  for me the takeaway from this is clear: no matter how good the detection tool is or claims to be, it will always be one step behind the abilities of these generative AI tools. 

Even taking into account that real students might be ‘smarter’ about the way they use these generative AI tools by putting more effort into camouflaging it as their own work, I showed originality reports from real student assignments with the same results; no plagiarism was detected. 

I also spoke about how traditional source evaluation skills such as the ‘CRAAP’ test, are less effective when using generative AI tools, and instead we can be giving students skills such as algorithmic literacy as well as search skills such as how to use academic databases (Oddone, 2023, para. 7). 

It is far more important, and of far better use of time and resources, to teach students about the ethical considerations of these tools, the potential limitations, as well as the advantages that these tools provide and how to effectively use these tools in their learning. 

With that in mind, I want to next talk about the discussions I had with the year 10 cohort about academic integrity. 

As mentioned in my post ‘AI in education – my own experiences’, I worked with the year 10 group to complete a one hour seminar on the topics presented in the HSC:All My Own Work program modules before the students then completed the modules individually. 

Firstly, I showed the students excerpts of  what the school’s academic integrity policy is, followed by the NSW DoE’s academic integrity policy followed by the two universities that the majority of our students go to (generally speaking, 100% of our year 12 graduates will attend university).  I used a PowerPoint presentation and had screenshots showing the excerpts. Even though the students are in year 10 and will only just start their preliminary HSC course next year, I wanted to drive home the point that they need to be aware of these policies and expectations now so that they engage in the correct practices early on and it will become second nature. Each policy – from the school to the state level, has a zero tolerance policy to plagiarism, including plagiarism through generative AI tools and I explained the consequences for plagiarism at each level.  

Then, we moved onto a group activity. There are five modules and so I split each class into five groups, one for each module, to facilitate group discussions which were then shared with the class. 

The first thing I noticed about the HSC: All My Own Work program, is that it feels dated. This is definitely something the DET needs to update to address the academic integrity considerations of current students. 

Each module page had key questions which formed the subsections for that module. I used these questions as the basis for my discussion questions – but I modified them as I wanted to know what students actually think. I am well aware that students know what they should say and do when it comes to best practices for academic integrity but I wanted this to be a more real discussion and a learning experience for me as well as for them. 

These discussions led to some very interesting points made by the students. 

In module 1, I asked the students why they cheat and if they think the consequences for cheating are enough to deter them. In each class, a similar pattern emerged; about half the class said yes, the consequences were enough to deter them from cheating and the other half said no. Of those that said no, they said the consequences were not enough to deter students from cheating as they believed that it only affects you if you get caught and that the main reasons for cheating; pressure to perform (from peers, themselves, school, home), overwhelming workload and  lack of effort were of greater impact than the possibility of getting caught. Some classes suggested that we should publicly (within the school intranet) shame students who were caught plagiarising. Other students disagreed that punishing students after the fact would not stop them from doing it or provide an effective solution. 

I raised some of the ‘solutions’ I had read about in my research on AI in education, such as this article, suggesting that we shift to more formative assessment models including low-stakes, multi part projects which have the benefit of reducing student anxiety instead of relying on high-stakes summative assessments. The feedback from the students was mixed; some said this was a good idea whereas other students believe that low-stakes assessments are seen as ‘not important’ and therefore they are more likely to use generative AI tools to ‘get it over with’ to spend more time on other things, whereas high-stakes summative tasks are seen as more important and there is a greater risk to plagiarising and getting caught. 

A student raised a really interesting point – that teachers already don’t have enough time with all of the workload expectations so by adding more marking points for each assessment we won’t have enough time to get through it all and it will be even more on our plates. This was a really considerate point of view and one worth thinking about if we are to redesign assessments and what ‘assessment’ means; we need to think about ourselves as well as the students. 

In module 3, we spoke in more detail about plagiarism through generative AI tools. I showed them The University of Sydney’s academic integrity policy and explained that like Sydney Uni, many universities are now utilising generative AI tools as a teaching and learning tool in some aspects of their assessments (The University of Sydney, 2023). I stressed the importance of not relying on these tools and how we still need to reference the use of these tools. The application of these tools within an academic context was surprising to students and the majority of students were enthusiastic about this development and agreed that high schools should follow suit. The students also were interested when I facilitated discussions on the possible limitations both ethically and factually of these tools. 

When we spoke about creative commons and copyright laws, the students were interested and surprised to learn about how this impacts them directly; e.g. sharing content online through platforms such as TikTok or using grammar checkers to check and improve their grammar in their written work. Some students commented that they did not know about any of this and weren’t taught about it prior to this class. 

I think the key takeaways from this are that the students are excited about the prospect of utilising AI tools in their education and it could be a great way to make it more engaging for them. If universities are utilising it – we need to prepare students for university by getting them used to these new teaching methods.  

We also need to have greater discussions and input from students about how to effectively implement these technologies into the teaching and learning and assessment process in ways that are most effective according to the students. 

Finally, despite what I thought, and I’m sure what many teachers think, students are using these technologies without really knowing/being taught about the ethical considerations and possible limitations (at least according to the students in my school). These are definitely key points students need to know. As long as we are making direct links to how this impacts students and their lives, I think it is definitely something students will listen to and hopefully consider in their own future use of these tools. 

 

References:

Monash Australia. (2023, August). Generative AI and assessment. Learning and Teaching: Teach HQ. Retrieved August 1, 2023 from https://www.monash.edu/learning-teaching/teachhq/Teaching-practices/artificial-intelligence/generative-ai-and-assessment 

Oddone, K. (2023). Empowering school library staff to navigate the AI frontier. Connections, Term 3, 2023(126). https://www.scisdata.com/connections/issue-126/empowering-school-library-staff-to-navigate-the-ai-frontier/ 

The University of Sydney (2023, October). Academic integrity breaches. Retrieved November 8, 2023 from https://www.sydney.edu.au/students/academic-integrity/breaches.html

 

Leave a Reply

Your email address will not be published. Required fields are marked *