My experience with AI isn’t limited, but I also don’t think it is what it could be. I think when utilized correctly, as Ilda and Ayodele discussed, there are so many advantages that can be had with it. However, as we are still very early days in it, there has also been a lot of abuse with it.
In 2022 I was teaching Social Studies 30. We had an ongoing dialectic essay throughout the semester. We had certain criteria that needed to be met, like hand-written jot notes, a rough draft with edits, a good copy, and a resource list. The paper discussed modern issues in Canadian society where a student would have to argue the positions of both sides of their issue (3 topics for each side), and then offer their perspective and a possible solution. A well-done one totaled about 6-7 pages on average. I thought something of this nature would avoid the AI craze. There was so much personalization and a clear process to follow that it would be blatantly obvious if a student did use AI. To my pleasure, but to also my dismay, I wasn’t wrong.
One student handed in a singular paragraph about how AI was going to take over the education system. I don’t know if he was trying to be meta or what with this topic. There were zero spelling errors, non-cited information, and it didn’t argue both sides. I failed them. I had grounds to stand on of course because it didn’t align with our rubric, but I also was uncomfortable with the fact that the student tried to cheat by using AI. How could I prove it though? I threw it through a couple different AI checkers, but I don’t have as stable ground to stand on when it came to these. These weren’t regulated in any manner. I took it to my vice-principal and we discussed it together.
We compared it to his old writing and it didn’t line up, but the only real way we could get to the bottom of it was through interrogating the student. The student was going to fail either way, and this would decide whether or not they were going to be allowed the chance to credit complete or not. The VP pulled him into his office and the student admitted to using chatgpt. The only real way, based on our current policies, to “get” the student was through an admission of guilt. The student said that everything they read showed that they couldn’t get caught and he had no idea how we knew. We had to explain the process of how we figured it out and it became more obvious to them.
I feel we are potentially going to a better place now, but this has been my experience for the last two years. Students using it improperly and not understanding how what what the program generated was not adequate to be handed in. I am a firm believer that AI can be a great tool to use in the classroom, but I don’t think we have necessarily done a great job, as a whole, to educate students on how to use it. Some teachers do a fantastic job of it, while others avoid it like the plague.
If we want to use it for a good in the future, we certainly need to be addressing it to avoid situations like this. I think there will always be students who try to take an easy way out and will abuse it, but I always end up with at least 1 kid who tries to take the easy way out in other classes without AI. If we can mitigate the overall numbers and teach the majority of students to use it properly, I think we can lower instances like my example above. I still believe that there will always be someone to abuse it though. I don’t know if that can be avoided.
One topic I liked from the debate was that some teachers are worried about chatgpt and other AI’s because it wrecks their only way of assessment or teaching that they have ever known. Too many teachers are heavily reliant on essays or a formal piece of writing. I do think this issue of those misusing AI is a good thing, because it forces some of those teachers to take a look inward at their own practices. Maybe there are other ways of doing things? Maybe we should have been doing those things all along?
This one was a lot more personal with my example above. I feel like I went on a rant, but I wanted to showcase my own personal experience.
Hetterley
Thanks for sharing your personal experience Greg. I know we have a lot of Generative AI tools out there and students are misusing them. I agree with you on training students how to use these tools effectively. Students need to learn how to critically analyze responses from these tools and apply them to their context. Personally, the most useful part of these AI tools is those focused on teaching science lessons because it actually help students visualize those abstract and theoretical concepts. In terms of assessment, teachers need to really improve on their assessment techniques. As teachers, we just need to develop ourselves and be aligned to the emerging tools used in the education sector.
That is some story with your student, Greg! I agree that sometimes it just looks perfect and when you read it, you just know that this could not have been written by this person! AI can be so decptive at times. Using AI these days can help but I do not have much experience with it!