All about AI

Working Smarter Not Harder – Using AI to Automate the Mundane

“Outsource your workload, not your thinking.”

– Alec Couros

In this week’s video presentation Dr. Alec Couros explored the usage of generative A.I. in classrooms.  The presentation was quite compelling, as he demonstrated several practical applications throughout.  This seemed particularly relevant to the creation of my online course as time always seems to be in short supply.

Does this mean that I should have used an AI chatbot to create my entire course?  Obviously, no.

First off, academic misconduct was still a thing (last time I checked) and would betray the entire enterprise of learning (which being a teacher I have a particular affinity for).  Secondly, Alec (for clarification while I don’t know Dr. Couros personally, I took a class from him last semester and he was comfortable with students referring to him by his first name) was very clear that we (teachers, private citizens, corporations, or cats walking across keyboards) bear ultimate responsibility for any A.I. derived materials that we choose to post.  Factual inaccuracies and bias can not be eliminated from algorithms and any information that we distribute to our students must be thoroughly screened.  Essentially, content generated by A.I. must be adjudicated against our own knowledge base.  This means the average user cannot go into the process blind, or simply trust the material (for example Alec cited an example of a B.C. lawyer who was caught citing non-existent case law generated by AI).  That said, what we can do with A.I. is automate the mundane, menial, and time-consuming tasks that take away from the real goal of teaching: building relationships with students and engaging in thoughtful and meaningful conversations.

Here are some of the examples that I felt could be applied to my mathematics course on buying and leasing vehicles:

  • Chatbots like ChatGPT can easily create multiple choice questions when a suitable prompt is utilized. For example, after watching Alec’s presentation I used one of the prompts listed in this Google document to create a list of multiple-choice questions on buying and leasing vehicles (a sample question is shown below).

Which of the following factors should you consider when deciding whether to buy or lease a vehicle?

  1. a) Expected mileage per year
  2. b) Current fuel prices
  3. c) Number of previous owners
  4. d) Vehicle color options

 Answer: a) Expected mileage per year.

Explanation: Mileage limits are a crucial factor in leasing agreements, whereas the do not directly affect buying.

The question is reasonable, and I appreciate that the prompt cues the Chat bot to include an explanation and rationale for the correct answer.  Utilizing this feature (even after verifying the accuracy of the questions) would have saved me several hours.

  • In a similar vein I used Chat GPT to create a series of short answer and long answer discussion questions on the pros and cons of buying vs. leasing. Here is an example of the output:

How do your personal financial goals and lifestyle preferences influence your decision to buy or lease a vehicle?

This is a decent question, but knowing my students I would have to clarify it somewhat.  This type of question would need to be accompanied by an example to clarify what I was looking for.  Guidelines for the length of the response (or number of points required) would have to be included as well.  The fact remains that this would be a useful jumping off point.

  • Increasing the accessibility of my lessons by using A.I. to translate the material to different languages is a unique capability that I am not sure I would be able to duplicate using anything else. I am a bit leery of not being able to verify the output though (harkening back to issue of responsibility).

Helping Students to Use A.I. productively.

Since A.I. tools are rapidly increasing in sophistication and proliferating throughout schools, successfully prohibiting their use feels to me like an act of futility.  As a general gauge I find once my mother (who is in her seventies) is freely using a technology the genie is truly out of the bottle.  The other day she told me about how she used Chat GPT to help suggest a recipe based on what she had available in her cupboard – so needless to say, A.I. has hit the mainstream.

Like any tool I think we need to discuss with students its inherent advantages and disadvantages.  Although this is not directly tied to the curricular outcomes of buying and leasing vehicles, I found it interesting that when I prompted Microsoft’s Copilot to create an image of “a young person buying their first vehicle at a dealership” it gave me four very similar options (see an example below).

3 people smile into the camera as they complete a car deal in a dealership. All three are Caucasian.

Do you notice something about the ethnicity of the people in the A.I. generated photograph?  Who seems to be missing (or not represented) in it?  What does this tell you about the data being used to train the A.I. model that created it?

Another interesting point that Alec brought up was that the essay, which is easily created by A.I. tools, may no longer be the gold standard for demonstrating learning.  Instead, we might need to shift our efforts to understanding how to write effective prompts and how to screen the output for accuracy.  Communicating with students why we need to read material to gain deep understanding will become a critical skill in a post A.I. world.

But Cheating, Matt, WHAT ABOUT THE RAMPANT CHEATING!

As I noted earlier winding back the hand of time is not a viable option.  Furthermore, as Alec noted in this video A.I. detection tools are (at the time of this writing) ineffective at detecting plagiarism and disproportionately flag non-native English speakers (in error).

I feel our best chance at mitigating cheating is by having open conversations with our students about the appropriate use of A.I., providing usable guidelines for its citation/clear expectations when it is not to be used, and building relationships with them so they respect the process of learning.  Some students will always choose to cheat.  Twenty years ago, I saw a student in a university chemistry final print a fake label on a cola bottle with chemical reactions replacing the ingredients list (he was caught).  In general, I believe a greater emphasis will need to be placed on process rather than product.  The pre-writing for a project will become as important (if not more) than the final paper.  It will also necessitate a shift away from generic essays to writing that directly connects to one’s personal experiences.  For example, could A.I. have generated this blog post?  Would it be able to connect to my Alec’s video or my projects I’ve created for this university course?  Maybe, but I doubt it.  As Alec put it, we need to have serious conversations about what is worth knowing.

Which brings me back to my course on buying and leasing vehicles.  Could students prompt A.I. to help them answer some of my questions?  Yes, but in the process of looking up this information would they not be learning?  Isn’t the point to have them read and learn about buying and leasing?  As my course has an in-class component, calculation-based questions are performed with the instructor.  At some point students will have to demonstrate their own understanding in a live environment.

One thought on “All about AI

  1. “I believe a greater emphasis will need to be placed on process rather than product.” That’s the point. If we keep chasing after what’s wrong, we’ll consistently miss the opportunities to learn through the available tools. In a way, we’d rather ban the use of AI than encourage its systematic application.

    If educators think beyond the exams and grades, focusing on skills and their application to work, we will help learners embrace the power of AI without outsourcing their thinking to it.

    A good reflection, Matt!

Leave a Reply

Your email address will not be published. Required fields are marked *