AI and Human-Computer Interaction for Teachers

In teaching and learning about AI, balancing trust—and over trust.

GUEST COLUMN | by Michelle Zimmerman

It is okay to not know what to think about AI yet. If you are hesitant, you are not alone. But that pause in an unknown space should not prevent you from curiosity or prevent your curiosity from helping you find answers to your questions. This is one of the processes that make you effective as an educator—asking hard questions and pursuing knowledge until you find answers to questions.

‘This is one of the processes that make you effective as an educator—asking hard questions and pursuing knowledge until you find answers to questions.’

The Discomfort of the Unknown

We teach young people the importance of showing their work, asking questions, and collaborating. When we become more skilled in an area, it can be easy to forget the discomfort of the unknown. I do not want to forget what that feels like for my students. That means continuously being in spaces that evoke questions that may not have clear answers yet. The topic of Artificial Intelligence is a space where the definition will evolve across time as technology advances. Ethical considerations and bias will continue to bring up questions faster than answers emerge.

Young people need you to model how to navigate a space like this, not because you are a computer science expert or have been coding since you were five, but because you can model how to ask hard questions that have no answers yet. You do not even have to know how to code to guide learning about AI. You can model how to transfer the knowledge you have from one domain to another, including a process of approaching learning things you do not know yet.

An Ongoing Stream of Questions and Healthy Skepticism

I want young people to see it is okay not to have all the answers. Not having all the answers means there is room for exploration, research, knowledge construction, collaboration, creative communication, and counterexamples.

Counterexamples are crucial to refining ideas while decreasing bias. It is easy to look for information that confirms what people think or feel to be right or true, but seeking an example that runs counter to prior knowledge challenges thinking and creates a scenario that offers practice for flexible thinking, refining, or re-defining ideas, revising, and even letting go of previously held incorrect assumptions.

These are skills the human mind has the capacity to practice and develop in ways machines are not currently good at. Not everyone has the opportunity (or given themselves the opportunity) to practice these skills.

Ask your learners. Ask adults around you. What do people around you know? What are perceptions they have? Do their perceptions match yours? Do you know how to define AI to someone if they asked you? These are good places to start.

Then, you can ask questions like:

  • Can a machine replace my job, or the jobs of my students in their futures?
  • What do I offer to society that is unique?
  • What am I spending my time on that can be replaced by a machine?
  • How do I know where my data is going and how it is used?
  • Are people at a disadvantage if they do not know a specific coding language?
  • Do different countries think about AI differently?
  • What are the goals organizations have for AI and are they different from end users?

 

It can feel like an ambiguous, daunting topic with a lot of unknowns and something that should be saved for experts in computer science fields. Even the people who I interviewed for Teaching AI: Exploring New Frontiers for Learning who have worked in the field of AI since the 1960s had an ongoing stream of questions and healthy skepticism. The more you learn, the more you realize there is more to learn. Challenge yourself to make your own KWL chart: “what I (K)now, what I (W)ant to know, what I (L)earned.” This should not just be for children. 

A Balance Between Trust and Over Trust

The balance between trusting and over trusting technology can be a fine line.

Over trusting AI misses important discussions and learning opportunities on topics from bias and representation to accuracy in data sets, legal and ethical protocols, and the essential aspect of multiple perspectives along with human curiosity.

To never trust technology as a beneficial asset for education is missing the chance to prepare young people to become the best positioned for a time and place that already uses AI.

I want young people to be prepared so they can be in the best position to contribute meaningful solutions to local and global challenges now and in the future. It starts with your curiosity, balancing human-computer interaction.

Michelle Zimmerman is Director of Innovative Teaching and Learning Sciences at Renton Prep and author of Teaching AI: Exploring New Frontiers for Learning. She recently participated in a new traveling exhibit, Artificial Intelligence: Your Mind & The Machine presented by Museum of History & Industry in Seattle. 

0 Comments

    Leave a Comment

    %d bloggers like this: