Ethics and Bias

Main Idea

Students should understand the ethical implications and potential biases in AI systems. The scenario explores the societal impact of AI technologies and emphasizes the importance of ethical guidelines and responsible AI development practices to mitigate bias and ensure fairness. This can be explored through AI learning and facial recognition.

 

Creator Lana Sattelmaier
Subject Computer science
Length 3 hours
Pedagogical Approach

Experiential learning
Participatory learning

Competences

C66 Ethical issues

C67 Limitations of AI

C68 Ethical considerations

C72 Algorithmic bias

C79 Personal information

Grades Grades 7 – 12
Technologies

Computers or tablets with internet access

Evaluation

Observe student engagement and proficiency in using the AI tools.

 

Learning Activities

Description The students are asked whether they think that artificial intelligence (AI) can be biased and why. They then try out examples of web applications. This is followed by a discussion about AI bias in real-life scenarios, such as facial recognition software that discriminates against certain demographic groups. Explain how AI systems can inherit unintended biases from the data on which they have been trained.

Students will complete a critical thinking exercise to identify potential biases in AI applications. They then discuss ethical considerations and the importance of ethical development and responsible use of AI. Finally, they discuss the impact of AI on privacy, security and employment.

LA1: Contextualization (15 min)

The teacher introduces the purpose of the task:

  • To introduce the topic of ethics and bias in AI and its relevance.
  • Explain why this topic is important and how it fits into the larger context of the course.
  • Stimulate interest:
    • Short, engaging story or example that brings the topic to life (e.g., a well-known example of bias in an AI application).
  • Raise questions:
    • Ask initial questions that provoke thought and stimulate interest (e.g. “How would you feel if an AI was unfair to you because of a bias?”).
    • Encourage students to ask their own questions about the topic.

LA2: Knowledge transfer (120 min)

  • Presentation of the content by the teacher in short intervals (see presentation)
  • Integration of small exercises and discussions in between:
    • Discussions in small groups or in plenary to reinforce what has been learned (e.g. discussions on ethical dilemmas).
    • (optional) Use of interactive methods such as quizzes, surveys or practical examples.
  •  

LA3: Reflection (30 min)

  • Discussion about what happened during the testing of the applications:
    • Students report on their experiences and challenges during the application tests.
    • Jointly analyze the test results and discuss possible improvements.
  • Discussion of what AIs can cause in real life and what ethical problems they entail:
    • Explain the practical implications of bias in AI applications (e.g. discrimination, injustice).
    • Discussion of ethical issues and how they can be addressed (e.g. transparency, fairness, accountability).

LA5: Outlook (10 min)

Draw a joint conclusion:

  • Summarize the most important points of the lesson.
  • Reflection on what has been learned and how it can be applied.

Materials

Presentation here