Wednesday, July 10: Day 7: Will AI Destroy the World?
Unit Description and Learning Goal:
Copy and paste here.
Teacher to record an 2-5 minutes introduction video and upload in this section.Each unit must have:
- Pre-Learning Assessment (FOR) for unit 1 (could use Quiz/Assignment/Forum)
- Unit Evaluation(s) (OF Assessment) for unit 1 (could use Quiz/Assignment) - For assignment make sure to have rubric.
- Unit Learning Journals: Reflection (AS Learning) - could use Assignment
Submission Format: PDF (include your name, date and assignment title); Double spaced, Times New Roman, 12 pt font.
Decision theorist Eliezer Yudkowsky has a simple message: super-intelligent AI could probably kill us all. So the question becomes: Is it possible to build powerful artificial minds that are obedient, even benevolent? In a passionate talk, Yudkowsky explores why we need to act immediately to ensure smarter-than-human AI systems don't lead to our extinction.
This is the link to the video:
Working with a partner, watch the TedTALK and follow along with the transcript, which is downloadable here.
Then, summarize why Yudkowsky is so concerned about building superintelligent AI systems. What are some of his solutions to the potential problem(s)?
Later, you and your partner will share your thoughts with the rest of the class.
Your next assignment is to write a brief one-paragraph summary for each of five (5) movies and three (3) novels that describe AI technology threatening humanity.
In each paragraph include the title, the type of AI technology that threatens and how it affects people and society. if they survive, how do the humans cope, fight back and survive?
In-class students make a written record while online students submit a PDF document
This is an in-class assignment and if more time is needed to complete it, it will be done as homework.