Notes 8

From WikiEducator
Jump to: navigation, search

8. Errors, Failures, and Risks

In this module, we will look at some of the uses of computers where they have very direct personal implications for our health and safety. Who is responsible for the use and abuse that can result from these relationships and interactions?

Computers themselves are not capable of original thought but they can act as through they "think". Computers have capabilities to "remember" vast amounts of information and apply the information according to millions of rules that have been defined by hundreds of programmers over decades. How does this all come together? What happens when there are conflicts within the rules or data and something goes wrong?

  • Podcast: Can 'friendly' AI save humans from irrelevance or extinction?
    - The podcast is an interview between Dan Farber, editor-in-chief of CNET News.com and Eliezer Yudkowsky, co-founder of the Singularity Institute for Artificial Intelligence. Yudkowsky talks about the future of artificial intelligence and the contradiction between human and artificial intelligence development.


Objectives

In this module, students

  • recognize the benefits and dangers associated with computers
  • select web sites that provide additional insight into the issues
  • examine issues raised in discussions
  • write thoughtful responses to questions asked


Study Guide: Errors, Failures, and Risks

These notes are guides to reading and studying Chapter 8. Errors, Failures, and Risks of the textbook.

For the textbook reading for this lesson, here are some questions to get you thinking about the important concepts and information.

  • We are becoming more dependent on technology. What safeguards should there be to ensure that we are safe from this technology?
  • Who should be responsible in the case of technology doing damage or causing injury?
  • What computer errors are just annoying? What are some examples of serious computer errors?
  • What is the difference between a "design flaw" and a "bug"? Is one more serious than the other?
  • What legal remedies should be available in cases of computer hardware and software problems?
  • Are we too dependent on computers?
  • How reliable and accurate are computer models? Are there computer models that are "better" than real life testing?
  • Which people or organizations have helped make systems safer or reduced the negative consequences of errors?