31 October 2015

Science as a Human Endeavour: Science and technology contribute to finding solutions to a range of contemporary issues; these solutions may impact on other areas of society and involve ethical considerations.

Australian Curriculum: 
Science as a Human Endeavour: 
Science and technology contribute to finding solutions to a range of contemporary issues; these solutions may impact on other areas of society and involve ethical considerations. 

STEM and Ethics

You may have seen the Volkswagen scandal or read about it in the last few weeks.
http://www.9jumpin.com.au/show/60minutes/stories/2015/october/das-liars/
image from link http://spectrum.ieee.org/cars-that-think/at-work/education/vw-scandal-shocking-but-not-surprising-ethicists-say
Volkswagen installed a software “defeat device” in 11 million Volkswagen and Audi diesel vehicles sold worldwide. An algorithm was installed in the emissions-control module that detects when the cars were undergoing emissions testing. It ran the engine cleanly during tests and switched off emissions control during normal driving conditions. The cars produced up to 40 times the U.S. Environmental Protection Agency’s maximum allowed.
Who discovered this questionable practice?
What evidence was required to show there was an artificial intelligence incorporated into the software of the diesel emissions testing? Why did Volkswagen accuse the US engineers of fabrication –making up data? How did they show falsification- distortion of the data?
Australian Curriculum General Capability Ethical Understanding
• Recognise and analyse behaviours that exemplify the dimensions and challenges of ethical concepts
• Reason and analyse inconsistencies in personal reasoning and societal ethical decision making

• How does this scenario highlight ways that personal dispositions and actions had consequences for global citizens?
What Does “Responsible Innovation” Mean?
Does the Volkswagen scandal point to a need for a code of ethics for the creators of software applications?
Issac Asimov a scientist and science fiction author first brought the issue of software ethics to public attention in a 1942 magazine article.
He developed the Three Laws of Robotics, a primitive code of behaviour for robots, one applicable to today's software as well:
First Law: A robot may not injure a human being, or through inaction, allow a human being to come to harm.
Second Law: A robot must obey the orders given by human beings, except where such orders would conflict with the First Law.
Third Law: A robot must protect its own existence, as long as such protection does not conflict with the First or Second Law. (Asimov subsequently developed a Fourth Law that superseded the first three laws: A robot may not harm humanity, or, by inaction, allow humanity to come to harm. Volkswagen egregiously violated this law.)
Read more: http://www.theage.com.au/comment/volkswagen-scandal-software-developers-need-a-code-of-ethics-20151007-gk3w6m.html
What do you think about Issac Asimov’s three laws of robotics?
Should engineering graduates be taught engineering ethics so they have an understanding of professional and ethical responsibility?
How much control can a design engineer have over his or her product once it has reached the market?
Do you believe we have a compliance mindset when we purchase products internationally?
“Who is better equipped to understand the possible far reaching effects of these innovation processes than the engineers themselves?” What do you think about this statement?

No comments:

Post a Comment