Abstract of Spring 2013 paper:
Is it safe to assume that the experts are correct and we have nothing to fear? The answer appears to depend on what the complexity, design, and openness of the technology are. It also seems to depend on who you ask.
A doctor or a lawyer most likely will leverage their professional experience over any symptom or evidence brought before them. These professions have become reliant on computation systems and the data they provide. This has created a professional bias of reliability without any qualified understanding of the computer systems.
The pervasive nature of computation systems has created a comfortable culture that is unaware of its presence. It is difficult to find an entire day that a human isn’t interacting with some kind of microcontroller that senses and responds. Systems with intended uses can ultimately be placed in situations that they were not intended be. The complexity of the systems can also overshadow the risks facing the public.
Some technologies contain several layers of interconnect subsets that could be compromised in any part results in a global failure. With out segmented understanding, these faults may not be recognized until some detectable condition merits further investigation. Failures on critical system can have far reaching consequences. Given the nature of a fault, some outcomes could fall outside the realm of ethical comprehension.