Abstract

Accidents or malfunctions in automatic systems often raise questions about the possibility of the system’s designer being able to foresee such problems. In general, the opinions of experts are given more credence than the opinions of non-experts. If objective evidence shows that a malfunction could not have been foreseen even by experts, the possibility of prediction is assumed to have not been possible. Experts can make a proper decision based on expertise related to the automatic machine coverage. However, non-experts might underestimate the coverage and become careful about handling of the system. When a malfunction that an expert cannot foresee occurs in such a situation, and results agree by chance with the forecast of a non-expert, engineers are questioned beyond reason about their “responsibility” – a trend particularly marked in relation to computer systems. As described in this paper, the case in which an Okazaki City Library user was arrested is an appropriate case study for this problem. Given the perspective of design of automatic machines and engineering ethics, we discuss it as an internal security issue.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call