Opacity is an important information security property that characterizes whether the secret of a system can be inferred by an intruder who can partially observe the behavior of the system. In this paper, we touch upon the enforcement of the current-state opacity and initial-state opacity of discrete event systems modeled by automata. Specifically, given a partially observed discrete event system that is not current-state opaque (resp., initial-state opaque), by observing the system's behavior, the intruder stands a chance to infer that the current state (resp., initial state) is within the secret of the system. To prevent the exposure of secret information, we introduce a differential privacy mechanism to enforce a non-opaque system to be opaque. Particularly, the designed mechanism is settled to receive the observations of the system. Suppose that there are two observations such that one of them can lead to the exposure of the secret and the other is randomly observed. When the two observations are received by the designed mechanism that provides differential privacy, a randomly modified output with an approximate probability is exported, exposing to the intruder. In this way, since the observation from which the secret can be inferred by the intruder, is befuddled with one randomly generated by the system, he/she is not able to deduce which one is truly generated and received by the mechanism, even if the probabilistic generation rule of the mechanism is public.