Abstract

Security attacks present unique challenges to self-adaptive system design due to the adversarial nature of the environment. Game theory approaches have been explored in security to model malicious behaviors and design reliable defense for the system in a mathematically grounded manner. However, modeling the system as a single player, as done in prior works, is insufficient for the system under partial compromise and for the design of fine-grained defensive strategies where the rest of the system with autonomy can cooperate to mitigate the impact of attacks. To deal with such issues, we propose a new self-adaptive framework incorporating Bayesian game theory and model the defender (i.e., the system) at the granularity of components. Under security attacks, the architecture model of the system is translated into a Bayesian multi-player game, where each component is explicitly modeled as an independent player while security attacks are encoded as variant types for the components. The optimal defensive strategy for the system is dynamically computed by solving the pure equilibrium (i.e., adaptation response) to achieve the best possible system utility, improving the resiliency of the system against security attacks. We illustrate our approach using an example involving load balancing and a case study on inter-domain routing.

Highlights

  • A self-adaptive system is designed to be capable of modifying its structure and behavior at run time in response to changes in its environment and the system itself [9,12]

  • We demonstrate how our approach can produce adaptation decisions under security attacks for Znn website to enhance the system utility

  • We adopted a greedy algorithm for this routing application as the benchmark, and compared the system utility between these two approaches to demonstrate the superiority of game theory under security attacks

Read more

Summary

Introduction

A self-adaptive system is designed to be capable of modifying its structure and behavior at run time in response to changes in its environment and the system itself (e.g., variability in system performance, deployment cost, internal faults, and system availability) [9,12]. Various game-theoretic approaches have been explored in the security community for modeling interactions between the system and attackers as a game between a group of players (i.e., system and multiple attackers, each as one player) and computing optimal strategies (i.e., Nash Equilibrium) for the system to minimize the impact of possible attacks and improve its resiliency against them [40,15,19,28] These methods can be used to (1) model adversarial behaviors by malicious attackers [19], and (2) design reliable defense for the system by using underlying incentive mechanisms to balance perceived risks in a mathematically grounded manner [15]. We advocate a security modeling approach where an attack is modeled as the anomalous behavior of a system component that deviates from its expected behavior, as an alternative to a conventional approach where attackers themselves are modeled as separate players To this end, we propose a novel approach to improving the resiliency of self-adaptive systems against security attacks by leveraging game theory. – A demonstration of the applicability of our approach through an example with load-balancing scenarios and a case study involving a network routing application with a proposed dynamic programming algorithm

Running Example
Bayesian Game Theory
Self-Adaptive Framework Incorporating Bayesian Game Theory
Bayesian Game Through Model Transformation
Evaluation – Routing Games
Game Definition for Interdomain Routing
Dynamic Programming Algorithm
Experiment Setup & Results We demonstrate how our
Related Work
Conclusion and Future Work
Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call