Abstract

In recent years, a wide spectrum of database tuning systems have emerged to automatically optimize database performance. However, these systems require a significant number of workload runs to deliver a satisfactory level of database performance, which is time-consuming and resource-intensive. While many attempts have been made to address this issue by using advanced search optimizers, empirical studies have shown that no single optimizer can dominate the rest across tuning tasks with different characteristics. Choosing an inferior optimizer may significantly increase the tuning cost. Unfortunately, current practices typically adopt a single optimizer or follow simple heuristics without considering the task characteristics. Consequently, they fail to choose the most suitable optimizer for a specific task. Furthermore, constructing a compact search space can significantly improve the tuning efficiency. However, current practices neglect the setting of the value range for each knob and rely on a large number of workload runs to select important knobs, resulting in a considerable amount of unnecessary exploration in ineffective regions. To pursue efficient database tuning, in this paper, we argue that it is imperative to have an approach that can judiciously determine a precise space and search optimizer for an arbitrary tuning task. To this end, we propose OpAdviser, which exploits the information learned from historical tuning tasks to guide the search space construction and search optimizer selection. Our design can greatly accelerate the tuning process and further reduce the required workload runs. Given a tuning task, OpAdviser learns the geometries of search space, including important knobs and their effective regions, from relevant previous tasks. It then constructs the target search space from the geometries according to the on-the-fly task similarity, which allows for adaptive adjustment of the target space. OpAdviser also employs a pairwise ranking model to capture the relationship from task characteristics to optimizer rankings. This ranking model is invoked during tuning and predicts the best optimizer to be used for the current iteration. We conduct extensive evaluations across a diverse set of workloads, where OpAdviser achieves 9.2% higher throughput and significantly reduces the number of workload runs with an average speedup of ~3.4x compared to state-of-the-art tuning systems.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call