Szlichta, JarekChai, Andrew Brian Frederick2025-07-232025-07-232025-04-172025-07-23https://hdl.handle.net/10315/43023Modern database systems, such as IBM Db2, rely on cost-based optimizers to improve workload performance. However, their decision-making processes are difficult to interpret. Tuning them for specific workloads remains challenging due to their complexity, numerous configuration options, and interaction with unique workload characteristics. Additionally, database systems increasingly rely on black-box machine learning models within the optimizer and automatic tuning tools. These black-box models lack interpretability, hindering expert trust and debugging. We propose GEX, a system that provides interpretable insights into database optimizer behavior using explainable AI techniques. We adapt XAI techniques for generating perturbation-based saliency maps from surrogate models to the domain of SQL queries. With GEX we propose a framework for how saliency scores can be used to guide experts in system tuning tasks such as statistical view creation, configuration parameter adjustment, and query rewrite. We demonstrate the ability of GEX to capture and communicate optimizer behaviour through experimental evaluation in these tasks using the TPC-DS benchmark and IBM Db2.Author owns copyright, except where explicitly noted. Please contact the author directly with licensing requests.Computer scienceGuiding Expert Database Tuning with Explainable AIElectronic Thesis or Dissertation2025-07-23Machine learningExplainable AIDatabaseData systems