Building temperature control presents significant modeling challenges due to the complexity of internal systems and varied usage patterns. Consequently, black box modeling has become a prevalent approach in this domain. This technique broadly encompasses deep learning and Bayesian optimization methods, each offering unique mechanisms for deriving optimal control parameters. Deep learning methods iteratively refine control parameters using extensive datasets, whereas Bayesian optimization employs Gaussian process regression and acquisition functions to pinpoint optimal solutions. This paper conducts a comparative analysis of two innovative methods within these branches: the application of Deep Reinforcement Learning (DRL) for integrated control of household temperature and electric vehicle charging, and the Primal-Dual Contextual Bayesian Optimization (PDCBO) method for adaptive thermal control in buildings. The study reveals that the DRL approach, integrated with electric vehicle charging systems, excels in responsiveness and reducing energy consumption. In contrast, the PDCBO method enhances control by allowing for the artificial definition and continuous optimization of constraints on energy usage and comfort during the operation, facilitated by algorithmic design. By synthesizing the strengths of both methods and considering electric vehicle integration and tailored adaptive control constraints, enhanced outcomes are achievable. This paper proposes that specific control strategies be developed for different building types to address diverse real-world requirements effectively.