10
0

Data-Driven Policy Mapping for Safe RL-based Energy Management Systems

Main:41 Pages
15 Figures
Bibliography:5 Pages
4 Tables
Abstract

Increasing global energy demand and renewable integration complexity have placed buildings at the center of sustainable energy management. We present a three-step reinforcement learning(RL)-based Building Energy Management System (BEMS) that combines clustering, forecasting, and constrained policy learning to address scalability, adaptability, and safety challenges. First, we cluster non-shiftable load profiles to identify common consumption patterns, enabling policy generalization and transfer without retraining for each new building. Next, we integrate an LSTM based forecasting module to anticipate future states, improving the RL agents' responsiveness to dynamic conditions. Lastly, domain-informed action masking ensures safe exploration and operation, preventing harmful decisions. Evaluated on real-world data, our approach reduces operating costs by up to 15% for certain building types, maintains stable environmental performance, and quickly classifies and optimizes new buildings with limited data. It also adapts to stochastic tariff changes without retraining. Overall, this framework delivers scalable, robust, and cost-effective building energy management.

View on arXiv
@article{zangato2025_2506.16352,
  title={ Data-Driven Policy Mapping for Safe RL-based Energy Management Systems },
  author={ Theo Zangato and Aomar Osmani and Pegah Alizadeh },
  journal={arXiv preprint arXiv:2506.16352},
  year={ 2025 }
}
Comments on this paper