Optimization of a Cluster-Based Energy Management System Using Deep Reinforcement Learning Without Affecting Prosumer Comfort: V2X Technologies and Peer-to-Peer Energy Trading

dc.authorscopusid58924652000
dc.authorscopusid55780618800
dc.contributor.authorKıvanç, Ömer Cihan
dc.contributor.authorKivanc, Omer Cihan
dc.date.accessioned2024-05-25T11:28:11Z
dc.date.available2024-05-25T11:28:11Z
dc.date.issued2024
dc.departmentOkan Universityen_US
dc.department-temp[Yavuz, Mete; Kivanc, Omer Cihan] Istanbul Okan Univ, Dept Elect & Elect Engn, TR-34959 Istanbul, Turkiyeen_US
dc.description.abstractThe concept of Prosumer has enabled consumers to actively participate in Peer-to-Peer (P2P) energy trading, particularly as Renewable Energy Source (RES)s and Electric Vehicle (EV)s have become more accessible and cost-effective. In addition to the P2P energy trading, prosumers benefit from the relatively high energy capacity of EVs through the integration of Vehicle-to-X (V2X) technologies, such as Vehicle-to-Home (V2H), Vehicle-to-Load (V2L), and Vehicle-to-Grid (V2G). Optimization of an Energy Management System (EMS) is required to allocate the required energy efficiently within the cluster, due to the complex pricing and energy exchange mechanism of P2P energy trading and multiple EVs with V2X technologies. In this paper, Deep Reinforcement Learning (DRL) based EMS optimization method is proposed to optimize the pricing and energy exchanging mechanisms of the P2P energy trading without affecting the comfort of prosumers. The proposed EMS is applied to a small-scale cluster-based environment, including multiple (6) prosumers, P2P energy trading with novel hybrid pricing and energy exchanging mechanisms, and V2X technologies (V2H, V2L, and V2G) to reduce the overall energy costs and increase the Self-Sufficiency Ratio (SSR)s. Multi Double Deep Q-Network (DDQN) agents based DRL algorithm is implemented and the environment is formulated as a Markov Decision Process (MDP) to optimize the decision-making process. Numerical results show that the proposed EMS reduces the overall energy costs by 19.18%, increases the SSRs by 9.39%, and achieves an overall 65.87% SSR. Additionally, numerical results indicates that model-free DRL, such as DDQN agent based Deep Q-Network (DQN) Reinforcement Learning (RL) algorithm, promise to eliminate the energy management complexities with multiple uncertainties.en_US
dc.identifier.citation0
dc.identifier.doi10.1109/ACCESS.2024.3370922
dc.identifier.endpage31575en_US
dc.identifier.issn2169-3536
dc.identifier.scopus2-s2.0-85186983534
dc.identifier.scopusqualityQ1
dc.identifier.startpage31551en_US
dc.identifier.urihttps://doi.org/10.1109/ACCESS.2024.3370922
dc.identifier.urihttps://hdl.handle.net/20.500.14517/1137
dc.identifier.volume12en_US
dc.identifier.wosWOS:001177024500001
dc.identifier.wosqualityQ2
dc.language.isoen
dc.publisherIeee-inst Electrical Electronics Engineers incen_US
dc.relation.publicationcategoryMakale - Uluslararası Hakemli Dergi - Kurum Öğretim Elemanıen_US
dc.rightsinfo:eu-repo/semantics/openAccessen_US
dc.subjectCostsen_US
dc.subjectOptimizationen_US
dc.subjectEnergy managementen_US
dc.subjectVehicle-to-griden_US
dc.subjectClustering algorithmsen_US
dc.subjectHeuristic algorithmsen_US
dc.subjectVehicle-to-everythingen_US
dc.subjectPeer-to-peer computingen_US
dc.subjectEnergy exchangeen_US
dc.subjectReinforcement learningen_US
dc.subjectDeep reinforcement learningen_US
dc.subjectSmart gridsen_US
dc.subjectEnergy management systemen_US
dc.subjectpeer-to-peer energy tradingen_US
dc.subjectvehicle-to-homeen_US
dc.subjectmulti-agent reinforcement learningen_US
dc.subjectdeep reinforcement learningen_US
dc.subjectsmart gridsen_US
dc.titleOptimization of a Cluster-Based Energy Management System Using Deep Reinforcement Learning Without Affecting Prosumer Comfort: V2X Technologies and Peer-to-Peer Energy Tradingen_US
dc.typeArticleen_US
dspace.entity.typePublication
relation.isAuthorOfPublicationa8a28b97-f9e7-4486-8767-ddba23bc6fee
relation.isAuthorOfPublication.latestForDiscoverya8a28b97-f9e7-4486-8767-ddba23bc6fee

Files