This study introduces a novel energy management model based on Deep Reinforcement Learning for IoT-based landslide early warning systems, aiming to achieve energy neutrality and enhance system resilience, efficiency, and sustainability. Unlike traditional energy optimization methods, the proposed model employs a Deep Q-Network (DQN) to dynamically optimize the duty cycle of sensor nodes by leveraging real-time energy availability. By adaptively balancing energy harvesting and consumption, sensor nodes can maintain continuous operation even under highly variable environmental conditions, maximizing their performance during high-energy periods while preserving battery life when energy is limited. Extensive simulations using real-world solar radiation data demonstrate the model's superior capability in extending system longevity and operational stability compared to existing approaches. Addressing critical energy management challenges in landslide monitoring systems, this work enhances system reliability, scalability, and adaptability, offering a robust foundation for broader IoT applications deployed in energy-limited and dynamic environments. The proposed method represents a significant improvement over conventional techniques, as it autonomously optimizes energy resources to ensure the continuous and sustainable operation of IoT ecosystems
Reinforcement learning internet of things energy management optimization landslide early warning systems
Birincil Dil | İngilizce |
---|---|
Konular | Derin Öğrenme, Nöral Ağlar, Takviyeli Öğrenme, Modelleme ve Simülasyon |
Bölüm | Research Articles |
Yazarlar | |
Yayımlanma Tarihi | 31 Aralık 2024 |
Gönderilme Tarihi | 20 Kasım 2024 |
Kabul Tarihi | 27 Aralık 2024 |
Yayımlandığı Sayı | Yıl 2024 Sayı: 059 |