ENERGY-AWARE EVALUATION OF PERFORMANCE AND COST TRADE-OFFS BETWEEN FOG AND CLOUD DEPLOYMENTS FOR REAL-TIME IOT APPLICATIONS
DOI:
https://doi.org/10.25215/8194288797.46Abstract
The integration of fog computing with cloud infrastructure has emerged as a promising paradigm for supporting real-time Internet of Things (IoT) applications. This study presents an energy-aware evaluation of performance, cost, and network trade-offs between fog-based and cloud-based deployments in a Smart Parking system. Using iFogSim-based simulations, two deployment strategies were analyzed: (i) centralized cloud processing and (ii) distributed fog-layer processing. The results demonstrate that shifting computation from the cloud to fog nodes significantly reduces operational cost and overall latency while moderately increasing local energy consumption. The total system energy consumption remains relatively stable due to redistribution across layers,and a corresponding reduction in cloud energy demand. Furthermore, network utilization increases under fog deployment, reflecting higher inter-node communication at the edge. The findings say that energy-aware fog computing can achieve an effective balance between responsiveness, energy efficiency for real-time IoT services such as Smart Parking. The energy analysis shows that total system energy is constrained by the high idle power consumption of the centralized Cloud resource. This constant energy overhead dominates the total budget and scaling analysis confirms the Edge deployment maintains invariant latency and achieves high network utilization stability as the number of processing areas grows, demonstrating excellent scalability.Published
2026-03-13
Issue
Section
Articles
