Loading...
Management and control of energy usage and price using participatory sensing data
Date
2012
Abstract
A key change in the move to Smart Grids (SGs) is the use of dynamic pricing; this together with less reliable energy from renewable resources makes optimising electricity use highly complex. For smart-devices to function in this envi- ronment, they must adapt to this complexity, while main- taining the flexibility to handle changing user behaviour pat- terns. Reinforcement Learning (RL) has been used to op- timise the scheduling of dynamic resources in SGs. It is proposed to provide smart-devices with knowledge of user intentions and actions by leveraging participatory sensing data. This, in consequence, will allow devices in the SG to tailor their operational schedule to users’ behaviour. With- out this data, the devices’ operation would be interrupted by user activity, leading to suboptimal results. Participa- tory sensing provides for both, the monitoring of parame- ters affecting devices operation (for example, temperature for a heating system) and access to detailed information about user behaviour and activity. The results obtained by our RL approach, clearly indicate that participatory sensing data indeed improve the performance of device scheduling when compared to static schemes resulting in a dramatic price reduction.
Supervisor
Description
peer-reviewed
Publisher
Citation
3rd International Workshop on Agent Technologies for Energy Systems (ATES), at AAMAS 2012;
Files
Loading...
2012_Taylor.pdf
Adobe PDF, 265.79 KB
Keywords
ULRR Identifiers
Funding code
Funding Information
Science Foundation Ireland (SFI)
