-
Notifications
You must be signed in to change notification settings - Fork 4
Manually Running Scrape Util to Retrieve Past Egauge Data
After cloning the scrape-util locally, open up \tmp and create a new directory called projects. Within projects, you will be copying any of the project folders into it as found on the server. In our case for egauges, i went ahead and copied dhhl, marine-center-egauge, maui-egauge, and uhm-frog.
And inside those project folders are the necessary toml files
To manually set the desired start and end time for when you want the scrape-util to run from, simply overwrite it here in /src/acquire/egauge.py
[65] start[gid] = nonce.get(gid,init) [66] stop[gid] = mkstop(start[gid])
For example, from 8/14/2017 to 8/15/2017
[65] start[gid] = 1502668800 [66] stop[gid] = 1502755200
To query only specific egauges within a project such as in dhhl, leave only the desired egauges in the config.toml file. For example, in /tmp/projects/dhhl/config.toml, by default this is how the gauges dictionary is set
[acquire.egauge] gauges = { 'e34111' = 34111 , 'e34113' = 34113, 'e34104' = 34104, 'e34105' = 34105, 'e34106' = 34106, 'e34107' = 34107, 'e34108' = 34108, 'e34109' = 34109, 'e34110' = 34110, 'e790' = 790}
That is, when the scrape-util is run it will query all the egauges within gauges. To query only egauge e34111, remove all other egauges except it
[acquire.egauge] gauges = { 'e34111' = 34111}
In combination with the time setter, we are able to query specific egauges within a desired timeframe and the output will be saved as a csv.