With istSOS you can configure the acquisition of new observations using a time-based scheduler.
The istSOS scheduler relies on the Advanced Python Scheduler library (APScheduler 2.1.2).
In the istSOS directory there is the scheduler.py script. If executed, it will scan the services folder searching for files with *.aps extension and if present it will schedule time-based job based on you configuration choices.
To create a job you have to create a file (e.g. demo.aps) inside the folder of the istSOS instance you want the acquisition is executed (e.g. services/demo/demo.aps).
Tipically a remote sensor is sending data to a FTP server where all the raw files are stored waiting to be loaded into istSOS. with the scheduler you can decide the acquisition frequency.
The next example is an aps file that convert one (or more if present in the folder) proprietary CSV file located in a predefined folder. (for more examples on how to implement proprietary csv file converter go to the Insertion of new observations page)
To configure the acquisition insterval between executions check the APScheduler decorator syntax.
File location: /usr/local/istsos/services/demo/demo.aps
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 | @sched.interval_schedule(minutes=10, start_date='2014-01-01 00:00')
def importMaggia():
from scripts.converter import csv
# Configuring the Converter
conv = csv.CsvImporter('MAGGIA', {
"headrows": 0,
"separator": ",",
"filenamedate": {
"format": '%Y%m%d%H%M%S',
"remove": ['maggia_', '.dat']
},
"datetime": {
"column": 0,
"format": '%Y-%m-%d %H:%M:%S',
"tz": '+01:00'
},
"observations": [{
"observedProperty": "urn:ogc:def:parameter:x-istsos:1.0:river:water:height",
"column": 1
}]
},
'http://localhost/istsos', 'demo',
'/data/maggia', 'maggia_*.dat',
debug=True,
archivefolder='/data/archive/maggia'
)
# Converting raw data to text/csv;subtype=istSOS
if conv.execute():
# Send observation to istSOS
conv.csv2istsos()
|
To run the scheduler:
cd /user/local/istsos
python scheduler.py
Now every ten minutes the function will be executed and the data will be converted using the Generic CSV converter.