ETL Jobs

An ETL job is a set of task definitions that the ETL engine follows in order to Extract, Transform, and Load data from one data source to another. An ETL job contains an extract task to pull the data, a transform task to manipulate the data, and a load task that applies the data to the new location.

After the job is set up, you can set the job to run as a service. When set to run as a service, a job automatically runs at set times based on the settings you input in the Job tab. The Sleep Time Between Execution value (in seconds) allows you to define the frequency with a default value of 3600 seconds. The value defines when new data can be displayed in the Dashboards and Reports. You can use a smaller value if you want the job to run more frequently. For meter interval data, 900 seconds is usually fast enough for the updates. When increasing this value, consider the Building Operation Trendlog size to make sure it can be covered before being overwritten in order to prevent missing TrendLogs.

ETL does not use any of the PME processes, so this has little impact on PME performance while loading. SQL supports multi-threading and will serve both ETL and PME needs. If SQL is overloaded, the data buffer handles it.