DPS - Data Processing Service platform

The ENEXSA Data Processing Service (DPS) platform is a compre­hensive suite of functional modules that enable the config­u­ration, testing and imple­men­tation of complex proce­dures for online data from a wide variety of sources.

If a detailed physics-based model or other digital twin is fed with online data, the calcu­lation can only produce valid and correct results, if the input data are COMPLETE, CORRECT and CONSISTENT.

The modular design and user-friendly graphical user-interface of the DPS platform enable the user to configure complex workflows including various pre-and post-processing steps, and to test and debug the entire workflow without any coding or in-depth IT knowledge.

Designed for real-life data quality

Equipped with a modern web interface, the DPS allows for building the sequence of process steps from individual ‘unit opera­tions’ to be placed on the flowsheet and connected by simple mouse clicks.

The calcu­lation flow can be triggered based on user-defined points of time or time intervals, or event-based by ‘watch-dog’ criteria applied on one or several online signals.   Various types of data sources can be linked to the process with modules based on published API infor­mation or standard commu­ni­cation protocols.

Config­urable modules for range and steady-state checking as well as logical operators make sure that only valid inputs and reasonable replacement values are processed, or that the calcu­lation is skipped if essential inputs are faulty or missing.

The actual calcu­lation process begins with the mapping of the signals to the model parameters, which may also include side calcu­la­tions defined in C# or Python or variable type trans­for­ma­tions.  These parameters are then fed into a Calcu­lation Job module that manages the execution of the simulation model.

Once the results of a job are returned to the processing scheme, the post-processing steps may involve further logical checks and customized calcu­la­tions before they are mapped to output signals and written to one or several target data repos­i­tories.

Join our live webinar titled "Avoiding Garbage In Garbage Out in Online Process Simulation" taking place on September 26th, 2024  at 3 p.m. GMT.

Portrait Miguel Prokop

"Blaming customers for bad data quality is not a good strategy. Software systems must be capable of identi­fying faulty input data, and they must have proce­dures in place to generate reasonable replacement values to ensure that suffi­cient correct and useful results are produced."

Miguel Prokop, Manager Software Engineering

Icon quotes

Want to get more infor­mation on this topic?

Explore our articles and other resources as a free download.

DPS - Data Processing Service

A user-friendly toolbox with functional modules to build complex processing for online data.