Data Reconciliation
Data reconciliation of indoor positioning data: Improve position data accuracy in warehouse environment
This article focuses on improving indoor positioning data through data reconciliation. Indoor positioning systems are increasingly used for resource tracking to monitor manufacturing and warehouse processes. However, measurement errors due to noise can negatively impact system performance. Redundant measurement involves the use of multiple sensor tags that provide position data on the same resource, to identify errors in the physical environment. If we have measurement data from the entire physical environment, a map-based average measurement error can be determined by specifying the points in the examined area where measurement data should be compensated and to what extent. This compensation is achieved through data reconciliation, which improves real-time position data by considering the measurement error in the actual position as an element of the variance-covariance matrix. A case study in a warehouse environment is presented to demonstrate how discrepancies in position data from two sensor tags on forklifts can be used to identify layout-based errors. The algorithm is generally capable of handling the multi-sensor problem in the case of indoor positioning systems. The key points are as follows:
Post Date: 2 July 2024
Constrained Recursive Input Estimation of Blending and Mixing Systems
Blending, mixing processes are often supported by advanced process control systems to maximise margins from available component and heat streams. Since these model-based solutions require accurate and reliable data, in weakly instrumented processes, the unknown inlet concentrations and temperatures should be estimated based on the measured outflows. This work presents a method for the reliable estimation of multiple input variables of process units. The key idea is that the input estimation problem is formulated as a constrained recursive estimation task. The applicability of the method is illustrated based on a benchmark model of a blending system. The performance of the method is compared to the moving window and Kalman Filter based solutions. The results show the superior performance of the proposed method and confirm that the apriori knowledge-based constraints improve the robustness of the estimates.
Operating regime model based multi-objective sensor placement for data reconciliation
Although the number of sensors in chemical production plants is increasing thanks to the IoT revolution, it is still a crucial problem what to measure and how to place the sensors as such the resulted sensor network be robust and cost-effectively provide the required information. This problem is especially relevant in flexible multi-purpose, multi-product production plants when there are significant differences among the operating regions. The present work aims the development of a sensor placement methodology that utilizes the advantages of local linear models. Realizing the often conflicting nature of the key objectives of sensor placement, the problem is formulated as a multi-objective optimization task taking into consideration the cost, estimation accuracy, observability and fault detection performance of the designed networks and simultaneously seeking for the optimal solutions under multiple operating regimes. The effectiveness of the Non-dominated Sorting Genetic Algorithm-II (NSGA-II)-based solution of the defined problem is demonstrated through benchmark examples.
Role Of Steady State Data Reconciliation In Process Model Development
In chemical and hydrocarbon industry operational efficiency is improved by model-based solutions. Historical process data plays an important role in the identification and verification of models utilized by these tools. Since most of the used information are measured values, they are affected by errors influencing the quality of these models. Data reconciliaton aims the reduction of random errors to enhance the quality of data used for model development resulting in more reliable process simulators. This concept is applied to the development and validation of the complex process model and simulator of an industrial hydrogenation system. The results show the applicability of the proposed scheme in industrial environment.
Synergy between DR and Principal Component Analysis (PCA)
In many practical applications simulators used for planning, scheduling or operator training are often too complex for direct usage in real-time process monitoring; the structure of the related non-linear models does not support low-cost and rapid implementation of process monitoring systems. We presented a novel method that effectively utilizes these first principle models for the development, maintenance and validation of multivariate statistical models. We demonstrated that the performance of Principal Component Analysis (PCA) models used for process monitoring can be improved by models used for data reconciliation. The synergy between data reconciliation and PCA has already been realized (Amand et al. 2001): when reconciled data are used for PCA, the numbers of principal components are reduced.
The aim of our research is to develop a method that can be applied in industrial environment and includes advantages of prior knowledge based models and data-driven multivariate statistical process monitoring tools.
Data Reconciliation for iterative model development
Historical process data can be used to determine the unknown elements and parameters of first-principles based models. Data reconciliation (DR) can improve the applicability of these data. We propose an iterative model building – data reconciliation procedure that can continuously improve the quality of models and data.
Firstly, raw data is reconciled based on the draft model of the process (mainly based on mass balances). The unknown parameters of the simulator are identified based on this reconciled data. The reconciled data and estimated parameters are used as an input of the process simulator. This improved model can be used again for reconciliation of the raw data. This procedure could be repeated till there is a significant difference between reconciled and calculated process values.
We used this method to monitor a time-varying process (deactivation of catalyst).
Online Monitoring of Catalyst Deactivation Based on Data Reconciliation and Flowsheeting Simulator
Most of chemical technologies are based on heterogeneous catalytic reactions (~80%). Model based tools used for design, control and monitoring of these processes require accurate kinetic parameters of these catalytic reactions. Laboratory measurements and on-line analysers can be used to monitor decreasing catalyst activity. However these measurements are affected by errors influencing the estimation of kinetic parameters. To increase the robustness and accuracy of the estimation we developed a method based on the integrated application of data reconciliation and flowsheeting simulation. The proposed technique is applied for an industrial hydrogenation system. The estimated reaction kinetic parameters can be utilized advanced process control of the process.
Simultaneous Validation of Online Analyzers and Process Simulators by Process Data Reconciliation
We present a model based algorithm for simultaneous validation of online analysers and process simulators. Reconciled on-line and historical process data satisfying balance and model equations provides the opportunity to validate and improve process models and soft sensors. Accurate simulation results and laboratory measurements can be used for the validation of online analysers. Validated and reconciled data can be used to the iterative and interactive identification of the unknown parameters of the simulator, e.g. for the determination of kinetic parameters. This method can also be used for monitoring and diagnostics of complex processes because situations when the operating conditions have been significantly changed can be discovered. The approach is illustrated by the analysis of an industrial hydrogenation system. We present the proposed iterative and interactive procedure of model development and analyser validation, the applied data reconciliation method, and the details of the case study. The results show the applicability of the proposed scheme in industrial environment and the benefits of the extracted information in the maintenance and monitoring of advanced model based process engineering tools. The developed tool can increase operating efficiency that is the key of reducing energy consumption and environmental impact. This is especially true in hydrocarbon industry where the operation of the technology is supported by process simulator and on-line analyser based advanced process control systems.
Information transfer between PCA and DR
The performance of model based process monitoring systems highly depends on the quality of the model. Good PCA based solutions require accurate and validated historical process data with high information content. However, measurements are always affected by errors. To minimize random errors a pre-processing of data is necessary. Data reconciliation (DR) technique is a useful tool, because this method uses the balance equations and physical-chemical laws so the consistency of data is provided. We analysed the projection matrices of linear DR and found a strong relationship between DR and PCA. With the use of our approach not only the projection matrix of PCA can be calculated, but the similarity to the projection matrix of DR and PCA can be also measured. Furthermore, by using total least squares regression (based on principal components) the parameters of the balance equations can also be (re)calculated.
Process Development Based on Model Mining and Data Reconciliation Techniques
During the last decade a major shift has begun in process and chemical industry, since there was an urgent need for new tools that are able to support the optimization of process operations and the development of new production technologies. Approaches of this shift differ from company to company but one common feature is the requirement of the intensive communication between those who conduct research, design, manufacturing, marketing and management. Such communication should be centered on modeling and simulation due to the need of the integration of the whole product and process development chain in all time and scale levels of the company. To explore and transfer all the useful knowledge needed to operate and optimize technologies and the business processes, we aimed the development of a novel methodology to integrate heterogeneous information sources and heterogeneous models. The proposed methodology can be referred as model mining, since it is based on the extraction and transformation of information not only from historical process data but also from different type of process models. The introduction of this novel concept required the development of new algorithms and tools for model analysis, reduction and information integration. In this paper we report our newest results related to the integration of multivariate statistical methods, data reconciliation and a priori models.
Life-cycle Modelling for Fault Detection - Extraction of PCA Models from Flowsheeting Simulators
The operation of chemical processes is often supported by flowsheeting simulators and process monitoring systems. In many practical applications simulators used for planning, scheduling or operator training are often too complex for direct usage in real- time process monitoring; the structure of the related non-linear models does not support low-cost and rapid implementation of process monitoring systems. In this paper we present a novel method that effectively utilizes these first principle models for the development, maintenance and validation of multivariate statistical models. We demonstrate that the performance of Principal Component Analysis (PCA) models used for process monitoring can be improved by model based data reconciliation. The applicability of developed method is demonstrated in the Tennessee Eastman benchmark problem.
Synergy between Data Reconciliation and Principal Component Analysis in Energy Monitoring
Monitoring of energy consumption is central importance for the energy-efficient operation of chemical processes. Fault detection and process monitoring systems can reduce the environmental impact and enhance safety and energy efficiency of chemical processes. These solutions are based on the analysis of process data. Data reconciliation is a model-based technique that checks the consistence of measurements and balance equations. Principal component analysis is a similar multivariate model based technique, but it utilises a data-driven statistical model. We investigate how information can be transferred between these models to get a more sensitive tool for energy monitoring. To illustrate the capability of the proposed method in energy monitoring, we provide a case study for heat balance analysis in the well- known Tennessee Eastman benchmark problem. The results demonstrate how balance equations can improve energy management of complex process technologies.