News
Model-Centric Integration of Uncertain Expert Knowledge into Importance Sampling-Based Parameter Estimation
This study presents a model-based parameter estimation method for integrating and validating uncertainty in expert knowledge and simulation models. The parameters of the models of complex systems are often unknown due to a lack of measurement data. The experience-based knowledge of experts can substitute missing information, which is usually imprecise. The novelty of the present paper is a method based on Monte Carlo (MC) simulation and importance sampling (IS) techniques for integrating uncertain expert knowledge into the system model. Uncertain knowledge about the model parameters is propagated through the system model by MC simulation in the form of a discrete sample, while IS helps to weight the sample elements regarding imprecise knowledge about the outputs in an iterative circle. Thereby, the consistency of expert judgments can be investigated as well. The contributions of this paper include an expert knowledge-based parameter estimation technique and a method for the evaluation of expert judgments according to the estimation results to eliminate incorrect ones. The applicability of the proposed method is introduced through a case study of a Hungarian operating waste separation system. The results verify that the assessments of experts can be efficiently integrated into system models, and their consistency can be evaluated.
Post Date: 22 October 2024
The Use of eXplainable Artificial Intelligence and Machine Learning Operation Principles to Support the Continuous Development of Machine Learning-Based Solutions in Fault Detection and Identification
Machine learning (ML) revolutionized traditional machine fault detection and identification (FDI), as complex-structured models with well-designed unsupervised learning strategies can detect abnormal patterns from abundant data, which significantly reduces the total cost of ownership. However, their opaqueness raised human concern and intrigued the eXplainable artificial intelligence (XAI) concept. Furthermore, the development of ML-based FDI models can be improved fundamentally with machine learning operations (MLOps) guidelines, enhancing reproducibility and operational quality. This study proposes a framework for the continuous development of ML-based FDI solutions, which contains a general structure to simultaneously visualize and check the performance of the ML model while directing the resource-efficient development process. A use case is conducted on sensor data of a hydraulic system with a simple long short-term memory (LSTM) network. Proposed XAI principles and tools supported the model engineering and monitoring, while additional system optimization can be made regarding input data preparation, feature selection, and model usage. Suggested MLOps principles help developers create a minimum viable solution and involve it in a continuous improvement loop. The promising result motivates further adoption of XAI and MLOps while endorsing the generalization of modern ML-based FDI applications with the HITL concept.
Post Date: 02 October 2024
Éva Kenyeres won the best student poster presentation award at the 27th CHISA conference in Prague
Congratulations to Éva Kenyeres who participated in the 27th Internation Congress of Chemical and Process Engineering in Prague and won the best student poster presentation award with an interactive poster presentation about "The Particle Filter is a Multifunctional Tool for Robust State Estimation and Optimization" topic.
More information about the conference is available here.
Post Date: 05 August 2024
Time-dependent frequent sequence mining-based survival analysis
Frequent sequence mining is a valuable technique for identifying patterns and co-occurrences in event sequences. However, traditional approaches often neglect the temporal information associated with events, limiting their ability to capture the dynamics of event sequences. In this study, we propose a methodology that integrates frequent sequence mining with survival analysis to address this limitation. Frequent sequence mining captures the order and frequency of occurrence of typical events, while association rules highlight the relevant ones. In addition, survival analysis provides comprehensive temporal information between them. The approach also handles competing risks simultaneously, ensuring unbiased results. The output of the method is sequences of distribution functions of the elapsed time between the frequent and relevant events, which describe the time-varying confidence of the frequent sequences. The method also presents how time-varying confidence functions can be enhanced by explanatory variables and how their confidence interval can be determined using the bootstrapping method. The applicability of the approach is demonstrated using clinical data, specifically focusing on disease sequences.
Post Date: 19July 2024
Multilayer Network-Based Evaluation of the Efficiency and Resilience of Network Flows
Supply chain optimization and resource allocation are challenging because of the complex dynamics of flows. We can classify these flows based on whether they perform value-added or nonvalue-added activities in our process. The aim of this article is to present a multilayered temporal network-based model for the analysis of network flows in supply chain optimization and resource allocation. Implementation of a multilayered network distinguishes value-added from nonvalue-added resource flows, enabling a comprehensive view of the flow of resources in the system. Incorporating weighted edges representing the probabilities of time-dependent flows identifies the resource needs and excesses at each supply site, addresses optimal transportation challenges for resource reallocation, and assesses the efficiency and robustness of the system by examining the overlaps in network layers. The proposed method offers a significant extension to the toolsets for network flow analysis, which has the potential to improve decision-making processes for organizations dealing with complex resource management problems. The applicability of the proposed method is demonstrated by analyzing the temporal network extracted from taxi cab flows in New York City. With the application of the method, the results indicate that significant reductions in idle times are achievable.
Post Date: 18 July 2024
Operationalization Management: Enhancing Life Cycle Management of Digital Twins
The recent progress in development of Information Technology (IT) gave rise to a new wave of industrial transformation marked by cloud computing, the Industrial Internet of Things (IIoT), Big Data analytics, Industry 4.0 principles, and autonomous systems. Digital Twins are at the core of this revolution, by bridging physical world with its digital representation to optimize Cyber-Physical Production Systems (CPPS) in order to create more value. However, it is quite challenging to validate that of the anyway obvious theoretical advantages even in the case of a pilot project not to mention a full production unit size Digital Twin. Another aspect of challenges is the need for model life-cycle management emerges to preserve the benefit captured by the new Digital Twin based technologies. This paper introduces a novel methodology inspired by Operations-based frameworks and Model Engineering, addressing these bottlenecks. It offers a unique solution for managing simulation models monitoring and maintenance in Digital Twins applications. The paper shows the benefit of surrogate-based automated flowsheet model fitting solution for a simplified refinery case-study to reduce the expensive simulation use for model fitting, and reduced the time required compared with the direct simulation fitting without losing accuracy.
Post Date: 18 July 2024
Machine Learning Classifier-Based Metrics Can Evaluate the Efficiency of Separation Systems
This paper highlights that metrics from the machine learning field (e.g., entropy and information gain) used to qualify a classifier model can be used to evaluate the effectiveness of separation systems. To evaluate the efficiency of separation systems and their operation units, entropy- and information gain-based metrics were developed. The receiver operating characteristic (ROC) curve is used to determine the optimal cut point in a separation system. The proposed metrics are verified by simulation experiments conducted on the stochastic model of a waste-sorting system.
Post Date: 03 July 2024
Data reconciliation of indoor positioning data: Improve position data accuracy in warehouse environment
This article focuses on improving indoor positioning data through data reconciliation. Indoor positioning systems are increasingly used for resource tracking to monitor manufacturing and warehouse processes. However, measurement errors due to noise can negatively impact system performance. Redundant measurement involves the use of multiple sensor tags that provide position data on the same resource, to identify errors in the physical environment. If we have measurement data from the entire physical environment, a map-based average measurement error can be determined by specifying the points in the examined area where measurement data should be compensated and to what extent. This compensation is achieved through data reconciliation, which improves real-time position data by considering the measurement error in the actual position as an element of the variance-covariance matrix. A case study in a warehouse environment is presented to demonstrate how discrepancies in position data from two sensor tags on forklifts can be used to identify layout-based errors. The algorithm is generally capable of handling the multi-sensor problem in the case of indoor positioning systems. The key points are as follows:
The layout-based error detection is determined with the indoor positioning system measurement error.
• This article shows how redundant measurements and data reconciliation can improve the accuracy of such systems.
• Improving the accuracy of position data with the layout-based error map using a data reconciliation algorithm.
Post Date: 2 July 2024
Goal-oriented clustering algorithm to monitor the efficiency of logistic processes through real-time locating systems
Modern internal logistic systems face several challenges, from supply chain disruption to mass customization of marketed products. In such a highly dynamic scenario, Internet of Things technologies provide a reliable path to digitizing low-standardized systems and quantitatively monitoring their functioning. In addition, acquired measurements are often combined with machine learning methods to achieve improved data analytics. For this purpose, this work presents a digital architecture to detect logistic activities during order management. While an ultrawide band-based real-time locating system acquires the positioning information of forklifts, a goal-oriented clustering algorithm called Industrial DB scan classifies process-driven operations during the shift. These insights represent valuable information for constantly evaluating the operational efficiency of logistic systems. The robustness and validity of the industrial DB scan are tested from different perspectives. On the one hand, a quantitative benchmark with traditional clustering methods is performed. The proposed algorithm results in the most effective approach to detect uptime forklift operations. On the other hand, a warehousing system proves the operational functioning of the algorithm. In this regard, a Tracking Management System interface is developed to achieve user-friendly process monitoring, where plant supervisors can analyze several internal logistic key performance indicators.
Post Date: 30 June 2024
Technology-enabled cognitive resilience: what can we learn from military operation to develop Operator 5.0 solutions?
The Operator 5.0 concept calls for the self-resilience of operators in Industry 5.0, including the cognitive aspect. Despite attempts to develop supporting technologies, achieved results are loosely connected without a comprehensive approach. Looking for novel expectations, this study seeks inspiration from a chaotic environment, where cognitive resilience is a firm standard: military operations. A systematic literature review in Scopus investigated how technology-enabled cognitive resilience is achieved in this context. Extracted details show vast technology support from field operations to control space, against the single or corporate effect of stressors from the work environment, context, content, or users themselves. These technologies generate indirect and direct influence from physical, mental, and cognitive aspects, creating a cognitive resilience effect. The concept of human-machine symbiosis is proposed, with a framework from technology development to resilience training, to inspire developers to define a broader scope, and engineers to facilitate comprehensive adaptation of Operator 5.0 solutions.
Post Date: 27 June 2024
Finding multifaceted communities in multiplex networks
Identifying communities in multilayer networks is crucial for understanding the structural dynamics of complex systems. Traditional community detection algorithms often overlook the presence of overlapping edges within communities, despite the potential significance of such relationships. In this work, we introduce a novel modularity measure designed to uncover communities where nodes share specific multiple facets of connectivity. Our approach leverages a null network, an empirical layer of the multiplex network, not a random network, that can be one of the network layers or a complement graph of that, depending on the objective. By analyzing real-world social networks, we validate the effectiveness of our method in identifying meaningful communities with overlapping edges. The proposed approach offers valuable insights into the structural dynamics of multiplex systems, shedding light on nodes that share similar multifaceted connections.
Post Date: 24 June 2024
Network science and explainable AI-based life cycle management of sustainability models
Model-based assessment of the potential impacts of variables on the Sustainable Development Goals (SDGs) can bring great additional information about possible policy intervention points. In the context of sustainability planning, machine learning techniques can provide data-driven solutions throughout the modeling life cycle. In a changing environment, existing models must be continuously reviewed and developed for effective decision support. Thus, we propose to use the Machine Learning Operations (MLOps) life cycle framework. A novel approach for model identification and development is introduced, which involves utilizing the Shapley value to determine the individual direct and indirect contributions of each variable towards the output, as well as network analysis to identify key drivers and support the identification and validation of possible policy intervention points. The applicability of the methods is demonstrated through a case study of the Hungarian water model developed by the Global Green Growth Institute. Based on the model exploration of the case of water efficiency and water stress (in the examined period for the SDG 6.4.1 & 6.4.2) SDG indicators, water reuse and water circularity offer a more effective intervention option than pricing and the use of internal or external renewable water resources.
Post Date: 17 June 2024
Machine Learning-Supported Designing of Human–Machine Interfaces
The design and functionality of the human–machine interface (HMI) significantly affects operational efficiency and safety related to process control. Alarm management techniques consider the cognitive model of operators, but mainly only from a signal perception point of view. To develop a human-centric alarm management system, the construction of an easy-to-use and supportive HMI is essential. This work suggests a development method that uses machine learning (ML) tools. The key idea is that more supportive higher-level HMI displays can be developed by analysing operator-related events in the process log file. The obtained process model contains relevant data on the relationship of the process events, enabling a network-like visualisation. Attributes of the network allow us to solve the minimisation problem of the ideal workflow–display relation. The suggested approach allows a targeted process pattern exploration to design higher-level HMI displays with respect to content and hierarchy. The method was applied in a real-life hydrofluoric acid alkylation plant, where a proposal was made about the content of an overview display.
Post Date: 13 May 2024
Network-based visualisation of frequent sequences
Frequent sequence pattern mining is an excellent tool to discover patterns in event chains. In complex systems, events from parallel processes are present, often without proper labelling. To identify the groups of events related to the subprocess, frequent sequential pattern mining can be applied. Since most algorithms provide too many frequent sequences that make it difficult to interpret the results, it is necessary to post-process the resulting frequent patterns. The available visualisation techniques do not allow easy access to multiple properties that support a faster and better understanding of the event scenarios. To answer this issue, our work proposes an intuitive and interactive solution to support this task, introducing three novel network-based sequence visualisation methods that can reduce the time of information processing from a cognitive perspective. The proposed visualisation methods offer a more information rich and easily understandable interpretation of sequential pattern mining results compared to the usual text-like outcome of pattern mining algorithms. The first uses the confidence values of the transitions to create a weighted network, while the second enriches the adjacency matrix based on the confidence values with similarities of the transitive nodes. The enriched matrix enables a similarity-based Multidimensional Scaling (MDS) projection of the sequences. The third method uses similarity measurement based on the overlap of the occurrences of the supporting events of the sequences. The applicability of the method is presented in an industrial alarm management problem and in the analysis of clickstreams of a website. The method was fully implemented in Python environment. The results show that the proposed methods are highly applicable for the interactive processing of frequent sequences, supporting the exploration of the inner mechanisms of complex systems.
Post Date: 13 May 2024
Éva Kenyeres was awarded 1st place at XXVII. Spring Wind Conference
The presentation of Éva Kenyeres entitled Stochastic Algorithms in Process Engineering was awarded 1st place by the jury in the Mathematics and IT section of XXVII. Spring Wind Conference held on 3-5th of May at the Óbuda University this year. The interdisciplinary event aims to gather PhD students in Hungary to share their ideas and get valuable feedback from the jury pointing out the gaps of the works to encourage further development of them.
Post Date: 07 May 2024
Iterative experimental design and identifiability analysis of composite material failure models
The parameter identification of failure models for composite plies can be cumbersome, due to multiple effects as the consequence of brittle fracture. Our work proposes an iterative, nonlinear design of experiments (DoE) approach that finds the most informative experimental data to identify the parameters of the Tsai-Wu, Tsai-Hill, Hoffman, Hashin, max stress and Puck failure models. Depending on the data, the models perform differently, therefore, the parameter identification is validated by the Euclidean distance of the measured points to the closest ones on the nominal surface. The resulting errors provide a base for the ranking of the models, which helps to select the best fitting. Following the validation, the sensitivity of the best model is calculated by partial differentiation, and a theoretical surface is generated. Lastly, an iterative design of the experiments is implemented to select the optimal set of experiments from which the parameters can be identified from the least data by minimizing the fitting error. In this way, the number of experiments required for the identification of a model of a composite material can be significantly reduced. We demonstrate how the proposed method selected the most optimal experiments out of generated data. The results indicate that if the dataset contains enough information, the method is robust and accurate. If the data set lacks the necessary information, novel material tests can be proposed based on the optimal points of the parameters' sensitivity of the generated failure model surface.
Post Date: 27 April 2024
Explainable prediction of node labels in multilayer networks: a case study of turnover prediction in organizations
In real-world classification problems, it is important to build accurate prediction models and provide information that can improve decision-making. Decision-support tools are often based on network models, and this article uses information encoded by social networks to solve the problem of employer turnover. However, understanding the factors behind black-box prediction models can be challenging. Our question was about the predictability of employee turnover, given information from the multilayer network that describes collaborations and perceptions that assess the performance of organizations that indicate the success of cooperation. Our goal was to develop an accurate prediction procedure, preserve the interpretability of the classification, and capture the wide variety of specific reasons that explain positive cases. After a feature engineering, we identified variables with the best predictive power using decision trees and ranked them based on their added value considering their frequent co-occurrence. We applied the Random Forest using the SMOTE balancing technique for prediction. We calculated the SHAP values to identify the variables that contribute the most to individual predictions. As a last step, we clustered the sample based on SHAP values to fine-tune the explanations for quitting due to different background factors.
Post Date: 19 April 2024
Knowledge Graph-Based Framework to Support Human-Centered Collaborative Manufacturing in Industry 5.0
The importance of highly monitored and analyzed processes, linked by information systems such as knowledge graphs, is growing. In addition, the integration of operators has become urgent due to their high costs and from a social point of view. An appropriate framework for implementing the Industry 5.0 approach requires effective data exchange in a highly complex manufacturing network to utilize resources and information. Furthermore, the continuous development of collaboration between human and machine actors is fundamental for industrial cyber-physical systems, as the workforce is one of the most agile and flexible manufacturing resources. This paper introduces the human-centric knowledge graph framework by adapting ontologies and standards to model the operator-related factors such as monitoring movements, working conditions, or collaborating with robots. It also presents graph-based data querying, visualization, and analysis through an industrial case study. The main contribution of this work is a knowledge graph-based framework that focuses on the work performed by the operator, including the evaluation of movements, collaboration with machines, ergonomics, and other conditions. In addition, the use of the framework is demonstrated in a complex use case based on an assembly line, with examples of resource allocation and comprehensive support in terms of the collaboration aspect between shop-floor workers.
Post Date: 17 April 2024
Analysis and Clustering-Based Improvement of Particle Filter Optimization Algorithms
This study highlights how particle filter optimization (PFO) algorithms can explore objective functions and their robustness near optimums. Improvements of the general algorithm are also introduced to increase search efficiency. Population-based optimization algorithms reach outstanding performance by propagating not only one but many candidate solutions. One novel representative of these methods is the PFO concept, which was created as an analogue of the particle filter state estimation algorithm. The PFO algorithm results in a probability distribution of the sample elements, which can represent the shape of the objective function. In the literature, several variants of the PFO can be found, but its elements are not clearly fixed because of its novelty. In the present study, a method is introduced to gain information on the shape of the objective function by following the propagation of the particles along the iterations. The contributions of the paper: 1) comparative study is proposed examining the different variants of the algorithm, and some improvements are introduced (e.g., weight differentiation) to increase the efficiency of the general PFO algorithm; 2) propagation of the particles is investigated to explore the shape of the objective function; 3) clustering-based technique is proposed to get information about the local optimums (e.g., robustness). The results verify that the proposed method is applicable to find local optimums and evaluate their robustness, which is a promising prospect for robust optimization problems where often not the global, but a more stable local optimum gives the best solution.
Post Date: 17 April 2024
Particle filtering supported probability density estimation of mobility patterns
This paper presents a methodology that aims to enhance the accuracy of probability density estimation in mobility pattern analysis by integrating prior knowledge of system dynamics and contextual information into the particle filter algorithm. The quality of the data used for density estimation is often inadequate due to measurement noise, which significantly influences the distribution of the measurement data. Thus, it is crucial to augment the information content of the input data by incorporating additional sources of information beyond the measured position data. These other sources can include the dynamic model of movement and the spatial model of the environment, which influences motion patterns. To effectively combine the information provided by positional measurements with system and environment models, the particle filter algorithm is employed, which generates discrete probability distributions. By subjecting these discrete distributions to exploratory techniques, it becomes possible to extract more certain information compared to using raw measurement data alone. Consequently, this study proposes a methodology in which probability density estimation is not solely based on raw positional data but rather on probability-weighted samples generated through the particle filter. This approach yields more compact and precise modeling distributions. Specifically, the method is applied to process position measurement data using a nonparametric density estimator known as kernel density estimation. The proposed methodology is thoroughly tested and validated using information-theoretic and probability metrics. The applicability of the methodology is demonstrated through a practical example of mobility pattern analysis based on forklift data in a warehouse environment.
Post Date: 15 April 2024
Generally Applicable Q-Table Compression Method and Its Application for Constrained Stochastic Graph Traversal Optimization Problems
We analyzed a special class of graph traversal problems, where the distances are stochastic, and the agent is restricted to take a limited range in one go. We showed that both constrained shortest Hamiltonian pathfinding problems and disassembly line balancing problems belong to the class of constrained shortest pathfinding problems, which can be represented as mixed-integer optimization problems. Reinforcement learning (RL) methods have proven their efficiency in multiple complex problems. However, researchers concluded that the learning time increases radically by growing the state- and action spaces. In continuous cases, approximation techniques are used, but these methods have several limitations in mixed-integer searching spaces. We present the Q-table compression method as a multistep method with dimension reduction, state fusion, and space compression techniques that project a mixed-integer optimization problem into a discrete one. The RL agent is then trained using an extended Q-value-based method to deliver a human-interpretable model for optimal action selection. Our approach was tested in selected constrained stochastic graph traversal use cases, and comparative results are shown to the simple grid-based discretization method.
Post Date: 01 April 2024
Disassembly line optimization with reinforcement learning
As the environmental aspects become increasingly important, the disassembly problems have become the researcher’s focus. Multiple criteria do not enable finding a general optimization method for the topic, but some heuristics and classical formulations provide effective solutions. By highlighting that disassembly problems are not the straight inverses of assembly problems and the conditions are not standard, disassembly optimization solutions require human control and supervision. Considering that Reinforcement learning (RL) methods can successfully solve complex optimization problems, we developed an RL-based solution for a fully formalized disassembly problem. There were known successful implementations of RL-based optimizers. But we integrated a novel heuristic to target a dynamically pre-filtered action space for the RL agent (DLOPTRL algorithm) and hence significantly raise the efficiency of the learning path. Our algorithm belongs to the Heuristically Accelerated Reinforcement Learning (HARL) method class. We demonstrated its applicability in two use cases, but our approach can also be easily adapted for other problem types. Our article gives a detailed overview of disassembly problems and their formulation, the general RL framework and especially Q-learning techniques, and a perfect example of extending RL learning with a built-in heuristic.
Post Date: 11 March 2024
Machine learning -based decision support framework for CBRN protection
Detecting chemical, biological, radiological and nuclear (CBRN) incidents is a high priority task and has been a topic of intensive research for decades. Ongoing technological, data processing, and automation developments are opening up new potentials in CBRN protection, which has become a complex, interdisciplinary field of science. According to it, chemists, physicists, meteorologists, military experts, programmers, and data scientists are all involved in the research. The key to effectively enhancing CBRN defence capabilities is continuous and targeted development along a well-structured concept. Our study highlights the importance of predictive analytics by providing an overview of the main components of modern CBRN defence technologies, including a summary of the conceptual requirements for CBRN reconnaissance and decision support steps, and by presenting the role and recent opportunities of information management in these processes.
Post Date: 29 February 2024
Measuring factors affecting local loyalty based on a correlation network
Understanding the level of local loyalty is crucial for urban planners, as individuals who exhibit higher levels of loyalty are more likely to adopt a “voice” strategy and act in the interest of their community, while being less likely to relocate. This study aims to develop a methodology for assessing and determining the factors influencing local loyalty levels. It is presumed that different factors contribute to each level of local loyalty. Through the identification of loyalty components and potential drivers, a data-driven approach based on correlation networks was employed to identify critical factors influencing loyalty at varying levels. The methodology was applied in Veszprém, Hungary, the European Capital of Culture in 2023, using a representative survey. The findings reveal that while demographic variables exhibit a weak correlation with loyalty levels, residents living in the city centre tend to show higher loyalty. Factors associated with high local loyalty include well-being, employment opportunities, healthy social relationships, and strong family ties. Conversely, the least loyal group is characterized by weak connections with friends, neighbours, and colleagues, as well as living in unsafe environments.
Post Date: February 2024
Utility function-based generalization of sum of ranking differences–country-wise analysis of greenhouse gas emissions
The utility function-based sum of ranking differences (uSRD) method is proposed as a utility function-based multi-criteria decision analysis tool. Our idea is that the transformation functions can be represented by a utility function that can be aggregated with multi-attribute utility functions. We present a framework incorporating utility values as the basis for three different but interconnected analyses. The exemplary application focuses on greenhouse gas emissions and economic indicators of 147 countries. First, the uSRD is applied to the utility values to uncover the hidden relationships of the 40 indicators. A ranking of countries is established to see which sample performs the best and the worst in both emissions and economy. Lastly, mitigation actions are delegated to countries through a three-stage assignment that connects emissions to utilities, sectors, and mitigation actions. The results show that the uSRD excels as a support tool for decision-making.
Post date: 28 February 2024
Neighborhood Ranking-Based Feature Selection
This article aims to integrate k -NN regression, false-nearest neighborhood (FNN), and trustworthiness and continuity (T&C) neighborhood-based measures into an efficient and robust feature selection method to support the identification of nonlinear regression models. The proposed neighborhood ranking-based feature selection technique (NRFS) is validated in three problems, in a linear regression task, in the nonlinear Friedman database, and in the problem of determining the order of nonlinear dynamical models. A neural network is also identified to validate the resulting feature sets. The analysis of the distance correlation also confirms that the method is capable of exploring the nonlinear correlation structure of complex systems. The results illustrate that the proposed NRFS method can select relevant variables for nonlinear regression models.
Post date: 05 February 2024
Comparison of different data and information fusion methods to improve the performance of machine learning models
The combined handling of information from fast and non-destructive spectroscopic measurement techniques enables more accurate and robust models. Machine learning algorithms that estimate qualities are sometimes not accurate enough for a given quality, so to improve the models' accuracy, we need to provide additional information from different analytical measurements. This research compares five data and information fusion techniques tested on spectroscopic results from various oil industry samples. We aim to show examples of which fusion technique should be used and how to build models in this way. We used the boosting technique to compare which approach provides the most information from the measurements. We used mid-infrared and Raman spectral data from the same samples in this study and applied low, medium, high, and complex data fusion techniques. Our motivation is to combine and compare the treatment of different measurement techniques that can provide additional information. The difference between the five methods is the level at which the fusion takes place. The variables come from spectral data at a low level, but at a medium level, we used individually created model results. Furthermore, at the complex level, we used the data of the models built together with the ensemble learning technique for the spectral data.
Considering our achievements in the oil industry, data fusion techniques can significantly improve the accuracy of machine learning models. In our case study, the best results were obtained by the fifth-level data fusion technique, where half of the developed model can predict the hydrocarbon/imide ratio of the additive with an error as if data fusion had not been applied.
Post Date: Janury 2024
Extension of HAAS for the Management of Cognitive Load
The rapid advancement of technology related to Industry 4.0 has brought about a paradigm shift in the way we interact with assets across various domains. This progress has led to the emergence of the concept of a Human Digital Twin (HDT), a virtual representation of an individual’s cognitive, psychological, and behavioral characteristics. The HDT has demonstrated potential as a strategic tool for enhancing productivity, safety, and collaboration within the framework of Industry 5.0. In response to this challenge, this paper outlines a process for tracking human cognitive load using the galvanic skin response as a physiological marker and proposes a novel method for managing cognitive load based on the extended Human Asset Administration Shell (HAAS). The proposed HAAS framework integrates real-time data streams from wearable sensors, user skills, contextual information, task specifics, and environmental and surrounding conditions to deliver a comprehensive understanding of an individual’s cognitive state, physical wellness, and skill set. Through the incorporation of skills set, physical, physiological, and psychological variables, and task parameters, the developed HAAS framework enables the identification, management, and development of individual capabilities, thereby facilitating individualized training and knowledge exchange. The applicability of the developed framework is proved by an experiments in the Operator 4.0 laboratory with the detailed HAAS parameters.
Post date: 30 January 2024
Fault Diagnostics Based on the Analysis of Probability Distributions Estimated Using a Particle Filter
This paper proposes a monitoring procedure based on characterizing state probability distributions estimated using particle filters. The work highlights what types of information can be obtained during state estimation and how the revealed information helps to solve fault diagnosis tasks. If a failure is present in the system, the output predicted by the model is inconsistent with the actual output, which affects the operation of the estimator. The heterogeneity of the probability distribution of states increases, and a large proportion of the particles lose their information content. The correlation structure of the posterior probability density can also be altered by failures. The proposed method uses various indicators that characterize the heterogeneity and correlation structure of the state distribution, as well as the consistency between model predictions and observed behavior, to identify the effects of failures.The applicability of the utilized measures is demonstrated through a dynamic vehicle model, where actuator and sensor failure scenarios are investigated.
Post date: 25 January 2024
Darányi, A., & Abonyi, J. (2024). Fault Diagnostics Based on the Analysis of Probability Distributions Estimated Using a Particle Filter. Sensors, 24(3), 719.doi.org/10.3390/s24030719
This book presents a comprehensive framework for developing Industry 4.0 and 5.0 solutions through the use of ontology modeling and graph-based optimization techniques. With effective information management being critical to successful manufacturing processes, this book emphasizes the importance of adequate modeling and systematic analysis of interacting elements in the era of smart manufacturing.
The book provides an extensive overview of semantic technologies and their potential to integrate with existing industrial standards, planning, and execution systems to provide efficient data processing and analysis. It also investigates the design of Industry 5.0 solutions and the need for problem-specific descriptions of production processes, operator skills and states, and sensor monitoring in intelligent spaces.
The book proposes that ontology-based data can efficiently represent enterprise and manufacturing datasets.
The book is divided into two parts: modeling and optimization. The semantic modeling part provides an overview of ontologies and knowledge graphs that can be used to create Industry 4.0 and 5.0 applications, with two detailed applications presented on a reproducible industrial case study. The optimization part of the book focuses on network science-based process optimization and presents various detailed applications, such as graph-based analytics, assembly line balancing, and community detection.
The book is based on six key points: the need for horizontal and vertical integration in modern industry; the potential benefits of integrating semantic technologies into ERP and MES systems; the importance of optimization methods in Industry 4.0 and 5.0 concepts; the need to process large amounts of data while ensuring interoperability and re-usability factors; the potential for digital twin models to model smart factories, including big data access; and the need to integrate human factors in CPSs and provide adequate methods to facilitate collaboration and support shop floor workers.
You can order this book at Springer
Objective well-being level (OWL) composite indicator for sustainable and resilient cities
Well-being is a critical element of the 2030 Agenda for Sustainable Development Goals. Given the complexity of the concept of well-being, it follows that its measurement requires complex, multivariate methods that can characterize the physical, economic, social and environmental aspects along with the mental state of a city. Although it is not sufficient to carry out settlement-level analyses to make cities inclusive, safe, resilient and sustainable. It is necessary to understand patterns within settlements. This work aims to present how the urban macrostructure of urban well-being indicators can be estimated based on GIS-based multilayer analysis. Open-source data, e.g. road networks, points of interest, green spaces and vegetation, are used to estimate urban well-being parameters such as noise levels, air quality and health-related impacts supplemented by climate models to assess urban resilience and sustainability. The proposed methodology integrates 24 models into six categories, namely walkability, environment, health, society, climate change and safety, which are weighted based on a multilevel Principal Component Analysis to minimize information loss for aggregated composite indicators. The study revealed two main components of the macrostructure related to well-being in the studied city: one related to the geometrical features and the other can be derived from the structure of the natural environment. In Veszprém a natural restoration of the detached house area, industrial area and downtown is recommended including developments with green and blue infrastructural elements and nature-based solutions.
Post date: 09 January 2024
Research laboratory members took third place in Reman Challange 2023
The Reman Challenge was hosted for the fifth time by BORG Automotive, with three students from the research laboratory taking part. The team of Abdulrahman Khalid, Timea Czvetkó and Gergely Halász finished in the third place.
The challenge was about research and innovate transformative solutions that can revolutionize operators' work conditions in remanufacturing operations. From virtual reality to human-robot collaboration, your only boundaries are the limits of feasibility and scalability of the solutions.
Post date: 05 January 2024
Time-dependent sequential association rule-based survival analysis: A healthcare application
The analysis of event sequences with temporal dependencies holds substantial importance across various domains, including healthcare. This study introduces a novel approach that combines sequential rule mining and survival analysis to uncover significant associations and temporal patterns within event sequences. By integrating these techniques, we address the limitations linked to the loss of temporal information. The methodology extends traditional sequential rule mining by introducing time-dependent confidence functions, providing a comprehensive understanding of relationships between antecedent and consequent events. The incorporation of the Kaplan-Meier estimator of survival analysis enables the calculation of temporal distributions between events, resulting in time-dependent confidence functions. These confidence functions illuminate the probability of specific event occurrences considering temporal contexts. To present the application of the method, we demonstrated the usage within the healthcare domain. Analyzing the ICD-10 codes and the laboratory events, we successfully identified relevant sequential rules and their time-dependent confidence functions. This empirical validation underscores the potential of methodology to uncover clinically significant associations within intricate medical data.
The study presents a unique methodology that integrates sequential rule mining and survival analysis.
The methodology extends traditional sequential rule mining by introducing time-dependent confidence functions.
The application of the method is demonstrated within the healthcare domain.
Post date: 05 January 2024
Magyar Fuzzy Társaság Ifjúsági Díj / Hungarian Fuzzy Society Youth Award
Darányi András a Mérnöki Kar PhD hallgatója kapta a Magyar Fuzzy Társaság Ifjúsági díját. A díj a lágy számítási módszerek területén nyújtott kutatási-tudományos teljesítmények elismeréseként került adományozásra.
András Darányi, PhD student of the Faculty of Engineering, received the Hungarian Fuzzy Society Youth Award. The prize was awarded in recognition of his research and scientific achievements in the field of soft computing methods.
Post date 21 November 2023
Operator 4.0 research network: A key player in Gartner's top trends for 2024
Gartner just published the top 10 strategic technology trends for 2024 report. The Augmented connected workforce is in 9th place, which is the main goal of the Operator 4.0 research network led by our research group.
Post date 17 October 2023
Indoor Positioning-based Occupational Exposures Mapping and Operator Well-being Assessment in Manufacturing Environment
This research was motivated by the need for detailed information about the spatial and contextualized distribution of occupational exposures, which can be used to improve the layout of the workspace. To achieve this goal, the study emphasizes the need for position-related information and contextualized data. To address these concerns, the study proposes the use of Indoor Positioning System (IPS) sensors that can be further developed to establish a set of metrics for measuring and evaluating occupational exposures. The proposed IPS-based sensor fusion framework, which combines various environmental parameters with position data, can provide valuable insights into the operator’s working environment. For this, we propose an indoor position-based comfort level indicator. By identifying areas of improvement, interventions can be implemented to enhance operator performance and overall health. The measurement unit installed on a manual material handling device in a real production environment and collected data using temperature, noise, and humidity sensors. The results demonstrated the applicability of the proposed comfort level indicator in a wire harness manufacturing setting, providing location-based information to enhance operator well-being. Overall, the proposed framework can be used as a tool to monitor the industrial environment, especially the well-being of shop floor operators.
Post date 12 October 2023
Assessing human worker performance by pattern mining of Kinect sensor skeleton data
The human worker is an in-disposable factor in manufacturing processes. Traditional observation methods to assess their performance is time-consuming and expert-dependent, while it is still impossible to diagnose the detailed movement trajectory with the naked eye. Industry 4.0 technologies can innovate that process with smart sensors paired with data mining techniques for automated operation and develop a database of frequent movements for corporate reference and improvement. This paper proposes an approach to automatically assess worker performance with skeleton data by applying pattern mining methods and supervised learning algorithms. A use case is performed on an electrical assembly line to validate the approach, with the skeleton data collected by Kinect sensor v2. By using supervised learning, the movements of workers in each workstation can be segmented, and the line performance can be assessed. The work movement motifs can be recognized with pattern mining. The mined results can be used to further improve the production processes in terms of work procedures, movement symmetry, body utilization, and other ergonomics factors for both short and long-term human resource development. The promising result motivates further utilization of easy-to-adopt technology in Industry 5.0, which facilitates human-centric data-driven improvements.
Post date 12 September 2023
Current development on the Operator 4.0 and transition towards the Operator 5.0: A systematic literature review in light of Industry 5.0
Technology-driven Industry 4.0 (I4.0) paradigm combined with human-centrism, sustainability, and resilience orientation, forms the Industry 5.0 (I5.0) paradigm, providing support for the workforce and enabling the Operator 4.0 (O4.0) approach. The I5.0 focuses can face unforeseen challenges, as the applicability and readiness of I4.0 solutions are still not well discussed in the literature. Therefore, structuring existing knowledge of O4.0 to prepare for the smooth transition toward Operator 5.0 (O5.0) is crucial. A systematic literature review is performed in the Scopus database, considering publications up to 31 December 2022. Bibliography Network Analysis (BNA), text mining techniques (i.e., Latent Dirichlet Allocation (LDA), BERTopic), and knowledge graph (KG) were deployed on the retrieved abstracts. The full-text examination is carried out over papers matched by LDA and BERTopic. From the BNA result of 279 relevant papers, clusters of active researchers and topics were found, while text-mining results revealed trending and missing research directions. The extracted details from the full text of 81 papers reflected the coverage and development levels of O4.0 types with the preparation for resilience, human-centrism, and sustainability. Achieved results suggest that though the O5.0 transition is inevitable, I4.0 technologies are not ready with sufficient human factor integration. Missing research orientations including integrated sustainability from the human perspective, or system resilience, concerning drivers and restrainers for technology adoption. To prepare for O5.0, discussed O4.0 drivers can help to shape the favorable conditions, and the restrainers should be mitigated before adopting human-centric technologies. Further study including grey literature is necessary to exploit more industrial and policy-making perspectives.
Post date 04 August 2023
Heart Rate Variability Measurement to Assess Acute Work-Content-Related Stress of Workers in Industrial Manufacturing Environment—A Systematic Scoping Review
Background: Human workers are indispensable in the human–cyber-physical system in the forthcoming Industry 5.0. As inappropriate work content induces stress and harmful effects on human performance, engineering applications search for a physiological indicator for monitoring the well-being state of workers during work; thus, the work content can be modified accordingly. The primary aim of this study is to assess whether heart rate variability (HRV) can be a valid and reliable indicator of acute work-content-related stress (AWCRS) in real time during industrial work. Second, we aim to provide a broader scope of HRV usage as a stress indicator in this context. Methods: A search was conducted in Scopus, IEEE Xplore, PubMed, and Web of Science between 1 January 2000 and 1 June 2022. Eligible articles are analyzed regarding study design, population, assessment of AWCRS, and its association with HRV. Results: A total of 14 studies met the inclusion criteria. No randomized control trial (RCT) was conducted to assess the association between AWCRS and HRV. Five observational studies were performed. Both AWCRS and HRV were measured in nine further studies, but their associations were not analyzed. Results suggest that HRV does not fully reflect the AWCRS during work, and it is problematic to measure the effect of AWCRS on HRV in the real manufacturing environment. The evidence is insufficient for a reliable conclusion about the HRV diagnostic role as an indicator of human worker status. Conclusion: This review is valuable in the Operator 4.0 paradigm, calling for more trials to validate the use of HRV to measure AWCRS on human workers.
Post date: 11 July 2023
Sequence Compression and Alignment-Based Process Alarm Prediction
With the increasing complexity of production technologies, alarm management becomes more and more important in industrial process control. The overall safety of the plant relies heavily on the situation-aware response time of the staff. This kind of awareness has to be supported by a state-of-the-art alarm management system, which requires broad and up-to-date process-relevant knowledge. The proposed method provides a solution when such information is not fully available. With the utilization of machine learning algorithms, a real-time event scenario prediction can be gained by comparing the frequent event patterns extracted from historical event-log data with the actual online data stream. This study discusses an integrated solution, which combines sequence compression and sequence alignment to predict the most probable alarm progression. The effectiveness and limitations of the proposed method are tested using the data of an industrial delayed-coker plant. The results confirm that the presented parameter-free method identifies the characteristic patterns─operational states─ and their progression with high confidence in real time, suggesting it for a wider adoption for sequence analysis.
Post date: 28 June 2023
The human-centric Industry 5.0 collaboration architecture
While the primary focus of Industry 4.0 revolves around extensive digitalization, Industry 5.0, on the other hand, seeks to integrate innovative technologies with human actors, signifying an approach that is more value-driven than technology-centric. The key objectives of the Industry 5.0 paradigm, which were not central to Industry 4.0, underscore that production should not only be digitalized but also resilient, sustainable, and human-centric. This paper is focusing on the human-centric pillar of Industry 5.0. The proposed methodology addresses the need for a human-AI collaborative process design and innovation approach to support the development and deployment of advanced AI-driven co-creation and collaboration tools. The method aims to solve the problem of integrating various innovative agents (human, AI, IoT, robot) in a plant-level collaboration process through a generic semantic definition, utilizing a time event-driven process. It also encourages the development of AI techniques for human-in-the-loop optimization, incorporating cross-checking with alternative feedback loop models. Benefits of this methodology include the Industry 5.0 collaboration architecture (I5arc), which provides new adaptable, generic frameworks, concepts, and methodologies for modern knowledge creation and sharing to enhance plant collaboration processes.
The I5arc aims to investigate and establish a truly integrated human-AI collaboration model, equipped with methods and tools for human-AI driven co-creation.
Provide a framework for the co-execution of processes and activities, with humans remaining empowered and in control.
The framework primarily targets human-AI collaboration processes and activities in industrial plants, with potential applicability to other societal contexts.
Post date: 18 June 2023
Machine learning-based soft-sensor development for road quality classification
Vibrations in road vehicles cause several harmful effects, health problems can occur for the passengers, and mechanical damage can occur to the vehicle components. Given the health, safety, and financial issues that arise, keeping the road network in good condition and detecting road defects as early as possible requires an extensive monitoring system. Related to this, our study presents the development of hardware and software for a low-cost, multi-sensor road quality monitoring system for passenger vehicles. The developed monitoring system can classify road sections according to their quality parameters into four classes. In order to detect vibrations in the vehicle, accelerometers and gyroscope sensors are installed at several points. Then, a machine learning-based soft-sensor development is introduced. Besides noise filtering, each data point is resampled by spatial frequency to reduce the velocity dependence. Subsequently, a decision tree-based classification model is trained using features from the power spectrum and principal component analysis. The classification algorithm is validated and tested with measurement data in a real-world environment. In addition to reviewing the accuracy of the model, we examine the correlation of the data measured in the cabin and on the suspension to see how much additional information is provided by the sensor on the axle.
Post date: 9 June 2023
Identifying the links among urban climate hazards, mitigation and adaptation actions and sustainability for future resilient cities
Comprehensive and objective assessment methods need to be developed to create inclusive, safe, resilient and sustainable cities. Monitoring the evolution of sustainability and well-being in the cities is important for researchers implementing the UN 2030 Agenda. This research explores and analyzes the climate change hazards, adaptation- and mitigation actions and their implementation in 776 cities located in 84 different countries. The climate action co-benefits are supporting the achievement of sustainable development goals, which are comprehensively elaborated in this methodological development. The analyzes are carried out based on the continuously updated Carbon Disclosure Project database. An open source algorithm has been developed that represents the CDP database as a bit table and use frequent itemset mining for the identification of global patterns of climate hazards, mitigation- and adaptation actions and their co-benefits, therefore, this paper offers an exploratory analysis tool that is suitable for monitoring climate actions. The most frequently identified mitigation actions in cities were energy planting (1444 actions), and on-site renewable production (644), while the most common actions for adaptation were tree planting (283) and flood mapping (267). Regarding city size, 41% of large metropolitan areas plan to develop mass transit actions, while the separate collection of recyclables is typical in 85% of towns. 56.2% of CDP database actions support access to sustainable cities and communities goal (SDG11), 54.2% access to climate action goal (SDG13), and the emergence of affordable and clean energy (SDG7) and gender equality goal (SDG5) are below 5%.
Post date: 9 June 2023
Az információmenedzsment szerepe az ABV-védelemben
Az atom-, bio- és vegyi (ABV-) incidensek felderítése kiemelt fontosságú feladat, amely évtizedek óta intenzíven kutatott téma. A folyamatos technológiai, adatfeldolgozási és automatizálási vívmányok újabb és újabb fejlesztési potenciált nyitnak az ABV-védelem terén is, amely napjainkra komplex, interdiszciplináris tudományterületté vált. Ennek megfelelően kémikusok, fizikusok, meteorológusok, katonai szakértők, programozók és adattudósok egyaránt közreműködnek a kutatásokban. A hazai ABV-védelmi képességek hatékony növelésének a kulcsa is abban rejlik, hogy megfelelően strukturált koncepció mentén folyamatos és célirányos fejlesztés történjen. Kutatásunk célja, hogy áttekintést adjunk a modern ABV-védelmi technológiák főbb komponenseiről, ezen belül összefoglaljuk az ABV-felderítés, illetve a döntéstámogatási lépések koncepcionális követelményeit, és bemutatjuk az információmenedzsment szerepét és legújabb lehetőségeit a folyamatokban.
Post date: 19 May 2023
Matrix factorization-based multi-objective ranking–What makes a good university?
Non-negative matrix factorization (NMF) efficiently reduces high dimensionality for many-objective ranking problems. In multi-objective optimization, as long as only three or four conflicting viewpoints are present, an optimal solution can be determined by finding the Pareto front. When the number of the objectives increases, the multi-objective problem evolves into a many-objective optimization task, where the Pareto front becomes oversaturated. The key idea is that NMF aggregates the objectives so that the Pareto front can be applied, while the Sum of Ranking Differences (SRD) method selects the objectives that have a detrimental effect on the aggregation, and validates the findings. The applicability of the method is illustrated by the ranking of 1176 universities based on 46 variables of the CWTS Leiden Ranking 2020 database. The performance of NMF is compared to principal component analysis (PCA) and sparse non-negative matrix factorization-based solutions. The results illustrate that PCA incorporates negatively correlated objectives into the same principal component. On the contrary, NMF only allows non-negative correlations, which enable the proper use of the Pareto front. With the combination of NMF and SRD, a non-biased ranking of the universities based on 46 criteria is established, where Harvard, Rockefeller and Stanford Universities are determined as the first three. To evaluate the ranking capabilities of the methods, measures based on Relative Entropy (RE) and Hypervolume (HV) are proposed. The results confirm that the sparse NMF method provides the most informative ranking. The results highlight that academic excellence can be improved by decreasing the proportion of unknown open-access publications and short distance collaborations. The proportion of gender indicators barely correlate with scientific impact. More authors, long-distance collaborations, publications that have more scientific impact and citations on average highly influence the university ranking in a positive direction.
Post date: 13 April 2023
Frequent pattern mining-based log file partition for process mining
Process mining is a technique for exploring models based on event sequences, growing in popularity in the process industry. Process mining algorithms assume that the processed log files contain events generated by only one unknown process, which can lead to extremely complex and inaccurate models when this assumption is not met. To address this issue, this article proposes a frequent pattern mining-based method for log file partitioning, allowing for the exploration of parallel processes. The key idea is that frequent pattern mining can identify grouped events and generate sub-logs of overlapping sub-processes. Thanks to the pre-processing of the log files, more compact and interpretable process models can be identified. We developed a set of goal-oriented metrics to evaluate the complexity of process mining problems and the resulting models. The applicability and effectiveness of the method are demonstrated in the analysis of process alarms of an industrial plant. The results confirm that the proposed method enables the discovery of targeted sub-process models by partitioning the log file using frequent pattern mining, and the effectiveness of the method increases with the number of parallel processes stored in the same log file. We recommend applying the method in every case where there is no clear start and end of the logged events so that the log file can describe different processes.
Post date: 03 April 2023
Surface Water Monitoring Systems—The Importance of Integrating Information Sources for Sustainable Watershed Management
The complex interactions from anthropogenic activities, climate change, sedimentation and the input of wastewater has significantly affected the aquatic environment and entire ecosystem. Over the years, the researchers have investigated water monitoring approaches in terms of traditional monitoring or even integrated systems to handle such an environmental assessment and predictions based on warning systems. However, research into the selection and optimization of water monitoring systems by the combination of parallel approach in terms of sampling techniques, process analysis and results is limited. The research objectives of the present study are to evaluate the existing water monitoring systems based on the latest approach and then provide insights into factors affecting sensor implementation at sampling locations. Here we summarize the advancement and trends of various water monitoring systems as well as the suitability of sensor placement in the area by reviewing more than 300 papers published between 2011 and 2022. The research highlights the urgency of an integrative approach with regard to water monitoring systems including water quality model and water quantity model. A framework is proposed to incorporate all water monitoring approaches, sampling techniques, and predictive models to provide comprehensive information about environmental assessment. It was observed that the urgency of model-based approaches as verification and fusion of data assemble has the ability to improve the performances of the systems. Furthermore, integrated systems with the inclusion of a separate modeling approach through integrated, semi-mechanistic models, data science and artificial intelligence are recommended in the future. Overall, this study provides guidelines for achieving standardized water management by implementing integrated water monitoring systems.
Post date: 31 March 2023
Goal-Oriented Tuning of Particle Filters for the Fault Diagnostics of Process Systems
This study introduces particle filtering (PF) for the tracking and fault diagnostics of complex process systems. In process systems, model equations are often nonlinear and environmental noise is non-Gaussian. We propose a method for state estimation and fault detection in a wastewater treatment system. The contributions of the paper are the following: (1) A method is suggested for sensor placement based on the state estimation performance; (2) based on the sensitivity analysis of the particle filter parameters, a tuning method is proposed; (3) a case study is presented to compare the performances of the classical PF and intelligent particle filtering (IPF) algorithms; (4) for fault diagnostics purposes, bias and impact sensor faults were examined; moreover, the efficiency of fault detection was evaluated. The results verify that particle filtering is applicable and highly efficient for tracking and fault diagnostics tasks in process systems.
Post date: 09 March 2023
Data sharing in Industry 4.0 - AutomationML, B2MML and International Data Spaces-based solutions
The concept of a data ecosystem and Industry 4.0 requires high-level vertical and horizontal interconnectivity across the entire value chain. Its successful realization demands standardized data models to ensure transparent, secure and widely integrable data sharing within and between enterprises. This paper provides a PRISMA method-based systematic review about data sharing in Industry 4.0 via AutomationML, B2MML and International Data Spaces-based solutions. The interconnection of these data models and the ISA-95 standard is emphasized. This review describes the major application areas of these standards and their extension as well as supporting technologies and their contribution towards horizontal integration and data ecosystems. This review highlights how much value interconnected, exchanged and shared data gained in recent years. Standardized data sharing mechanisms enable real-time, flexible and transparent communication, which features became top requirements to gain a competitive advantage. However, to foster the shift from within company data communication towards the data ecosystem, IT- and people-oriented cultures must be well-established to ensure data protection and digital trust. We believe that this review of these standardized data exchange and sharing solutions can contribute to the development and design of Industry 4.0-related systems as well as support related scientific research.
Post date: 03 March 2023
3D Scanner-Based Identification of Welding Defects—Clustering the Results of Point Cloud Alignment
This paper describes a framework for detecting welding errors using 3D scanner data. The proposed approach employs density-based clustering to compare point clouds and identify deviations. The discovered clusters are then classified according to standard welding fault classes. Six welding deviations defined in the ISO 5817:2014 standard were evaluated. All defects were represented through CAD models, and the method was able to detect five of these deviations. The results demonstrate that the errors can be effectively identified and grouped according to the location of the different points in the error clusters. However, the method cannot separate crack-related defects as a distinct cluster.
Post date: 01 March 2023
Multi-objective hierarchical clustering for tool assignment
Due to the limited tool magazine capacities of CNC machines, time-consuming tool changeovers result in inefficient equipment utilization. This study provides a method to minimize the changeovers by optimizing the allocation of the tools to the machines. The proposed algorithm is efficient as it approaches the tool assignment task as a multi-objective hierarchical clustering problem where the products are grouped based on the similarity of the tool demands. The novelty of the goal-oriented agglomerative clustering algorithm is that it is based on the Pareto optimal selection of the merged clusters. The applicability of the method is demonstrated through an industrial case study. The tool assignment problem has also been formulated as a bin-packing optimization task, and the results of the related linear programming were used as a benchmark reference. The comparison highlighted that the proposed method provides a feasible solution for large real-life problems with low computation time.
Post date: 18 February 2023
Last Chance to Submit Your Paper to CoDIT'23: Extended Deadline Announced
The CoDIT’23 conference is the ninth (9th) edition in the series of the International Conference on Control, Decision and Information Technologies.
It will be held 03-06 July, 2023 at Rome, Italy.
The paper submission deadline has been extended until February 28, 2023.
The conference purpose is to be a forum for technical exchange amongst scientists having interests in Control, Automation, Robotics, Optimization, Decision, Cybernetics, Computer Science and Information Technologies. This conference will provide a remarkable opportunity for the academic and industrial communities to address new challenges, share solutions and discuss future research directions. The technical program will include plenary lectures, regular technical sessions, and special sessions.
For more information please visit the website.
Post date: 01 February 2023
Tuan-anh Tran achieved excellent ranking at the IEEE HS Student Paper Contest
Every year, the IEEE Hungary Section (IEEE HS) announces a "Student Paper Contest" for students from higher education institutions. Tuan-anh Tran was awarded thrid place with a paper entitled "Retrofitting-Based Development of Brownfield Industry 4.0 and Industry 5.0 Solutions".
Post date: 05 January 2023
Demonstration Laboratory of Industry 4.0 Retrofitting and Operator 4.0 Solutions: Education towards Industry 5.0
One of the main challenges of Industry 4.0 is how advanced sensors and sensing technologies can be applied through the Internet of Things layers of existing manufacturing. This is the so-called Brownfield Industry 4.0, where the different types and ages of machines and processes need to be digitalized. Smart retrofitting is the umbrella term for solutions to show how we can digitalize manufacturing machines. This problem is critical in the case of solutions to support human workers. The Operator 4.0 concept shows how we can efficiently support workers on the shop floor. The key indicator is the readiness level of a company, and the main bottleneck is the technical knowledge of the employees. This study proposes an education framework and a related Operator 4.0 laboratory that prepares students for the development and application of Industry 5.0 technologies. The concept of intelligent space is proposed as a basis of the educational framework, which can solve the problem of monitoring the stochastic nature of operators in production processes. The components of the intelligent space are detailed through the layers of the IoT in the form of a case study conducted at the laboratory. The applicability of indoor positioning systems is described with the integration of machine-, operator- and environment-based sensor data to obtain real-time information from the shop floor. The digital twin of the laboratory is developed in a discrete event simulator, which integrates the data from the shop floor and can control the production based on the simulation results. The presented framework can be utilized to design education for the generation of Industry 5.0.
Post date: 03 January 2023
Hypergraph and network flow-based quality function deployment
Quality function deployment (QFD) has been a widely-acknowledged tool for translating customer requirements into quality product characteristics based on which product development strategies and focus areas are identified. However, the QFD method considers the correlation and effect between development parameters, but it is not directly implemented in the importance ranking of development actions. Therefore, the cross-relationships between development parameters and their impact on customer requirement satisfaction are often neglected. The primary objective of this study is to make decision-making more reliable by improving QFD with methods that optimize the selection of development parameters even under capacity or cost constraints and directly implement cross-relationships between development parameters and support the identification of interactions visually. Therefore, QFD is accessed from two approaches that proved efficient in operations research. 1) QFD is formulated as a network flow problem with two objectives: maximizing the benefits of satisfying customer needs using linear optimization or minimizing the total cost of actions while still meeting customer requirements using assignment of minimum cost flow approach. 2) QFD is represented as a hypergraph, which allows efficient representation of the interactions of the relationship and correlation matrix and the determination of essential factors based on centrality metrics. The applicability of the methods is demonstrated through an application study in developing a sustainable design of customer electronic products and highlights the improvements' contribution to different development strategies, such as linear optimization performed the best in maximizing customer requirements' satisfaction, assignment as minimum cost flow approach minimized the total cost, while the hypergraph-based representation identified the indirect interactions of development parameters and customer requirements.
Post date: 14 December 2022
Expert-Based Modular Simulator for Municipal Waste Processing Technology Design
One of the significant problems in our society is the handling and processing of the vast amount of waste produced by households and industrial processes. Nowadays, packaging material regulations are constantly changing, which can significantly impact the quality of municipal waste, requiring the continuous development and redesign of waste processing plants. Since only a few uncertain measurements (composition, mass, etc.) are available for this task, analysing and redesigning waste processing technologies is challenging. This research aims to develop a modelling and simulation concept that can integrate all the available information and can also handle the uncertainty of the measurements. The proposed modular modelling framework can serve as a basis for designing and redesigning the technologies needed to process ever-changing municipal waste. The most important steps of the framework are as follows: identifying the typical equipment, these are the elements; building models of the elements; determining the characteristic parameters of the equipment; exploring the possible relationships between the elements. For example, the information needed to define the model parameters can be gathered from measurements, industrial experience, and expert knowledge. In many cases, the data obtained represent ranges. The stationary model framework applies efficiency factors and divides the solids into substreams based on expert knowledge. Furthermore, a modular simulator framework was developed to simulate the technological schemes with various connections. The specifications for all widely used waste industrial equipment (shredders, air separators, sieves, magnetic-, eddy current-, optical-, and ballistic separators) were used to construct the developed simulator. This simulator can open new opportunities for the design of waste sorting technological networks. The model was calibrated based on expertise gained from operating the studied technology. The changes in the material parameters can be considered, and the modular simulator can lead to flexible waste sorting technologies capable of adapting to governmental and environmental regulations changes. The main result of the work is that a methodology for designing a modular simulator, model development, and a validation method has been proposed, which provides the possibility to deal with uncertainty. All this is successfully presented through the analysis of an operating waste separation system.
Post date: 08 December 2022
Tamás Ruppert has been awarded VEAB Outstanding Young Researchers Award in the „engineering sciences" category
The prize is awarded to young researchers who have made significant achievements in the field of engineering or the living and non-living sciences.
The award ceremony will take place at the VEAB Headquarters of the Hungarian Academy of Sciences in a lecture session on 7 December 2022 at 14:00.
Post date: 25 November 2022
Researchers of Abonyilab achieved excellent ranking at the Institutional Scientific Student Conference
The University of Pannonia held the Institutional Scientific Student Conference on November 23, 2022. Éva Kenyeres and Ádám Ipkovich were awarded first place, Gergely Lajos Halász took second place. Therefore, they can participate in the National Scientific Student Conference. Mónika Gugolya was awarded third place.
Éva Kenyeres participated under the Faculty of Engineering- Engineering Sciences section with a paper entitled: Goal-oriented particle filter state estimation algorithm-based fault diagnostics of process systems.
Ádám Ipkovich participated under the Faculty of Engineering, Chemical and Chemical Industry - Modeling section with a paper entitled: Iterative Identifiability Analysis of Composite Material Failure Models.
Gergely Lajos Halász participated under the Faculty of Engineering- Engineering Sciences section with a paper entitled: Estimation of the operator comfort level and the layout information based on sensor fusion techniques.
Mónika Gugolya participated under the Faculty of Engineering, Chemical and Chemical Industry - Modeling section with a paper entitled: Collaborative work scheduling between humans and robots.
Post date: 25 November 2022
Machine learning-based software sensors for machine state monitoring - The role of SMOTE-based data augmentation
A method for flexible vibration sensor-based retrofitting of CNC machines is proposed. As different states leave different fingerprints in the power spectrum plane, the states of the machine can be distinguished based on the features extracted from the spectrum map. Due to some states, like tool replacement, are less frequent than others, like production state, monitoring the machine states is considered an imbalanced classification problem. The key idea is to use Borderline-Synthetic Minority Oversampling Technique (Borderline-SMOTE) to augment the data set. The concept is validated in an industrial case study. Soft sensors based on four machine learning algorithms with and without SMOTE to predict the states of the machine were implemented. The results show that the SMOTE-based data augmentation improved the performance of the models by 50%.
Post date: 21 November 2022
Cooperation patterns in the ERASMUS student exchange network: an empirical study
The ERASMUS program is the most extensive cooperation network of European higher education institutions. The network involves 90% of European universities and hundreds of thousands of students. The allocated money and number of travelers in the program are growing yearly. By considering the interconnection of institutions, the study asks how the program’s budget performs, whether the program can achieve its expected goals, and how the program contributes to the development of a European identity, interactions among young people from different countries and learning among cultures. Our goal was to review and explore the elements of network structures that can be used to understand the complexity of the whole ERASMUS student mobility network at the institutional level. The results suggest some socioeconomic and individual behavioral factors underpinning the emergence of the network. While the nodes are spatially distributed, geographical distance does not play a role in the network’s structure, although parallel travelling strategies exist, i.e., in terms of preference of short- and long-distance. The European regions of home and host countries also affect the network. One of the most considerable driving forces of edge formation between institutions are the subject areas represented by participating institutions. The study finds that faculties of institutions are connected rather than institutions, and multilayer network model suggested to explore the mechanisms of those connections. The results indicate that the information uncovered by the study is helpful to scholars and policymakers.
Post date: 27 October 2022
Sectoral Analysis of Energy Transition Paths and Greenhouse Gas Emissions
The Paris Climate Agreement and the 2030 Agenda for Sustainable Development Goals declared by the United Nations set high expectations for the countries of the world to reduce their greenhouse gas (GHG) emissions and to be sustainable. In order to judge the effectiveness of strategies, the evolution of carbon dioxide, methane, and nitrous oxide emissions in countries around the world has been explored based on statistical analysis of time-series data between 1990 and 2018. The empirical distributions of the variables were determined by the Kaplan–Meier method, and improvement-related utility functions have been defined based on the European Green Deal target for 2030 that aims to decrease at least 55% of GHG emissions compared to the 1990 levels. This study aims to analyze the energy transition trends at the country and sectoral levels and underline them with literature-based evidence. The transition trajectories of the countries are studied based on the percentile-based time-series analysis of the emission data. We also study the evolution of the sector-wise distributions of the emissions to assess how the development strategies of the countries contributed to climate change mitigation. Furthermore, the countries’ location on their transition trajectories is determined based on their individual Kuznets curve. Runs and Leybourne–McCabe statistical tests are also evaluated to study how systematic the changes are. Based on the proposed analysis, the main drivers of climate mitigation and evaluation and their effectiveness were identified and characterized, forming the basis for planning sectoral tasks in the coming years. The case study goes through the analysis of two counties, Sweden and Qatar. Sweden reduced their emission per capita almost by 40% since 1990, while Qatar increased their emission by 20%. Moreover, the defined improvement-related variables can highlight the highest increase and decrease in different aspects. The highest increase was reached by Equatorial Guinea, and the most significant decrease was made by Luxembourg. The integration of sustainable development goals, carbon capture, carbon credits and carbon offsets into the databases establishes a better understanding of the sectoral challenges of energy transition and strategy planning, which can be adapted to the proposed method.
Post date: 25 October 2022
Interview with Tamás Ruppert about the relationship between human, robot and intelligent space today and in the future
The Gyártástrend Hungarian magasine published an interview with Tamas Ruppert about the next steps of the collaboration between the humans and robots.
The detailed interview is available here.
Post date: 13 October 2022
Introducing the Operator 4.0 laboratory on the night of the scientists' event
This year we introduced our Operator 4.0 laboratory during the night of the scientists' event at the University of Pannonia. The demonstration was about collaboration, the digital twin, and human-centered solutions.
More details about the Operator 4.0 laboratory are available here.
Post date: 13 October 2022
Goal-oriented possibilistic fuzzy C-Medoid clustering of human mobility patterns—Illustrative application for the Taxicab trips-based enrichment of public transport services
The discovery of human mobility patterns of cities provides invaluable information for decision-makers who are responsible for redesign of community spaces, traffic, and public transportation systems and building more sustainable cities. The present article proposes a possibilistic fuzzy c-medoid clustering algorithm to study human mobility. The proposed medoid-based clustering approach groups the typical mobility patterns within walking distance to the stations of the public transportation system. The departure times of the clustered trips are also taken into account to obtain recommendations for the scheduling of the designed public transportation lines. The effectiveness of the proposed methodology is revealed in an illustrative case study based on the analysis of the GPS data of Taxicabs recorded during nights over a one-year-long period in Budapest.
Post date: 10 October 2022
Simultaneous Process Mining of Process Events and Operator Actions for Alarm Management
Alarm management is an important task to ensure the safety of industrial process technologies. A well-designed alarm system can reduce the workload of operators parallel with the support of the production, which is in line with the approach of Industry 5.0. Using Process Mining tools to explore the operator-related event scenarios requires a goal-oriented log file format that contains the start and the end of the alarms along with the triggered operator actions. The key contribution of the work is that a method is presented that transforms the historical event data of control systems into goal-oriented log files used as inputs of process mining algorithms. The applicability of the proposed process mining-based method is presented concerning the analysis of a hydrofluoric acid alkylation plant. The detailed application examples illustrate how the extracted process models can be interpreted and utilized. The results confirm that applying the tools of process mining in alarm management requires a goal-oriented log-file design.
Post date: 29 September 2022
Indicators for climate change-driven urban health impact assessment
Climate change can cause multiply potential health issues in urban areas, which is the most susceptible environment in terms of the presently increasing climate volatility. Urban greening strategies make an important part of the adaptation strategies which can ameliorate the negative impacts of climate change. It was aimed to study the potential impacts of different kinds of greenings against the adverse effects of climate change, including waterborne, vector-borne diseases, heat-related mortality, and surface ozone concentration in a medium-sized Hungarian city. As greening strategies, large and pocket parks were considered, based on our novel location identifier algorithm for climate risk minimization.
A method based on publicly available data sources including satellite pictures, climate scenarios and urban macrostructure has been developed to evaluate the health-related indicator patterns in cities. The modelled future- and current patterns of the indicators have been compared. The results can help the understanding of the possible future state of the studied indicators and the development of adequate greening strategies.
Another outcome of the study is that it is not the type of health indicator but its climate sensitivity that determines the extent to which it responds to temperature rises and how effective greening strategies are in addressing the expected problem posed by the factor.
Post date: 16 September 2022
Operator 4.0 community met at the ETFA conference in Stuttgart
The 27th IEEE International Conference on Emerging Technologies and Factory Automation (ETFA), was held from 6th-9th September 2022 in Stuttgart, Germany. During the special session at the ETFA conference in Stuttgart organized by Tamas Ruppert, the Operator 4.0 research community finally met each other. The members came to Stuttgart from seven different countries.
More information about the special session entitled "Industry 5.0 - Augmenting the Human Worker in Balanced Automation Systems" is available here.
Post date: 12 September 2022
Hypergraph-based analysis and design of intelligent collaborative manufacturing space
A method for hypergraph-based analysis and the design of manufacturing systems has been developed. The reason for its development is the need to integrate the human workforce into Industry 4.0 solutions. The proposed intelligent collaborative manufacturing space enhances collaboration between the operators as well as provides them with valuable information about their performance and the state of the production system. The design of these Operator 4.0 solutions requires a problem-specific description of manufacturing systems, the skills, and states of the operators, as well as of the sensors placed in the intelligent space for the simultaneous monitoring of the cooperative work. The design of this intelligent collaborative manufacturing space requires the systematic analysis of the critical sets of interacting elements. The proposal is that hypergraphs can efficiently represent these sets, moreover, studying the centrality and modularity of the resultant hypergraphs can support the formation of collaboration and interaction schemes and the formation of manufacturing cells. A fully reproducible illustrative example presents the applicability of this concept.
Post date: 06 September 2022
Challenges of the Fourth Industrial Revolution in HRM
As a result of the changes caused by Industry 4.0 and Industry 5.0, unknown or less prominent challenges will be the focus of the operation of organizations and will essentially transform current human resource management (HRM) and its framework and tools. This research aims to identify Industry 4.0 solutions and expected changes in the field of human resources (HR) and for employees and to outline emerging trends of Industry 4.0 that impact HR based on interviews with surveyed companies and a review of the relevant literature. Structured interviews were conducted in this research. After individually processing the responses of each interviewee, the responses were formulated by considering all interviews. This research points out that in terms of HR, recruitment and training are being most affected by the fourth industrial revolution, and changes in competencies and their development processes have begun. Hopefully, the discovered connections will inspire further research and provide useful information on the fields of Industry 4.0 and HR.
Post date: 30 August 2022
Information sharing in supply chains – Interoperability in an era of circular economy
In order to realize the goals of Industry 5.0 (I5.0), which has data interoperability as one of its core principles, the future research in the Supply Chain (SC) visibility has to be aligned with socially, economically and environmentally sustainable objectives. Within the purview of circular economy, this paper indicates various aspects and implications of data sharing in the SCs in light of the published research. Taking into consideration the heterogeneity of data sources and standards, this article also catalogs all the major data-sharing technologies being employed in sharing data digitally across the SCs.
Drawing on the published research from 2015 to 2021, following the PRISMA framework, this paper presents the state of research in the field of data sharing in SCs in terms of their standardization, optimization, simulation, automation, security and more notably sustainability. Using the co-occurrence metric, bibliometric analysis has been conducted such that the collected research is categorized under various keyword clusters and regional themes. This article brings together two major themes in reviewing the research in the field. Firstly, the bibliometric analysis of the published articles demonstrates the contours of the current state of research and the future possibilities in the field. Secondly, in synthesizing the research on the foundations of sustainability within the CRoss Industry Standard Process for Data Mining (CRISP-DM) framework, this article deals with various aspects and implications of information sharing in the SCs. By bringing these two themes together, this paper affords a prospective researcher with the research vis-à-vis the information sharing in SC, starting from the actual data standards in use to the modality and consequence of their application within the perspective of the circular economy. This article, in essence, indicates how all the aspects of data sharing in SCs may be brought together in service of the paradigm of I5.0.
Post date: 10 August 2022
Simulation of Sustainable Manufacturing Solutions: Tools for Enabling Circular Economy
At the current worrisome rate of global consumption, the linear economy model of producing goods, using them, and then disposing of them with no thought of the environmental, social, or economic consequences, is unsustainable and points to a deeply flawed manufacturing framework. Circular economy (CE) is presented as an alternative framework to address the management of emissions, scarcity of resources, and economic sustainability such that the resources are kept ‘in the loop’. In the context of manufacturing supply chains (SCs), the 6R’s of rethink, refuse, reduce, reuse, repair, and recycle have been proposed in line with the achievement of targeted net-zero emissions. In order to bring that about, the required changes in the framework for assessing the state of manufacturing SCs with regard to sustainability are indispensable. Verifiable and empirical model-based approaches such as modeling and simulation (M&S) techniques find pronounced use in realizing the ideal of CE. The simulation models find extensive use across various aspects of SCs, including analysis of the impacts, and support for optimal re-design and operation. Using the PRISMA framework to sift through published research, as gathered from SCOPUS, this review is based on 202 research papers spanning from 2015 to the present. This review provides an overview of the simulation tools being put to use in the context of sustainability in the manufacturing SCs, such that various aspects and contours of the collected research articles spanning from 2015 to the present, are highlighted. This article focuses on the three major simulation techniques in the literature, namely, Discrete Event Simulation (DES), Agent-Based Simulation (ABS), and System Dynamics (SD). With regards to their application in manufacturing SCs, each modeling technique has its pros and its cons which are evinced in case of data requirement, model magnification, model resolution, and environment interaction, among others. These limitations are remedied through use of hybrids wherein two or more than two modeling techniques are applied for the desired results. The article also indicates various open-source software solutions that are being employed in research and the industry. This article, in essence, has three objectives. First to present to the prospective researchers, the current state of research, the concerns that have been presented in the field of sustainability modeling, and how they have been resolved. Secondly, it serves as a comprehensive bibliography of peer-reviewed research published from 2015–2022 and, finally, indicating the limitations of the techniques with regards to sustainability assessment. The article also indicates the necessity of a new M&S framework and its prerequisites.
Post date: 8 August 2022
Retrofitting-based development of brownfield Industry 4.0 and Industry 5.0 solutions
The ongoing Industry 4.0 is characterized by the connectivity between components in the manufacturing system. For modern machines, the Internet of Things is a built-in function. In contrast, there are legacy machines in deployment functioning without digital communication. The need to connect them became popular to improve overall production efficiency. As building a new smart factory as a greenfield investment is a capital-intensive choice, retrofitting the existing infrastructure with IoT capability is more reasonable than replacing them. However, this so-called brownfield development, or retrofitting, requires specific prerequisites, e.g., digitization status assessment, technical and connectivity development, management requirement, and operational need, representing a significant disadvantage: lack of scalability. In the meantime, Industry 5.0 is under human-centric priority, which poses new challenges to the retrofitted system. Aware of the challenge, this paper provides a systematic overview of brownfield development regarding technical difficulties, supporting technologies, and possible applications for the legacy system. The research scope focuses on available Industry 4.0 advancements but considers preparing for the forthcoming Industry 5.0. The proposed retrofitting project approach can be a guideline for manufacturers to transform their factories into intelligent spaces with minimal cost and effort but still gain the most applicable solution for management needs. The future direction for other research in brownfield development for Industry 5.0 is also discussed.
Post date: 22 June 2022
Edge-Computing and Machine-Learning-Based Framework for Software Sensor Development
The present research presents a framework that supports the development and operation of machine-learning (ML) algorithms to develop, maintain and manage the whole lifecycle of modeling software sensors related to complex chemical processes. Our motivation is to take advantage of ML and edge computing and offer innovative solutions to the chemical industry for difficult-to-measure laboratory variables. The purpose of software sensor models is to continuously forecast the quality of products to achieve effective quality control, maintain the stable production condition of plants, and support efficient, environmentally friendly, and harmless laboratory work. As a result of the literature review, quite a few ML models have been developed in recent years that support the quality assurance of different types of materials. However, the problems of continuous operation, maintenance and version control of these models have not yet been solved. The method uses ML algorithms and takes advantage of cloud services in an enterprise environment. Industrial 4.0 devices such as the Internet of Things (IoT), edge computing, cloud computing, ML, and artificial intelligence (AI) are core techniques. The article outlines an information system structure and the related methodology based on data from a quality-assurance laboratory. During the development, we encountered several challenges resulting from the continuous development of ML models and the tuning of their parameters. The article discusses the development, version control, validation, lifecycle, and maintenance of ML models and a case study. The developed framework can continuously monitor the performance of the models and increase the amount of data that make up the models. As a result, the most accurate, data-driven and up-to-date models are always available to quality-assurance engineers with this solution.
Post date: 03 June 2022
Tamás Ruppert will be a presenter at the interdisciplinary DAAD Alumni & Friends Colloquium
Tamás Ruppert will present at DAAD Alumni & Friends Colloquium about Industry 5.0 - Human-Factors in semi-automated Manufacturing. The event takes place on June 2 via Webex.
The DAAD Alumni & Friends Colloquium takes place on June 2. Since 2016 the regional group Ruhr of DAAD Alumni & Friends has co-organized four colloquia per year since 2019, co-hosted by FH Dortmund. They invite DAAD fellows currently pursuing their research in Germany and other young researchers in NRW to present their ongoing projects.
More detail about the event is available here.
Post date: 22 May 2022
Tamás Ruppert attended the Industry 4.0 Symposium at the Warsaw University of Technology in Poland
Tamás Ruppert was a keynote speaker at the Industry 4.0 Symposium held at the Warsaw University of Technology in Poland on May 9-10. He presented the newest results of our research group on the Operator 4.0 topic and the Operator 4.0 research network.
More information about the Operator 4.0 research network is available here.
More information about the event is available here.
Post date: 10 May 2022
Multi-agent reinforcement learning-based exploration of optimal operation strategies of semi-batch reactors
The operation of semi-batch reactors requires caution because the feeding reagents can accumulate, leading to hazardous situations due to the loss of control ability. This work aims to develop a method that explores the optimal operational strategy of semi-batch reactors. Since reinforcement learning (RL) is an efficient tool to find optimal strategies, we tested the applicability of this concept. We developed a problem-specific RL-based solution for the optimal control of semi-batch reactors in different operation phases. The RL-controller varies the feeding rate in the feeding phase directly, while in the mixing phase, it works as a master in a cascade control structure. The RL-controllers were trained with different neural network architectures to define the most suitable one. The developed RL-based controllers worked very well and were able to keep the temperature at the desired setpoint in the investigated system. The results confirm the benefit of the proposed problem-specific RL-controller.
Post date: 05 May 2022
Ádám Ipkovich took second place at Scientific Student Conference
The University of Pannonia held the Engineering Faculty Scientific Student Conference on May 04 2022, where Ádám Ipkovich took second place in the modeling section with a paper entitled: Neighborhood Ranking-based Model-free Feature Selection.
Post date: 04 May 2022
A multi-block clustering algorithm for high dimensional binarized sparse data
We introduce a multidimensional multiblock clustering (MDMBC) algorithm in this paper. MDMBC can generate overlapping clusters with similar values along clusters of dimensions. The parsimonious binary vector representation of multidimensional clusters lends itself to the application of efficient meta-heuristic optimization algorithms. In this paper, a hill-climbing (HC) greedy search algorithm has been presented that can be extended by several stochastic and population-based meta-heuristic frameworks. The benefits of the algorithm are demonstrated in a bi-clustering benchmark problem and in the analysis of the Leiden higher education ranking system, which measures the scientific performance of 903 institutions along four dimensions of 20 indicators representing publication output and collaboration in different scientific fields and time periods.
Post date: 01 April 2022
Honti Gergely Marcell, a former PhD graduate took first place in Digitalization context of data economy
Honti Gergely Marcell was awarded the first place in Digitization context of data economy on March 30 2022.
Digitális Jólét Nonprofit Kft. és a Neumann Nonprofit Közhasznú Kft. announced a call for proposals entitled "Digitalization contexts of the data economy". The call for applications was addressed to students and recent graduates who have written their thesis, diploma thesis or doctoral dissertation on the digitalisation context of the data economy.
Post date: 30 March 2022
János Abonyi was awarded the Knight's Cross of the Merit of Hungary
The Parliamentary and Strategic State Secretary of the Ministry of Innovation and Technology, Tamás Schanda, presented awards on the March 17 2022. On the occasion of the event, Dr. János Abonyi was awarded the Knight's Cross of the Order of Merit of Hungary, in recognition of his outstanding research and teaching work in the field of complex systems modelling and data mining, and his achievements in the development of the engineering education system of the University of Pannonia.
Post date: 17 March 2022
Processing indoor positioning data by goal-oriented supervised fuzzy clustering for tool management
Indoor positioning systems allow real-time tracking of tool locations. Tool utilization can be calculated based on positional data of the storage and manufacturing areas. Due to the uncertainty of the position measurements, estimation of the state of the tools is problematic when the distance urvival Indoor positioning systems allow real-time tracking of tool locations. Tool utilization can be calculated based on positional data of the storage and manufacturing areas. Due to the uncertainty of the position measurements, estimation of the state of the tools is problematic when the distance between the examined zones is less than the estimation error. We propose a goal-oriented supervised fuzzy clustering algorithm that utilizes the activity state of the tool, as the algorithm simultaneously maximizes the spatial distribution probability and the probability of a specific activity state occurring in a cluster. By weighting data points according to the time spent in the related states and positions, the resulting cluster weights can be interpreted as tool utilizations. The applicability of the developed method is presented through the processing of position data from crimping tools used by a wire harness manufacturer.
Post date: 26 February 2022
Factor analysis, sparse PCA, and Sum of Ranking Differences-based improvements of the Promethee-GAIA multicriteria decision support technique
The Promethee-GAIA method is a multicriteria decision support technique that defines the aggregated ranks of multiple criteria and visualizes them based on Principal Component Analysis (PCA). In the case of numerous criteria, the PCA biplot-based visualization do not perceive how a criterion influences the decision problem. The central question is how the Promethee-GAIA-based decision-making process can be improved to gain more interpretable results that reveal more characteristic inner relationships between the criteria. To improve the Promethee-GAIA method, we suggest three techniques that eliminate redundant criteria as well as clearly outline, which criterion belongs to which factor and explore the similarities between criteria. These methods are the following: A) Principal factoring with rotation and communatily analysis (P-PFA), B) the integration of Sparse PCA into the Promethee II methods (P-sPCA), and C) the Sum of Ranking Differences method (P-SRD). The suggested methods are presented through an I4.0+ dataset that measures the Industry 4.0 readiness of NUTS2-classified regions. The proposed methods are useful tools for handling multicriteria ranking problems, if the number of criteria is numerous.
Post date: 25 February 2022
3D Scanning and Model Error Distributiron-Based Characterisation of Welding Defects
The inspection of welded structures requires particular attention due to many aspects that define the quality of the product. Deciding on the suitability of welds is a complex process. This work aims to propose a method that can support this qualification. This paper presents a state-of-the-art data-driven evaluation method and its application in the quality assessment of welds. Image processing and CAD modelling software was applied to generate a reference using the Iterative Closest Point algorithm that can be used to generate datasets which represent the model errors. The results demonstrate that the distribution of these variables characterises the typical welding defects. Based on the automated analysis of these distributions, it is possible to reduce the turnaround time of testing, thereby improving the productivity of welding processes.
Post date: 08 February 2022
János Abonyi and Tamás Ruppert are Special Issue Editors of "Industry 5.0 - the Human Factors in Semi-automated Manufacturing"
With the rapid development of innovative technologies, such as artificial intelligence methods, big data and cloud computing, the new concept of Industry 5.0 has been revolutionizing production and logistics systems by introducing collaborative processes and data-based operator support (so-called Operator 4.0). This Special Issue aims to disseminate advanced research in the theory and application of collaboration in the manufacturing industries (also known by some experts as Industry 5.0).
Topics of interest include, but are not limited to:
Human–machine interface in IIoT for industrial applications;
Digital Twin, Device Models, Adaptive- and Automation-Models;
Human–machine interfaces (HMI) and SCADA supervisory systems;
Human factors, industrial ergonomics, and safety in smart maintenance;
Industrial applications of the Internet of Things;
AI- or ML-based maintenance;
Risk analysis for Industry Production Systems;
Smart Manufacturing;
Smart logistics related to industrial applications;
Cyberphysical systems;
Industrial sensor networks;
Combinations of sensors/sensor networks and Augmented Reality in industrial environments;
Real-time locating in production and logistics;
Process modelling and simulation.
Link to the Special Issue "Industry 5.0 - the Human Factors in Semi-automated Manufacturing"
Post date: 07 February 2022
Network-Based Topological Exploration of the Impact of Pollution Sources on Surface Water Bodies
We developed a digital water management toolkit to evaluate the importance of the connections between water bodies and the impacts caused by pollution sources. By representing water bodies in a topological network, the relationship between point loads and basic water quality parameters is examined as a labelled network. The labels are defined based on the classification of the water bodies and pollution sources. The analysis of the topology of the network can provide information on how the possible paths of the surface water network influence the water quality. The extracted information can be used to develop a monitoring- and evidence-based decision support system. The methodological development is presented through the analysis of the physical-chemical parameters of all surface water bodies in Hungary, using the emissions of industrial plants and wastewater treatment plants. Changes in water quality are comprehensively assessed based on the water quality data recorded over the past 10 years. The results illustrate that the developed method can identify critical surface water bodies where the impact of local pollution sources is more significant. One hundred six critical water bodies have been identified, where special attention should be given to water quality improvement.
Post date: 13 January 2022
We are on the Cover Story of Data journal
The article "Learning Interpretable Mixture of Weibull Distributions - Exploratory Analysis of How Economic Development Influences the Incidence of COVID-19 Deaths" by R. Csalódi, Z. Birkner, and J. Abonyi is on the cover story of Data journal, Volume 6, Issue 12, pp. 125 (2021).
The journal is available here.
The article is available here.
Post date: December 2021
Data-driven business process management-based development of Industry 4.0 solutions
Business process management (BPM) supports the management and transformation of organizational operations. This paper provides a structured guideline for improving data-based process development within the BPM life cycle. We show how Industry 4.0-induced tools and models can be integrated within the BPM life cycle to achieve more efficient process excellence and evidence-based decision-making. The paper demonstrates how standards of machine learning (CRISP-ML(Q)), BPM, and tools of design science research can support the redesign phases of Industry 4.0 development. The proposed methodology is carried out on an assembly company, where the proposed improvement steps are investigated by simulation and evaluated by relevant key performance indicators.
Post date: 10 December 2021
Comprehensible Visualization of Multidimensional Data: Ranking Differences-Based Parallel Coordinates
A novel visualization technique is proposed for the sum of ranking differences method (SRD) based on parallel coordinates. An axis is defined for each variable, on which the data are depicted row-wise. By connecting data, the lines may intersect. The fewer intersections between the variables, the more similar they are and the clearer the figure becomes. Therefore, the visualization depends on what techniques are used to order the variables. The key idea is to employ the SRD method to measure the degree of similarity of the variables, establishing a distance-based order. The distances between the axes are not uniformly distributed in the proposed visualization; their closeness reflects similarity, according to their SRD value. The proposed algorithm identifies false similarities through an iterative approach, where the angles between the SRD values determine which side a variable is plotted. Visualization of the algorithm is provided by MATLAB/Octave source codes. The proposed tool is applied to study how the sources of greenhouse gas emissions can be grouped based on the statistical data of the countries. A comparison to multidimensional scaling (MDS)-based ordering is also given. The use case demonstrates the applicability of the method and the synergies of the incorporation of the SRD method into parallel coordinates.
Post date: 11 December 2021
János Abonyi is the Special Issue Editor of "Soft Sensors 2021-2022"
The Special Issue solicits papers that cover the development, validation, application, and maintenance of software sensors. The potential topics include, but are not limited to:
Data-driven modeling for soft sensor development (from classical system identification and multivariate chemometric techniques to deep learning);
Semi-mechanistic and first-principle models in soft sensor development (including grey box models);
Validation and maintenance of soft sensors;
Control-oriented soft sensor development (g., inferential control, senseless control);
Applications in fault detection and diagnosis and monitoring of complex processes;
Applications in state estimation, control, and optimization (e.g., sensorless motor control, nonlinear model predictive control);
Special applications in process analytical technology (PAT), manufacturing, chemical, bio, pharmaceutical, oil, and process engineering.
Link to the Special Issue "Soft Sensors 2021-2022"
Post date: December 2021
The Applicability of Reinforcement Learning Methods in the Development of Industry 4.0 Applications
Reinforcement learning (RL) methods can successfully solve complex optimization problems. Our article gives a systematic overview of major types of RL methods, their applications at the field of Industry 4.0 solutions, and it provides methodological guidelines to determine the right approach that can be fitted better to the different problems, and moreover, it can be a point of reference for R&D projects and further researches.
Post date: 30 November 2021
Learning Interpretable Mixture of Weibull Distributions - Exploratory Analysis of How Economic Development Influences the Incidence of COVID-19 Deaths
This paper presents an algorithm for learning local Weibull models, whose operating regions are represented by fuzzy rules. The applicability of the proposed method is demonstrated in estimating the mortality rate of the COVID-19 pandemic. The reproducible results show that there is a significant difference between mortality rates of countries due to their economic situation, urbanization, and the state of the health sector. The proposed method is compared with the semi-parametric Cox proportional hazard regression method. The distribution functions of these two methods are close to each other, so the proposed method can estimate efficiently.
Post date: 26 November 2021
Ontology-Based Analysis of Manufacturing Processes: Lessons Learned from the Case Study of Wire Harness Production
Effective information management is critical for the development of manufacturing processes. This paper aims to provide an overview of ontologies that can be utilized in building Industry 4.0 applications. The main contributions of the work are that it highlights ontologies that are suitable for manufacturing management and recommends the multilayer-network-based interpretation and analysis of ontology-based databases. This article not only serves as a reference for engineers and researchers on ontologies but also presents a reproducible industrial case study that describes the ontology-based model of a wire harness assembly manufacturing process.
Post date: 19 November 2021
Mixture of Survival Analysis Models-Cluster-Weighted Weibull Distributions
Survival analysis is a widely used method to establish a connection between a time to event outcome and a set of variables. The goal of this work is to improve the accuracy of the widely applied parametric survival models. This work highlights that accurate and interpretable survival analysis models can be identified by clustering-based exploration of the operating regions of local survival models. The key idea is that when operating regions of local Weibull distributions are represented by Gaussian mixture models, the parameters of the mixture-of-Weibull model can be identified by a clustering algorithm. The proposed method is utilised in three case studies. The examples cover studying the dropout rate of university students, calculating the remaining useful life of lithium-ion batteries, and determining the chances of survival of prostate cancer patients. The results demonstrate the wide applicability of the method and the benefits of clustering-based identification of local Weibull models.
Post date: 11 November 2021
Identification of sampling points for the detection of SARS-CoV-2 in the sewage system
A suitable tool for monitoring the spread of SARS-CoV-2 is to identify potential sampling points in the wastewater collection system that can be used to monitor the distribution of COVID-19 disease affected clusters within a city. The applicability of the developed methodology is presented through the description of the 72,837 population equivalent wastewater collection system of the city of Nagykanizsa, Hungary and the results of the analytical and epidemiological measurements of the wastewater samples. The wastewater sampling was conducted during the 3rd wave of the COVID-19 epidemic. It was found that the overlap between the road system and the wastewater network is high, it is 82 %. It was showed that the proposed methodological approach, using the tools of network science, determines confidently the zones of the wastewater collection system and provides the ideal monitoring points in order to provide the best sampling resolution in urban areas. The strength of the presented approach is that it estimates the network based on publicly available information. It was concluded that the number of zones or sampling points can be chosen based on relevant epidemiological intervention and mitigation strategies. The algorithm allows for continuous effective monitoring of the population infected by SARS-CoV-2 in small-sized cities.
Post date: 29 October 2021
Janos Abonyi was invited to join as a program committee member at 8th International Conference on Control, Decision and Infromation Technologies (CoDIT), 2022
Janos Abonyi serves as a program committee member at the 8th International Conference on Control, Decision and Infromation Technologies (CoDIT).
The aim of CoDIT to be a forum for technical exchange amongst scientist having interests in Control, Optimization, Decision, all areas of Engineering, Computer Science and Information Technologies. The conferene will provide remarkable opportunity for the academic and industrial communities to address new challenges, share solutions and discucc future research directions.
The conference will be held at Istambul, Turkey in May 17-20, 2022.
Post date: 26 October 2021
Janos Abonyi was invited to join as a program committee member at Innovations in Bio-Inspired Computing and Applications (IBICA) 2021 conference
Janos Abonyi serves as a program committee member at the 12th International Conference on Innovations in Bio-Inspired Computing and Applications (IBICA).
The aim of IBICA is to provide a platform for world research leaders and practitioners, to discuss the "full spectrum" of current theoretical developments, emerging technologies and innovative applications of Bio-inspired Computing. Bio-insipired Computing is currently one of the most exciting research ares, and it is continuously demonstrating exceptional strenght in solving complex real life problems.
The conference will be held online on December 16-18, 2021.
Post date: 07 September 2021
Janos Abonyi was invited to join as a program committee member at Soft Computing and Pattern Recognition (SoCPaR) 2021 conference
Janos Abonyi serves as a program committee member at the 13th International Conference on Soft Computing and Pattern Recognition (SoCPaR).
The conference aims to bring together worldwide leading researchers and practitioners interested in advancing the state-of-the-art in Soft Computing and Pattern Recognition, for exchanging knowledge that encompasses a broad range of disciplines among various distinct communities. It is hoped that researchers and practitioners will bring new prospects for collaboration across disciplines and gain inspiration to facilitate novel breakthroughs. The themes for this conference are thus focused on "Innovating and Inspiring Soft Computing and Intelligent Pattern Recognition".
The conference will be held online on December 15-17, 2021.
Post date: 07 September 2021
Janos Abonyi was invited to join as a program committee member at Information Assurance and Security (IAS) 2021 conference
Janos Abonyi serves as a program committee member at the 17th International Conference on Information Assurance and Security (IAS).
The Conference Theme: Innovative Cyber Secutiry: Protecting National Borders'.
The conference aims to bring together researchers, practitioners, developers, and policy makers in multiple disciplines of information security and assurance to exchange ideas and to learn the latest development in this important field.
The conference is organized by Machine Intelligence Research Labs (MIR Labs) and will be held online on December 14-16.
Post date: 07 September 2021
Janos Abonyi was invited to join as a program committee member at Nature and Biologically Inspired Computing (NaBIC) 2021 World Congress
Janos Abonyi serves as a program committee member at the 13th World Congress on Nature and Biologically Inspired Computing (NaBIC).
NaBIC 2021 is organized to provide a forum for researchers, engineers, and students from all over the world, to discuss the state-of-the-art in machine intelligence, and address various issues on building up human friendly machines by learning from nature. The conference theme is “Nurturing Intelligent Computing Towards Advancement of Machine Intelligence”.
The conference will be held online on March 15-17, 2021.
Post date: 07 September 2021
Janos Abonyi was invited to join as a program committee member at Hybrid Intelligent Systems (HIS) 2021 conference
Janos Abonyi serves as a program committee member at the 21th International Conference on Hybrid Intelligent Systems (HIS).
The objectives of HIS 2021 are: to increase the awareness of the research community of the broad spectrum of hybrid techniques, to bring together AI researchers from around the world to present their cutting-edge results, to discuss the current trends in HIS research, to develop a collective vision of future opportunities, to establish international collaborative opportunities, and as a result to advance the state of the art of the field.
The conference will be held online on March 14-16, 2021.
Post date: 07 September 2021
Data-driven comparative analysis of national adaptation pathways for Sustainable Development Goals
Since the declaration of Sustainable Development Goals (SDGs) in 2015, countries have begun developing and strategizing their national pathways for effective implementation of the 2030 Agenda. The sustainable development targets set out how the world’s nations must move forward so that sustainable development is not an ideal vision but a workable, comprehensive environmental, economic, and social policy. This work aims to analyze the state of progress towards achieving sustainable development goals for each country. In addition to the static presentation of the achievements that countries can present, the changes over time are also compared, allowing countries to be grouped according to the current states. A sophisticated SDG performance measurement tool has been developed to support this analysis, which automatically processes the entire UN Global SDG Indicators database with exploratory data analysis, frequent item mining, and network analysis supported. Based on the trend analysis of the percentiles, the values of the indicators achievable by 2030 are also derived. The analyzes were performed based on the time-series data of 1319 disaggregated official SDG indicators.
Most of the world countries have achieved the greatest success in SDG12 and SDG10 since the declaration of the 2030 Agenda. In the field of climate change (SDG13), 26 countries can count on significant achievements. However, SDG6, SDG2, and SDG1 face significant challenges globally, as they have typically seen minor progress in recent years. Examined at the indicator level, indicators 1.4.1, 5.6.2, 6.b.1, 10.7.2, and 15.4.2 improved in all countries of the world, while indicators 2.a.1, 9.4.1, 2.1.1, 2.1. and 12.b.1 have deteriorated predominantly. According to the forecast for 2030, Australia and the United States can reduce their per capita CO2 emissions, while some countries in Africa, Asia, and the Middle East are expected to increase their emissions.
Post date: 20 August 2021
Event-Tree Based Sequence Mining Using LSTM Deep-Learning Model
During the operation of modern technical systems, the use of the LSTM model for the prediction of process variable values and system states is commonly widespread. The goal of this paper is to expand the application of the LSTM-based models upon obtaining information based on prediction. In this method, by predicting transition probabilities, the output layer is interpreted as a probability model by creating a prediction tree of sequences instead of just a single sequence. By further analyzing the prediction tree, we can take risk considerations into account, extract more complex prediction, and analyze what event trees are yielded from different input sequences, that is, with a given state or input sequence, the upcoming events and the probability of their occurrence are considered. In the case of online application, by utilizing a series of input events and the probability trees, it is possible to predetermine subsequent event sequences. The applicability and performance of the approach are demonstrated via a dataset in which the occurrence of events is predetermined, and further datasets are generated with a higher-order decision tree-based model. The case studies simply and effectively validate the performance of the created tool as the structure of the generated tree, and the determined probabilities reflect the original dataset.
Post date: 16 August 2021
Contrast and brightness balance in image enhancement using Cuckoo Search-optimized image fusion
Many vision-based systems suffer from poor levels of contrast and brightness, mainly because of inadequate and improper illumination during the image acquisition process. As a result, the required specified information from the acquired image is not available for the particular application. In general, it is hard to achieve a balance between the improvement of contrast and brightness in image enhancement. By introducing nature-inspired optimization in image enhancement, the best features of the image are utilized, and the complexity related to the nonlinearity of images can be solved with various constraints, like a balance between contrast and brightness. In this work, a novel automatic method for image enhancement to find a balance between contrast and brightness is developed by using Cuckoo Search-optimized image fusion. First, the Cuckoo Search-based optimization algorithm generates two sets of optimized parameters. These parameter sets are used to generate a pair of enhanced images, one with a high degree of sharpness and contrast, the other is bright and has been improved without losing the level of detail. Furthermore, the two enhanced images are fused by the fusion process to obtain an output image where the contrast and brightness are in balance. The effectiveness of the proposed method is verified by applying it to standard images (CVG-UGR image database) and lathe tool images. Experimental results demonstrated that the proposed method performs better with regard to both the quality of contrast and brightness, moreover, yields enhanced quality evaluation metrics compared to the other conventional techniques.
Post date: 15 July 2021
Quality vs. quantity of alarm messages - How to measure the performance of an alarm system
Despite significant efforts to measure and assess the performance of alarm systems, to this day, no silver bullet has been found. The majority of the existing standards and guidelines focus on the alarm load of the operators, either during normal or upset plant conditions, and only a small fraction takes into consideration the actions performed by the operators. In this study, an overview of the evolution of alarm system performance metrics is presented and the current data-based approaches are grouped into seven categories based on the goals of and the methodologies associated with each metric. Deriving from our categorical overview, the terminological differences between the academic and industrial approaches of alarm system performance measurement are reflected. Moreover, we highlight how extremely unbalanced the performance measurement of alarm systems is towards quantitative metrics instead of focusing on qualitative assessment, invoking the threat of excessive alarm reductions resulting from such a unilateral approach. The critical aspects of qualitative performance measurement of alarm systems is demonstrated in terms of the comparison of the alarm system of an industrial hydrofluoric acid alkylation unit before and after the alarm rationalization process. The quality of the alarm messages is measured via their informativeness and actionability, in other words, how appropriate the parameter settings are for the everyday work and how actionable they are by the operators of the process.
Post date: 5 July 2021
Genetic programming-based symbolic regression for goal-oriented dimension reduction
The majority of dimension reduction techniques are built upon the optimization of an objective functionaiming to retain certain characteristics of the projected datapoints: the variance of the original dataset,the distance between the datapoints or their neighbourhood characteristics, etc. Building upon theoptimization-based formalization of dimension reduction techniques, the goal-oriented formulation ofprojection cost functions is proposed. For the optimization of the application-oriented data visualizationcost function, a Multi-gene genetic programming (GP)-based algorithm is introduced to optimize thestructures of the equations used for mapping high-dimensional data into a two-dimensional space andto select variables that are needed to explore the internal structure of the data for data-driven softwaresensor development or classifier design. The main benefit of the approach is that the evolved equationsare interpretable and can be utilized in surrogate models. The applicability of the approach is demon-strated in the benchmark wine dataset and in the estimation of the product quality in a diesel oil blendingtechnology based on an online near-infrared (NIR) analyzer. The results illustrate that the algorithm iscapable to generate goal-oriented and interpretable features, and the resultant simple algebraic equa-tions can be directly implemented into applications when there is a need for computationally cost-effective projections of high-dimensional data as the resultant algebraic equations are computationallysimpler than other solutions as neural networks.
Post date: 08 June 2021
Indoor Positioning Systems Can Revolutionise Digital Lean
The powerful combination of lean principles and digital technologies accelerates wasteidentification and mitigation faster than traditional lean methods. The new digital lean (also referredto as Lean 4.0) solutions incorporate sensors and digital equipment, yielding innovative solutionsthat extend the reach of traditional lean tools. The tracking of flexible and configurable productionsystems is not as straightforward as in a simple conveyor. This paper examines how the informationprovided by indoor positioning systems (IPS) can be utilised in the digital transformation of flexiblemanufacturing. The proposed IPS-based method enriches the information sources of value streammapping and transforms positional data into key-performance indicators used in Lean Manufacturing.The challenges of flexible and reconfigurable manufacturing require a dynamic value stream mapping.To handle this problem, a process mining-based solution has been proposed. A case study isprovided to show how the proposed method can be employed for monitoring and improvingmanufacturing efficiency.
Post date: 07 June 2021
Sparse PCA supports exploration of process structures for decentralized fault detection
With the ever-increasing use of sensor technologies in industrial processes, and more data becoming available to engineers, the fault detection and isolation activities in the context of process monitoring have gained significant momentum in recent years. A statistical procedure frequently used in this domain is Principal Component Analysis (PCA), which can reduce the dimensionality of large data sets without compromising the information content. While most process monitoring methods offer satisfactory detection capabilities, understanding the root cause of malfunctions and providing the physical basis for their occurrence have been challengin. The relatively new sparse PCA techniques represent a further development of the PCA in which not only the data dimension is reduced but the data is also made more interpretable, revealing clear correlation structures among variables. Hence, taking a step forward from classical fault detection methods, in this work, a decentralized monitoring approach is proposed based on a sparse algorithm. The resulting control charts reveal the correlation structures associated with the monitored process and facilitate a structural analysis of the occurred faults. The applicability of the proposed method is demonstrated using data generated from the simulation of the benchmark vinyl acetate process. It is shown that the sparse principal components, as a foundation to decentralized multivariate monitoring framework, can provide physical insight towards the origins of process faults.
Post date: 25 May 2021
Test Plan for the Verification of the Robustness of Sensors and Automotive Electronic Products Using Scenario-Based Noise Deployment (SND)
The targeted shortening of sensor development requires short and convincing verification tests. The goal of the development of novel verification methods is to avoid or reduce an excessive amount of testing and identify tests that guarantee that the assumed failure will not happen in practice. In this paper, a method is presented that results in the test loads of such a verification. The method starts with the identification of the requirements for the product related to robustness using the precise descriptions of those use case scenarios in which the product is assumed to be working. Based on the logic of the Quality Function Deployment (QFD) method, a step-by-step procedure has been developed to translate the robustness requirements through the change in design parameters, their causing phenomena, the physical quantities as causes of these phenomena, until the test loads of the verification. The developed method is applied to the test plan of an automotive sensor. The method is general and can be used for any parts of a vehicle, including mechanical, electrical and mechatronical ones, such as sensors and actuators. Nonetheless, the method is applicable in a much broader application area, even outside of the automotive industry.
Post date: 12 May 2021
Regional development potentials of Industry 4.0: Open data indicators of the Industry 4.0+ model
This paper aims to identify the regional potential of Industry 4.0 (I4.0). Although the regional background of a company significantly determines how the concept of I4.0 can be introduced, the regional aspects of digital transformation are often neglected with regard to the analysis of I4.0 readiness. Based on the analysis of the I4.0 readiness models, the external regional success factors of the implementation of I4.0 solutions are determined. An I4.0+ (regional Industry 4.0) readiness model, a specific indicator system is developed to foster medium-term regional I4.0 readiness analysis and foresight planning. The indicator system is based on three types of data sources: (1) open governmental data; (2) alternative metrics like the number of I4.0-related publications and patent applications; and (3) the number of news stories related to economic and industrial development. The indicators are aggregated to the statistical regions (NUTS 2), and their relationships analyzed using the Sum of Ranking Differences (SRD) and Promethee II methods. The developed I4.0+ readiness index correlates with regional economic, innovation and competitiveness indexes, which indicates the importance of boosting regional I4.0 readiness.
Post date: 19 April 2021
Data describing the relationship between world news and sustainable development goals
The data article presents a dataset and a tool for news-based monitoring of sustainable development goals defined by the United Nations. The presented dataset was created by struc- tured queries of the GDELT database based on the categories of the World Bank taxonomy matched to sustainable devel- opment goals. The Google BigQuery SQL scripts and the re- sults of the related network analysis are attached to the data to provide a toolset for the strategic management of sustain- ability issues. The article demonstrates the dataset on the 6th sustainable development goal (Clean Water and Sanita- tion). The network formed based on how countries appear in the same news can be used to explore the potential interna- tional cooperation. The network formed based on how topics of World Bank taxonomy appear in the same news can be used to explore how the problems and solutions of sustain- ability issues are interlinked.
Post date: 24 March 2021
The Applicability of Big Data in Climate Change Research: The Importance of System of Systems Thinking
The aim of this paper is to provide an overview of the interrelationship between data science and climate studies, as well as describes how sustainability climate issues can be managed using the Big Data tools. Climate-related Big Data articles are analyzed and categorized, which revealed the increasing number of applications of data-driven solutions in specific areas, however, broad integrative analyses are gaining less of a focus. Our major objective is to highlight the potential in the System of Systems (SoS) theorem, as the synergies between diverse disciplines and research ideas must be explored to gain a comprehensive overview of the issue. Data and systems science enables a large amount of heterogeneous data to be integrated and simulation models developed, while considering socio-environmental interrelations in parallel. The improved knowledge integration offered by the System of Systems thinking or climate computing has been demonstrated by analysing the possible inter-linkages of the latest Big Data application papers. The analysis highlights how data and models focusing on the specific areas of sustainability can be bridged to study the complex problems of climate change.
Post date: 17 March 2021
We are delivering a couse about "Sensing the future of science"
The course offers to gain ability to get ideas for research that has potential impact on social and economic relevances.
The course material covers the concepts of technology scouting and the exploration of information sources to find 'evidence of the future in the present'.
You can download the course material from: here
Post date: 17 March 2021
Modelling for Digital Twins - Potential Role of Surrogate Models
The application of white box models in digital twins is often hindered by missing knowledge, uncertain information and computational difficulties. Our aim was to overview the difficulties and challenges regarding the modelling aspects of digital twin applications and to explore the fields where surrogate models can be utilised advantageously. In this sense, the paper discusses what types of surrogate models are suitable for different practical problems as well as introduces the appropriate techniques for building and using these models. A number of examples of digital twin applications from both continuous processes and discrete manufacturing are presented to underline the potentials of utilising surrogate models. The surrogate models and model-building methods are categorised according to the area of applications. The importance of keeping these models up to date through their whole model life cycle is also highlighted. An industrial case study is also presented to demonstrate the applicability of the concept.
Post date: 7 March 2021
Integrated Survival Analysis and Frequent Pattern Mining for Course Failure-Based Prediction of Student Dropout
A data-driven method to identify frequent sets of course failures that students should avoid in order to minimize the likelihood of their dropping out from their university training is proposed. The overall probability distribution of the dropout is determined by survival analysis. This result can only describe the mean dropout rate of the undergraduates. However, due to the failure of different courses, the chances of dropout can be highly varied, so the traditional survival model should be extended with event analysis. The study paths of students are represented as events in relation to the lack of completing the required subjects for every semester. Frequent patterns of backlogs are discovered by the mining of frequent sets of these events. The prediction of dropout is personalised by classifying the success of the transitions between the semesters. Based on the explored frequent item sets and classifiers, association rules are formed providing the estimates of the success of the continuation of the studies in the form of confidence metrics. The results can be used to identify critical study paths and courses. Furthermore, based on the patterns of individual uncompleted subjects, it is suitable to predict the chance of continuation in every semester. The analysis of the critical study paths can be used to design personalised actions minimizing the risk of dropout, or to redesign the curriculum aiming the reduction in the dropout rate. The applicability of the method is demonstrated based on the analysis of the progress of chemical engineering students at the University of Pannonia in Hungary. The method is suitable for the examination of more general problems assuming the occurrence of a set of events whose combinations may trigger a set of critical events.
Post date: 24 February 2021
Frequent Itemset Mining and Multi-Layer Network-Based Analysis of RDF Databases
Triplestores or resource description framework (RDF) stores are purpose-built databasesused to organise, store and share data with context. Knowledge extraction from a large amountof interconnected data requires effective tools and methods to address the complexity and theunderlying structure of semantic information. We propose a method that generates an interpretablemultilayered network from an RDF database. The method utilises frequent itemset mining (FIM)of the subjects, predicates and the objects of the RDF data, and automatically extracts informativesubsets of the database for the analysis. The results are used to form layers in an analysablemultidimensional network. The methodology enables a consistent, transparent, multi-aspect-orientedknowledge extraction from the linked dataset. To demonstrate the usability and effectiveness ofthe methodology, we analyse how the science of sustainability and climate change are structuredusing the Microsoft Academic Knowledge Graph. In the case study, the FIM forms networks ofdisciplines to reveal the significant interdisciplinary science communities in sustainability and climatechange. The constructed multilayer network then enables an analysis of the significant disciplinesand interdisciplinary scientific areas. To demonstrate the proposed knowledge extraction process, wesearch for interdisciplinary science communities and then measure and rank their multidisciplinaryeffects. The analysis identifies discipline similarities, pinpointing the similarity between atmosphericscience and meteorology as well as between geomorphology and oceanography. The results confirmthat frequent itemset mining provides an informative sampled subsets of RDF databases which canbe simultaneously analysed as layers of a multilayer network.
Post date: 23 February 2021
Inustry 4.0-Driven Development of Optimization Algortihms: A Systematic Overview
The Fourth Industrial Revolution means the digital transformation of production systems. Cyber-physical systems allow for the horizontal and vertical integration of these production systems as well as the exploitation of the benefits via optimization tools. This article reviews the impact of Industry 4.0 solutions concerning optimization tasks and optimization algorithms, in addition to the identification of the new R&D directions driven by new application options. The basic organizing principle of this overview of the literature is to explore the requirements of optimization tasks, which are needed to perform horizontal and vertical integration. This systematic review presents content from 900 articles on Industry 4.0 and optimization as well as 388 articles on Industry 4.0 and scheduling. It is our hope that this work can serve as a starting point for researchers and developers in the field.
Post date: 13 February 2021
The intertwining of world news with Sustainable Development Goals: An effective monitoring tool
This study aims to bring about a novel approach to the analysis of Sustainable Development Goals (SDGs) based solely on the appearance of news. Our purpose is to provide a monitoring tool that enables world news to be detected in an SDG-oriented manner, by considering multilingual as well as wide geographic coverage. The association of the goals with news basis the World Bank Group Topical Taxonomy, from which the selection of search words approximates the 17 development goals. News is extracted from The GDELT Project (Global Database of Events, Language and Tone) which gathers both printed as well as online news from around the world. 60 851 572 relevant news stories were identified in 2019. The intertwining of world news with SDGs as well as connections between countries are interpreted and highlight that even in the most SDG-sensitive countries, only 2.5% of the news can be attributed to the goals. Most of the news about sustainability appears in Africa as well as East and Southeast Asia, moreover typically the most negative tone of news can be observed in Africa. In the case of climate change (SDG 13), the United States plays a key role in both the share of news and the negative tone. Using the tools of network science, it can be verified that SDGs can be characterized on the basis of world news.
This news-centred network analysis of SDGs identifies global partnerships as well as national stages of implementation towards a sustainable socio-environmental ecosystem. In the field of sustainability, it is vital to form the attitudes and environmental awareness of people, which strategic plans cannot address but can be measured well through the news.
Post date: 05 February 2021
A negyedik ipari forradalom hatása a kompetenciacserélődésre
Az élet számos területén folyamatos változás figyelhető meg, különösen így van ez a gyakorlati életben jelenleg is zajló negyedik ipari forradalom kapcsán. Az Ipar 4.0, a technológiai újításai révén, jelentősen megváltoztatja a munkaerőpiacot és a munkahelyeket. Így elkerülhetetlen a jelenleg is zajló és a várható változáshoz való alkalmazkodás, ugyanakkor nehéz megmondani, hogy milyen kompetenciákra lesz szükség ehhez a jövőben. A kutatás célja, az Ipar 4.0 megoldások azonosítása során, a kompetenciaszükséglet változásának meghatározása a vizsgált vállalatokkal készített strukturált interjúk alapján.A kutatás rávilágít arra, hogy a kompetenciacserélődés és azok fejlesztési folyamatai megkezdődtek. Remélhetőleg a feltárt összefüggések további kutatásokat inspirálnak, támpontot szolgáltatnak a munkavállalók fejlesztését szolgáló képzések kidolgozásában, megújításban, valamint a HRM és az Ipar 4.0 területén hasznos információként szolgálnak, segítve a kompetenciafejlesztési és HR-fejlesztési stratégiák kidolgozását és megvalósítását.
Post date: 12 Janura 2021
Lászlo Nagy took second place in the IEEE HS Student Paper Contest
László Nagy took second place in the IEEE HS Student Paper Contest with the article entitled: "Analytic Hierarchy Process and Multilayer Network-Based Method for Assembly Line Balancing".
Post date: 12 Janury 2021
Estimation of machine setup and changeover times by survival analysis
The losses associated with changeovers are becoming more significant in manufacturing due to the high variance of products and requirements for just-in-time production. The study is based on the single minute exchange of die (SMED) philosophy, which aims to reduce changeover times. We introduced a method for the analysis of these losses based on models that estimate the product- and operator-dependent changeover times using survival analysis. The root causes of the losses are identified by significance tests of the utilized Cox regression models. The resulting models can be used to design a performance management system that considers the stochastic nature of the work of the operators. An anonymized manufacturing example related to the setup of crimping and wire cutting machines demonstrates the applicability of the method.
Post date: 23 December 2020
János Abonyi was presenting about the applicability of data science and machine learning in water management at the National Conference on Water Value and Digital Water Management
The virtual meeting was held in 2020. December 2-3 organized by MaSzeSz (Hungarian Water and Wastewater Association).
The conference discussed the real value of sustainable water utility services, knowledge-based management, reducing the large gap in costs in small and large settlements, increasing the value and social recognition of water services, the international value of the domestic water industry, digital data and information management of municipal water management, and future professionals .
For more details about the conference available here.
Post date: 11 December 2020
Machine Learning Based Analysis of Human Serum N-glycome Alterations to Follow up Lung Tumor Surgery
The human serum N-glycome is a valuable source of biomarkers for malignant diseases, already utilized in multiple studies. In this paper, the N-glycosylation changes in human serumproteins were analyzed after surgical lung tumor resection. Seventeen lung cancer patients were involved in this study and the N-glycosylation pattern of their serum samples was analyzed before andafter the surgery using capillary electrophoresis separation with laser-induced fluorescent detection. The relative peak areas of 21N-glycans were evaluated from the acquired electropherograms using machine learning-based data analysis. Individual glycans as well as their subclasses were taken into account during the course of evaluation. For the data analysis, both discrete (e.g., smoker or not)and continuous (e.g., age of the patient) clinical parameters were compared against the alterations in these 21N-linked carbohydrate structures. The classification tree analysis resulted in a panel of N-glycans, which could be used to follow up on the effects of lung tumor surgical resection.
Post date: 09 December 2020
Az adattudomány eszköztárának alkalmazási lehetőségei a klímaváltozás kihívásainak azonosításában és kezelésében
A fenntarthatóság tudományterületének legfontosabb kérdésein végighaladva bemutattuk, hogy a jövőben milyen kutatási és fejlesztési tevékenységekre van szükség ahhoz, hogy a klímaváltozás komplex problémáinak megismerésében és kezelésében az adat-tudomány eszköztára hatékony segítséget nyújtson. A jelenleg sikeresnek bizonyult adatalapú alkalmazásokat kulcsszó elemzés segítségével tekintettük át. Elemzésünk szemléltette, hogy a Big Data a klímatudomány egyre szélesebb körben alkalmazott eszköze, ugyanakkor kevés az e technológia előnyeit ténylegesen kiaknázó, igazán átfogó jellegű, integratív elemzés. Tanulmányunkkal szeretnénk felhívni a figyelmet a rendszerek rendszere (SoS) elvére, ugyanis a klímaváltozás mozgatórugói és hatásai csak akkor ismerhetők fel, és a hatásokhoz csak akkor tudunk alkalmazkodni és azoknak ellenállni, ha időben felismerjük és feltárjuk az új kutatási irányzatok közötti szinergiákat.
Post date: December 2020
Real-Time Locating System in Production Management
Real-time monitoring and optimization of production and logistics processes significantlyimprove the efficiency of production systems. Advanced production management solutions requirereal-time information about the status of products, production, and resources. As real-time locatingsystems (also referred to as indoor positioning systems) can enrich the available information, thesesystems started to gain attention in industrial environments in recent years. This paper providesa review of the possible technologies and applications related to production control and logistics,quality management, safety, and efficiency monitoring. This work also provides a workflow to clarifythe steps of a typical real-time locating system project, including the cleaning, pre-processing, andanalysis of the data to provide a guideline and reference for research and development of indoorpositioning-based manufacturing solutions.
Post date: 26 November 2020
Abonyi János interjúja a Sic Itur ad Astra történeti folyóiratban
Témája, a hálózat mint metafora és modell: kibővült a társadalomtudományok eszköztára.
Beszélgetés Abonyi János és Lengyel Balázs hálózatkutatókkal a hálózatelmélet aktuális trendjeiről.
Az interjú tartalma megtekinthető ezen a linken.
Post date: 22 November 2020
Data describing the regional Industry 4.0 readiness index
The data article presents a dataset suitable to measure regional Industry 4.0 (I4.0+) readiness. The I4.0+ dataset includes 101 indicators with 248 958 observations, aggregated to NUTS 2 statistical level) based on open data in the field of education (ETER, Erasmus), science (USPTO, MA-Graph, GRID), government (Eurostat) and media coverage (GDELT). Indicators consider the I4.0-specific domain of higher education and lifelong learning, innovation, technological investment, labour market and technological readiness as indicators. A composite indicator, the I4.0+ index was constructed by the Promethee method, to identify regional rank regarding their I4.0 performance. The index is validated with economic (GDP) and innovation indexes (Regional Innovation Index).
Post date: 27 October 2020
Techtogether engineering competition
The Techtogedther engineering competition was held at the Industry Days 2020 event, where the research group proposed a task for students to be solved. Students had to present exciting industrial solutions in the topic of 'Improving the digital twin of production systems and increasing efficency based on data analysis'.
Post date: 26 October 2020
Technology meetup 06.10.2020 - 18:00
Why is it worth going back to school? What can the person who applies for our latest trainings learn form?
These questions were answered at the Technology Meetup held in Veszprém at the 6th of October. Gyula Dörgő and Tamás Ruppert presented the work of the research group as well as introduced the Industry 4.0 engineering training and the Automotive Quality Academy to the audience.
Post date: 26 October 2020
Decision trees for informative process alarm definition and alarm-based fault classification
Alarm messages in industrial processes are designed to draw attention to abnormalities that require timely assessment or intervention. However, in practice, alarms are arbitrarily and excessively defined by process operators resulting numerous nuisance and chattering alarms that are simply a source of distraction. Countless techniques are available for the retrospective filtering of alarm data, e.g., adding time delays and deadbands to existing alarm settings. As an alternative, in the present paper, instead of filtering or modifying existing alarms, a method for the design of alarm messages being informative for fault detection is proposed which takes into consideration that the occurring alarm messages originally should be optimal for fault detection and identification. This methodology utilizes a machine learning technique, the decision tree classifier, which provides linguistically well-interpretable models without the modification of the measured process variables. Furthermore, an online application of the defined alarm messages for fault identification is presented using a sliding window-based data preprocessing approach. The effectiveness of the proposed methodology is demonstrated in terms of the analysis of a well-known benchmark simulator of a vinyl-acetate production technology, where the complexity of the simulator is considered to be sufficient for the testing of alarm systems.
Note to practitioners: Process-specific knowledge can be used to label historical process data to normal operating and fault-specific periods. Alarm generation should be designed to be able to detect and isolate faulty states. Using decision trees, optimal”cuts” or alarm limits for the purpose of fault classification can be defined utilizing a labelled dataset. The results apply to a variety of industries operating with online control systems, and especially timely in the chemical industry.
Post date: 22 October 2020
Local newspaper reported about our professional engineer training in Industry 4.0 and the Automotive Quality Academy
Link to the Industry 4.0 Professional Engineer Training website
Link to the Automotive Quality Academy website
Post date: 22 October 2020
Directions of membrane separator development for microbial fuel cells: A retrospective analysis using frequent itemset mining and descriptive statistical approach
To increase the efficiency of microbial fuel cells (MFCs), the separator (which is mostly a membrane) placed between the electrodes or their compartments is considered of high importance besides several other biotic and abiotic factors (e.g. configuration, mode of operation, types of inoculum and substrate). Nafion-based proton exchange membranes (PEMs) are the most widespread, although these materials are often criticized on various technological and economical grounds. Therefore, to find alternatives of Nafion, the synthesis, development and testing of novel/commercialized membrane separators with enhanced characteristics have been hot topics. In this study, the goals were to assess the membrane-installed MFCs in a retrospective manner and reveal the trends, the applied practices, frequent setups, etc. via Bayesian classification and frequent itemset mining algorithms. Thereafter, a separate discussion was devoted to examine the current standing of research related to major membrane groups used in MFCs and evaluate in accordance with the big picture how the various systems behave in comparison with each other, especially compared to those applying Nafion PEM. It was concluded that some membrane types seem to be competitive to Nafion, however, the standardization of the experiments would drive the more unambiguous comparison of studies.
Post date: 21 October 2020
Janos Abonyi was invited to join as a program committee member at Evolutionary Multi-Criterion Optimization (EMO) 2021 conference
Janos Abonyi serves as a program committee member at the 11th International Conference on Evolutionary Multi-Criterion Optimization (EMO).
The conference aims to bring together both the EMO, Multiple Criteria Decision-Making (MCDM) communities, and other related fields, moreover, focusing on solving real-world problems in government, business and industry.
The conference will be held as a hybrid conference on March 28-31, 2021 in Shenzhen and on-line.
Post date: 12 October 2020
Integration of real-time locating systems into digital twins
Cyber-physical model-based solutions should rely on digital twins in which simulations are integrated with real-time sensory and manufacturing data. This paper highlights the benefits of information fusion with real-time locating systems (RTLS) and demonstrates how position and acceleration data can be utilised for the simulation-based analysis of product-specific activity times. The proposed digital twin is continuously capable to predict the production status and provide information for monitoring of production performance thanks to the real time connections of the RTLS and adaptive simulation models. The presented industrial case study demonstrates how the resulted Simulation 4.0 concept supports the analysis of human resource effectiveness (HRE) in an assembly process.
Post date: 06 October 2020
Are Regions Prepared for Industry 4.0? The Industry 4.0+ Indicator System for Assessment
The concept of industry 4.0 is spreading worldwide and readiness models exist to determine organizational or national maturity. On the other hand, the regional perspective of the digital transformation is yet to be widely researched, although it significantly determines how the concept of industry 4.0 can be introduced to the organisations. This book identifies the regional aspect of industry 4.0 and provides a regional (NUTS 2 classified) industry 4.0 indicator system model that is based on open data sources. This new model serves as a tool to evaluate regional economy to support governmental decisions. It also provides territorial councils with a decision-support tool for field investment decisions. And finally, this model offers investors with a heat map to evaluate regional economies successful implementation of industry 4.0 solutions.
Post date: 23 September 2020
Development of manufacturing execution systems in accordance with Industry 4.0 requirements: A review of standard- and ontology-based methodologies and tools
This work presents how recent trends in Industry 4.0 (I4.0) solutions are influencing the development of manufacturing execution systems (MESs) and analyzes what kinds of trends will determine the development of the next generation of these technologies. This systematic and thematic review provides a detailed analysis of I4.0-related requirements in terms of MES functionalities and an overview of MES development methods and standards because these three aspects are essential in developing MESs. The analysis highlights that MESs should interconnect all components of cyber-physical systems in a seamless, secure, and trustworthy manner to enable high-level automated smart solutions and that semantic metadata can provide contextual information to support interoperability and modular development. The observed trends show that formal models and ontologies will play an even more essential role in I4.0 systems as interoperability becomes more of a focus and that the new generation of linkable data sources should be based on semantically enriched information. The presented overview can serve as a guide for engineers interested in the development of MESs as well as for researchers interested in finding worthwhile areas of research.
Post date: 01 September 2020
Pairwise comparison based Failure Mode and Effects Analysis (FMEA)
The proposed method supports the determination of severity (S), occurrence (O), and detection (D) indices of Failure Modes and Effects Analysis (FMEA). Previously evaluated and previously not studied risks are compared in pairwise comparison. The analysis of the resulted pairwise comparison matrix provides information about the consistency of the risk evaluations and allows the estimation of the indices of the previously not evaluated risks. The advantages of the method include:
The pairwise comparison facilities the identification of risks that are otherwise difficult to evaluate
The inconsistency of existing FMEA studies can be highlighted and systematically reduced
The method can be generalized about a wide range of grading problems
Post date: 01 August 2020
Analytic Hierarchy Process and Multilayer Network-Based Method for Assembly Line Balancing
Assembly line balancing improves the efficiency of production systems by the optimal assignment of tasks to operators. The optimisation of this assignment requires models that provide information about the activity times, constraints and costs of the assignments. A multilayer network-based representation of the assembly line-balancing problem is proposed, in which the layers of the network represent the skills of the operators, the tools required for their activities and the precedence constraints of their activities. The activity–operator network layer is designed by a multi-objective optimisation algorithm in which the training and equipment costs as well as the precedence of the activities are also taken into account. As these costs are difficult to evaluate, the analytic hierarchy process (AHP) technique is used to quantify the importance of the criteria. The optimisation problem is solved by a multi-level simulated annealing algorithm (SA) that efficiently handles the precedence constraints. The efficiency of the method is demonstrated by a case study from wire harness manufacturing.
Post date: 05 June 2020
Megjelentünk az Innotéka magazinban!
Megjelent az Innotéka magazin májusi számában a Dr. Abonyi Jánosról készült portré "Vonzódni az ismeretlenhez" címen.
A cikk megtalálható az alábbi linken:
A májusi szám pedig az alábbi linken:
Post date: 11 May 2020
Multilayer network based comparative document analysis (MUNCoDA)
The proposed multilayer network-based comparative document analysis (MUNCoDA) method supports the identification of the common points of a set of documents, which deal with the same subject area. As documents are transformed into networks of informative word-pairs, the collection of documents form a multilayer network that allows the comparative evaluation of the texts. The multilayer network can be visualized and analyzed to highlight how the texts are structured. The topics of the documents can be clustered based on the developed similarity measures. By exploring the network centralities, topic importance values can be assigned. The method is fully automated by KNIME preprocessing tools and MATLAB/Octave code.
•Networks can be formed based on informative word pairs of a multiple documents
•The analysis of the proposed multilayer networks provides information for multi-document summarization
•Words and documents can be clustered based on node similarity and edge overlap measures
Post date: 29 April 2020
Focal points for sustainable development strategies—Text mining-based comparative analysis of voluntary national reviews has been published!
Post date: March11, 2020 10:00:00 AM
Countries have to work out and follow tailored strategies for the achievement of their Sustainable Development Goals. At the end of 2018, more than 100 voluntary national reviews were published. The reviews are transformed by text mining algorithms into networks of keywords to identify country-specific thematic areas of the strategies and cluster countries that face similar problems and follow similar development strategies. The analysis of the 75 VNRs has shown that SDG5 (gender equality) is the most discussed goal worldwide, as it is discussed in 77% of the analysed Voluntary National Reviews. The SDG8 (decent work and economic growth) is the second most studied goal, With 76 %, while the SDG1 (no poverty) is the least focused goal, it is mentioned only in 48 % of documents and the SDG10 (reduced inequalities) in 49 %. The results demonstrate that the proposed benchmark tool is capable of highlighting what kind of activities can make significant contributions to achieve sustainable developments.
Prof. Janos Abonyi was invited to join the program committee of 7th edition of the International conference on Time Series and Forecasting (ITISE 2020)
Post date: Feb22, 2020 09:00:00 AM
The ITISE 2020 (7th International conference on Time Series and Forecasting) seeks to provide a discussion forum for scientists, engineers, educators and students about the latest ideas and realizations in the foundations, theory, models and applications for interdisciplinary and multidisciplinary research encompassing disciplines of computer science, mathematics, statistics, forecaster, econometric, etc, in the field of time series analysis and forecasting.
The aims of ITISE 2020 is to create a a friendly environment that could lead to the establishment or strengthening of scientific collaborations and exchanges among attendees, and therefore, ITISE 2020 solicits high-quality original research papers (including significant work-in-progress) on any aspect time series analysis and forecasting, in order to motivating the generation, and use of knowledge and new computational techniques and methods on forecasting in a wide range of fields.
Post date: Feb16, 2020 10:00:00 PM
The school is organized at the University of Catania, Italy, by the Department of Electrical Electronics and Computer Science and the Cometa Consortium.
It consists of a series of lectures given by leading scientists in the field, aiming at providing a comprehensive treatment from background material to advanced results. The school is specially directed to PhD students and young researchers interested to the diverse aspects of the theory and applications of complex networks in science and engineering. The school aims at encouraging cross-disciplinary discussions between participants and speakers and start new joint researches.
The Assembly magazine reported about our methodology developed for activity time monitoring:
Post date: Feb13, 2020 6:00:00 PM
Industry 4.0 and the digital manufacturing revolution are all about collecting—and, more importantly, acting on—data gathered from the assembly process in real time. That’s all well and good when data is coming from sensors, vision systems, fastening tools and other electronic devices. But, how can engineers gather real-time data on largely manual assembly processes, such as wire harness assembly?
To solve this problem, we developed a software- and sensor-based system to measure activity times and performance on a wire harness assembly line. To ensure a real-time connection between assembler performance and varying product complexity, our system relies on fixture sensors and an indoor positioning system (IPS). Our goal was to create a system that could continuously estimate the time consumed by the various elementary activities that make up wire harness assembly. Our system creates a model of a task, compares estimated activity times to the actual performance of assemblers, and generates early warnings when their productivity decreases.
J. Abonyi, T. Ruppert, Monitoring Activity During Wire Harness Assembly, Assembly, 2020
Mixtures of QSAR Models – Learning Application Domains of pKa Predictors has been published!
Post date: Feb11, 2020 5:00:00 PM
Quantitative structure-activity relationship models (QSAR models) predict the physical properties or biological effects based on physicochemical properties or molecular descriptors of chemical structures. Our work focuses on the construction of optimal linear and nonlinear weighted mixes of individual QSAR models to more accurately predict their performance. How the splitting of the application domain by a nonlinear gating network in a "mixture of experts" model structure is suitable for the determination of the optimal domain-specific QSAR model and how the optimal QSAR model for certain chemical groups can be determined is highlighted. The input of the gating network is arbitrarily formed by the various molecular structure descriptors and/or even the prediction of the individual QSAR models. The applicability of the method is demonstrated on the pKa values of the OASIS database (1912 chemicals) by the combination of four acidic pKa predictions of the OECD QSAR Toolbox. According to the results, the prediction performance was enhanced by more than 15 % (RMSE value) compared to the predictions of the best individual QSAR model.
Janos Abonyi invited to serve as a Committee member!
Post date: Feb10, 2020 9:00:00 PM
Dr. Janos Abonyi has been invited to serve as Technical Program Committee member in the IEEE Wireless Africa 2020 conference.
IEEE Wireless Africa 2020 is sponsored by the IEEE Vehicular Technology Society and will be hosted in South Africa, from 29-30 November 2020.
The conference aims to provide a platform for wireless researchers to share their results, call for comments and collaborations, and exchange innovative ideas on leading edge research in wireless technologies.
A multilayer and spatial description of the Erasmus mobility network has been published!
Post date: Feb6, 2020 4:00:00 PM
The Erasmus Programme is the biggest collaboration network consisting of European Higher Education Institutions (HEIs). The flows of students, teachers and staff form directed and weighted networks that connect institutions, regions and countries. Here, we present a linked and manually verified dataset of this multiplex, multipartite, multi-labelled, spatial network. We enriched the network with institutional socio-economic data from the European Tertiary Education Register (ETER) and the Global Research Identifier Database (GRID). We geocoded the headquarters of institutions and characterised the attractiveness and quality of their environments based on Points of Interest (POI) data. The linked datasets provide relevant information to grasp a more comprehensive understanding of the mobility patterns and attractiveness of the institutions.
Conference presentation
Tamás Ruppert and Róbert Csalódi were attending the 7th International Conference on Industrial Engineering and Applications (ICIEA) in Paris from 15 to 17 January 2020. Their presentation were rated by the judges as the best presentation in their own section.
Network-Based Analysis of Dynamical Systems has been published!
Post date: Jan15, 2020 6:00:00 PM
The key idea of this book is that the dynamical properties of complex systems can be determined by the effective calculation of specific structural features by network science-based analysis. Furthermore, certain dynamical behaviours can originate from the existence of specific motifs in the network representation or can be determined by network segmentation. Although the applicability of the methods and results was highlighted in the operability analysis of dynamical systems, the proposed methods can be utilised in various fields that will be mentioned at the end of each chapter.
Fuzzy activity time-based model predictive control of open station assembly lines is published!
Post date: Dec 14, 2019 3:00:00 PM
The sequencing and line balancing of manual mixed-model assembly lines are challenging tasks due to the complexity and uncertainty of operator activities. The control of cycle time and the sequencing of production can mitigate the losses due to non-optimal line balancing in the case of open-station production where the operators can work ahead of schedule and try to reduce their backlog. The objective of this paper is to provide a cycle time control algorithm that can improve the e ciency of assembly lines in such situations based on a specially mixed sequencing strategy. To handle the uncertainty of activity times, a fuzzy model-based solution has been developed. As the production process is modular, the fuzzy sets represent the uncertainty of the elementary activity times related to the processing of the modules. The optimistic and pessimistic estimates of the completion of activity times extracted from the fuzzy model are incorporated into a model predictive control algorithm to ensure the constrained optimization of the cycle time. The applicability of the proposed method is demonstrated based on a wire-harness manufacturing process with a paced conveyor, but the proposed algorithm can handle continuous conveyors as well. The results confirm that the application of the proposed algorithm is widely applicable in cases where a production line of a supply chain is not well balanced and the activity times are uncertain.
Network analysis dataset of System Dynamics models is published!
Post date: Nov 01, 2019 3:00:00 PM
This paper presents a tool developed for the analysis of networks extracted from system dynamics models. The developed tool and the collected models were used and analyzed in the research paper, Review and structural analysis of system dynamics models in sustainability science. The models developed in Vensim, Stella, and InsightMaker are converted into networks of state-variables, flows, and parameters by the developed Python program that also performs model reduction, modularity analysis and calculates the structural properties of the models and its main variables. The dataset covers the results of the analysis of nine models in sustainability science used for policy testing, prediction and simulation.
Constrained Recursive Input Estimation of Blending and Mixing Systems is published!
Post date: Oct 23, 2019 6:00:00 PM
Blending, mixing processes are often supported by advanced process control systems to maximise margins from available component and heat streams. Since these model-based solutions require accurate and reliable data, in weakly instrumented processes, the unknown inlet concentrations and temperatures should be estimated based on the measured outflows. This work presents a method for the reliable estimation of multiple input variables of process units. The key idea is that the input estimation problem is formulated as a constrained recursive estimation task. The applicability of the method is illustrated based on a benchmark model of a blending system. The performance of the method is compared to the moving window and Kalman Filter based solutions. The results show the superior performance of the proposed method and confirm that the apriori knowledge-based constraints improve the robustness of the estimates.
Introduction to Data Analysis Course
Post date: Oct 18, 2019
We are delivering a data analysis course at MOL Value Chain Academy.
Data-driven multilayer complex networks of sustainable development goals is published!
Post date: Oct 08, 2019 6:15:00 PM
This data article presents the formulation of multilayer network for modelling the interconnections among the sustainable development goals (SDGs), targets and includes the correlation based linking of the sustainable development indicators with the available long-term datasets of The World Bank, 2018. The spatial distribution of the time series data allows creating country-specific sustainability assessments. In the related research article “Network Model-Based Analysis of the Goals, Targets and Indicators of Sustainable Development for Strategic Environmental Assessment” the similarities of SDGs for ten regions have been modelled in order to improve the quality of strategic environmental assessments. The datasets of the multilayer networks are available on Mendeley .
Network-based Observability and Controllability Analysis of Dynamical Systems: the NOCAD toolbox is published!
Post date: Oct 08, 2019 6:10:00 PM
The network science-based determination of driver nodes and sensor placement has become increasingly popular in the field of dynamical systems over the last decade. In this paper, the applicability of the methodology in the field of life sciences is introduced through the analysis of the neural network of Caenorhabditis elegans. Simultaneously, an Octave and MATLAB-compatible NOCAD toolbox is proposed that provides a set of methods to automatically generate the relevant structural controllability and observability associated measures for linear or linearised systems and compare the different sensor placement methods.
Genetic programming-based development of thermal runaway criteria is published!
Post date: Oct 08, 2019 6:10:00 PM
Common thermal runaway criteria (e.g. divergence criterion and the Maxi criterion) may predict a thermal runaway unreasonably as the Maximum Allowable Temperature (MAT) is not taken into account. This contribution proposes a method for the goal-oriented construction of reactor runaway criteria by Genetic Programming (GP). The runaway prediction problem is formulated as a critical equation-based classification task, and GP is used to identify the optimal structure of the equations that also take into account the MAT. To demonstrate the applicability of the method, tailored criteria were developed for batch and continuous stirred-tank reactors. The resultant critical equations outperform the well-known criteria in terms of the early and accurate indication of thermal runaways.
Review and structural analysis of system dynamics models in sustainability science is published!
Post date: Oct 08, 2019 6:00:00 PM
As the complexity of sustainability-related problems increases, it is more and more difficult to understand the related models. Although tremendous models are published recently, their automated structural analysis is still absent. This study provides a methodology to structure and visualise the information content of these models. The novelty of the present approach is the development of a network analysis-based tool for modellers to measure the importance of variables, identify structural modules in the models and measure the complexity of the created model, and thus enabling the comparison of different models. The overview of 130 system dynamics models from the past five years is provided. The typical topics and complexity of these models highlight the need for tools that support the automated structural analysis of sustainability problems. For practising engineers and analysts, nine models from the field of sustainability science, including the World3 model, are studied in details. The results highlight that with the help of the developed method the experts can highlight the most critical variables of sustainability problems (like arable land in the Word 3 model) and can determine how these variables are clustered and interconnected (e.g. the population and fertility are key drivers of global processes). The developed software tools and the resulted networks are all available online.
Learning and predicting operation strategies by sequence mining and deep learning (full paper) is published!
Post date: Jun 15, 2019 7:55:00 PM
The operators of chemical technologies are frequently faced with the problem of determining optimal interventions. Our aim is to develop data-driven models by exploring the consequential relationships in the alarm and event-log database of industrial systems. Our motivation is twofold: (1) to facilitate the work of the operators by predicting future events and (2) analyse how consequent the event series is. The core idea is that machine learning algorithms can learn sequences of events by exploring connected events in databases. First, frequent sequence mining applications are utilised to determine how the event sequences evolve during the operation. Second, a sequence-to-sequence deep learning model is proposed for their prediction. The long short-term memory unit-based model (LSTM) is capable of evaluating rare operation situations and their consequential events. The performance of this methodology is presented with regard to the analysis of the alarm and event-log database of an industrial delayed coker unit.
A Review of Semantic Sensor Technologies in Internet of Things Architectures is published!
Post date: Jun 15, 2019 7:40:00 PM
Intelligent sensors should be seamlessly, securely, and trustworthy interconnected to enable automated high-level smart applications. Semantic metadata can provide contextual information to support the accessibility of these features, making it easier for machines and humans to process the sensory data and achieve interoperability. The unique overview of sensor ontologies according to the semantic needs of the layers of IoT solutions can serve a guideline of engineers and researchers interested in the development of intelligent sensor-based solutions. The explored trends show that ontologies will play an even more essential role in interlinked IoT systems as interoperability and the generation of controlled linkable data sources should be based on semantically enriched sensory data.
Operating regime model based multi-objective sensor placement for data reconciliation is published!
Post date: Jun 15, 2019 7:40:00 PM
Although the number of sensors in chemical production plants is increasing thanks to the IoT revolution, it