News

Particle filtering supported probability density estimation of mobility patterns

This paper presents a methodology that aims to enhance the accuracy of probability density estimation in mobility pattern analysis by integrating prior knowledge of system dynamics and contextual information into the particle filter algorithm. The quality of the data used for density estimation is often inadequate due to measurement noise, which significantly influences the distribution of the measurement data. Thus, it is crucial to augment the information content of the input data by incorporating additional sources of information beyond the measured position data. These other sources can include the dynamic model of movement and the spatial model of the environment, which influences motion patterns. To effectively combine the information provided by positional measurements with system and environment models, the particle filter algorithm is employed, which generates discrete probability distributions. By subjecting these discrete distributions to exploratory techniques, it becomes possible to extract more certain information compared to using raw measurement data alone. Consequently, this study proposes a methodology in which probability density estimation is not solely based on raw positional data but rather on probability-weighted samples generated through the particle filter. This approach yields more compact and precise modeling distributions. Specifically, the method is applied to process position measurement data using a nonparametric density estimator known as kernel density estimation. The proposed methodology is thoroughly tested and validated using information-theoretic and probability metrics. The applicability of the methodology is demonstrated through a practical example of mobility pattern analysis based on forklift data in a warehouse environment.

Post Date: 15 April 2024

Generally Applicable Q-Table Compression Method and Its Application for Constrained Stochastic Graph Traversal Optimization Problems

We analyzed a special class of graph traversal problems, where the distances are stochastic, and the agent is restricted to take a limited range in one go. We showed that both constrained shortest Hamiltonian pathfinding problems and disassembly line balancing problems belong to the class of constrained shortest pathfinding problems, which can be represented as mixed-integer optimization problems. Reinforcement learning (RL) methods have proven their efficiency in multiple complex problems. However, researchers concluded that the learning time increases radically by growing the state- and action spaces. In continuous cases, approximation techniques are used, but these methods have several limitations in mixed-integer searching spaces. We present the Q-table compression method as a multistep method with dimension reduction, state fusion, and space compression techniques that project a mixed-integer optimization problem into a discrete one. The RL agent is then trained using an extended Q-value-based method to deliver a human-interpretable model for optimal action selection. Our approach was tested in selected constrained stochastic graph traversal use cases, and comparative results are shown to the simple grid-based discretization method.


Post Date: 01 April 2024

Disassembly line optimization with reinforcement learning

As the environmental aspects become increasingly important, the disassembly problems have become the researcher’s focus. Multiple criteria do not enable finding a general optimization method for the topic, but some heuristics and classical formulations provide effective solutions. By highlighting that disassembly problems are not the straight inverses of assembly problems and the conditions are not standard, disassembly optimization solutions require human control and supervision. Considering that Reinforcement learning (RL) methods can successfully solve complex optimization problems, we developed an RL-based solution for a fully formalized disassembly problem. There were known successful implementations of RL-based optimizers. But we integrated a novel heuristic to target a dynamically pre-filtered action space for the RL agent (DLOPTRL algorithm) and hence significantly raise the efficiency of the learning path. Our algorithm belongs to the Heuristically Accelerated Reinforcement Learning (HARL) method class. We demonstrated its applicability in two use cases, but our approach can also be easily adapted for other problem types. Our article gives a detailed overview of disassembly problems and their formulation, the general RL framework and especially Q-learning techniques, and a perfect example of extending RL learning with a built-in heuristic.


Post Date: 11 March 2024

Utility function-based generalization of sum of ranking differences–country-wise analysis of greenhouse gas emissions

The utility function-based sum of ranking differences (uSRD) method is proposed as a utility function-based multi-criteria decision analysis tool. Our idea is that the transformation functions can be represented by a utility function that can be aggregated with multi-attribute utility functions. We present a framework incorporating utility values as the basis for three different but interconnected analyses. The exemplary application focuses on greenhouse gas emissions and economic indicators of 147 countries. First, the uSRD is applied to the utility values to uncover the hidden relationships of the 40 indicators. A ranking of countries is established to see which sample performs the best and the worst in both emissions and economy. Lastly, mitigation actions are delegated to countries through a three-stage assignment that connects emissions to utilities, sectors, and mitigation actions. The results show that the uSRD excels as a support tool for decision-making.


Post date: 28 February 2024

Neighborhood Ranking-Based Feature Selection

This article aims to integrate k -NN regression, false-nearest neighborhood (FNN), and trustworthiness and continuity (T&C) neighborhood-based measures into an efficient and robust feature selection method to support the identification of nonlinear regression models. The proposed neighborhood ranking-based feature selection technique (NRFS) is validated in three problems, in a linear regression task, in the nonlinear Friedman database, and in the problem of determining the order of nonlinear dynamical models. A neural network is also identified to validate the resulting feature sets. The analysis of the distance correlation also confirms that the method is capable of exploring the nonlinear correlation structure of complex systems. The results illustrate that the proposed NRFS method can select relevant variables for nonlinear regression models.

Post date:  05 February 2024

Fault Diagnostics Based on the Analysis of Probability Distributions Estimated Using a Particle Filter 

 This paper proposes a monitoring procedure based on characterizing state probability distributions estimated using particle filters. The work highlights what types of information can be obtained during state estimation and how the revealed information helps to solve fault diagnosis tasks. If a failure is present in the system, the output predicted by the model is inconsistent with the actual output, which affects the operation of the estimator. The heterogeneity of the probability distribution of states increases, and a large proportion of the particles lose their information content. The correlation structure of the posterior probability density can also be altered by failures. The proposed method uses various indicators that characterize the heterogeneity and correlation structure of the state distribution, as well as the consistency between model predictions and observed behavior, to identify the effects of failures.The applicability of the utilized measures is demonstrated through a dynamic vehicle model, where actuator and sensor failure scenarios are investigated.


Post date:  25 January 2024

Darányi, A., & Abonyi, J. (2024). Fault Diagnostics Based on the Analysis of Probability Distributions Estimated Using a Particle Filter. Sensors, 24(3), 719.doi.org/10.3390/s24030719 

This book presents a comprehensive framework for developing Industry 4.0 and 5.0 solutions through the use of ontology modeling and graph-based optimization techniques. With effective information management being critical to successful manufacturing processes, this book emphasizes the importance of adequate modeling and systematic analysis of interacting elements in the era of smart manufacturing.

The book provides an extensive overview of semantic technologies and their potential to integrate with existing industrial standards, planning, and execution systems to provide efficient data processing and analysis. It also investigates the design of Industry 5.0 solutions and the need for problem-specific descriptions of production processes, operator skills and states, and sensor monitoring in intelligent spaces.

The book proposes that ontology-based data can efficiently represent enterprise and manufacturing datasets.

The book is divided into two parts: modeling and optimization. The semantic modeling part provides an overview of ontologies and knowledge graphs that can be used to create Industry 4.0 and 5.0 applications, with two detailed applications presented on a reproducible industrial case study. The optimization part of the book focuses on network science-based process optimization and presents various detailed applications, such as graph-based analytics, assembly line balancing, and community detection.

The book is based on six key points: the need for horizontal and vertical integration in modern industry; the potential benefits of integrating semantic technologies into ERP and MES systems; the importance of optimization methods in Industry 4.0 and 5.0 concepts; the need to process large amounts of data while ensuring interoperability and re-usability factors; the potential for digital twin models to model smart factories, including big data access; and the need to integrate human factors in CPSs and provide adequate methods to facilitate collaboration and support shop floor workers. 

You can order this book at Springer

Objective well-being level (OWL) composite indicator for sustainable and resilient cities

Well-being is a critical element of the 2030 Agenda for Sustainable Development Goals. Given the complexity of the concept of well-being, it follows that its measurement requires complex, multivariate methods that can characterize the physical, economic, social and environmental aspects along with the mental state of a city. Although it is not sufficient to carry out settlement-level analyses to make cities inclusive, safe, resilient and sustainable. It is necessary to understand patterns within settlements. This work aims to present how the urban macrostructure of urban well-being indicators can be estimated based on GIS-based multilayer analysis. Open-source data, e.g. road networks, points of interest, green spaces and vegetation, are used to estimate urban well-being parameters such as noise levels, air quality and health-related impacts supplemented by climate models to assess urban resilience and sustainability. The proposed methodology integrates 24 models into six categories, namely walkability, environment, health, society, climate change and safety, which are weighted based on a multilevel Principal Component Analysis to minimize information loss for aggregated composite indicators. The study revealed two main components of the macrostructure related to well-being in the studied city: one related to the geometrical features and the other can be derived from the structure of the natural environment. In Veszprém a natural restoration of the detached house area, industrial area and downtown is recommended including developments with green and blue infrastructural elements and nature-based solutions.


 Post date: 09 January 2024 

Research laboratory members  took third place in Reman Challange 2023 

The Reman Challenge was hosted for the fifth time by BORG Automotive, with three students from the research laboratory taking part. The team of Abdulrahman Khalid, Timea Czvetkó and Gergely Halász finished in the third place.

The challenge was about research and innovate transformative solutions that can revolutionize operators' work conditions in remanufacturing operations. From virtual reality to human-robot collaboration, your only boundaries are the limits of feasibility and scalability of the solutions.


Post date: 05 January 2024 

Time-dependent sequential association rule-based survival analysis: A healthcare application

The analysis of event sequences with temporal dependencies holds substantial importance across various domains, including healthcare. This study introduces a novel approach that combines sequential rule mining and survival analysis to uncover significant associations and temporal patterns within event sequences. By integrating these techniques, we address the limitations linked to the loss of temporal information. The methodology extends traditional sequential rule mining by introducing time-dependent confidence functions, providing a comprehensive understanding of relationships between antecedent and consequent events. The incorporation of the Kaplan-Meier estimator of survival analysis enables the calculation of temporal distributions between events, resulting in time-dependent confidence functions. These confidence functions illuminate the probability of specific event occurrences considering temporal contexts. To present the application of the method, we demonstrated the usage within the healthcare domain. Analyzing the ICD-10 codes and the laboratory events, we successfully identified relevant sequential rules and their time-dependent confidence functions. This empirical validation underscores the potential of methodology to uncover clinically significant associations within intricate medical data.


Post date: 05 January 2024 

Magyar Fuzzy Társaság Ifjúsági Díj / Hungarian Fuzzy Society Youth Award

Darányi András a Mérnöki Kar PhD hallgatója kapta a Magyar Fuzzy Társaság Ifjúsági díját. A díj a lágy számítási módszerek területén nyújtott kutatási-tudományos teljesítmények elismeréseként került adományozásra.


András Darányi, PhD student of the Faculty of Engineering, received the Hungarian Fuzzy Society Youth Award. The prize was awarded in recognition of his research and scientific achievements in the field of soft computing methods.


Post date  21 November 2023

Operator 4.0 research network: A key player in Gartner's top trends for 2024

Gartner just published the top 10 strategic technology trends for 2024 report. The Augmented connected workforce is in 9th place, which is the main goal of the Operator 4.0 research network led by our research group.


Post date  17 October 2023

Indoor Positioning-based Occupational Exposures Mapping and Operator Well-being Assessment in Manufacturing Environment

This research was motivated by the need for detailed information about the spatial and contextualized distribution of occupational exposures, which can be used to improve the layout of the workspace. To achieve this goal, the study emphasizes the need for position-related information and contextualized data. To address these concerns, the study proposes the use of Indoor Positioning System (IPS) sensors that can be further developed to establish a set of metrics for measuring and evaluating occupational exposures. The proposed IPS-based sensor fusion framework, which combines various environmental parameters with position data, can provide valuable insights into the operator’s working environment. For this, we propose an indoor position-based comfort level indicator. By identifying areas of improvement, interventions can be implemented to enhance operator performance and overall health. The measurement unit installed on a manual material handling device in a real production environment and collected data using temperature, noise, and humidity sensors. The results demonstrated the applicability of the proposed comfort level indicator in a wire harness manufacturing setting, providing location-based information to enhance operator well-being. Overall, the proposed framework can be used as a tool to monitor the industrial environment, especially the well-being of shop floor operators.


Post date  12 October 2023

Assessing human worker performance by pattern mining of Kinect sensor skeleton data

The human worker is an in-disposable factor in manufacturing processes. Traditional observation methods to assess their performance is time-consuming and expert-dependent, while it is still impossible to diagnose the detailed movement trajectory with the naked eye. Industry 4.0 technologies can innovate that process with smart sensors paired with data mining techniques for automated operation and develop a database of frequent movements for corporate reference and improvement. This paper proposes an approach to automatically assess worker performance with skeleton data by applying pattern mining methods and supervised learning algorithms. A use case is performed on an electrical assembly line to validate the approach, with the skeleton data collected by Kinect sensor v2. By using supervised learning, the movements of workers in each workstation can be segmented, and the line performance can be assessed. The work movement motifs can be recognized with pattern mining. The mined results can be used to further improve the production processes in terms of work procedures, movement symmetry, body utilization, and other ergonomics factors for both short and long-term human resource development. The promising result motivates further utilization of easy-to-adopt technology in Industry 5.0, which facilitates human-centric data-driven improvements.


Post date  12 September 2023

Current development on the Operator 4.0 and transition towards the Operator 5.0: A systematic literature review in light of Industry 5.0

Technology-driven Industry 4.0 (I4.0) paradigm combined with human-centrism, sustainability, and resilience orientation, forms the Industry 5.0 (I5.0) paradigm, providing support for the workforce and enabling the Operator 4.0 (O4.0) approach. The I5.0 focuses can face unforeseen challenges, as the applicability and readiness of I4.0 solutions are still not well discussed in the literature. Therefore, structuring existing knowledge of O4.0 to prepare for the smooth transition toward Operator 5.0 (O5.0) is crucial. A systematic literature review is performed in the Scopus database, considering publications up to 31 December 2022. Bibliography Network Analysis (BNA), text mining techniques (i.e., Latent Dirichlet Allocation (LDA), BERTopic), and knowledge graph (KG) were deployed on the retrieved abstracts. The full-text examination is carried out over papers matched by LDA and BERTopic. From the BNA result of 279 relevant papers, clusters of active researchers and topics were found, while text-mining results revealed trending and missing research directions. The extracted details from the full text of 81 papers reflected the coverage and development levels of O4.0 types with the preparation for resilience, human-centrism, and sustainability. Achieved results suggest that though the O5.0 transition is inevitable, I4.0 technologies are not ready with sufficient human factor integration. Missing research orientations including integrated sustainability from the human perspective, or system resilience, concerning drivers and restrainers for technology adoption. To prepare for O5.0, discussed O4.0 drivers can help to shape the favorable conditions, and the restrainers should be mitigated before adopting human-centric technologies. Further study including grey literature is necessary to exploit more industrial and policy-making perspectives.


Post date 04 August 2023

Sequence Compression and Alignment-Based Process Alarm Prediction

With the increasing complexity of production technologies, alarm management becomes more and more important in industrial process control. The overall safety of the plant relies heavily on the situation-aware response time of the staff. This kind of awareness has to be supported by a state-of-the-art alarm management system, which requires broad and up-to-date process-relevant knowledge. The proposed method provides a solution when such information is not fully available. With the utilization of machine learning algorithms, a real-time event scenario prediction can be gained by comparing the frequent event patterns extracted from historical event-log data with the actual online data stream. This study discusses an integrated solution, which combines sequence compression and sequence alignment to predict the most probable alarm progression. The effectiveness and limitations of the proposed method are tested using the data of an industrial delayed-coker plant. The results confirm that the presented parameter-free method identifies the characteristic patterns─operational states─ and their progression with high confidence in real time, suggesting it for a wider adoption for sequence analysis.


Post date: 28 June 2023

The human-centric Industry 5.0 collaboration architecture

While the primary focus of Industry 4.0 revolves around extensive digitalization, Industry 5.0, on the other hand, seeks to integrate innovative technologies with human actors, signifying an approach that is more value-driven than technology-centric. The key objectives of the Industry 5.0 paradigm, which were not central to Industry 4.0, underscore that production should not only be digitalized but also resilient, sustainable, and human-centric. This paper is focusing on the human-centric pillar of Industry 5.0. The proposed methodology addresses the need for a human-AI collaborative process design and innovation approach to support the development and deployment of advanced AI-driven co-creation and collaboration tools. The method aims to solve the problem of integrating various innovative agents (human, AI, IoT, robot) in a plant-level collaboration process through a generic semantic definition, utilizing a time event-driven process. It also encourages the development of AI techniques for human-in-the-loop optimization, incorporating cross-checking with alternative feedback loop models. Benefits of this methodology include the Industry 5.0 collaboration architecture (I5arc), which provides new adaptable, generic frameworks, concepts, and methodologies for modern knowledge creation and sharing to enhance plant collaboration processes.


Post date: 18 June 2023

Machine learning-based soft-sensor development for road quality classification

Vibrations in road vehicles cause several harmful effects, health problems can occur for the passengers, and mechanical damage can occur to the vehicle components. Given the health, safety, and financial issues that arise, keeping the road network in good condition and detecting road defects as early as possible requires an extensive monitoring system. Related to this, our study presents the development of hardware and software for a low-cost, multi-sensor road quality monitoring system for passenger vehicles. The developed monitoring system can classify road sections according to their quality parameters into four classes. In order to detect vibrations in the vehicle, accelerometers and gyroscope sensors are installed at several points. Then, a machine learning-based soft-sensor development is introduced. Besides noise filtering, each data point is resampled by spatial frequency to reduce the velocity dependence. Subsequently, a decision tree-based classification model is trained using features from the power spectrum and principal component analysis. The classification algorithm is validated and tested with measurement data in a real-world environment. In addition to reviewing the accuracy of the model, we examine the correlation of the data measured in the cabin and on the suspension to see how much additional information is provided by the sensor on the axle.


Post date: 9 June 2023

Identifying the links among urban climate hazards, mitigation and adaptation actions and sustainability for future resilient cities

Comprehensive and objective assessment methods need to be developed to create inclusive, safe, resilient and sustainable cities. Monitoring the evolution of sustainability and well-being in the cities is important for researchers implementing the UN 2030 Agenda. This research explores and analyzes the climate change hazards, adaptation- and mitigation actions and their implementation in 776 cities located in 84 different countries. The climate action co-benefits are supporting the achievement of sustainable development goals, which are comprehensively elaborated in this methodological development. The analyzes are carried out based on the continuously updated Carbon Disclosure Project database. An open source algorithm has been developed that represents the CDP database as a bit table and use frequent itemset mining for the identification of global patterns of climate hazards, mitigation- and adaptation actions and their co-benefits, therefore, this paper offers an exploratory analysis tool that is suitable for monitoring climate actions. The most frequently identified mitigation actions in cities were energy planting (1444 actions), and on-site renewable production (644), while the most common actions for adaptation were tree planting (283) and flood mapping (267). Regarding city size, 41% of large metropolitan areas plan to develop mass transit actions, while the separate collection of recyclables is typical in 85% of towns. 56.2% of CDP database actions support access to sustainable cities and communities goal (SDG11), 54.2% access to climate action goal (SDG13), and the emergence of affordable and clean energy (SDG7) and gender equality goal (SDG5) are below 5%.


Post date: 9 June 2023

Az információmenedzsment szerepe az ABV-védelemben

Az atom-, bio- és vegyi (ABV-) incidensek felderítése kiemelt fontosságú feladat, amely évtizedek óta intenzíven kutatott téma. A folyamatos technológiai, adatfeldolgozási és automatizálási vívmányok újabb és újabb fejlesztési potenciált nyitnak az ABV-védelem terén is, amely napjainkra komplex, interdiszciplináris tudományterületté vált. Ennek megfelelően kémikusok, fizikusok, meteorológusok, katonai szakértők, programozók és adattudósok egyaránt közreműködnek a kutatásokban. A hazai ABV-védelmi képességek hatékony növelésének a kulcsa is abban rejlik, hogy megfelelően strukturált koncepció mentén folyamatos és célirányos fejlesztés történjen. Kutatásunk célja, hogy áttekintést adjunk a modern ABV-védelmi technológiák főbb komponenseiről, ezen belül összefoglaljuk az ABV-felderítés, illetve a döntéstámogatási lépések koncepcionális követelményeit, és bemutatjuk az információmenedzsment szerepét és legújabb lehetőségeit a folyamatokban.


Post date: 19 May 2023

Matrix factorization-based multi-objective ranking–What makes a good university?

Non-negative matrix factorization (NMF) efficiently reduces high dimensionality for many-objective ranking problems. In multi-objective optimization, as long as only three or four conflicting viewpoints are present, an optimal solution can be determined by finding the Pareto front. When the number of the objectives increases, the multi-objective problem evolves into a many-objective optimization task, where the Pareto front becomes oversaturated. The key idea is that NMF aggregates the objectives so that the Pareto front can be applied, while the Sum of Ranking Differences (SRD) method selects the objectives that have a detrimental effect on the aggregation, and validates the findings. The applicability of the method is illustrated by the ranking of 1176 universities based on 46 variables of the CWTS Leiden Ranking 2020 database. The performance of NMF is compared to principal component analysis (PCA) and sparse non-negative matrix factorization-based solutions. The results illustrate that PCA incorporates negatively correlated objectives into the same principal component. On the contrary, NMF only allows non-negative correlations, which enable the proper use of the Pareto front. With the combination of NMF and SRD, a non-biased ranking of the universities based on 46 criteria is established, where Harvard, Rockefeller and Stanford Universities are determined as the first three. To evaluate the ranking capabilities of the methods, measures based on Relative Entropy (RE) and Hypervolume (HV) are proposed. The results confirm that the sparse NMF method provides the most informative ranking. The results highlight that academic excellence can be improved by decreasing the proportion of unknown open-access publications and short distance collaborations. The proportion of gender indicators barely correlate with scientific impact. More authors, long-distance collaborations, publications that have more scientific impact and citations on average highly influence the university ranking in a positive direction.


Post date: 13 April 2023 

Frequent pattern mining-based log file partition for process mining

Process mining is a technique for exploring models based on event sequences, growing in popularity in the process industry. Process mining algorithms assume that the processed log files contain events generated by only one unknown process, which can lead to extremely complex and inaccurate models when this assumption is not met. To address this issue, this article proposes a frequent pattern mining-based method for log file partitioning, allowing for the exploration of parallel processes. The key idea is that frequent pattern mining can identify grouped events and generate sub-logs of overlapping sub-processes. Thanks to the pre-processing of the log files, more compact and interpretable process models can be identified. We developed a set of goal-oriented metrics to evaluate the complexity of process mining problems and the resulting models. The applicability and effectiveness of the method are demonstrated in the analysis of process alarms of an industrial plant. The results confirm that the proposed method enables the discovery of targeted sub-process models by partitioning the log file using frequent pattern mining, and the effectiveness of the method increases with the number of parallel processes stored in the same log file. We recommend applying the method in every case where there is no clear start and end of the logged events so that the log file can describe different processes.


Post date: 03 April 2023 

Surface Water Monitoring Systems—The Importance of Integrating Information Sources for Sustainable Watershed Management

The complex interactions from anthropogenic activities, climate change, sedimentation and the input of wastewater has significantly affected the aquatic environment and entire ecosystem. Over the years, the researchers have investigated water monitoring approaches in terms of traditional monitoring or even integrated systems to handle such an environmental assessment and predictions based on warning systems. However, research into the selection and optimization of water monitoring systems by the combination of parallel approach in terms of sampling techniques, process analysis and results is limited. The research objectives of the present study are to evaluate the existing water monitoring systems based on the latest approach and then provide insights into factors affecting sensor implementation at sampling locations. Here we summarize the advancement and trends of various water monitoring systems as well as the suitability of sensor placement in the area by reviewing more than 300 papers published between 2011 and 2022. The research highlights the urgency of an integrative approach with regard to water monitoring systems including water quality model and water quantity model. A framework is proposed to incorporate all water monitoring approaches, sampling techniques, and predictive models to provide comprehensive information about environmental assessment. It was observed that the urgency of model-based approaches as verification and fusion of data assemble has the ability to improve the performances of the systems. Furthermore, integrated systems with the inclusion of a separate modeling approach through integrated, semi-mechanistic models, data science and artificial intelligence are recommended in the future. Overall, this study provides guidelines for achieving standardized water management by implementing integrated water monitoring systems. 


 Post date: 31 March 2023 

Goal-Oriented Tuning of Particle Filters for the Fault Diagnostics of Process Systems

This study introduces particle filtering (PF) for the tracking and fault diagnostics of complex process systems. In process systems, model equations are often nonlinear and environmental noise is non-Gaussian. We propose a method for state estimation and fault detection in a wastewater treatment system. The contributions of the paper are the following: (1) A method is suggested for sensor placement based on the state estimation performance; (2) based on the sensitivity analysis of the particle filter parameters, a tuning method is proposed; (3) a case study is presented to compare the performances of the classical PF and intelligent particle filtering (IPF) algorithms; (4) for fault diagnostics purposes, bias and impact sensor faults were examined; moreover, the efficiency of fault detection was evaluated. The results verify that particle filtering is applicable and highly efficient for tracking and fault diagnostics tasks in process systems.


Post date: 09 March 2023

Data sharing in Industry 4.0 - AutomationML, B2MML and International Data Spaces-based solutions

The concept of a data ecosystem and Industry 4.0 requires high-level vertical and horizontal interconnectivity across the entire value chain. Its successful realization demands standardized data models to ensure transparent, secure and widely integrable data sharing within and between enterprises. This paper provides a PRISMA method-based systematic review about data sharing in Industry 4.0 via AutomationML, B2MML and International Data Spaces-based solutions. The interconnection of these data models and the ISA-95 standard is emphasized. This review describes the major application areas of these standards and their extension as well as supporting technologies and their contribution towards horizontal integration and data ecosystems. This review highlights how much value interconnected, exchanged and shared data gained in recent years. Standardized data sharing mechanisms enable real-time, flexible and transparent communication, which features became top requirements to gain a competitive advantage. However, to foster the shift from within company data communication towards the data ecosystem, IT- and people-oriented cultures must be well-established to ensure data protection and digital trust. We believe that this review of these standardized data exchange and sharing solutions can contribute to the development and design of Industry 4.0-related systems as well as support related scientific research.


Post date: 03 March 2023

3D Scanner-Based Identification of Welding Defects—Clustering the Results of Point Cloud Alignment

This paper describes a framework for detecting welding errors using 3D scanner data. The proposed approach employs density-based clustering to compare point clouds and identify deviations. The discovered clusters are then classified according to standard welding fault classes. Six welding deviations defined in the ISO 5817:2014 standard were evaluated. All defects were represented through CAD models, and the method was able to detect five of these deviations. The results demonstrate that the errors can be effectively identified and grouped according to the location of the different points in the error clusters. However, the method cannot separate crack-related defects as a distinct cluster. 

Post date: 01 March 2023

Multi-objective hierarchical clustering for tool assignment

Due to the limited tool magazine capacities of CNC machines, time-consuming tool changeovers result in inefficient equipment utilization. This study provides a method to minimize the changeovers by optimizing the allocation of the tools to the machines. The proposed algorithm is efficient as it approaches the tool assignment task as a multi-objective hierarchical clustering problem where the products are grouped based on the similarity of the tool demands. The novelty of the goal-oriented agglomerative clustering algorithm is that it is based on the Pareto optimal selection of the merged clusters. The applicability of the method is demonstrated through an industrial case study. The tool assignment problem has also been formulated as a bin-packing optimization task, and the results of the related linear programming were used as a benchmark reference. The comparison highlighted that the proposed method provides a feasible solution for large real-life problems with low computation time. 


Post date: 18 February 2023

Last Chance to Submit Your Paper to CoDIT'23: Extended Deadline Announced 

The CoDIT’23 conference is the ninth (9th) edition in the series of the International Conference on Control, Decision and Information Technologies. 

It will be held 03-06 July, 2023 at Rome, Italy. 

The paper submission deadline has been extended until February 28, 2023.

The conference  purpose is to be a forum for technical exchange amongst scientists having interests in Control, Automation, Robotics, Optimization, Decision, Cybernetics, Computer Science and Information Technologies. This conference will provide a remarkable opportunity for the academic and industrial communities to address new challenges, share solutions and discuss future research directions. The technical program will include plenary lectures, regular technical sessions, and special sessions.

For more information please visit the website.


Post date: 01 February 2023

Tuan-anh Tran  achieved excellent ranking at the IEEE HS Student Paper Contest

Every year, the IEEE Hungary Section (IEEE HS) announces a "Student Paper Contest" for students from higher education institutions. Tuan-anh Tran was awarded thrid place with a paper entitled "Retrofitting-Based Development of Brownfield Industry 4.0 and Industry 5.0 Solutions".  


Post date: 05 January 2023

Demonstration Laboratory of Industry 4.0 Retrofitting and Operator 4.0 Solutions: Education towards Industry 5.0

One of the main challenges of Industry 4.0 is how advanced sensors and sensing technologies can be applied through the Internet of Things layers of existing manufacturing. This is the so-called Brownfield Industry 4.0, where the different types and ages of machines and processes need to be digitalized. Smart retrofitting is the umbrella term for solutions to show how we can digitalize manufacturing machines. This problem is critical in the case of solutions to support human workers. The Operator 4.0 concept shows how we can efficiently support workers on the shop floor. The key indicator is the readiness level of a company, and the main bottleneck is the technical knowledge of the employees. This study proposes an education framework and a related Operator 4.0 laboratory that prepares students for the development and application of Industry 5.0 technologies. The concept of intelligent space is proposed as a basis of the educational framework, which can solve the problem of monitoring the stochastic nature of operators in production processes. The components of the intelligent space are detailed through the layers of the IoT in the form of a case study conducted at the laboratory. The applicability of indoor positioning systems is described with the integration of machine-, operator- and environment-based sensor data to obtain real-time information from the shop floor. The digital twin of the laboratory is developed in a discrete event simulator, which integrates the data from the shop floor and can control the production based on the simulation results. The presented framework can be utilized to design education for the generation of Industry 5.0.


Post date: 03 January 2023

Hypergraph and network flow-based quality function deployment

Quality function deployment (QFD) has been a widely-acknowledged tool for translating customer requirements into quality product characteristics based on which product development strategies and focus areas are identified. However, the QFD method considers the correlation and effect between development parameters, but it is not directly implemented in the importance ranking of development actions. Therefore, the cross-relationships between development parameters and their impact on customer requirement satisfaction are often neglected. The primary objective of this study is to make decision-making more reliable by improving QFD with methods that optimize the selection of development parameters even under capacity or cost constraints and directly implement cross-relationships between development parameters and support the identification of interactions visually. Therefore, QFD is accessed from two approaches that proved efficient in operations research. 1) QFD is formulated as a network flow problem with two objectives: maximizing the benefits of satisfying customer needs using linear optimization or minimizing the total cost of actions while still meeting customer requirements using assignment of minimum cost flow approach. 2) QFD is represented as a hypergraph, which allows efficient representation of the interactions of the relationship and correlation matrix and the determination of essential factors based on centrality metrics. The applicability of the methods is demonstrated through an application study in developing a sustainable design of customer electronic products and highlights the improvements' contribution to different development strategies, such as linear optimization performed the best in maximizing customer requirements' satisfaction, assignment as minimum cost flow approach minimized the total cost, while the hypergraph-based representation identified the indirect interactions of development parameters and customer requirements.

Post date: 14 December 2022 

Expert-Based Modular Simulator for Municipal Waste Processing Technology Design

One of the significant problems in our society is the handling and processing of the vast amount of waste produced by households and industrial processes. Nowadays, packaging material regulations are constantly changing, which can significantly impact the quality of municipal waste, requiring the continuous development and redesign of waste processing plants. Since only a few uncertain measurements (composition, mass, etc.) are available for this task, analysing and redesigning waste processing technologies is challenging. This research aims to develop a modelling and simulation concept that can integrate all the available information and can also handle the uncertainty of the measurements. The proposed modular modelling framework can serve as a basis for designing and redesigning the technologies needed to process ever-changing municipal waste. The most important steps of the framework are as follows: identifying the typical equipment, these are the elements; building models of the elements; determining the characteristic parameters of the equipment; exploring the possible relationships between the elements. For example, the information needed to define the model parameters can be gathered from measurements, industrial experience, and expert knowledge. In many cases, the data obtained represent ranges. The stationary model framework applies efficiency factors and divides the solids into substreams based on expert knowledge. Furthermore, a modular simulator framework was developed to simulate the technological schemes with various connections. The specifications for all widely used waste industrial equipment (shredders, air separators, sieves, magnetic-, eddy current-, optical-, and ballistic separators) were used to construct the developed simulator. This simulator can open new opportunities for the design of waste sorting technological networks. The model was calibrated based on expertise gained from operating the studied technology. The changes in the material parameters can be considered, and the modular simulator can lead to flexible waste sorting technologies capable of adapting to governmental and environmental regulations changes. The main result of the work is that a methodology for designing a modular simulator, model development, and a validation method has been proposed, which provides the possibility to deal with uncertainty. All this is successfully presented through the analysis of an operating waste separation system.

Post date: 08 December 2022

Tamás Ruppert has been awarded VEAB Outstanding Young Researchers Award in the „engineering sciences" category

The prize is awarded to young researchers who have made significant achievements in the field of engineering or the living and non-living sciences.

The award ceremony will take place at the VEAB Headquarters of the Hungarian Academy of Sciences in a lecture session on 7 December 2022 at 14:00.

Post date: 25 November 2022 

Researchers of Abonyilab achieved excellent ranking at the Institutional Scientific Student Conference

The University of Pannonia held the Institutional Scientific Student Conference on November 23, 2022. Éva Kenyeres and Ádám Ipkovich were awarded first place, Gergely Lajos Halász took second place. Therefore, they can participate in the National Scientific Student Conference. Mónika Gugolya was awarded third place.

Éva Kenyeres participated under the Faculty  of Engineering- Engineering Sciences section with a paper entitled: Goal-oriented particle filter state estimation algorithm-based fault diagnostics of process systems.

Ádám Ipkovich participated under the Faculty  of Engineering,  Chemical and Chemical Industry - Modeling section with a paper entitled: Iterative Identifiability Analysis of Composite Material Failure Models.

Gergely Lajos Halász participated under the Faculty  of Engineering- Engineering Sciences section with a paper entitled: Estimation of the operator comfort level and the layout information based on sensor fusion techniques.

Mónika Gugolya  participated under the Faculty  of Engineering,  Chemical and Chemical Industry - Modeling section with a paper entitled: Collaborative work scheduling between humans and robots.


Post date: 25 November 2022 

Machine learning-based software sensors for machine state monitoring - The role of SMOTE-based data augmentation

A method for flexible vibration sensor-based retrofitting of CNC machines is proposed. As different states leave different fingerprints in the power spectrum plane, the states of the machine can be distinguished based on the features extracted from the spectrum map. Due to some states, like tool replacement, are less frequent than others, like production state, monitoring the machine states is considered an imbalanced classification problem. The key idea is to use Borderline-Synthetic Minority Oversampling Technique (Borderline-SMOTE) to augment the data set. The concept is validated in an industrial case study. Soft sensors based on four machine learning algorithms with and without SMOTE to predict the states of the machine were implemented. The results show that the SMOTE-based data augmentation improved the performance of the models by 50%.


Post date: 21 November 2022 

Cooperation patterns in the ERASMUS student exchange network: an empirical study

The ERASMUS program is the most extensive cooperation network of European higher education institutions. The network involves 90% of European universities and hundreds of thousands of students. The allocated money and number of travelers in the program are growing yearly. By considering the interconnection of institutions, the study asks how the program’s budget performs, whether the program can achieve its expected goals, and how the program contributes to the development of a European identity, interactions among young people from different countries and learning among cultures. Our goal was to review and explore the elements of network structures that can be used to understand the complexity of the whole ERASMUS student mobility network at the institutional level. The results suggest some socioeconomic and individual behavioral factors underpinning the emergence of the network. While the nodes are spatially distributed, geographical distance does not play a role in the network’s structure, although parallel travelling strategies exist, i.e., in terms of preference of short- and long-distance. The European regions of home and host countries also affect the network. One of the most considerable driving forces of edge formation between institutions are the subject areas represented by participating institutions. The study finds that faculties of institutions are connected rather than institutions, and multilayer network model suggested to explore the mechanisms of those connections. The results indicate that the information uncovered by the study is helpful to scholars and policymakers.


Post date: 27 October 2022 

Sectoral Analysis of Energy Transition Paths and Greenhouse Gas Emissions

The Paris Climate Agreement and the 2030 Agenda for Sustainable Development Goals declared by the United Nations set high expectations for the countries of the world to reduce their greenhouse gas (GHG) emissions and to be sustainable. In order to judge the effectiveness of strategies, the evolution of carbon dioxide, methane, and nitrous oxide emissions in countries around the world has been explored based on statistical analysis of time-series data between 1990 and 2018. The empirical distributions of the variables were determined by the Kaplan–Meier method, and improvement-related utility functions have been defined based on the European Green Deal target for 2030 that aims to decrease at least 55% of GHG emissions compared to the 1990 levels. This study aims to analyze the energy transition trends at the country and sectoral levels and underline them with literature-based evidence. The transition trajectories of the countries are studied based on the percentile-based time-series analysis of the emission data. We also study the evolution of the sector-wise distributions of the emissions to assess how the development strategies of the countries contributed to climate change mitigation. Furthermore, the countries’ location on their transition trajectories is determined based on their individual Kuznets curve. Runs and Leybourne–McCabe statistical tests are also evaluated to study how systematic the changes are. Based on the proposed analysis, the main drivers of climate mitigation and evaluation and their effectiveness were identified and characterized, forming the basis for planning sectoral tasks in the coming years. The case study goes through the analysis of two counties, Sweden and Qatar. Sweden reduced their emission per capita almost by 40% since 1990, while Qatar increased their emission by 20%. Moreover, the defined improvement-related variables can highlight the highest increase and decrease in different aspects. The highest increase was reached by Equatorial Guinea, and the most significant decrease was made by Luxembourg. The integration of sustainable development goals, carbon capture, carbon credits and carbon offsets into the databases establishes a better understanding of the sectoral challenges of energy transition and strategy planning, which can be adapted to the proposed method.


Post date: 25 October 2022 

Interview with Tamás Ruppert about the relationship between human, robot and intelligent space today and in the future 

The Gyártástrend Hungarian magasine published an interview with Tamas Ruppert about the next steps of the collaboration between the humans and robots.

The detailed interview is available here.


Post date: 13 October 2022 

Introducing the Operator 4.0 laboratory on the night of the scientists' event

This year we introduced our Operator 4.0 laboratory during the night of the scientists' event at the University of Pannonia. The demonstration was about collaboration, the digital twin, and human-centered solutions. 

More details about the Operator 4.0 laboratory are available here


Post date: 13 October 2022 

Goal-oriented possibilistic fuzzy C-Medoid clustering of human mobility patterns—Illustrative application for the Taxicab trips-based enrichment of public transport services

The discovery of human mobility patterns of cities provides invaluable information for decision-makers who are responsible for redesign of community spaces, traffic, and public transportation systems and building more sustainable cities. The present article proposes a possibilistic fuzzy c-medoid clustering algorithm to study human mobility. The proposed medoid-based clustering approach groups the typical mobility patterns within walking distance to the stations of the public transportation system. The departure times of the clustered trips are also taken into account to obtain recommendations for the scheduling of the designed public transportation lines. The effectiveness of the proposed methodology is revealed in an illustrative case study based on the analysis of the GPS data of Taxicabs recorded during nights over a one-year-long period in Budapest.


Post date: 10 October 2022 

Simultaneous Process Mining of Process Events and Operator Actions for Alarm Management

Alarm management is an important task to ensure the safety of industrial process technologies. A well-designed alarm system can reduce the workload of operators parallel with the support of the production, which is in line with the approach of Industry 5.0. Using Process Mining tools to explore the operator-related event scenarios requires a goal-oriented log file format that contains the start and the end of the alarms along with the triggered operator actions. The key contribution of the work is that a method is presented that transforms the historical event data of control systems into goal-oriented log files used as inputs of process mining algorithms. The applicability of the proposed process mining-based method is presented concerning the analysis of a hydrofluoric acid alkylation plant. The detailed application examples illustrate how the extracted process models can be interpreted and utilized. The results confirm that applying the tools of process mining in alarm management requires a goal-oriented log-file design. 


 Post date: 29 September 2022 

Indicators for climate change-driven urban health impact assessment

Climate change can cause multiply potential health issues in urban areas, which is the most susceptible environment in terms of the presently increasing climate volatility. Urban greening strategies make an important part of the adaptation strategies which can ameliorate the negative impacts of climate change. It was aimed to study the potential impacts of different kinds of greenings against the adverse effects of climate change, including waterborne, vector-borne diseases, heat-related mortality, and surface ozone concentration in a medium-sized Hungarian city. As greening strategies, large and pocket parks were considered, based on our novel location identifier algorithm for climate risk minimization.

A method based on publicly available data sources including satellite pictures, climate scenarios and urban macrostructure has been developed to evaluate the health-related indicator patterns in cities. The modelled future- and current patterns of the indicators have been compared. The results can help the understanding of the possible future state of the studied indicators and the development of adequate greening strategies.

Another outcome of the study is that it is not the type of health indicator but its climate sensitivity that determines the extent to which it responds to temperature rises and how effective greening strategies are in addressing the expected problem posed by the factor.


 Post date: 16 September 2022 

Operator 4.0 community met at the ETFA conference in Stuttgart

The 27th IEEE International Conference on Emerging Technologies and Factory Automation (ETFA), was held from 6th-9th September 2022 in Stuttgart, Germany. During the special session at the ETFA conference in Stuttgart organized by Tamas Ruppert, the Operator 4.0 research community finally met each other. The members came to Stuttgart from seven different countries.

More information about the special session entitled "Industry 5.0 - Augmenting the Human Worker in Balanced Automation Systems" is available here.


 Post date: 12 September 2022 

Hypergraph-based analysis and design of intelligent collaborative manufacturing space

A method for hypergraph-based analysis and the design of manufacturing systems has been developed. The reason for its development is the need to integrate the human workforce into Industry 4.0 solutions. The proposed intelligent collaborative manufacturing space enhances collaboration between the operators as well as provides them with valuable information about their performance and the state of the production system. The design of these Operator 4.0 solutions requires a problem-specific description of manufacturing systems, the skills, and states of the operators, as well as of the sensors placed in the intelligent space for the simultaneous monitoring of the cooperative work. The design of this intelligent collaborative manufacturing space requires the systematic analysis of the critical sets of interacting elements. The proposal is that hypergraphs can efficiently represent these sets, moreover, studying the centrality and modularity of the resultant hypergraphs can support the formation of collaboration and interaction schemes and the formation of manufacturing cells. A fully reproducible illustrative example presents the applicability of this concept.


Post date: 06 September 2022 

Challenges of the Fourth Industrial Revolution in HRM

As a result of the changes caused by Industry 4.0 and Industry 5.0, unknown or less prominent challenges will be the focus of the operation of organizations and will essentially transform current human resource management (HRM) and its framework and tools. This research aims to identify Industry 4.0 solutions and expected changes in the field of human resources (HR) and for employees and to outline emerging trends of Industry 4.0 that impact HR based on interviews with surveyed companies and a review of the relevant literature. Structured interviews were conducted in this research. After individually processing the responses of each interviewee, the responses were formulated by considering all interviews. This research points out that in terms of HR, recruitment and training are being most affected by the fourth industrial revolution, and changes in competencies and their development processes have begun. Hopefully, the discovered connections will inspire further research and provide useful information on the fields of Industry 4.0 and HR. 


Post date: 30 August 2022 

Information sharing in supply chains – Interoperability in an era of circular economy

In order to realize the goals of Industry 5.0 (I5.0), which has data interoperability as one of its core principles, the future research in the Supply Chain (SC) visibility has to be aligned with socially, economically and environmentally sustainable objectives. Within the purview of circular economy, this paper indicates various aspects and implications of data sharing in the SCs in light of the published research. Taking into consideration the heterogeneity of data sources and standards, this article also catalogs all the major data-sharing technologies being employed in sharing data digitally across the SCs.

Drawing on the published research from 2015 to 2021, following the PRISMA framework, this paper presents the state of research in the field of data sharing in SCs in terms of their standardization, optimization, simulation, automation, security and more notably sustainability. Using the co-occurrence metric, bibliometric analysis has been conducted such that the collected research is categorized under various keyword clusters and regional themes. This article brings together two major themes in reviewing the research in the field. Firstly, the bibliometric analysis of the published articles demonstrates the contours of the current state of research and the future possibilities in the field. Secondly, in synthesizing the research on the foundations of sustainability within the CRoss Industry Standard Process for Data Mining (CRISP-DM) framework, this article deals with various aspects and implications of information sharing in the SCs. By bringing these two themes together, this paper affords a prospective researcher with the research vis-à-vis the information sharing in SC, starting from the actual data standards in use to the modality and consequence of their application within the perspective of the circular economy. This article, in essence, indicates how all the aspects of data sharing in SCs may be brought together in service of the paradigm of I5.0.


Post date: 10 August 2022 

Simulation of Sustainable Manufacturing Solutions: Tools for Enabling Circular Economy 

At the current worrisome rate of global consumption, the linear economy model of producing goods, using them, and then disposing of them with no thought of the environmental, social, or economic consequences, is unsustainable and points to a deeply flawed manufacturing framework. Circular economy (CE) is presented as an alternative framework to address the management of emissions, scarcity of resources, and economic sustainability such that the resources are kept ‘in the loop’. In the context of manufacturing supply chains (SCs), the 6R’s of rethink, refuse, reduce, reuse, repair, and recycle have been proposed in line with the achievement of targeted net-zero emissions. In order to bring that about, the required changes in the framework for assessing the state of manufacturing SCs with regard to sustainability are indispensable. Verifiable and empirical model-based approaches such as modeling and simulation (M&S) techniques find pronounced use in realizing the ideal of CE. The simulation models find extensive use across various aspects of SCs, including analysis of the impacts, and support for optimal re-design and operation. Using the PRISMA framework to sift through published research, as gathered from SCOPUS, this review is based on 202 research papers spanning from 2015 to the present. This review provides an overview of the simulation tools being put to use in the context of sustainability in the manufacturing SCs, such that various aspects and contours of the collected research articles spanning from 2015 to the present, are highlighted. This article focuses on the three major simulation techniques in the literature, namely, Discrete Event Simulation (DES), Agent-Based Simulation (ABS), and System Dynamics (SD). With regards to their application in manufacturing SCs, each modeling technique has its pros and its cons which are evinced in case of data requirement, model magnification, model resolution, and environment interaction, among others. These limitations are remedied through use of hybrids wherein two or more than two modeling techniques are applied for the desired results. The article also indicates various open-source software solutions that are being employed in research and the industry. This article, in essence, has three objectives. First to present to the prospective researchers, the current state of research, the concerns that have been presented in the field of sustainability modeling, and how they have been resolved. Secondly, it serves as a comprehensive bibliography of peer-reviewed research published from 2015–2022 and, finally, indicating the limitations of the techniques with regards to sustainability assessment. The article also indicates the necessity of a new M&S framework and its prerequisites. 


Post date:  8 August 2022

Retrofitting-based development of brownfield Industry 4.0 and Industry 5.0 solutions 

The ongoing Industry 4.0 is characterized by the connectivity between components in the manufacturing system. For modern machines, the Internet of Things is a built-in function. In contrast, there are legacy machines in deployment functioning without digital communication. The need to connect them became popular to improve overall production efficiency. As building a new smart factory as a greenfield investment is a capital-intensive choice, retrofitting the existing infrastructure with IoT capability is more reasonable than replacing them. However, this so-called brownfield development, or retrofitting, requires specific prerequisites, e.g., digitization status assessment, technical and connectivity development, management requirement, and operational need, representing a significant disadvantage: lack of scalability. In the meantime, Industry 5.0 is under human-centric priority, which poses new challenges to the retrofitted system. Aware of the challenge, this paper provides a systematic overview of brownfield development regarding technical difficulties, supporting technologies, and possible applications for the legacy system. The research scope focuses on available Industry 4.0 advancements but considers preparing for the forthcoming Industry 5.0. The proposed retrofitting project approach can be a guideline for manufacturers to transform their factories into intelligent spaces with minimal cost and effort but still gain the most applicable solution for management needs. The future direction for other research in brownfield development for Industry 5.0 is also discussed. 


Post date: 22 June 2022

Edge-Computing and Machine-Learning-Based Framework for Software Sensor Development

The present research presents a framework that supports the development and operation of machine-learning (ML) algorithms to develop, maintain and manage the whole lifecycle of modeling software sensors related to complex chemical processes. Our motivation is to take advantage of ML and edge computing and offer innovative solutions to the chemical industry for difficult-to-measure laboratory variables. The purpose of software sensor models is to continuously forecast the quality of products to achieve effective quality control, maintain the stable production condition of plants, and support efficient, environmentally friendly, and harmless laboratory work. As a result of the literature review, quite a few ML models have been developed in recent years that support the quality assurance of different types of materials. However, the problems of continuous operation, maintenance and version control of these models have not yet been solved. The method uses ML algorithms and takes advantage of cloud services in an enterprise environment. Industrial 4.0 devices such as the Internet of Things (IoT), edge computing, cloud computing, ML, and artificial intelligence (AI) are core techniques. The article outlines an information system structure and the related methodology based on data from a quality-assurance laboratory. During the development, we encountered several challenges resulting from the continuous development of ML models and the tuning of their parameters. The article discusses the development, version control, validation, lifecycle, and maintenance of ML models and a case study. The developed framework can continuously monitor the performance of the models and increase the amount of data that make up the models. As a result, the most accurate, data-driven and up-to-date models are always available to quality-assurance engineers with this solution. 


Post date: 03 June 2022

Tamás Ruppert will be a presenter at the interdisciplinary DAAD Alumni & Friends Colloquium

Tamás Ruppert will present at DAAD Alumni & Friends  Colloquium about Industry 5.0 - Human-Factors in semi-automated Manufacturing. The event takes place on June 2  via Webex.

The DAAD Alumni & Friends  Colloquium takes place on June 2. Since 2016 the regional group Ruhr of DAAD Alumni & Friends has co-organized four colloquia per year since 2019, co-hosted by FH Dortmund. They invite DAAD fellows currently pursuing their research in Germany and other young researchers in NRW to present their ongoing projects. 

More detail about the event is available here.


Post date:  22 May 2022

Tamás Ruppert attended the Industry 4.0 Symposium  at the Warsaw University of Technology in Poland

Tamás Ruppert was a keynote speaker at the Industry 4.0 Symposium held at the Warsaw University of Technology in Poland on May 9-10. He presented the newest results of our research group on the Operator 4.0 topic and the Operator 4.0 research network.

More information about the Operator 4.0 research network is available here.

More information about the event is available here.


Post date:  10 May 2022

Multi-agent reinforcement learning-based exploration of optimal operation strategies of semi-batch reactors

The operation of semi-batch reactors requires caution because the feeding reagents can accumulate, leading to hazardous situations due to the loss of control ability. This work aims to develop a method that explores the optimal operational strategy of semi-batch reactors. Since reinforcement learning (RL) is an efficient tool to find optimal strategies, we tested the applicability of this concept. We developed a problem-specific RL-based solution for the optimal control of semi-batch reactors in different operation phases. The RL-controller varies the feeding rate in the feeding phase directly, while in the mixing phase, it works as a master in a cascade control structure. The RL-controllers were trained with different neural network architectures to define the most suitable one. The developed RL-based controllers worked very well and were able to keep the temperature at the desired setpoint in the investigated system. The results confirm the benefit of the proposed problem-specific RL-controller. 


Post date:  05 May 2022

Ádám Ipkovich took second place at Scientific Student Conference

The University of Pannonia held the Engineering Faculty Scientific Student Conference on May 04 2022, where Ádám Ipkovich took second place in the modeling section with a paper entitled: Neighborhood Ranking-based Model-free Feature Selection.


Post date:  04 May 2022

A multi-block clustering algorithm for high dimensional binarized sparse data

We introduce a multidimensional multiblock clustering (MDMBC) algorithm in this paper. MDMBC can generate overlapping clusters with similar values along clusters of dimensions. The parsimonious binary vector representation of multidimensional clusters lends itself to the application of efficient meta-heuristic optimization algorithms. In this paper, a hill-climbing (HC) greedy search algorithm has been presented that can be extended by several stochastic and population-based meta-heuristic frameworks. The benefits of the algorithm are demonstrated in a bi-clustering benchmark problem and in the analysis of the Leiden higher education ranking system, which measures the scientific performance of 903 institutions along four dimensions of 20 indicators representing publication output and collaboration in different scientific fields and time periods.


Post date:  01 April 2022

Honti Gergely Marcell, a former PhD graduate  took first place in Digitalization context of data economy 

Honti Gergely Marcell was awarded the first place in Digitization context of data economy on March 30 2022.

Digitális Jólét Nonprofit Kft. és a Neumann Nonprofit Közhasznú Kft.  announced a call for proposals entitled "Digitalization contexts of the data economy". The call for applications was addressed to students and recent graduates who have written their thesis, diploma thesis or doctoral dissertation on the digitalisation context of the data economy.


Post date: 30 March 2022

János Abonyi was awarded the Knight's Cross of the Merit of Hungary

The Parliamentary and Strategic State Secretary of the Ministry of Innovation and Technology, Tamás Schanda, presented awards on the March 17 2022. On the occasion of the event, Dr. János Abonyi was awarded the Knight's Cross of the Order of Merit of Hungary, in recognition of his outstanding research and teaching work in the field of complex systems modelling and data mining, and his achievements in the development of the engineering education system of the University of Pannonia.


Post date: 17 March 2022

Processing indoor positioning data by goal-oriented supervised fuzzy clustering for tool management

Indoor positioning systems allow real-time tracking of tool locations. Tool utilization can be calculated based on positional data of the storage and manufacturing areas. Due to the uncertainty of the position measurements, estimation of the state of the tools is problematic when the distance urvival Indoor positioning systems allow real-time tracking of tool locations. Tool utilization can be calculated based on positional data of the storage and manufacturing areas. Due to the uncertainty of the position measurements, estimation of the state of the tools is problematic when the distance between the examined zones is less than the estimation error. We propose a goal-oriented supervised fuzzy clustering algorithm that utilizes the activity state of the tool, as the algorithm simultaneously maximizes the spatial distribution probability and the probability of a specific activity state occurring in a cluster. By weighting data points according to the time spent in the related states and positions, the resulting cluster weights can be interpreted as tool utilizations. The applicability of the developed method is presented through the processing of position data from crimping tools used by a wire harness manufacturer.


Post date:  26 February 2022

Factor analysis, sparse PCA, and Sum of Ranking Differences-based improvements of the Promethee-GAIA multicriteria decision support technique

The Promethee-GAIA method is a multicriteria decision support technique that defines the aggregated ranks of multiple criteria and visualizes them based on Principal Component Analysis (PCA). In the case of numerous criteria, the PCA biplot-based visualization do not perceive how a criterion influences the decision problem. The central question is how the Promethee-GAIA-based decision-making process can be improved to gain more interpretable results that reveal more characteristic inner relationships between the criteria. To improve the Promethee-GAIA method, we suggest three techniques that eliminate redundant criteria as well as clearly outline, which criterion belongs to which factor and explore the similarities between criteria. These methods are the following: A) Principal factoring with rotation and communatily analysis (P-PFA), B) the integration of Sparse PCA into the Promethee II methods (P-sPCA), and C) the Sum of Ranking Differences method (P-SRD). The suggested methods are presented through an I4.0+ dataset that measures the Industry 4.0 readiness of NUTS2-classified regions. The proposed methods are useful tools for handling multicriteria ranking problems, if the number of criteria is numerous. 


Post date:  25 February 2022

3D Scanning and Model Error Distributiron-Based Characterisation of Welding Defects

The inspection of welded structures requires particular attention due to many aspects that define the quality of the product. Deciding on the suitability of welds is a complex process. This work aims to propose a method that can support this qualification. This paper presents a state-of-the-art data-driven evaluation method and its application in the quality assessment of welds. Image processing and CAD modelling software was applied to generate a reference using the Iterative Closest Point algorithm that can be used to generate datasets which represent the model errors. The results demonstrate that the distribution of these variables characterises the typical welding defects. Based on the automated analysis of these distributions, it is possible to reduce the turnaround time of testing, thereby improving the productivity of welding processes. 


Post date:  08 February 2022

János Abonyi and Tamás Ruppert are Special Issue Editors of "Industry 5.0 - the Human Factors in Semi-automated Manufacturing"

With the rapid development of innovative technologies, such as artificial intelligence methods, big data and cloud computing, the new concept of Industry 5.0 has been revolutionizing production and logistics systems by introducing collaborative processes and data-based operator support (so-called Operator 4.0). This Special Issue aims to disseminate advanced research in the theory and application of collaboration in the manufacturing industries (also known by some experts as Industry 5.0).

Topics of interest include, but are not limited to:

Link to the Special Issue "Industry 5.0 - the Human Factors in Semi-automated Manufacturing"


Post date:  07 February 2022

Network-Based Topological Exploration of the Impact of Pollution Sources on Surface Water Bodies

We developed a digital water management toolkit to evaluate the importance of the connections between water bodies and the impacts caused by pollution sources. By representing water bodies in a topological network, the relationship between point loads and basic water quality parameters is examined as a labelled network. The labels are defined based on the classification of the water bodies and pollution sources. The analysis of the topology of the network can provide information on how the possible paths of the surface water network influence the water quality. The extracted information can be used to develop a monitoring- and evidence-based decision support system. The methodological development is presented through the analysis of the physical-chemical parameters of all surface water bodies in Hungary, using the emissions of industrial plants and wastewater treatment plants. Changes in water quality are comprehensively assessed based on the water quality data recorded over the past 10 years. The results illustrate that the developed method can identify critical surface water bodies where the impact of local pollution sources is more significant. One hundred six critical water bodies have been identified, where special attention should be given to water quality improvement.


Post date:  13 January 2022

We are on the Cover Story of Data journal


The article "Learning Interpretable Mixture of Weibull Distributions - Exploratory Analysis of How Economic Development Influences the Incidence of COVID-19 Deaths" by R. Csalódi, Z. Birkner, and J. Abonyi is on the cover story of Data journal, Volume 6, Issue 12, pp. 125 (2021).


The journal is available here.

The article is available here. 


Post date:  December 2021

Data-driven business process management-based development of Industry 4.0 solutions

Business process management (BPM) supports the management and transformation of organizational operations. This paper provides a structured guideline for improving data-based process development within the BPM life cycle. We show how Industry 4.0-induced tools and models can be integrated within the BPM life cycle to achieve more efficient process excellence and evidence-based decision-making. The paper demonstrates how standards of machine learning (CRISP-ML(Q)), BPM, and tools of design science research can support the redesign phases of Industry 4.0 development. The proposed methodology is carried out on an assembly company, where the proposed improvement steps are investigated by simulation and evaluated by relevant key performance indicators


Post date: 10 December 2021

Comprehensible Visualization of Multidimensional Data: Ranking Differences-Based Parallel Coordinates

A novel visualization technique is proposed for the sum of ranking differences method (SRD) based on parallel coordinates. An axis is defined for each variable, on which the data are depicted row-wise. By connecting data, the lines may intersect. The fewer intersections between the variables, the more similar they are and the clearer the figure becomes. Therefore, the visualization depends on what techniques are used to order the variables. The key idea is to employ the SRD method to measure the degree of similarity of the variables, establishing a distance-based order. The distances between the axes are not uniformly distributed in the proposed visualization; their closeness reflects similarity, according to their SRD value. The proposed algorithm identifies false similarities through an iterative approach, where the angles between the SRD values determine which side a variable is plotted. Visualization of the algorithm is provided by MATLAB/Octave source codes. The proposed tool is applied to study how the sources of greenhouse gas emissions can be grouped based on the statistical data of the countries. A comparison to multidimensional scaling (MDS)-based ordering is also given. The use case demonstrates the applicability of the method and the synergies of the incorporation of the SRD method into parallel coordinates. 


Post date:  11 December 2021

János Abonyi is the Special Issue Editor of "Soft Sensors 2021-2022"

The Special Issue solicits papers that cover the development, validation, application, and maintenance of software sensors. The potential topics include, but are not limited to: 

Link to the Special Issue "Soft Sensors 2021-2022"


Post date: December 2021

The Applicability of Reinforcement Learning Methods in the Development of Industry 4.0 Applications

Reinforcement learning (RL) methods can successfully solve complex optimization problems. Our article gives a systematic overview of major types of RL methods, their applications at the field of Industry 4.0 solutions, and it provides methodological guidelines to determine the right approach that can be fitted better to the different problems, and moreover, it can be a point of reference for R&D projects and further researches.


Post date:  30 November 2021

Learning Interpretable Mixture of Weibull Distributions - Exploratory Analysis of How Economic Development Influences the Incidence of COVID-19 Deaths

This paper presents an algorithm for learning local Weibull models, whose operating regions are represented by fuzzy rules. The applicability of the proposed method is demonstrated in estimating the mortality rate of the COVID-19 pandemic. The reproducible results show that there is a significant difference between mortality rates of countries due to their economic situation, urbanization, and the state of the health sector. The proposed method is compared with the semi-parametric Cox proportional hazard regression method. The distribution functions of these two methods are close to each other, so the proposed method can estimate efficiently. 


Post date: 26 November 2021

Ontology-Based Analysis of Manufacturing Processes: Lessons Learned from the Case Study of Wire Harness Production

Effective information management is critical for the development of manufacturing processes. This paper aims to provide an overview of ontologies that can be utilized in building Industry 4.0 applications. The main contributions of the work are that it highlights ontologies that are suitable for manufacturing management and recommends the multilayer-network-based interpretation and analysis of ontology-based databases. This article not only serves as a reference for engineers and researchers on ontologies but also presents a reproducible industrial case study that describes the ontology-based model of a wire harness assembly manufacturing process. 


Post date: 19 November 2021

Mixture of Survival Analysis Models-Cluster-Weighted Weibull Distributions

Survival analysis is a widely used method to establish a connection between a time to event outcome and a set of variables. The goal of this work is to improve the accuracy of the widely applied parametric survival models. This work highlights that accurate and interpretable survival analysis models can be identified by clustering-based exploration of the operating regions of local survival models. The key idea is that when operating regions of local Weibull distributions are represented by Gaussian mixture models, the parameters of the mixture-of-Weibull model can be identified by a clustering algorithm. The proposed method is utilised in three case studies. The examples cover studying the dropout rate of university students, calculating the remaining useful life of lithium-ion batteries, and determining the chances of survival of prostate cancer patients. The results demonstrate the wide applicability of the method and the benefits of clustering-based identification of local Weibull models. 


Post date: 11 November 2021

Identification of sampling points for the detection of SARS-CoV-2 in the sewage system

A suitable tool for monitoring the spread of SARS-CoV-2 is to identify potential sampling points in the wastewater collection system that can be used to monitor the distribution of COVID-19 disease affected clusters within a city. The applicability of the developed methodology is presented through the description of the 72,837 population equivalent wastewater collection system of the city of Nagykanizsa, Hungary and the results of the analytical and epidemiological measurements of the wastewater samples. The wastewater sampling was conducted during the 3rd wave of the COVID-19 epidemic. It was found that the overlap between the road system and the wastewater network is high, it is 82 %. It was showed that the proposed methodological approach, using the tools of network science, determines confidently the zones of the wastewater collection system and provides the ideal monitoring points in order to provide the best sampling resolution in urban areas. The strength of the presented approach is that it estimates the network based on publicly available information. It was concluded that the number of zones or sampling points can be chosen based on relevant epidemiological intervention and mitigation strategies. The algorithm allows for continuous effective monitoring of the population infected by SARS-CoV-2 in small-sized cities. 


Post date: 29 October 2021

Janos Abonyi was invited to join as a program committee member at 8th International Conference on Control, Decision and Infromation Technologies (CoDIT), 2022

Janos Abonyi serves as a program committee member at the 8th International Conference on Control, Decision and Infromation Technologies (CoDIT).

The aim of CoDIT to be a forum for technical exchange amongst scientist having interests in Control, Optimization, Decision, all areas of Engineering, Computer Science and Information Technologies.  The conferene will provide remarkable opportunity for the academic and industrial communities to address new challenges, share solutions and discucc future research directions. 

The conference will be held at Istambul, Turkey  in May 17-20, 2022.

Link to conference website


Post date: 26 October 2021

Janos Abonyi was invited to join as a program committee member at Innovations in Bio-Inspired Computing and Applications (IBICA) 2021 conference

Janos Abonyi serves as a program committee member at the 12th International Conference on Innovations in Bio-Inspired Computing and Applications (IBICA).

The aim of IBICA is to provide a platform for world research leaders and practitioners, to discuss the "full spectrum" of current theoretical developments, emerging technologies and innovative applications of Bio-inspired Computing. Bio-insipired Computing is currently one of the most exciting research ares, and it is continuously demonstrating exceptional strenght in solving complex real life problems. 

The conference will be held online on December 16-18, 2021.

Link to conference website


Post date: 07 September 2021

Janos Abonyi was invited to join as a program committee member at Soft Computing and Pattern Recognition (SoCPaR) 2021 conference

Janos Abonyi serves as a program committee member at the 13th International Conference on Soft Computing and Pattern Recognition (SoCPaR).

The conference aims to bring together worldwide leading researchers and practitioners interested in advancing the state-of-the-art in Soft Computing and Pattern Recognition, for exchanging knowledge that encompasses a broad range of disciplines among various distinct communities. It is hoped that researchers and practitioners will bring new prospects for collaboration across disciplines and gain inspiration to facilitate novel breakthroughs. The themes for this conference are thus focused on "Innovating and Inspiring Soft Computing and Intelligent Pattern Recognition". 

The conference will be held online on December 15-17, 2021.

Link to conference website


Post date: 07 September 2021

Janos Abonyi was invited to join as a program committee member at Information Assurance and Security (IAS) 2021 conference

Janos Abonyi serves as a program committee member at the 17th International Conference on Information Assurance and Security (IAS). 

The Conference Theme: Innovative Cyber Secutiry: Protecting National Borders'.

The conference aims to bring together researchers, practitioners, developers, and policy makers in multiple disciplines of information security and assurance to exchange ideas and to learn the latest development in this important field.

The conference is organized by Machine Intelligence Research Labs (MIR Labs) and will be held online on December 14-16.

Link to conference website


Post date: 07 September 2021

Janos Abonyi was invited to join as a program committee member at Nature and Biologically Inspired Computing (NaBIC) 2021 World Congress

Janos Abonyi serves as a program committee member at the 13th World Congress on Nature and Biologically Inspired Computing (NaBIC).

NaBIC 2021 is organized to provide a forum for researchers, engineers, and students from all over the world, to discuss the state-of-the-art in machine intelligence, and address various issues on building up human friendly machines by learning from nature. The conference theme is “Nurturing Intelligent Computing Towards Advancement of Machine Intelligence”. 

The conference will be held online on March 15-17, 2021.

Link to conference website


Post date: 07 September 2021

Janos Abonyi was invited to join as a program committee member at Hybrid Intelligent Systems (HIS) 2021 conference

Janos Abonyi serves as a program committee member at the 21th International Conference on Hybrid Intelligent Systems (HIS).

The objectives of HIS 2021 are: to increase the awareness of the research community of the broad spectrum of hybrid techniques, to bring together AI researchers from around the world to present their cutting-edge results, to discuss the current trends in HIS research, to develop a collective vision of future opportunities, to establish international collaborative opportunities, and as a result to advance the state of the art of the field. 

The conference will be held online on March 14-16, 2021.

Link to conference website


Post date: 07 September 2021

Data-driven comparative analysis of national adaptation pathways for Sustainable Development Goals

Since the declaration of Sustainable Development Goals (SDGs) in 2015, countries have begun developing and strategizing their national pathways for effective implementation of the 2030 Agenda. The sustainable development targets set out how the world’s nations must move forward so that sustainable development is not an ideal vision but a workable, comprehensive environmental, economic, and social policy. This work aims to analyze the state of progress towards achieving sustainable development goals for each country. In addition to the static presentation of the achievements that countries can present, the changes over time are also compared, allowing countries to be grouped according to the current states. A sophisticated SDG performance measurement tool has been developed to support this analysis, which automatically processes the entire UN Global SDG Indicators database with exploratory data analysis, frequent item mining, and network analysis supported. Based on the trend analysis of the percentiles, the values of the indicators achievable by 2030 are also derived. The analyzes were performed based on the time-series data of 1319 disaggregated official SDG indicators.

Most of the world countries have achieved the greatest success in SDG12 and SDG10 since the declaration of the 2030 Agenda. In the field of climate change (SDG13), 26 countries can count on significant achievements. However, SDG6, SDG2, and SDG1 face significant challenges globally, as they have typically seen minor progress in recent years. Examined at the indicator level, indicators 1.4.1, 5.6.2, 6.b.1, 10.7.2, and 15.4.2 improved in all countries of the world, while indicators 2.a.1, 9.4.1, 2.1.1, 2.1. and 12.b.1 have deteriorated predominantly. According to the forecast for 2030, Australia and the United States can reduce their per capita CO2 emissions, while some countries in Africa, Asia, and the Middle East are expected to increase their emissions.


Post date: 20 August 2021

Event-Tree Based Sequence Mining Using LSTM Deep-Learning Model

During the operation of modern technical systems, the use of the LSTM model for the prediction of process variable values and system states is commonly widespread. The goal of this paper is to expand the application of the LSTM-based models upon obtaining information based on prediction. In this method, by predicting transition probabilities, the output layer is interpreted as a probability model by creating a prediction tree of sequences instead of just a single sequence. By further analyzing the prediction tree, we can take risk considerations into account, extract more complex prediction, and analyze what event trees are yielded from different input sequences, that is, with a given state or input sequence, the upcoming events and the probability of their occurrence are considered. In the case of online application, by utilizing a series of input events and the probability trees, it is possible to predetermine subsequent event sequences. The applicability and performance of the approach are demonstrated via a dataset in which the occurrence of events is predetermined, and further datasets are generated with a higher-order decision tree-based model. The case studies simply and effectively validate the performance of the created tool as the structure of the generated tree, and the determined probabilities reflect the original dataset. 


Post date: 16 August 2021

Contrast and brightness balance in image enhancement using Cuckoo Search-optimized image fusion

Many vision-based systems suffer from poor levels of contrast and brightness, mainly because of inadequate and improper illumination during the image acquisition process. As a result, the required specified information from the acquired image is not available for the particular application. In general, it is hard to achieve a balance between the improvement of contrast and brightness in image enhancement. By introducing nature-inspired optimization in image enhancement, the best features of the image are utilized, and the complexity related to the nonlinearity of images can be solved with various constraints, like a balance between contrast and brightness. In this work, a novel automatic method for image enhancement to find a balance between contrast and brightness is developed by using Cuckoo Search-optimized image fusion. First, the Cuckoo Search-based optimization algorithm generates two sets of optimized parameters. These parameter sets are used to generate a pair of enhanced images, one with a high degree of sharpness and contrast, the other is bright and has been improved without losing the level of detail. Furthermore, the two enhanced images are fused by the fusion process to obtain an output image where the contrast and brightness are in balance. The effectiveness of the proposed method is verified by applying it to standard images (CVG-UGR image database) and lathe tool images. Experimental results demonstrated that the proposed method performs better with regard to both the quality of contrast and brightness, moreover, yields enhanced quality evaluation metrics compared to the other conventional techniques. 


Post date: 15 July 2021

Quality vs. quantity of alarm messages - How to measure the performance of an alarm system

Despite significant efforts to measure and assess the performance of alarm systems, to this day, no silver bullet has been found. The majority of the existing standards and guidelines focus on the alarm load of the operators, either during normal or upset plant conditions, and only a small fraction takes into consideration the actions performed by the operators. In this study, an overview of the evolution of alarm system performance metrics is presented and the current data-based approaches are grouped into seven categories based on the goals of and the methodologies associated with each metric. Deriving from our categorical overview, the terminological differences between the academic and industrial approaches of alarm system performance measurement are reflected. Moreover, we highlight how extremely unbalanced the performance measurement of alarm systems is towards quantitative metrics instead of focusing on qualitative assessment, invoking the threat of excessive alarm reductions resulting from such a unilateral approach. The critical aspects of qualitative performance measurement of alarm systems is demonstrated in terms of the comparison of the alarm system of an industrial hydrofluoric acid alkylation unit before and after the alarm rationalization process. The quality of the alarm messages is measured via their informativeness and actionability, in other words, how appropriate the parameter settings are for the everyday work and how actionable they are by the operators of the process. 


Post date: 5 July 2021

Genetic programming-based symbolic regression for goal-oriented dimension reduction

The majority of dimension reduction techniques are built upon the optimization of an objective functionaiming to retain certain characteristics of the projected datapoints: the variance of the original dataset,the distance between the datapoints or their neighbourhood characteristics, etc. Building upon theoptimization-based formalization of dimension reduction techniques, the goal-oriented formulation ofprojection cost functions is proposed. For the optimization of the application-oriented data visualizationcost function, a Multi-gene genetic programming (GP)-based algorithm is introduced to optimize thestructures of the equations used for mapping high-dimensional data into a two-dimensional space andto select variables that are needed to explore the internal structure of the data for data-driven softwaresensor development or classifier design. The main benefit of the approach is that the evolved equationsare interpretable and can be utilized in surrogate models. The applicability of the approach is demon-strated in the benchmark wine dataset and in the estimation of the product quality in a diesel oil blendingtechnology based on an online near-infrared (NIR) analyzer. The results illustrate that the algorithm iscapable to generate goal-oriented and interpretable features, and the resultant simple algebraic equa-tions can be directly implemented into applications when there is a need for computationally cost-effective projections of high-dimensional data as the resultant algebraic equations are computationallysimpler than other solutions as neural networks.


Post date: 08 June 2021

Indoor Positioning Systems Can Revolutionise Digital Lean

The powerful combination of lean principles and digital technologies accelerates wasteidentification and mitigation faster than traditional lean methods. The new digital lean (also referredto as Lean 4.0) solutions incorporate sensors and digital equipment, yielding innovative solutionsthat extend the reach of traditional lean tools. The tracking of flexible and configurable productionsystems is not as straightforward as in a simple conveyor. This paper examines how the informationprovided by indoor positioning systems (IPS) can be utilised in the digital transformation of flexiblemanufacturing. The proposed IPS-based method enriches the information sources of value streammapping and transforms positional data into key-performance indicators used in Lean Manufacturing.The challenges of flexible and reconfigurable manufacturing require a dynamic value stream mapping.To handle this problem, a process mining-based solution has been proposed. A case study isprovided to show how the proposed method can be employed for monitoring and improvingmanufacturing efficiency. 


Post date:  07 June 2021

Sparse PCA supports exploration of process structures for decentralized fault detection

With the ever-increasing use of sensor technologies in industrial processes, and more data becoming available to engineers, the fault detection and isolation activities in the context of process monitoring have gained significant momentum in recent years. A statistical procedure frequently used in this domain is Principal Component Analysis (PCA), which can reduce the dimensionality of large data sets without compromising the information content. While most process monitoring methods offer satisfactory detection capabilities, understanding the root cause of malfunctions and providing the physical basis for their occurrence have been challengin. The relatively new sparse PCA techniques represent a further development of the PCA in which not only the data dimension is reduced but the data is also made more interpretable, revealing clear correlation structures among variables. Hence, taking a step forward from classical fault detection methods, in this work, a decentralized monitoring approach is proposed based on a sparse algorithm. The resulting control charts reveal the correlation structures associated with the monitored process and facilitate a structural analysis of the occurred faults. The applicability of the proposed method is demonstrated using data generated from the simulation of the benchmark vinyl acetate process. It is shown that the sparse principal components, as a foundation to decentralized multivariate monitoring framework, can provide physical insight towards the origins of process faults. 


Post date: 25 May 2021

Test Plan for the Verification of the Robustness of Sensors and Automotive Electronic Products Using Scenario-Based Noise Deployment (SND)

The targeted shortening of sensor development requires short and convincing verification tests. The goal of the development of novel verification methods is to avoid or reduce an excessive amount of testing and identify tests that guarantee that the assumed failure will not happen in practice. In this paper, a method is presented that results in the test loads of such a verification. The method starts with the identification of the requirements for the product related to robustness using the precise descriptions of those use case scenarios in which the product is assumed to be working. Based on the logic of the Quality Function Deployment (QFD) method, a step-by-step procedure has been developed to translate the robustness requirements through the change in design parameters, their causing phenomena, the physical quantities as causes of these phenomena, until the test loads of the verification. The developed method is applied to the test plan of an automotive sensor. The method is general and can be used for any parts of a vehicle, including mechanical, electrical and mechatronical ones, such as sensors and actuators. Nonetheless, the method is applicable in a much broader application area, even outside of the automotive industry.

 

Post date: 12 May 2021

Regional development potentials of Industry 4.0: Open data indicators of the Industry 4.0+ model

This paper aims to identify the regional potential of Industry 4.0 (I4.0). Although the regional background of a company significantly determines how the concept of I4.0 can be introduced, the regional aspects of digital transformation are often neglected with regard to the analysis of I4.0 readiness. Based on the analysis of the I4.0 readiness models, the external regional success factors of the implementation of I4.0 solutions are determined. An I4.0+ (regional Industry 4.0) readiness model, a specific indicator system is developed to foster medium-term regional I4.0 readiness analysis and foresight planning. The indicator system is based on three types of data sources: (1) open governmental data; (2) alternative metrics like the number of I4.0-related publications and patent applications; and (3) the number of news stories related to economic and industrial development. The indicators are aggregated to the statistical regions (NUTS 2), and their relationships analyzed using the Sum of Ranking Differences (SRD) and Promethee II methods. The developed I4.0+ readiness index correlates with regional economic, innovation and competitiveness indexes, which indicates the importance of boosting regional I4.0 readiness. 


Post date: 19 April 2021

Data describing the relationship between world news and sustainable development goals

The data article presents a dataset and a tool for news-based monitoring of sustainable development goals defined by the United Nations. The presented dataset was created by struc- tured queries of the GDELT database based on the categories of the World Bank taxonomy matched to sustainable devel- opment goals. The Google BigQuery SQL scripts and the re- sults of the related network analysis are attached to the data to provide a toolset for the strategic management of sustain- ability issues. The article demonstrates the dataset on the 6th sustainable development goal (Clean Water and Sanita- tion). The network formed based on how countries appear in the same news can be used to explore the potential interna- tional cooperation. The network formed based on how topics of World Bank taxonomy appear in the same news can be used to explore how the problems and solutions of sustain- ability issues are interlinked.

Access to data


Post date: 24 March 2021

The Applicability of Big Data in Climate Change Research: The Importance of System of Systems Thinking 

The aim of this paper is to provide an overview of the interrelationship between data science and climate studies, as well as describes how sustainability climate issues can be managed using the Big Data tools. Climate-related Big Data articles are analyzed and categorized, which revealed the increasing number of applications of data-driven solutions in specific areas, however, broad integrative analyses are gaining less of a focus. Our major objective is to highlight the potential in the System of Systems (SoS) theorem, as the synergies between diverse disciplines and research ideas must be explored to gain a comprehensive overview of the issue. Data and systems science enables a large amount of heterogeneous data to be integrated and simulation models developed, while considering socio-environmental interrelations in parallel. The improved knowledge integration offered by the System of Systems thinking or climate computing has been demonstrated by analysing the possible inter-linkages of the latest Big Data application papers. The analysis highlights how data and models focusing on the specific areas of sustainability can be bridged to study the complex problems of climate change. 


Post date: 17 March 2021

We are delivering a couse about "Sensing the future of science"

The course offers to gain ability to get ideas for research that has potential impact on social and economic relevances.

The course material covers the concepts of technology scouting and the exploration of information sources to find 'evidence of the future in the present'.

You can download the course material from:  here 


Post date: 17 March 2021

Modelling for Digital Twins - Potential Role of Surrogate Models

The application of white box models in digital twins is often hindered by missing knowledge, uncertain information and computational difficulties. Our aim was to overview the difficulties and challenges regarding the modelling aspects of digital twin applications and to explore the fields where surrogate models can be utilised advantageously. In this sense, the paper discusses what types of surrogate models are suitable for different practical problems as well as introduces the appropriate techniques for building and using these models. A number of examples of digital twin applications from both continuous processes and discrete manufacturing are presented to underline the potentials of utilising surrogate models. The surrogate models and model-building methods are categorised according to the area of applications. The importance of keeping these models up to date through their whole model life cycle is also highlighted. An industrial case study is also presented to demonstrate the applicability of the concept. 


Post date: 7 March 2021

Integrated Survival Analysis and Frequent Pattern Mining for Course Failure-Based Prediction of Student Dropout

A data-driven method to identify frequent sets of course failures that students should avoid in order to minimize the likelihood of their dropping out from their university training is proposed. The overall probability distribution of the dropout is determined by survival analysis. This result can only describe the mean dropout rate of the undergraduates. However, due to the failure of different courses, the chances of dropout can be highly varied, so the traditional survival model should be extended with event analysis. The study paths of students are represented as events in relation to the lack of completing the required subjects for every semester. Frequent patterns of backlogs are discovered by the mining of frequent sets of these events. The prediction of dropout is personalised by classifying the success of the transitions between the semesters. Based on the explored frequent item sets and classifiers, association rules are formed providing the estimates of the success of the continuation of the studies in the form of confidence metrics. The results can be used to identify critical study paths and courses. Furthermore, based on the patterns of individual uncompleted subjects, it is suitable to predict the chance of continuation in every semester. The analysis of the critical study paths can be used to design personalised actions minimizing the risk of dropout, or to redesign the curriculum aiming the reduction in the dropout rate. The applicability of the method is demonstrated based on the analysis of the progress of chemical engineering students at the University of Pannonia in Hungary. The method is suitable for the examination of more general problems assuming the occurrence of a set of events whose combinations may trigger a set of critical events. 


Post date: 24 February 2021

Frequent Itemset Mining and Multi-Layer Network-Based Analysis of RDF Databases

Triplestores or resource description framework (RDF) stores are purpose-built databasesused to organise, store and share data with context. Knowledge extraction from a large amountof interconnected data requires effective tools and methods to address the complexity and theunderlying structure of semantic information. We propose a method that generates an interpretablemultilayered network from an RDF database. The method utilises frequent itemset mining (FIM)of the subjects, predicates and the objects of the RDF data, and automatically extracts informativesubsets of the database for the analysis. The results are used to form layers in an analysablemultidimensional network. The methodology enables a consistent, transparent, multi-aspect-orientedknowledge extraction from the linked dataset. To demonstrate the usability and effectiveness ofthe methodology, we analyse how the science of sustainability and climate change are structuredusing the Microsoft Academic Knowledge Graph. In the case study, the FIM forms networks ofdisciplines to reveal the significant interdisciplinary science communities in sustainability and climatechange. The constructed multilayer network then enables an analysis of the significant disciplinesand interdisciplinary scientific areas. To demonstrate the proposed knowledge extraction process, wesearch for interdisciplinary science communities and then measure and rank their multidisciplinaryeffects. The analysis identifies discipline similarities, pinpointing the similarity between atmosphericscience and meteorology as well as between geomorphology and oceanography. The results confirmthat frequent itemset mining provides an informative sampled subsets of RDF databases which canbe simultaneously analysed as layers of a multilayer network.


Post date: 23 February 2021

Inustry 4.0-Driven Development of Optimization Algortihms: A Systematic Overview

The Fourth Industrial Revolution means the digital transformation of production systems. Cyber-physical systems allow for the horizontal and vertical integration of these production systems as well as the exploitation of the benefits via optimization tools. This article reviews the impact of Industry 4.0 solutions concerning optimization tasks and optimization algorithms, in addition to the identification of the new R&D directions driven by new application options. The basic organizing principle of this overview of the literature is to explore the requirements of optimization tasks, which are needed to perform horizontal and vertical integration. This systematic review presents content from 900 articles on Industry 4.0 and optimization as well as 388 articles on Industry 4.0 and scheduling. It is our hope that this work can serve as a starting point for researchers and developers in the field. 


Post date: 13 February 2021

The intertwining of world news with Sustainable Development Goals: An effective monitoring tool

This study aims to bring about a novel approach to the analysis of Sustainable Development Goals (SDGs) based solely on the appearance of news. Our purpose is to provide a monitoring tool that enables world news to be detected in an SDG-oriented manner, by considering multilingual as well as wide geographic coverage. The association of the goals with news basis the World Bank Group Topical Taxonomy, from which the selection of search words approximates the 17 development goals. News is extracted from The GDELT Project (Global Database of Events, Language and Tone) which gathers both printed as well as online news from around the world. 60 851 572 relevant news stories were identified in 2019. The intertwining of world news with SDGs as well as connections between countries are interpreted and highlight that even in the most SDG-sensitive countries, only 2.5% of the news can be attributed to the goals. Most of the news about sustainability appears in Africa as well as East and Southeast Asia, moreover typically the most negative tone of news can be observed in Africa. In the case of climate change (SDG 13), the United States plays a key role in both the share of news and the negative tone. Using the tools of network science, it can be verified that SDGs can be characterized on the basis of world news.

This news-centred network analysis of SDGs identifies global partnerships as well as national stages of implementation towards a sustainable socio-environmental ecosystem. In the field of sustainability, it is vital to form the attitudes and environmental awareness of people, which strategic plans cannot address but can be measured well through the news.


Post date: 05 February 2021

A negyedik ipari forradalom hatása a kompetenciacserélődésre

Az élet számos területén folyamatos változás figyelhető meg, különösen így van ez a gyakorlati életben jelenleg is zajló negyedik ipari forradalom kapcsán. Az Ipar 4.0, a technológiai újításai révén, jelentősen megváltoztatja a munkaerőpiacot és a munkahelyeket. Így elkerülhetetlen a jelenleg is zajló és a várható változáshoz való alkalmazkodás, ugyanakkor nehéz megmondani, hogy milyen kompetenciákra lesz szükség ehhez a jövőben. A kutatás célja, az Ipar 4.0 megoldások azonosítása során, a kompetenciaszükséglet változásának meghatározása a vizsgált vállalatokkal készített strukturált interjúk alapján.A kutatás rávilágít arra, hogy a kompetenciacserélődés és azok fejlesztési folyamatai megkezdődtek. Remélhetőleg a feltárt összefüggések további kutatásokat inspirálnak, támpontot szolgáltatnak a munkavállalók fejlesztését szolgáló képzések kidolgozásában, megújításban, valamint a HRM és az Ipar 4.0 területén hasznos információként szolgálnak, segítve a kompetenciafejlesztési és HR-fejlesztési stratégiák kidolgozását és megvalósítását.


Post date: 12 Janura 2021

Lászlo Nagy took second place in the IEEE HS Student Paper Contest 

László Nagy took second place in the IEEE HS Student Paper Contest with the article entitled: "Analytic Hierarchy Process and Multilayer Network-Based Method for Assembly Line Balancing".


Post date: 12 Janury 2021

Estimation of machine setup and changeover times by survival analysis

The losses associated with changeovers are becoming more significant in manufacturing due to the high variance of products and requirements for just-in-time production. The study is based on the single minute exchange of die (SMED) philosophy, which aims to reduce changeover times. We introduced a method for the analysis of these losses based on models that estimate the product- and operator-dependent changeover times using survival analysis. The root causes of the losses are identified by significance tests of the utilized Cox regression models. The resulting models can be used to design a performance management system that considers the stochastic nature of the work of the operators. An anonymized manufacturing example related to the setup of crimping and wire cutting machines demonstrates the applicability of the method. 


Post date: 23 December 2020

János Abonyi was presenting about the applicability of data science and machine learning in water management at the National Conference on Water Value and Digital Water Management 

The virtual meeting was held in 2020. December 2-3 organized by MaSzeSz (Hungarian Water and Wastewater Association). 

The conference discussed the real value of sustainable water utility services, knowledge-based management, reducing the large gap in costs in small and large settlements, increasing the value and social recognition of water services, the international value of the domestic water industry, digital data and information management of municipal water management, and future professionals . 

For more details about the conference available here.


Post date: 11 December 2020

Machine Learning Based Analysis of Human Serum N-glycome Alterations to Follow up Lung Tumor Surgery

The human serum N-glycome is a valuable source of biomarkers for malignant diseases, already utilized in multiple studies. In this paper, the N-glycosylation changes in human serumproteins were analyzed after surgical lung tumor resection. Seventeen lung cancer patients were involved in this study and the N-glycosylation pattern of their serum samples was analyzed before andafter the surgery using capillary electrophoresis separation with laser-induced fluorescent detection. The relative peak areas of 21N-glycans were evaluated from the acquired electropherograms using machine learning-based data analysis. Individual glycans as well as their subclasses were taken into account during the course of evaluation. For the data analysis, both discrete (e.g., smoker or not)and continuous (e.g., age of the patient) clinical parameters were compared against the alterations in these 21N-linked carbohydrate structures. The classification tree analysis resulted in a panel of N-glycans, which could be used to follow up on the effects of lung tumor surgical resection. 


Post date: 09 December 2020

Az adattudomány eszköztárának alkalmazási lehetőségei a klímaváltozás kihívásainak azonosításában és kezelésében 

A fenntarthatóság tudományterületének legfontosabb kérdésein végighaladva bemutattuk, hogy a jövőben milyen kutatási és fejlesztési tevékenységekre van szükség ahhoz, hogy a klímaváltozás komplex problémáinak megismerésében és kezelésében az adat-tudomány eszköztára hatékony segítséget nyújtson. A  jelenleg  sikeresnek  bizonyult  adatalapú  alkalmazásokat kulcsszó elemzés segítségével tekintettük át. Elemzésünk szemléltette, hogy a Big Data a klímatudomány egyre szélesebb körben alkalmazott eszköze, ugyanakkor kevés az e technológia előnyeit ténylegesen kiaknázó, igazán átfogó jellegű, integratív elemzés. Tanulmányunkkal szeretnénk felhívni a figyelmet a rendszerek rendszere (SoS) elvére, ugyanis a klímaváltozás mozgatórugói és hatásai csak akkor ismerhetők fel, és a hatásokhoz csak akkor tudunk alkalmazkodni és azoknak ellenállni, ha időben felismerjük és feltárjuk az új kutatási irányzatok közötti szinergiákat. 

Abonyi János, Czvetkó Tímea, Sebestyén Viktor. "Az adattudomány eszköztárának alkalmazási lehetőségei a klímaváltozás kihívásainak azonosításában és kezelésében." MKL, 2020


Post date: December 2020

Real-Time Locating System in Production Management 

Real-time monitoring and optimization of production and logistics processes significantlyimprove the efficiency of production systems. Advanced production management solutions requirereal-time information about the status of products, production, and resources. As real-time locatingsystems (also referred to as indoor positioning systems) can enrich the available information, thesesystems started to gain attention in industrial environments in recent years. This paper providesa review of the possible technologies and applications related to production control and logistics,quality management, safety, and efficiency monitoring. This work also provides a workflow to clarifythe steps of a typical real-time locating system project, including the cleaning, pre-processing, andanalysis of the data to provide a guideline and reference for research and development of indoorpositioning-based manufacturing solutions. 


Post date: 26 November 2020

Abonyi János interjúja a Sic Itur ad Astra történeti folyóiratban

Témája, a hálózat mint metafora és modell: kibővült a társadalomtudományok eszköztára.

Beszélgetés Abonyi János és Lengyel Balázs hálózatkutatókkal a hálózatelmélet aktuális trendjeiről.

Az interjú tartalma megtekinthető ezen a linken.


Post date: 22 November 2020

 Data describing the regional Industry 4.0 readiness index

The data article presents a dataset suitable to measure regional Industry 4.0 (I4.0+) readiness. The I4.0+ dataset includes 101 indicators with 248 958 observations, aggregated to NUTS 2 statistical level) based on open data in the field of education (ETER, Erasmus), science (USPTO, MA-Graph, GRID), government (Eurostat) and media coverage (GDELT). Indicators consider the I4.0-specific domain of higher education and lifelong learning, innovation, technological investment, labour market and technological readiness as indicators. A composite indicator, the I4.0+ index was constructed by the Promethee method, to identify regional rank regarding their I4.0 performance. The index is validated with economic (GDP) and innovation indexes (Regional Innovation Index). 

Data accessibility


Post date: 27 October 2020

Techtogether engineering competition

The Techtogedther engineering competition was held at the Industry Days 2020 event, where the research group proposed a task for students to be solved. Students had to present exciting industrial solutions in the topic of  'Improving the digital twin of production systems and increasing efficency based on data analysis'.


Post date: 26 October 2020

Technology meetup 06.10.2020 - 18:00

Why is it worth going back to school? What can the person who applies for our latest trainings learn form?

These questions were answered at the Technology Meetup held in Veszprém at the 6th of October.  Gyula Dörgő and Tamás Ruppert presented the work of the research group as well as introduced the Industry 4.0 engineering training and the Automotive Quality Academy to the audience.


Post date: 26 October 2020

Decision trees for informative process alarm definition and alarm-based fault classification

Alarm messages in industrial processes are designed to draw attention to abnormalities that require timely assessment or intervention. However, in practice, alarms are arbitrarily and excessively defined by process operators resulting numerous nuisance and chattering alarms that are simply a source of distraction. Countless techniques are available for the retrospective filtering of alarm data, e.g., adding time delays and deadbands to existing alarm settings. As an alternative, in the present paper, instead of filtering or modifying existing alarms, a method for the design of alarm messages being informative for fault detection is proposed which takes into consideration that the occurring alarm messages originally should be optimal for fault detection and identification. This methodology utilizes a machine learning technique, the decision tree classifier, which provides linguistically well-interpretable models without the modification of the measured process variables. Furthermore, an online application of the defined alarm messages for fault identification is presented using a sliding window-based data preprocessing approach. The effectiveness of the proposed methodology is demonstrated in terms of the analysis of a well-known benchmark simulator of a vinyl-acetate production technology, where the complexity of the simulator is considered to be sufficient for the testing of alarm systems.

Note to practitioners: Process-specific knowledge can be used to label historical process data to normal operating and fault-specific periods. Alarm generation should be designed to be able to detect and isolate faulty states. Using decision trees, optimal”cuts” or alarm limits for the purpose of fault classification can be defined utilizing a labelled dataset. The results apply to a variety of industries operating with online control systems, and especially timely in the chemical industry.


Post date: 22 October 2020

Local newspaper reported about our professional engineer training in Industry 4.0 and the Automotive Quality Academy 

Link to the Industry 4.0 Professional Engineer Training website

Link to the Automotive Quality Academy website


Post date: 22 October 2020

Directions of membrane separator development for microbial fuel cells: A retrospective analysis using frequent itemset mining and descriptive statistical approach

To increase the efficiency of microbial fuel cells (MFCs), the separator (which is mostly a membrane) placed between the electrodes or their compartments is considered of high importance besides several other biotic and abiotic factors (e.g. configuration, mode of operation, types of inoculum and substrate). Nafion-based proton exchange membranes (PEMs) are the most widespread, although these materials are often criticized on various technological and economical grounds. Therefore, to find alternatives of Nafion, the synthesis, development and testing of novel/commercialized membrane separators with enhanced characteristics have been hot topics. In this study, the goals were to assess the membrane-installed MFCs in a retrospective manner and reveal the trends, the applied practices, frequent setups, etc. via Bayesian classification and frequent itemset mining algorithms. Thereafter, a separate discussion was devoted to examine the current standing of research related to major membrane groups used in MFCs and evaluate in accordance with the big picture how the various systems behave in comparison with each other, especially compared to those applying Nafion PEM. It was concluded that some membrane types seem to be competitive to Nafion, however, the standardization of the experiments would drive the more unambiguous comparison of studies. 


Post date: 21 October 2020

Janos Abonyi was invited to join as a program committee member at Evolutionary Multi-Criterion Optimization (EMO) 2021 conference

Janos Abonyi serves as a program committee member at the 11th International Conference on Evolutionary Multi-Criterion Optimization (EMO).

The conference aims to bring together both the EMO, Multiple Criteria Decision-Making (MCDM) communities, and other related fields, moreover, focusing on solving real-world problems in government, business and industry.

The conference will be held as a hybrid conference on March 28-31, 2021 in Shenzhen and on-line.

Link to conference website


Post date: 12 October 2020

Integration of real-time locating systems into digital twins

Cyber-physical model-based solutions should rely on digital twins in which simulations are integrated with real-time sensory and manufacturing data. This paper highlights the benefits of information fusion with real-time locating systems (RTLS) and demonstrates how position and acceleration data can be utilised for the simulation-based analysis of product-specific activity times. The proposed digital twin is continuously capable to predict the production status and provide information for monitoring of production performance thanks to the real time connections of the RTLS and adaptive simulation models. The presented industrial case study demonstrates how the resulted Simulation 4.0 concept supports the analysis of human resource effectiveness (HRE) in an assembly process.


Post date:  06 October 2020

Are Regions Prepared for Industry 4.0? The Industry 4.0+ Indicator System for Assessment

The concept of industry 4.0 is spreading worldwide and readiness models exist to determine organizational or national maturity. On the other hand, the regional perspective of the digital transformation is yet to be widely researched, although it significantly determines how the concept of industry 4.0  can be introduced to the organisations. This book identifies the regional aspect of industry 4.0  and provides a regional (NUTS 2 classified) industry 4.0 indicator system model that is based on open data sources. This new model serves as a tool to evaluate regional economy to support governmental decisions. It also provides territorial councils with a decision-support tool for field investment decisions. And finally, this model offers investors with a heat map to evaluate regional economies successful implementation of industry 4.0 solutions. 

J.  Abonyi, T. Czvetko, G. Honti, Are Regions Prepared for Industry 4.0? The Industry 4.0+ Indicator System for Assessment, SpringerBriefs in Entrepreneurship and Innovation 


Post date: 23 September 2020

Development of manufacturing execution systems in accordance with Industry 4.0 requirements: A review of standard- and ontology-based methodologies and tools

This work presents how recent trends in Industry 4.0 (I4.0) solutions are influencing the development of manufacturing execution systems (MESs) and analyzes what kinds of trends will determine the development of the next generation of these technologies. This systematic and thematic review provides a detailed analysis of I4.0-related requirements in terms of MES functionalities and an overview of MES development methods and standards because these three aspects are essential in developing MESs. The analysis highlights that MESs should interconnect all components of cyber-physical systems in a seamless, secure, and trustworthy manner to enable high-level automated smart solutions and that semantic metadata can provide contextual information to support interoperability and modular development. The observed trends show that formal models and ontologies will play an even more essential role in I4.0 systems as interoperability becomes more of a focus and that the new generation of linkable data sources should be based on semantically enriched information. The presented overview can serve as a guide for engineers interested in the development of MESs as well as for researchers interested in finding worthwhile areas of research. 


Post date: 01 September 2020

Pairwise comparison based Failure Mode and Effects Analysis (FMEA)

The proposed method supports the determination of severity (S), occurrence (O), and detection (D) indices of Failure Modes and Effects Analysis (FMEA). Previously evaluated and previously not studied risks are compared in pairwise comparison. The analysis of  the resulted pairwise  comparison  matrix provides  information  about  the consistency  of the risk  evaluations and  allows  the estimation of the indices of the previously not evaluated risks. The advantages of the method include: 


Post date: 01 August 2020

Analytic Hierarchy Process and Multilayer Network-Based Method for Assembly Line Balancing

Assembly line balancing improves the efficiency of production systems by the optimal assignment of tasks to operators. The optimisation of this assignment requires models that provide information about the activity times, constraints and costs of the assignments. A multilayer network-based representation of the assembly line-balancing problem is proposed, in which the layers of the network represent the skills of the operators, the tools required for their activities and the precedence constraints of their activities. The activity–operator network layer is designed by a multi-objective optimisation algorithm in which the training and equipment costs as well as the precedence of the activities are also taken into account. As these costs are difficult to evaluate, the analytic hierarchy process (AHP) technique is used to quantify the importance of the criteria. The optimisation problem is solved by a multi-level simulated annealing algorithm (SA) that efficiently handles the precedence constraints. The efficiency of the method is demonstrated by a case study from wire harness manufacturing. 


Post date: 05 June 2020

Megjelentünk az Innotéka magazinban!

Megjelent az Innotéka magazin májusi számában a Dr. Abonyi Jánosról készült portré "Vonzódni az ismeretlenhez" címen.

A cikk megtalálható az alábbi linken:

Cikk linkje

A májusi szám pedig az alábbi linken:

Májusi szám linkje


Post date: 11 May 2020

Multilayer network based comparative document analysis (MUNCoDA)

The proposed multilayer network-based comparative document analysis (MUNCoDA) method supports the identification of the common points of a set of documents, which deal with the same subject area. As documents are transformed into networks of informative word-pairs, the collection of documents form a multilayer network that allows the comparative evaluation of the texts. The multilayer network can be visualized and analyzed to highlight how the texts are structured. The topics of the documents can be clustered based on the developed similarity measures. By exploring the network centralities, topic importance values can be assigned. The method is fully automated by KNIME preprocessing tools and MATLAB/Octave code.

•Networks can be formed based on informative word pairs of a multiple documents

•The analysis of the proposed multilayer networks provides information for multi-document summarization

•Words and documents can be clustered based on node similarity and edge overlap measures

V. Sebestyén, E. Domokos, J. Abonyi : Multilayer network based comparative document analysis (MUNCoDA), MethodsX, Volume 7, 2020


Post date: 29 April 2020

Focal points for sustainable development strategies—Text mining-based comparative analysis of voluntary national reviews has been published!

Post date: March11, 2020 10:00:00 AM

Countries have to work out and follow tailored strategies for the achievement of their Sustainable Development Goals. At the end of 2018, more than 100 voluntary national reviews were published. The reviews are transformed by text mining algorithms into networks of keywords to identify country-specific thematic areas of the strategies and cluster countries that face similar problems and follow similar development strategies. The analysis of the 75 VNRs has shown that SDG5 (gender equality) is the most discussed goal worldwide, as it is discussed in 77% of the analysed Voluntary National Reviews. The SDG8 (decent work and economic growth) is the second most studied goal, With 76 %, while the SDG1 (no poverty) is the least focused goal, it is mentioned only in 48 % of documents and the SDG10 (reduced inequalities) in 49 %. The results demonstrate that the proposed benchmark tool is capable of highlighting what kind of activities can make significant contributions to achieve sustainable developments. 

Prof. Janos Abonyi was invited to join the program committee of 7th edition of the International conference on Time Series and Forecasting (ITISE 2020) 

Post date: Feb22, 2020 09:00:00 AM

The ITISE 2020 (7th International conference on Time Series and Forecasting) seeks to provide a discussion forum for scientists, engineers, educators and students about the latest ideas and realizations in the foundations, theory, models and applications for interdisciplinary and multidisciplinary research encompassing disciplines of computer science, mathematics, statistics, forecaster, econometric, etc, in the field of time series analysis and forecasting.

The aims of ITISE 2020 is to create a a friendly environment that could lead to the establishment or strengthening of scientific collaborations and exchanges among attendees, and therefore, ITISE 2020 solicits high-quality original research papers (including significant work-in-progress) on any aspect time series analysis and forecasting, in order to motivating the generation, and use of knowledge and new computational techniques and methods on forecasting in a wide range of fields.

Link to the conference website

Post date: Feb16, 2020 10:00:00 PM

The school is organized at the University of Catania, Italy, by the Department of Electrical Electronics and Computer Science and the Cometa Consortium. 

It consists of a series of lectures given by leading scientists in the field, aiming at providing a comprehensive treatment from background material to advanced results. The school is specially directed to PhD students and young researchers interested to the diverse aspects of the theory and applications of complex networks in science and engineering. The school aims at encouraging cross-disciplinary discussions between participants and speakers and start new joint researches. 

The Assembly magazine reported about our methodology developed for activity time monitoring: 

Post date: Feb13, 2020 6:00:00 PM

Industry 4.0 and the digital manufacturing revolution are all about collecting—and, more importantly, acting on—data gathered from the assembly process in real time. That’s all well and good when data is coming from sensors, vision systems, fastening tools and other electronic devices. But, how can engineers gather real-time data on largely manual assembly processes, such as wire harness assembly?

To solve this problem, we developed a software- and sensor-based system to measure activity times and performance on a wire harness assembly line. To ensure a real-time connection between assembler performance and varying product complexity, our system relies on fixture sensors and an indoor positioning system (IPS). Our goal was to create a system that could continuously estimate the time consumed by the various elementary activities that make up wire harness assembly. Our system creates a model of a task, compares estimated activity times to the actual performance of assemblers, and generates early warnings when their productivity decreases. 

J. Abonyi, T. Ruppert, Monitoring Activity During Wire Harness Assembly, Assembly, 2020

Mixtures of QSAR Models – Learning Application Domains of pKa Predictors has been published!

Post date: Feb11, 2020 5:00:00 PM

Quantitative structure-activity relationship models (QSAR models) predict the physical properties or biological effects based on physicochemical properties or molecular descriptors of chemical structures. Our work focuses on the construction of optimal linear and nonlinear weighted mixes of individual QSAR models to more accurately predict their performance. How the splitting of the application domain by a nonlinear gating network in a "mixture of experts" model structure is suitable for the determination of the optimal domain-specific QSAR model and how the optimal QSAR model for certain chemical groups can be determined is highlighted. The input of the gating network is arbitrarily formed by the various molecular structure descriptors and/or even the prediction of the individual QSAR models. The applicability of the method is demonstrated on the pKa values of the OASIS database (1912 chemicals) by the combination of four acidic pKa predictions of the OECD QSAR Toolbox. According to the results, the prediction performance was enhanced by more than 15 % (RMSE value) compared to the predictions of the best individual QSAR model. 

J.Abonyi, T. Varga, O. P. Hamadi, Gy. Dorgo, Mixtures of QSAR Models – Learning Application Domains of pKa Predictors, Journal of Chemometrics, 2020

Janos Abonyi invited to serve as a Committee member!

Post date: Feb10, 2020 9:00:00 PM

Dr. Janos Abonyi has been invited to serve as Technical Program Committee member in the  IEEE Wireless Africa 2020 conference.

IEEE Wireless Africa 2020 is sponsored by the IEEE Vehicular Technology Society and will be hosted in South Africa, from 29-30 November 2020.

The conference aims to provide a platform for wireless researchers to share their results, call for comments and collaborations, and exchange innovative ideas on leading edge research in wireless technologies.

Link to the conference website

Link to the precise track

A multilayer and spatial description of the Erasmus mobility network has been published!

Post date: Feb6, 2020 4:00:00 PM

The Erasmus Programme is the biggest collaboration network consisting of European Higher Education Institutions (HEIs). The flows of students, teachers and staff form directed and weighted networks that connect institutions, regions and countries. Here, we present a linked and manually verified dataset of this multiplex, multipartite, multi-labelled, spatial network. We enriched the network with institutional socio-economic data from the European Tertiary Education Register (ETER) and the Global Research Identifier Database (GRID). We geocoded the headquarters of institutions and characterised the attractiveness and quality of their environments based on Points of Interest (POI) data. The linked datasets provide relevant information to grasp a more comprehensive understanding of the mobility patterns and attractiveness of the institutions. 

J. Abonyi, L. Gadár, Zs. T. Kosztyán, A. Telcs, A multilayer and spatial description of the Erasmus mobility network, Scientific Data 7, Article 41, 2020 

Conference presentation

Tamás Ruppert and Róbert Csalódi were attending the 7th International Conference on Industrial Engineering and Applications (ICIEA) in Paris from 15 to 17 January 2020. Their presentation were rated by the judges as the best presentation in their own section. 

Webpage of the conference

Network-Based Analysis of Dynamical Systems has been published!

Post date: Jan15, 2020 6:00:00 PM

The key idea of this book is that the dynamical properties of complex systems can be determined by the effective calculation of specific structural features by network science-based analysis. Furthermore, certain dynamical behaviours can originate from the existence of specific motifs in the network representation or can be determined by network segmentation. Although the applicability of the methods and results was highlighted in the operability analysis of dynamical systems, the proposed methods can be utilised in various fields that will be mentioned at the end of each chapter.

J. Abonyi, Dániel Leitold, Ágnes Vathy-Fogarassy, Network-Based Analysis of Dynamical Systems, SpringerBriefs in Computer Science

Fuzzy activity time-based model predictive control of open station assembly lines is published!

Post date: Dec 14, 2019 3:00:00 PM

The sequencing and line balancing of manual mixed-model assembly lines are challenging tasks due to the complexity and uncertainty of operator activities. The control of cycle time and the sequencing of production can mitigate the losses due to non-optimal line balancing in the case of open-station production where the operators can work ahead of schedule and try to reduce their backlog. The objective of this paper is to provide a cycle time control algorithm that can improve the e ciency of assembly lines in such situations based on a specially mixed sequencing strategy. To handle the uncertainty of activity times, a fuzzy model-based solution has been developed. As the production process is modular, the fuzzy sets represent the uncertainty of the elementary activity times related to the processing of the modules. The optimistic and pessimistic estimates of the completion of activity times extracted from the fuzzy model are incorporated into a model predictive control algorithm to ensure the constrained optimization of the cycle time. The applicability of the proposed method is demonstrated based on a wire-harness manufacturing process with a paced conveyor, but the proposed algorithm can handle continuous conveyors as well. The results confirm that the application of the proposed algorithm is widely applicable in cases where a production line of a supply chain is not well balanced and the activity times are uncertain. 

J. Abonyi, Tamas Ruppert, Gyula Dorgo, Fuzzy activity time-based model predictive control of open-station assembly lines, Journal of Manufacturing Systems Volume 54, January 2020, Pages 12-23

Network analysis dataset of System Dynamics models is published!

Post date: Nov 01, 2019 3:00:00 PM

This paper presents a tool developed for the analysis of networks extracted from system dynamics models. The developed tool and the collected models were used and analyzed in the research paper, Review and structural analysis of system dynamics models in sustainability science. The models developed in Vensim, Stella, and InsightMaker are converted into networks of state-variables, flows, and parameters by the developed Python program that also performs model reduction, modularity analysis and calculates the structural properties of the models and its main variables. The dataset covers the results of the analysis of nine models in sustainability science used for policy testing, prediction and simulation. 

Honti G., Dorgo Gy., Abonyi J.:"Network analysis dataset of System Dynamics models" , Data in Brief, 2019, Paper:104723

Constrained Recursive Input Estimation of Blending and Mixing Systems is published!

Post date: Oct 23, 2019 6:00:00 PM

Blending, mixing processes are often supported by advanced process control systems to maximise margins from available component and heat streams. Since these model-based solutions require accurate and reliable data, in weakly instrumented processes, the unknown inlet concentrations and temperatures should be estimated based on the measured outflows. This work presents a method for the reliable estimation of multiple input variables of process units. The key idea is that the input estimation problem is formulated as a constrained recursive estimation task. The applicability of the method is illustrated based on a benchmark model of a blending system. The performance of the method is compared to the moving window and Kalman Filter based solutions. The results show the superior performance of the proposed method and confirm that the apriori knowledge-based constraints improve the robustness of the estimates.

Abonyi J.: „Constrained Recursive Input Estimation of Blending and Mixing Systems”, Chemical Engineering Transactions, 76, 727-732

Introduction to Data Analysis Course

Post date: Oct  18, 2019 

We are delivering a data analysis course at MOL Value Chain Academy. 

The course material is avaliable here 

Data-driven multilayer complex networks of sustainable development goals is published!

Post date: Oct 08, 2019 6:15:00 PM

This data article presents the formulation of multilayer network for modelling the interconnections among the sustainable development goals (SDGs), targets and includes the correlation based linking of the sustainable development indicators with the available long-term datasets of The World Bank, 2018. The spatial distribution of the time series data allows creating country-specific sustainability assessments. In the related research article “Network Model-Based Analysis of the Goals, Targets and Indicators of Sustainable Development for Strategic Environmental Assessment” the similarities of SDGs for ten regions have been modelled in order to improve the quality of strategic environmental assessments. The datasets of the multilayer networks are available on Mendeley .

Sebestyén V., Bulla M., Rédey Á., Abonyi J.: „Data-driven multilayer complex networks of sustainable development goals”, Data in Brief, Volume 25, 2019, 104049

Network-based Observability and Controllability Analysis of Dynamical Systems: the NOCAD toolbox is published!

Post date: Oct 08, 2019 6:10:00 PM

The network science-based determination of driver nodes and sensor placement has become increasingly popular in the field of dynamical systems over the last decade. In this paper, the applicability of the methodology in the field of life sciences is introduced through the analysis of the neural network of Caenorhabditis elegans. Simultaneously, an Octave and MATLAB-compatible NOCAD toolbox is proposed that provides a set of methods to automatically generate the relevant structural controllability and observability associated measures for linear or linearised systems and compare the different sensor placement methods.

Leitold D., Vathy-Fogarassy Á., Abonyi J.: „Network-based Observability and Controllability Analysis of Dynamical Systems: the NOCAD toolbox”, [version 2; peer review: 2 approved], F1000Research 2019, 8:646

Genetic programming-based development of thermal runaway criteria is published!

Post date: Oct 08, 2019 6:10:00 PM

Common thermal runaway criteria (e.g. divergence criterion and the Maxi criterion) may predict a thermal runaway unreasonably as the Maximum Allowable Temperature (MAT) is not taken into account. This contribution proposes a method for the goal-oriented construction of reactor runaway criteria by Genetic Programming (GP). The runaway prediction problem is formulated as a critical equation-based classification task, and GP is used to identify the optimal structure of the equations that also take into account the MAT. To demonstrate the applicability of the method, tailored criteria were developed for batch and continuous stirred-tank reactors. The resultant critical equations outperform the well-known criteria in terms of the early and accurate indication of thermal runaways.

Kummer A., Varga T., Abonyi J.: „Genetic programming-based development of thermal runaway criteria”, Computers & Chemical Engineering, 2019, 106582

Review and structural analysis of system dynamics models in sustainability science is published!

Post date: Oct 08, 2019 6:00:00 PM

As the complexity of sustainability-related problems increases, it is more and more difficult to understand the related models. Although tremendous models are published recently, their automated structural analysis is still absent. This study provides a methodology to structure and visualise the information content of these models. The novelty of the present approach is the development of a network analysis-based tool for modellers to measure the importance of variables, identify structural modules in the models and measure the complexity of the created model, and thus enabling the comparison of different models. The overview of 130 system dynamics models from the past five years is provided. The typical topics and complexity of these models highlight the need for tools that support the automated structural analysis of sustainability problems. For practising engineers and analysts, nine models from the field of sustainability science, including the World3 model, are studied in details. The results highlight that with the help of the developed method the experts can highlight the most critical variables of sustainability problems (like arable land in the Word 3 model) and can determine how these variables are clustered and interconnected (e.g. the population and fertility are key drivers of global processes). The developed software tools and the resulted networks are all available online.

Honti G., Dörgő Gy., Abonyi J.: „Review and structural analysis of system dynamics models in sustainability science”, Journal of Cleaner Production, Volume 240, 2019, 118015

Learning and predicting operation strategies by sequence mining and deep learning (full paper) is published!

Post date: Jun 15, 2019 7:55:00 PM

The operators of chemical technologies are frequently faced with the problem of determining optimal interventions. Our aim is to develop data-driven models by exploring the consequential relationships in the alarm and event-log database of industrial systems. Our motivation is twofold: (1) to facilitate the work of the operators by predicting future events and (2) analyse how consequent the event series is. The core idea is that machine learning algorithms can learn sequences of events by exploring connected events in databases. First, frequent sequence mining applications are utilised to determine how the event sequences evolve during the operation. Second, a sequence-to-sequence deep learning model is proposed for their prediction. The long short-term memory unit-based model (LSTM) is capable of evaluating rare operation situations and their consequential events. The performance of this methodology is presented with regard to the analysis of the alarm and event-log database of an industrial delayed coker unit. 

Dörgő Gy., Abonyi J.: "Learning and predicting operation strategies by sequence mining and deep learning", Computers & Chemical Engineering, Volume 128, 2 September 2019, Pages 174-187

A Review of Semantic Sensor Technologies in Internet of Things Architectures is published!

Post date: Jun 15, 2019 7:40:00 PM

Intelligent sensors should be seamlessly, securely, and trustworthy interconnected to enable automated high-level smart applications. Semantic metadata can provide contextual information to support the accessibility of these features, making it easier for machines and humans to process the sensory data and achieve interoperability. The unique overview of sensor ontologies according to the semantic needs of the layers of IoT solutions can serve a guideline of engineers and researchers interested in the development of intelligent sensor-based solutions. The explored trends show that ontologies will play an even more essential role in interlinked IoT systems as interoperability and the generation of controlled linkable data sources should be based on semantically enriched sensory data.

Honti G., Abonyi J.: "A Review of Semantic Sensor Technologies in Internet of Things Architectures", Complexity, Volume 2019, Article ID 6473160, 21 pages

Operating regime model based multi-objective sensor placement for data reconciliation is published!

Post date: Jun 15, 2019 7:40:00 PM

Although the number of sensors in chemical production plants is increasing thanks to the IoT revolution, it is still a crucial problem what to measure and how to place the sensors as such the resulted sensor network be robust and cost-effectively provide the required information. This problem is especially relevant in flexible multi-purpose, multi-product production plants when there are significant differences among the operating regions. The present work aims the development of a sensor placement methodology that utilizes the advantages of local linear models. Realizing the often conflicting nature of the key objectives of sensor placement, the problem is formulated as a multi-objective optimization task taking into consideration the cost, estimation accuracy, observability and fault detection performance of the designed networks and simultaneously seeking for the optimal solutions under multiple operating regimes. The effectiveness of the Non-dominated Sorting Genetic Algorithm-II (NSGA-II)-based solution of the defined problem is demonstrated through benchmark examples.

Dörgő Gy., Haragovics M., Abonyi J.: "Operating regime model based multi-objective sensor placement for data reconciliation", 29th European Symposium on Computer Aided Process Engineering, Netherlands, Eindhoven, 2019 June 16-19.

Soft Sensors Special Issue 

Post date: Jun 4, 2019 8:40:00 PM

We are editing a special issue related to software sensors. Please forward this link to researchers potentially interested in submitting a paper.

Deadline for manuscript submissions: 30 September 2019


https://www.mdpi.com/journal/sensors/special_issues/Soft_Sensors 

P-graph-based multi-objective risk analysis and redundancy allocation in safety-critical energy systems is published!

Post date: May 27, 2019 8:10:00 PM

As most of the energy production and transformation processes are safety-critical, it is vital to develop tools that support the analysis and minimisation of their reliability-related risks. The resultant optimisation problem should reflect the structure of the process which requires the utilisation of flexible and problem-relevant models. This paper highlights that P-graphs extended by logical condition units can be transformed into reliability block diagrams, and based on the cut and path sets of the graph a polynomial risk model can be extracted which opens up new opportunities for the definition optimisation problems related to reliability redundancy allocation. A novel multi-objective optimisation based method has been developed to evaluate the criticality of the units and subsystems. The applicability of the proposed method is demonstrated using a real-life case study related to a reforming reaction system. The results highlight that P-graphs can serve as an interface between process flow diagrams and polynomial risk models and the developed tool can improve the reliability of energy systems in retrofitting projects. 

Süle Z., Baumgartner J., Dörgő Gy., Abonyi J.: "P-graph-based multi-objective risk analysis and redundancy allocation in safety-critical energy systems", Energy (2019), vol. 179, 989-1003.

Network-based Observability and Controllability Analysis of Dynamical Systems: the NOCAD toolbox is published!

Post date: May 27, 2019 8:10:00 PM

Network science has become increasingly important in life science over the last decade. The proposed Octave and MATLAB-compatible NOCAD toolbox provides a set of methods which enables the structural controllability and observability analysis of dynamical systems. In this paper, the functionality of the toolbox is presented, and the implemented functions demonstrated. 

Leitold D., Vathy-Fogarassy Á., and Abonyi J.: "Network-based Observability and Controllability Analysis of Dynamical Systems: the NOCAD toolbox [version 1; peer review: awaiting peer review]", F1000Research 2019, 8:646

A new version of our toolbox is available!

https://github.com/abonyilab/NOCAD 

Network Model-Based Analysis of the Goals, Targets and Indicators of Sustainable Development for Strategic Environmental Assessment is published!

Post date: Mar 07, 2019 2:00:00 PM

Strategic environmental assessment is a decision support technique that evaluates policies, plans and programs in addition to identifying the most appropriate interventions in different scenarios. This work develops a network-based model to study interlinked ecological, economic, environmental and social problems to highlight the synergies between policies, plans, and programs in environmental strategic planning. Our primary goal is to propose a methodology for the data-driven verification and extension of expert knowledge concerning the interconnectedness of the sustainable development goals and their related targets. A multilayer network model based on the time-series indicators of the World Bank open data over the last 55 years was assembled. The results illustrate that by providing an objective and data-driven view of the correlated variables of the World Bank, the proposed layered multipartite network model highlights the previously not discussed interconnections, node centrality measures evaluate the importance of the targets, and network community detection algorithms reveal their strongly connected groups. The results confirm that the proposed methodology can serve as a data-driven decision support tool for the preparation and monitoring of long-term environmental policies. The developed new data-driven network model enables multi-level analysis of the sustainability (goals, targets, indicators) and will make it possible to plan long-term environmental strategic planning. Through relationships among indicators, relationships among targets and goals can be modelled. The results show that sustainable development goals are strongly interconnected, while the 5th goal (gender equality) is linked mostly to 17th (partnerships for the goals) goal. The analysis has also highlighted the importance of the 4th (quality education).

Sebestyén V., Bulla M., Rédey Á., Abonyi J.: "Network Model-Based Analysis of the Goals, Targets and Indicators of Sustainable Development for Strategic Environmental Assessment", Journal of Environmental Management, 2019, 238, 126-135

Frequent pattern mining in multidimensional organizational networks is published!

Post date: Mar 05, 2019 1:00:00 PM

Network analysis can be applied to understand organizations based on patterns of communication, knowledge flows, trust, and the proximity of employees. A multidimensional organizational network was designed, and association rule mining of the edge labels applied to reveal how relationships, motivations, and perceptions determine each other in different scopes of activities and types of organizations. Frequent itemset-based similarity analysis of the nodes provides the opportunity to characterize typical roles in organizations and clusters of co-workers. A survey was designed to define 15 layers of the organizational network and demonstrate the applicability of the method in three companies. The novelty of our approach resides in the evaluation of people in organizations as frequent multidimensional patterns of multilayer networks. The results illustrate that the overlapping edges of the proposed multilayer network can be used to highlight the motivation and managerial capabilities of the leaders and to find similarly perceived key persons. 

https://www.nature.com/articles/s41598-019-39705-1

Evaluation of the Complexity, Controllability and Observability of Heat Exchanger Networks Based on Structural Analysis of Network Representations is published!

Post date: Feb 15, 2019 7:45:00 PM

The design and retrofit of Heat Exchanger Networks (HENs) can be based on several objectives and optimisation algorithms. As each method results in an individual network topology that has a significant effect on the operability of the system, control-relevant HEN design and analysis are becoming more and more essential tasks. This work proposes a network science-based analysis tool for the qualification of controllability and observability of HENs. With the proposed methodology, the main characteristics of HEN design methods are determined, the effect of structural properties of HENs on their dynamical behaviour revealed, and the potentials of the network-based HEN representations discussed. Our findings are based on the systematic analysis of almost 50 benchmark problems related to 20 different design methodologies. 

https://www.mdpi.com/1996-1073/12/3/513

We are delivering a Python course. The course material is avaliable from here: 

https://www.dropbox.com/sh/n22wec5qtdascn1/AADkMHiYFV26yQtcO6lJhWP8a?dl=0

https://github.com/abonyilab/SystemsEngineering 

We deliver a data analysis course (in Excel) at PIMS academy. The course material is avaliable from here: https://www.dropbox.com/s/n7myvhy68mvna0h/Data_excel.zip?dl=0

The Settlement Structure Is Reflected in Personal Investments: Distance-Dependent Network Modularity-Based Measurement of Regional Attractiveness is published!

Post date: Dec 12, 2018 10:30:00 PM

How are ownership relationships distributed in the geographical space? Is physical proximity a significant factor in investment decisions? What is the impact of the capital city? How can the structure of investment patterns characterize the attractiveness and development of economic regions? To explore these issues, we analyze the network of company ownership in Hungary and determine how are connections are distributed in geographical space. Based on the calculation of the internal and external linking probabilities, we propose several measures to evaluate the attractiveness of towns and geographic regions. Community detection based on several null models indicates that modules of the network coincide with administrative regions, in which Budapest is the absolute centre, and where county centres function as hubs. Gravity model-based modularity analysis highlights that, besides the strong attraction of Budapest, geographical distance has a significant influence over the frequency of connections and the target nodes play the most significant role in link formation, which confirms that the analysis of the directed company-ownership network gives a good indication of regional attractiveness. 

Gadar Laszlo, Kosztyan Zsolt T., Abonyi Janos: "The Settlement Structure Is Reflected in Personal Investments: Distance-Dependent Network Modularity-Based Measurement of Regional Attractiveness", Complexity, 2018, Article ID 1306704, 16 pages

Evaluating the Interconnectedness of the Sustainable Development Goals Based on the Causality Analysis of Sustainability Indicators is published!

Post date: Oct 20, 2018 11:40:00 PM

Policymaking requires an in-depth understanding of the cause-and-effect relationships between the sustainable development goals. However, due to the complex nature of socio-economic and environmental systems, this is still a challenging task. In the present article, the interconnectedness of the United Nations (UN) sustainability goals is measured using the Granger causality analysis of their indicators. The applicability of the causality analysis is validated through the predictions of the World3 model. The causal relationships are represented as a network of sustainability indicators providing the opportunity for the application of network analysis techniques. Based on the analysis of 801 UN indicator types in 283 geographical regions, approximately 4000 causal relationships were identified and the most important global connections were represented in a causal loop network. The results highlight the drastic deficiency of the analysed datasets, the strong interconnectedness of the sustainability targets and the applicability of the extracted causal loop network. The analysis of the causal loop networks emphasised the problems of poverty, proper sanitation and economic support in sustainable development.


Dörgő Gy., Sebestyén V., Abonyi J.:"Evaluating the Interconnectedness of the Sustainable Development Goals Based on the Causality Analysis of Sustainability Indicators", Sustainability 2018, 10(10), 3766, doi:10.3390/su10103766

6th International Conference on Control, Decision and Information Technologies - Janos Abonyi became the member of the PC

Post date: Oct 10, 2018 7:00:00 PM

The CoDIT’19 conference is the sixth (6th) edition in the series of the International Conference on Control, Decision and Information Technologies, organized since 2013, the previous one CoDIT'18 having held in Thessaloniki, Greece, in April 2018.

CoDIT’19 will be held April 23-26, 2019 at Paris, France. Its purpose is to be a forum for technical exchange amongst scientists having interests in Control, Optimization, Decision, all areas of Engineering, Computer Science and Information Technologies. This conference will provide a remarkable opportunity for the academic and industrial communities to address new challenges, share solutions and discuss future research directions. The technical program will include plenary lectures, regular technical sessions, and special sessions.

TOPICS


Important dates and deadlines


http://codit19.com/index.php/committees 

Our review about  operators and cyber-physical systems is published! 

Enabling Technologies for Operator 4.0: A Survey

The fast development of smart sensors and wearable devices has provided the opportunity to develop intelligent operator workspaces. The resultant Human-Cyber-Physical Systems (H-CPS) integrate the operators into flexible and multi-purpose manufacturing processes. The primary enabling factor of the resultant Operator 4.0 paradigm is the integration of advanced sensor and actuator technologies and communications solutions. This work provides an extensive overview of these technologies and highlights that the design of future workplaces should be based on the concept of intelligent space. 

Ruppert T., Jaskó Sz., Holczinger T., Abonyi J.: "Enabling Technologies for Operator 4.0: A Survey", Applied Sciences, Basel, 2018, 8 (9), 1650, 1-19

4rd International Conference on Internet of Things, Big Data and Security - Janos Abonyi became the member of the PC

Post date: Aug 30, 2018 8:15:00 PM

The internet of things (IoT) is a platform that allows a network of devices (sensors, smart meters, etc.) to communicate, analyse data and process information collaboratively in the service of individuals or organisations. The IoT network can generate large amounts of data in a variety of formats and using different protocols which can be stored and processed in the cloud. The conference looks to address the issues surrounding IoT devices, their interconnectedness and services they may offer, including efficient, effective and secure analysis of the data IoT produces using machine learning and other advanced techniques, models and tools, and issues of security, privacy and trust that will emerge as IoT technologies mature and become part of our everyday lives.

CONFERENCE AREAS

1 . Big Data Research 

2 . Emerging Services and Analytics 

3 . Internet of Things (IoT) Fundamentals 

4 . Internet of Things (IoT) Applications 

5 . Big Data for Multi-discipline Services 

6 . Security, Privacy and Trust 

7 . IoT Technologies 

UPCOMING DEADLINES

Regular Paper Submission: December 10, 2018 

Regular Paper Authors Notification: February 7, 2019 

Regular Paper Camera Ready and Registration: February 21, 2019 

Late-Breaking Camera Ready and Registration: March 21, 2019

http://iotbds.org/ProgramCommittee.aspx

Soft Sensors Special Issue

Post date: Aug 30, 2018 8:1:00 PM


We are editing a special issue related to software sensors.

Please forward this link to researchers potentially interested insubmitting a paper.

Slide to promote special issue in conference

http://www.mdpi.com/journal/sensors/special_issues/Soft_Sensors

Graph configuration model based evaluation of the education-occupation match

Post date: Mar 6, 2018 7:47:06 PM

To study education—occupation matchings we developed a bipartite network model of education to work transition and a graph configuration model based metric. We studied the career paths of 15 thousand Hungarian students based on the integrated database of the National Tax Administration, the National Health Insurance Fund, and the higher education information system of the Hungarian Government. A brief analysis of gender pay gap and the spatial distribution of over-education is presented to demonstrate the background of the research and the resulted open dataset. We highlighted the hierarchical and clustered structure of the career paths based on the multi-resolution analysis of the graph modularity. The results of the cluster analysis can support policymakers to fine-tune the fragmented program structure of higher education.

The details of this research are published in PLOS ONE:

http://journals.plos.org/plosone/article?id=10.1371/journal.pone.0192427

All the files and the R code are available at:

https://github.com/abonyilab/Edu_Mine_Graph

Sequence Mining based Alarm Suppression

Post date: Feb 10, 2018 8:45:27 PM

To provide more insight into the process dynamics and represent the temporal relationships among faults, control actions and process variables we propose of a multi-temporal sequence mining based algorithm. The methodology starts with the generation of frequent temporal patterns of the alarm signals. We transformed the multi-temporal sequences into Bayes classifiers. The obtained association rules can be used to define alarm suppression rules. We analyzed the dataset of a laboratory-scale water treatment testbed to illustrate that multi-temporal sequences are applicable for the description of operation patterns. We extended the benchmark simulator of a vinyl acetate production technology to generate easily reproducible results and stimulate the development of alarm management algorithms. The results of detailed sensitivity analyses confirm the benefits of the application of temporal alarm suppression rules which are reflecting the dynamical behaviour of the process.

The files are the supplementary materials of our paper will be published in IEEE Access, 2018 For the extended simulator of the vinyl acetate production technology and the source codes of the Bayes’ theorem-based evaluation of sequences see: https://github.com/abonyilab/VACsimulator

The MATLAB implementation of the sequence mining algorithm is available at: https://github.com/abonyilab/Multi-temporal-sequence-mining

Visualization and interpretation of deep learning models

Post date: Feb 10, 2018 8:36:32 PM

We visualise the LSTM deep learning models by principal component analysis. The similarity of the events in fault isolation can be evaluated based on the linear embedding layer of the network, which maps the input signals into a continuous-valued vector space. The method is demonstrated in a simulated vinyl acetate production technology. The results illustrate that with the application of RNN based sequence learning not only accurate fault classification solutions can be developed, but the visualisation of the model can give useful hints for hazard analysis.

The paper related paper will be published in Journal of Chemometrics soon.

The algorithm was implemented in Python. The related code can be downloaded from our Github repository

3rd International Conference on Internet of Things, Big Data and Security - Janos Abonyi became the member of the PC

Post date: Jun 8, 2017 5:09:52 AM

The internet of things (IoT) is a platform that allows a network of devices (sensors, smart meters, etc.) to communicate, analyse data and process information collaboratively in the service of individuals or organisations. The IoT network can generate large amounts of data in a variety of formats and using different protocols which can be stored and processed in the cloud. The conference looks to address the issues surrounding IoT devices, their interconnectedness and services they may offer, including efficient, effective and secure analysis of the data IoT produces using machine learning and other advanced techniques, models and tools, and issues of security, privacy and trust that will emerge as IoT technologies mature and become part of our everyday lives.

CONFERENCE AREAS

1 . Big Data Research 

2 . Emerging Services and Analytics 

3 . Internet of Things (IoT) Fundamentals 

4 . Internet of Things (IoT) Applications 

5 . Big Data for Multi-discipline Services 

6 . Security, Privacy and Trust 

7 . IoT Technologies 

UPCOMING DEADLINES

Regular Paper Submission: October 16, 2017 

Regular Paper Authors Notification: December 15, 2017 

Regular Paper Camera Ready and Registration: January 4, 2018 

International Conference on Communication, Computing & Internet of Things - (IC3IoT 2018)

Post date: Jun 1, 2017 3:53:43 AM

"Acceptance of global competitiveness and unlimited innovations are emerging as the most critical elements in wealth generation in the current world economy. Transitions into a developed nation and empowered society shall not be a far away dream but shall be a near future reality. The need for linking science and technology to the growth of India shall be intensified and improved by conferences of this kind.

Broadband and Wireless Communication have brought massive changes to the world and continue to provide an array of new challenges, multi-domain applications and solutions such as IoT. The aim of IC3IoT is to provide an excellent forum for sharing knowledge and present the innovative researchers, and technologies as well as developments and future demands related to Broadband Technologies, Computing Technologies, Human-Computer Interaction and Wireless Communication along with IoT.

An International conference of this nature will enhance and benefit the human society at large since it will bring together leading researchers engineers and scientists in the domain of interest."

Prof. Abonyi is a member of the program committee of the conference. More details can be found at the website of the event 

!!!!! HAS - UP "Momentum" Complex Systems Research Group !!!!

Post date: May 19, 2017 5:11:37 PM

The objective of the Lendület (Momentum) Program of the Hungarian Academy of Science is a dynamic renewal of the research teams of the Academy and participating universities. With the help of this program, we transform and extend the group of Prof. Abonyi into a research group devoted to complex systems.

We will form a new school for rethinking and upgrading systems engineering and data science in the light of the fourth industrial revolution.  The overall goal of the project is the development of new algorithms and open source tools to utilise the data collected by internetworking systems in monitoring, control, optimisation, scheduling, risk management, and product lifecycle management. This goal challenges present-day internet of things technology regarding the development software agent and advanced sensor fusion functionalities.

We believe that algorithms tailored for (1) multivariate time series analysis, (2) software sensors and event analysis, (3) localisation and (4) model mining can result in significant progress in this field. The creative and integrated application of the resulted algorithms can bring in a new perspective to the integrated monitoring and structural analysis of complex systems and the utilisation of open and linked data. The full integration of these four subprojects is primarily important and ensures the strength and uniqueness of this proposal.

The proposed centre, therefore, aims to bring together the best technological expertise in systems-, data-, and network science, and industrial intelligence. As part of its mission, the Group will make the new and integrated solutions available to the research community and industry through its collaborations and training.

Network science and control theory  - Our paper in Scientific Reports!

Post date: Mar 11, 2017 2:43:46 PM

Network theory based controllability and observability analysis have become widely used techniques. We realized that most applications are not related to dynamical systems, and mainly the physical topologies of the systems are analysed without deeper considerations. Here, we draw attention to the importance of dynamics inside and between state variables by adding functional relationship defined edges to the original topology. The resulting networks differ from physical topologies of the systems and describe more accurately the dynamics of the conservation of mass, momentum and energy. We define the typical connection types and highlight how the reinterpreted topologies change the number of the necessary sensors and actuators in benchmark networks widely studied in the literature. Additionally, we offer a workflow for network science-based dynamical system analysis, and we also introduce a method for generating the minimum number of necessary actuator and sensor points in the system.

http://www.nature.com/articles/s41598-017-00160-5

https://github.com/abonyilab/NOCAD

PhD defense of Laszlo Dobos, 7th of June, 2016, 1pm

Post date: May 20, 2016 10:04:55 PM

Development of Experimental Design Techniques for Analyzing and Optimization of Operating Technologies

The aim of this thesis is to introduce theoretical basics of different approaches which can support further the production process development, based on the extracted knowledge from process data. As selection of time-frame with a certain operation is the starting point in a further process investigation, Dynamic Principal Component Analysis (DPCA) based time-series segmentation approach is introduced in this thesis first. This new solution is resulted by integrating DPCA tools into the classical univariate time-series segmentation methodologies. It helps us to detect changes in the linear relationship of process variables, what can be caused by faults or misbehaves. This step can be the first one in the model-based process development since it is possible to neglect the operation ranges, which can ruin the prediction capability of the model. In other point of view, we can highlight problematic operation regimes and focus on finding root causes of them. When fault-free, linear operation segments have been selected, further segregation segregation of data segments is needed to find data slices with high information content in terms of model parameter identification. As tools of Optimal Experiment Design (OED) are appropriate for measuring the information content of process data, the goal oriented integration of OED tools and classical timeseries segmentation can handle the problem. Fisher information matrix is one of the basic tools of OED. It contains the partial derivatives of model output respect to model parameters when considering a particular input data sequence. A new, Fisher information matrix based time-series segmentation methodology has been developed to evaluate the information content of an input data slice. By using this tool, it becomes possible to select potentially the most valuable and informative time-series segments. This leads to the reduction of number of industrial experiments and their costs. In the end of the thesis a novel, economic objective function-oriented framework is introduced for tuning model predictive controllers to be able to exploit all the control potentials and at the meantime considering the physical and chemical limits of process.

Media Cloud

Post date: May 20, 2016 9:58:19 PM

Media Cloud is a project of the Harvard Berkman Center for Internet & Society and the MIT Center for Civic Media. It is worth to take a look

Könyvbemutató, 2016. május 24.

Post date: May 20, 2016 9:55:07 PM


Early warning - risk management

Post date: May 27, 2012 12:41:56 PM

Early waring risk management system can support commercial banks to increase safety, profitability and fluidity of credit founds. The aim of the project is to develop novel algorithms and tools based on the early detection of trends and unusual business networking patterns. The project founded by GOP-1.1.1-11-2011-0045 program and will start in June 2012 with cooperation of Kürt Zrt .