News

Janos Abonyi was invited to join as a program committee member at Innovations in Bio-Inspired Computing and Applications (IBICA) 2021 conference

Janos Abonyi serves as a program committee member at the 12th International Conference on Innovations in Bio-Inspired Computing and Applications (IBICA).

The aim of IBICA is to provide a platform for world research leaders and practitioners, to discuss the "full spectrum" of current theoretical developments, emerging technologies and innovative applications of Bio-inspired Computing. Bio-insipired Computing is currently one of the most exciting research ares, and it is continuously demonstrating exceptional strenght in solving complex real life problems.

The conference will be held online on December 16-18, 2021.

Link to conference website

Janos Abonyi was invited to join as a program committee member at Soft Computing and Pattern Recognition (SoCPaR) 2021 conference

Janos Abonyi serves as a program committee member at the 13th International Conference on Soft Computing and Pattern Recognition (SoCPaR).

The conference aims to bring together worldwide leading researchers and practitioners interested in advancing the state-of-the-art in Soft Computing and Pattern Recognition, for exchanging knowledge that encompasses a broad range of disciplines among various distinct communities. It is hoped that researchers and practitioners will bring new prospects for collaboration across disciplines and gain inspiration to facilitate novel breakthroughs. The themes for this conference are thus focused on "Innovating and Inspiring Soft Computing and Intelligent Pattern Recognition".

The conference will be held online on December 15-17, 2021.

Link to conference website

Janos Abonyi was invited to join as a program committee member at Information Assurance and Security (IAS) 2021 conference

Janos Abonyi serves as a program committee member at the 17th International Conference on Information Assurance and Security (IAS).

The Conference Theme: Innovative Cyber Secutiry: Protecting National Borders'.

The conference aims to bring together researchers, practitioners, developers, and policy makers in multiple disciplines of information security and assurance to exchange ideas and to learn the latest development in this important field.

The conference is organized by Machine Intelligence Research Labs (MIR Labs) and will be held online on December 14-16.

Link to conference website

Janos Abonyi was invited to join as a program committee member at Nature and Biologically Inspired Computing (NaBIC) 2021 World Congress

Janos Abonyi serves as a program committee member at the 13th World Congress on Nature and Biologically Inspired Computing (NaBIC).

NaBIC 2021 is organized to provide a forum for researchers, engineers, and students from all over the world, to discuss the state-of-the-art in machine intelligence, and address various issues on building up human friendly machines by learning from nature. The conference theme is “Nurturing Intelligent Computing Towards Advancement of Machine Intelligence”.

The conference will be held online on March 15-17, 2021.

Link to conference website

Janos Abonyi was invited to join as a program committee member at Hybrid Intelligent Systems (HIS) 2021 conference

Janos Abonyi serves as a program committee member at the 21th International Conference on Hybrid Intelligent Systems (HIS).

The objectives of HIS 2021 are: to increase the awareness of the research community of the broad spectrum of hybrid techniques, to bring together AI researchers from around the world to present their cutting-edge results, to discuss the current trends in HIS research, to develop a collective vision of future opportunities, to establish international collaborative opportunities, and as a result to advance the state of the art of the field.

The conference will be held online on March 14-16, 2021.

Link to conference website

Data-driven comparative analysis of national adaptation pathways for Sustainable Development Goals

Since the declaration of Sustainable Development Goals (SDGs) in 2015, countries have begun developing and strategizing their national pathways for effective implementation of the 2030 Agenda. The sustainable development targets set out how the world’s nations must move forward so that sustainable development is not an ideal vision but a workable, comprehensive environmental, economic, and social policy. This work aims to analyze the state of progress towards achieving sustainable development goals for each country. In addition to the static presentation of the achievements that countries can present, the changes over time are also compared, allowing countries to be grouped according to the current states. A sophisticated SDG performance measurement tool has been developed to support this analysis, which automatically processes the entire UN Global SDG Indicators database with exploratory data analysis, frequent item mining, and network analysis supported. Based on the trend analysis of the percentiles, the values of the indicators achievable by 2030 are also derived. The analyzes were performed based on the time-series data of 1319 disaggregated official SDG indicators.

Most of the world countries have achieved the greatest success in SDG12 and SDG10 since the declaration of the 2030 Agenda. In the field of climate change (SDG13), 26 countries can count on significant achievements. However, SDG6, SDG2, and SDG1 face significant challenges globally, as they have typically seen minor progress in recent years. Examined at the indicator level, indicators 1.4.1, 5.6.2, 6.b.1, 10.7.2, and 15.4.2 improved in all countries of the world, while indicators 2.a.1, 9.4.1, 2.1.1, 2.1. and 12.b.1 have deteriorated predominantly. According to the forecast for 2030, Australia and the United States can reduce their per capita CO2 emissions, while some countries in Africa, Asia, and the Middle East are expected to increase their emissions.

Event-Tree Based Sequence Mining Using LSTM Deep-Learning Model

During the operation of modern technical systems, the use of the LSTM model for the prediction of process variable values and system states is commonly widespread. The goal of this paper is to expand the application of the LSTM-based models upon obtaining information based on prediction. In this method, by predicting transition probabilities, the output layer is interpreted as a probability model by creating a prediction tree of sequences instead of just a single sequence. By further analyzing the prediction tree, we can take risk considerations into account, extract more complex prediction, and analyze what event trees are yielded from different input sequences, that is, with a given state or input sequence, the upcoming events and the probability of their occurrence are considered. In the case of online application, by utilizing a series of input events and the probability trees, it is possible to predetermine subsequent event sequences. The applicability and performance of the approach are demonstrated via a dataset in which the occurrence of events is predetermined, and further datasets are generated with a higher-order decision tree-based model. The case studies simply and effectively validate the performance of the created tool as the structure of the generated tree, and the determined probabilities reflect the original dataset.

Contrast and brightness balance in image enhancement using Cuckoo Search-optimized image fusion

Many vision-based systems suffer from poor levels of contrast and brightness, mainly because of inadequate and improper illumination during the image acquisition process. As a result, the required specified information from the acquired image is not available for the particular application. In general, it is hard to achieve a balance between the improvement of contrast and brightness in image enhancement. By introducing nature-inspired optimization in image enhancement, the best features of the image are utilized, and the complexity related to the nonlinearity of images can be solved with various constraints, like a balance between contrast and brightness. In this work, a novel automatic method for image enhancement to find a balance between contrast and brightness is developed by using Cuckoo Search-optimized image fusion. First, the Cuckoo Search-based optimization algorithm generates two sets of optimized parameters. These parameter sets are used to generate a pair of enhanced images, one with a high degree of sharpness and contrast, the other is bright and has been improved without losing the level of detail. Furthermore, the two enhanced images are fused by the fusion process to obtain an output image where the contrast and brightness are in balance. The effectiveness of the proposed method is verified by applying it to standard images (CVG-UGR image database) and lathe tool images. Experimental results demonstrated that the proposed method performs better with regard to both the quality of contrast and brightness, moreover, yields enhanced quality evaluation metrics compared to the other conventional techniques.

Quality vs. quantity of alarm messages - How to measure the performance of an alarm system

Despite significant efforts to measure and assess the performance of alarm systems, to this day, no silver bullet has been found. The majority of the existing standards and guidelines focus on the alarm load of the operators, either during normal or upset plant conditions, and only a small fraction takes into consideration the actions performed by the operators. In this study, an overview of the evolution of alarm system performance metrics is presented and the current data-based approaches are grouped into seven categories based on the goals of and the methodologies associated with each metric. Deriving from our categorical overview, the terminological differences between the academic and industrial approaches of alarm system performance measurement are reflected. Moreover, we highlight how extremely unbalanced the performance measurement of alarm systems is towards quantitative metrics instead of focusing on qualitative assessment, invoking the threat of excessive alarm reductions resulting from such a unilateral approach. The critical aspects of qualitative performance measurement of alarm systems is demonstrated in terms of the comparison of the alarm system of an industrial hydrofluoric acid alkylation unit before and after the alarm rationalization process. The quality of the alarm messages is measured via their informativeness and actionability, in other words, how appropriate the parameter settings are for the everyday work and how actionable they are by the operators of the process.

Genetic programming-based symbolic regression for goal-oriented dimension reduction

The majority of dimension reduction techniques are built upon the optimization of an objective functionaiming to retain certain characteristics of the projected datapoints: the variance of the original dataset,the distance between the datapoints or their neighbourhood characteristics, etc. Building upon theoptimization-based formalization of dimension reduction techniques, the goal-oriented formulation ofprojection cost functions is proposed. For the optimization of the application-oriented data visualizationcost function, a Multi-gene genetic programming (GP)-based algorithm is introduced to optimize thestructures of the equations used for mapping high-dimensional data into a two-dimensional space andto select variables that are needed to explore the internal structure of the data for data-driven softwaresensor development or classifier design. The main benefit of the approach is that the evolved equationsare interpretable and can be utilized in surrogate models. The applicability of the approach is demon-strated in the benchmark wine dataset and in the estimation of the product quality in a diesel oil blendingtechnology based on an online near-infrared (NIR) analyzer. The results illustrate that the algorithm iscapable to generate goal-oriented and interpretable features, and the resultant simple algebraic equa-tions can be directly implemented into applications when there is a need for computationally cost-effective projections of high-dimensional data as the resultant algebraic equations are computationallysimpler than other solutions as neural networks.

Indoor Positioning Systems Can Revolutionise Digital Lean

The powerful combination of lean principles and digital technologies accelerates wasteidentification and mitigation faster than traditional lean methods. The new digital lean (also referredto as Lean 4.0) solutions incorporate sensors and digital equipment, yielding innovative solutionsthat extend the reach of traditional lean tools. The tracking of flexible and configurable productionsystems is not as straightforward as in a simple conveyor. This paper examines how the informationprovided by indoor positioning systems (IPS) can be utilised in the digital transformation of flexiblemanufacturing. The proposed IPS-based method enriches the information sources of value streammapping and transforms positional data into key-performance indicators used in Lean Manufacturing.The challenges of flexible and reconfigurable manufacturing require a dynamic value stream mapping.To handle this problem, a process mining-based solution has been proposed. A case study isprovided to show how the proposed method can be employed for monitoring and improvingmanufacturing efficiency.

Sparse PCA supports exploration of process structures for decentralized fault detection

With the ever-increasing use of sensor technologies in industrial processes, and more data becoming available to engineers, the fault detection and isolation activities in the context of process monitoring have gained significant momentum in recent years. A statistical procedure frequently used in this domain is Principal Component Analysis (PCA), which can reduce the dimensionality of large data sets without compromising the information content. While most process monitoring methods offer satisfactory detection capabilities, understanding the root cause of malfunctions and providing the physical basis for their occurrence have been challengin. The relatively new sparse PCA techniques represent a further development of the PCA in which not only the data dimension is reduced but the data is also made more interpretable, revealing clear correlation structures among variables. Hence, taking a step forward from classical fault detection methods, in this work, a decentralized monitoring approach is proposed based on a sparse algorithm. The resulting control charts reveal the correlation structures associated with the monitored process and facilitate a structural analysis of the occurred faults. The applicability of the proposed method is demonstrated using data generated from the simulation of the benchmark vinyl acetate process. It is shown that the sparse principal components, as a foundation to decentralized multivariate monitoring framework, can provide physical insight towards the origins of process faults.


Test Plan for the Verification of the Robustness of Sensors and Automotive Electronic Products Using Scenario-Based Noise Deployment (SND)

The targeted shortening of sensor development requires short and convincing verification tests. The goal of the development of novel verification methods is to avoid or reduce an excessive amount of testing and identify tests that guarantee that the assumed failure will not happen in practice. In this paper, a method is presented that results in the test loads of such a verification. The method starts with the identification of the requirements for the product related to robustness using the precise descriptions of those use case scenarios in which the product is assumed to be working. Based on the logic of the Quality Function Deployment (QFD) method, a step-by-step procedure has been developed to translate the robustness requirements through the change in design parameters, their causing phenomena, the physical quantities as causes of these phenomena, until the test loads of the verification. The developed method is applied to the test plan of an automotive sensor. The method is general and can be used for any parts of a vehicle, including mechanical, electrical and mechatronical ones, such as sensors and actuators. Nonetheless, the method is applicable in a much broader application area, even outside of the automotive industry

Regional development potentials of Industry 4.0: Open data indicators of the Industry 4.0+ model

This paper aims to identify the regional potential of Industry 4.0 (I4.0). Although the regional background of a company significantly determines how the concept of I4.0 can be introduced, the regional aspects of digital transformation are often neglected with regard to the analysis of I4.0 readiness. Based on the analysis of the I4.0 readiness models, the external regional success factors of the implementation of I4.0 solutions are determined. An I4.0+ (regional Industry 4.0) readiness model, a specific indicator system is developed to foster medium-term regional I4.0 readiness analysis and foresight planning. The indicator system is based on three types of data sources: (1) open governmental data; (2) alternative metrics like the number of I4.0-related publications and patent applications; and (3) the number of news stories related to economic and industrial development. The indicators are aggregated to the statistical regions (NUTS 2), and their relationships analyzed using the Sum of Ranking Differences (SRD) and Promethee II methods. The developed I4.0+ readiness index correlates with regional economic, innovation and competitiveness indexes, which indicates the importance of boosting regional I4.0 readiness.

Data describing the relationship between world news and sustainable development goals

The data article presents a dataset and a tool for news-based monitoring of sustainable development goals defined by the United Nations. The presented dataset was created by struc- tured queries of the GDELT database based on the categories of the World Bank taxonomy matched to sustainable devel- opment goals. The Google BigQuery SQL scripts and the re- sults of the related network analysis are attached to the data to provide a toolset for the strategic management of sustain- ability issues. The article demonstrates the dataset on the 6th sustainable development goal (Clean Water and Sanita- tion). The network formed based on how countries appear in the same news can be used to explore the potential interna- tional cooperation. The network formed based on how topics of World Bank taxonomy appear in the same news can be used to explore how the problems and solutions of sustain- ability issues are interlinked.

Access to data

The Applicability of Big Data in Climate Change Research: The Importance of System of Systems Thinking

The aim of this paper is to provide an overview of the interrelationship between data science and climate studies, as well as describes how sustainability climate issues can be managed using the Big Data tools. Climate-related Big Data articles are analyzed and categorized, which revealed the increasing number of applications of data-driven solutions in specific areas, however, broad integrative analyses are gaining less of a focus. Our major objective is to highlight the potential in the System of Systems (SoS) theorem, as the synergies between diverse disciplines and research ideas must be explored to gain a comprehensive overview of the issue. Data and systems science enables a large amount of heterogeneous data to be integrated and simulation models developed, while considering socio-environmental interrelations in parallel. The improved knowledge integration offered by the System of Systems thinking or climate computing has been demonstrated by analysing the possible inter-linkages of the latest Big Data application papers. The analysis highlights how data and models focusing on the specific areas of sustainability can be bridged to study the complex problems of climate change.

We are delivering a couse about "Sensing the future of science"

The course offers to gain ability to get ideas for research that has potential impact on social and economic relevances.

The course material covers the concepts of technology scouting and the exploration of information sources to find 'evidence of the future in the present'.

You can download the course material from: here

Modelling for Digital Twins - Potential Role of Surrogate Models

The application of white box models in digital twins is often hindered by missing knowledge, uncertain information and computational difficulties. Our aim was to overview the difficulties and challenges regarding the modelling aspects of digital twin applications and to explore the fields where surrogate models can be utilised advantageously. In this sense, the paper discusses what types of surrogate models are suitable for different practical problems as well as introduces the appropriate techniques for building and using these models. A number of examples of digital twin applications from both continuous processes and discrete manufacturing are presented to underline the potentials of utilising surrogate models. The surrogate models and model-building methods are categorised according to the area of applications. The importance of keeping these models up to date through their whole model life cycle is also highlighted. An industrial case study is also presented to demonstrate the applicability of the concept.

Integrated Survival Analysis and Frequent Pattern Mining for Course Failure-Based Prediction of Student Dropout

A data-driven method to identify frequent sets of course failures that students should avoid in order to minimize the likelihood of their dropping out from their university training is proposed. The overall probability distribution of the dropout is determined by survival analysis. This result can only describe the mean dropout rate of the undergraduates. However, due to the failure of different courses, the chances of dropout can be highly varied, so the traditional survival model should be extended with event analysis. The study paths of students are represented as events in relation to the lack of completing the required subjects for every semester. Frequent patterns of backlogs are discovered by the mining of frequent sets of these events. The prediction of dropout is personalised by classifying the success of the transitions between the semesters. Based on the explored frequent item sets and classifiers, association rules are formed providing the estimates of the success of the continuation of the studies in the form of confidence metrics. The results can be used to identify critical study paths and courses. Furthermore, based on the patterns of individual uncompleted subjects, it is suitable to predict the chance of continuation in every semester. The analysis of the critical study paths can be used to design personalised actions minimizing the risk of dropout, or to redesign the curriculum aiming the reduction in the dropout rate. The applicability of the method is demonstrated based on the analysis of the progress of chemical engineering students at the University of Pannonia in Hungary. The method is suitable for the examination of more general problems assuming the occurrence of a set of events whose combinations may trigger a set of critical events.

Frequent Itemset Mining and Multi-Layer Network-Based Analysis of RDF Databases

Triplestores or resource description framework (RDF) stores are purpose-built databasesused to organise, store and share data with context. Knowledge extraction from a large amountof interconnected data requires effective tools and methods to address the complexity and theunderlying structure of semantic information. We propose a method that generates an interpretablemultilayered network from an RDF database. The method utilises frequent itemset mining (FIM)of the subjects, predicates and the objects of the RDF data, and automatically extracts informativesubsets of the database for the analysis. The results are used to form layers in an analysablemultidimensional network. The methodology enables a consistent, transparent, multi-aspect-orientedknowledge extraction from the linked dataset. To demonstrate the usability and effectiveness ofthe methodology, we analyse how the science of sustainability and climate change are structuredusing the Microsoft Academic Knowledge Graph. In the case study, the FIM forms networks ofdisciplines to reveal the significant interdisciplinary science communities in sustainability and climatechange. The constructed multilayer network then enables an analysis of the significant disciplinesand interdisciplinary scientific areas. To demonstrate the proposed knowledge extraction process, wesearch for interdisciplinary science communities and then measure and rank their multidisciplinaryeffects. The analysis identifies discipline similarities, pinpointing the similarity between atmosphericscience and meteorology as well as between geomorphology and oceanography. The results confirmthat frequent itemset mining provides an informative sampled subsets of RDF databases which canbe simultaneously analysed as layers of a multilayer networ

Inustry 4.0-Driven Development of Optimization Algortihms: A Systematic Overview

The Fourth Industrial Revolution means the digital transformation of production systems. Cyber-physical systems allow for the horizontal and vertical integration of these production systems as well as the exploitation of the benefits via optimization tools. This article reviews the impact of Industry 4.0 solutions concerning optimization tasks and optimization algorithms, in addition to the identification of the new R&D directions driven by new application options. The basic organizing principle of this overview of the literature is to explore the requirements of optimization tasks, which are needed to perform horizontal and vertical integration. This systematic review presents content from 900 articles on Industry 4.0 and optimization as well as 388 articles on Industry 4.0 and scheduling. It is our hope that this work can serve as a starting point for researchers and developers in the field.

The intertwining of world news with Sustainable Development Goals: An effective monitoring tool

This study aims to bring about a novel approach to the analysis of Sustainable Development Goals (SDGs) based solely on the appearance of news. Our purpose is to provide a monitoring tool that enables world news to be detected in an SDG-oriented manner, by considering multilingual as well as wide geographic coverage. The association of the goals with news basis the World Bank Group Topical Taxonomy, from which the selection of search words approximates the 17 development goals. News is extracted from The GDELT Project (Global Database of Events, Language and Tone) which gathers both printed as well as online news from around the world. 60 851 572 relevant news stories were identified in 2019. The intertwining of world news with SDGs as well as connections between countries are interpreted and highlight that even in the most SDG-sensitive countries, only 2.5% of the news can be attributed to the goals. Most of the news about sustainability appears in Africa as well as East and Southeast Asia, moreover typically the most negative tone of news can be observed in Africa. In the case of climate change (SDG 13), the United States plays a key role in both the share of news and the negative tone. Using the tools of network science, it can be verified that SDGs can be characterized on the basis of world news.

This news-centred network analysis of SDGs identifies global partnerships as well as national stages of implementation towards a sustainable socio-environmental ecosystem. In the field of sustainability, it is vital to form the attitudes and environmental awareness of people, which strategic plans cannot address but can be measured well through the news.

A negyedik ipari forradalom hatása a kompetenciacserélődésre

Az élet számos területén folyamatos változás figyelhető meg, különösen így van ez a gyakorlati életben jelenleg is zajló negyedik ipari forradalom kapcsán. Az Ipar 4.0, a technológiai újításai révén, jelentősen megváltoztatja a munkaerőpiacot és a munkahelyeket. Így elkerülhetetlen a jelenleg is zajló és a várható változáshoz való alkalmazkodás, ugyanakkor nehéz megmondani, hogy milyen kompetenciákra lesz szükség ehhez a jövőben. A kutatás célja, az Ipar 4.0 megoldások azonosítása során, a kompetenciaszükséglet változásának meghatározása a vizsgált vállalatokkal készített strukturált interjúk alapján.A kutatás rávilágít arra, hogy a kompetenciacserélődés és azok fejlesztési folyamatai megkezdődtek. Remélhetőleg a feltárt összefüggések további kutatásokat inspirálnak, támpontot szolgáltatnak a munkavállalók fejlesztését szolgáló képzések kidolgozásában, megújításban, valamint a HRM és az Ipar 4.0 területén hasznos információként szolgálnak, segítve a kompetenciafejlesztési és HR-fejlesztési stratégiák kidolgozását és megvalósítását.

Lászlo Nagy took second place in the IEEE HS Student Paper Contest

László Nagy took second place in the IEEE HS Student Paper Contest with the article entitled: "Analytic Hierarchy Process and Multilayer Network-Based Method for Assembly Line Balancing".


Estimation of machine setup and changeover times by survival analysis

The losses associated with changeovers are becoming more significant in manufacturing due to the high variance of products and requirements for just-in-time production. The study is based on the single minute exchange of die (SMED) philosophy, which aims to reduce changeover times. We introduced a method for the analysis of these losses based on models that estimate the product- and operator-dependent changeover times using survival analysis. The root causes of the losses are identified by significance tests of the utilized Cox regression models. The resulting models can be used to design a performance management system that considers the stochastic nature of the work of the operators. An anonymized manufacturing example related to the setup of crimping and wire cutting machines demonstrates the applicability of the method.

János Abonyi was presenting about the applicability of data science and machine learning in water management at the National Conference on Water Value and Digital Water Management

The virtual meeting was held in 2020. December 2-3 organized by MaSzeSz (Hungarian Water and Wastewater Association).

The conference discussed the real value of sustainable water utility services, knowledge-based management, reducing the large gap in costs in small and large settlements, increasing the value and social recognition of water services, the international value of the domestic water industry, digital data and information management of municipal water management, and future professionals .

For more details about the conference available here.

Machine Learning Based Analysis of Human Serum N-glycome Alterations to Follow up Lung Tumor Surgery

The human serum N-glycome is a valuable source of biomarkers for malignant diseases, already utilized in multiple studies. In this paper, theN-glycosylation changes in human serumproteins were analyzed after surgical lung tumor resection. Seventeen lung cancer patients were involved in this study and the N-glycosylation pattern of their serum samples was analyzed before andafter the surgery using capillary electrophoresis separation with laser-induced fluorescent detection. The relative peak areas of 21N-glycans were evaluated from the acquired electropherograms using machine learning-based data analysis. Individual glycans as well as their subclasses were taken into account during the course of evaluation. For the data analysis, both discrete (e.g., smoker or not)and continuous (e.g., age of the patient) clinical parameters were compared against the alterations in these 21N-linked carbohydrate structures. The classification tree analysis resulted in a panel of N-glycans, which could be used to follow up on the effects of lung tumor surgical resection.

Az adattudomány eszköztárának alkalmazási lehetőségei a klímaváltozás kihívásainak azonosításában és kezelésében

A fenntarthatóság tudományterületének legfontosabb kérdésein végighaladva bemutattuk, hogy a jövőben milyen kutatási és fejlesztési tevékenységekre van szükség ahhoz, hogy a klímaváltozás komplex problémáinak megismerésében és kezelésében az adat-tudomány eszköztára hatékony segítséget nyújtson. A jelenleg sikeresnek bizonyult adatalapú alkalmazásokat kulcsszó elemzés segítségével tekintettük át. Elemzésünk szemléltette, hogy a Big Data a klímatudomány egyre szélesebb körben alkalmazott eszköze, ugyanakkor kevés az e technológia előnyeit ténylegesen kiaknázó, igazán átfogó jellegű, integratív elemzés. Tanulmányunkkal szeretnénk felhívni a figyelmet a rendszerek rend-szere (SoS) elvére, ugyanis a klímaváltozás mozgatórugói és hatásai csak akkor ismerhetők fel, és a hatásokhoz csak akkor tudunk alkalmazkodni és azoknak ellenállni, ha időben felismerjük és feltárjuk az új kutatási irányzatok közötti szinergiákat.

Abonyi János, Czvetkó Tímea, Sebestyén Viktor. "Az adattudomány eszköztárának alkalmazási lehetőségei a klímaváltozás kihívásainak azonosításában és kezelésében." MKL, 2020

Real-Time Locating System in Production Management

Real-time monitoring and optimization of production and logistics processes significantlyimprove the efficiency of production systems. Advanced production management solutions requirereal-time information about the status of products, production, and resources. As real-time locatingsystems (also referred to as indoor positioning systems) can enrich the available information, thesesystems started to gain attention in industrial environments in recent years. This paper providesa review of the possible technologies and applications related to production control and logistics,quality management, safety, and efficiency monitoring. This work also provides a workflow to clarifythe steps of a typical real-time locating system project, including the cleaning, pre-processing, andanalysis of the data to provide a guideline and reference for research and development of indoorpositioning-based manufacturing solutions.

Abonyi János interjúja a Sic Itur ad Astra történeti folyóiratban

Témája, a hálózat mint metafora és modell: kibővült a társadalomtudományok eszköztára.

Beszélgetés Abonyi János és Lengyel Balázs hálózatkutatókkal a hálózatelmélet aktuális trendjeiről.

Az interjú tartalma megtekinthető ezen a linken.

Data describing the regional Industry 4.0 readiness index

The data article presents a dataset suitable to measure regional Industry 4.0 (I4.0+) readiness. The I4.0+ dataset includes 101 indicators with 248 958 observations, aggregated to NUTS 2 statistical level) based on open data in the field of education (ETER, Erasmus), science (USPTO, MA-Graph, GRID), government (Eurostat) and media coverage (GDELT). Indicators consider the I4.0-specific domain of higher education and lifelong learning, innovation, technological investment, labour market and technological readiness as indicators. A composite indicator, the I4.0+ index was constructed by the Promethee method, to identify regional rank regarding their I4.0 performance. The index is validated with economic (GDP) and innovation indexes (Regional Innovation Index).

Data accessibility

Techtogether engineering competition

The Techtogedther engineering competition was held at the Industry Days 2020 event, where the research group proposed a task for students to be solved. Students had to present exciting industrial solutions in the topic of 'Improving the digital twin of production systems and increasing efficency based on data analysis'.

Technology meetup 06.10.2020 - 18:00

Why is it worth going back to school? What can the person who applies for our latest trainings learn form?

These questions were answered at the Technology Meetup held in Veszprém at the 6th of October. Gyula Dörgő and Tamás Ruppert presented the work of the research group as well as introduced the Industry 4.0 engineering training and the Automotive Quality Academy to the audience.

Decision trees for informative process alarm definition and alarm-based fault classification

Alarm messages in industrial processes are designed to draw attention to abnormalities that require timely assessment or intervention. However, in practice, alarms are arbitrarily and excessively defined by process operators resulting numerous nuisance and chattering alarms that are simply a source of distraction. Countless techniques are available for the retrospective filtering of alarm data, e.g., adding time delays and deadbands to existing alarm settings. As an alternative, in the present paper, instead of filtering or modifying existing alarms, a method for the design of alarm messages being informative for fault detection is proposed which takes into consideration that the occurring alarm messages originally should be optimal for fault detection and identification. This methodology utilizes a machine learning technique, the decision tree classifier, which provides linguistically well-interpretable models without the modification of the measured process variables. Furthermore, an online application of the defined alarm messages for fault identification is presented using a sliding window-based data preprocessing approach. The effectiveness of the proposed methodology is demonstrated in terms of the analysis of a well-known benchmark simulator of a vinyl-acetate production technology, where the complexity of the simulator is considered to be sufficient for the testing of alarm systems.

Note to practitioners: Process-specific knowledge can be used to label historical process data to normal operating and fault-specific periods. Alarm generation should be designed to be able to detect and isolate faulty states. Using decision trees, optimal”cuts” or alarm limits for the purpose of fault classification can be defined utilizing a labelled dataset. The results apply to a variety of industries operating with online control systems, and especially timely in the chemical industry.


Local newspaper reported about our professional engineer training in Industry 4.0 and the Automotive Quality Academy

Link to the Industry 4.0 Professional Engineer Training website

Link to the Automotive Quality Academy website

Directions of membrane separator development for microbial fuel cells: A retrospective analysis using frequent itemset mining and descriptive statistical approach

To increase the efficiency of microbial fuel cells (MFCs), the separator (which is mostly a membrane) placed between the electrodes or their compartments is considered of high importance besides several other biotic and abiotic factors (e.g. configuration, mode of operation, types of inoculum and substrate). Nafion-based proton exchange membranes (PEMs) are the most widespread, although these materials are often criticized on various technological and economical grounds. Therefore, to find alternatives of Nafion, the synthesis, development and testing of novel/commercialized membrane separators with enhanced characteristics have been hot topics. In this study, the goals were to assess the membrane-installed MFCs in a retrospective manner and reveal the trends, the applied practices, frequent setups, etc. via Bayesian classification and frequent itemset mining algorithms. Thereafter, a separate discussion was devoted to examine the current standing of research related to major membrane groups used in MFCs and evaluate in accordance with the big picture how the various systems behave in comparison with each other, especially compared to those applying Nafion PEM. It was concluded that some membrane types seem to be competitive to Nafion, however, the standardization of the experiments would drive the more unambiguous comparison of studies.


Janos Abonyi was invited to join as a program committee member at Evolutionary Multi-Criterion Optimization (EMO) 2021 conference

Janos Abonyi serves as a program committee member at the 11th International Conference on Evolutionary Multi-Criterion Optimization (EMO).

The conference aims to bring together both the EMO, Multiple Criteria Decision-Making (MCDM) communities, and other related fields, moreover, focusing on solving real-world problems in government, business and industry.

The conference will be held as a hybrid conference on March 28-31, 2021 in Shenzhen and on-line.

Link to conference website

Integration of real-time locating systems into digital twins

Cyber-physical model-based solutions should rely on digital twins in which simulations are integrated with real-time sensory and manufacturing data. This paper highlights the benefits of information fusion with real-time locating systems (RTLS) and demonstrates how position and acceleration data can be utilised for the simulation-based analysis of product-specific activity times. The proposed digital twin is continuously capable to predict the production status and provide information for monitoring of production performance thanks to the real time connections of the RTLS and adaptive simulation models. The presented industrial case study demonstrates how the resulted Simulation 4.0 concept supports the analysis of human resource effectiveness (HRE) in an assembly process.

Are Regions Prepared for Industry 4.0? The Industry 4.0+ Indicator System for Assessment

The concept of industry 4.0 is spreading worldwide and readiness models exist to determine organizational or national maturity. On the other hand, the regional perspective of the digital transformation is yet to be widely researched, although it significantly determines how the concept of industry 4.0 can be introduced to the organisations. This book identifies the regional aspect of industry 4.0 and provides a regional (NUTS 2 classified) industry 4.0 indicator system model that is based on open data sources. This new model serves as a tool to evaluate regional economy to support governmental decisions. It also provides territorial councils with a decision-support tool for field investment decisions. And finally, this model offers investors with a heat map to evaluate regional economies successful implementation of industry 4.0 solutions.

J. Abonyi, T. Czvetko, G. Honti, Are Regions Prepared for Industry 4.0? The Industry 4.0+ Indicator System for Assessment, SpringerBriefs in Entrepreneurship and Innovation

Development of manufacturing execution systems in accordance with Industry 4.0 requirements: A review of standard- and ontology-based methodologies and tools

This work presents how recent trends in Industry 4.0 (I4.0) solutions are influencing the development of manufacturing execution systems (MESs) and analyzes what kinds of trends will determine the development of the next generation of these technologies. This systematic and thematic review provides a detailed analysis of I4.0-related requirements in terms of MES functionalities and an overview of MES development methods and standards because these three aspects are essential in developing MESs. The analysis highlights that MESs should interconnect all components of cyber-physical systems in a seamless, secure, and trustworthy manner to enable high-level automated smart solutions and that semantic metadata can provide contextual information to support interoperability and modular development. The observed trends show that formal models and ontologies will play an even more essential role in I4.0 systems as interoperability becomes more of a focus and that the new generation of linkable data sources should be based on semantically enriched information. The presented overview can serve as a guide for engineers interested in the development of MESs as well as for researchers interested in finding worthwhile areas of research.

Pairwise comparison based Failure Mode and Effects Analysis (FMEA)

The proposed method supports the determination of severity (S), occurrence (O), and detection (D) indices of Failure Modes and Effects Analysis (FMEA). Previously evaluated and previously not studied risks are compared in pairwise comparison. The analysis of the resulted pairwise comparison matrix provides information about the consistency of the risk evaluations and allows the estimation of the indices of the previously not evaluated risks. The advantages of the method include:

  • The pairwise comparison facilities the identification of risks that are otherwise difficult to evaluate

  • The inconsistency of existing FMEA studies can be highlighted and systematically reduced

  • The method can be generalized about a wide range of grading problems

Analytic Hierarchy Process and Multilayer Network-Based Method for Assembly Line Balancing

Assembly line balancing improves the efficiency of production systems by the optimal assignment of tasks to operators. The optimisation of this assignment requires models that provide information about the activity times, constraints and costs of the assignments. A multilayer network-based representation of the assembly line-balancing problem is proposed, in which the layers of the network represent the skills of the operators, the tools required for their activities and the precedence constraints of their activities. The activity–operator network layer is designed by a multi-objective optimisation algorithm in which the training and equipment costs as well as the precedence of the activities are also taken into account. As these costs are difficult to evaluate, the analytic hierarchy process (AHP) technique is used to quantify the importance of the criteria. The optimisation problem is solved by a multi-level simulated annealing algorithm (SA) that efficiently handles the precedence constraints. The efficiency of the method is demonstrated by a case study from wire harness manufacturing.

Megjelentünk az Innotéka magazinban!

Megjelent az Innotéka magazin májusi számában a Dr. Abonyi Jánosról készült portré "Vonzódni az ismeretlenhez" címen.

A cikk megtalálható az alábbi linken:

Cikk linkje

A májusi szám pedig az alábbi linken:

Májusi szám linkje

Multilayer network based comparative document analysis (MUNCoDA)

The proposed multilayer network-based comparative document analysis (MUNCoDA) method supports the identification of the common points of a set of documents, which deal with the same subject area. As documents are transformed into networks of informative word-pairs, the collection of documents form a multilayer network that allows the comparative evaluation of the texts. The multilayer network can be visualized and analyzed to highlight how the texts are structured. The topics of the documents can be clustered based on the developed similarity measures. By exploring the network centralities, topic importance values can be assigned. The method is fully automated by KNIME preprocessing tools and MATLAB/Octave code.

•Networks can be formed based on informative word pairs of a multiple documents

•The analysis of the proposed multilayer networks provides information for multi-document summarization

•Words and documents can be clustered based on node similarity and edge overlap measures

V. Sebestyén, E. Domokos, J. Abonyi : Multilayer network based comparative document analysis (MUNCoDA), MethodsX, Volume 7, 2020

Focal points for sustainable development strategies—Text mining-based comparative analysis of voluntary national reviews has been published!

Post date: March11, 2020 10:00:00 AM

Countries have to work out and follow tailored strategies for the achievement of their Sustainable Development Goals. At the end of 2018, more than 100 voluntary national reviews were published. The reviews are transformed by text mining algorithms into networks of keywords to identify country-specific thematic areas of the strategies and cluster countries that face similar problems and follow similar development strategies. The analysis of the 75 VNRs has shown that SDG5 (gender equality) is the most discussed goal worldwide, as it is discussed in 77% of the analysed Voluntary National Reviews. The SDG8 (decent work and economic growth) is the second most studied goal, With 76 %, while the SDG1 (no poverty) is the least focused goal, it is mentioned only in 48 % of documents and the SDG10 (reduced inequalities) in 49 %. The results demonstrate that the proposed benchmark tool is capable of highlighting what kind of activities can make significant contributions to achieve sustainable developments.

Prof. Janos Abonyi was invited to join the program committee of 7th edition of the International conference on Time Series and Forecasting (ITISE 2020)

Post date: Feb22, 2020 09:00:00 AM

The ITISE 2020 (7th International conference on Time Series and Forecasting) seeks to provide a discussion forum for scientists, engineers, educators and students about the latest ideas and realizations in the foundations, theory, models and applications for interdisciplinary and multidisciplinary research encompassing disciplines of computer science, mathematics, statistics, forecaster, econometric, etc, in the field of time series analysis and forecasting.

The aims of ITISE 2020 is to create a a friendly environment that could lead to the establishment or strengthening of scientific collaborations and exchanges among attendees, and therefore, ITISE 2020 solicits high-quality original research papers (including significant work-in-progress) on any aspect time series analysis and forecasting, in order to motivating the generation, and use of knowledge and new computational techniques and methods on forecasting in a wide range of fields.

Link to the conference website

Post date: Feb16, 2020 10:00:00 PM

The school is organized at the University of Catania, Italy, by the Department of Electrical Electronics and Computer Science and the Cometa Consortium.

It consists of a series of lectures given by leading scientists in the field, aiming at providing a comprehensive treatment from background material to advanced results. The school is specially directed to PhD students and young researchers interested to the diverse aspects of the theory and applications of complex networks in science and engineering. The school aims at encouraging cross-disciplinary discussions between participants and speakers and start new joint researches.

The Assembly magazine reported about our methodology developed for activity time monitoring:

Post date: Feb13, 2020 6:00:00 PM

Industry 4.0 and the digital manufacturing revolution are all about collecting—and, more importantly, acting on—data gathered from the assembly process in real time. That’s all well and good when data is coming from sensors, vision systems, fastening tools and other electronic devices. But, how can engineers gather real-time data on largely manual assembly processes, such as wire harness assembly?

To solve this problem, we developed a software- and sensor-based system to measure activity times and performance on a wire harness assembly line. To ensure a real-time connection between assembler performance and varying product complexity, our system relies on fixture sensors and an indoor positioning system (IPS). Our goal was to create a system that could continuously estimate the time consumed by the various elementary activities that make up wire harness assembly. Our system creates a model of a task, compares estimated activity times to the actual performance of assemblers, and generates early warnings when their productivity decreases.

J. Abonyi, T. Ruppert, Monitoring Activity During Wire Harness Assembly, Assembly, 2020

Mixtures of QSAR Models – Learning Application Domains of pKa Predictors has been published!

Post date: Feb11, 2020 5:00:00 PM

Quantitative structure-activity relationship models (QSAR models) predict the physical properties or biological effects based on physicochemical properties or molecular descriptors of chemical structures. Our work focuses on the construction of optimal linear and nonlinear weighted mixes of individual QSAR models to more accurately predict their performance. How the splitting of the application domain by a nonlinear gating network in a "mixture of experts" model structure is suitable for the determination of the optimal domain-specific QSAR model and how the optimal QSAR model for certain chemical groups can be determined is highlighted. The input of the gating network is arbitrarily formed by the various molecular structure descriptors and/or even the prediction of the individual QSAR models. The applicability of the method is demonstrated on the pKa values of the OASIS database (1912 chemicals) by the combination of four acidic pKa predictions of the OECD QSAR Toolbox. According to the results, the prediction performance was enhanced by more than 15 % (RMSE value) compared to the predictions of the best individual QSAR model.

J.Abonyi, T. Varga, O. P. Hamadi, Gy. Dorgo, Mixtures of QSAR Models – Learning Application Domains of pKa Predictors, Journal of Chemometrics, 2020

Janos Abonyi invited to serve as a Committee member!

Post date: Feb10, 2020 9:00:00 PM

Dr. Janos Abonyi has been invited to serve as Technical Program Committee member in the IEEE Wireless Africa 2020 conference.

IEEE Wireless Africa 2020 is sponsored by the IEEE Vehicular Technology Society and will be hosted in South Africa, from 29-30 November 2020.

The conference aims to provide a platform for wireless researchers to share their results, call for comments and collaborations, and exchange innovative ideas on leading edge research in wireless technologies.

Link to the conference website

Link to the precise track

A multilayer and spatial description of the Erasmus mobility network has been published!

Post date: Feb6, 2020 4:00:00 PM

The Erasmus Programme is the biggest collaboration network consisting of European Higher Education Institutions (HEIs). The flows of students, teachers and staff form directed and weighted networks that connect institutions, regions and countries. Here, we present a linked and manually verified dataset of this multiplex, multipartite, multi-labelled, spatial network. We enriched the network with institutional socio-economic data from the European Tertiary Education Register (ETER) and the Global Research Identifier Database (GRID). We geocoded the headquarters of institutions and characterised the attractiveness and quality of their environments based on Points of Interest (POI) data. The linked datasets provide relevant information to grasp a more comprehensive understanding of the mobility patterns and attractiveness of the institutions.

J. Abonyi, L. Gadár, Zs. T. Kosztyán, A. Telcs, A multilayer and spatial description of the Erasmus mobility network, Scientific Data 7, Article 41, 2020

Conference presentation

Tamás Ruppert and Róbert Csalódi were attending the 7th International Conference on Industrial Engineering and Applications (ICIEA) in Paris from 15 to 17 January 2020. Their presentation were rated by the judges as the best presentation in their own section.

Webpage of the conference

Network-Based Analysis of Dynamical Systems has been published!

Post date: Jan15, 2020 6:00:00 PM

The key idea of this book is that the dynamical properties of complex systems can be determined by the effective calculation of specific structural features by network science-based analysis. Furthermore, certain dynamical behaviours can originate from the existence of specific motifs in the network representation or can be determined by network segmentation. Although the applicability of the methods and results was highlighted in the operability analysis of dynamical systems, the proposed methods can be utilised in various fields that will be mentioned at the end of each chapter.

J. Abonyi, Dániel Leitold, Ágnes Vathy-Fogarassy, Network-Based Analysis of Dynamical Systems, SpringerBriefs in Computer Science

Fuzzy activity time-based model predictive control of open station assembly lines is published!

Post date: Dec 14, 2019 3:00:00 PM

The sequencing and line balancing of manual mixed-model assembly lines are challenging tasks due to the complexity and uncertainty of operator activities. The control of cycle time and the sequencing of production can mitigate the losses due to non-optimal line balancing in the case of open-station production where the operators can work ahead of schedule and try to reduce their backlog. The objective of this paper is to provide a cycle time control algorithm that can improve the e ciency of assembly lines in such situations based on a specially mixed sequencing strategy. To handle the uncertainty of activity times, a fuzzy model-based solution has been developed. As the production process is modular, the fuzzy sets represent the uncertainty of the elementary activity times related to the processing of the modules. The optimistic and pessimistic estimates of the completion of activity times extracted from the fuzzy model are incorporated into a model predictive control algorithm to ensure the constrained optimization of the cycle time. The applicability of the proposed method is demonstrated based on a wire-harness manufacturing process with a paced conveyor, but the proposed algorithm can handle continuous conveyors as well. The results confirm that the application of the proposed algorithm is widely applicable in cases where a production line of a supply chain is not well balanced and the activity times are uncertain.

J. Abonyi, Tamas Ruppert, Gyula Dorgo, Fuzzy activity time-based model predictive control of open-station assembly lines, Journal of Manufacturing Systems Volume 54, January 2020, Pages 12-23

Network analysis dataset of System Dynamics models is published!

Post date: Nov 01, 2019 3:00:00 PM

This paper presents a tool developed for the analysis of networks extracted from system dynamics models. The developed tool and the collected models were used and analyzed in the research paper, Review and structural analysis of system dynamics models in sustainability science. The models developed in Vensim, Stella, and InsightMaker are converted into networks of state-variables, flows, and parameters by the developed Python program that also performs model reduction, modularity analysis and calculates the structural properties of the models and its main variables. The dataset covers the results of the analysis of nine models in sustainability science used for policy testing, prediction and simulation.

Honti G., Dorgo Gy., Abonyi J.:"Network analysis dataset of System Dynamics models" , Data in Brief, 2019, Paper:104723

Constrained Recursive Input Estimation of Blending and Mixing Systems is published!

Post date: Oct 23, 2019 6:00:00 PM

Blending, mixing processes are often supported by advanced process control systems to maximise margins from available component and heat streams. Since these model-based solutions require accurate and reliable data, in weakly instrumented processes, the unknown inlet concentrations and temperatures should be estimated based on the measured outflows. This work presents a method for the reliable estimation of multiple input variables of process units. The key idea is that the input estimation problem is formulated as a constrained recursive estimation task. The applicability of the method is illustrated based on a benchmark model of a blending system. The performance of the method is compared to the moving window and Kalman Filter based solutions. The results show the superior performance of the proposed method and confirm that the apriori knowledge-based constraints improve the robustness of the estimates.

Abonyi J.: „Constrained Recursive Input Estimation of Blending and Mixing Systems”, Chemical Engineering Transactions, 76, 727-732

Introduction to Data Analysis Course

Post date: Oct 18, 2019

We are delivering a data analysis course at MOL Value Chain Academy.

The course material is avaliable here

Data-driven multilayer complex networks of sustainable development goals is published!

Post date: Oct 08, 2019 6:15:00 PM

This data article presents the formulation of multilayer network for modelling the interconnections among the sustainable development goals (SDGs), targets and includes the correlation based linking of the sustainable development indicators with the available long-term datasets of The World Bank, 2018. The spatial distribution of the time series data allows creating country-specific sustainability assessments. In the related research article “Network Model-Based Analysis of the Goals, Targets and Indicators of Sustainable Development for Strategic Environmental Assessment” the similarities of SDGs for ten regions have been modelled in order to improve the quality of strategic environmental assessments. The datasets of the multilayer networks are available on Mendeley .

Sebestyén V., Bulla M., Rédey Á., Abonyi J.: „Data-driven multilayer complex networks of sustainable development goals”, Data in Brief, Volume 25, 2019, 104049

Network-based Observability and Controllability Analysis of Dynamical Systems: the NOCAD toolbox is published!

Post date: Oct 08, 2019 6:10:00 PM

The network science-based determination of driver nodes and sensor placement has become increasingly popular in the field of dynamical systems over the last decade. In this paper, the applicability of the methodology in the field of life sciences is introduced through the analysis of the neural network of Caenorhabditis elegans. Simultaneously, an Octave and MATLAB-compatible NOCAD toolbox is proposed that provides a set of methods to automatically generate the relevant structural controllability and observability associated measures for linear or linearised systems and compare the different sensor placement methods.

Leitold D., Vathy-Fogarassy Á., Abonyi J.: „Network-based Observability and Controllability Analysis of Dynamical Systems: the NOCAD toolbox”, [version 2; peer review: 2 approved], F1000Research 2019, 8:646

Genetic programming-based development of thermal runaway criteria is published!

Post date: Oct 08, 2019 6:10:00 PM

Common thermal runaway criteria (e.g. divergence criterion and the Maxi criterion) may predict a thermal runaway unreasonably as the Maximum Allowable Temperature (MAT) is not taken into account. This contribution proposes a method for the goal-oriented construction of reactor runaway criteria by Genetic Programming (GP). The runaway prediction problem is formulated as a critical equation-based classification task, and GP is used to identify the optimal structure of the equations that also take into account the MAT. To demonstrate the applicability of the method, tailored criteria were developed for batch and continuous stirred-tank reactors. The resultant critical equations outperform the well-known criteria in terms of the early and accurate indication of thermal runaways.

Kummer A., Varga T., Abonyi J.: „Genetic programming-based development of thermal runaway criteria”, Computers & Chemical Engineering, 2019, 106582

Review and structural analysis of system dynamics models in sustainability science is published!

Post date: Oct 08, 2019 6:00:00 PM

As the complexity of sustainability-related problems increases, it is more and more difficult to understand the related models. Although tremendous models are published recently, their automated structural analysis is still absent. This study provides a methodology to structure and visualise the information content of these models. The novelty of the present approach is the development of a network analysis-based tool for modellers to measure the importance of variables, identify structural modules in the models and measure the complexity of the created model, and thus enabling the comparison of different models. The overview of 130 system dynamics models from the past five years is provided. The typical topics and complexity of these models highlight the need for tools that support the automated structural analysis of sustainability problems. For practising engineers and analysts, nine models from the field of sustainability science, including the World3 model, are studied in details. The results highlight that with the help of the developed method the experts can highlight the most critical variables of sustainability problems (like arable land in the Word 3 model) and can determine how these variables are clustered and interconnected (e.g. the population and fertility are key drivers of global processes). The developed software tools and the resulted networks are all available online.

Honti G., Dörgő Gy., Abonyi J.: „Review and structural analysis of system dynamics models in sustainability science”, Journal of Cleaner Production, Volume 240, 2019, 118015

Learning and predicting operation strategies by sequence mining and deep learning (full paper) is published!

Post date: Jun 15, 2019 7:55:00 PM

The operators of chemical technologies are frequently faced with the problem of determining optimal interventions. Our aim is to develop data-driven models by exploring the consequential relationships in the alarm and event-log database of industrial systems. Our motivation is twofold: (1) to facilitate the work of the operators by predicting future events and (2) analyse how consequent the event series is. The core idea is that machine learning algorithms can learn sequences of events by exploring connected events in databases. First, frequent sequence mining applications are utilised to determine how the event sequences evolve during the operation. Second, a sequence-to-sequence deep learning model is proposed for their prediction. The long short-term memory unit-based model (LSTM) is capable of evaluating rare operation situations and their consequential events. The performance of this methodology is presented with regard to the analysis of the alarm and event-log database of an industrial delayed coker unit.

Dörgő Gy., Abonyi J.: "Learning and predicting operation strategies by sequence mining and deep learning", Computers & Chemical Engineering, Volume 128, 2 September 2019, Pages 174-187

A Review of Semantic Sensor Technologies in Internet of Things Architectures is published!

Post date: Jun 15, 2019 7:40:00 PM

Intelligent sensors should be seamlessly, securely, and trustworthy interconnected to enable automated high-level smart applications. Semantic metadata can provide contextual information to support the accessibility of these features, making it easier for machines and humans to process the sensory data and achieve interoperability. The unique overview of sensor ontologies according to the semantic needs of the layers of IoT solutions can serve a guideline of engineers and researchers interested in the development of intelligent sensor-based solutions. The explored trends show that ontologies will play an even more essential role in interlinked IoT systems as interoperability and the generation of controlled linkable data sources should be based on semantically enriched sensory data.

Honti G., Abonyi J.: "A Review of Semantic Sensor Technologies in Internet of Things Architectures", Complexity, Volume 2019, Article ID 6473160, 21 pages

Operating regime model based multi-objective sensor placement for data reconciliation is published!

Post date: Jun 15, 2019 7:40:00 PM

Although the number of sensors in chemical production plants is increasing thanks to the IoT revolution, it is still a crucial problem what to measure and how to place the sensors as such the resulted sensor network be robust and cost-effectively provide the required information. This problem is especially relevant in flexible multi-purpose, multi-product production plants when there are significant differences among the operating regions. The present work aims the development of a sensor placement methodology that utilizes the advantages of local linear models. Realizing the often conflicting nature of the key objectives of sensor placement, the problem is formulated as a multi-objective optimization task taking into consideration the cost, estimation accuracy, observability and fault detection performance of the designed networks and simultaneously seeking for the optimal solutions under multiple operating regimes. The effectiveness of the Non-dominated Sorting Genetic Algorithm-II (NSGA-II)-based solution of the defined problem is demonstrated through benchmark examples.

Dörgő Gy., Haragovics M., Abonyi J.: "Operating regime model based multi-objective sensor placement for data reconciliation", 29th European Symposium on Computer Aided Process Engineering, Netherlands, Eindhoven, 2019 June 16-19.

Soft Sensors Special Issue

Post date: Jun 4, 2019 8:40:00 PM

We are editing a special issue related to software sensors. Please forward this link to researchers potentially interested in submitting a paper.

Deadline for manuscript submissions: 30 September 2019


https://www.mdpi.com/journal/sensors/special_issues/Soft_Sensors

P-graph-based multi-objective risk analysis and redundancy allocation in safety-critical energy systems is published!

Post date: May 27, 2019 8:10:00 PM

As most of the energy production and transformation processes are safety-critical, it is vital to develop tools that support the analysis and minimisation of their reliability-related risks. The resultant optimisation problem should reflect the structure of the process which requires the utilisation of flexible and problem-relevant models. This paper highlights that P-graphs extended by logical condition units can be transformed into reliability block diagrams, and based on the cut and path sets of the graph a polynomial risk model can be extracted which opens up new opportunities for the definition optimisation problems related to reliability redundancy allocation. A novel multi-objective optimisation based method has been developed to evaluate the criticality of the units and subsystems. The applicability of the proposed method is demonstrated using a real-life case study related to a reforming reaction system. The results highlight that P-graphs can serve as an interface between process flow diagrams and polynomial risk models and the developed tool can improve the reliability of energy systems in retrofitting projects.

Süle Z., Baumgartner J., Dörgő Gy., Abonyi J.: "P-graph-based multi-objective risk analysis and redundancy allocation in safety-critical energy systems", Energy (2019), vol. 179, 989-1003.

Network-based Observability and Controllability Analysis of Dynamical Systems: the NOCAD toolbox is published!

Post date: May 27, 2019 8:10:00 PM

Network science has become increasingly important in life science over the last decade. The proposed Octave and MATLAB-compatible NOCAD toolbox provides a set of methods which enables the structural controllability and observability analysis of dynamical systems. In this paper, the functionality of the toolbox is presented, and the implemented functions demonstrated.

Leitold D., Vathy-Fogarassy Á., and Abonyi J.: "Network-based Observability and Controllability Analysis of Dynamical Systems: the NOCAD toolbox [version 1; peer review: awaiting peer review]", F1000Research 2019, 8:646

A new version of our toolbox is available!

https://github.com/abonyilab/NOCAD

Network Model-Based Analysis of the Goals, Targets and Indicators of Sustainable Development for Strategic Environmental Assessment is published!

Post date: Mar 07, 2019 2:00:00 PM

Strategic environmental assessment is a decision support technique that evaluates policies, plans and programs in addition to identifying the most appropriate interventions in different scenarios. This work develops a network-based model to study interlinked ecological, economic, environmental and social problems to highlight the synergies between policies, plans, and programs in environmental strategic planning. Our primary goal is to propose a methodology for the data-driven verification and extension of expert knowledge concerning the interconnectedness of the sustainable development goals and their related targets. A multilayer network model based on the time-series indicators of the World Bank open data over the last 55 years was assembled. The results illustrate that by providing an objective and data-driven view of the correlated variables of the World Bank, the proposed layered multipartite network model highlights the previously not discussed interconnections, node centrality measures evaluate the importance of the targets, and network community detection algorithms reveal their strongly connected groups. The results confirm that the proposed methodology can serve as a data-driven decision support tool for the preparation and monitoring of long-term environmental policies. The developed new data-driven network model enables multi-level analysis of the sustainability (goals, targets, indicators) and will make it possible to plan long-term environmental strategic planning. Through relationships among indicators, relationships among targets and goals can be modelled. The results show that sustainable development goals are strongly interconnected, while the 5th goal (gender equality) is linked mostly to 17th (partnerships for the goals) goal. The analysis has also highlighted the importance of the 4th (quality education).

Sebestyén V., Bulla M., Rédey Á., Abonyi J.: "Network Model-Based Analysis of the Goals, Targets and Indicators of Sustainable Development for Strategic Environmental Assessment", Journal of Environmental Management, 2019, 238, 126-135

Frequent pattern mining in multidimensional organizational networks is published!

Post date: Mar 05, 2019 1:00:00 PM

Network analysis can be applied to understand organizations based on patterns of communication, knowledge flows, trust, and the proximity of employees. A multidimensional organizational network was designed, and association rule mining of the edge labels applied to reveal how relationships, motivations, and perceptions determine each other in different scopes of activities and types of organizations. Frequent itemset-based similarity analysis of the nodes provides the opportunity to characterize typical roles in organizations and clusters of co-workers. A survey was designed to define 15 layers of the organizational network and demonstrate the applicability of the method in three companies. The novelty of our approach resides in the evaluation of people in organizations as frequent multidimensional patterns of multilayer networks. The results illustrate that the overlapping edges of the proposed multilayer network can be used to highlight the motivation and managerial capabilities of the leaders and to find similarly perceived key persons.

https://www.nature.com/articles/s41598-019-39705-1

Evaluation of the Complexity, Controllability and Observability of Heat Exchanger Networks Based on Structural Analysis of Network Representations is published!

Post date: Feb 15, 2019 7:45:00 PM

The design and retrofit of Heat Exchanger Networks (HENs) can be based on several objectives and optimisation algorithms. As each method results in an individual network topology that has a significant effect on the operability of the system, control-relevant HEN design and analysis are becoming more and more essential tasks. This work proposes a network science-based analysis tool for the qualification of controllability and observability of HENs. With the proposed methodology, the main characteristics of HEN design methods are determined, the effect of structural properties of HENs on their dynamical behaviour revealed, and the potentials of the network-based HEN representations discussed. Our findings are based on the systematic analysis of almost 50 benchmark problems related to 20 different design methodologies.

https://www.mdpi.com/1996-1073/12/3/513

We are delivering a Python course. The course material is avaliable from here:

https://www.dropbox.com/sh/n22wec5qtdascn1/AADkMHiYFV26yQtcO6lJhWP8a?dl=0

https://github.com/abonyilab/SystemsEngineering

We deliver a data analysis course (in Excel) at PIMS academy. The course material is avaliable from here: https://www.dropbox.com/s/n7myvhy68mvna0h/Data_excel.zip?dl=0

The Settlement Structure Is Reflected in Personal Investments: Distance-Dependent Network Modularity-Based Measurement of Regional Attractiveness is published!

Post date: Dec 12, 2018 10:30:00 PM

How are ownership relationships distributed in the geographical space? Is physical proximity a significant factor in investment decisions? What is the impact of the capital city? How can the structure of investment patterns characterize the attractiveness and development of economic regions? To explore these issues, we analyze the network of company ownership in Hungary and determine how are connections are distributed in geographical space. Based on the calculation of the internal and external linking probabilities, we propose several measures to evaluate the attractiveness of towns and geographic regions. Community detection based on several null models indicates that modules of the network coincide with administrative regions, in which Budapest is the absolute centre, and where county centres function as hubs. Gravity model-based modularity analysis highlights that, besides the strong attraction of Budapest, geographical distance has a significant influence over the frequency of connections and the target nodes play the most significant role in link formation, which confirms that the analysis of the directed company-ownership network gives a good indication of regional attractiveness.

Gadar Laszlo, Kosztyan Zsolt T., Abonyi Janos: "The Settlement Structure Is Reflected in Personal Investments: Distance-Dependent Network Modularity-Based Measurement of Regional Attractiveness", Complexity, 2018, Article ID 1306704, 16 pages

Evaluating the Interconnectedness of the Sustainable Development Goals Based on the Causality Analysis of Sustainability Indicators is published!

Post date: Oct 20, 2018 11:40:00 PM

Policymaking requires an in-depth understanding of the cause-and-effect relationships between the sustainable development goals. However, due to the complex nature of socio-economic and environmental systems, this is still a challenging task. In the present article, the interconnectedness of the United Nations (UN) sustainability goals is measured using the Granger causality analysis of their indicators. The applicability of the causality analysis is validated through the predictions of the World3 model. The causal relationships are represented as a network of sustainability indicators providing the opportunity for the application of network analysis techniques. Based on the analysis of 801 UN indicator types in 283 geographical regions, approximately 4000 causal relationships were identified and the most important global connections were represented in a causal loop network. The results highlight the drastic deficiency of the analysed datasets, the strong interconnectedness of the sustainability targets and the applicability of the extracted causal loop network. The analysis of the causal loop networks emphasised the problems of poverty, proper sanitation and economic support in sustainable development.


Dörgő Gy., Sebestyén V., Abonyi J.:"Evaluating the Interconnectedness of the Sustainable Development Goals Based on the Causality Analysis of Sustainability Indicators", Sustainability 2018, 10(10), 3766, doi:10.3390/su10103766

6th International Conference on Control, Decision and Information Technologies - Janos Abonyi became the member of the PC

Post date: Oct 10, 2018 7:00:00 PM

The CoDIT’19 conference is the sixth (6th) edition in the series of the International Conference on Control, Decision and Information Technologies, organized since 2013, the previous one CoDIT'18 having held in Thessaloniki, Greece, in April 2018.

CoDIT’19 will be held April 23-26, 2019 at Paris, France. Its purpose is to be a forum for technical exchange amongst scientists having interests in Control, Optimization, Decision, all areas of Engineering, Computer Science and Information Technologies. This conference will provide a remarkable opportunity for the academic and industrial communities to address new challenges, share solutions and discuss future research directions. The technical program will include plenary lectures, regular technical sessions, and special sessions.

TOPICS

  • Control and Automation

  • Decision and Optimization

  • Information Technologies and Computer Science


Important dates and deadlines

  • Submission site opening: September 25, 2018

  • Special session Proposal: October 25, 2018

  • Papers submision deadline: December 5, 2018

  • Acceptance notification: February 8, 2019

  • Final version due: February 28, 2019

  • Registration deadline: February 28, 2019


http://codit19.com/index.php/committees

Our review about operators and cyber-physical systems is published!

Enabling Technologies for Operator 4.0: A Survey

The fast development of smart sensors and wearable devices has provided the opportunity to develop intelligent operator workspaces. The resultant Human-Cyber-Physical Systems (H-CPS) integrate the operators into flexible and multi-purpose manufacturing processes. The primary enabling factor of the resultant Operator 4.0 paradigm is the integration of advanced sensor and actuator technologies and communications solutions. This work provides an extensive overview of these technologies and highlights that the design of future workplaces should be based on the concept of intelligent space.

Ruppert T., Jaskó Sz., Holczinger T., Abonyi J.: "Enabling Technologies for Operator 4.0: A Survey", Applied Sciences, Basel, 2018, 8 (9), 1650, 1-19

4rd International Conference on Internet of Things, Big Data and Security - Janos Abonyi became the member of the PC

Post date: Aug 30, 2018 8:15:00 PM

The internet of things (IoT) is a platform that allows a network of devices (sensors, smart meters, etc.) to communicate, analyse data and process information collaboratively in the service of individuals or organisations. The IoT network can generate large amounts of data in a variety of formats and using different protocols which can be stored and processed in the cloud. The conference looks to address the issues surrounding IoT devices, their interconnectedness and services they may offer, including efficient, effective and secure analysis of the data IoT produces using machine learning and other advanced techniques, models and tools, and issues of security, privacy and trust that will emerge as IoT technologies mature and become part of our everyday lives.

CONFERENCE AREAS

1 . Big Data Research

2 . Emerging Services and Analytics

3 . Internet of Things (IoT) Fundamentals

4 . Internet of Things (IoT) Applications

5 . Big Data for Multi-discipline Services

6 . Security, Privacy and Trust

7 . IoT Technologies

UPCOMING DEADLINES

Regular Paper Submission: December 10, 2018

Regular Paper Authors Notification: February 7, 2019

Regular Paper Camera Ready and Registration: February 21, 2019

Late-Breaking Camera Ready and Registration: March 21, 2019

http://iotbds.org/ProgramCommittee.aspx

Soft Sensors Special Issue

Post date: Aug 30, 2018 8:1:00 PM


We are editing a special issue related to software sensors.

Please forward this link to researchers potentially interested insubmitting a paper.

Slide to promote special issue in conference

http://www.mdpi.com/journal/sensors/special_issues/Soft_Sensors

Graph configuration model based evaluation of the education-occupation match

Post date: Mar 6, 2018 7:47:06 PM

To study education—occupation matchings we developed a bipartite network model of education to work transition and a graph configuration model based metric. We studied the career paths of 15 thousand Hungarian students based on the integrated database of the National Tax Administration, the National Health Insurance Fund, and the higher education information system of the Hungarian Government. A brief analysis of gender pay gap and the spatial distribution of over-education is presented to demonstrate the background of the research and the resulted open dataset. We highlighted the hierarchical and clustered structure of the career paths based on the multi-resolution analysis of the graph modularity. The results of the cluster analysis can support policymakers to fine-tune the fragmented program structure of higher education.

The details of this research are published in PLOS ONE:

http://journals.plos.org/plosone/article?id=10.1371/journal.pone.0192427

All the files and the R code are available at:

https://github.com/abonyilab/Edu_Mine_Graph

Sequence Mining based Alarm Suppression

Post date: Feb 10, 2018 8:45:27 PM

To provide more insight into the process dynamics and represent the temporal relationships among faults, control actions and process variables we propose of a multi-temporal sequence mining based algorithm. The methodology starts with the generation of frequent temporal patterns of the alarm signals. We transformed the multi-temporal sequences into Bayes classifiers. The obtained association rules can be used to define alarm suppression rules. We analyzed the dataset of a laboratory-scale water treatment testbed to illustrate that multi-temporal sequences are applicable for the description of operation patterns. We extended the benchmark simulator of a vinyl acetate production technology to generate easily reproducible results and stimulate the development of alarm management algorithms. The results of detailed sensitivity analyses confirm the benefits of the application of temporal alarm suppression rules which are reflecting the dynamical behaviour of the process.

The files are the supplementary materials of our paper will be published in IEEE Access, 2018 For the extended simulator of the vinyl acetate production technology and the source codes of the Bayes’ theorem-based evaluation of sequences see: https://github.com/abonyilab/VACsimulator

The MATLAB implementation of the sequence mining algorithm is available at: https://github.com/abonyilab/Multi-temporal-sequence-mining

Visualization and interpretation of deep learning models

Post date: Feb 10, 2018 8:36:32 PM

We visualise the LSTM deep learning models by principal component analysis. The similarity of the events in fault isolation can be evaluated based on the linear embedding layer of the network, which maps the input signals into a continuous-valued vector space. The method is demonstrated in a simulated vinyl acetate production technology. The results illustrate that with the application of RNN based sequence learning not only accurate fault classification solutions can be developed, but the visualisation of the model can give useful hints for hazard analysis.

The paper related paper will be published in Journal of Chemometrics soon.

The algorithm was implemented in Python. The related code can be downloaded from our Github repository.

3rd International Conference on Internet of Things, Big Data and Security - Janos Abonyi became the member of the PC

Post date: Jun 8, 2017 5:09:52 AM

The internet of things (IoT) is a platform that allows a network of devices (sensors, smart meters, etc.) to communicate, analyse data and process information collaboratively in the service of individuals or organisations. The IoT network can generate large amounts of data in a variety of formats and using different protocols which can be stored and processed in the cloud. The conference looks to address the issues surrounding IoT devices, their interconnectedness and services they may offer, including efficient, effective and secure analysis of the data IoT produces using machine learning and other advanced techniques, models and tools, and issues of security, privacy and trust that will emerge as IoT technologies mature and become part of our everyday lives.

CONFERENCE AREAS

1 . Big Data Research

2 . Emerging Services and Analytics

3 . Internet of Things (IoT) Fundamentals

4 . Internet of Things (IoT) Applications

5 . Big Data for Multi-discipline Services

6 . Security, Privacy and Trust

7 . IoT Technologies

UPCOMING DEADLINES

Regular Paper Submission: October 16, 2017

Regular Paper Authors Notification: December 15, 2017

Regular Paper Camera Ready and Registration: January 4, 2018

International Conference on Communication, Computing & Internet of Things - (IC3IoT 2018)

Post date: Jun 1, 2017 3:53:43 AM

"Acceptance of global competitiveness and unlimited innovations are emerging as the most critical elements in wealth generation in the current world economy. Transitions into a developed nation and empowered society shall not be a far away dream but shall be a near future reality. The need for linking science and technology to the growth of India shall be intensified and improved by conferences of this kind.

Broadband and Wireless Communication have brought massive changes to the world and continue to provide an array of new challenges, multi-domain applications and solutions such as IoT. The aim of IC3IoT is to provide an excellent forum for sharing knowledge and present the innovative researchers, and technologies as well as developments and future demands related to Broadband Technologies, Computing Technologies, Human-Computer Interaction and Wireless Communication along with IoT.

An International conference of this nature will enhance and benefit the human society at large since it will bring together leading researchers engineers and scientists in the domain of interest."

Prof. Abonyi is a member of the program committee of the conference. More details can be found at the website of the event

!!!!! HAS - UP "Momentum" Complex Systems Research Group !!!!

Post date: May 19, 2017 5:11:37 PM

The objective of the Lendület (Momentum) Program of the Hungarian Academy of Science is a dynamic renewal of the research teams of the Academy and participating universities. With the help of this program, we transform and extend the group of Prof. Abonyi into a research group devoted to complex systems.

We will form a new school for rethinking and upgrading systems engineering and data science in the light of the fourth industrial revolution. The overall goal of the project is the development of new algorithms and open source tools to utilise the data collected by internetworking systems in monitoring, control, optimisation, scheduling, risk management, and product lifecycle management. This goal challenges present-day internet of things technology regarding the development software agent and advanced sensor fusion functionalities.

We believe that algorithms tailored for (1) multivariate time series analysis, (2) software sensors and event analysis, (3) localisation and (4) model mining can result in significant progress in this field. The creative and integrated application of the resulted algorithms can bring in a new perspective to the integrated monitoring and structural analysis of complex systems and the utilisation of open and linked data. The full integration of these four subprojects is primarily important and ensures the strength and uniqueness of this proposal.

The proposed centre, therefore, aims to bring together the best technological expertise in systems-, data-, and network science, and industrial intelligence. As part of its mission, the Group will make the new and integrated solutions available to the research community and industry through its collaborations and training.

Network science and control theory - Our paper in Scientific Reports!

Post date: Mar 11, 2017 2:43:46 PM

Network theory based controllability and observability analysis have become widely used techniques. We realized that most applications are not related to dynamical systems, and mainly the physical topologies of the systems are analysed without deeper considerations. Here, we draw attention to the importance of dynamics inside and between state variables by adding functional relationship defined edges to the original topology. The resulting networks differ from physical topologies of the systems and describe more accurately the dynamics of the conservation of mass, momentum and energy. We define the typical connection types and highlight how the reinterpreted topologies change the number of the necessary sensors and actuators in benchmark networks widely studied in the literature. Additionally, we offer a workflow for network science-based dynamical system analysis, and we also introduce a method for generating the minimum number of necessary actuator and sensor points in the system.

http://www.nature.com/articles/s41598-017-00160-5

https://github.com/abonyilab/NOCAD

PhD defense of Laszlo Dobos, 7th of June, 2016, 1pm

Post date: May 20, 2016 10:04:55 PM

Development of Experimental Design Techniques for Analyzing and Optimization of Operating Technologies

The aim of this thesis is to introduce theoretical basics of different approaches which can support further the production process development, based on the extracted knowledge from process data. As selection of time-frame with a certain operation is the starting point in a further process investigation, Dynamic Principal Component Analysis (DPCA) based time-series segmentation approach is introduced in this thesis first. This new solution is resulted by integrating DPCA tools into the classical univariate time-series segmentation methodologies. It helps us to detect changes in the linear relationship of process variables, what can be caused by faults or misbehaves. This step can be the first one in the model-based process development since it is possible to neglect the operation ranges, which can ruin the prediction capability of the model. In other point of view, we can highlight problematic operation regimes and focus on finding root causes of them. When fault-free, linear operation segments have been selected, further segregation segregation of data segments is needed to find data slices with high information content in terms of model parameter identification. As tools of Optimal Experiment Design (OED) are appropriate for measuring the information content of process data, the goal oriented integration of OED tools and classical timeseries segmentation can handle the problem. Fisher information matrix is one of the basic tools of OED. It contains the partial derivatives of model output respect to model parameters when considering a particular input data sequence. A new, Fisher information matrix based time-series segmentation methodology has been developed to evaluate the information content of an input data slice. By using this tool, it becomes possible to select potentially the most valuable and informative time-series segments. This leads to the reduction of number of industrial experiments and their costs. In the end of the thesis a novel, economic objective function-oriented framework is introduced for tuning model predictive controllers to be able to exploit all the control potentials and at the meantime considering the physical and chemical limits of process.

Media Cloud

Post date: May 20, 2016 9:58:19 PM

Media Cloud is a project of the Harvard Berkman Center for Internet & Society and the MIT Center for Civic Media. It is worth to take a look.

Könyvbemutató, 2016. május 24.

Post date: May 20, 2016 9:55:07 PM


Early warning - risk management

Post date: May 27, 2012 12:41:56 PM

Early waring risk management system can support commercial banks to increase safety, profitability and fluidity of credit founds. The aim of the project is to develop novel algorithms and tools based on the early detection of trends and unusual business networking patterns. The project founded by GOP-1.1.1-11-2011-0045 program and will start in June 2012 with cooperation of Kürt Zrt .