Adaptive fault detection and diagnosis using parsimonious Gaussian mixture models trained with distributed computing techniques

Descrição

Tipo

Artigo de periódico

Título alternativo

Primeiro orientador

Membros da banca

Resumo

After a great advance by the industry on processes automation, an important challenge still remains: the automation under abnormal situations. The first step towards solving this challenge is the Fault Detection and Diagnosis (FDD). This work proposes a batch-incremental adaptive methodology for fault detection and diagnosis based on mixture models trained on a distributed computing environment. The models used are from a family of Parsimonious Gaussian Mixture Models (PGMM), in which the reduced number of parameters of the model brings important advantages when there are few data available, an expected scenario of faulty conditions. On the other hand, a large number of different models rises another challenge, the best model selection for a given behaviour. For that, it is proposed to train a large number of models, using distributed computing techniques, for only then select the best model. This work proposes the usage of the Spark framework, ideal for iterative computations. The proposed methodology was validated in a simulated process, the Tennessee Eastman Process (TEP), showing good results for both the detection and the diagnosis of faults. Furthermore, numeric experiments show the viability of training a large number of models for the best model selection a posteriori.

Abstract

Assunto

Representação do conhecimento (Teoria da informação), Aprendizado do computador

Palavras-chave

Gaussian mixture models, fault detection and diagnosis, statistical models, using the Parsimonious Gaussian Mixture Models (PGMM) as the base of the system brings greater flexibility to the model, allowing for better representation of nonlinear and dynamic behaviours

Citação

Curso

Endereço externo

https://www.sciencedirect.com/science/article/pii/S0016003216304434

Avaliação

Revisão

Suplementado Por

Referenciado Por