Learning from imbalanced data sets with weighted cross-entropy function

Carregando...
Imagem de Miniatura

Data

Título da Revista

ISSN da Revista

Título de Volume

Editor

Universidade Federal de Minas Gerais

Descrição

Tipo

Artigo de periódico

Título alternativo

Primeiro orientador

Membros da banca

Resumo

This paper presents a novel approach to deal with the imbalanced data set problem in neural networks by incorporating prior probabilities into a cost-sensitive cross-entropy error function. Several classical benchmarks were tested for performance evaluation using different metrics, namely G-Mean, area under the ROC curve (AUC), adjusted G-Mean, Accuracy, True Positive Rate, True Negative Rate and F1-score. The obtained results were compared to well-known algorithms and showed the effectiveness and robustness of the proposed approach, which results in well-balanced classifiers given different imbalance scenarios.

Abstract

Assunto

Redes neurais (Computação)

Palavras-chave

The number of samples commonly differs from one class to another in classification problems. This problem, known as the imbalanced data set problem, arises in most real-world applications. The point is that most current inductive learning principles resides on a sum of squared errors that do not take priors into account, which generally results in a classification bias towards the majority class.

Citação

Curso

Endereço externo

https://link.springer.com/article/10.1007/s11063-018-09977-1

Avaliação

Revisão

Suplementado Por

Referenciado Por