Generalizações da integral de Choquet como método de combinação em comitês de classificadores

Ensembles of classifiers is an method in machine learning that consists in a collection of classifiers that process the same information and their output is combined in some manner. The process of classification is done in two main steps: the classification step and the combination step. In the cl...

ver descrição completa

Na minha lista:
Detalhes bibliográficos
Autor principal: Batista, Thiago Vinicius Vieira
Outros Autores: Bedregal, Benjamin Rene Callejas
Formato: doctoralThesis
Idioma:pt_BR
Publicado em: Universidade Federal do Rio Grande do Norte
Assuntos:
Endereço do item:https://repositorio.ufrn.br/handle/123456789/48233
Tags: Adicionar Tag
Sem tags, seja o primeiro a adicionar uma tag!
Descrição
Resumo:Ensembles of classifiers is an method in machine learning that consists in a collection of classifiers that process the same information and their output is combined in some manner. The process of classification is done in two main steps: the classification step and the combination step. In the classification step, each classifier processes the information and provides an output, in the combination step, the output of every classifier is combined, providing a single output. Although the combination step is extremely important, most works focus mostly on the classification step. Therefore, in this work, generalizations of the Choquet Integral will be proposed to be used as a combination method in ensembles of classifiers. The main idea is to allow a greater freedom of choice for functions in the integral, opening possibilities for otimization and using functions adequate to the data. Furthermore, a new notion of partial monotonicity is proposed, and consequently an alternative to the notion of pre-aggregation functions. Preliminary results that were obtained by the generalizations of the Choquet integral in the ensemble showed that they were capable of obtaining good results, having a superior performance to known methods in literature such as XGBoost, Bagging, among others. Furthermore, the generalizations that used the proposed aggregation functions had good performance when compared to other classes of functions, such as Copulas and Overlaps.