Publications
2002
1.
Rodríguez, Rocío Alaiz; Sueiro, Jesús Cid
Neural minimax classifiers Artículo de revista
En: International Conference on Artificial Neural Networks, pp. 408–413, 2002, (Publisher: Springer Berlin Heidelberg Berlin, Heidelberg).
Resumen | Enlaces | BibTeX | Etiquetas: class priors, minimax strategy, neural networks, robust learning, softmax
@article{alaiz_rodriguez_neural_2002,
title = {Neural minimax classifiers},
author = {Rocío Alaiz Rodríguez and Jesús Cid Sueiro},
url = {https://link.springer.com/chapter/10.1007/3-540-46084-5_66},
year = {2002},
date = {2002-01-01},
journal = {International Conference on Artificial Neural Networks},
pages = {408–413},
abstract = {This paper presents a method to train neural networks using a minimax strategy, which aims to minimize the error probability under the worst-case scenario where class priors in the training data differ from those in the test set. The approach is demonstrated using a softmax-based neural network, but it is also applicable to other neural network structures. Experimental results indicate that the proposed method outperforms other approaches in handling situations where the stationarity assumption (i.e., consistency between training and test data distributions) is violated.},
note = {Publisher: Springer Berlin Heidelberg Berlin, Heidelberg},
keywords = {class priors, minimax strategy, neural networks, robust learning, softmax},
pubstate = {published},
tppubtype = {article}
}
This paper presents a method to train neural networks using a minimax strategy, which aims to minimize the error probability under the worst-case scenario where class priors in the training data differ from those in the test set. The approach is demonstrated using a softmax-based neural network, but it is also applicable to other neural network structures. Experimental results indicate that the proposed method outperforms other approaches in handling situations where the stationarity assumption (i.e., consistency between training and test data distributions) is violated.