Star algorithm for NN ensembling
Sergey Zinchenko, Dmitry Lishudi
Abstract
Neural network ensembling is a common and robust way to increase model efficiency. In this paper, we propose a new neural network ensemble algorithm based on Audibert's empirical star algorithm. We provide optimal theoretical minimax bound on the excess squared risk. Additionally, we empirically study this algorithm on regression and classification tasks and compare it to most popular ensembling methods.
Results
| Task | Dataset | Metric | Value | Model |
|---|---|---|---|---|
| Image Classification | Fashion-MNIST | Accuracy | 92.3 | Star Algorithm on LeNet |
| Image Classification | Fashion-MNIST | Percentage error | 7.7 | Star Algorithm on LeNet |
Related Papers
Language Integration in Fine-Tuning Multimodal Large Language Models for Image-Based Regression2025-07-20Automatic Classification and Segmentation of Tunnel Cracks Based on Deep Learning and Visual Explanations2025-07-18Adversarial attacks to image classification systems using evolutionary algorithms2025-07-17Efficient Adaptation of Pre-trained Vision Transformer underpinned by Approximately Orthogonal Fine-Tuning Strategy2025-07-17Federated Learning for Commercial Image Sources2025-07-17MUPAX: Multidimensional Problem Agnostic eXplainable AI2025-07-17Neural Network-Guided Symbolic Regression for Interpretable Descriptor Discovery in Perovskite Catalysts2025-07-16Imbalanced Regression Pipeline Recommendation2025-07-16