TasksSotADatasetsPapersMethodsSubmitAbout
Papers With Code 2

A community resource for machine learning research: papers, code, benchmarks, and state-of-the-art results.

Explore

Notable BenchmarksAll SotADatasetsPapersMethods

Community

Submit ResultsAbout

Data sourced from the PWC Archive (CC-BY-SA 4.0). Built by the community, for the community.

Papers/Multi-Objective Optimisation of Multi-Output Neural Trees

Multi-Objective Optimisation of Multi-Output Neural Trees

Varun Ojha, Giuseppe Nicosia

2020-10-09General Classification
PaperPDFCode(official)

Abstract

We propose an algorithm and a new method to tackle the classification problems. We propose a multi-output neural tree (MONT) algorithm, which is an evolutionary learning algorithm trained by the non-dominated sorting genetic algorithm (NSGA)-III. Since evolutionary learning is stochastic, a hypothesis found in the form of MONT is unique for each run of evolutionary learning, i.e., each hypothesis (tree) generated bears distinct properties compared to any other hypothesis both in topological space and parameter-space. This leads to a challenging optimisation problem where the aim is to minimise the tree-size and maximise the classification accuracy. Therefore, the Pareto-optimality concerns were met by hypervolume indicator analysis. We used nine benchmark classification learning problems to evaluate the performance of the MONT. As a result of our experiments, we obtained MONTs which are able to tackle the classification problems with high accuracy. The performance of MONT emerged better over a set of problems tackled in this study compared with a set of well-known classifiers: multilayer perceptron, reduced-error pruning tree, naive Bayes classifier, decision tree, and support vector machine. Moreover, the performances of three versions of MONT's training using genetic programming, NSGA-II, and NSGA-III suggest that the NSGA-III gives the best Pareto-optimal solution.

Results

TaskDatasetMetricValueModel
General ClassificationirisAccuracy100MONT3
General ClassificationWineAccuracy100MONT3

Related Papers

Specialized text classification: an approach to classifying Open Banking transactions2025-04-10Universal Training of Neural Networks to Achieve Bayes Optimal Classification Accuracy2025-01-13Revisiting MLLMs: An In-Depth Analysis of Image Classification Abilities2024-12-21Using Instruction-Tuned Large Language Models to Identify Indicators of Vulnerability in Police Incident Narratives2024-12-16Ramsey Theorems for Trees and a General 'Private Learning Implies Online Learning' Theorem2024-07-10Cross-Block Fine-Grained Semantic Cascade for Skeleton-Based Sports Action Recognition2024-04-30DiffuseMix: Label-Preserving Data Augmentation with Diffusion Models2024-04-05Large Stepsize Gradient Descent for Logistic Loss: Non-Monotonicity of the Loss Improves Optimization Efficiency2024-02-24