TasksSotADatasetsPapersMethodsSubmitAbout
Papers With Code 2

A community resource for machine learning research: papers, code, benchmarks, and state-of-the-art results.

Explore

Notable BenchmarksAll SotADatasetsPapersMethods

Community

Submit ResultsAbout

Data sourced from the PWC Archive (CC-BY-SA 4.0). Built by the community, for the community.

Methods/KAF

KAF

Kernel Activation Function

GeneralIntroduced 20009 papers
Source Paper

Description

A Kernel Activation Function is a non-parametric activation function defined as a one-dimensional kernel approximator:

f(s)=∑i=1Dαiκ(s,di)f(s) = \sum_{i=1}^D \alpha_i \kappa( s, d_i)f(s)=∑i=1D​αi​κ(s,di​)

where:

  1. The dictionary of the kernel elements d0,…,dDd_0, \ldots, d_Dd0​,…,dD​ is fixed by sampling the xxx-axis with a uniform step around 0.
  2. The user selects the kernel function (e.g., Gaussian, ReLU, Softplus) and the number of kernel elements DDD as a hyper-parameter. A larger dictionary leads to more expressive activation functions and a larger number of trainable parameters.
  3. The linear coefficients are adapted independently at every neuron via standard back-propagation.

In addition, the linear coefficients can be initialized using kernel ridge regression to behave similarly to a known function in the beginning of the optimization process.

Papers Using This Method

Knowledge Augmentation in Federation: Rethinking What Collaborative Learning Can Bring Back to Decentralized Data2025-03-05Kolmogorov-Arnold Fourier Networks2025-02-09An Analytic Solution for Kernel Adaptive Filtering2024-02-05Association of stroke lesion distributions with atrial fibrillation detected after stroke2023-05-24The Functional Wiener Filter2022-12-31Learning to Forecast Dynamical Systems from Streaming Data2021-09-20Multikernel activation functions: formulation and a case study2019-01-29Kafnets: kernel-based non-parametric activation functions for neural networks2017-07-13Improving Sparsity in Kernel Adaptive Filters Using a Unit-Norm Dictionary2017-07-13