ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2405.18805
16
0

Semiring Activation in Neural Networks

29 May 2024
B. Smets
Peter D. Donker
Jim W. Portegies
    LLMSV
ArXivPDFHTML
Abstract

We introduce a class of trainable nonlinear operators based on semirings that are suitable for use in neural networks. These operators generalize the traditional alternation of linear operators with activation functions in neural networks. Semirings are algebraic structures that describe a generalised notation of linearity, greatly expanding the range of trainable operators that can be included in neural networks. In fact, max- or min-pooling operations are convolutions in the tropical semiring with a fixed kernel.We perform experiments where we replace the activation functions for trainable semiring-based operators to show that these are viable operations to include in fully connected as well as convolutional neural networks (ConvNeXt). We discuss some of the challenges of replacing traditional activation functions with trainable semiring activations and the trade-offs of doing so.

View on arXiv
@article{smets2025_2405.18805,
  title={ Semiring Activation in Neural Networks },
  author={ Bart M.N. Smets and Peter D. Donker and Jim W. Portegies },
  journal={arXiv preprint arXiv:2405.18805},
  year={ 2025 }
}
Comments on this paper