ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2505.23942
23
0

SG-Blend: Learning an Interpolation Between Improved Swish and GELU for Robust Neural Representations

29 May 2025
Gaurav Sarkar
Jay Gala
Subarna Tripathi
ArXiv (abs)PDFHTML
Main:9 Pages
6 Figures
Bibliography:2 Pages
8 Tables
Appendix:2 Pages
Abstract

The design of activation functions remains a pivotal component in optimizing deep neural networks. While prevailing choices like Swish and GELU demonstrate considerable efficacy, they often exhibit domain-specific optima. This work introduces SG-Blend, a novel activation function that blends our proposed SSwish, a first-order symmetric variant of Swish and the established GELU through dynamic interpolation. By adaptively blending these constituent functions via learnable parameters, SG-Blend aims to harness their complementary strengths: SSwish's controlled non-monotonicity and symmetry, and GELU's smooth, probabilistic profile, to achieve a more universally robust balance between model expressivity and gradient stability. We conduct comprehensive empirical evaluations across diverse modalities and architectures, showing performance improvements across all considered natural language and computer vision tasks and models. These results, achieved with negligible computational overhead, underscore SG-Blend's potential as a versatile, drop-in replacement that consistently outperforms strong contemporary baselines. The code is available atthis https URL.

View on arXiv
@article{sarkar2025_2505.23942,
  title={ SG-Blend: Learning an Interpolation Between Improved Swish and GELU for Robust Neural Representations },
  author={ Gaurav Sarkar and Jay Gala and Subarna Tripathi },
  journal={arXiv preprint arXiv:2505.23942},
  year={ 2025 }
}
Comments on this paper