ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2503.22528
37
0

MixFunn: A Neural Network for Differential Equations with Improved Generalization and Interpretability

28 March 2025
T. S. Farias
Gubio Gomes de Lima
Jonas Maziero
Celso Jorge Villas-Boas
    AI4CE
ArXivPDFHTML
Abstract

We introduce MixFunn, a novel neural network architecture designed to solve differential equations with enhanced precision, interpretability, and generalization capability. The architecture comprises two key components: the mixed-function neuron, which integrates multiple parameterized nonlinear functions to improve representational flexibility, and the second-order neuron, which combines a linear transformation of its inputs with a quadratic term to capture cross-combinations of input variables. These features significantly enhance the expressive power of the network, enabling it to achieve comparable or superior results with drastically fewer parameters and a reduction of up to four orders of magnitude compared to conventional approaches. We applied MixFunn in a physics-informed setting to solve differential equations in classical mechanics, quantum mechanics, and fluid dynamics, demonstrating its effectiveness in achieving higher accuracy and improved generalization to regions outside the training domain relative to standard machine learning models. Furthermore, the architecture facilitates the extraction of interpretable analytical expressions, offering valuable insights into the underlying solutions.

View on arXiv
@article{farias2025_2503.22528,
  title={ MixFunn: A Neural Network for Differential Equations with Improved Generalization and Interpretability },
  author={ Tiago de Souza Farias and Gubio Gomes de Lima and Jonas Maziero and Celso Jorge Villas-Boas },
  journal={arXiv preprint arXiv:2503.22528},
  year={ 2025 }
}
Comments on this paper