ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 1903.10951
  4. Cited By
Optimize TSK Fuzzy Systems for Regression Problems: Mini-Batch Gradient
  Descent with Regularization, DropRule and AdaBound (MBGD-RDA)

Optimize TSK Fuzzy Systems for Regression Problems: Mini-Batch Gradient Descent with Regularization, DropRule and AdaBound (MBGD-RDA)

26 March 2019
Dongrui Wu
Ye Yuan
Yihua Tan
ArXivPDFHTML

Papers citing "Optimize TSK Fuzzy Systems for Regression Problems: Mini-Batch Gradient Descent with Regularization, DropRule and AdaBound (MBGD-RDA)"

4 / 4 papers shown
Title
On the Functional Equivalence of TSK Fuzzy Systems to Neural Networks,
  Mixture of Experts, CART, and Stacking Ensemble Regression
On the Functional Equivalence of TSK Fuzzy Systems to Neural Networks, Mixture of Experts, CART, and Stacking Ensemble Regression
Dongrui Wu
Chin-Teng Lin
Jian Huang
Zhigang Zeng
20
53
0
25 Mar 2019
Adaptive Gradient Methods with Dynamic Bound of Learning Rate
Adaptive Gradient Methods with Dynamic Bound of Learning Rate
Liangchen Luo
Yuanhao Xiong
Yan Liu
Xu Sun
ODL
74
602
0
26 Feb 2019
Adam: A Method for Stochastic Optimization
Adam: A Method for Stochastic Optimization
Diederik P. Kingma
Jimmy Ba
ODL
1.8K
150,039
0
22 Dec 2014
Practical recommendations for gradient-based training of deep
  architectures
Practical recommendations for gradient-based training of deep architectures
Yoshua Bengio
3DH
ODL
193
2,198
0
24 Jun 2012
1