9
3

A Letter on Convergence of In-Parameter-Linear Nonlinear Neural Architectures with Gradient Learnings

Abstract

This letter summarizes and proves the concept of bounded-input bounded-state (BIBS) stability for weight convergence of a broad family of in-parameter-linear nonlinear neural architectures as it generally applies to a broad family of incremental gradient learning algorithms. A practical BIBS convergence condition results from the derived proofs for every individual learning point or batches for real-time applications.

View on arXiv
Comments on this paper