Multiple general sigmoids based Banach space valued neural network multivariate approximation

Downloads

DOI:

https://doi.org/10.56754/0719-0646.2503.411

Abstract

Here we present multivariate quantitative approximations of Banach space valued continuous multivariate functions on a box or \(\mathbb{R}^{N},\) \(N\in \mathbb{N}\), by the multivariate normalized, quasi-interpolation, Kantorovich type and quadrature type neural network operators. We treat also the case of approximation by iterated operators of the last four types. These approximations are derived by establishing multidimensional Jackson type inequalities involving the multivariate modulus of continuity of the engaged function or its high order Fréchet derivatives. Our multivariate operators are defined by using a multidimensional density function induced by several different among themselves general sigmoid functions. This is done on the purpose to activate as many as possible neurons. The approximations are pointwise and uniform. The related feed-forward neural network is with one hidden layer. We finish with related \(L_{p}\) approximations.

Keywords

General sigmoid functions , multivariate neural network approximation , quasi-interpolation operator , Kantorovich type operator , quadrature type operator , multivariate modulus of continuity , abstract approximation , iterated approximation , Lp approximation

Mathematics Subject Classification:

41A17 , 41A25 , 41A30 , 41A36
  • Pages: 411–439
  • Date Published: 2023-12-22
  • Vol. 25 No. 3 (2023)

G. A. Anastassiou, Moments in probability and approximation theory, ser. Pitman Research Notes in Mathematics Series. New York, USA: Longman Scientific & Technical, Harlow; copublished with John Wiley & Sons, Inc., 1993, vol. 287.

G. A. Anastassiou, “Rate of convergence of some neural network operators to the unit-univariate case,” J. Math. Anal. Appl., vol. 212, no. 1, pp. 237–262, 1997, doi: 10.1006/jmaa.1997.5494.

G. A. Anastassiou, Quantitative approximations. Boca Raton, FL, USA: Chapman & Hall/CRC, 2001.

G. A. Anastassiou, Intelligent systems: approximation by artificial neural networks, ser. In- telligent Systems Reference Library. Berlin, Germany: Springer-Verlag, 2011, vol. 19.

G. A. Anastassiou, “Multivariate hyperbolic tangent neural network approximation,” Comput. Math. Appl., vol. 61, no. 4, pp. 809–821, 2011, doi: 10.1016/j.camwa.2010.12.029.

G. A. Anastassiou, “Multivariate sigmoidal neural network approximation,” Neural Networks, vol. 24, no. 4, pp. 378–386, 2011, doi: 10.1016/j.neunet.2011.01.003.

G. A. Anastassiou, “Univariate hyperbolic tangent neural network approximation,” Math. Comput. Modelling, vol. 53, no. 5–6, pp. 1111–1132, 2011, doi: 10.1016/j.mcm.2010.11.072.

G. A. Anastassiou, “Univariate sigmoidal neural network approximation,” J. Comput. Anal. Appl., vol. 14, no. 4, pp. 659–690, 2012, doi: 10.1007/978-3-642-21431-8_1.

G. A. Anastassiou, “Approximation by neural networks iterates,” in Advances in applied mathematics and approximation theory, ser. Springer Proc. Math. Stat. New York, USA: Springer, New York, 2013, vol. 41, pp. 1–20.

G. A. Anastassiou, Intelligent systems II: complete approximation by neural network operators, ser. Studies in Computational Intelligence. Cham, Switzerland: Springer, 2016, vol. 608, doi: 10.1007/978-3-319-20505-2.

G. A. Anastassiou, Intelligent computations: abstract fractional calculus, inequalities, approximations, ser. Studies in Computational Intelligence. Cham, Switzerland: Springer, 2018, vol. 734, doi: 10.1007/978-3-319-66936-6.

G. A. Anastassiou, “General multivariate arctangent function activated neural network approximations,” J. Numer. Anal. Approx. Theory, vol. 51, no. 1, pp. 37–66, 2022.

H. Cartan, Differential calculus. Paris, France; Boston, MA, USA: Hermann; Houghton Mifflin Co., 1971.

Z. Chen and F. Cao, “The approximation operators with sigmoidal functions,” Comput. Math. Appl., vol. 58, no. 4, pp. 758–765, 2009, doi: 10.1016/j.camwa.2009.05.001.

D. Costarelli and R. Spigler, “Approximation results for neural network operators activated by sigmoidal functions,” Neural Networks, vol. 44, pp. 101–106, 2013, doi: 10.1016/j.neunet.2013.03.015.

D. Costarelli and R. Spigler, “Multivariate neural network operators with sigmoidal activation functions,” Neural Networks, vol. 48, pp. 72–77, 2013, doi: 10.1016/j.neunet.2013.07.009.

S. Haykin, Neural Networks: A Comprehensive Foundation. New York, USA: Prentice Hall, 1999.

W. S. McCulloch and W. Pitts, “A logical calculus of the ideas immanent in nervous activity,” Bull. Math. Biophys., vol. 5, pp. 115–133, 1943, doi: 10.1007/bf02478259.

T. M. Mitchell, Machine Learning. New York, USA: McGraw-Hill Science, 1997.

L. B. Rall, Computational solution of nonlinear operator equations. New York-London- Sydney, USA-UK-Australia: John Wiley & Sons, Inc., 1969, with an appendix by Ramon E. Moore.

Most read articles by the same author(s)

1 2 > >> 

Downloads

Download data is not yet available.

Published

2023-12-22

How to Cite

[1]
G. A. Anastassiou, “Multiple general sigmoids based Banach space valued neural network multivariate approximation”, CUBO, vol. 25, no. 3, pp. 411–439, Dec. 2023.

Issue

Section

Articles