Document Type : Original Article
Authors
1 Department of Mathematics and Computer Sciences, Faculty of Mathematical Sciences, Lahijan Branch, Islamic Azad University, Lahijan, Iran.
2 Department of Applied Mathematics, Ayandegan Institute of Higher Education, Tonekabon, Iran.
Abstract
In this paper we proposed a new neurodynamic model with recurrent learning process for solving ill-condition Generalized eigenvalue problem (GEP) Ax = lambda Bx. our method is based on recurrent neural networks with customized energy function for finding smallest (largest) or all eigenpairs. We evaluate our method on collected structural engineering data from Harwell Boeing collection with high dimensional parameter space and ill-conditioned sparse matrices. The experiments demonstrate that our algorithm using Adam optimizer, in comparison with other stochastic optimization methods like gradient descent works well in practice and improves complexity and accuracy of convergence.
Keywords
- Erkan, U. (2021). A precise and stable machine learning algorithm: eigenvalue classification (EigenClass). Neural computing and applications, 33(10), 5381-5392.
- Datta, B. N. (2010). Numerical linear algebra and applications(Vol. 116). Siam.
- Najafi, H. S., & Refahi, A. (2007). FOM-inverse vector iteration method for computing a few smallest (largest) eigenvalues of pair (A, B). Applied mathematics and computation, 188(1), 641-647.
- Cichocki, A., & Unbehauen, R. (1992). Neural networks for computing eigenvalues and eigenvectors. Biological cybernetics, 68(2), 155-164.
- Feng, J., Chai, Y., & Xu, C. (2021). A novel neural network to nonlinear complex-variable constrained nonconvex optimization. Journal of the Franklin institute, 358(8), 4435-4457.
- Xu, C., Chai, Y., Qin, S., Wang, Z., & Feng, J. (2020). A neurodynamic approach to nonsmooth constrained pseudoconvex optimization problem. Neural networks, 124, 180-192.
- Qin, S., Feng, J., Song, J., Wen, X., & Xu, C. (2016). A one-layer recurrent neural network for constrained complex-variable convex optimization. IEEE transactions on neural networks and learning systems, 29(3), 534-544.
- Qin, S., Fan, D., Wu, G., & Zhao, L. (2015). Neural network for constrained nonsmooth optimization using Tikhonov regularization. Neural networks, 63, 272-281.
- Liao, L. Z., Qi, H., & Qi, L. (2004). Neurodynamical optimization. Journal of global optimization, 28(2), 175-195.
- Liu, L., Shao, H., & Nan, D. (2008). Recurrent neural network model for computing largest and smallest generalized eigenvalue. Neurocomputing, 71(16-18), 3589-3594.
- Feng, J., Yan, S., Qin, S., & Han, W. (2019). A neurodynamic approach to compute the generalized eigenvalues of symmetric positive matrix pair. Neurocomputing, 359, 420-426.
- Yi, Z., Fu, Y., & Tang, H. J. (2004). Neural networks-based approach for computing eigenvectors and eigenvalues of symmetric matrix. Computers & mathematics with applications, 47(8-9), 1155-1164.
- Wang, X., Che, M., & Wei, Y. (2016). Recurrent neural network for computation of generalized eigenvalue problem with real diagonalizable matrix pair and its applications. Neurocomputing, 216, 230-241.
- Najafi, H. S., & Khaleghi, E. (2004). A new restarting method in the Arnoldi algorithm for computing the eigenvalues of a nonsymmetric matrix. Applied mathematics and computation, 156(1), 59-71.
- Boisvert, R. F., Pozo, R., Remington, K., Barrett, R. F., & Dongarra, J. J. (1997). Matrix Market: a web resource for test matrix collections. Quality of numerical software(pp. 125-137). Springer, Boston, MA.
- Kingma, D. P., & Ba, J. (2014). Adam: a method for stochastic optimization. arXiv preprint arXiv:1412.6980. https://arxiv.org/abs/1412.6980
- Ghojogh, B., Karray, F., & Crowley, M. (2019). Eigenvalue and generalized eigenvalue problems: Tutorial. arXiv preprint arXiv:1903.11240. https://arxiv.org/abs/1903.11240