
<abstract><p>Consider a linear system $ Ax = b $ where the coefficient matrix $ A $ is rectangular and of full-column rank. We propose an iterative algorithm for solving this linear system, based on gradient-descent optimization technique, aiming to produce a sequence of well-approximate least-squares solutions. Here, we consider least-squares solutions in a full generality, that is, we measure any related error through an arbitrary vector norm induced from weighted positive definite matrices $ W $. It turns out that when the system has a unique solution, the proposed algorithm produces approximated solutions converging to the unique solution. When the system is inconsistent, the sequence of residual norms converges to the weighted least-squares error. Our work includes the usual least-squares solution when $ W = I $. Numerical experiments are performed to validate the capability of the algorithm. Moreover, the performance of this algorithm is better than that of recent gradient-based iterative algorithms in both iteration numbers and computational time.</p></abstract>
FOS: Political science, Norm (philosophy), Computational Mechanics, Estimator, least-squares solution, Engineering, Political science, Eigenvalues and eigenvectors, weighted norm, Numerical Analysis, Numerical Optimization Techniques, Physics, Mathematical optimization, Statistics, Singular value decomposition, Theory and Applications of Compressed Sensing, Iterative method, Algorithm, Computational Theory and Mathematics, Rank (graph theory), Residual, Physical Sciences, Sparse Linear Systems, Iterative Methods, Artificial neural network, Convex Optimization, FOS: Law, Quantum mechanics, convergence analysis, Positive-definite matrix, iterative method, Machine learning, QA1-939, FOS: Mathematics, Linear least squares, Genetics, Orthogonal Matching Pursuit, gradient-descent, Biology, Matrix Algorithms and Iterative Methods, Gradient descent, Applied mathematics, Computer science, Combinatorics, FOS: Biological sciences, Computer Science, Law, Mathematics, Least-squares function approximation, Matrix Computations, Sequence (biology)
FOS: Political science, Norm (philosophy), Computational Mechanics, Estimator, least-squares solution, Engineering, Political science, Eigenvalues and eigenvectors, weighted norm, Numerical Analysis, Numerical Optimization Techniques, Physics, Mathematical optimization, Statistics, Singular value decomposition, Theory and Applications of Compressed Sensing, Iterative method, Algorithm, Computational Theory and Mathematics, Rank (graph theory), Residual, Physical Sciences, Sparse Linear Systems, Iterative Methods, Artificial neural network, Convex Optimization, FOS: Law, Quantum mechanics, convergence analysis, Positive-definite matrix, iterative method, Machine learning, QA1-939, FOS: Mathematics, Linear least squares, Genetics, Orthogonal Matching Pursuit, gradient-descent, Biology, Matrix Algorithms and Iterative Methods, Gradient descent, Applied mathematics, Computer science, Combinatorics, FOS: Biological sciences, Computer Science, Law, Mathematics, Least-squares function approximation, Matrix Computations, Sequence (biology)
| selected citations These citations are derived from selected sources. This is an alternative to the "Influence" indicator, which also reflects the overall/total impact of an article in the research community at large, based on the underlying citation network (diachronically). | 0 | |
| popularity This indicator reflects the "current" impact/attention (the "hype") of an article in the research community at large, based on the underlying citation network. | Average | |
| influence This indicator reflects the overall/total impact of an article in the research community at large, based on the underlying citation network (diachronically). | Average | |
| impulse This indicator reflects the initial momentum of an article directly after its publication, based on the underlying citation network. | Average |
