
doi: 10.1007/bfb0120952
This paper considers the problem of minimizing a nonlinear function subject to linear constraints. The method adopted is the reduced gradient method as described by Murtagh and Saunders, with a conjugate gradient method due to Shanno used for unconstrained minimization on manifolds. It is shown how to preserve past information on search directions when a basis change occurs and when a superbasic variable is dropped. Numerical results show a substantial improvement over the reported results of Murtagh and Saunders when conjugate gradient methods are used.
Large-scale problems in mathematical programming, sparse matrix, reduced gradient method, Numerical mathematical programming methods, Nonlinear programming, search direction determination, Methods of reduced gradient type, linearly constrained nonlinear programming, large- scale systems, superbasic variable, basis change, simplex method, computational results
Large-scale problems in mathematical programming, sparse matrix, reduced gradient method, Numerical mathematical programming methods, Nonlinear programming, search direction determination, Methods of reduced gradient type, linearly constrained nonlinear programming, large- scale systems, superbasic variable, basis change, simplex method, computational results
| selected citations These citations are derived from selected sources. This is an alternative to the "Influence" indicator, which also reflects the overall/total impact of an article in the research community at large, based on the underlying citation network (diachronically). | 12 | |
| popularity This indicator reflects the "current" impact/attention (the "hype") of an article in the research community at large, based on the underlying citation network. | Average | |
| influence This indicator reflects the overall/total impact of an article in the research community at large, based on the underlying citation network (diachronically). | Top 10% | |
| impulse This indicator reflects the initial momentum of an article directly after its publication, based on the underlying citation network. | Top 10% |
