Ovtchinnikov, Evgueni (2003) Convergence estimates for the generalized davidson method for symmetric eigenvalue problems II: the subspace acceleration. SIAM Journal on Numerical Analysis, 41 (1). pp. 272-286. ISSN 0036-1429
Full text not available from this repository.
Official URL: http://dx.doi.org/10.1137/S0036142902411756
The generalized Davidson (GD) method can be viewed as a generalization of the preconditioned steepest descent (PSD) method for solving symmetric eigenvalue problems. There are two aspects of this generalization. The most obvious one is that in the GD method the new approximation is sought in a larger subspace, namely the one that spans all the previous approximate eigenvectors, in addition to the current one and the preconditioned residual thereof. Another aspect relates to the preconditioning. Most of the available results for the PSD method are associated with the same view on preconditioning as in the case of linear systems. Consequently, they fail to detect the superlinear convergence for certain "ideal" preconditioners, such as the one corresponding to the "exact" version of the Jacobi--Davidson method---one of the most familiar instances of the GD method. Focusing on the preconditioning aspect, this paper advocates an alternative approach to measuring the quality of preconditioning for eigenvalue problems and presents corresponding non-asymptotic convergence estimates for the GD method in general and Jacobi--Davidson method in particular that correctly detect known cases of the superlinear convergence.
|Additional Information:||Online ISSN 1095-7170|
|Research Community:||University of Westminster > Electronics and Computer Science, School of|
|Deposited On:||26 Sep 2005|
|Last Modified:||15 Oct 2009 15:05|
Repository Staff Only: item control page