Restricted Maximum Likelihood (REML) estimation of variance-covariance matrices is an optimization procedure that behaves well in the presence of sampling bias and that has both scientific and industrial applications. Parallel algorithms are presented and compared for computing the REML gradient when the covariance matrix is not assumed block diagonal. The implementations presented are based on the Portable Extensible Toolkit for Scientific Computing (PETSc) and can run on any parallel computer supporting MPI.
Revised: September 4, 2002 |
Published: February 1, 2002
Citation
Malard J.M. 2002.Parallel Restricted Maximum Likelihood Estimation for Linear Models with a Dense Exogenous Matrix.Parallel Computing 28, no. 2:343-353.PNNL-SA-34215.