[R] SVD of a variance matrix

From: Giovanni Petris <GPetris_at_uark.edu>
Date: Tue, 15 Apr 2008 16:43:07 -0500 (CDT)


I suppose this is more a matrix theory question than a question on R, but I will give it a try...

I am using La.svd to compute the singular value decomposition (SVD) of a variance matrix, i.e., a symmetric nonnegative definite square matrix. Let S be my variance matrix, and S = U D V' be its SVD. In my numerical experiments I always got U = V. Is this necessarily the case? Or I might eventually run into a SVD which has U != V?

Thank you in advance for your insights and pointers.



Giovanni Petris  <GPetris_at_uark.edu>
Associate Professor
Department of Mathematical Sciences
University of Arkansas - Fayetteville, AR 72701
Ph: (479) 575-6324, 575-8630 (fax)

R-help_at_r-project.org mailing list
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.
Received on Tue 15 Apr 2008 - 21:45:36 GMT

Archive maintained by Robert King, hosted by the discipline of statistics at the University of Newcastle, Australia.
Archive generated by hypermail 2.2.0, at Tue 15 Apr 2008 - 22:30:28 GMT.

Mailing list information is available at https://stat.ethz.ch/mailman/listinfo/r-help. Please read the posting guide before posting to the list.

list of date sections of archive