```Date: Wed, 2 Jul 1997 20:03:57 GMT Reply-To: David Nichols Sender: "SPSSX(r) Discussion" From: David Nichols Organization: SPSS, Inc. Subject: Re: reliability inquiry (determinant approaching zero) In article <97Jul1.143111pdt.10375@gateway.sharp.com>, Dale Glaser wrote: >Quick question with probably an easy answer: ran a reliability analysis >with 16 dichotomous items (coded 0,1).....I already practiced a prior >analyses by hand to assure that the result from SPSS approximates the >KR-20 and it does......... > >....my conundrum is that even though sample size is obviously >problematic (n = 100), I attain a reliability of .7293, but before I get too >satisifed (!) the following error crops up: > >***Warning*** Determinant of matrix is close to zero: 5.155E-16 > Statistics based on inverse matrix for scale ALPHA are meaningless >and printed as . > >My understanding of matrix algebra was that one cause generally >associated with problems with matrix inversion is singularity.........I know >that as the determinant approaches zero the matrix becomes singular, >precluding inversion.....so this is where I am stuck: for the 16 items, the >average inter-item correlation is only .1411, so wouldn't that be some >evidence that singularity is not the culprit?.......or is this just unique to >calculating reliability with dichotomous variables within the SPSS >context? > >any help would be greatly appreciated.....thanks!! > >Dale Glaser, Ph.D. >Clinical Outcomes Research >Sharp HealthCare >San Diego, CA This message basically just means that certain statistics arenn't being computed because of fears about precision issues. There's another message that's issued when the matrix is actually singular (truly 0 determinant). As I noted recently in a response to a question on FACTOR, the determinant of a matrix (particularly a correlation matrix) can be very close to 0 without there being problems in inverting the matrix or computing things like eigenvalues. There is no necessity for any items to be highly intercorrelated (or covarying, for a covariance matrix) to have a correlation (or covariance) matrix be singular. Consider a set of random or pseudo- random deviates computed so as to have only random intercorrelations (or actually, 0 intercorrelations). Compute a new variable that is the sum of these variables and add it to the set. The correlation or covariance matrix will be singular, though the average intercorrelation will be quite low. Of course, regardless of the values of the intercorrelations, if you form a matrix from fewer cases than variables, it will be singular. -- ----------------------------------------------------------------------------- David Nichols Senior Support Statistician SPSS, Inc. Phone: (312) 329-3684 Internet: nichols@spss.com Fax: (312) 329-3668 ----------------------------------------------------------------------------- ```

Back to: Top of message | Previous page | Main SPSSX-L page