Date: Thu, 1 Dec 2005 22:30:28 -0800
Reply-To: David L Cassell <davidlcassell@MSN.COM>
Sender: "SAS(r) Discussion" <SAS-L@LISTSERV.UGA.EDU>
From: David L Cassell <davidlcassell@MSN.COM>
Subject: Re: Influential observations
Content-Type: text/plain; format=flowed
>Thanks all of you for the suggestions.Some of you have suggested
>removing variables. It is not possible for me to do that as i have only
>8 variables and all of them play important role. Removing variables
>will make life esaier but at the same time i will be loosing important
>information and since i want to make predictions i don't want to base
>my model on only few variables. So i wanted to know which method is
>best suited for dealing with multicollinearity when predictions have to
>be made preferably in business area.
 If all you want is a predictive model, then just run your data through
Choose partial least squares regression, or principal components regression,
reduced rank regression. Just be aware that the process is implicitly doing
model reduction for you under the hood.
 I'm going to look into my crystal ball and speculate that you don't
all 8 variables in your model. If you really did need all 8 in your model
in order to
predict your Y, you wouldn't have this collinearity problem. Perhaps there
subject-matter rationale for keeping all of them in the model, as in Paige's
But I can't tell that. My crystal ball is getting really fuzzy now.
David L. Cassell
3115 NW Norwood Pl.
Corvallis OR 97330
Express yourself instantly with MSN Messenger! Download today - it's FREE!