Date: Fri, 27 Sep 2002 08:52:14 -0700
Sender: "SAS(r) Discussion" <SAS-L@LISTSERV.UGA.EDU>
From: "David L. Cassell" <Cassell.David@EPAMAIL.EPA.GOV>
Subject: Re: Regression with Class Variables and Stepwise Selection
Content-type: text/plain; charset=us-ascii
"Femminella, Oliver" <Oliver.Femminella@HALIFAXCETELEM.COM> replied:
> you mention Ridge Regression...does ANYBODY know
> of (or has) any SAS implementation (macro) of this
> relatively old regression model...
PROC REG does it. There's an example in the docs.
> even though there
> are more 'practical' modelling approaches used nowadays
> for dealing with multicollienearity - (e.g. neural networks,
> CART, logistic, support vector machines, etc.) -
I don't find these "more practical" at *all*. Some of them do not
even answer the same questions! Others have been over-hyped by
their developers and users, to an embarrassing extent. Neural
networks, for example, are merely a class of functionals which are
identical to an array of standard statistical procedures, even though
their relationships have been unfortunately well hidden. "Logistic"
is merely a jargon word, as it doesn't describe anything specific, and
the topics it could be applied to do *not* handle multicollinearity.
Neither do CART or neural networks, except by ignoring the problem and
sweeping the pieces under a nice, well-camouflaged rug.
> (in Ridge Regression one has to choose Lambda, the
> 'smoothing' or 'regularisation' parameter(s)).
Not quite. In ridge regression, one usually makes an _a_posteriori_
choice about lambda. Just as one needs to make similar choices when
using any flexible tool, like CART, SIR, etc... Using 'default'
settings in something like CART without checking is just asking for
trouble, or at least, less than optimal soultion sets. But if you
really don't want to do ridge regression, consider PROC PLS instead.
David Cassell, CSC
Senior computing specialist