|Date: ||Fri, 17 Apr 2009 11:33:32 -0400|
|Reply-To: ||Peter Flom <firstname.lastname@example.org>|
|Sender: ||"SAS(r) Discussion" <SAS-L@LISTSERV.UGA.EDU>|
|From: ||Peter Flom <peterflomconsulting@MINDSPRING.COM>|
|Subject: ||Re: Multivariate Linear Regression Models|
|Content-Type: ||text/plain; charset=UTF-8|
>On Apr 17, 3:57 am, Pact.Capacity.Managem...@UK.FUJITSU.COM (Anthony
>> Good-day all,
>> I was hoping someone may have a solution to a minor modelling issue we
>> are encountering.
>> Our aim is to produce "best-fit" analysis of response-variables against
>> multiple regressors (for which PROC REG is currently used), but as we are
>> dealing with sets of real positive numbers (>=0) we wish to restrict the
>> range of possible parameter value outputs to also be in the same range
>> (i.e. >=0).
>> So in summary is there anyway of ensuring the multivariate linear
>> regression analysis provides the "best-fit" that can be achieved with the
>> calculated paramaters being greater than or equal to zero?
>I can't understand the reasoning here. Your predictor variables are
>real positive numbers. Why then do you require the parameter estimates
>to be also positive? Makes no sense to me.
>There could easily be an inverse (i.e. negative slope) relationship
>between a specific predictor and your response, which would result in
>a negative parameter estimate.
>Seems to me you need to rethink this.
Now I realize that I misinterpreted the original post, and I agree with Paige
I don't see why you would want to have only positive parameter estimates.... I had
thought you wanted only positive predicted values.
Peter L. Flom, PhD
www DOT peterflomconsulting DOT com