problem with logistic regression - linearity to the logit

classic Classic list List threaded Threaded
23 messages Options
12
Reply | Threaded
Open this post in threaded view
|

Re: problem with logistic regression - linearity to the logit

joesturgis
Hello, I'm at a similar place in my research using logistic regression.  I have four parameters that fail the linearity test, and was able to use the methods in  Menard (1995) to discover possible parametric forms.  One is a cubic, and the other three are quadratic; each provides a significant change in -2LL, so I believe all of the forms are correct.  

I have one small issue, when I refit the logistic regression with the transformed parameters, one of the quadratic parameters was no longer significant.  I can just remove the squared term to get acceptable results, but it doesn't seem right--the other two quadratic parameters are significant and each variable has the same physical meaning.  Any suggestions?  Could this be a problem with colinearity?
Reply | Threaded
Open this post in threaded view
|

Re: problem with logistic regression - linearity to the logit

Bruce Weaver
Administrator
There is no need for every variable in the model to be statistically significant.  In fact, the Manuscript Checklist at the link below says that a model with no non-significant explanatory variables is suspicious.

   http://biostat.mc.vanderbilt.edu/wiki/Main/ManuscriptChecklist

Here's the relevant section:

Lack of insignificant variables in the final model

Unless the sample size is huge, this is usually the result of the authors using a stepwise variable selection or some other approach for filtering out "insignificant" variables. Hence the presence of a table of variables in which every variable is significant is usually the sign of a serious problem.

Authors frequently use strategies involving removing insignificant terms from the model without making an attempt to derive valid confidence intervals or P-values that account for uncertainty in which terms were selected (using for example the bootstrap or penalized maximum likelihood esetimation). A paper in J Clin Epi March 2009 cited Ockham's razor as a principle to be followed when building a model, not realizing that parsimony resulting from utilizing of the data at hand to make modeling decisions only seems to result in parsimony. Removing insignificant terms causes bias, inaccurate (too narrow) confidence intervals, and failure to preserve type I error in the resulting model's P-values, which are calculated as though the model was completely pre-specified.

HTH.


joesturgis wrote
Hello, I'm at a similar place in my research using logistic regression.  I have four parameters that fail the linearity test, and was able to use the methods in  Menard (1995) to discover possible parametric forms.  One is a cubic, and the other three are quadratic; each provides a significant change in -2LL, so I believe all of the forms are correct.  

I have one small issue, when I refit the logistic regression with the transformed parameters, one of the quadratic parameters was no longer significant.  I can just remove the squared term to get acceptable results, but it doesn't seem right--the other two quadratic parameters are significant and each variable has the same physical meaning.  Any suggestions?  Could this be a problem with colinearity?
--
Bruce Weaver
bweaver@lakeheadu.ca
http://sites.google.com/a/lakeheadu.ca/bweaver/

"When all else fails, RTFM."

PLEASE NOTE THE FOLLOWING: 
1. My Hotmail account is not monitored regularly. To send me an e-mail, please use the address shown above.
2. The SPSSX Discussion forum on Nabble is no longer linked to the SPSSX-L listserv administered by UGA (https://listserv.uga.edu/).
Reply | Threaded
Open this post in threaded view
|

Re: problem with logistic regression - linearity to the logit

Rich Ulrich
In reply to this post by joesturgis
Yes, when a univariate relation does not show up when controlling
for other variables, that is collinearity. 

And you are right (though you don't ask), that you should use one
form for the variables with the same physical meaning. 
 - I always hope that the transformation makes sense as a natural
transformation of the data, and how it is collected.  So I wonder
about the one that was improved by "cubic" because I haven't seen
that happen in my data.

--
Rich Ulrich


> Date: Wed, 10 Aug 2011 15:09:34 -0700

> From: [hidden email]
> Subject: Re: problem with logistic regression - linearity to the logit
> To: [hidden email]
>
> Hello, I'm at a similar place in my research using logistic regression. I
> have four parameters that fail the linearity test, and was able to use the
> methods in Menard (1995) to discover possible parametric forms. One is a
> cubic, and the other three are quadratic; each provides a significant change
> in -2LL, so I believe all of the forms are correct.
>
> I have one small issue, when I refit the logistic regression with the
> transformed parameters, one of the quadratic parameters was no longer
> significant. I can just remove the squared term to get acceptable results,
> but it doesn't seem right--the other two quadratic parameters are
> significant and each variable has the same physical meaning. Any
> suggestions? Could this be a problem with colinearity?
>

12