Part correlations and multicollinearity?

classic Classic list List threaded Threaded
3 messages Options
Reply | Threaded
Open this post in threaded view
|

Part correlations and multicollinearity?

Anter
Hello to all,

 

 I have a regression model where two variables are highly correlated, at ~ .8, and the Tolerance is low .2 - .3 and VIF high, ~4-5.

Does it mean that the variables don’t have such a large independent contribution to explaining the outcome? Even if there seems to be multicollinearity for the two variables, these same two variables have large part and semipartial correlation coefficients, the largest amongst all 10 variables included (5 controls).

Can I say that despite the multicollinearity, each variable explains a unique proportion in the outcome only based on these part correlations? I have read somewhere that <The more “tolerant” a variable is (i.e. the less highly correlated it is with the other IVs), the greater its unique contribution to R2 will be>, but the case here is reverse, these variables have low tolerance but high contribution to explaining the criterion.

Is this interpretation correct?
 
Thank you in advance.
 
Andra

Andra-Florina Toader
MA Student, www.fpse.ro
E-mail: [hidden email]
Tel.:+40747314389/+40736478936
Reply | Threaded
Open this post in threaded view
|

Re: Part correlations and multicollinearity?

Rich Ulrich
The trick here is "unique variance", and
what you need to read up on is "suppressor variables".

If you look for it, you will see that one of these variables
has a partial-correlation that is in the opposite direction
from its 0-order correlation.  This happens whenever
the *difference* (or ratio) of the two correlated predictors
is also predictive.  - An easy clue that this is happening is
that a standardized beta is greater than 1.0.

The best solution that I know of is to create a new variable,
 a composite score that is some (weighted) difference or ratio
(or log-ratio) of the two correlated variables -- if there is
such a composite that makes particularly good sense for the
variables, use that version.

--
Rich Ulrich


Date: Fri, 27 Jul 2012 10:06:15 -0700
From: [hidden email]
Subject: Part correlations and multicollinearity?
To: [hidden email]

Hello to all,

 

 I have a regression model where two variables are highly correlated, at ~ .8, and the Tolerance is low .2 - .3 and VIF high, ~4-5.

Does it mean that the variables don’t have such a large independent contribution to explaining the outcome? Even if there seems to be multicollinearity for the two variables, these same two variables have large part and semipartial correlation coefficients, the largest amongst all 10 variables included (5 controls).

Can I say that despite the multicollinearity, each variable explains a unique proportion in the outcome only based on these part correlations? I have read somewhere that <The more “tolerant” a variable is (i.e. the less highly correlated it is with the other IVs), the greater its unique contribution to R2 will be>, but the case here is reverse, these variables have low tolerance but high contribution to explaining the criterion.

Is this interpretation correct?
 
Thank you in advance.
 
Reply | Threaded
Open this post in threaded view
|

Re: Part correlations and multicollinearity?

Anter
 
Thank you for the response. I understand the explanation but by checking I found that the partial and zero order correlations are in the same direction and there are no standardized beta weights greater than 1.
 I've tried centering them and creating a composite score but that would not be predictive .. The variables are referring to different constructs and I don't think a combination of the two would still have the same significance..

--- On Fri, 7/27/12, Rich Ulrich <[hidden email]> wrote:

From: Rich Ulrich <[hidden email]>
Subject: Re: Part correlations and multicollinearity?
To: [hidden email]
Date: Friday, July 27, 2012, 6:39 PM

The trick here is "unique variance", and
what you need to read up on is "suppressor variables".

If you look for it, you will see that one of these variables
has a partial-correlation that is in the opposite direction
from its 0-order correlation.  This happens whenever
the *difference* (or ratio) of the two correlated predictors
is also predictive.  - An easy clue that this is happening is
that a standardized beta is greater than 1.0.

The best solution that I know of is to create a new variable,
 a composite score that is some (weighted) difference or ratio
(or log-ratio) of the two correlated variables -- if there is
such a composite that makes particularly good sense for the
variables, use that version.

--
Rich Ulrich


Date: Fri, 27 Jul 2012 10:06:15 -0700
From: [hidden email]
Subject: Part correlations and multicollinearity?
To: [hidden email]

Hello to all,

 

 I have a regression model where two variables are highly correlated, at ~ .8, and the Tolerance is low .2 - .3 and VIF high, ~4-5.

Does it mean that the variables don’t have such a large independent contribution to explaining the outcome? Even if there seems to be multicollinearity for the two variables, these same two variables have large part and semipartial correlation coefficients, the largest amongst all 10 variables included (5 controls).

Can I say that despite the multicollinearity, each variable explains a unique proportion in the outcome only based on these part correlations? I have read somewhere that <The more “tolerant” a variable is (i.e. the less highly correlated it is with the other IVs), the greater its unique contribution to R2 will be>, but the case here is reverse, these variables have low tolerance but high contribution to explaining the criterion.

Is this interpretation correct?
 
Thank you in advance.