Re: Multicollinearity in SPSS
Posted by Almost Done on Aug 06, 2012; 9:24pm
URL: http://spssx-discussion.165.s1.nabble.com/Multicollinearity-in-SPSS-tp1070043p5714600.html
Hey, guys! I might have a problem that might seem easy to you but it isn't for me. I'm doing a research about creative advertising and have to check for example whether the divergence (rated on a seven point Lickert scale) and relevance (rated the same) and the interaction between the two divergence*relevance has an effect on the attention that the respondents also rated on a 7 point lickert schale. So when I run a regression this is what I get:
B t sig
Constant ,529 ,649 ,518
Divergence ,666 4,215 ,000
Relevance ,573 2,275 ,024
Divergence*Relevance -,091 -2,012 ,046
This seemed weird to me because divergence*relevance has a negative influence on the dependent variable attention. How can it be?
So I removed the Divergence*Relevance interaction, and this is what I got:
B t sig.
Constant 1,892 4,113 ,000
Divergence ,398 4,622 ,000
Relevance ,090 1,167 ,245
So the B and significance changed drastically. I've tested for multicoliniarity using the VIF. The combination where the Divergence was the dependent variable (and independent : Divergence and Divergence*Relevance) was the one where VIF was greater than 5. All the other combinations were fine (VIF was either 1 or slighter greater than 1).
So my question is - what does that mean and how do I proceed? What do I have to do? And how should I explain that?