Suppressor variables in moderated multiple regression

classic Classic list List threaded Threaded
17 messages Options
Reply | Threaded
Open this post in threaded view
|

Suppressor variables in moderated multiple regression

Kathryn Gardner
Dear List,

I've conducted moderated multiple regression analysis with the main effects
on steps 1 and 2 and product (interaction) terms on step 3.  After recently
reading & learning about suppressor variables, I examined the zero-order
correlations between all predictors (including interaction terms) and the
criterion variable & compared them to the regression model beta
coefficients for inconsistencies in sizes and signs.  I found that some
bivariate correlations between predictors & the criterion are non-
significant, but they are significant predictors in the regression
analysis. I have read that this may be a sign of classical suppression & I
was wondering if anyone could advise on:

a) whether this is a sign of suppression, & even if it is, what else
could these results suggest other than suppression?
b) the literature on suppressor variables suggests looking for
inconsistencies in signs and sizes between the bivariate correlations and
standardized regression coefficients (beta). However, I have read that only
the unstandardized coefficients (B) should be interpreted when interpreting
interaction effects.  Is it therefore OK to examine inconsistencies between
bivariate correlations and unstandardized coefficients and are there any
issues I need to be aware of?

Many thanks.

Kathryn
Reply | Threaded
Open this post in threaded view
|

Re: Suppressor variables in moderated multiple regression

Keith McCormick
Hi All,

I think there is evidence of suppression here, but there are a number
of things I would check that you don't mention.  I don't know which
you have tried, so I will list them in response to 'a'.

If you have not centered the variables, you might want to do that.
That is, you subtract the average of a variable from itself so that
zero is the average.  This is important when creating the interactions
and polynomials.

Request the collinearity diagnostics.  Small tolerance values (below
.1) would indicate a problem and add to the evidence that suppression
is present.  Since you ran 3 steps it would be interesting to see when
(if) the tolerance radically lowers.

VIF would also be a sign.  If the Variance Inflation Factor becomes
large, you might have suppression (5+ or so).  If you have not
centered and the VIF jumps on step three, then I would center and run
it again.  It might help a lot.

In answer to 'b', I don't see any harm in looking at the standarized
beta on the interactions to check for one detail.  See if the value
falls outside its normal range - that is it shouldn't be above 1 or
below -1.  If it is, that would also indicate suppression.

HTH.  Good Luck.

Keith
keithmccormick.com

On 6/26/06, Kathryn Gardner <[hidden email]> wrote:

> Dear List,
>
> I've conducted moderated multiple regression analysis with the main effects
> on steps 1 and 2 and product (interaction) terms on step 3.  After recently
> reading & learning about suppressor variables, I examined the zero-order
> correlations between all predictors (including interaction terms) and the
> criterion variable & compared them to the regression model beta
> coefficients for inconsistencies in sizes and signs.  I found that some
> bivariate correlations between predictors & the criterion are non-
> significant, but they are significant predictors in the regression
> analysis. I have read that this may be a sign of classical suppression & I
> was wondering if anyone could advise on:
>
> a) whether this is a sign of suppression, & even if it is, what else
> could these results suggest other than suppression?
> b) the literature on suppressor variables suggests looking for
> inconsistencies in signs and sizes between the bivariate correlations and
> standardized regression coefficients (beta). However, I have read that only
> the unstandardized coefficients (B) should be interpreted when interpreting
> interaction effects.  Is it therefore OK to examine inconsistencies between
> bivariate correlations and unstandardized coefficients and are there any
> issues I need to be aware of?
>
> Many thanks.
>
> Kathryn
>
Reply | Threaded
Open this post in threaded view
|

Python jump start suggestions

King Douglas
Folks,

  I’ve just upgraded to SPSS 14 and must put Python to good use as soon as possible—the purpose of the upgrade from SPSS 13 is to accomplish a pressing task that SPSS 14 and Python make eminently possible.

  I’m looking for your advice on tutorials, textbooks, manuals or encouraging words regarding the best way to learn Python.  I promise to follow all (good) advice.

  Regards,

  King Douglas
  American Airlines Customer Research
Reply | Threaded
Open this post in threaded view
|

SPSS experts--interviews and photos: follow-up announcement

King Douglas
In reply to this post by Keith McCormick
Folks,

  For those who didn’t see the announcement I made last November, you may be interested in seeing photos I took and read interviews I conducted with six SPSS experts as a project for last November's SPSS Directions 2006 in Las Vegas titled:

  Diverse Cultures/Uncommon Skills: Profiles of Six SPSS Experts

  Five of the six experts are regular contributors to this list.  The interviewees are (in alphabetical order by last name):

  Marta Garcia-Granero, Pamplona, Spain
  Art Kendall, Washington, D.C., U.S.A.
  Raynald Levesque, Montreal, Quebec, Canada
  Hector Maletta, Buenos Aires, Argentina
  Kirill Orlov (Russian SPSS savant), Moscow, Russia
  Richard Ristow, Providence, Rhode Island, U.S.A.

  The entire presentation, including video clips, examples of solutions, SPSS humor and more, is online at

  http://www.kingdouglas.com/spss.htm

  I think you'll enjoy it.

  Regards,

  King Douglas
  American Airlines Customer Research
Reply | Threaded
Open this post in threaded view
|

Re: Suppressor variables in moderated multiple regression

statisticsdoc
In reply to this post by Kathryn Gardner
Stephen Brand
www.statisticsdoc.com

Kathryn,

I would suggest that you center the variables around their mean (by subtracting their mean) before you compute the cross-product interaction term.  Enter the centered variables and their cross-product into the regresssion.  This procedure will greatly reduce collinearity between the main effects and the cross-product interaction term.

It would be advisable to request diagnostic statistics for the regression and pay particular attention to the results that are reported when the interaction term is entered.  Include the keyword TOL on the requested statistics.  (If you have a number of additional predictors in the regression equation, you might also include COLLIN, but that is another discussion).   TOL will show you tolerance - i.e. the proportion of variance in each predictor that is not predicted by the combination of the other predictors.  Small values indicate that the predictor is redundant.  I would be very concerned if the entry of the interaction term results in tolerance values below .1  and somewhat concerned if the tolerance falls below .2    VIF is the reciprocal of tolerance.

You may have a scenario in which the correlations for the main effects are not significant, but their beta weights are when the interaction term is present in the equation, for reasons that are quite meaningful (and not due to collinearity).   The main effects of one independent variable may not be clear until the interaction between independent variables is taken into consideration.

HTH,

SB


---- Kathryn Gardner <[hidden email]> wrote:

> Dear List,
>
> I've conducted moderated multiple regression analysis with the main effects
> on steps 1 and 2 and product (interaction) terms on step 3.  After recently
> reading & learning about suppressor variables, I examined the zero-order
> correlations between all predictors (including interaction terms) and the
> criterion variable & compared them to the regression model beta
> coefficients for inconsistencies in sizes and signs.  I found that some
> bivariate correlations between predictors & the criterion are non-
> significant, but they are significant predictors in the regression
> analysis. I have read that this may be a sign of classical suppression & I
> was wondering if anyone could advise on:
>
> a) whether this is a sign of suppression, & even if it is, what else
> could these results suggest other than suppression?
> b) the literature on suppressor variables suggests looking for
> inconsistencies in signs and sizes between the bivariate correlations and
> standardized regression coefficients (beta). However, I have read that only
> the unstandardized coefficients (B) should be interpreted when interpreting
> interaction effects.  Is it therefore OK to examine inconsistencies between
> bivariate correlations and unstandardized coefficients and are there any
> issues I need to be aware of?
>
> Many thanks.
>
> Kathryn

--
For personalized and experienced consulting in statistics and research design, visit www.statisticsdoc.com
Reply | Threaded
Open this post in threaded view
|

Re: Python jump start suggestions

Keith McCormick
In reply to this post by King Douglas
Hello All,

There is a lot I could suggest, but I will start with a couple of
brief suggestions:

a) The tutorial on python.org has a good reputation. I took a class,
so by the time I looked into it, I knew the basics.

b) http://www.spss.com/devcentral/ has just been revised and updated
with lots of links.

c) http://rmi.net/~lutz/  I took this python expert's class and found
it very helpful, although the only things he knows about SPSS's new
features he has discussed with me.  His book is quite good, and he
class at $1000 for three days is a great deal.  The only problem is
that he does mostly onsite trainings, so his next public class is
October.  I found it helpful to put Python into context since no one
else there is likely to know SPSS.  Hint: go ahead and buy Learning
Python, but if you want the more complete Programming Python wait
until the new edition comes out in a month.  I am read the first, but
am waiting on the later myself.

d) http://www.spss.com/spss/SPSS_programming_data_mgmt.pdf has python
stuff, but only if you get the third edition.

Good luck!

Keith
keithmccormick.com

On 6/26/06, King Douglas <[hidden email]> wrote:

> Folks,
>
>  I've just upgraded to SPSS 14 and must put Python to good use as soon as possible—the purpose of the upgrade from SPSS 13 is to accomplish a pressing task that SPSS 14 and Python make eminently possible.
>
>  I'm looking for your advice on tutorials, textbooks, manuals or encouraging words regarding the best way to learn Python.  I promise to follow all (good) advice.
>
>  Regards,
>
>  King Douglas
>  American Airlines Customer Research
>
Reply | Threaded
Open this post in threaded view
|

Re: Python jump start suggestions

King Douglas
Keith,

  I'll wait on the 3rd edition of Lutz's book, but will follow your other suggestions right away.

  Thanks very much,

  King Douglas



Keith McCormick <[hidden email]> wrote:
  Hello All,

There is a lot I could suggest, but I will start with a couple of
brief suggestions:

a) The tutorial on python.org has a good reputation. I took a class,
so by the time I looked into it, I knew the basics.

b) http://www.spss.com/devcentral/ has just been revised and updated
with lots of links.

c) http://rmi.net/~lutz/ I took this python expert's class and found
it very helpful, although the only things he knows about SPSS's new
features he has discussed with me. His book is quite good, and he
class at $1000 for three days is a great deal. The only problem is
that he does mostly onsite trainings, so his next public class is
October. I found it helpful to put Python into context since no one
else there is likely to know SPSS. Hint: go ahead and buy Learning
Python, but if you want the more complete Programming Python wait
until the new edition comes out in a month. I am read the first, but
am waiting on the later myself.

d) http://www.spss.com/spss/SPSS_programming_data_mgmt.pdf has python
stuff, but only if you get the third edition.

Good luck!

Keith
keithmccormick.com

On 6/26/06, King Douglas wrote:

> Folks,
>
> I've just upgraded to SPSS 14 and must put Python to good use as soon as possible—the purpose of the upgrade from SPSS 13 is to accomplish a pressing task that SPSS 14 and Python make eminently possible.
>
> I'm looking for your advice on tutorials, textbooks, manuals or encouraging words regarding the best way to learn Python. I promise to follow all (good) advice.
>
> Regards,
>
> King Douglas
> American Airlines Customer Research
>
Reply | Threaded
Open this post in threaded view
|

Re: Python jump start suggestions

Laura Berry, Dr.
King, Keith & others:

Okay, for those of us who are non-herpetologists, what is Python and why
will I almost certainly want it?

Thanks,
Laura

Laura Berry, Ed.D
Director of Student Success & Institutional Research
North Arkansas College
870.391-3280


On 6/26/06, King Douglas wrote:
> Folks,
>
> I've just upgraded to SPSS 14 and must put Python to good use as soon
as possible-the purpose of the upgrade from SPSS 13 is to accomplish a
pressing task that SPSS 14 and Python make eminently possible.
>
> I'm looking for your advice on tutorials, textbooks, manuals or
encouraging words regarding the best way to learn Python. I promise to
follow all (good) advice.
>
> Regards,
>
> King Douglas
> American Airlines Customer Research
>
Reply | Threaded
Open this post in threaded view
|

Re: Python jump start suggestions

King Douglas
Laura,

  You'll find an excellent justification for learning and using Python in the Python section of the third edition of Raynald Levesque's SPSS Data Management book.  You can download it free in pdf from

http://www.spss.com/spss/data_management_book.htm


  King Douglas

"Laura Berry, Dr." <[hidden email]> wrote:

King, Keith & others:

Okay, for those of us who are non-herpetologists, what is Python and why
will I almost certainly want it?

Thanks,
Laura

Laura Berry, Ed.D
Director of Student Success & Institutional Research
North Arkansas College
870.391-3280


On 6/26/06, King Douglas wrote:
> Folks,
>
> I've just upgraded to SPSS 14 and must put Python to good use as soon
as possible-the purpose of the upgrade from SPSS 13 is to accomplish a
pressing task that SPSS 14 and Python make eminently possible.
>
> I'm looking for your advice on tutorials, textbooks, manuals or
encouraging words regarding the best way to learn Python. I promise to
follow all (good) advice.
>
> Regards,
>
> King Douglas
> American Airlines Customer Research
>
Reply | Threaded
Open this post in threaded view
|

Re: Suppressor variables in moderated multiple regression

Kathryn Gardner
In reply to this post by Kathryn Gardner
Hi Keith,
 
Thanks a lot for your help. I have centered my variables that are to be used to create product terms and have also examined the collinearity diagnostics. I ended up deleting one variable with a VIF of 11.90 and I now have more acceptable VIF levels. I still have many variables which have non-sig zero order correlations but sig B coefficients in the regression model though, so perhaps they are suppressors. Actually some of my VIFs are still over 5 so you may be right in that they are suppressors.
 
Thank you for your advice regarding my second question which was also helpful.
Best wishes,
Kathryn



> Date: Mon, 26 Jun 2006 13:01:57 -0400> From: [hidden email]> Subject: Re: Suppressor variables in moderated multiple regression> To: [hidden email]> > Hi All,> > I think there is evidence of suppression here, but there are a number> of things I would check that you don't mention.  I don't know which> you have tried, so I will list them in response to 'a'.> > If you have not centered the variables, you might want to do that.> That is, you subtract the average of a variable from itself so that> zero is the average.  This is important when creating the interactions> and polynomials.> > Request the collinearity diagnostics.  Small tolerance values (below> .1) would indicate a problem and add to the evidence that suppression> is present.  Since you ran 3 steps it would be interesting to see when> (if) the tolerance radically lowers.> > VIF would also be a sign.  If the Variance Inflation Factor becomes> large, you might have suppression (5+ or so).  If you have not> centered and the VIF jumps on step three, then I would center and run> it again.  It might help a lot.> > In answer to 'b', I don't see any harm in looking at the standarized> beta on the interactions to check for one detail.  See if the value> falls outside its normal range - that is it shouldn't be above 1 or> below -1.  If it is, that would also indicate suppression.> > HTH.  Good Luck.> > Keith> keithmccormick.com> > On 6/26/06, Kathryn Gardner <[hidden email]> wrote:> > Dear List,> >> > I've conducted moderated multiple regression analysis with the main effects> > on steps 1 and 2 and product (interaction) terms on step 3.  After recently> > reading & learning about suppressor variables, I examined the zero-order> > correlations between all predictors (including interaction terms) and the> > criterion variable & compared them to the regression model beta> > coefficients for inconsistencies in sizes and signs.  I found that some> > bivariate correlations between predictors & the criterion are non-> > significant, but they are significant predictors in the regression> > analysis. I have read that this may be a sign of classical suppression & I> > was wondering if anyone could advise on:> >> > a) whether this is a sign of suppression, & even if it is, what else> > could these results suggest other than suppression?> > b) the literature on suppressor variables suggests looking for> > inconsistencies in signs and sizes between the bivariate correlations and> > standardized regression coefficients (beta). However, I have read that only> > the unstandardized coefficients (B) should be interpreted when interpreting> > interaction effects.  Is it therefore OK to examine inconsistencies between> > bivariate correlations and unstandardized coefficients and are there any> > issues I need to be aware of?> >> > Many thanks.> >> > Kathryn> >
_________________________________________________________________
Try Live.com: where your online world comes together - with news, sports, weather, and much more.
http://www.live.com/getstarted
Reply | Threaded
Open this post in threaded view
|

Re: Suppressor variables in moderated multiple regression

Kathryn Gardner
In reply to this post by Kathryn Gardner
Hi Stephen,
 
Thanks for your e-mail and suggestions. I have centered my variables that are to be used to create product terms and have also examined the collinearity diagnostics. I ended up deleting one variable with a VIF of 11.90 and I now have more acceptable VIF levels. I'm going to go back and have a closer more detailed inspection though, seeing as both yourself and someone else has advised about examining VIF and tolerance to address this issue.
 
Thank you  also for the following advice which was useful: "The main effects of one independent variable may not be clear until the interaction between independent variables is taken into consideration".
 
Thanks again.
Kathryn



> Date: Mon, 26 Jun 2006 14:17:27 -0400> From: [hidden email]> To: [hidden email]; [hidden email]> Subject: Re: Suppressor variables in moderated multiple regression> > Stephen Brand> www.statisticsdoc.com> > Kathryn,> > I would suggest that you center the variables around their mean (by subtracting their mean) before you compute the cross-product interaction term.  Enter the centered variables and their cross-product into the regresssion.  This procedure will greatly reduce collinearity between the main effects and the cross-product interaction term.> > It would be advisable to request diagnostic statistics for the regression and pay particular attention to the results that are reported when the interaction term is entered.  Include the keyword TOL on the requested statistics.  (If you have a number of additional predictors in the regression equation, you might also include COLLIN, but that is another discussion).   TOL will show you tolerance - i.e. the proportion of variance in each predictor that is not predicted by the combination of the other predictors.  Small values indicate that the predictor is redundant.  I would be very concerned if the entry of the interaction term results in tolerance values below .1  and somewhat concerned if the tolerance falls below .2    VIF is the reciprocal of tolerance.  > > You may have a scenario in which the correlations for the main effects are not significant, but their beta weights are when the interaction term is present in the equation, for reasons that are quite meaningful (and not due to collinearity).   The main effects of one independent variable may not be clear until the interaction between independent variables is taken into consideration.   > > HTH,> > SB> > > ---- Kathryn Gardner <[hidden email]> wrote: > > Dear List,> > > > I've conducted moderated multiple regression analysis with the main effects> > on steps 1 and 2 and product (interaction) terms on step 3.  After recently> > reading & learning about suppressor variables, I examined the zero-order> > correlations between all predictors (including interaction terms) and the> > criterion variable & compared them to the regression model beta> > coefficients for inconsistencies in sizes and signs.  I found that some> > bivariate correlations between predictors & the criterion are non-> > significant, but they are significant predictors in the regression> > analysis. I have read that this may be a sign of classical suppression & I> > was wondering if anyone could advise on:> > > > a) whether this is a sign of suppression, & even if it is, what else> > could these results suggest other than suppression?> > b) the literature on suppressor variables suggests looking for> > inconsistencies in signs and sizes between the bivariate correlations and> > standardized regression coefficients (beta). However, I have read that only> > the unstandardized coefficients (B) should be interpreted when interpreting> > interaction effects.  Is it therefore OK to examine inconsistencies between> > bivariate correlations and unstandardized coefficients and are there any> > issues I need to be aware of?> > > > Many thanks.> > > > Kathryn> > --> For personalized and experienced consulting in statistics and research design, visit www.statisticsdoc.com> >
_________________________________________________________________
Try Live.com: where your online world comes together - with news, sports, weather, and much more.
http://www.live.com/getstarted
Reply | Threaded
Open this post in threaded view
|

Re: Suppressor variables in moderated multiple regression

Keith McCormick
In reply to this post by Kathryn Gardner
HI Kathryn,

I am glad you made progress.  VIF and tolerance are often enough to do
the trick, but as Stephen mentioned there is also the Variance
Proportions table in the Collinearity Diagnostics.

Look for row with a low eigenvalue (near zero), and then across the
row for the columns that have high proportions (near one).  That will
tell you which variables are collinear with each other.  Deleting
variables is not the only choice, you could combine them in some way
(add them or average them when it makes sense) or create factors using
factor analysis.

Good luck,  Keith

On 6/27/06, Kathryn Gardner <[hidden email]> wrote:

>
>
> Hi Keith,
>
> Thanks a lot for your help. I have centered my variables that are to be used
> to create product terms and have also examined the collinearity diagnostics.
> I ended up deleting one variable with a VIF of 11.90 and I now have more
> acceptable VIF levels. I still have many variables which have non-sig zero
> order correlations but sig B coefficients in the regression model though, so
> perhaps they are suppressors. Actually some of my VIFs are still over 5 so
> you may be right in that they are suppressors.
>
> Thank you for your advice regarding my second question which was also
> helpful.
> Best wishes,
> Kathryn
>
>
> ________________________________
>
> > Date: Mon, 26 Jun 2006 13:01:57 -0400
> > From: [hidden email]
> > Subject: Re: Suppressor variables in moderated multiple regression
> > To: [hidden email]
>
> >
> > Hi All,
> >
> > I think there is evidence of suppression here, but
> there are a number
> > of things I would check that you don't mention.  I
> don't know which
> > you have tried, so I will list them in response to 'a'.
> >
> > If you have not centered the variables, you might want
> to do that.
> > That is, you subtract the average of a variable
> from itself so that
> > zero is the average.  This is important when creating
> the interactions
> > and polynomials.
> >
> > Request the collinearity diagnostics.  Small
> tolerance values (below
> > .1) would indicate a problem and add to the evidence
> that suppression
> > is present.  Since you ran 3 steps it would be
> interesting to see when
> > (if) the tolerance radically lowers.
> >
> > VIF would also be a sign.  If the Variance Inflation
> Factor becomes
> > large, you might have suppression (5+ or so).  If you
> have not
> > centered and the VIF jumps on step three, then I
> would center and run
> > it again.  It might help a lot.
> >
> > In answer to 'b', I don't see any harm in looking at
> the standarized
> > beta on the interactions to check for one detail.  See
> if the value
> > falls outside its normal range - that is it shouldn't
> be above 1 or
> > below -1.  If it is, that would also indicate
> suppression.
> >
> > HTH.  Good Luck.
> >
> > Keith
> > keithmccormick.com
> >
> > On 6/26/06, Kathryn Gardner <[hidden email]> wrote:
>
> > > Dear List,
> > >
> > > I've conducted moderated multiple regression
> analysis with the main effects
> > > on steps 1 and 2 and product (interaction) terms on
> step 3.  After recently
> > > reading & learning about suppressor variables,
> I examined the zero-order
> > > correlations between all predictors (including
> interaction terms) and the
> > > criterion variable & compared them to the regression
> model beta
> > > coefficients for inconsistencies in sizes and signs.
> I found that some
> > > bivariate correlations between predictors &
> the criterion are non-
> > > significant, but they are significant predictors in
> the regression
> > > analysis. I have read that this may be a sign of
> classical suppression & I
> > > was wondering if anyone could advise on:
> > >
> > > a) whether this is a sign of suppression, & even if it
> is, what else
> > > could these results suggest other than suppression?
> > > b) the literature on suppressor variables suggests
> looking for
> > > inconsistencies in signs and sizes between the
> bivariate correlations and
> > > standardized regression coefficients (beta).
> However, I have read that only
> > > the unstandardized coefficients (B) should be
> interpreted when interpreting
> > > interaction effects.  Is it therefore OK to
> examine inconsistencies between
> > > bivariate correlations and unstandardized
> coefficients and are there any
> > > issues I need to be aware of?
> > >
> > > Many thanks.
> > >
> > > Kathryn
> > >
>
>
> ________________________________
> Be one of the first to try Windows Live Mail.
Reply | Threaded
Open this post in threaded view
|

Re: Suppressor variables in moderated multiple regression

Kathryn Gardner
In reply to this post by Kathryn Gardner
Hi Keith,
 
Thanks for the further advice. I've been looking at collinearity diagnostics and I was wondering how the condition index is used? You said "Look for rows with a low eigenvalue (near zero), and then across the row for the columns that have high proportions (near one).  That will tell you which variables are collinear with each other". But I was wondering whether I need to look at the condition index as well? I read that a condition index over 30 suggests serious collinearity problems and an index over 15 indicates possible collinearity problems. If a factor (component) has a high condition index, one looks in the variance proportions column to see if two or more variables have a variance partition of .50 or higher. But this text makes no reference to the eigenvalues. Can you shed any light on how eigenvalues, condition indices and variance portions are used all together or relate?
 
Thanks
Kathryn



> Date: Tue, 27 Jun 2006 11:04:36 -0400> From: [hidden email]> To: [hidden email]> Subject: Re: Re: Suppressor variables in moderated multiple regression> CC: [hidden email]> > HI Kathryn,> > I am glad you made progress.  VIF and tolerance are often enough to do> the trick, but as Stephen mentioned there is also the Variance> Proportions table in the Collinearity Diagnostics.> > Look for row with a low eigenvalue (near zero), and then across the> row for the columns that have high proportions (near one).  That will> tell you which variables are collinear with each other.  Deleting> variables is not the only choice, you could combine them in some way> (add them or average them when it makes sense) or create factors using> factor analysis.> > Good luck,  Keith> > On 6/27/06, Kathryn Gardner <[hidden email]> wrote:> >> >> > Hi Keith,> >> > Thanks a lot for your help. I have centered my variables that are to be used> > to create product terms and have also examined the collinearity diagnostics.> > I ended up deleting one variable with a VIF of 11.90 and I now have more> > acceptable VIF levels. I still have many variables which have non-sig zero> > order correlations but sig B coefficients in the regression model though, so> > perhaps they are suppressors. Actually some of my VIFs are still over 5 so> > you may be right in that they are suppressors.> >> > Thank you for your advice regarding my second question which was also> > helpful.> > Best wishes,> > Kathryn> >> >> > ________________________________> >> > > Date: Mon, 26 Jun 2006 13:01:57 -0400> > > From: [hidden email]> > > Subject: Re: Suppressor variables in moderated multiple regression> > > To: [hidden email]> >> > >> > > Hi All,> > >> > > I think there is evidence of suppression here, but> > there are a number> > > of things I would check that you don't mention.  I> > don't know which> > > you have tried, so I will list them in response to 'a'.> > >> > > If you have not centered the variables, you might want> > to do that.> > > That is, you subtract the average of a variable> > from itself so that> > > zero is the average.  This is important when creating> > the interactions> > > and polynomials.> > >> > > Request the collinearity diagnostics.  Small> > tolerance values (below> > > .1) would indicate a problem and add to the evidence> > that suppression> > > is present.  Since you ran 3 steps it would be> > interesting to see when> > > (if) the tolerance radically lowers.> > >> > > VIF would also be a sign.  If the Variance Inflation> > Factor becomes> > > large, you might have suppression (5+ or so).  If you> > have not> > > centered and the VIF jumps on step three, then I> > would center and run> > > it again.  It might help a lot.> > >> > > In answer to 'b', I don't see any harm in looking at> > the standarized> > > beta on the interactions to check for one detail.  See> > if the value> > > falls outside its normal range - that is it shouldn't> > be above 1 or> > > below -1.  If it is, that would also indicate> > suppression.> > >> > > HTH.  Good Luck.> > >> > > Keith> > > keithmccormick.com> > >> > > On 6/26/06, Kathryn Gardner <[hidden email]> wrote:> >> > > > Dear List,> > > >> > > > I've conducted moderated multiple regression> > analysis with the main effects> > > > on steps 1 and 2 and product (interaction) terms on> > step 3.  After recently> > > > reading & learning about suppressor variables,> > I examined the zero-order> > > > correlations between all predictors (including> > interaction terms) and the> > > > criterion variable & compared them to the regression> > model beta> > > > coefficients for inconsistencies in sizes and signs.> > I found that some> > > > bivariate correlations between predictors &> > the criterion are non-> > > > significant, but they are significant predictors in> > the regression> > > > analysis. I have read that this may be a sign of> > classical suppression & I> > > > was wondering if anyone could advise on:> > > >> > > > a) whether this is a sign of suppression, & even if it> > is, what else> > > > could these results suggest other than suppression?> > > > b) the literature on suppressor variables suggests> > looking for> > > > inconsistencies in signs and sizes between the> > bivariate correlations and> > > > standardized regression coefficients (beta).> > However, I have read that only> > > > the unstandardized coefficients (B) should be> > interpreted when interpreting> > > > interaction effects.  Is it therefore OK to> > examine inconsistencies between> > > > bivariate correlations and unstandardized> > coefficients and are there any> > > > issues I need to be aware of?> > > >> > > > Many thanks.> > > >> > > > Kathryn> > > >> >> >> > ________________________________> > Be one of the first to try Windows Live Mail.
_________________________________________________________________
Try Live.com: where your online world comes together - with news, sports, weather, and much more.
http://www.live.com/getstarted
Reply | Threaded
Open this post in threaded view
|

Re: Suppressor variables in moderated multiple regression

Keith McCormick
Hi Kathryn,

Sorry, I have not been on the list for the last several days.

The condition index will be high for the same rows that the
eigenvalues are low.  You could focus on either one.  In those same
rows, when the variance proportions are high those are the variables
or variables that are potential problems.

For instance, I just ran MPG as the dependent and engine displacement,
horsepower, vehicle weight, and time to accelerate using the CARS.sav
data set which is in your installation directory.  Two rows have low
eigenvalues.  One indicates a possible problem pair engine, and time
to acc.; the other shows engine displacement and weight as a possible
problem pair. The former has an condition index of 27, the later 36.
I would hate to think anyone would treat these differently because
they are on either side of 30.  There are no hard and fast rules.  In
this case, it is pretty clear that they are versions of the same
thing, so I might drop some.  That would not always be the best route.

Seaching for a good phrase to "cut and paste" for condition index, I
cut the following from the "result's coach". "Condition indices are
the square roots of the ratios of the largest eigenvalue to each
successive eigenvalue."  The help is brief, and sometimes basic, but
it is a resource worth remembering.  Just put any pivot table in
editiing mode, and right click on it to get the result's coach.

Hope that helps despite the delay,

Best, Keith
keithmccormick.com


On 6/29/06, Kathryn Gardner <[hidden email]> wrote:

> Hi Keith,
>
> Thanks for the further advice. I've been looking at collinearity diagnostics and I was wondering how the condition index is used? You said "Look for rows with a low eigenvalue (near zero), and then across the row for the columns that have high proportions (near one).  That will tell you which variables are collinear with each other". But I was wondering whether I need to look at the condition index as well? I read that a condition index over 30 suggests serious collinearity problems and an index over 15 indicates possible collinearity problems. If a factor (component) has a high condition index, one looks in the variance proportions column to see if two or more variables have a variance partition of .50 or higher. But this text makes no reference to the eigenvalues. Can you shed any light on how eigenvalues, condition indices and variance portions are used all together or relate?
>
> Thanks
> Kathryn
>
>
>
> > Date: Tue, 27 Jun 2006 11:04:36 -0400> From: [hidden email]> To: [hidden email]> Subject: Re: Re: Suppressor variables in moderated multiple regression> CC: [hidden email]> > HI Kathryn,> > I am glad you made progress.  VIF and tolerance are often enough to do> the trick, but as Stephen mentioned there is also the Variance> Proportions table in the Collinearity Diagnostics.> > Look for row with a low eigenvalue (near zero), and then across the> row for the columns that have high proportions (near one).  That will> tell you which variables are collinear with each other.  Deleting> variables is not the only choice, you could combine them in some way> (add them or average them when it makes sense) or create factors using> factor analysis.> > Good luck,  Keith> > On 6/27/06, Kathryn Gardner <[hidden email]> wrote:> >> >> > Hi Keith,> >> > Thanks a lot for your help. I have centered my variables that are to be used> > to create product terms !
 and have also examined the collinearity diagnostics.> > I ended up deleting one variable with a VIF of 11.90 and I now have more> > acceptable VIF levels. I still have many variables which have non-sig zero> > order correlations but sig B coefficients in the regression model though, so> > perhaps they are suppressors. Actually some of my VIFs are still over 5 so> > you may be right in that they are suppressors.> >> > Thank you for your advice regarding my second question which was also> > helpful.> > Best wishes,> > Kathryn> >> >> > ________________________________> >> > > Date: Mon, 26 Jun 2006 13:01:57 -0400> > > From: [hidden email]> > > Subject: Re: Suppressor variables in moderated multiple regression> > > To: [hidden email]> >> > >> > > Hi All,> > >> > > I think there is evidence of suppression here, but> > there are a number> > > of things I would check that you don't mention.  I> > don't know which> > > you have tried, so I will list them in response!
  to 'a'.> > >> > > If you have not centered the variables, you might w
ant> > to do that.> > > That is, you subtract the average of a variable> > from itself so that> > > zero is the average.  This is important when creating> > the interactions> > > and polynomials.> > >> > > Request the collinearity diagnostics.  Small> > tolerance values (below> > > .1) would indicate a problem and add to the evidence> > that suppression> > > is present.  Since you ran 3 steps it would be> > interesting to see when> > > (if) the tolerance radically lowers.> > >> > > VIF would also be a sign.  If the Variance Inflation> > Factor becomes> > > large, you might have suppression (5+ or so).  If you> > have not> > > centered and the VIF jumps on step three, then I> > would center and run> > > it again.  It might help a lot.> > >> > > In answer to 'b', I don't see any harm in looking at> > the standarized> > > beta on the interactions to check for one detail.  See> > if the value> > > falls outside its normal range - that is it shouldn't> > be above 1 or> > > below !
 -1.  If it is, that would also indicate> > suppression.> > >> > > HTH.  Good Luck.> > >> > > Keith> > > keithmccormick.com> > >> > > On 6/26/06, Kathryn Gardner <[hidden email]> wrote:> >> > > > Dear List,> > > >> > > > I've conducted moderated multiple regression> > analysis with the main effects> > > > on steps 1 and 2 and product (interaction) terms on> > step 3.  After recently> > > > reading & learning about suppressor variables,> > I examined the zero-order> > > > correlations between all predictors (including> > interaction terms) and the> > > > criterion variable & compared them to the regression> > model beta> > > > coefficients for inconsistencies in sizes and signs.> > I found that some> > > > bivariate correlations between predictors &> > the criterion are non-> > > > significant, but they are significant predictors in> > the regression> > > > analysis. I have read that this may be a sign of> > classical suppression & I> > > > was wondering if anyone co!
 uld advise on:> > > >> > > > a) whether this is a sign of suppression,
 & even if it> > is, what else> > > > could these results suggest other than suppression?> > > > b) the literature on suppressor variables suggests> > looking for> > > > inconsistencies in signs and sizes between the> > bivariate correlations and> > > > standardized regression coefficients (beta).> > However, I have read that only> > > > the unstandardized coefficients (B) should be> > interpreted when interpreting> > > > interaction effects.  Is it therefore OK to> > examine inconsistencies between> > > > bivariate correlations and unstandardized> > coefficients and are there any> > > > issues I need to be aware of?> > > >> > > > Many thanks.> > > >> > > > Kathryn> > > >> >> >> > ________________________________> > Be one of the first to try Windows Live Mail.
> _________________________________________________________________
> Try Live.com: where your online world comes together - with news, sports, weather, and much more.
> http://www.live.com/getstarted
Reply | Threaded
Open this post in threaded view
|

Re: Suppressor variables in moderated multiple regression

statisticsdoc
In reply to this post by Kathryn Gardner
Stephen Brand
www.statisticsdoc.com

Kathryn,

Keith's advice here is very good.  If you have pairs of predictors with high condition indices / low eigenvalues /  - one thing that I might consider in your research is combined both of the items in the problem pair into an additive composite (if that makes sense in the context of your research).

HTH,

Stephen Brand

---- Keith McCormick <[hidden email]> wrote:

> Hi Kathryn,
>
> Sorry, I have not been on the list for the last several days.
>
> The condition index will be high for the same rows that the
> eigenvalues are low.  You could focus on either one.  In those same
> rows, when the variance proportions are high those are the variables
> or variables that are potential problems.
>
> For instance, I just ran MPG as the dependent and engine displacement,
> horsepower, vehicle weight, and time to accelerate using the CARS.sav
> data set which is in your installation directory.  Two rows have low
> eigenvalues.  One indicates a possible problem pair engine, and time
> to acc.; the other shows engine displacement and weight as a possible
> problem pair. The former has an condition index of 27, the later 36.
> I would hate to think anyone would treat these differently because
> they are on either side of 30.  There are no hard and fast rules.  In
> this case, it is pretty clear that they are versions of the same
> thing, so I might drop some.  That would not always be the best route.
>
> Seaching for a good phrase to "cut and paste" for condition index, I
> cut the following from the "result's coach". "Condition indices are
> the square roots of the ratios of the largest eigenvalue to each
> successive eigenvalue."  The help is brief, and sometimes basic, but
> it is a resource worth remembering.  Just put any pivot table in
> editiing mode, and right click on it to get the result's coach.
>
> Hope that helps despite the delay,
>
> Best, Keith
> keithmccormick.com
>
>
> On 6/29/06, Kathryn Gardner <[hidden email]> wrote:
> > Hi Keith,
> >
> > Thanks for the further advice. I've been looking at collinearity diagnostics and I was wondering how the condition index is used? You said "Look for rows with a low eigenvalue (near zero), and then across the row for the columns that have high proportions (near one).  That will tell you which variables are collinear with each other". But I was wondering whether I need to look at the condition index as well? I read that a condition index over 30 suggests serious collinearity problems and an index over 15 indicates possible collinearity problems. If a factor (component) has a high condition index, one looks in the variance proportions column to see if two or more variables have a variance partition of .50 or higher. But this text makes no reference to the eigenvalues. Can you shed any light on how eigenvalues, condition indices and variance portions are used all together or relate?
> >
> > Thanks
> > Kathryn
> >
> >
> >
> > > Date: Tue, 27 Jun 2006 11:04:36 -0400> From: [hidden email]> To: [hidden email]> Subject: Re: Re: Suppressor variables in moderated multiple regression> CC: [hidden email]> > HI Kathryn,> > I am glad you made progress.  VIF and tolerance are often enough to do> the trick, but as Stephen mentioned there is also the Variance> Proportions table in the Collinearity Diagnostics.> > Look for row with a low eigenvalue (near zero), and then across the> row for the columns that have high proportions (near one).  That will> tell you which variables are collinear with each other.  Deleting> variables is not the only choice, you could combine them in some way> (add them or average them when it makes sense) or create factors using> factor analysis.> > Good luck,  Keith> > On 6/27/06, Kathryn Gardner <[hidden email]> wrote:> >> >> > Hi Keith,> >> > Thanks a lot for your help. I have centered my variables that are to be used> > to create product term!
 s !
>  and have also examined the collinearity diagnostics.> > I ended up deleting one variable with a VIF of 11.90 and I now have more> > acceptable VIF levels. I still have many variables which have non-sig zero> > order correlations but sig B coefficients in the regression model though, so> > perhaps they are suppressors. Actually some of my VIFs are still over 5 so> > you may be right in that they are suppressors.> >> > Thank you for your advice regarding my second question which was also> > helpful.> > Best wishes,> > Kathryn> >> >> > ________________________________> >> > > Date: Mon, 26 Jun 2006 13:01:57 -0400> > > From: [hidden email]> > > Subject: Re: Suppressor variables in moderated multiple regression> > > To: [hidden email]> >> > >> > > Hi All,> > >> > > I think there is evidence of suppression here, but> > there are a number> > > of things I would check that you don't mention.  I> > don't know which> > > you have tried, so I will list them in respon!
 se!
>   to 'a'.> > >> > > If you have not centered the variables, you might w
> ant> > to do that.> > > That is, you subtract the average of a variable> > from itself so that> > > zero is the average.  This is important when creating> > the interactions> > > and polynomials.> > >> > > Request the collinearity diagnostics.  Small> > tolerance values (below> > > .1) would indicate a problem and add to the evidence> > that suppression> > > is present.  Since you ran 3 steps it would be> > interesting to see when> > > (if) the tolerance radically lowers.> > >> > > VIF would also be a sign.  If the Variance Inflation> > Factor becomes> > > large, you might have suppression (5+ or so).  If you> > have not> > > centered and the VIF jumps on step three, then I> > would center and run> > > it again.  It might help a lot.> > >> > > In answer to 'b', I don't see any harm in looking at> > the standarized> > > beta on the interactions to check for one detail.  See> > if the value> > > falls outside its normal range - that is it shouldn't> > be above 1 or> > > belo!
 w !
>  -1.  If it is, that would also indicate> > suppression.> > >> > > HTH.  Good Luck.> > >> > > Keith> > > keithmccormick.com> > >> > > On 6/26/06, Kathryn Gardner <[hidden email]> wrote:> >> > > > Dear List,> > > >> > > > I've conducted moderated multiple regression> > analysis with the main effects> > > > on steps 1 and 2 and product (interaction) terms on> > step 3.  After recently> > > > reading & learning about suppressor variables,> > I examined the zero-order> > > > correlations between all predictors (including> > interaction terms) and the> > > > criterion variable & compared them to the regression> > model beta> > > > coefficients for inconsistencies in sizes and signs.> > I found that some> > > > bivariate correlations between predictors &> > the criterion are non-> > > > significant, but they are significant predictors in> > the regression> > > > analysis. I have read that this may be a sign of> > classical suppression & I> > > > was wondering if anyone !
 co!
>  uld advise on:> > > >> > > > a) whether this is a sign of suppression,
>  & even if it> > is, what else> > > > could these results suggest other than suppression?> > > > b) the literature on suppressor variables suggests> > looking for> > > > inconsistencies in signs and sizes between the> > bivariate correlations and> > > > standardized regression coefficients (beta).> > However, I have read that only> > > > the unstandardized coefficients (B) should be> > interpreted when interpreting> > > > interaction effects.  Is it therefore OK to> > examine inconsistencies between> > > > bivariate correlations and unstandardized> > coefficients and are there any> > > > issues I need to be aware of?> > > >> > > > Many thanks.> > > >> > > > Kathryn> > > >> >> >> > ________________________________> > Be one of the first to try Windows Live Mail.
> > _________________________________________________________________
> > Try Live.com: where your online world comes together - with news, sports, weather, and much more.
> > http://www.live.com/getstarted

--
For personalized and experienced consulting in statistics and research design, visit www.statisticsdoc.com
Reply | Threaded
Open this post in threaded view
|

Re: Suppressor variables in moderated multiple regression

Kathryn Gardner
In reply to this post by Kathryn Gardner
Hi Keith,
 
Thanks for getting back to me and explaining how condition indices, eigenvalues and variance proportions work. I think i've got it now :-) Your example was really helpful too. You're correct to say that "There are no hard and fast rules." I've read in some books that condition indices over 30 are problematic, whilst others suggest that condition indices up to 50 are moderately problematic, with condition indices up to 100 being serious.
 
Thanks again Keith for your much appreciated time and help.
Kathryn



regression> To: [hidden email]> > Hi Kathryn,> > Sorry, I have not been on the list for the last several days.> > The condition index will be high for the same rows that the> eigenvalues are low.  You could focus on either one.  In those same> rows, when the variance proportions are high those are the variables> or variables that are potential problems.> > For instance, I just ran MPG as the dependent and engine displacement,> horsepower, vehicle weight, and time to accelerate using the CARS.sav> data set which is in your installation directory.  Two rows have low> eigenvalues.  One indicates a possible problem pair engine, and time> to acc.; the other shows engine displacement and weight as a possible> problem pair. The former has an condition index of 27, the later 36.> I would hate to think anyone would treat these differently because> they are on either side of 30.  There are no hard and fast rules.  In> this case, it is pretty clear that they are versions of the same> thing, so I might drop some.  That would not always be the best route.> > Seaching for a good phrase to "cut and paste" for condition index, I> cut the following from the "result's coach". "Condition indices are> the square roots of the ratios of the largest eigenvalue to each> successive eigenvalue."  The help is brief, and sometimes basic, but> it is a resource worth remembering.  Just put any pivot table in> editiing mode, and right click on it to get the result's coach.> > Hope that helps despite the delay,> > Best, Keith> keithmccormick.com> > > On 6/29/06, Kathryn Gardner <[hidden email]> wrote:> > Hi Keith,> >> > Thanks for the further advice. I've been looking at collinearity diagnostics and I was wondering how the condition index is used? You said "Look for rows with a low eigenvalue (near zero), and then across the row for the columns that have high proportions (near one).  That will tell you which variables are collinear with each other". But I was wondering whether I need to look at the condition index as well? I read that a condition index over 30 suggests serious collinearity problems and an index over 15 indicates possible collinearity problems. If a factor (component) has a high condition index, one looks in the variance proportions column to see if two or more variables have a variance partition of .50 or higher. But this text makes no reference to the eigenvalues. Can you shed any light on how eigenvalues, condition indices and variance portions are used all together or relate?> >> > Thanks> > Kathryn> >> >> >> > > Date: Tue, 27 Jun 2006 11:04:36 -0400> From: [hidden email]> To: [hidden email]> Subject: Re: Re: Suppressor variables in moderated multiple regression> CC: [hidden email]> > HI Kathryn,> > I am glad you made progress.  VIF and tolerance are often enough to do> the trick, but as Stephen mentioned there is also the Variance> Proportions table in the Collinearity Diagnostics.> > Look for row with a low eigenvalue (near zero), and then across the> row for the columns that have high proportions (near one).  That will> tell you which variables are collinear with each other.  Deleting> variables is not the only choice, you could combine them in some way> (add them or average them when it makes sense) or create factors using> factor analysis.> > Good luck,  Keith> > On 6/27/06, Kathryn Gardner <[hidden email]> wrote:> >> >> > Hi Keith,> >> > Thanks a lot for your help. I have centered my variables that are to be used> > to create product terms !>  and have also examined the collinearity diagnostics.> > I ended up deleting one variable with a VIF of 11.90 and I now have more> > acceptable VIF levels. I still have many variables which have non-sig zero> > order correlations but sig B coefficients in the regression model though, so> > perhaps they are suppressors. Actually some of my VIFs are still over 5 so> > you may be right in that they are suppressors.> >> > Thank you for your advice regarding my second question which was also> > helpful.> > Best wishes,> > Kathryn> >> >> > ________________________________> >> > > Date: Mon, 26 Jun 2006 13:01:57 -0400> > > From: [hidden email]> > > Subject: Re: Suppressor variables in moderated multiple regression> > > To: [hidden email]> >> > >> > > Hi All,> > >> > > I think there is evidence of suppression here, but> > there are a number> > > of things I would check that you don't mention.  I> > don't know which> > > you have tried, so I will list them in response!>   to 'a'.> > >> > > If you have not centered the variables, you might w> ant> > to do that.> > > That is, you subtract the average of a variable> > from itself so that> > > zero is the average.  This is important when creating> > the interactions> > > and polynomials.> > >> > > Request the collinearity diagnostics.  Small> > tolerance values (below> > > .1) would indicate a problem and add to the evidence> > that suppression> > > is present.  Since you ran 3 steps it would be> > interesting to see when> > > (if) the tolerance radically lowers.> > >> > > VIF would also be a sign.  If the Variance Inflation> > Factor becomes> > > large, you might have suppression (5+ or so).  If you> > have not> > > centered and the VIF jumps on step three, then I> > would center and run> > > it again.  It might help a lot.> > >> > > In answer to 'b', I don't see any harm in looking at> > the standarized> > > beta on the interactions to check for one detail.  See> > if the value> > > falls outside its normal range - that is it shouldn't> > be above 1 or> > > below !>  -1.  If it is, that would also indicate> > suppression.> > >> > > HTH.  Good Luck.> > >> > > Keith> > > keithmccormick.com> > >> > > On 6/26/06, Kathryn Gardner <[hidden email]> wrote:> >> > > > Dear List,> > > >> > > > I've conducted moderated multiple regression> > analysis with the main effects> > > > on steps 1 and 2 and product (interaction) terms on> > step 3.  After recently> > > > reading & learning about suppressor variables,> > I examined the zero-order> > > > correlations between all predictors (including> > interaction terms) and the> > > > criterion variable & compared them to the regression> > model beta> > > > coefficients for inconsistencies in sizes and signs.> > I found that some> > > > bivariate correlations between predictors &> > the criterion are non-> > > > significant, but they are significant predictors in> > the regression> > > > analysis. I have read that this may be a sign of> > classical suppression & I> > > > was wondering if anyone co!>  uld advise on:> > > >> > > > a) whether this is a sign of suppression,>  & even if it> > is, what else> > > > could these results suggest other than suppression?> > > > b) the literature on suppressor variables suggests> > looking for> > > > inconsistencies in signs and sizes between the> > bivariate correlations and> > > > standardized regression coefficients (beta).> > However, I have read that only> > > > the unstandardized coefficients (B) should be> > interpreted when interpreting> > > > interaction effects.  Is it therefore OK to> > examine inconsistencies between> > > > bivariate correlations and unstandardized> > coefficients and are there any> > > > issues I need to be aware of?> > > >> > > > Many thanks.> > > >> > > > Kathryn> > > >> >> >> > ________________________________> > Be one of the first to try Windows Live Mail.> > _________________________________________________________________> > Try Live.com: where your online world comes together - with news, sports, weather, and much more.> > http://www.live.com/getstarted
_________________________________________________________________
Be one of the first to try Windows Live Mail.
http://ideas.live.com/programpage.aspx?versionId=5d21c51a-b161-4314-9b0e-4911fb2b2e6d
Reply | Threaded
Open this post in threaded view
|

Re: Suppressor variables in moderated multiple regression

Kathryn Gardner
In reply to this post by Kathryn Gardner
Hi Stephen,
 
Thanks a lot for your e-mail and advice.
 
I have considered making composites of variables as opposed to deleting variables, but it's often difficult to know which technique is most appropriate. Also, with interaction/product terms I wasn't actually sure whether these would simply be summed or the 2 interaction terms would have to be combined into one variable using another technique.  I was also unsure about summing scales that have negative relationships with the DV with those which have positive relationships i.e., is this OK or are there any issues I need to be aware of. Finally, I actually found that when I did sum two interaction terms they no longer predicted the criterion variable in my regression models. Thus I decided to delete one variable instead.
 
Thanks again
Kathryn



> Date: Wed, 5 Jul 2006 12:07:25 -0400> From: [hidden email]> Subject: Re: Suppressor variables in moderated multiple regression> To: [hidden email]> > Stephen Brand> www.statisticsdoc.com> > Kathryn,> > Keith's advice here is very good.  If you have pairs of predictors with high condition indices / low eigenvalues /  - one thing that I might consider in your research is combined both of the items in the problem pair into an additive composite (if that makes sense in the context of your research).> > HTH,> > Stephen Brand> > ---- Keith McCormick <[hidden email]> wrote:> > Hi Kathryn,> >> > Sorry, I have not been on the list for the last several days.> >> > The condition index will be high for the same rows that the> > eigenvalues are low.  You could focus on either one.  In those same> > rows, when the variance proportions are high those are the variables> > or variables that are potential problems.> >> > For instance, I just ran MPG as the dependent and engine displacement,> > horsepower, vehicle weight, and time to accelerate using the CARS.sav> > data set which is in your installation directory.  Two rows have low> > eigenvalues.  One indicates a possible problem pair engine, and time> > to acc.; the other shows engine displacement and weight as a possible> > problem pair. The former has an condition index of 27, the later 36.> > I would hate to think anyone would treat these differently because> > they are on either side of 30.  There are no hard and fast rules.  In> > this case, it is pretty clear that they are versions of the same> > thing, so I might drop some.  That would not always be the best route.> >> > Seaching for a good phrase to "cut and paste" for condition index, I> > cut the following from the "result's coach". "Condition indices are> > the square roots of the ratios of the largest eigenvalue to each> > successive eigenvalue."  The help is brief, and sometimes basic, but> > it is a resource worth remembering.  Just put any pivot table in> > editiing mode, and right click on it to get the result's coach.> >> > Hope that helps despite the delay,> >> > Best, Keith> > keithmccormick.com> >> >> > On 6/29/06, Kathryn Gardner <[hidden email]> wrote:> > > Hi Keith,> > >> > > Thanks for the further advice. I've been looking at collinearity diagnostics and I was wondering how the condition index is used? You said "Look for rows with a low eigenvalue (near zero), and then across the row for the columns that have high proportions (near one).  That will tell you which variables are collinear with each other". But I was wondering whether I need to look at the condition index as well? I read that a condition index over 30 suggests serious collinearity problems and an index over 15 indicates possible collinearity problems. If a factor (component) has a high condition index, one looks in the variance proportions column to see if two or more variables have a variance partition of .50 or higher. But this text makes no reference to the eigenvalues. Can you shed any light on how eigenvalues, condition indices and variance portions are used all together or relate?> > >> > > Thanks> > > Kathryn> > >> > >> > >> > > > Date: Tue, 27 Jun 2006 11:04:36 -0400> From: [hidden email]> To: [hidden email]> Subject: Re: Re: Suppressor variables in moderated multiple regression> CC: [hidden email]> > HI Kathryn,> > I am glad you made progress.  VIF and tolerance are often enough to do> the trick, but as Stephen mentioned there is also the Variance> Proportions table in the Collinearity Diagnostics.> > Look for row with a low eigenvalue (near zero), and then across the> row for the columns that have high proportions (near one).  That will> tell you which variables are collinear with each other.  Deleting> variables is not the only choice, you could combine them in some way> (add them or average them when it makes sense) or create factors using> factor analysis.> > Good luck,  Keith> > On 6/27/06, Kathryn Gardner <[hidden email]> wrote:> >> >> > Hi Keith,> >> > Thanks a lot for your help. I have centered my variables that are to be used> > to create product term!>  s !> >  and have also examined the collinearity diagnostics.> > I ended up deleting one variable with a VIF of 11.90 and I now have more> > acceptable VIF levels. I still have many variables which have non-sig zero> > order correlations but sig B coefficients in the regression model though, so> > perhaps they are suppressors. Actually some of my VIFs are still over 5 so> > you may be right in that they are suppressors.> >> > Thank you for your advice regarding my second question which was also> > helpful.> > Best wishes,> > Kathryn> >> >> > ________________________________> >> > > Date: Mon, 26 Jun 2006 13:01:57 -0400> > > From: [hidden email]> > > Subject: Re: Suppressor variables in moderated multiple regression> > > To: [hidden email]> >> > >> > > Hi All,> > >> > > I think there is evidence of suppression here, but> > there are a number> > > of things I would check that you don't mention.  I> > don't know which> > > you have tried, so I will list them in respon!>  se!> >   to 'a'.> > >> > > If you have not centered the variables, you might w> > ant> > to do that.> > > That is, you subtract the average of a variable> > from itself so that> > > zero is the average.  This is important when creating> > the interactions> > > and polynomials.> > >> > > Request the collinearity diagnostics.  Small> > tolerance values (below> > > .1) would indicate a problem and add to the evidence> > that suppression> > > is present.  Since you ran 3 steps it would be> > interesting to see when> > > (if) the tolerance radically lowers.> > >> > > VIF would also be a sign.  If the Variance Inflation> > Factor becomes> > > large, you might have suppression (5+ or so).  If you> > have not> > > centered and the VIF jumps on step three, then I> > would center and run> > > it again.  It might help a lot.> > >> > > In answer to 'b', I don't see any harm in looking at> > the standarized> > > beta on the interactions to check for one detail.  See> > if the value> > > falls outside its normal range - that is it shouldn't> > be above 1 or> > > belo!>  w !> >  -1.  If it is, that would also indicate> > suppression.> > >> > > HTH.  Good Luck.> > >> > > Keith> > > keithmccormick.com> > >> > > On 6/26/06, Kathryn Gardner <[hidden email]> wrote:> >> > > > Dear List,> > > >> > > > I've conducted moderated multiple regression> > analysis with the main effects> > > > on steps 1 and 2 and product (interaction) terms on> > step 3.  After recently> > > > reading & learning about suppressor variables,> > I examined the zero-order> > > > correlations between all predictors (including> > interaction terms) and the> > > > criterion variable & compared them to the regression> > model beta> > > > coefficients for inconsistencies in sizes and signs.> > I found that some> > > > bivariate correlations between predictors &> > the criterion are non-> > > > significant, but they are significant predictors in> > the regression> > > > analysis. I have read that this may be a sign of> > classical suppression & I> > > > was wondering if anyone !>  co!> >  uld advise on:> > > >> > > > a) whether this is a sign of suppression,> >  & even if it> > is, what else> > > > could these results suggest other than suppression?> > > > b) the literature on suppressor variables suggests> > looking for> > > > inconsistencies in signs and sizes between the> > bivariate correlations and> > > > standardized regression coefficients (beta).> > However, I have read that only> > > > the unstandardized coefficients (B) should be> > interpreted when interpreting> > > > interaction effects.  Is it therefore OK to> > examine inconsistencies between> > > > bivariate correlations and unstandardized> > coefficients and are there any> > > > issues I need to be aware of?> > > >> > > > Many thanks.> > > >> > > > Kathryn> > > >> >> >> > ________________________________> > Be one of the first to try Windows Live Mail.> > > _________________________________________________________________> > > Try Live.com: where your online world comes together - with news, sports, weather, and much more.> > > http://www.live.com/getstarted> > --> For personalized and experienced consulting in statistics and research design, visit www.statisticsdoc.com
_________________________________________________________________
Be one of the first to try Windows Live Mail.
http://ideas.live.com/programpage.aspx?versionId=5d21c51a-b161-4314-9b0e-4911fb2b2e6d