comparing two partial regression slopes within the same equation

classic Classic list List threaded Threaded
13 messages Options
Reply | Threaded
Open this post in threaded view
|

comparing two partial regression slopes within the same equation

msherman

Dear List:  I have tried to find some sources that would provide the means of testing two partial regression coefficients within the same regression equation.  The best that I have been able to find is to do the following but I cannot find a reference  for this.  This is what I think can be used to test the two partial slopes.

B1 –B2/Sqrt(SEb1[squared} +SEb2[squared])    = t value.   Does anyone know where I could find a reference to document this. thanks,

 

                                                                                                                                                                             
Martin F. Sherman, Ph.D.

Professor of Psychology

Director of Masters Education: Thesis Track
Loyola College of Arts and Sciences

 

Reply | Threaded
Open this post in threaded view
|

Re: comparing two partial regression slopes within the same equation

Mike
If memory serves, I believe one of the editions of Kleinbaum's regression
texts has something on this, at least for testing correlated regression
coefficient.  Consider looking at:

Applied regression analysis and other multivariable methods
 By David G. Kleinbaum, Lawrence L. Kupper, Keith E. Muller

I think that the 4th edition is the latest.  Here is the entry on WorldCat:
 
 
-Mike Palij
New York University




----- Original Message -----
From: Martin Sherman
To: [hidden email]
Sent: Friday, March 26, 2010 3:32 PM
Subject: comparing two partial regression slopes within the same equation


Dear List:  I have tried to find some sources that would provide the means of testing two partial regression coefficients within the same regression equation.  The best that I have been able to find is to do the following but I cannot find a reference  for this.  This is what I think can be used to test the two partial slopes.
B1 –B2/Sqrt(SEb1[squared} +SEb2[squared])    = t value.   Does anyone know where I could find a reference to document this. thanks,
 
                                                                                                                                                                            
Martin F. Sherman, Ph.D.
Professor of Psychology
Director of Masters Education: Thesis Track
Loyola College of Arts and Sciences
 
Reply | Threaded
Open this post in threaded view
|

Re: comparing two partial regression slopes within the same equation

Bruce Weaver
Administrator
In reply to this post by msherman
See posts 1 and 2 here:

http://groups.google.ca/group/sci.stat.edu/browse_frm/thread/766fbb9b4acd80f1/5707c3f7f9fb8615?hl=en&q=Ray+Koopman+compare+coefficients#5707c3f7f9fb8615


Martin Sherman wrote
Dear List:  I have tried to find some sources that would provide the means of testing two partial regression coefficients within the same regression equation.  The best that I have been able to find is to do the following but I cannot find a reference  for this.  This is what I think can be used to test the two partial slopes.
B1 -B2/Sqrt(SEb1[squared} +SEb2[squared])    = t value.   Does anyone know where I could find a reference to document this. thanks,


Martin F. Sherman, Ph.D.
Professor of Psychology
Director of Masters Education: Thesis Track
Loyola College of Arts and Sciences
--
Bruce Weaver
bweaver@lakeheadu.ca
http://sites.google.com/a/lakeheadu.ca/bweaver/

"When all else fails, RTFM."

PLEASE NOTE THE FOLLOWING: 
1. My Hotmail account is not monitored regularly. To send me an e-mail, please use the address shown above.
2. The SPSSX Discussion forum on Nabble is no longer linked to the SPSSX-L listserv administered by UGA (https://listserv.uga.edu/).
Reply | Threaded
Open this post in threaded view
|

Re: comparing two partial regression slopes within the same equation

Kornbrot, Diana
In reply to this post by msherman
Re: comparing two partial regression slopes within the same equation create a model with the continuous variable, c; the group factor, f; and an f*c interaction
Then a significant f*c interaction implies that slopes are different [at chosen alpha]
If conducted with spss software the f*c parameter valye is the diffrenence in slope
Alternatively, my preferred option, give slope and intercept for each group separately.

Your solution is, in my view, equivalent to the above.
Try the spss manual [or other stats package] for refs. The prodedure is known as ANCOVA and is a glm, general linear model
Google suggested: wkipedia, which in turn suggests STATSOFT http://www.statsoft.com/textbook/general-linear-models/
– in my view one of the best on-line stats resources
http://udel.edu/~mcdonald/statancova.html is also good with excellent bio eg, usful eg of grpahic presentation and following suggested refs.
Sokal, R.R., and F.J. Rohlf. 1995. Biometry: The principles and practice of statistics in biological research. 3rd edition. W.H. Freeman, New York.
Zar, J.H. 1999. Biostatistical analysis. 4th edition. Prentice Hall, Upper Saddle River, NJ.

Best
diana

On 26/03/2010 19:32, "Martin Sherman" <MSherman@...> wrote:

Dear List:  I have tried to find some sources that would provide the means of testing two partial regression coefficients within the same regression equation.  The best that I have been able to find is to do the following but I cannot find a reference  for this.  This is what I think can be used to test the two partial slopes.
B1 –B2/Sqrt(SEb1[squared} +SEb2[squared])   = t value.   Does anyone know where I could find a reference to document this. thanks,
 
                                                                                                                                                                             
Martin F. Sherman, Ph.D.
Professor of Psychology
Director of Masters Education: Thesis Track
Loyola College of Arts and Sciences





Professor Diana Kornbrot
  email: 
d.e.kornbrot@...    
   
web:    http://web.mac.com/kornbrot/iweb/KornbrotHome.html
Work
School of Psychology
University of Hertfordshire
College Lane, Hatfield, Hertfordshire AL10 9AB, UK
    voice:     +44 (0) 170 728 4626
    mobile:   +44 (0) 796 890 2102
    fax          +44 (0) 170 728 5073
Home
19 Elmhurst Avenue
London N2 0LT, UK
   landline: +44 (0) 208 883 3657
   mobile:   +44 (0) 796 890 2102
   fax:         +44 (0) 870 706 4997





Reply | Threaded
Open this post in threaded view
|

Re: comparing two partial regression slopes within the same equation

Bruce Weaver
Administrator
Martin, Diana's solution below assumes you are looking at the linear relationship between X and Y in two independent groups, and want to know if the slopes differ significantly for those two groups.  The solution I pointed you to, on the other hand, assumed you have this equation:

   Y = b0 + b1X1 + b2X2 + error

And that you want to test the null hypothesis that b1 = b2.  Please clarify which it is.


kornbrot wrote
create a model with the continuous variable, c; the group factor, f; and an
f*c interaction
Then a significant f*c interaction implies that slopes are different [at
chosen alpha]
If conducted with spss software the f*c parameter valye is the diffrenence
in slope
Alternatively, my preferred option, give slope and intercept for each group
separately.

Your solution is, in my view, equivalent to the above.
Try the spss manual [or other stats package] for refs. The prodedure is
known as ANCOVA and is a glm, general linear model
Google suggested: wkipedia, which in turn suggests STATSOFT
http://www.statsoft.com/textbook/general-linear-models/
­ in my view one of the best on-line stats resources
http://udel.edu/~mcdonald/statancova.html is also good with excellent bio
eg, usful eg of grpahic presentation and following suggested refs.
Sokal, R.R., and F.J. Rohlf. 1995. Biometry: The principles and practice of
statistics in biological research. 3rd edition. W.H. Freeman, New York.
Zar, J.H. 1999. Biostatistical analysis. 4th edition. Prentice Hall, Upper
Saddle River, NJ.

Best
diana

On 26/03/2010 19:32, "Martin Sherman" <MSherman@loyola.edu> wrote:

> Dear List:  I have tried to find some sources that would provide the means of
> testing two partial regression coefficients within the same regression
> equation.  The best that I have been able to find is to do the following but I
> cannot find a reference  for this.  This is what I think can be used to test
> the two partial slopes.
> B1 ­B2/Sqrt(SEb1[squared} +SEb2[squared])   = t value.   Does anyone know
> where I could find a reference to document this. thanks,
>  
>                  
> Martin F. Sherman, Ph.D.
> Professor of Psychology
> Director of Masters Education: Thesis Track
> Loyola College of Arts and Sciences
>  
>



Professor Diana Kornbrot
   email:  d.e.kornbrot@herts.ac.uk
   web:    http://web.mac.com/kornbrot/iweb/KornbrotHome.html
Work
School of Psychology
University of Hertfordshire
College Lane, Hatfield, Hertfordshire AL10 9AB, UK
    voice:     +44 (0) 170 728 4626
    mobile:   +44 (0) 796 890 2102
    fax          +44 (0) 170 728 5073
Home
19 Elmhurst Avenue
London N2 0LT, UK
   landline: +44 (0) 208 883 3657
   mobile:   +44 (0) 796 890 2102
   fax:         +44 (0) 870 706 4997





--
Bruce Weaver
bweaver@lakeheadu.ca
http://sites.google.com/a/lakeheadu.ca/bweaver/

"When all else fails, RTFM."

PLEASE NOTE THE FOLLOWING: 
1. My Hotmail account is not monitored regularly. To send me an e-mail, please use the address shown above.
2. The SPSSX Discussion forum on Nabble is no longer linked to the SPSSX-L listserv administered by UGA (https://listserv.uga.edu/).
Reply | Threaded
Open this post in threaded view
|

Re: comparing two partial regression slopes within the same equation

Kornbrot, Diana
Re: comparing two partial regression slopes within the same equation Apologies to all
MY response did NOT answer Martin’s question
As I now understand it, he has two continuous predictors of a single continuous outcome. He can get linear model with simple regression coefficients for each predictor. He also has PARTIAL correlations for each one separately and want to know if a numerical difference is statistically significant.

My approach would be to test if the model with 2 vars accounts for SIGNIFICANTLY more vrainace than the best model with only one var. If yes, then that’s the model to go with. If no, then use the simple model with the best indpendent predictor, since it will include the effecty of the other correlated predictor.

One might also try a SEM with a hiddden variable contributing to both the observed predictor variables – but that’s getting complicated
Best
Diana


On 27/03/2010 14:59, "Bruce Weaver" <bruce.weaver@...> wrote:

Martin, Diana's solution below assumes you are looking at the linear
relationship between X and Y in two independent groups, and want to know if
the slopes differ significantly for those two groups.  The solution I
pointed you to, on the other hand, assumed you have this equation:

   Y = b0 + b1X1 + b2X2 + error

And that you want to test the null hypothesis that b1 = b2.  Please clarify
which it is.



kornbrot wrote:
>
> create a model with the continuous variable, c; the group factor, f; and
> an
> f*c interaction
> Then a significant f*c interaction implies that slopes are different [at
> chosen alpha]
> If conducted with spss software the f*c parameter valye is the diffrenence
> in slope
> Alternatively, my preferred option, give slope and intercept for each
> group
> separately.
>
> Your solution is, in my view, equivalent to the above.
> Try the spss manual [or other stats package] for refs. The prodedure is
> known as ANCOVA and is a glm, general linear model
> Google suggested: wkipedia, which in turn suggests STATSOFT
> http://www.statsoft.com/textbook/general-linear-models/
> ­ in my view one of the best on-line stats resources
> http://udel.edu/~mcdonald/statancova.html is also good with excellent bio
> eg, usful eg of grpahic presentation and following suggested refs.
> Sokal, R.R., and F.J. Rohlf. 1995. Biometry: The principles and practice
> of
> statistics in biological research. 3rd edition. W.H. Freeman, New York.
> Zar, J.H. 1999. Biostatistical analysis. 4th edition. Prentice Hall, Upper
> Saddle River, NJ.
>
> Best
> diana
>
> On 26/03/2010 19:32, "Martin Sherman" <MSherman@...> wrote:
>
>> Dear List:  I have tried to find some sources that would provide the
>> means of
>> testing two partial regression coefficients within the same regression
>> equation.  The best that I have been able to find is to do the following
>> but I
>> cannot find a reference  for this.  This is what I think can be used to
>> test
>> the two partial slopes.
>> B1 ­B2/Sqrt(SEb1[squared} +SEb2[squared])   = t value.   Does anyone know
>> where I could find a reference to document this. thanks,
>>
>>
>> Martin F. Sherman, Ph.D.
>> Professor of Psychology
>> Director of Masters Education: Thesis Track
>> Loyola College of Arts and Sciences
>>
>>
>
>
>
> Professor Diana Kornbrot
>    email:�  d.e.kornbrot@...
>    web:    http://web.mac.com/kornbrot/iweb/KornbrotHome.html
> Work
> School of Psychology
> University of Hertfordshire
> College Lane, Hatfield, Hertfordshire AL10 9AB, UK
>     voice:     +44 (0) 170 728 4626
>     mobile:   +44 (0) 796 890 2102
>     fax          +44 (0) 170 728 5073
> Home
> 19 Elmhurst Avenue
> London N2 0LT, UK
>    landline: +44 (0) 208 883 3657
>    mobile:   +44 (0) 796 890 2102
>    fax:         +44 (0) 870 706 4997
>
>
>
>
>
>
>
>


-----
--
Bruce Weaver
bweaver@...
http://sites.google.com/a/lakeheadu.ca/bweaver/
"When all else fails, RTFM."

NOTE:  My Hotmail account is not monitored regularly.
To send me an e-mail, please use the address shown above.
--
View this message in context: http://old.nabble.com/comparing-two-partial-regression-slopes-within-the-same-equation-tp28046996p28052889.html
Sent from the SPSSX Discussion mailing list archive at Nabble.com.

=====================
To manage your subscription to SPSSX-L, send a message to
LISTSERV@... (not to SPSSX-L), with no body text except the
command. To leave the list, send the command
SIGNOFF SPSSX-L
For a list of commands to manage subscriptions, send the command
INFO REFCARD



Professor Diana Kornbrot
email: 
d.e.kornbrot@...    
web:    http://web.me.com/kornbrot/KornbrotHome.html
Work
School of Psychology
 University of Hertfordshire
 College Lane, Hatfield, Hertfordshire AL10 9AB, UK
 voice:   +44 (0) 170 728 4626
   fax:     +44 (0) 170 728 5073
Home
 
19 Elmhurst Avenue
 London N2 0LT, UK
    voice:   +44 (0) 208 883  3657
    mobile: +44 (0)
796 890 2102
   fax:      +44 (0) 870 706 4997





Reply | Threaded
Open this post in threaded view
|

Re: comparing two partial regression slopes within the same equation

msherman
In reply to this post by Bruce Weaver
Bruce:  Yes, you are correct.  I have a single regression with two predictor variables.  Both predictors are significant and I want to know if they are significantly different from one another. Looking for an explicit test to show that they are in fact statistically different.  Thanks, mfs
� � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � �
Martin F. Sherman, Ph.D.
Professor of Psychology
Director of Masters Education: Thesis Track
Loyola College of Arts and Sciences

Loyola University Maryland
4501 North Charles Street
222 B Beatty Hall
Baltimore, MD 21210-2601

410-617-2417 office
410-617-5341 fax

[hidden email]

www.loyola.edu


-----Original Message-----
From: SPSSX(r) Discussion [mailto:[hidden email]] On Behalf Of Bruce Weaver
Sent: Saturday, March 27, 2010 10:59 AM
To: [hidden email]
Subject: Re: comparing two partial regression slopes within the same equation

Martin, Diana's solution below assumes you are looking at the linear
relationship between X and Y in two independent groups, and want to know if
the slopes differ significantly for those two groups.  The solution I
pointed you to, on the other hand, assumed you have this equation:

   Y = b0 + b1X1 + b2X2 + error

And that you want to test the null hypothesis that b1 = b2.  Please clarify
which it is.



kornbrot wrote:

>
> create a model with the continuous variable, c; the group factor, f; and
> an
> f*c interaction
> Then a significant f*c interaction implies that slopes are different [at
> chosen alpha]
> If conducted with spss software the f*c parameter valye is the diffrenence
> in slope
> Alternatively, my preferred option, give slope and intercept for each
> group
> separately.
>
> Your solution is, in my view, equivalent to the above.
> Try the spss manual [or other stats package] for refs. The prodedure is
> known as ANCOVA and is a glm, general linear model
> Google suggested: wkipedia, which in turn suggests STATSOFT
> http://www.statsoft.com/textbook/general-linear-models/
> ­ in my view one of the best on-line stats resources
> http://udel.edu/~mcdonald/statancova.html is also good with excellent bio
> eg, usful eg of grpahic presentation and following suggested refs.
> Sokal, R.R., and F.J. Rohlf. 1995. Biometry: The principles and practice
> of
> statistics in biological research. 3rd edition. W.H. Freeman, New York.
> Zar, J.H. 1999. Biostatistical analysis. 4th edition. Prentice Hall, Upper
> Saddle River, NJ.
>
> Best
> diana
>
> On 26/03/2010 19:32, "Martin Sherman" <[hidden email]> wrote:
>
>> Dear List:  I have tried to find some sources that would provide the
>> means of
>> testing two partial regression coefficients within the same regression
>> equation.  The best that I have been able to find is to do the following
>> but I
>> cannot find a reference  for this.  This is what I think can be used to
>> test
>> the two partial slopes.
>> B1 ­B2/Sqrt(SEb1[squared} +SEb2[squared])   = t value.   Does anyone know
>> where I could find a reference to document this. thanks,
>>
>>
>> Martin F. Sherman, Ph.D.
>> Professor of Psychology
>> Director of Masters Education: Thesis Track
>> Loyola College of Arts and Sciences
>>
>>
>
>
>
> Professor Diana Kornbrot
>    email:�  [hidden email]
>    web:    http://web.mac.com/kornbrot/iweb/KornbrotHome.html
> Work
> School of Psychology
> University of Hertfordshire
> College Lane, Hatfield, Hertfordshire AL10 9AB, UK
>     voice:     +44 (0) 170 728 4626
>     mobile:   +44 (0) 796 890 2102
>     fax          +44 (0) 170 728 5073
> Home
> 19 Elmhurst Avenue
> London N2 0LT, UK
>    landline: +44 (0) 208 883 3657
>    mobile:   +44 (0) 796 890 2102
>    fax:         +44 (0) 870 706 4997
>
>
>
>
>
>
>
>


-----
--
Bruce Weaver
[hidden email]
http://sites.google.com/a/lakeheadu.ca/bweaver/
"When all else fails, RTFM."

NOTE:  My Hotmail account is not monitored regularly.
To send me an e-mail, please use the address shown above.
--
View this message in context: http://old.nabble.com/comparing-two-partial-regression-slopes-within-the-same-equation-tp28046996p28052889.html
Sent from the SPSSX Discussion mailing list archive at Nabble.com.

=====================
To manage your subscription to SPSSX-L, send a message to
[hidden email] (not to SPSSX-L), with no body text except the
command. To leave the list, send the command
SIGNOFF SPSSX-L
For a list of commands to manage subscriptions, send the command
INFO REFCARD

=====================
To manage your subscription to SPSSX-L, send a message to
[hidden email] (not to SPSSX-L), with no body text except the
command. To leave the list, send the command
SIGNOFF SPSSX-L
For a list of commands to manage subscriptions, send the command
INFO REFCARD
Reply | Threaded
Open this post in threaded view
|

Re: comparing two partial regression slopes within the same equation

Bruce Weaver
Administrator
In reply to this post by Kornbrot, Diana
Diana, what don't you like about Ray Koopman's suggestion (post # 2 at the link given below)?

http://groups.google.ca/group/sci.stat.edu/browse_frm/thread/766fbb9b4acd80f1/5707c3f7f9fb8615?hl=en&q=Ray+Koopman+compare+coefficients#5707c3f7f9fb8615

I've not tried Koopman's method, but think it would give the same result as the following:

Var(b1) = SE(b1)^2
Var(b2) = SE(b2)^2
Var(b1-b2) = Var(b1) + Var(b2) - 2*COV(b1,b2)
               = Var(b1) + Var(b2) - 2*Corr(b1,b2)*SE(b1)*SE(b2)

SE(b1-b2) = SQRT[Var(b1-b2)]

t = (b1-b2) / SE(b1-b2)

Note that the covariance term was omitted from the SE formula in the original post.

Bruce

kornbrot wrote
Apologies to all
MY response did NOT answer Martin’s question
As I now understand it, he has two continuous predictors of a single
continuous outcome. He can get linear model with simple regression
coefficients for each predictor. He also has PARTIAL correlations for each
one separately and want to know if a numerical difference is statistically
significant.

My approach would be to test if the model with 2 vars accounts for
SIGNIFICANTLY more vrainace than the best model with only one var. If yes,
then that’s the model to go with. If no, then use the simple model with the
best indpendent predictor, since it will include the effecty of the other
correlated predictor.

One might also try a SEM with a hiddden variable contributing to both the
observed predictor variables – but that’s getting complicated
Best
Diana


On 27/03/2010 14:59, "Bruce Weaver" <bruce.weaver@hotmail.com> wrote:

> Martin, Diana's solution below assumes you are looking at the linear
> relationship between X and Y in two independent groups, and want to know if
> the slopes differ significantly for those two groups.  The solution I
> pointed you to, on the other hand, assumed you have this equation:
>
>    Y = b0 + b1X1 + b2X2 + error
>
> And that you want to test the null hypothesis that b1 = b2.  Please clarify
> which it is.
>
>
>
> kornbrot wrote:
>> >
>> > create a model with the continuous variable, c; the group factor, f; and
>> > an
>> > f*c interaction
>> > Then a significant f*c interaction implies that slopes are different [at
>> > chosen alpha]
>> > If conducted with spss software the f*c parameter valye is the diffrenence
>> > in slope
>> > Alternatively, my preferred option, give slope and intercept for each
>> > group
>> > separately.
>> >
>> > Your solution is, in my view, equivalent to the above.
>> > Try the spss manual [or other stats package] for refs. The prodedure is
>> > known as ANCOVA and is a glm, general linear model
>> > Google suggested: wkipedia, which in turn suggests STATSOFT
>> > http://www.statsoft.com/textbook/general-linear-models/
>> > ­ in my view one of the best on-line stats resources
>> > http://udel.edu/~mcdonald/statancova.html is also good with excellent bio
>> > eg, usful eg of grpahic presentation and following suggested refs.
>> > Sokal, R.R., and F.J. Rohlf. 1995. Biometry: The principles and practice
>> > of
>> > statistics in biological research. 3rd edition. W.H. Freeman, New York.
>> > Zar, J.H. 1999. Biostatistical analysis. 4th edition. Prentice Hall, Upper
>> > Saddle River, NJ.
>> >
>> > Best
>> > diana
>> >
>> > On 26/03/2010 19:32, "Martin Sherman" <MSherman@loyola.edu> wrote:
>> >
>>> >> Dear List:  I have tried to find some sources that would provide the
>>> >> means of
>>> >> testing two partial regression coefficients within the same regression
>>> >> equation.  The best that I have been able to find is to do the following
>>> >> but I
>>> >> cannot find a reference  for this.  This is what I think can be used to
>>> >> test
>>> >> the two partial slopes.
>>> >> B1 ­B2/Sqrt(SEb1[squared} +SEb2[squared])   = t value.   Does anyone know
>>> >> where I could find a reference to document this. thanks,
>>> >>
>>> >>
>>> >> Martin F. Sherman, Ph.D.
>>> >> Professor of Psychology
>>> >> Director of Masters Education: Thesis Track
>>> >> Loyola College of Arts and Sciences
>>> >>
>>> >>
>> >
>> >
>> >
>> > Professor Diana Kornbrot
>> >    email:�  d.e.kornbrot@herts.ac.uk
>> >    web:    http://web.mac.com/kornbrot/iweb/KornbrotHome.html
>> > Work
>> > School of Psychology
>> > University of Hertfordshire
>> > College Lane, Hatfield, Hertfordshire AL10 9AB, UK
>> >     voice:     +44 (0) 170 728 4626
>> >     mobile:   +44 (0) 796 890 2102
>> >     fax          +44 (0) 170 728 5073
>> > Home
>> > 19 Elmhurst Avenue
>> > London N2 0LT, UK
>> >    landline: +44 (0) 208 883 3657
>> >    mobile:   +44 (0) 796 890 2102
>> >    fax:         +44 (0) 870 706 4997
>> >
>> >
>> >
>> >
>> >
>> >
>> >
>> >
>
>
> -----
> --
> Bruce Weaver
> bweaver@lakeheadu.ca
> http://sites.google.com/a/lakeheadu.ca/bweaver/
> "When all else fails, RTFM."
>
> NOTE:  My Hotmail account is not monitored regularly.
> To send me an e-mail, please use the address shown above.
> --
> View this message in context:
> http://old.nabble.com/comparing-two-partial-regression-slopes-within-the-same-
> equation-tp28046996p28052889.html
> Sent from the SPSSX Discussion mailing list archive at Nabble.com.
>
> =====================
> To manage your subscription to SPSSX-L, send a message to
> LISTSERV@LISTSERV.UGA.EDU (not to SPSSX-L), with no body text except the
> command. To leave the list, send the command
> SIGNOFF SPSSX-L
> For a list of commands to manage subscriptions, send the command
> INFO REFCARD



Professor Diana Kornbrot
email:  d.e.kornbrot@herts.ac.uk
web:    http://web.me.com/kornbrot/KornbrotHome.html
Work
School of Psychology
 University of Hertfordshire
 College Lane, Hatfield, Hertfordshire AL10 9AB, UK
   voice:   +44 (0) 170 728 4626
   fax:     +44 (0) 170 728 5073
Home
 19 Elmhurst Avenue
 London N2 0LT, UK
    voice:   +44 (0) 208 883  3657
    mobile: +44 (0) 796 890 2102
    fax:      +44 (0) 870 706 4997





--
Bruce Weaver
bweaver@lakeheadu.ca
http://sites.google.com/a/lakeheadu.ca/bweaver/

"When all else fails, RTFM."

PLEASE NOTE THE FOLLOWING: 
1. My Hotmail account is not monitored regularly. To send me an e-mail, please use the address shown above.
2. The SPSSX Discussion forum on Nabble is no longer linked to the SPSSX-L listserv administered by UGA (https://listserv.uga.edu/).
Reply | Threaded
Open this post in threaded view
|

Re: comparing two partial regression slopes within the same equation

statisticsdoc
In reply to this post by msherman
Martin,
You may find the following paper helpful for the purpose of testing the significance of the difference between two beta weights in the same equation;
Httner et al (1995)
Educational and Psychological Measurement.
Vol 55 (5) pp 777-784
Best,
Steve Brand
www.StatisticsDoc.com

-----Original Message-----
From: Martin Sherman <[hidden email]>
Date:         Sat, 27 Mar 2010 12:08:23
To: <[hidden email]>
Subject:      Re: comparing two partial regression slopes within the same
              equation

Bruce:  Yes, you are correct.  I have a single regression with two predictor variables.  Both predictors are significant and I want to know if they are significantly different from one another. Looking for an explicit test to show that they are in fact statistically different.  Thanks, mfs
� � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � �
Martin F. Sherman, Ph.D.
Professor of Psychology
Director of Masters Education: Thesis Track
Loyola College of Arts and Sciences

Loyola University Maryland
4501 North Charles Street
222 B Beatty Hall
Baltimore, MD 21210-2601

410-617-2417 office
410-617-5341 fax

[hidden email]

www.loyola.edu


-----Original Message-----
From: SPSSX(r) Discussion [mailto:[hidden email]] On Behalf Of Bruce Weaver
Sent: Saturday, March 27, 2010 10:59 AM
To: [hidden email]
Subject: Re: comparing two partial regression slopes within the same equation

Martin, Diana's solution below assumes you are looking at the linear
relationship between X and Y in two independent groups, and want to know if
the slopes differ significantly for those two groups.  The solution I
pointed you to, on the other hand, assumed you have this equation:

   Y = b0 + b1X1 + b2X2 + error

And that you want to test the null hypothesis that b1 = b2.  Please clarify
which it is.



kornbrot wrote:

>
> create a model with the continuous variable, c; the group factor, f; and
> an
> f*c interaction
> Then a significant f*c interaction implies that slopes are different [at
> chosen alpha]
> If conducted with spss software the f*c parameter valye is the diffrenence
> in slope
> Alternatively, my preferred option, give slope and intercept for each
> group
> separately.
>
> Your solution is, in my view, equivalent to the above.
> Try the spss manual [or other stats package] for refs. The prodedure is
> known as ANCOVA and is a glm, general linear model
> Google suggested: wkipedia, which in turn suggests STATSOFT
> http://www.statsoft.com/textbook/general-linear-models/
> ­ in my view one of the best on-line stats resources
> http://udel.edu/~mcdonald/statancova.html is also good with excellent bio
> eg, usful eg of grpahic presentation and following suggested refs.
> Sokal, R.R., and F.J. Rohlf. 1995. Biometry: The principles and practice
> of
> statistics in biological research. 3rd edition. W.H. Freeman, New York.
> Zar, J.H. 1999. Biostatistical analysis. 4th edition. Prentice Hall, Upper
> Saddle River, NJ.
>
> Best
> diana
>
> On 26/03/2010 19:32, "Martin Sherman" <[hidden email]> wrote:
>
>> Dear List:  I have tried to find some sources that would provide the
>> means of
>> testing two partial regression coefficients within the same regression
>> equation.  The best that I have been able to find is to do the following
>> but I
>> cannot find a reference  for this.  This is what I think can be used to
>> test
>> the two partial slopes.
>> B1 ­B2/Sqrt(SEb1[squared} +SEb2[squared])   = t value.   Does anyone know
>> where I could find a reference to document this. thanks,
>>
>>
>> Martin F. Sherman, Ph.D.
>> Professor of Psychology
>> Director of Masters Education: Thesis Track
>> Loyola College of Arts and Sciences
>>
>>
>
>
>
> Professor Diana Kornbrot
>    email:�  [hidden email]
>    web:    http://web.mac.com/kornbrot/iweb/KornbrotHome.html
> Work
> School of Psychology
> University of Hertfordshire
> College Lane, Hatfield, Hertfordshire AL10 9AB, UK
>     voice:     +44 (0) 170 728 4626
>     mobile:   +44 (0) 796 890 2102
>     fax          +44 (0) 170 728 5073
> Home
> 19 Elmhurst Avenue
> London N2 0LT, UK
>    landline: +44 (0) 208 883 3657
>    mobile:   +44 (0) 796 890 2102
>    fax:         +44 (0) 870 706 4997
>
>
>
>
>
>
>
>


-----
--
Bruce Weaver
[hidden email]
http://sites.google.com/a/lakeheadu.ca/bweaver/
"When all else fails, RTFM."

NOTE:  My Hotmail account is not monitored regularly.
To send me an e-mail, please use the address shown above.
--
View this message in context: http://old.nabble.com/comparing-two-partial-regression-slopes-within-the-same-equation-tp28046996p28052889.html
Sent from the SPSSX Discussion mailing list archive at Nabble.com.

=====================
To manage your subscription to SPSSX-L, send a message to
[hidden email] (not to SPSSX-L), with no body text except the
command. To leave the list, send the command
SIGNOFF SPSSX-L
For a list of commands to manage subscriptions, send the command
INFO REFCARD

=====================
To manage your subscription to SPSSX-L, send a message to
[hidden email] (not to SPSSX-L), with no body text except the
command. To leave the list, send the command
SIGNOFF SPSSX-L
For a list of commands to manage subscriptions, send the command
INFO REFCARD

=====================
To manage your subscription to SPSSX-L, send a message to
[hidden email] (not to SPSSX-L), with no body text except the
command. To leave the list, send the command
SIGNOFF SPSSX-L
For a list of commands to manage subscriptions, send the command
INFO REFCARD
Reply | Threaded
Open this post in threaded view
|

Re: comparing two partial regression slopes within the same equation

Swank, Paul R
In reply to this post by Bruce Weaver
There are two ways to do this. First, you model y = b1x1 + b2x2 + b0. Then, assuming b1 = b2 implies that y = b1x1 + b1x2 + b0 = b1(x1+x2) + b0.

Compare the the error sums of squares (or R squareds) using a partial F  test. Or you use Koopman's method. Y = b1(x1+x2) + b2(X2-X1) + b0 which implies that y = x1(b1-b2) + x2(b1+b2) + b0. In the latter case, if b2 = 0 then y = b1x1 + b2x2 + b0.

Both methods lead to the same F test between coefficients.

Paul

Dr. Paul R. Swank,
Professor and Director of Research
Children's Learning Institute
University of Texas Health Science Center-Houston


-----Original Message-----
From: SPSSX(r) Discussion [mailto:[hidden email]] On Behalf Of Bruce Weaver
Sent: Saturday, March 27, 2010 3:18 PM
To: [hidden email]
Subject: Re: comparing two partial regression slopes within the same equation

Diana, what don't you like about Ray Koopman's suggestion (post # 2 at the
link given below)?

http://groups.google.ca/group/sci.stat.edu/browse_frm/thread/766fbb9b4acd80f1/5707c3f7f9fb8615?hl=en&q=Ray+Koopman+compare+coefficients#5707c3f7f9fb8615

I've not tried Koopman's method, but think it would give the same result as
the following:

Var(b1) = SE(b1)^2
Var(b2) = SE(b2)^2
Var(b1-b2) = Var(b1) + Var(b2) - 2*COV(b1,b2)
               = Var(b1) + Var(b2) - 2*Corr(b1,b2)*SE(b1)*SE(b2)

SE(b1-b2) = SQRT[Var(b1-b2)]

t = (b1-b2) / SE(b1-b2)

Note that the covariance term was omitted from the SE formula in the
original post.

Bruce


kornbrot wrote:

>
> Apologies to all
> MY response did NOT answer Martin’s question
> As I now understand it, he has two continuous predictors of a single
> continuous outcome. He can get linear model with simple regression
> coefficients for each predictor. He also has PARTIAL correlations for each
> one separately and want to know if a numerical difference is statistically
> significant.
>
> My approach would be to test if the model with 2 vars accounts for
> SIGNIFICANTLY more vrainace than the best model with only one var. If yes,
> then that’s the model to go with. If no, then use the simple model with
> the
> best indpendent predictor, since it will include the effecty of the other
> correlated predictor.
>
> One might also try a SEM with a hiddden variable contributing to both the
> observed predictor variables – but that’s getting complicated
> Best
> Diana
>
>
> On 27/03/2010 14:59, "Bruce Weaver" <[hidden email]> wrote:
>
>> Martin, Diana's solution below assumes you are looking at the linear
>> relationship between X and Y in two independent groups, and want to know
>> if
>> the slopes differ significantly for those two groups.  The solution I
>> pointed you to, on the other hand, assumed you have this equation:
>>
>>    Y = b0 + b1X1 + b2X2 + error
>>
>> And that you want to test the null hypothesis that b1 = b2.  Please
>> clarify
>> which it is.
>>
>>
>>
>> kornbrot wrote:
>>> >
>>> > create a model with the continuous variable, c; the group factor, f;
>>> and
>>> > an
>>> > f*c interaction
>>> > Then a significant f*c interaction implies that slopes are different
>>> [at
>>> > chosen alpha]
>>> > If conducted with spss software the f*c parameter valye is the
>>> diffrenence
>>> > in slope
>>> > Alternatively, my preferred option, give slope and intercept for each
>>> > group
>>> > separately.
>>> >
>>> > Your solution is, in my view, equivalent to the above.
>>> > Try the spss manual [or other stats package] for refs. The prodedure
>>> is
>>> > known as ANCOVA and is a glm, general linear model
>>> > Google suggested: wkipedia, which in turn suggests STATSOFT
>>> > http://www.statsoft.com/textbook/general-linear-models/
>>> > ­ in my view one of the best on-line stats resources
>>> > http://udel.edu/~mcdonald/statancova.html is also good with excellent
>>> bio
>>> > eg, usful eg of grpahic presentation and following suggested refs.
>>> > Sokal, R.R., and F.J. Rohlf. 1995. Biometry: The principles and
>>> practice
>>> > of
>>> > statistics in biological research. 3rd edition. W.H. Freeman, New
>>> York.
>>> > Zar, J.H. 1999. Biostatistical analysis. 4th edition. Prentice Hall,
>>> Upper
>>> > Saddle River, NJ.
>>> >
>>> > Best
>>> > diana
>>> >
>>> > On 26/03/2010 19:32, "Martin Sherman" <[hidden email]> wrote:
>>> >
>>>> >> Dear List:  I have tried to find some sources that would provide the
>>>> >> means of
>>>> >> testing two partial regression coefficients within the same
>>>> regression
>>>> >> equation.  The best that I have been able to find is to do the
>>>> following
>>>> >> but I
>>>> >> cannot find a reference  for this.  This is what I think can be used
>>>> to
>>>> >> test
>>>> >> the two partial slopes.
>>>> >> B1 ­B2/Sqrt(SEb1[squared} +SEb2[squared])   = t value.   Does anyone
>>>> know
>>>> >> where I could find a reference to document this. thanks,
>>>> >>
>>>> >>
>>>> >> Martin F. Sherman, Ph.D.
>>>> >> Professor of Psychology
>>>> >> Director of Masters Education: Thesis Track
>>>> >> Loyola College of Arts and Sciences
>>>> >>
>>>> >>
>>> >
>>> >
>>> >
>>> > Professor Diana Kornbrot
>>> >    email:�  [hidden email]
>>> >    web:    http://web.mac.com/kornbrot/iweb/KornbrotHome.html
>>> > Work
>>> > School of Psychology
>>> > University of Hertfordshire
>>> > College Lane, Hatfield, Hertfordshire AL10 9AB, UK
>>> >     voice:     +44 (0) 170 728 4626
>>> >     mobile:   +44 (0) 796 890 2102
>>> >     fax          +44 (0) 170 728 5073
>>> > Home
>>> > 19 Elmhurst Avenue
>>> > London N2 0LT, UK
>>> >    landline: +44 (0) 208 883 3657
>>> >    mobile:   +44 (0) 796 890 2102
>>> >    fax:         +44 (0) 870 706 4997
>>> >
>>> >
>>> >
>>> >
>>> >
>>> >
>>> >
>>> >
>>
>>
>> -----
>> --
>> Bruce Weaver
>> [hidden email]
>> http://sites.google.com/a/lakeheadu.ca/bweaver/
>> "When all else fails, RTFM."
>>
>> NOTE:  My Hotmail account is not monitored regularly.
>> To send me an e-mail, please use the address shown above.
>> --
>> View this message in context:
>> http://old.nabble.com/comparing-two-partial-regression-slopes-within-the-same-
>> equation-tp28046996p28052889.html
>> Sent from the SPSSX Discussion mailing list archive at Nabble.com.
>>
>> =====================
>> To manage your subscription to SPSSX-L, send a message to
>> [hidden email] (not to SPSSX-L), with no body text except the
>> command. To leave the list, send the command
>> SIGNOFF SPSSX-L
>> For a list of commands to manage subscriptions, send the command
>> INFO REFCARD
>
>
>
> Professor Diana Kornbrot
> email:�  [hidden email]
> web:    http://web.me.com/kornbrot/KornbrotHome.html
> Work
> School of Psychology
>  University of Hertfordshire
>  College Lane, Hatfield, Hertfordshire AL10 9AB, UK
>    voice:   +44 (0) 170 728 4626
>    fax:     +44 (0) 170 728 5073
> Home
>  19 Elmhurst Avenue
>  London N2 0LT, UK
>     voice:   +44 (0) 208 883  3657
>     mobile: +44 (0) 796 890 2102
>     fax:      +44 (0) 870 706 4997
>
>
>
>
>
>
>
>


-----
--
Bruce Weaver
[hidden email]
http://sites.google.com/a/lakeheadu.ca/bweaver/
"When all else fails, RTFM."

NOTE:  My Hotmail account is not monitored regularly.
To send me an e-mail, please use the address shown above.
--
View this message in context: http://old.nabble.com/comparing-two-partial-regression-slopes-within-the-same-equation-tp28046996p28054938.html
Sent from the SPSSX Discussion mailing list archive at Nabble.com.

=====================
To manage your subscription to SPSSX-L, send a message to
[hidden email] (not to SPSSX-L), with no body text except the
command. To leave the list, send the command
SIGNOFF SPSSX-L
For a list of commands to manage subscriptions, send the command
INFO REFCARD

=====================
To manage your subscription to SPSSX-L, send a message to
[hidden email] (not to SPSSX-L), with no body text except the
command. To leave the list, send the command
SIGNOFF SPSSX-L
For a list of commands to manage subscriptions, send the command
INFO REFCARD
Reply | Threaded
Open this post in threaded view
|

Re: comparing two partial regression slopes within the same equation

Garry Gelade
In reply to this post by msherman

Martin

 

In addition to the answers already posted, this is easy to do with SEM. Just constrain the two regression paths to equality, and see if the model fit becomes significantly worse.  If it does, the coefficients are unequal.

 

Garry Gelade

Business Analytic Ltd

 

From: SPSSX(r) Discussion [mailto:[hidden email]] On Behalf Of Martin Sherman
Sent: 26 March 2010 19:32
To: [hidden email]
Subject: comparing two partial regression slopes within the same equation

 

Dear List:  I have tried to find some sources that would provide the means of testing two partial regression coefficients within the same regression equation.  The best that I have been able to find is to do the following but I cannot find a reference  for this.  This is what I think can be used to test the two partial slopes.

B1 –B2/Sqrt(SEb1[squared} +SEb2[squared])    = t value.   Does anyone know where I could find a reference to document this. thanks,

 

                                                                                                                                                                             
Martin F. Sherman, Ph.D.

Professor of Psychology

Director of Masters Education: Thesis Track
Loyola College of Arts and Sciences

 



__________ Information from ESET NOD32 Antivirus, version of virus signature database 4978 (20100326) __________

The message was checked by ESET NOD32 Antivirus.

http://www.eset.com



__________ Information from ESET NOD32 Antivirus, version of virus signature database 4978 (20100326) __________

The message was checked by ESET NOD32 Antivirus.

http://www.eset.com
Reply | Threaded
Open this post in threaded view
|

Re: comparing two partial regression slopes within the same equation

Ryan
In reply to this post by Bruce Weaver
Hi all,

I decided to take this opportunity [for selfish reasons] to run a little simulation study. I usually use SAS to simulate data, so I was interested in seeing if I could do this in SPSS--Apologies if I've made an error. Certainly, any suggestions on improving my simulation code below are welcome. The other reason was to show Koopman's suggested method. Also, for those interested, Koopman showed how he arrived at the sum and difference method here:

http://groups.google.com/group/sci.stat.math/browse_thread/thread/8a9a0712b6f96f5b/c8d95886557d6fb4?hl=en&lnk=gst&q=Testing+the+difference+between+betas+within+a+regression+model+#c8d95886557d6fb4

HTH,

Ryan
--

set seed 98765432.
new file.

inp pro.

loop ID= 1 to 1000.

comp b0 = 0.
comp b1 = 2.
comp b2 = 4.

     comp x1 = normal(2).
     comp x2 = normal(2).
     comp e0 = normal(2).

     comp y  = b0 + b1*x1 + b2*x2 + e0.
     end case.
   end loop.
end file.
end inp pro.
exe.

delete variables b0 b1 b2 e0.

REGRESSION
  /STATISTICS COEFF OUTS R ANOVA
  /DEPENDENT y
  /METHOD=ENTER x1 x2.

COMPUTE x_sum = x1 + x2.
COMPUTE x_diff = x1 - x2.
EXECUTE.

REGRESSION
  /STATISTICS COEFF OUTS R ANOVA
  /DEPENDENT y
  /METHOD=ENTER x_sum x_diff.

Bruce Weaver wrote
Diana, what don't you like about Ray Koopman's suggestion (post # 2 at the link given below)?

http://groups.google.ca/group/sci.stat.edu/browse_frm/thread/766fbb9b4acd80f1/5707c3f7f9fb8615?hl=en&q=Ray+Koopman+compare+coefficients#5707c3f7f9fb8615

I've not tried Koopman's method, but think it would give the same result as the following:

Var(b1) = SE(b1)^2
Var(b2) = SE(b2)^2
Var(b1-b2) = Var(b1) + Var(b2) - 2*COV(b1,b2)
               = Var(b1) + Var(b2) - 2*Corr(b1,b2)*SE(b1)*SE(b2)

SE(b1-b2) = SQRT[Var(b1-b2)]

t = (b1-b2) / SE(b1-b2)

Note that the covariance term was omitted from the SE formula in the original post.

Bruce

kornbrot wrote
Apologies to all
MY response did NOT answer Martin’s question
As I now understand it, he has two continuous predictors of a single
continuous outcome. He can get linear model with simple regression
coefficients for each predictor. He also has PARTIAL correlations for each
one separately and want to know if a numerical difference is statistically
significant.

My approach would be to test if the model with 2 vars accounts for
SIGNIFICANTLY more vrainace than the best model with only one var. If yes,
then that’s the model to go with. If no, then use the simple model with the
best indpendent predictor, since it will include the effecty of the other
correlated predictor.

One might also try a SEM with a hiddden variable contributing to both the
observed predictor variables – but that’s getting complicated
Best
Diana


On 27/03/2010 14:59, "Bruce Weaver" <bruce.weaver@hotmail.com> wrote:

> Martin, Diana's solution below assumes you are looking at the linear
> relationship between X and Y in two independent groups, and want to know if
> the slopes differ significantly for those two groups.  The solution I
> pointed you to, on the other hand, assumed you have this equation:
>
>    Y = b0 + b1X1 + b2X2 + error
>
> And that you want to test the null hypothesis that b1 = b2.  Please clarify
> which it is.
>
>
>
> kornbrot wrote:
>> >
>> > create a model with the continuous variable, c; the group factor, f; and
>> > an
>> > f*c interaction
>> > Then a significant f*c interaction implies that slopes are different [at
>> > chosen alpha]
>> > If conducted with spss software the f*c parameter valye is the diffrenence
>> > in slope
>> > Alternatively, my preferred option, give slope and intercept for each
>> > group
>> > separately.
>> >
>> > Your solution is, in my view, equivalent to the above.
>> > Try the spss manual [or other stats package] for refs. The prodedure is
>> > known as ANCOVA and is a glm, general linear model
>> > Google suggested: wkipedia, which in turn suggests STATSOFT
>> > http://www.statsoft.com/textbook/general-linear-models/
>> > ­ in my view one of the best on-line stats resources
>> > http://udel.edu/~mcdonald/statancova.html is also good with excellent bio
>> > eg, usful eg of grpahic presentation and following suggested refs.
>> > Sokal, R.R., and F.J. Rohlf. 1995. Biometry: The principles and practice
>> > of
>> > statistics in biological research. 3rd edition. W.H. Freeman, New York.
>> > Zar, J.H. 1999. Biostatistical analysis. 4th edition. Prentice Hall, Upper
>> > Saddle River, NJ.
>> >
>> > Best
>> > diana
>> >
>> > On 26/03/2010 19:32, "Martin Sherman" <MSherman@loyola.edu> wrote:
>> >
>>> >> Dear List:  I have tried to find some sources that would provide the
>>> >> means of
>>> >> testing two partial regression coefficients within the same regression
>>> >> equation.  The best that I have been able to find is to do the following
>>> >> but I
>>> >> cannot find a reference  for this.  This is what I think can be used to
>>> >> test
>>> >> the two partial slopes.
>>> >> B1 ­B2/Sqrt(SEb1[squared} +SEb2[squared])   = t value.   Does anyone know
>>> >> where I could find a reference to document this. thanks,
>>> >>
>>> >>
>>> >> Martin F. Sherman, Ph.D.
>>> >> Professor of Psychology
>>> >> Director of Masters Education: Thesis Track
>>> >> Loyola College of Arts and Sciences
>>> >>
>>> >>
>> >
>> >
>> >
>> > Professor Diana Kornbrot
>> >    email:�  d.e.kornbrot@herts.ac.uk
>> >    web:    http://web.mac.com/kornbrot/iweb/KornbrotHome.html
>> > Work
>> > School of Psychology
>> > University of Hertfordshire
>> > College Lane, Hatfield, Hertfordshire AL10 9AB, UK
>> >     voice:     +44 (0) 170 728 4626
>> >     mobile:   +44 (0) 796 890 2102
>> >     fax          +44 (0) 170 728 5073
>> > Home
>> > 19 Elmhurst Avenue
>> > London N2 0LT, UK
>> >    landline: +44 (0) 208 883 3657
>> >    mobile:   +44 (0) 796 890 2102
>> >    fax:         +44 (0) 870 706 4997
>> >
>> >
>> >
>> >
>> >
>> >
>> >
>> >
>
>
> -----
> --
> Bruce Weaver
> bweaver@lakeheadu.ca
> http://sites.google.com/a/lakeheadu.ca/bweaver/
> "When all else fails, RTFM."
>
> NOTE:  My Hotmail account is not monitored regularly.
> To send me an e-mail, please use the address shown above.
> --
> View this message in context:
> http://old.nabble.com/comparing-two-partial-regression-slopes-within-the-same-
> equation-tp28046996p28052889.html
> Sent from the SPSSX Discussion mailing list archive at Nabble.com.
>
> =====================
> To manage your subscription to SPSSX-L, send a message to
> LISTSERV@LISTSERV.UGA.EDU (not to SPSSX-L), with no body text except the
> command. To leave the list, send the command
> SIGNOFF SPSSX-L
> For a list of commands to manage subscriptions, send the command
> INFO REFCARD



Professor Diana Kornbrot
email:  d.e.kornbrot@herts.ac.uk
web:    http://web.me.com/kornbrot/KornbrotHome.html
Work
School of Psychology
 University of Hertfordshire
 College Lane, Hatfield, Hertfordshire AL10 9AB, UK
   voice:   +44 (0) 170 728 4626
   fax:     +44 (0) 170 728 5073
Home
 19 Elmhurst Avenue
 London N2 0LT, UK
    voice:   +44 (0) 208 883  3657
    mobile: +44 (0) 796 890 2102
    fax:      +44 (0) 870 706 4997





Reply | Threaded
Open this post in threaded view
|

Re: comparing two partial regression slopes within the same equation

Kornbrot, Diana
In reply to this post by Bruce Weaver
Re: comparing two partial regression slopes within the same equation Koopman’s method of replacing the 2 independent vars x1, x2 by  the 3 independent vars, u1=x1+x2 and u2=x1-x2 and testing coefficient of u2 is not significantly different form zero. Seems to me an excellent, direct and easy to interpret approach
What’s not to like?
my methods may be equivalent – but I’d go for Koopman. It is also suggested by several others later in the thread
Best
Diana


On 27/03/2010 20:18, "Bruce Weaver" <bruce.weaver@...> wrote:

Diana, what don't you like about Ray Koopman's suggestion (post # 2 at the
link given below)?

http://groups.google.ca/group/sci.stat.edu/browse_frm/thread/766fbb9b4acd80f1/5707c3f7f9fb8615?hl=en&q=Ray+Koopman+compare+coefficients#5707c3f7f9fb8615

I've not tried Koopman's method, but think it would give the same result as
the following:

Var(b1) = SE(b1)^2
Var(b2) = SE(b2)^2
Var(b1-b2) = Var(b1) + Var(b2) - 2*COV(b1,b2)
               = Var(b1) + Var(b2) - 2*Corr(b1,b2)*SE(b1)*SE(b2)

SE(b1-b2) = SQRT[Var(b1-b2)]

t = (b1-b2) / SE(b1-b2)

Note that the covariance term was omitted from the SE formula in the
original post.

Bruce


kornbrot wrote:
>
> Apologies to all
> MY response did NOT answer Martin’s question
> As I now understand it, he has two continuous predictors of a single
> continuous outcome. He can get linear model with simple regression
> coefficients for each predictor. He also has PARTIAL correlations for each
> one separately and want to know if a numerical difference is statistically
> significant.
>
> My approach would be to test if the model with 2 vars accounts for
> SIGNIFICANTLY more vrainace than the best model with only one var. If yes,
> then that’s the model to go with. If no, then use the simple model with
> the
> best indpendent predictor, since it will include the effecty of the other
> correlated predictor.
>
> One might also try a SEM with a hiddden variable contributing to both the
> observed predictor variables – but that’s getting complicated
> Best
> Diana
>
>
> On 27/03/2010 14:59, "Bruce Weaver" <bruce.weaver@...> wrote:
>
>> Martin, Diana's solution below assumes you are looking at the linear
>> relationship between X and Y in two independent groups, and want to know
>> if
>> the slopes differ significantly for those two groups.  The solution I
>> pointed you to, on the other hand, assumed you have this equation:
>>
>>    Y = b0 + b1X1 + b2X2 + error
>>
>> And that you want to test the null hypothesis that b1 = b2.  Please
>> clarify
>> which it is.
>>
>>
>>
>> kornbrot wrote:
>>> >
>>> > create a model with the continuous variable, c; the group factor, f;
>>> and
>>> > an
>>> > f*c interaction
>>> > Then a significant f*c interaction implies that slopes are different
>>> [at
>>> > chosen alpha]
>>> > If conducted with spss software the f*c parameter valye is the
>>> diffrenence
>>> > in slope
>>> > Alternatively, my preferred option, give slope and intercept for each
>>> > group
>>> > separately.
>>> >
>>> > Your solution is, in my view, equivalent to the above.
>>> > Try the spss manual [or other stats package] for refs. The prodedure
>>> is
>>> > known as ANCOVA and is a glm, general linear model
>>> > Google suggested: wkipedia, which in turn suggests STATSOFT
>>> > http://www.statsoft.com/textbook/general-linear-models/
>>> > ­ in my view one of the best on-line stats resources
>>> > http://udel.edu/~mcdonald/statancova.html is also good with excellent
>>> bio
>>> > eg, usful eg of grpahic presentation and following suggested refs.
>>> > Sokal, R.R., and F.J. Rohlf. 1995. Biometry: The principles and
>>> practice
>>> > of
>>> > statistics in biological research. 3rd edition. W.H. Freeman, New
>>> York.
>>> > Zar, J.H. 1999. Biostatistical analysis. 4th edition. Prentice Hall,
>>> Upper
>>> > Saddle River, NJ.
>>> >
>>> > Best
>>> > diana
>>> >
>>> > On 26/03/2010 19:32, "Martin Sherman" <MSherman@...> wrote:
>>> >
>>>> >> Dear List:  I have tried to find some sources that would provide the
>>>> >> means of
>>>> >> testing two partial regression coefficients within the same
>>>> regression
>>>> >> equation.  The best that I have been able to find is to do the
>>>> following
>>>> >> but I
>>>> >> cannot find a reference  for this.  This is what I think can be used
>>>> to
>>>> >> test
>>>> >> the two partial slopes.
>>>> >> B1 ­B2/Sqrt(SEb1[squared} +SEb2[squared])   = t value.   Does anyone
>>>> know
>>>> >> where I could find a reference to document this. thanks,
>>>> >>
>>>> >>
>>>> >> Martin F. Sherman, Ph.D.
>>>> >> Professor of Psychology
>>>> >> Director of Masters Education: Thesis Track
>>>> >> Loyola College of Arts and Sciences
>>>> >>
>>>> >>
>>> >
>>> >
>>> >
>>> > Professor Diana Kornbrot
>>> >    email:�  d.e.kornbrot@...
>>> >    web:    http://web.mac.com/kornbrot/iweb/KornbrotHome.html
>>> > Work
>>> > School of Psychology
>>> > University of Hertfordshire
>>> > College Lane, Hatfield, Hertfordshire AL10 9AB, UK
>>> >     voice:     +44 (0) 170 728 4626
>>> >     mobile:   +44 (0) 796 890 2102
>>> >     fax          +44 (0) 170 728 5073
>>> > Home
>>> > 19 Elmhurst Avenue
>>> > London N2 0LT, UK
>>> >    landline: +44 (0) 208 883 3657
>>> >    mobile:   +44 (0) 796 890 2102
>>> >    fax:         +44 (0) 870 706 4997
>>> >
>>> >
>>> >
>>> >
>>> >
>>> >
>>> >
>>> >
>>
>>
>> -----
>> --
>> Bruce Weaver
>> bweaver@...
>> http://sites.google.com/a/lakeheadu.ca/bweaver/
>> "When all else fails, RTFM."
>>
>> NOTE:  My Hotmail account is not monitored regularly.
>> To send me an e-mail, please use the address shown above.
>> --
>> View this message in context:
>> http://old.nabble.com/comparing-two-partial-regression-slopes-within-the-same-
>> equation-tp28046996p28052889.html
>> Sent from the SPSSX Discussion mailing list archive at Nabble.com.
>>
>> =====================
>> To manage your subscription to SPSSX-L, send a message to
>> LISTSERV@... (not to SPSSX-L), with no body text except the
>> command. To leave the list, send the command
>> SIGNOFF SPSSX-L
>> For a list of commands to manage subscriptions, send the command
>> INFO REFCARD
>
>
>
> Professor Diana Kornbrot
> email:�  d.e.kornbrot@...
> web:    http://web.me.com/kornbrot/KornbrotHome.html
> Work
> School of Psychology
>  University of Hertfordshire
>  College Lane, Hatfield, Hertfordshire AL10 9AB, UK
>    voice:   +44 (0) 170 728 4626
>    fax:     +44 (0) 170 728 5073
> Home
>  19 Elmhurst Avenue
>  London N2 0LT, UK
>     voice:   +44 (0) 208 883  3657
>     mobile: +44 (0) 796 890 2102
>     fax:      +44 (0) 870 706 4997
>
>
>
>
>
>
>
>


-----
--
Bruce Weaver
bweaver@...
http://sites.google.com/a/lakeheadu.ca/bweaver/
"When all else fails, RTFM."

NOTE:  My Hotmail account is not monitored regularly.
To send me an e-mail, please use the address shown above.
--
View this message in context: http://old.nabble.com/comparing-two-partial-regression-slopes-within-the-same-equation-tp28046996p28054938.html
Sent from the SPSSX Discussion mailing list archive at Nabble.com.

=====================
To manage your subscription to SPSSX-L, send a message to
LISTSERV@... (not to SPSSX-L), with no body text except the
command. To leave the list, send the command
SIGNOFF SPSSX-L
For a list of commands to manage subscriptions, send the command
INFO REFCARD



Professor Diana Kornbrot
  email: 
d.e.kornbrot@...    
   
web:    http://web.mac.com/kornbrot/iweb/KornbrotHome.html
Work
School of Psychology
University of Hertfordshire
College Lane, Hatfield, Hertfordshire AL10 9AB, UK
    voice:     +44 (0) 170 728 4626
    mobile:   +44 (0) 796 890 2102
    fax          +44 (0) 170 728 5073
Home
19 Elmhurst Avenue
London N2 0LT, UK
   landline: +44 (0) 208 883 3657
   mobile:   +44 (0) 796 890 2102
   fax:         +44 (0) 870 706 4997