Login  Register

follow-up question to: ANOVA and HOV

Posted by Deborah Pearce on Jun 28, 2006; 12:22am
URL: http://spssx-discussion.165.s1.nabble.com/SPSS-MVA-tp1069196p1069201.html

Dear All,

Thank you Marta and Dale for your detailed responses in relation to my
question about the lack of HOV and the effect on P values.  This is a
follow-up question.

My designs are not Oneway ANOVAs but balanced two-way ANOVAs with
replication (e.g. 3 sites, 4 depths and 4 replicates).   P was generally
less than 0.00001 for the interaction, the site effect and the depth
effect.  I followed your advice and transformed my data (following your
recommended criteria/justification) and it did increase the HOV for many
of my experiments and the residuals followed a normal distribution.
Thanks.

For other experiments no transformations gave equality of variances.  In
these cases if the sites are looked at separately i.e. just looking at the
data for the change of depth within each site then I do get equality of
variances.  For these experiments is it better to just look at each site
separately using a Oneway ANOVA? (The sites are actually so different for
the variable I am looking at).

When is Tukey better to use than the Tamhane test?  I mean what if P=0.06
for Leven HOV test?

For ANOVA does one need to check that the data as well as the residuals
follow a normal distribution of is just the residuals sufficient?

I will be very grateful of any help.

Best wishes,
Deborah


> Hi
>
> DG> Deborah, I'll be interested to see responses from others, as
> DG> I don't think there will be an ironclad truism, but here is my
> DG> two-pence
>
> Here's mine.
>
> DG>   (1) even though often we hear that ANOVA is "robust" to
> DG> moderate violations of the assumptions, there are some papers that
> DG> show this is not necessarily the case when sample size markedly
> DG> varies across groups (e.g., if the larger variance is associated
> DG> with the larger group tests of signficance tend to be
> DG> conservative); so, to the extent that your design is relatively
> DG> balanced, that will be of less concern that if it is not.
>
> Normality is in general considered less important than homogeneity of
> variances. As you point out, unbalanced designs are more affected by
> lack of HOV. I read (but I don't have the reference here right now)
> that lack of HOV will severely affect the ANOVA p-value if the
> smallest sample is lower than 10 cases and the biggest sample is more
> than four times the smallest. If the biggest samples has the smallest
> variances, true significance level increases, and if biggest samples
> have biggest variances, true significance level disminishes (I hope my
> explanation is clear, even in Spanish it was a bit difficult to
> understand, and the translation hasn't improved it).
>
> For Oneway ANOVA, SPSS incorporated (since version 9, I believe)
> robust tests: Brown-Forsythe and Welch (this last is more adecuated
> for heavily unbalanced designs).
>
> Even if lack of HOV doesn't have much impact on the overall p-value
> (in balanced or moderately unbalanced designs), it can have a lot of
> effect on multiple comparison methods. Replace Tukey..., or any
> post-hoc method you use, by Tamhane test. Also, contrasts (orthogonal
> or not) are adjusted for lack of HOV (I'm talking about SPSS'
> procedure ONEWAY).
>
> DG>   (2) Though there are myriad opinions about transformations
> DG> (e.g., log, reciprocal, etc.), if the normality assumption is not
> DG> tenable, attempt a transformation (one ideally that can be
> DG> justified) and see if your general results/conclusion holds
>
> As I mentioned before, ANOVA is quite robust to departures of
> normality (as a matter of fact, Levene test is an ANOVA with the
> absolute values of the residuals, which have a highly skewed
> distribution). Besides, transformations that fix lack of HOV usually
> improve normality, therefore I recommend you to focus on those
> transformations that stabilize variances and see the effect on
> normality.
>
> A list of the most popular transformations:
>
> - If SD is proportional to the mean, then a log transformation will
> improve both HOV & normality (distributions tipically log-normal,
> positively skewed). This transformation has the advantage of being
> "reversable": you can back transform the data and obtain geometric
> means or ratio of geometric means (when you back transform logmean
> differences). Use x'=(log(1+x) if there are zeroes (problems back
> transforming data can arise in this particular case).
>
> - If variance is proportional to the mean, then you have distributions
> that follow Poisson distributions (or overdispersed Poisson
> distributions: Negative binomial) and square root can help. Again, add
> 1 before taking the square root if zeroes are present. This
> transformation can't be back transformed for mean differences.
>
> - For binomial proportions with constant denominators, you can use the
> angular transformation: x'=arcsin(sqrt(p)). Again, it can't be back
> transformed for mean differences.
>
> - Reciprocal transformation: x'=1/x. I can't remember right now when
> it was indicated.
>
> See: http://bmj.bmjjournals.com/cgi/content/full/312/7039/1153 for an
> interesting Statistics Note on the problems of back transforming CI
> for mean differences.
>
> Also, this note
> http://bmj.bmjjournals.com/cgi/content/full/312/7038/1079 focuses on
> the problem of trying to back transform the SD after a log transform.
>
> DG>   (3) You can always resort to a nonparametric analogue
> DG> (e.g., Kruskal-Wallis) and again check if the general results
> DG> obtained...
>
> This could be OK if the only problem is lack of normality, but if you
> also have lack of HOV you should not use Kruskal-Wallis test. Citing a
> previous message of mine (from the Tutorial on non-parametrics series
> I started in April):
>
> "Data requirements for Kruskal-Wallis test: distributions similar in
> shape (this means that dispersion is something to be considered too;
> see: "Statistical Significance Levels of Nonparametric Tests Biased by
> Heterogeneous Variances of Treatment Groups" Journal of General
> Psychology,  Oct, 2000 by Donald W. Zimmerman. Available at:
> http://www.findarticles.com/p/articles/mi_m2405/is_4_127/ai_68025177 )"
>
> HTH,
>
> Marta
>