Communalities Question

classic Classic list List threaded Threaded
3 messages Options
Reply | Threaded
Open this post in threaded view
|

Communalities Question

Salbod
Happy Friday to Everyone,

            I performed a principle component analysis on 40 items. The
items were participates responses on to a 4 point scale. The SPSS default,
eigenvalue > 1.0, gave me 13 components explaining 57% of the variance. A
subsequent Parallel Analysis revealed a 5 component solution, accounting for
34% of the variance, which I plan to use. When I examined the communalities
of the 5 component solution I discovered items that the components explained
less than 20% of item's variance. I dropped these items and re-ran the
analyses; again, I obtained a 5 component solution.

            I am pleased with the solution, but I not sure how to justify
the removal of the items based on less than 20% item variance. Does anyone
know of a reference?



TIA



Stephen Salbod, Pace University, NYC
Reply | Threaded
Open this post in threaded view
|

Re: Communalities Question

Art Kendall
Since you use seem to be creating scales from items, I would not use the
criterion of total variance explained  to decide which items to retain
for the scales. For a scale, it is customary to be interested in the
variance of the item on the single scale to which it is assigned.

I don't have references at hand, but have been using factor analysis
since the early 70s.

For a scale it is also more usual to be interested in the common
variance that the items share. So I suggest using principal factors
rather than principal components which assumes the items are perfectly
reliable and tries to account for the total variance of the items
In order to maximize discriminant validity, I would suggest using the
varimax rotated loadings to determine which items "go together"
cleanly.  In scale construction it is the correlation of the item with
the single scale to which it is assigned that is important.
I find it helpful to use /format= sorted blank(.3).  Sometimes I use.35
or .4.
[For disclosure: I suggested that these options be implemented in the
mid-70s].

I applaud the use of parallel analysis to get a ballpark number of
factors to retain.
Usually I have retained that number or 1 less factor depending on the
interpretability and cleanliness to the rotated factor loadings.

If you are just trying to reduce the data without emphasis on
interpretation of the meaning of the factors, it might give some
interpretability in for what is in the overall solution without getting
into which dimensions the item  defines.

Art Kendall
Social Research Consultants




Stephen Salbod wrote:

> Happy Friday to Everyone,
>
>             I performed a principle component analysis on 40 items. The
> items were participates responses on to a 4 point scale. The SPSS default,
> eigenvalue > 1.0, gave me 13 components explaining 57% of the variance. A
> subsequent Parallel Analysis revealed a 5 component solution, accounting for
> 34% of the variance, which I plan to use. When I examined the communalities
> of the 5 component solution I discovered items that the components explained
> less than 20% of item's variance. I dropped these items and re-ran the
> analyses; again, I obtained a 5 component solution.
>
>             I am pleased with the solution, but I not sure how to justify
> the removal of the items based on less than 20% item variance. Does anyone
> know of a reference?
>
>
>
> TIA
>
>
>
> Stephen Salbod, Pace University, NYC
>
>
>
Art Kendall
Social Research Consultants
Reply | Threaded
Open this post in threaded view
|

Re: Communalities Question

statisticsdoc
In reply to this post by Salbod
Stephen,

One way to justify the removal of these items would be to examine the factor
loadings of the items that you removed.  With such low communality, the
factor loadings were probably quite small, since most of the variance on
these items would have consisted of unique variance (viz. Harman, Modern
Factor Analysis).  If you are using the analysis for the purpose of scale
construction, then examine the item-total correlation for the deleted items
(probably also quite low, since the items do not share much variance with
the rest of the battery).

HTH,

Stephen Brand

For personalized and professional consultation in statistics and research
design, visit
www.statisticsdoc.com


-----Original Message-----
From: SPSSX(r) Discussion [mailto:[hidden email]]On Behalf Of
Stephen Salbod
Sent: Friday, February 16, 2007 4:41 PM
To: [hidden email]
Subject: Communalities Question


Happy Friday to Everyone,

            I performed a principle component analysis on 40 items. The
items were participates responses on to a 4 point scale. The SPSS default,
eigenvalue > 1.0, gave me 13 components explaining 57% of the variance. A
subsequent Parallel Analysis revealed a 5 component solution, accounting for
34% of the variance, which I plan to use. When I examined the communalities
of the 5 component solution I discovered items that the components explained
less than 20% of item's variance. I dropped these items and re-ran the
analyses; again, I obtained a 5 component solution.

            I am pleased with the solution, but I not sure how to justify
the removal of the items based on less than 20% item variance. Does anyone
know of a reference?



TIA



Stephen Salbod, Pace University, NYC