|
Dear list,
I have a query regarding factor analysis that i'm sure people can answer. I'm used to working with PCA, and in the output in the "total variance explained box" (pre-rotation), the values in the left hand panel under "initial eigenvalues" are identical to those in the right hand panel under "extraction sums of squared loadings". I have now run a factor analysis using principal axis factoring, and the values in the right hand column are now slightly smaller, which I understand is due to the fact that these values now represent common variance only. My question is this: how do I decide on the number of factors with eigenvalues greater than 1? In the left hand panel under "initlal eigenvalues" I have four factors with eigenvalues greater than 1 (in the total column these are 10.08, 1.79, 1.36 and 1.23). In the right hand panel under "extraction sums of squared loadings" only two of these have eigenvalues greater than 1 under the total column: 9.64, 1.45, .86 and .81. I'm probably missing the obvious here in terms of which values I should be looking at, but if someone could clarify id' appreciate the advice. Regards, Kathryn _________________________________________________________________ Win £1000 John Lewis shopping sprees with BigSnapSearch.com http://clk.atdmt.com/UKM/go/117442309/direct/01/ ====================To manage your subscription to SPSSX-L, send a message to [hidden email] (not to SPSSX-L), with no body text except the command. To leave the list, send the command SIGNOFF SPSSX-L For a list of commands to manage subscriptions, send the command INFO REFCARD |
|
Kathryn,
1. There is nothing sacred about eigenvalues greater than 1. It is one of the rules of thumb frequently applied, but you may use other rules (such as using only one factor, or even using all of them). It all depends on the theory behind your analysis, the statistical significance of results (especially for smaller eigenvalues and their associated factors), and the interpretation of the factors in terms of observed variables. 2. You do not clarify why this time you used Principal Axis instead of Principal Components. I suppose you had good reasons. You are surely aware that PCA was introduced with a view to extract as much variance as possible in the first factor, because Spearmann was interested in showing the importance of a general factor overshadowing all the rest (his "general intelligence" factor G). Instead, PAA was introduced (by Thurstone) with exactly the opposite purpose: showing that there are several intelligence dimensions in the mind, and thus the whole procedure points to a more balanced extraction of several factors. In theory, PCA should be applied whenever your theory leads you to believe the observed variables reflect chiefly one large or dominating underlying factor (plus other idiosyncratic factors of lesser import), whereas PAA should be used whenever your theory suggests that the observed variables reflect a latent multidimensional construct, where the various dimensions are relatively independent of each other (though they could later be obliquely rotated to reveal the degree of their mutual correlation). 3. Always remember that factors in factor analysis are not real things: they are figments of statistical imagination, ways of representing K variables in H dimensions with more clear meaning than the original observed variables, where H is usually less or not greater than K, just as you can represent a 3D cube (or landscape) in a 2D sheet of paper (or painter's canvass). There are several ways of achieving that purpose, all valid in principle, all representing some kind of theory or hypothesis, and proving nothing by themselves. Several different factor-analytical solutions may fit the same set of data, so neither Spearmann proved the existence of one single intelligence nor Thurstone proved the existence of several. They simply showed how the results of several cognitive ability tests can be represented by either one or several factors by means of some fancy mathematical procedure. Just as Leonardo, van Gogh and Picasso may have painted the same landscape in different styles, and photographers may have pictured it from various angles and with different techniques, proving little in the end about the landscape itself. Hector -----Original Message----- From: SPSSX(r) Discussion [mailto:[hidden email]] On Behalf Of Kathryn Gardner Sent: 20 November 2008 09:27 To: [hidden email] Subject: Principal Axis Factoring output Dear list, I have a query regarding factor analysis that i'm sure people can answer. I'm used to working with PCA, and in the output in the "total variance explained box" (pre-rotation), the values in the left hand panel under "initial eigenvalues" are identical to those in the right hand panel under "extraction sums of squared loadings". I have now run a factor analysis using principal axis factoring, and the values in the right hand column are now slightly smaller, which I understand is due to the fact that these values now represent common variance only. My question is this: how do I decide on the number of factors with eigenvalues greater than 1? In the left hand panel under "initlal eigenvalues" I have four factors with eigenvalues greater than 1 (in the total column these are 10.08, 1.79, 1.36 and 1.23). In the right hand panel under "extraction sums of squared loadings" only two of these have eigenvalues greater than 1 under the total column: 9.64, 1.45, .86 and .81. I'm probably missing the obvious here in terms of which values I should be looking at, but if someone could clarify id' appreciate the advice. Regards, Kathryn _________________________________________________________________ Win £1000 John Lewis shopping sprees with BigSnapSearch.com http://clk.atdmt.com/UKM/go/117442309/direct/01/ To manage your subscription to SPSSX-L, send a message to [hidden email] (not to SPSSX-L), with no body text except the command. To leave the list, send the command SIGNOFF SPSSX-L For a list of commands to manage subscriptions, send the command INFO REFCARD ===================== To manage your subscription to SPSSX-L, send a message to [hidden email] (not to SPSSX-L), with no body text except the command. To leave the list, send the command SIGNOFF SPSSX-L For a list of commands to manage subscriptions, send the command INFO REFCARD |
|
Hi Hector,
Thank you for your detailed and useful response. I am aware that the eigenvalue rule isn't the only option for deciding on the numbers of factor to retain, but I would still like to which are the correct values to look at in the SPSS output. Something i've just read seems to indicate it is the values in the left hand panel under initial eigenvalues, although some of my other reading suggests otherwise. I initually used PCA on my personality disorder scale to replicate a past study, but a reviewer of my paper commented that I should use PAF because it is more appropriate for investigations of latent variables. Kathryn > Date: Thu, 20 Nov 2008 10:04:28 -0200> From: [hidden email]> Subject: Re: Principal Axis Factoring output> To: [hidden email]> > Kathryn,> 1. There is nothing sacred about eigenvalues greater than 1. It is one of> the rules of thumb frequently applied, but you may use other rules (such as> using only one factor, or even using all of them). It all depends on the> theory behind your analysis, the statistical significance of results> (especially for smaller eigenvalues and their associated factors), and the> interpretation of the factors in terms of observed variables.> 2. You do not clarify why this time you used Principal Axis instead of> Principal Components. I suppose you had good reasons. You are surely aware> that PCA was introduced with a view to extract as much variance as possible> in the first factor, because Spearmann was interested in showing the> importance of a general factor overshadowing all the rest (his "general> intelligence" factor G). Instead, PAA was introduced (by Thurstone) with> exactly the opposite purpose: showing that there are several intelligence> dimensions in the mind, and thus the whole procedure points to a more> balanced extraction of several factors. In theory, PCA should be applied> whenever your theory leads you to believe the observed variables reflect> chiefly one large or dominating underlying factor (plus other idiosyncratic> factors of lesser import), whereas PAA should be used whenever your theory> suggests that the observed variables reflect a latent multidimensional> construct, where the various dimensions are relatively independent of each> other (though they could later be obliquely rotated to reveal the degree of> their mutual correlation).> 3. Always remember that factors in factor analysis are not real things: they> are figments of statistical imagination, ways of representing K variables in> H dimensions with more clear meaning than the original observed variables,> where H is usually less or not greater than K, just as you can represent a> 3D cube (or landscape) in a 2D sheet of paper (or painter's canvass). There> are several ways of achieving that purpose, all valid in principle, all> representing some kind of theory or hypothesis, and proving nothing by> themselves. Several different factor-analytical solutions may fit the same> set of data, so neither Spearmann proved the existence of one single> intelligence nor Thurstone proved the existence of several. They simply> showed how the results of several cognitive ability tests can be represented> by either one or several factors by means of some fancy mathematical> procedure. Just as Leonardo, van Gogh and Picasso may have painted the same> landscape in different styles, and photographers may have pictured it from> various angles and with different techniques, proving little in the end> about the landscape itself.> Hector> > > -----Original Message-----> From: SPSSX(r) Discussion [mailto:[hidden email]] On Behalf Of> Kathryn Gardner> Sent: 20 November 2008 09:27> To: [hidden email]> Subject: Principal Axis Factoring output> > Dear list,> I have a query regarding factor analysis that i'm sure people can answer.> I'm used to working with PCA, and in the output in the "total variance> explained box" (pre-rotation), the values in the left hand panel under> "initial eigenvalues" are identical to those in the right hand panel under> "extraction sums of squared loadings". I have now run a factor analysis> using principal axis factoring, and the values in the right hand column are> now slightly smaller, which I understand is due to the fact that these> values now represent common variance only. My question is this: how do I> decide on the number of factors with eigenvalues greater than 1? In the left> hand panel under "initlal eigenvalues" I have four factors with eigenvalues> greater than 1 (in the total column these are 10.08, 1.79, 1.36 and 1.23).> In the right hand panel under "extraction sums of squared loadings" only two> of these have eigenvalues greater than 1 under the total column: 9.64, 1.45,> .86 and .81. I'm probably missing> the obvious here in terms of which values I should be looking at, but if> someone could clarify id' appreciate the advice.> > Regards,> > Kathryn> > > _________________________________________________________________> Win £1000 John Lewis shopping sprees with BigSnapSearch.com> http://clk.atdmt.com/UKM/go/117442309/direct/01/> > To manage your subscription to SPSSX-L, send a message to> [hidden email] (not to SPSSX-L), with no body text except the> command. To leave the list, send the command> SIGNOFF SPSSX-L> For a list of commands to manage subscriptions, send the command> INFO REFCARD> > =====================> To manage your subscription to SPSSX-L, send a message to> [hidden email] (not to SPSSX-L), with no body text except the> command. To leave the list, send the command> SIGNOFF SPSSX-L> For a list of commands to manage subscriptions, send the command> INFO REFCARD _________________________________________________________________ See the most popular videos on the web http://clk.atdmt.com/GBL/go/115454061/direct/01/ ====================To manage your subscription to SPSSX-L, send a message to [hidden email] (not to SPSSX-L), with no body text except the command. To leave the list, send the command SIGNOFF SPSSX-L For a list of commands to manage subscriptions, send the command INFO REFCARD |
|
Kathryn,
The initial eigenvalues column reflects the eigenvalues of all factors. The other column refers only to the subset of extracted factors (which by default are only those with initial eigenvalue>1, unless otherwise specified by you. Thus to determine how many factors should be extracted, and you wish to use the rule of thumb of eigenvalues greater than one, you should look at the initial eigenvalues. One the other hand, if you are dealing with a unidimensional scale of personality disorder, i.e. with a scalar measure resulting from various questions in a test, or from several test scores combined into a single number, PAF is NOT adequate, in my humble opinion, because PAF is designed for situations where you want to elicit SEVERAL factors underlying a set of questions in a test (or a set of several test scores). Now, if you want to break down a personality disorder score into SEVERAL scales, using the original variables as input, then PAF would be more adequate. Hector _____ From: Kathryn Gardner [mailto:[hidden email]] Sent: 20 November 2008 10:20 To: [hidden email]; [hidden email] Subject: RE: Principal Axis Factoring output Hi Hector, Thank you for your detailed and useful response. I am aware that the eigenvalue rule isn't the only option for deciding on the numbers of factor to retain, but I would still like to which are the correct values to look at in the SPSS output. Something i've just read seems to indicate it is the values in the left hand panel under initial eigenvalues, although some of my other reading suggests otherwise. I initually used PCA on my personality disorder scale to replicate a past study, but a reviewer of my paper commented that I should use PAF because it is more appropriate for investigations of latent variables. Kathryn > Date: Thu, 20 Nov 2008 10:04:28 -0200 > From: [hidden email] > Subject: Re: Principal Axis Factoring output > To: [hidden email] > > Kathryn, > 1. There is nothing sacred about eigenvalues greater than 1. It is one of > the rules of thumb frequently applied, but you may use other rules (such as > using only one factor, or even using all of them). It all depends on the > theory behind your analysis, the statistical significance of results > (especially for smaller eigenvalues and their associated factors), and the > interpretation of the factors in terms of observed variables. > 2. You do not clarify why this time you used Principal Axis instead of > Principal Components. I suppose you had good reasons. You are surely aware > that PCA was introduced with a view to extract as much variance as possible > in the first factor, because Spearmann was interested in showing the > importance of a general factor overshadowing all the rest (his "general > intelligence" factor G). Instead, PAA was introduced (by Thurstone) with > exactly the opposite purpose: showing that there are several intelligence > dimensions in the mind, and thus the whole procedure points to a more > balanced extraction of several factors. In theory, PCA should be applied > whenever your theory leads you to believe the observed variables reflect > chiefly one large or dominating underlying factor (plus other idiosyncratic > factors of lesser import), whereas PAA should be used whenever your theory > suggests that the observed variables reflect a latent multidimensional > construct, where the various dimensions are relatively independent of each > other (though they could later be obliquely rotated to reveal the degree of > their mutual correlation). > 3. Always remember that factors in factor analysis are not real things: they > are figments of statistical imagination, ways of representing K variables in > H dimensions with more clear meaning than the original observed variables, > where H is usually less or not greater than K, just as you can represent a > 3D cube (or landscape) in a 2D sheet of paper (or painter's canvass). There > are several ways of achieving that purpose, all valid in principle, all > representing some kind of theory or hypothesis, and proving nothing by > themselves. Several different factor-analytical solutions may fit the same > set of data, so neither Spearmann proved the existence of one single > intelligence nor Thurstone proved the existence of several. They simply > showed how the results of several cognitive ability tests can be represented > by either one or several factors by means of some fancy mathematical > procedure. Just as Leonardo, van Gogh and Picasso may have painted the same > landscape in different styles, and photographers may have pictured it from > various angles and with different techniques, proving little in the end > about the landscape itself. > Hector > > > -----Original Message----- > From: SPSSX(r) Discussion [mailto:[hidden email]] On Behalf Of > Kathryn Gardner > Sent: 20 November 2008 09:27 > To: [hidden email] > Subject: Principal Axis Factoring output > > Dear list, > I have a query regarding factor analysis that i'm sure people can answer. > I'm used to working with PCA, and in the output in the "total variance > explained box" (pre-rotation), the values in the left hand panel under > "initial eigenvalues" are identical to those in the right hand panel under > "extraction sums of squared loadings". I have now run a factor analysis > using principal axis factoring, and the values in the right hand column > now slightly smaller, which I understand is due to the fact that these > values now represent common variance only. My question is this: how do I > decide on the number of factors with eigenvalues greater than 1? In the left > hand panel under "initlal eigenvalues" I have four factors with eigenvalues > greater than 1 (in the total column these are 10.08, 1.79, 1.36 and 1.23). > In the right hand panel under "extraction sums of squared loadings" only two > of these have eigenvalues greater than 1 under the total column: 9.64, 1.45, > .86 and .81. I'm probably missing > the obvious here in terms of which values I should be looking at, but if > someone could clarify id' appreciate the advice. > > Regards, > > Kathryn > > > _________________________________________________________________ > Win £1000 John Lewis shopping sprees with BigSnapSearch.com > http://clk.atdmt.com/UKM/go/117442309/direct/01/ > > To manage your subscription to SPSSX-L, send a message to > [hidden email] (not to SPSSX-L), with no body text except the > command. To leave the list, send the command > SIGNOFF SPSSX-L > For a list of commands to manage subscriptions, send the command > INFO REFCARD > > ===================== > To manage your subscription to SPSSX-L, send a message to > [hidden email] (not to SPSSX-L), with no body text except the > command. To leave the list, send the command > SIGNOFF SPSSX-L > For a list of commands to manage subscriptions, send the command > INFO REFCARD _____ Read amazing stories to your kids on Messenger Try it <http://clk.atdmt.com/UKM/go/117588488/direct/01/> Now! ====================To manage your subscription to SPSSX-L, send a message to [hidden email] (not to SPSSX-L), with no body text except the command. To leave the list, send the command SIGNOFF SPSSX-L For a list of commands to manage subscriptions, send the command INFO REFCARD |
|
I concur with Hector.
I reiterate, it is important to know why you are doing a FA in the first place. For example: If you intend to create a psychometric scale for future use corresponding to your interpretation of the only factor you are finding, you may want to leave the second factor in the extraction. If items are splitters or somehow different in concept they would be excluded from the list of items that you sum to get a score. If you are just verifying the keying of the items from a previously published scale and if you have a sizable set of cases, then those items that load on a second factor would be ones to check whether they belong on another scale. Art Kendall Social Research Consultants Hector Maletta wrote: > Kathryn, > > The "initial eigenvalues" column reflects the eigenvalues of all factors. > The other column refers only to the subset of extracted factors (which by > default are only those with initial eigenvalue>1, unless otherwise specified > by you. Thus to determine how many factors should be extracted, and you wish > to use the rule of thumb of eigenvalues greater than one, you should look at > the initial eigenvalues. > > > > One the other hand, if you are dealing with a unidimensional scale of > personality disorder, i.e. with a scalar measure resulting from various > questions in a test, or from several test scores combined into a single > number, PAF is NOT adequate, in my humble opinion, because PAF is designed > for situations where you want to elicit SEVERAL factors underlying a set of > questions in a test (or a set of several test scores). > > Now, if you want to break down a personality disorder score into SEVERAL > scales, using the original variables as input, then PAF would be more > adequate. > > > > Hector > > > > _____ > > From: Kathryn Gardner [mailto:[hidden email]] > Sent: 20 November 2008 10:20 > To: [hidden email]; [hidden email] > Subject: RE: Principal Axis Factoring output > > > > Hi Hector, > Thank you for your detailed and useful response. I am aware that the > eigenvalue rule isn't the only option for deciding on the numbers of factor > to retain, but I would still like to which are the correct values to look at > in the SPSS output. Something i've just read seems to indicate it is the > values in the left hand panel under initial eigenvalues, although some of my > other reading suggests otherwise. > > I initually used PCA on my personality disorder scale to replicate a past > study, but a reviewer of my paper commented that I should use PAF because it > is more appropriate for investigations of latent variables. > > Kathryn > > >> Date: Thu, 20 Nov 2008 10:04:28 -0200 >> From: [hidden email] >> Subject: Re: Principal Axis Factoring output >> To: [hidden email] >> >> Kathryn, >> 1. There is nothing sacred about eigenvalues greater than 1. It is one of >> the rules of thumb frequently applied, but you may use other rules (such >> > as > >> using only one factor, or even using all of them). It all depends on the >> theory behind your analysis, the statistical significance of results >> (especially for smaller eigenvalues and their associated factors), and the >> interpretation of the factors in terms of observed variables. >> 2. You do not clarify why this time you used Principal Axis instead of >> Principal Components. I suppose you had good reasons. You are surely aware >> that PCA was introduced with a view to extract as much variance as >> > possible > >> in the first factor, because Spearmann was interested in showing the >> importance of a general factor overshadowing all the rest (his "general >> intelligence" factor G). Instead, PAA was introduced (by Thurstone) with >> exactly the opposite purpose: showing that there are several intelligence >> dimensions in the mind, and thus the whole procedure points to a more >> balanced extraction of several factors. In theory, PCA should be applied >> whenever your theory leads you to believe the observed variables reflect >> chiefly one large or dominating underlying factor (plus other >> > idiosyncratic > >> factors of lesser import), whereas PAA should be used whenever your theory >> suggests that the observed variables reflect a latent multidimensional >> construct, where the various dimensions are relatively independent of each >> other (though they could later be obliquely rotated to reveal the degree >> > of > >> their mutual correlation). >> 3. Always remember that factors in factor analysis are not real things: >> > they > >> are figments of statistical imagination, ways of representing K variables >> > in > >> H dimensions with more clear meaning than the original observed variables, >> where H is usually less or not greater than K, just as you can represent a >> 3D cube (or landscape) in a 2D sheet of paper (or painter's canvass). >> > There > >> are several ways of achieving that purpose, all valid in principle, all >> representing some kind of theory or hypothesis, and proving nothing by >> themselves. Several different factor-analytical solutions may fit the same >> set of data, so neither Spearmann proved the existence of one single >> intelligence nor Thurstone proved the existence of several. They simply >> showed how the results of several cognitive ability tests can be >> > represented > >> by either one or several factors by means of some fancy mathematical >> procedure. Just as Leonardo, van Gogh and Picasso may have painted the >> > same > >> landscape in different styles, and photographers may have pictured it from >> various angles and with different techniques, proving little in the end >> about the landscape itself. >> Hector >> >> >> -----Original Message----- >> From: SPSSX(r) Discussion [mailto:[hidden email]] On Behalf Of >> Kathryn Gardner >> Sent: 20 November 2008 09:27 >> To: [hidden email] >> Subject: Principal Axis Factoring output >> >> Dear list, >> I have a query regarding factor analysis that i'm sure people can answer. >> I'm used to working with PCA, and in the output in the "total variance >> explained box" (pre-rotation), the values in the left hand panel under >> "initial eigenvalues" are identical to those in the right hand panel under >> "extraction sums of squared loadings". I have now run a factor analysis >> using principal axis factoring, and the values in the right hand column >> > are > >> now slightly smaller, which I understand is due to the fact that these >> values now represent common variance only. My question is this: how do I >> decide on the number of factors with eigenvalues greater than 1? In the >> > left > >> hand panel under "initlal eigenvalues" I have four factors with >> > eigenvalues > >> greater than 1 (in the total column these are 10.08, 1.79, 1.36 and 1.23). >> In the right hand panel under "extraction sums of squared loadings" only >> > two > >> of these have eigenvalues greater than 1 under the total column: 9.64, >> > 1.45, > >> .86 and .81. I'm probably missing >> the obvious here in terms of which values I should be looking at, but if >> someone could clarify id' appreciate the advice. >> >> Regards, >> >> Kathryn >> >> >> _________________________________________________________________ >> Win £1000 John Lewis shopping sprees with BigSnapSearch.com >> http://clk.atdmt.com/UKM/go/117442309/direct/01/ >> >> To manage your subscription to SPSSX-L, send a message to >> [hidden email] (not to SPSSX-L), with no body text except the >> command. To leave the list, send the command >> SIGNOFF SPSSX-L >> For a list of commands to manage subscriptions, send the command >> INFO REFCARD >> >> ===================== >> To manage your subscription to SPSSX-L, send a message to >> [hidden email] (not to SPSSX-L), with no body text except the >> command. To leave the list, send the command >> SIGNOFF SPSSX-L >> For a list of commands to manage subscriptions, send the command >> INFO REFCARD >> > > > > _____ > > Read amazing stories to your kids on Messenger Try it > <http://clk.atdmt.com/UKM/go/117588488/direct/01/> Now! > > =================== > To manage your subscription to SPSSX-L, send a message to > [hidden email] (not to SPSSX-L), with no body text except the > command. To leave the list, send the command > SIGNOFF SPSSX-L > For a list of commands to manage subscriptions, send the command > INFO REFCARD > > > ===================== To manage your subscription to SPSSX-L, send a message to [hidden email] (not to SPSSX-L), with no body text except the command. To leave the list, send the command SIGNOFF SPSSX-L For a list of commands to manage subscriptions, send the command INFO REFCARD
Art Kendall
Social Research Consultants |
| Free forum by Nabble | Edit this page |
