Re: Pretest to Posttest: A question of reliability
Posted by
Art Kendall on
Dec 22, 2012; 11:45am
URL: http://spssx-discussion.165.s1.nabble.com/Pretest-to-Posttest-A-question-of-reliability-tp5717078p5717108.html
Ryan
Do you have an example set of
syntax to do this? Did you use OMS?
Art Kendall
Social Research Consultants
On 12/21/2012 11:38 PM, R B wrote:
Eins,
Reliability = true score variance / observed score variance
where
observed score variance = true score variance + error score
variance
Within a one-factor confirmatory factor analytic modeling
framework, you can estimate true score variance and error score
variance as follows:
estimated true score variance = [sum(factor loadings)]^2
estimated error score variance = sum(error variances) +
2*[sum(error covariances)]
estimated reliability = estimated true score variance /
(estimated true score variance + estimated error score variance)
The formula above employed on data from a single testing
occasion will yield a more accurate estimate of composite score
reliability than Cronbach's alpha.
Reference: Brown TA (2006).
Confirmatory factor analysis for applied research; Kenny DA,
editor. New York: The Guilford Press.
Estimating test-retest reliability using a structural
equation model is another matter for another time.
Ryan
On Fri, Dec 21, 2012 at 9:35 AM, E.
Bernardo
<[hidden email]>
wrote:
Dear RB, Ulrich, et al.
Sorry
for the poor English.
Let
me rephrase the scenario. Our variable is latent with
10 items. The ten-item five point likert type
questionnaire were administered to a sample in two
different occasions, F1 then F2. So we have two
correlated factors F1 and F2 and we want to treat
them as latent variables using AMOS 20. We expect
that the 10 items are loaded significantly
(p<.0.05) to F1 and then to F2. However, the
actual data showed that some items have insignificant
(p>.05) factor loadings on F1 and F2; thus, the set
of items that loaded to F1 are different from the set
of items that loaded to F2. Our question was: Can we
proceed to correlate F1 and F2. Is this not a
measurement problem?
Eins
To: [hidden email]
Sent:
Thursday, December 20, 2012 9:16 PM
Subject:
Re: Pretest to Posttest: A question of reliability
My responses are interspersed below.
On Thu, Dec 20, 2012 at 10:12 PM, E.
Bernardo
<[hidden email]>
wrote:
Dear Everyone,
We
do a pretest-posttest analysis for a
unidimentional scale with 10 items
(say, Q1, Q2, ...,Q10). Using the
pretest data, only four items (Q2, Q3,
Q4, Q5) were significant, while using
the posttest data six items(Q1, Q2,
Q6, Q8, Q9, Q10) were significant.
You are providing conflicting information
above. You state that you ran a
pretest-posttest analysis for a
"unidimensional scale with 10 items" which
suggests to me that you derived a composite
score to use as the dependent variable
(e.g., you computed a sum or mean across all
items for each subject). However, you go on
to suggest that you performed pre-post
analyses per item. Please clarify.
Noticed
that the scale has a different set of
items in the pretest and posttest
scenario.
What do you mean that the scale has a
different set of items during both
measurement periods? Why?
Our
question is, is the scale not
reliable?
The reliability of composite scores on a
measuring instrument and/or reliability on
change scores are unrelated to what you've
been discussing thus far, in my estimation.
Suppose
we extend the scenario above to more
than two measures (say, Pretest,
Posttest1, Posttest2, Posttest3,
Postest4) so that we will use the
Latent Growth Modeling(LGM) to model
the latent change. Is it a
requirement in LGM that all the scale
indicators/items are significant
across tests?
Technically, just because you have more
than two measurement points does not make
the analysis an LGC. You can have an LGC
with only two points at which each subject
was measured. Anyway, the answer to your
final question is no.
Thank
you for your inputs.
Eins
=====================
To manage your subscription to SPSSX-L, send a message to
[hidden email] (not to SPSSX-L), with no body text except the
command. To leave the list, send the command
SIGNOFF SPSSX-L
For a list of commands to manage subscriptions, send the command
INFO REFCARD
Art Kendall
Social Research Consultants