Posted by
Rcarlstedt on
Oct 27, 2006; 2:21am
URL: http://spssx-discussion.165.s1.nabble.com/Re-small-sample-repeated-predictors-more-tp1071739p1071743.html
SEE response to similar question that I posed in May. Scroll all the way
down to read how I described the sample size issue then.....thanks!
PS: if anybody remembers having commented on the below matter relative to
PANEL analysis please let me know...thanks!
In a message dated 10/26/2006 8:48:08 P.M. Eastern Standard Time, Rcarlstedt
writes:
I'll try to find the post and response about PANEL analysis that I received
previously that implied that one could use/enter trait constants each time
other more variable predictor variables are entered.
Yes, that is by definition a time-invariant variable and that is how it
is handled in a mixed models approach.
Paul R. Swank, Ph.D.
Professor, Developmental Pediatrics
Director of Research, Center for Improving the Readiness of Children for
Learning and Education (C.I.R.C.L.E.)
Medical School
UT Health Science Center at Houston
-----Original Message-----
From: SPSSX(r) Discussion [mailto:
[hidden email]] On Behalf Of
[hidden email]
Sent: Monday, May 15, 2006 1:09 PM
To:
[hidden email]
Subject: Sample Size Issues
I have a methodological question pertaining to sample size.
If one has a small sample in which specific measures are considered
TRAITS, that is, they are considered to be stable longitudinal mediators
of certain behaviors and outcome measures can they be seen/used as
repeated measures in a study that is interested in their influence on
other outcome measures?
For example, I have longitudinal data spanning nine months (a small
sample of athletes). I have repeated measures (81; ca. 10 per subject)
on heart rate variability (HRV) and numerous statistical outcome
measures (e.g., games won or lost)....over ten measurement occasions
(matches) and pre-post HRV measurements associated with these matches.
In addition, I have neuropsychological/cognition measures that are also
considered stable for the same sample. I also have intervention
efficacy data obtained in the context of an ecologically more valid and
not a controlled design.
Both cognition and personality/behavioral measures were found to explain
varying amounts of variance explained in outcome measures and
vice-versa. Also, among and between variable.
The sample size was only 8-12. However, data points or repeated measures
for outcome measures ranged from 52-81. Thus, although I only had a
sample of around 10, I have up to 81 outcome measures.
My question: if my cognition and personality/behavioral measures are
considered stable, can they be entered as predictor variables equivalent
to the amount of measurement occasions multiple times? For example, if
player A played 10 matches and 10 HRV measurements were taken, can one
justifiably enter his or her cognition-personality scores ten times to
match the outcome measurements; under the assumption that these stable
traits are enduring and will indeed influence HRV and performance
outcome measures at different points in time (the predictor measures
have very high Test-Retest reliability)?
This would increase sample size/predictor data points from 8 to 81,
albeit the predictor and outcome measures would be from a limited
sample?
Is this more a theoretical or methodological issue or can one justify
such an approach because stable predictor variables will "always"
influence certain performance (at the intra and inter-individual level)?
Which my results demonstrated.
What about vice versa when looking at how HRV and outcome is associated
with cognition/personality measures (only 8 measures/8 subjects),
whereas the HRV/Outcome measurement involves 52-81 measurement
occasions.
Any feedback would be appreciated including statistical considerations,
limitations, alternative data-analysis suggestions etc.
Thanks!
RC
____________________________________________
Roland A. Carlstedt, Ph.D.
Licensed Clinical Psychologist/Licensed Applied Psychologist
Chair, American Board of Sport Psychology
Clinical and Research Director: Integrative Psychological Services of NYC
Research Fellow in Applied Neuroscience: Brain Resource Company
_www.americanboardofsportpsychology.org_
(
http://www.americanboardofsportpsychology.org/)
[hidden email]
917-680-3994