Login  Register

Re: Negative Adjusted R Square is a "good" thing?

Posted by Rich Ulrich on Feb 10, 2014; 4:49am
URL: http://spssx-discussion.165.s1.nabble.com/Negative-Adjusted-R-Square-is-a-good-thing-tp5724399p5724413.html

Yep, "No prediction".  Reason: bad model; too many useless
degrees of freedom.

If I had 37 indicators for Types of authority, I would
create composite scores.  With N of 46, factoring is not
very robust, but I'd look at it.

Probably, I would retreat to combining the most correlated
items, in order to create a few tentative composites
scores, and then look at the other correlations with
those composites to see what others should be added. There
should be no overlap among the items chosen for different scores.

In the end, I would hope for maybe two or three composites.

I think I almost never saw 37 items that all would be of
equal importance; so, they shouldn't be tested as if they were.
Among all the variables, I would use "expert judgement"
(based on literature, logic, etc.) of which of these items,
with their ranges of endorsement as observed in this sample,
are apt to be most salient. That would give me two to five items.
These might already have been included in the composite scores.

Then I would test my original hypotheses when I carry out two OLS
regressions on the average graduation rate -- composite scores;
and salient items.

I suspect that there are regional disparities in graduation
rates, so I might include some control for those, as
nuisance parameter, if they don't "confound" the original IVs.

--
Rich Ulrich

________________________________

> Date: Sun, 9 Feb 2014 12:18:50 -0800
> From: [hidden email]
> Subject: Negative Adjusted R Square is a "good" thing?
> To: [hidden email]
>
> The theory: in K-12 education putting more administrative authority in
> the state board of education is "better" that leaving it to the local
> boards.
>
> DVs: I have two ways to measure "better" from 26 states: % kids
> graduating high school within 4 years and scores by school district on
> a standardized test. I'll be examining them separately.
>
> IVs: I have 37 different measures for the types of administrative
> authority: 3 (state has complete control), 2 (shared/split) and 1
> (locality has complete control).
>
> So I fire up SPSS, plunk in the % kids graduating high school within 4
> years by state in 26 states as my DV, plunk in the 37 IVs, use "Enter"
> as my method (I've been told stepwise is evil, evil, evil) and...
>
> Model Summary
>
>
> Model
>
>
> R
>
>
> R Square
>
>
> Adjusted R Square
>
>
> Std. Error of the Estimate
>
>
> 1
>
>
> .853a
>
>
> .727
>
>
> -.041
>
>
> .148600607125323
>
>
>
> This might be "good" if it means the predictors are useless. It is
> "bad" if I am getting this because my model stinks. How can I determine
> which?
> ________________________________
> View this message in context: Negative Adjusted R Square is a "good"
> thing?<http://spssx-discussion.1045642.n5.nabble.com/Negative-Adjusted-R-Square-is-a-good-thing-tp5724399.html>
> Sent from the SPSSX Discussion mailing list
> archive<http://spssx-discussion.1045642.n5.nabble.com/> at Nabble.com.

=====================
To manage your subscription to SPSSX-L, send a message to
[hidden email] (not to SPSSX-L), with no body text except the
command. To leave the list, send the command
SIGNOFF SPSSX-L
For a list of commands to manage subscriptions, send the command
INFO REFCARD