Stats course geared towards Program Evaluation

classic Classic list List threaded Threaded
15 messages Options
Reply | Threaded
Open this post in threaded view
|

Stats course geared towards Program Evaluation

Ryan
OT:

Is anyone familiar a grad level stats course that is geared towards program evaluation?

Thanks,

Ryan
===================== To manage your subscription to SPSSX-L, send a message to [hidden email] (not to SPSSX-L), with no body text except the command. To leave the list, send the command SIGNOFF SPSSX-L For a list of commands to manage subscriptions, send the command INFO REFCARD
Reply | Threaded
Open this post in threaded view
|

Re: Stats course geared towards Program Evaluation

bdates

Ryan,

 

The Evaluators’ Institute at George Washington University has a full cadre of courses for evaluators.  Here’s the current listing:

 

Analytic Approaches

The link to the Evaluators’ Institute is: http://tei.gwu.edu/course-listing-category

 

I’d also recommend the session by Stephanie Evergreen, Presenting Data Effectively: Practical Methods for Improving Evaluation Communication . She’s done amazing work with data visualization. We’ve had her at my organization for a full day, and her stuff is really good.

 

Hope this all helps.

 

Brian

 

Brian Dates, M.A.
Director of Evaluation and Research | Evaluation & Research | Southwest Counseling Solutions
Southwest Solutions
1700 Waterman, Detroit, MI 48209
313-841-8900 (x7442) office | 313-849-2702 fax
[hidden email] | www.swsol.org

 

From: SPSSX(r) Discussion [mailto:[hidden email]] On Behalf Of Ryan Black
Sent: Monday, March 02, 2015 10:05 AM
To: [hidden email]
Subject: Stats course geared towards Program Evaluation

 

OT:

 

Is anyone familiar a grad level stats course that is geared towards program evaluation?

 

Thanks,

 

Ryan

===================== To manage your subscription to SPSSX-L, send a message to [hidden email] (not to SPSSX-L), with no body text except the command. To leave the list, send the command SIGNOFF SPSSX-L For a list of commands to manage subscriptions, send the command INFO REFCARD

===================== To manage your subscription to SPSSX-L, send a message to [hidden email] (not to SPSSX-L), with no body text except the command. To leave the list, send the command SIGNOFF SPSSX-L For a list of commands to manage subscriptions, send the command INFO REFCARD
Reply | Threaded
Open this post in threaded view
|

Re: Stats course geared towards Program Evaluation

Mike

Just to add a little to what Brian Dates provides below:
there is no single design for program evaluation,
consequently, as implied below, a large variety of different
statistical analyses can be conducted depending upon
(a) the design of program or the logic model assumed,
(b) how program participants were assigned to different
programs components (e.g., random assignment, is there
an appropriate "control" group that was not in the program,
etc.), and what data was collected at different times of a
participant's involvement in the program (NOTE: I am implicitly
assuming the unit of analysis as being a person/dyad/family
but the unit can larger, such as schools, geographical regions,
social groups, etc.).  An analysis can be as simple an independent
groups t-test on some outcome measure or a process model with
empirical indicators (a) prior to program involvement,
(b) at specific times in the program, and (c) at the immediate exit
from the program as well as longer term outcome measures.
 
The simplest way to think of the analysis of a program is
that it conforms to an experimental/quasi-experimental design
but, as examination of a textbook on program evaluation
(e.g., Rossi, Lipsey, & Freeman "Evaluation") will show this an
oversimplification which may lead one to conduct analyses
that are essentially meaningless. Cost-benefit analysis, degree
of program participant and program administers "investment
in the program" (i.e., degree to which they believe in the
program and will work to achieve program goals), and other
issues need to be taken into account. Positive outcomes
may not be due to the program nor may negative results 
indicate program failure (especially if the program was not
properly implemented or was sabotaged by some "stakeholder").
It is also useful to keep in mind that some programs start out
with a well-define design/logic model which would suggest how
to analyze it while other programs developed piecemeal over
time, with different logic models at different times. Which
time/model does one want to analyze?
 
There is a lot to chew on in program evaluation.
 
-Mike Palij
New York University
 
 
----- Original Message -----
Sent: Monday, March 02, 2015 10:19 AM
Subject: Re: Stats course geared towards Program Evaluation

Ryan,

 

The Evaluators’ Institute at George Washington University has a full cadre of courses for evaluators.  Here’s the current listing:

 

Analytic Approaches

The link to the Evaluators’ Institute is: http://tei.gwu.edu/course-listing-category

 

I’d also recommend the session by Stephanie Evergreen, Presenting Data Effectively: Practical Methods for Improving Evaluation Communication . She’s done amazing work with data visualization. We’ve had her at my organization for a full day, and her stuff is really good.

 

Hope this all helps.

 

Brian

 

Brian Dates, M.A.
Director of Evaluation and Research | Evaluation & Research | Southwest Counseling Solutions
Southwest Solutions
1700 Waterman, Detroit, MI 48209
313-841-8900 (x7442) office | 313-849-2702 fax
[hidden email] | www.swsol.org

 

From: SPSSX(r) Discussion [mailto:[hidden email]] On Behalf Of Ryan Black
Sent: Monday, March 02, 2015 10:05 AM
To: [hidden email]
Subject: Stats course geared towards Program Evaluation

 

OT:

 

Is anyone familiar a grad level stats course that is geared towards program evaluation?

 

Thanks,

 

Ryan

===================== To manage your subscription to SPSSX-L, send a message to [hidden email] (not to SPSSX-L), with no body text except the command. To leave the list, send the command SIGNOFF SPSSX-L For a list of commands to manage subscriptions, send the command INFO REFCARD

===================== To manage your subscription to SPSSX-L, send a message to [hidden email] (not to SPSSX-L), with no body text except the command. To leave the list, send the command SIGNOFF SPSSX-L For a list of commands to manage subscriptions, send the command INFO REFCARD
===================== To manage your subscription to SPSSX-L, send a message to [hidden email] (not to SPSSX-L), with no body text except the command. To leave the list, send the command SIGNOFF SPSSX-L For a list of commands to manage subscriptions, send the command INFO REFCARD
Reply | Threaded
Open this post in threaded view
|

Re: Stats course geared towards Program Evaluation

bdates

Mike makes excellent points to which I’ll add that program evaluation looks frequently at program process (activities, outputs) which are not outcomes to the persons, organizations, and/or communities involved. Statistical analysis of process usually differs substantially from the statistical analysis of outcomes.

 

B

 

Brian Dates, M.A.
Director of Evaluation and Research | Evaluation & Research | Southwest Counseling Solutions
Southwest Solutions
1700 Waterman, Detroit, MI 48209
313-841-8900 (x7442) office | 313-849-2702 fax
[hidden email] | www.swsol.org

 

From: Mike Palij [mailto:[hidden email]]
Sent: Monday, March 02, 2015 11:12 AM
To: Dates, Brian; [hidden email]
Cc: Michael Palij
Subject: Re: Stats course geared towards Program Evaluation

 

Just to add a little to what Brian Dates provides below:

there is no single design for program evaluation,

consequently, as implied below, a large variety of different

statistical analyses can be conducted depending upon

(a) the design of program or the logic model assumed,

(b) how program participants were assigned to different

programs components (e.g., random assignment, is there

an appropriate "control" group that was not in the program,

etc.), and what data was collected at different times of a

participant's involvement in the program (NOTE: I am implicitly

assuming the unit of analysis as being a person/dyad/family

but the unit can larger, such as schools, geographical regions,

social groups, etc.).  An analysis can be as simple an independent

groups t-test on some outcome measure or a process model with

empirical indicators (a) prior to program involvement,

(b) at specific times in the program, and (c) at the immediate exit

from the program as well as longer term outcome measures.

 

The simplest way to think of the analysis of a program is

that it conforms to an experimental/quasi-experimental design

but, as examination of a textbook on program evaluation

(e.g., Rossi, Lipsey, & Freeman "Evaluation") will show this an

oversimplification which may lead one to conduct analyses

that are essentially meaningless. Cost-benefit analysis, degree

of program participant and program administers "investment

in the program" (i.e., degree to which they believe in the

program and will work to achieve program goals), and other

issues need to be taken into account. Positive outcomes

may not be due to the program nor may negative results 

indicate program failure (especially if the program was not

properly implemented or was sabotaged by some "stakeholder").

It is also useful to keep in mind that some programs start out

with a well-define design/logic model which would suggest how

to analyze it while other programs developed piecemeal over

time, with different logic models at different times. Which

time/model does one want to analyze?

 

There is a lot to chew on in program evaluation.

 

-Mike Palij

New York University

 

 

----- Original Message -----

Sent: Monday, March 02, 2015 10:19 AM

Subject: Re: Stats course geared towards Program Evaluation

 

Ryan,

 

The Evaluators’ Institute at George Washington University has a full cadre of courses for evaluators.  Here’s the current listing:

 

Analytic Approaches

The link to the Evaluators’ Institute is: http://tei.gwu.edu/course-listing-category

 

I’d also recommend the session by Stephanie Evergreen, Presenting Data Effectively: Practical Methods for Improving Evaluation Communication . She’s done amazing work with data visualization. We’ve had her at my organization for a full day, and her stuff is really good.

 

Hope this all helps.

 

Brian

 

Brian Dates, M.A.
Director of Evaluation and Research | Evaluation & Research | Southwest Counseling Solutions
Southwest Solutions
1700 Waterman, Detroit, MI 48209
313-841-8900 (x7442) office | 313-849-2702 fax
[hidden email] | www.swsol.org

 

From: SPSSX(r) Discussion [[hidden email]] On Behalf Of Ryan Black
Sent: Monday, March 02, 2015 10:05 AM
To: [hidden email]
Subject: Stats course geared towards Program Evaluation

 

OT:

 

Is anyone familiar a grad level stats course that is geared towards program evaluation?

 

Thanks,

 

Ryan

===================== To manage your subscription to SPSSX-L, send a message to [hidden email] (not to SPSSX-L), with no body text except the command. To leave the list, send the command SIGNOFF SPSSX-L For a list of commands to manage subscriptions, send the command INFO REFCARD

===================== To manage your subscription to SPSSX-L, send a message to [hidden email] (not to SPSSX-L), with no body text except the command. To leave the list, send the command SIGNOFF SPSSX-L For a list of commands to manage subscriptions, send the command INFO REFCARD

===================== To manage your subscription to SPSSX-L, send a message to [hidden email] (not to SPSSX-L), with no body text except the command. To leave the list, send the command SIGNOFF SPSSX-L For a list of commands to manage subscriptions, send the command INFO REFCARD
Reply | Threaded
Open this post in threaded view
|

Re: Stats course geared towards Program Evaluation

Art Kendall
In reply to this post by Mike
It is true that there is no single design for program evaluation.  In earlier years it was called "applied social science".  In fact if you look at the Handbook of Evaluation Research, it was sponsored by SPSSI (Society for Psychological Study of Social Issues).

The whole panoply of social science methods can be used in evaluating a vast array of programs.
The series of books authored by Cook and Campbell and later Shadish can be used as a framework to conceptualize the designs that can be used in program evaluation.  

In an ideal world we would be able to randomly assign individuals or larger aggregation of individuals to treatments.  When we cannot (which is very often) we need to supplement the rhetoric used in our statistical arguments with further information to rule out plausible rival hypotheses for observed differences.

In one sense basic research we are looking for difference in summary observations relevant to theory and in program evaluation we are looking for difference in summary observations relevant to practice and or policy/

Even in true experiments, our reasoning can be strengthened by some form of repeated measures.

Wrt teaching

One point that some miss is that the main hypotheses are interaction hypotheses. The simplest example of this kind of of interaction is the interaction of a variables related to time (repeats) with a variable representing (hopefully randomly assigned) treatment.  A difference in change is a simple example.  We hope that a treated groups shows a larger improvement than a control (or at least comparison group).  In my experience - YMMV - it is helpful for clients to sketch a graph of what they would like to see.  The idealized sketch with the DV on the Y axis and an X axis with 2 values pre and post, and two line segments connecting pre and   post separately would show a definite non-parallelism between the line segments.

In the example syntax below, the same SD is assumed for all 4 means, ideally these would be informed guesses from prior research.  For the treatment group, the "effect size that would make a difference" is 5 points on the DV scale.

Using such a demonstration a few times can help clients understand what they hope for is an interaction, that they need to consider prior research, and that they need to consider what size difference in a DV would be meaningful for theory/practice/policy.

set seed 20101802.
input program.
   loop #i = 1 to 25.
      compute group =1.
      compute DV_pre = rv.normal(22,5).
      compute DV_post = rv.normal(27,5).
      end case.
   end loop.
   loop #i = 1 to 25.
      compute group =2.
      compute DV_pre = rv.normal(22,5).
      compute DV_post= rv.normal(22,5).
      end case.
   end loop.
   end file.
end input program.
value labels group 1 'treatment' 2 'control'.
execute.
GLM DV_pre DV_post BY group
  /WSFACTOR=time 2 Polynomial
  /METHOD=SSTYPE(3)
  /PLOT=PROFILE(time*group)
  /CRITERIA=ALPHA(.05)
  /WSDESIGN=time
  /DESIGN=group.





Art Kendall
Social Research Consultants
Reply | Threaded
Open this post in threaded view
|

Re: Stats course geared towards Program Evaluation

Art Kendall
For the last few decades, I have asserted that although design, measurement, and statistics are taught as separate course subjects, they are intrinsically intertwined.

One of my soapboxes has long been opposition to the "lone ranger" view of research, that one person can be grasp all the necessary reasoning to do research with only a few grad courses. Whenever, I have have taught classes or workshops, I have emphasized that one goal is to facilitate their interaction with consultants who specialize in these topics.
Art Kendall
Social Research Consultants
Reply | Threaded
Open this post in threaded view
|

Re: Stats course geared towards Program Evaluation

bdates
Very good point, Art. I teach in the grad programs in Public Administration and Public Policy at University of Michigan-Dearborn. They've put together a nice little triad of courses I teach - Research Methods and Stats, Program Evaluation, and Performance Management for Govt. and Nonprofits. By the time the student has completed each, they understand the overlap and the uniqueness. Maybe not prepared at graduation to do any of the three without guidance, but they're at least trainable.

B

Brian Dates, M.A.
Director of Evaluation and Research | Evaluation & Research | Southwest Counseling Solutions
Southwest Solutions
1700 Waterman, Detroit, MI 48209
313-841-8900 (x7442) office | 313-849-2702 fax
[hidden email] | www.swsol.org


-----Original Message-----
From: SPSSX(r) Discussion [mailto:[hidden email]] On Behalf Of Art Kendall
Sent: Monday, March 02, 2015 1:39 PM
To: [hidden email]
Subject: Re: Stats course geared towards Program Evaluation

For the last few decades, I have asserted that although design, measurement, and statistics are taught as separate course subjects, they are intrinsically intertwined.

One of my soapboxes has long been opposition to the "lone ranger" view of research, that one person can be grasp all the necessary reasoning to do research with only a few grad courses. Whenever, I have have taught classes or workshops, I have emphasized that one goal is to facilitate their interaction with consultants who specialize in these topics.



-----
Art Kendall
Social Research Consultants
--
View this message in context: http://spssx-discussion.1045642.n5.nabble.com/Stats-course-geared-towards-Program-Evaluation-tp5728871p5728877.html
Sent from the SPSSX Discussion mailing list archive at Nabble.com.

=====================
To manage your subscription to SPSSX-L, send a message to [hidden email] (not to SPSSX-L), with no body text except the command. To leave the list, send the command SIGNOFF SPSSX-L For a list of commands to manage subscriptions, send the command INFO REFCARD

=====================
To manage your subscription to SPSSX-L, send a message to
[hidden email] (not to SPSSX-L), with no body text except the
command. To leave the list, send the command
SIGNOFF SPSSX-L
For a list of commands to manage subscriptions, send the command
INFO REFCARD
Reply | Threaded
Open this post in threaded view
|

Re: Stats course geared towards Program Evaluation

Ryan
Thanks for the comments and resources!

Best,

Ryan

Sent from my iPhone

> On Mar 2, 2015, at 1:54 PM, Dates, Brian <[hidden email]> wrote:
>
> Very good point, Art. I teach in the grad programs in Public Administration and Public Policy at University of Michigan-Dearborn. They've put together a nice little triad of courses I teach - Research Methods and Stats, Program Evaluation, and Performance Management for Govt. and Nonprofits. By the time the student has completed each, they understand the overlap and the uniqueness. Maybe not prepared at graduation to do any of the three without guidance, but they're at least trainable.
>
> B
>
> Brian Dates, M.A.
> Director of Evaluation and Research | Evaluation & Research | Southwest Counseling Solutions
> Southwest Solutions
> 1700 Waterman, Detroit, MI 48209
> 313-841-8900 (x7442) office | 313-849-2702 fax
> [hidden email] | www.swsol.org
>
>
> -----Original Message-----
> From: SPSSX(r) Discussion [mailto:[hidden email]] On Behalf Of Art Kendall
> Sent: Monday, March 02, 2015 1:39 PM
> To: [hidden email]
> Subject: Re: Stats course geared towards Program Evaluation
>
> For the last few decades, I have asserted that although design, measurement, and statistics are taught as separate course subjects, they are intrinsically intertwined.
>
> One of my soapboxes has long been opposition to the "lone ranger" view of research, that one person can be grasp all the necessary reasoning to do research with only a few grad courses. Whenever, I have have taught classes or workshops, I have emphasized that one goal is to facilitate their interaction with consultants who specialize in these topics.
>
>
>
> -----
> Art Kendall
> Social Research Consultants
> --
> View this message in context: http://spssx-discussion.1045642.n5.nabble.com/Stats-course-geared-towards-Program-Evaluation-tp5728871p5728877.html
> Sent from the SPSSX Discussion mailing list archive at Nabble.com.
>
> =====================
> To manage your subscription to SPSSX-L, send a message to [hidden email] (not to SPSSX-L), with no body text except the command. To leave the list, send the command SIGNOFF SPSSX-L For a list of commands to manage subscriptions, send the command INFO REFCARD
>
> =====================
> To manage your subscription to SPSSX-L, send a message to
> [hidden email] (not to SPSSX-L), with no body text except the
> command. To leave the list, send the command
> SIGNOFF SPSSX-L
> For a list of commands to manage subscriptions, send the command
> INFO REFCARD

=====================
To manage your subscription to SPSSX-L, send a message to
[hidden email] (not to SPSSX-L), with no body text except the
command. To leave the list, send the command
SIGNOFF SPSSX-L
For a list of commands to manage subscriptions, send the command
INFO REFCARD
Reply | Threaded
Open this post in threaded view
|

Re: Stats course geared towards Program Evaluation

Mike
In reply to this post by bdates

Just one more addition to this thread, a cautionary tale.
Susan Reverby (a researcher best known for her work on
the Tuskegee syphilis nontreatment study; see:
who is on another mailing list that I'm a member of, provided a
link to an analysis by HIV researcher Ida Susser of a program
intervention study that was published in the New England Journal
of Medicine.  The program was to promote preventive measures
to stop the spread of the HIV but, as implemented, appears to
have failed to do so.  Susser identifies some possible factors for
why the intervention failed which highlights some of the problems
the exist between the people implementing a program and the
recipients of the program.  Susser had her analysis published
on the Aljazeera America website (just a news website, not
a jihadi outlet) and it shows that when doing program evaluation
one cannot simply do statistical analysis but one also should do
an analysis of how the program was implemented.
NOTE: even if the program was implemented as planned, the
plan that was used may not have been the best approach to use,
consequently, it should not come as a surprise that there were
negative results, however, it failed because the wrong implementation
was used not that the implementation does not work.
Susser's article can be read here:
 
-Mike Palij
New York University
 
----- Original Message -----
Sent: Monday, March 02, 2015 10:19 AM
Subject: Re: Stats course geared towards Program Evaluation

Ryan,

 

The Evaluators’ Institute at George Washington University has a full cadre of courses for evaluators.  Here’s the current listing:

 

Analytic Approaches

The link to the Evaluators’ Institute is: http://tei.gwu.edu/course-listing-category

 

I’d also recommend the session by Stephanie Evergreen, Presenting Data Effectively: Practical Methods for Improving Evaluation Communication . She’s done amazing work with data visualization. We’ve had her at my organization for a full day, and her stuff is really good.

 

Hope this all helps.

 

Brian

 

Brian Dates, M.A.
Director of Evaluation and Research | Evaluation & Research | Southwest Counseling Solutions
Southwest Solutions
1700 Waterman, Detroit, MI 48209
313-841-8900 (x7442) office | 313-849-2702 fax
[hidden email] | www.swsol.org

 

From: SPSSX(r) Discussion [mailto:[hidden email]] On Behalf Of Ryan Black
Sent: Monday, March 02, 2015 10:05 AM
To: [hidden email]
Subject: Stats course geared towards Program Evaluation

 

OT:

 

Is anyone familiar a grad level stats course that is geared towards program evaluation?

 

Thanks,

 

Ryan

===================== To manage your subscription to SPSSX-L, send a message to [hidden email] (not to SPSSX-L), with no body text except the command. To leave the list, send the command SIGNOFF SPSSX-L For a list of commands to manage subscriptions, send the command INFO REFCARD

===================== To manage your subscription to SPSSX-L, send a message to [hidden email] (not to SPSSX-L), with no body text except the command. To leave the list, send the command SIGNOFF SPSSX-L For a list of commands to manage subscriptions, send the command INFO REFCARD
===================== To manage your subscription to SPSSX-L, send a message to [hidden email] (not to SPSSX-L), with no body text except the command. To leave the list, send the command SIGNOFF SPSSX-L For a list of commands to manage subscriptions, send the command INFO REFCARD
Reply | Threaded
Open this post in threaded view
|

Re: Stats course geared towards Program Evaluation

Bruce Weaver
Administrator
In a slightly different (but I think related) vein, I once heard an education researcher talking about a study that involved some intervention to improve student performance in a particular course.  Somewhat surprisingly, the results showed no improvement in the target course.  However, there was a substantial improvement in another course.  It was a situation where each course was populated by the same students.  So the researchers speculated that students were able to maintain their grades in the target course with less time and effort, and that they gave that extra time and effort to the other course, where it was apparently more important to them to improve their performance.  But if the researchers had not been paying attention to what was going on in the other non-target course(s), they might easily have concluded that the intervention was not effective.  

I can't remember who the researcher was.  But if anyone is dying to know, I can probably track it down.

Cheers,
Bruce

Mike Palij wrote
Just one more addition to this thread, a cautionary tale.
Susan Reverby (a researcher best known for her work on
the Tuskegee syphilis nontreatment study; see:
http://books.google.com/books?hl=en&lr=&id=qm5X5gW7qNIC&oi=fnd&pg=PR7&dq=%22susan+reverby%22+tuskegee&ots=b5Q3bpmUif&sig=GasL1Eq2uegPmHuaZD9uCyNX1pM#v=onepage&q=%22susan%20reverby%22%20tuskegee&f=false )
who is on another mailing list that I'm a member of, provided a
link to an analysis by HIV researcher Ida Susser of a program
intervention study that was published in the New England Journal
of Medicine.  The program was to promote preventive measures
to stop the spread of the HIV but, as implemented, appears to
have failed to do so.  Susser identifies some possible factors for
why the intervention failed which highlights some of the problems
the exist between the people implementing a program and the
recipients of the program.  Susser had her analysis published
on the Aljazeera America website (just a news website, not
a jihadi outlet) and it shows that when doing program evaluation
one cannot simply do statistical analysis but one also should do
an analysis of how the program was implemented.
NOTE: even if the program was implemented as planned, the
plan that was used may not have been the best approach to use,
consequently, it should not come as a surprise that there were
negative results, however, it failed because the wrong implementation
was used not that the implementation does not work.
Susser's article can be read here:
http://america.aljazeera.com/opinions/2015/3/blame-research-design-for-failed-hiv-study.html#

-Mike Palij
New York University
[hidden email]

  ----- Original Message -----
  From: Dates, Brian
  To: [hidden email] 
  Sent: Monday, March 02, 2015 10:19 AM
  Subject: Re: Stats course geared towards Program Evaluation


  Ryan,

   

  The Evaluators’ Institute at George Washington University has a full cadre of courses for evaluators.  Here’s the current listing:

   

  Analytic Approaches

    a.. Applied Regression Analysis for Evaluators
    b.. Applied Statistics for Evaluators
    c.. Hierarchical Linear Modeling
    d.. Intermediate Cost-Benefit and Cost-Effectiveness Analysis
    e.. Intermediate Qualitative Analysis
    f.. Introduction to Cost-Benefit and Cost-Effectiveness Analysis
    g.. Needs Assessment
    h.. Practical Meta-Analysis: Summarizing Results Across Studies
    i.. Qualitative Data Analysis
  The link to the Evaluators’ Institute is: http://tei.gwu.edu/course-listing-category 

   

  I’d also recommend the session by Stephanie Evergreen, Presenting Data Effectively: Practical Methods for Improving Evaluation Communication . She’s done amazing work with data visualization. We’ve had her at my organization for a full day, and her stuff is really good.

   

  Hope this all helps.

   

  Brian

   

  Brian Dates, M.A.
  Director of Evaluation and Research | Evaluation & Research | Southwest Counseling Solutions
  Southwest Solutions
  1700 Waterman, Detroit, MI 48209
  313-841-8900 (x7442) office | 313-849-2702 fax
  [hidden email] | www.swsol.org

   

  From: SPSSX(r) Discussion [mailto:[hidden email]] On Behalf Of Ryan Black
  Sent: Monday, March 02, 2015 10:05 AM
  To: [hidden email]
  Subject: Stats course geared towards Program Evaluation

   

  OT:

   

  Is anyone familiar a grad level stats course that is geared towards program evaluation?

   

  Thanks,

   

  Ryan

  ===================== To manage your subscription to SPSSX-L, send a message to [hidden email] (not to SPSSX-L), with no body text except the command. To leave the list, send the command SIGNOFF SPSSX-L For a list of commands to manage subscriptions, send the command INFO REFCARD

  ===================== To manage your subscription to SPSSX-L, send a message to [hidden email] (not to SPSSX-L), with no body text except the command. To leave the list, send the command SIGNOFF SPSSX-L For a list of commands to manage subscriptions, send the command INFO REFCARD

=====================
To manage your subscription to SPSSX-L, send a message to
[hidden email] (not to SPSSX-L), with no body text except the
command. To leave the list, send the command
SIGNOFF SPSSX-L
For a list of commands to manage subscriptions, send the command
INFO REFCARD
--
Bruce Weaver
bweaver@lakeheadu.ca
http://sites.google.com/a/lakeheadu.ca/bweaver/

"When all else fails, RTFM."

PLEASE NOTE THE FOLLOWING: 
1. My Hotmail account is not monitored regularly. To send me an e-mail, please use the address shown above.
2. The SPSSX Discussion forum on Nabble is no longer linked to the SPSSX-L listserv administered by UGA (https://listserv.uga.edu/).
Reply | Threaded
Open this post in threaded view
|

Re: Stats course geared towards Program Evaluation

Zdaniuk, Bozena-3
I am dying to know! It would be such a great example to use for advanced research courses...
bozena

-----Original Message-----
From: SPSSX(r) Discussion [mailto:[hidden email]] On Behalf Of Bruce Weaver
Sent: March 6, 2015 3:30 PM
To: [hidden email]
Subject: Re: Stats course geared towards Program Evaluation

In a slightly different (but I think related) vein, I once heard an education researcher talking about a study that involved some intervention to improve student performance in a particular course.  Somewhat surprisingly, the results showed no improvement in the target course.  However, there was a substantial improvement in another course.  It was a situation where each course was populated by the same students.  So the researchers speculated that students were able to maintain their grades in the target course with less time and effort, and that they gave that extra time and effort to the other course, where it was apparently more important to them to improve their performance.  But if the researchers had not been paying attention to what was going on in the other non-target course(s), they might easily have concluded that the intervention was not effective.  

I can't remember who the researcher was.  But if anyone is dying to know, I can probably track it down.

Cheers,
Bruce


Mike Palij wrote

> Just one more addition to this thread, a cautionary tale.
> Susan Reverby (a researcher best known for her work on the Tuskegee
> syphilis nontreatment study; see:
> http://books.google.com/books?hl=en&lr=&id=qm5X5gW7qNIC&oi=fnd&pg=PR7&
> dq=%22susan+reverby%22+tuskegee&ots=b5Q3bpmUif&sig=GasL1Eq2uegPmHuaZD9
> uCyNX1pM#v=onepage&q=%22susan%20reverby%22%20tuskegee&f=false
> )
> who is on another mailing list that I'm a member of, provided a link
> to an analysis by HIV researcher Ida Susser of a program intervention
> study that was published in the New England Journal of Medicine.  The
> program was to promote preventive measures to stop the spread of the
> HIV but, as implemented, appears to have failed to do so.  Susser
> identifies some possible factors for why the intervention failed which
> highlights some of the problems the exist between the people
> implementing a program and the recipients of the program.  Susser had
> her analysis published on the Aljazeera America website (just a news
> website, not a jihadi outlet) and it shows that when doing program
> evaluation one cannot simply do statistical analysis but one also
> should do an analysis of how the program was implemented.
> NOTE: even if the program was implemented as planned, the plan that
> was used may not have been the best approach to use, consequently, it
> should not come as a surprise that there were negative results,
> however, it failed because the wrong implementation was used not that
> the implementation does not work.
> Susser's article can be read here:
> http://america.aljazeera.com/opinions/2015/3/blame-research-design-for
> -failed-hiv-study.html#
>
> -Mike Palij
> New York University

> mp26@

>
>   ----- Original Message -----
>   From: Dates, Brian
>   To:

> SPSSX-L@.UGA

>  
>   Sent: Monday, March 02, 2015 10:19 AM
>   Subject: Re: Stats course geared towards Program Evaluation
>
>
>   Ryan,
>
>    
>
>   The Evaluators’ Institute at George Washington University has a full
> cadre of courses for evaluators.  Here’s the current listing:
>
>    
>
>   Analytic Approaches
>
>     a.. Applied Regression Analysis for Evaluators
>     b.. Applied Statistics for Evaluators
>     c.. Hierarchical Linear Modeling
>     d.. Intermediate Cost-Benefit and Cost-Effectiveness Analysis
>     e.. Intermediate Qualitative Analysis
>     f.. Introduction to Cost-Benefit and Cost-Effectiveness Analysis
>     g.. Needs Assessment
>     h.. Practical Meta-Analysis: Summarizing Results Across Studies
>     i.. Qualitative Data Analysis
>   The link to the Evaluators’ Institute is:
> http://tei.gwu.edu/course-listing-category
>
>    
>
>   I’d also recommend the session by Stephanie Evergreen, Presenting
> Data
> Effectively: Practical Methods for Improving Evaluation Communication .
> She’s done amazing work with data visualization. We’ve had her at my
> organization for a full day, and her stuff is really good.
>
>    
>
>   Hope this all helps.
>
>    
>
>   Brian
>
>    
>
>   Brian Dates, M.A.
>   Director of Evaluation and Research | Evaluation & Research |
> Southwest Counseling Solutions
>   Southwest Solutions
>   1700 Waterman, Detroit, MI 48209
>   313-841-8900 (x7442) office | 313-849-2702 fax
>  

> bdates@

>  | www.swsol.org
>
>    
>
>   From: SPSSX(r) Discussion [mailto:

> SPSSX-L@.UGA

> ] On Behalf Of Ryan Black
>   Sent: Monday, March 02, 2015 10:05 AM
>   To:

> SPSSX-L@.UGA

>   Subject: Stats course geared towards Program Evaluation
>
>    
>
>   OT:
>
>    
>
>   Is anyone familiar a grad level stats course that is geared towards
> program evaluation?
>
>    
>
>   Thanks,
>
>    
>
>   Ryan
>
>   ===================== To manage your subscription to SPSSX-L, send a
> message to

> LISTSERV@.UGA

>  (not to SPSSX-L), with no body text except the command. To leave the
> list, send the command SIGNOFF SPSSX-L For a list of commands to
> manage subscriptions, send the command INFO REFCARD
>
>   ===================== To manage your subscription to SPSSX-L, send a
> message to

> LISTSERV@.UGA

>  (not to SPSSX-L), with no body text except the command. To leave the
> list, send the command SIGNOFF SPSSX-L For a list of commands to
> manage subscriptions, send the command INFO REFCARD
>
> =====================
> To manage your subscription to SPSSX-L, send a message to

> LISTSERV@.UGA

>  (not to SPSSX-L), with no body text except the command. To leave the
> list, send the command SIGNOFF SPSSX-L For a list of commands to
> manage subscriptions, send the command INFO REFCARD





-----
--
Bruce Weaver
[hidden email]
http://sites.google.com/a/lakeheadu.ca/bweaver/

"When all else fails, RTFM."

NOTE: My Hotmail account is not monitored regularly.
To send me an e-mail, please use the address shown above.

--
View this message in context: http://spssx-discussion.1045642.n5.nabble.com/Stats-course-geared-towards-Program-Evaluation-tp5728871p5728909.html
Sent from the SPSSX Discussion mailing list archive at Nabble.com.

=====================
To manage your subscription to SPSSX-L, send a message to [hidden email] (not to SPSSX-L), with no body text except the command. To leave the list, send the command SIGNOFF SPSSX-L For a list of commands to manage subscriptions, send the command INFO REFCARD

=====================
To manage your subscription to SPSSX-L, send a message to
[hidden email] (not to SPSSX-L), with no body text except the
command. To leave the list, send the command
SIGNOFF SPSSX-L
For a list of commands to manage subscriptions, send the command
INFO REFCARD
Reply | Threaded
Open this post in threaded view
|

Re: Stats course geared towards Program Evaluation

Bruce Weaver
Administrator
I contacted someone who I thought might know the source of that story.  I copy his response below.  HTH.

-------------------------------------------------------------------

It was Richard Tiberius at Toronto. You've pretty well retold the story accurately. I think he wrote it up, but a Pubmed search turned up nothing. I know he presented it in a few places.  Might still be able to reach him; he's in Florida somewhere.

http://edo.med.miami.edu/contact-staff-fellows-advisory-council/subsect-staff/richard-g-tiberius



Zdaniuk, Bozena-3 wrote
I am dying to know! It would be such a great example to use for advanced research courses...
bozena

-----Original Message-----
From: SPSSX(r) Discussion [mailto:[hidden email]] On Behalf Of Bruce Weaver
Sent: March 6, 2015 3:30 PM
To: [hidden email]
Subject: Re: Stats course geared towards Program Evaluation

In a slightly different (but I think related) vein, I once heard an education researcher talking about a study that involved some intervention to improve student performance in a particular course.  Somewhat surprisingly, the results showed no improvement in the target course.  However, there was a substantial improvement in another course.  It was a situation where each course was populated by the same students.  So the researchers speculated that students were able to maintain their grades in the target course with less time and effort, and that they gave that extra time and effort to the other course, where it was apparently more important to them to improve their performance.  But if the researchers had not been paying attention to what was going on in the other non-target course(s), they might easily have concluded that the intervention was not effective.  

I can't remember who the researcher was.  But if anyone is dying to know, I can probably track it down.

Cheers,
Bruce


Mike Palij wrote
> Just one more addition to this thread, a cautionary tale.
> Susan Reverby (a researcher best known for her work on the Tuskegee
> syphilis nontreatment study; see:
> http://books.google.com/books?hl=en&lr=&id=qm5X5gW7qNIC&oi=fnd&pg=PR7&
> dq=%22susan+reverby%22+tuskegee&ots=b5Q3bpmUif&sig=GasL1Eq2uegPmHuaZD9
> uCyNX1pM#v=onepage&q=%22susan%20reverby%22%20tuskegee&f=false
> )
> who is on another mailing list that I'm a member of, provided a link
> to an analysis by HIV researcher Ida Susser of a program intervention
> study that was published in the New England Journal of Medicine.  The
> program was to promote preventive measures to stop the spread of the
> HIV but, as implemented, appears to have failed to do so.  Susser
> identifies some possible factors for why the intervention failed which
> highlights some of the problems the exist between the people
> implementing a program and the recipients of the program.  Susser had
> her analysis published on the Aljazeera America website (just a news
> website, not a jihadi outlet) and it shows that when doing program
> evaluation one cannot simply do statistical analysis but one also
> should do an analysis of how the program was implemented.
> NOTE: even if the program was implemented as planned, the plan that
> was used may not have been the best approach to use, consequently, it
> should not come as a surprise that there were negative results,
> however, it failed because the wrong implementation was used not that
> the implementation does not work.
> Susser's article can be read here:
> http://america.aljazeera.com/opinions/2015/3/blame-research-design-for
> -failed-hiv-study.html#
>
> -Mike Palij
> New York University

> mp26@

>
>   ----- Original Message -----
>   From: Dates, Brian
>   To:

> SPSSX-L@.UGA

>  
>   Sent: Monday, March 02, 2015 10:19 AM
>   Subject: Re: Stats course geared towards Program Evaluation
>
>
>   Ryan,
>
>    
>
>   The Evaluators’ Institute at George Washington University has a full
> cadre of courses for evaluators.  Here’s the current listing:
>
>    
>
>   Analytic Approaches
>
>     a.. Applied Regression Analysis for Evaluators
>     b.. Applied Statistics for Evaluators
>     c.. Hierarchical Linear Modeling
>     d.. Intermediate Cost-Benefit and Cost-Effectiveness Analysis
>     e.. Intermediate Qualitative Analysis
>     f.. Introduction to Cost-Benefit and Cost-Effectiveness Analysis
>     g.. Needs Assessment
>     h.. Practical Meta-Analysis: Summarizing Results Across Studies
>     i.. Qualitative Data Analysis
>   The link to the Evaluators’ Institute is:
> http://tei.gwu.edu/course-listing-category
>
>    
>
>   I’d also recommend the session by Stephanie Evergreen, Presenting
> Data
> Effectively: Practical Methods for Improving Evaluation Communication .
> She’s done amazing work with data visualization. We’ve had her at my
> organization for a full day, and her stuff is really good.
>
>    
>
>   Hope this all helps.
>
>    
>
>   Brian
>
>    
>
>   Brian Dates, M.A.
>   Director of Evaluation and Research | Evaluation & Research |
> Southwest Counseling Solutions
>   Southwest Solutions
>   1700 Waterman, Detroit, MI 48209
>   313-841-8900 (x7442) office | 313-849-2702 fax
>  

> bdates@

>  | www.swsol.org
>
>    
>
>   From: SPSSX(r) Discussion [mailto:

> SPSSX-L@.UGA

> ] On Behalf Of Ryan Black
>   Sent: Monday, March 02, 2015 10:05 AM
>   To:

> SPSSX-L@.UGA

>   Subject: Stats course geared towards Program Evaluation
>
>    
>
>   OT:
>
>    
>
>   Is anyone familiar a grad level stats course that is geared towards
> program evaluation?
>
>    
>
>   Thanks,
>
>    
>
>   Ryan
>
>   ===================== To manage your subscription to SPSSX-L, send a
> message to

> LISTSERV@.UGA

>  (not to SPSSX-L), with no body text except the command. To leave the
> list, send the command SIGNOFF SPSSX-L For a list of commands to
> manage subscriptions, send the command INFO REFCARD
>
>   ===================== To manage your subscription to SPSSX-L, send a
> message to

> LISTSERV@.UGA

>  (not to SPSSX-L), with no body text except the command. To leave the
> list, send the command SIGNOFF SPSSX-L For a list of commands to
> manage subscriptions, send the command INFO REFCARD
>
> =====================
> To manage your subscription to SPSSX-L, send a message to

> LISTSERV@.UGA

>  (not to SPSSX-L), with no body text except the command. To leave the
> list, send the command SIGNOFF SPSSX-L For a list of commands to
> manage subscriptions, send the command INFO REFCARD





-----
--
Bruce Weaver
[hidden email]
http://sites.google.com/a/lakeheadu.ca/bweaver/

"When all else fails, RTFM."

NOTE: My Hotmail account is not monitored regularly.
To send me an e-mail, please use the address shown above.

--
View this message in context: http://spssx-discussion.1045642.n5.nabble.com/Stats-course-geared-towards-Program-Evaluation-tp5728871p5728909.html
Sent from the SPSSX Discussion mailing list archive at Nabble.com.

=====================
To manage your subscription to SPSSX-L, send a message to [hidden email] (not to SPSSX-L), with no body text except the command. To leave the list, send the command SIGNOFF SPSSX-L For a list of commands to manage subscriptions, send the command INFO REFCARD

=====================
To manage your subscription to SPSSX-L, send a message to
[hidden email] (not to SPSSX-L), with no body text except the
command. To leave the list, send the command
SIGNOFF SPSSX-L
For a list of commands to manage subscriptions, send the command
INFO REFCARD
--
Bruce Weaver
bweaver@lakeheadu.ca
http://sites.google.com/a/lakeheadu.ca/bweaver/

"When all else fails, RTFM."

PLEASE NOTE THE FOLLOWING: 
1. My Hotmail account is not monitored regularly. To send me an e-mail, please use the address shown above.
2. The SPSSX Discussion forum on Nabble is no longer linked to the SPSSX-L listserv administered by UGA (https://listserv.uga.edu/).
Reply | Threaded
Open this post in threaded view
|

Re: Stats course geared towards Program Evaluation

Zdaniuk, Bozena-3
I just wrote an email to Richard T. asking if the story was published anywhere. I will share with the list if I hear from him.
Thanks so much for digging it up, Bruce.
Bozena

-----Original Message-----
From: SPSSX(r) Discussion [mailto:[hidden email]] On Behalf Of Bruce Weaver
Sent: March 6, 2015 6:20 PM
To: [hidden email]
Subject: Re: Stats course geared towards Program Evaluation

I contacted someone who I thought might know the source of that story.  I copy his response below.  HTH.

-------------------------------------------------------------------

It was Richard Tiberius at Toronto. You've pretty well retold the story accurately. I think he wrote it up, but a Pubmed search turned up nothing. I know he presented it in a few places.  Might still be able to reach him; he's in Florida somewhere.

http://edo.med.miami.edu/contact-staff-fellows-advisory-council/subsect-staff/richard-g-tiberius




Zdaniuk, Bozena-3 wrote
> I am dying to know! It would be such a great example to use for
> advanced research courses...
> bozena
>
> -----Original Message-----
> From: SPSSX(r) Discussion [mailto:

> SPSSX-L@.UGA

> ] On Behalf Of Bruce Weaver
> Sent: March 6, 2015 3:30 PM
> To:

> SPSSX-L@.UGA

> Subject: Re: Stats course geared towards Program Evaluation
>
> In a slightly different (but I think related) vein, I once heard an
> education researcher talking about a study that involved some
> intervention to improve student performance in a particular course.  
> Somewhat surprisingly, the results showed no improvement in the target course.
> However, there was a substantial improvement in another course.  It
> was a situation where each course was populated by the same students.  
> So the researchers speculated that students were able to maintain
> their grades in the target course with less time and effort, and that
> they gave that extra time and effort to the other course, where it was
> apparently more important to them to improve their performance.  But
> if the researchers had not been paying attention to what was going on
> in the other non-target course(s), they might easily have concluded
> that the intervention was not effective.
>
> I can't remember who the researcher was.  But if anyone is dying to
> know, I can probably track it down.
>
> Cheers,
> Bruce
>
>
> Mike Palij wrote
>> Just one more addition to this thread, a cautionary tale.
>> Susan Reverby (a researcher best known for her work on the Tuskegee
>> syphilis nontreatment study; see:
>> http://books.google.com/books?hl=en&lr=&id=qm5X5gW7qNIC&oi=fnd&pg=PR7
>> &
>> dq=%22susan+reverby%22+tuskegee&ots=b5Q3bpmUif&sig=GasL1Eq2uegPmHuaZD
>> 9 uCyNX1pM#v=onepage&q=%22susan%20reverby%22%20tuskegee&f=false
>> )
>> who is on another mailing list that I'm a member of, provided a link
>> to an analysis by HIV researcher Ida Susser of a program intervention
>> study that was published in the New England Journal of Medicine.  The
>> program was to promote preventive measures to stop the spread of the
>> HIV but, as implemented, appears to have failed to do so.  Susser
>> identifies some possible factors for why the intervention failed
>> which highlights some of the problems the exist between the people
>> implementing a program and the recipients of the program.  Susser had
>> her analysis published on the Aljazeera America website (just a news
>> website, not a jihadi outlet) and it shows that when doing program
>> evaluation one cannot simply do statistical analysis but one also
>> should do an analysis of how the program was implemented.
>> NOTE: even if the program was implemented as planned, the plan that
>> was used may not have been the best approach to use, consequently, it
>> should not come as a surprise that there were negative results,
>> however, it failed because the wrong implementation was used not that
>> the implementation does not work.
>> Susser's article can be read here:
>> http://america.aljazeera.com/opinions/2015/3/blame-research-design-fo
>> r
>> -failed-hiv-study.html#
>>
>> -Mike Palij
>> New York University
>
>> mp26@
>
>>
>>   ----- Original Message -----
>>   From: Dates, Brian
>>   To:
>
>> SPSSX-L@.UGA
>
>>  
>>   Sent: Monday, March 02, 2015 10:19 AM
>>   Subject: Re: Stats course geared towards Program Evaluation
>>
>>
>>   Ryan,
>>
>>    
>>
>>   The Evaluators’ Institute at George Washington University has a
>> full cadre of courses for evaluators.  Here’s the current listing:
>>
>>    
>>
>>   Analytic Approaches
>>
>>     a.. Applied Regression Analysis for Evaluators
>>     b.. Applied Statistics for Evaluators
>>     c.. Hierarchical Linear Modeling
>>     d.. Intermediate Cost-Benefit and Cost-Effectiveness Analysis
>>     e.. Intermediate Qualitative Analysis
>>     f.. Introduction to Cost-Benefit and Cost-Effectiveness Analysis
>>     g.. Needs Assessment
>>     h.. Practical Meta-Analysis: Summarizing Results Across Studies
>>     i.. Qualitative Data Analysis
>>   The link to the Evaluators’ Institute is:
>> http://tei.gwu.edu/course-listing-category
>>
>>    
>>
>>   I’d also recommend the session by Stephanie Evergreen, Presenting
>> Data
>> Effectively: Practical Methods for Improving Evaluation Communication .
>> She’s done amazing work with data visualization. We’ve had her at my
>> organization for a full day, and her stuff is really good.
>>
>>    
>>
>>   Hope this all helps.
>>
>>    
>>
>>   Brian
>>
>>    
>>
>>   Brian Dates, M.A.
>>   Director of Evaluation and Research | Evaluation & Research |
>> Southwest Counseling Solutions
>>   Southwest Solutions
>>   1700 Waterman, Detroit, MI 48209
>>   313-841-8900 (x7442) office | 313-849-2702 fax
>>  
>
>> bdates@
>
>>  | www.swsol.org
>>
>>    
>>
>>   From: SPSSX(r) Discussion [mailto:
>
>> SPSSX-L@.UGA
>
>> ] On Behalf Of Ryan Black
>>   Sent: Monday, March 02, 2015 10:05 AM
>>   To:
>
>> SPSSX-L@.UGA
>
>>   Subject: Stats course geared towards Program Evaluation
>>
>>    
>>
>>   OT:
>>
>>    
>>
>>   Is anyone familiar a grad level stats course that is geared towards
>> program evaluation?
>>
>>    
>>
>>   Thanks,
>>
>>    
>>
>>   Ryan
>>
>>   ===================== To manage your subscription to SPSSX-L, send
>> a message to
>
>> LISTSERV@.UGA
>
>>  (not to SPSSX-L), with no body text except the command. To leave the
>> list, send the command SIGNOFF SPSSX-L For a list of commands to
>> manage subscriptions, send the command INFO REFCARD
>>
>>   ===================== To manage your subscription to SPSSX-L, send
>> a message to
>
>> LISTSERV@.UGA
>
>>  (not to SPSSX-L), with no body text except the command. To leave the
>> list, send the command SIGNOFF SPSSX-L For a list of commands to
>> manage subscriptions, send the command INFO REFCARD
>>
>> =====================
>> To manage your subscription to SPSSX-L, send a message to
>
>> LISTSERV@.UGA
>
>>  (not to SPSSX-L), with no body text except the command. To leave the
>> list, send the command SIGNOFF SPSSX-L For a list of commands to
>> manage subscriptions, send the command INFO REFCARD
>
>
>
>
>
> -----
> --
> Bruce Weaver

> bweaver@

> http://sites.google.com/a/lakeheadu.ca/bweaver/
>
> "When all else fails, RTFM."
>
> NOTE: My Hotmail account is not monitored regularly.
> To send me an e-mail, please use the address shown above.
>
> --
> View this message in context:
> http://spssx-discussion.1045642.n5.nabble.com/Stats-course-geared-towa
> rds-Program-Evaluation-tp5728871p5728909.html
> Sent from the SPSSX Discussion mailing list archive at Nabble.com.
>
> =====================
> To manage your subscription to SPSSX-L, send a message to

> LISTSERV@.UGA

>  (not to SPSSX-L), with no body text except the command. To leave the
> list, send the command SIGNOFF SPSSX-L For a list of commands to
> manage subscriptions, send the command INFO REFCARD
>
> =====================
> To manage your subscription to SPSSX-L, send a message to

> LISTSERV@.UGA

>  (not to SPSSX-L), with no body text except the command. To leave the
> list, send the command SIGNOFF SPSSX-L For a list of commands to
> manage subscriptions, send the command INFO REFCARD





-----
--
Bruce Weaver
[hidden email]
http://sites.google.com/a/lakeheadu.ca/bweaver/

"When all else fails, RTFM."

NOTE: My Hotmail account is not monitored regularly.
To send me an e-mail, please use the address shown above.

--
View this message in context: http://spssx-discussion.1045642.n5.nabble.com/Stats-course-geared-towards-Program-Evaluation-tp5728871p5728911.html
Sent from the SPSSX Discussion mailing list archive at Nabble.com.

=====================
To manage your subscription to SPSSX-L, send a message to [hidden email] (not to SPSSX-L), with no body text except the command. To leave the list, send the command SIGNOFF SPSSX-L For a list of commands to manage subscriptions, send the command INFO REFCARD

=====================
To manage your subscription to SPSSX-L, send a message to
[hidden email] (not to SPSSX-L), with no body text except the
command. To leave the list, send the command
SIGNOFF SPSSX-L
For a list of commands to manage subscriptions, send the command
INFO REFCARD
Reply | Threaded
Open this post in threaded view
|

Re: Stats course geared towards Program Evaluation

Mike
In reply to this post by Bruce Weaver
Bruce, take a look at the following:

Tiberius, R. G., Sackin, H. D., & Cappe, L. (1987).
A comparison of two methods for evaluating teaching.
Studies in Higher Education, 12(3), 287-297.

Especially pages 292-293 at the beginning of the
Discussion.  Does this describe what you remember?

-Mike Palij
New York University
[hidden email]

P.S. Found via google.scholar.com

----- Original Message -----
From: "Bruce Weaver" <[hidden email]>
To: <[hidden email]>
Sent: Friday, March 06, 2015 9:20 PM
Subject: Re: Stats course geared towards Program Evaluation


I contacted someone who I thought might know the source of that story.
I
copy his response below.  HTH.

-------------------------------------------------------------------

It was Richard Tiberius at Toronto. You've pretty well retold the story
accurately. I think he wrote it up, but a Pubmed search turned up
nothing. I
know he presented it in a few places.  Might still be able to reach him;
he's in Florida somewhere.

http://edo.med.miami.edu/contact-staff-fellows-advisory-council/subsect-staff/richard-g-tiberius




Zdaniuk, Bozena-3 wrote
> I am dying to know! It would be such a great example to use for
> advanced
> research courses...
> bozena
>
> -----Original Message-----
> From: SPSSX(r) Discussion [mailto:

> SPSSX-L@.UGA

> ] On Behalf Of Bruce Weaver
> Sent: March 6, 2015 3:30 PM
> To:

> SPSSX-L@.UGA

> Subject: Re: Stats course geared towards Program Evaluation
>
> In a slightly different (but I think related) vein, I once heard an
> education researcher talking about a study that involved some
> intervention
> to improve student performance in a particular course.  Somewhat
> surprisingly, the results showed no improvement in the target course.
> However, there was a substantial improvement in another course.  It
> was a
> situation where each course was populated by the same students.  So
> the
> researchers speculated that students were able to maintain their
> grades in
> the target course with less time and effort, and that they gave that
> extra
> time and effort to the other course, where it was apparently more
> important to them to improve their performance.  But if the
> researchers
> had not been paying attention to what was going on in the other
> non-target
> course(s), they might easily have concluded that the intervention was
> not
> effective.
>
> I can't remember who the researcher was.  But if anyone is dying to
> know,
> I can probably track it down.
>
> Cheers,
> Bruce
>
>
> Mike Palij wrote
>> Just one more addition to this thread, a cautionary tale.
>> Susan Reverby (a researcher best known for her work on the Tuskegee
>> syphilis nontreatment study; see:
>> http://books.google.com/books?hl=en&lr=&id=qm5X5gW7qNIC&oi=fnd&pg=PR7&
>> dq=%22susan+reverby%22+tuskegee&ots=b5Q3bpmUif&sig=GasL1Eq2uegPmHuaZD9
>> uCyNX1pM#v=onepage&q=%22susan%20reverby%22%20tuskegee&f=false
>> )
>> who is on another mailing list that I'm a member of, provided a link
>> to an analysis by HIV researcher Ida Susser of a program intervention
>> study that was published in the New England Journal of Medicine.  The
>> program was to promote preventive measures to stop the spread of the
>> HIV but, as implemented, appears to have failed to do so.  Susser
>> identifies some possible factors for why the intervention failed
>> which
>> highlights some of the problems the exist between the people
>> implementing a program and the recipients of the program.  Susser had
>> her analysis published on the Aljazeera America website (just a news
>> website, not a jihadi outlet) and it shows that when doing program
>> evaluation one cannot simply do statistical analysis but one also
>> should do an analysis of how the program was implemented.
>> NOTE: even if the program was implemented as planned, the plan that
>> was used may not have been the best approach to use, consequently, it
>> should not come as a surprise that there were negative results,
>> however, it failed because the wrong implementation was used not that
>> the implementation does not work.
>> Susser's article can be read here:
>> http://america.aljazeera.com/opinions/2015/3/blame-research-design-for
>> -failed-hiv-study.html#
>>
>> -Mike Palij
>> New York University
>
>> mp26@
>
>>
>>   ----- Original Message -----
>>   From: Dates, Brian
>>   To:
>
>> SPSSX-L@.UGA
>
>>
>>   Sent: Monday, March 02, 2015 10:19 AM
>>   Subject: Re: Stats course geared towards Program Evaluation
>>
>>
>>   Ryan,
>>
>>
>>
>>   The Evaluators’ Institute at George Washington University has a
>> full
>> cadre of courses for evaluators.  Here’s the current listing:
>>
>>
>>
>>   Analytic Approaches
>>
>>     a.. Applied Regression Analysis for Evaluators
>>     b.. Applied Statistics for Evaluators
>>     c.. Hierarchical Linear Modeling
>>     d.. Intermediate Cost-Benefit and Cost-Effectiveness Analysis
>>     e.. Intermediate Qualitative Analysis
>>     f.. Introduction to Cost-Benefit and Cost-Effectiveness Analysis
>>     g.. Needs Assessment
>>     h.. Practical Meta-Analysis: Summarizing Results Across Studies
>>     i.. Qualitative Data Analysis
>>   The link to the Evaluators’ Institute is:
>> http://tei.gwu.edu/course-listing-category
>>
>>
>>
>>   I’d also recommend the session by Stephanie Evergreen, Presenting
>> Data
>> Effectively: Practical Methods for Improving Evaluation Communication
>> .
>> She’s done amazing work with data visualization. We’ve had her at my
>> organization for a full day, and her stuff is really good.
>>
>>
>>
>>   Hope this all helps.
>>
>>
>>
>>   Brian
>>
>>
>>
>>   Brian Dates, M.A.
>>   Director of Evaluation and Research | Evaluation & Research |
>> Southwest Counseling Solutions
>>   Southwest Solutions
>>   1700 Waterman, Detroit, MI 48209
>>   313-841-8900 (x7442) office | 313-849-2702 fax
>>
>
>> bdates@
>
>>  | www.swsol.org
>>
>>
>>
>>   From: SPSSX(r) Discussion [mailto:
>
>> SPSSX-L@.UGA
>
>> ] On Behalf Of Ryan Black
>>   Sent: Monday, March 02, 2015 10:05 AM
>>   To:
>
>> SPSSX-L@.UGA
>
>>   Subject: Stats course geared towards Program Evaluation
>>
>>
>>
>>   OT:
>>
>>
>>
>>   Is anyone familiar a grad level stats course that is geared towards
>> program evaluation?
>>
>>
>>
>>   Thanks,
>>
>>
>>
>>   Ryan
>>
>>   ===================== To manage your subscription to SPSSX-L, send
>> a
>> message to
>
>> LISTSERV@.UGA
>
>>  (not to SPSSX-L), with no body text except the command. To leave the
>> list, send the command SIGNOFF SPSSX-L For a list of commands to
>> manage subscriptions, send the command INFO REFCARD
>>
>>   ===================== To manage your subscription to SPSSX-L, send
>> a
>> message to
>
>> LISTSERV@.UGA
>
>>  (not to SPSSX-L), with no body text except the command. To leave the
>> list, send the command SIGNOFF SPSSX-L For a list of commands to
>> manage subscriptions, send the command INFO REFCARD
>>
>> =====================
>> To manage your subscription to SPSSX-L, send a message to
>
>> LISTSERV@.UGA
>
>>  (not to SPSSX-L), with no body text except the command. To leave the
>> list, send the command SIGNOFF SPSSX-L For a list of commands to
>> manage subscriptions, send the command INFO REFCARD
>
>
>
>
>
> -----
> --
> Bruce Weaver

> bweaver@

> http://sites.google.com/a/lakeheadu.ca/bweaver/
>
> "When all else fails, RTFM."
>
> NOTE: My Hotmail account is not monitored regularly.
> To send me an e-mail, please use the address shown above.
>
> --
> View this message in context:
> http://spssx-discussion.1045642.n5.nabble.com/Stats-course-geared-towards-Program-Evaluation-tp5728871p5728909.html
> Sent from the SPSSX Discussion mailing list archive at Nabble.com.
>
> =====================
> To manage your subscription to SPSSX-L, send a message to

> LISTSERV@.UGA

>  (not to SPSSX-L), with no body text except the command. To leave the
> list, send the command SIGNOFF SPSSX-L For a list of commands to
> manage
> subscriptions, send the command INFO REFCARD
>
> =====================
> To manage your subscription to SPSSX-L, send a message to

> LISTSERV@.UGA

>  (not to SPSSX-L), with no body text except the
> command. To leave the list, send the command
> SIGNOFF SPSSX-L
> For a list of commands to manage subscriptions, send the command
> INFO REFCARD





-----
--
Bruce Weaver
[hidden email]
http://sites.google.com/a/lakeheadu.ca/bweaver/

"When all else fails, RTFM."

NOTE: My Hotmail account is not monitored regularly.
To send me an e-mail, please use the address shown above.

--
View this message in context:
http://spssx-discussion.1045642.n5.nabble.com/Stats-course-geared-towards-Program-Evaluation-tp5728871p5728911.html
Sent from the SPSSX Discussion mailing list archive at Nabble.com.

=====================
To manage your subscription to SPSSX-L, send a message to
[hidden email] (not to SPSSX-L), with no body text except the
command. To leave the list, send the command
SIGNOFF SPSSX-L
For a list of commands to manage subscriptions, send the command
INFO REFCARD

=====================
To manage your subscription to SPSSX-L, send a message to
[hidden email] (not to SPSSX-L), with no body text except the
command. To leave the list, send the command
SIGNOFF SPSSX-L
For a list of commands to manage subscriptions, send the command
INFO REFCARD
Reply | Threaded
Open this post in threaded view
|

Re: Stats course geared towards Program Evaluation

Bruce Weaver
Administrator
Hi Mike.  I had to wait until I was back on campus to get access to the article.  Based on a quick skim, I don't think that is the same story I remember hearing.  With any luck, Bozena will get a response from Dr. Tiberius, and all will be clarified.

Cheers,
Bruce


Mike Palij wrote
Bruce, take a look at the following:

Tiberius, R. G., Sackin, H. D., & Cappe, L. (1987).
A comparison of two methods for evaluating teaching.
Studies in Higher Education, 12(3), 287-297.

Especially pages 292-293 at the beginning of the
Discussion.  Does this describe what you remember?

-Mike Palij
New York University
[hidden email]

P.S. Found via google.scholar.com

----- Original Message -----
From: "Bruce Weaver" <[hidden email]>
To: <[hidden email]>
Sent: Friday, March 06, 2015 9:20 PM
Subject: Re: Stats course geared towards Program Evaluation


I contacted someone who I thought might know the source of that story.
I
copy his response below.  HTH.

-------------------------------------------------------------------

It was Richard Tiberius at Toronto. You've pretty well retold the story
accurately. I think he wrote it up, but a Pubmed search turned up
nothing. I
know he presented it in a few places.  Might still be able to reach him;
he's in Florida somewhere.

http://edo.med.miami.edu/contact-staff-fellows-advisory-council/subsect-staff/richard-g-tiberius




Zdaniuk, Bozena-3 wrote
> I am dying to know! It would be such a great example to use for
> advanced
> research courses...
> bozena
>
> -----Original Message-----
> From: SPSSX(r) Discussion [mailto:

> SPSSX-L@.UGA

> ] On Behalf Of Bruce Weaver
> Sent: March 6, 2015 3:30 PM
> To:

> SPSSX-L@.UGA

> Subject: Re: Stats course geared towards Program Evaluation
>
> In a slightly different (but I think related) vein, I once heard an
> education researcher talking about a study that involved some
> intervention
> to improve student performance in a particular course.  Somewhat
> surprisingly, the results showed no improvement in the target course.
> However, there was a substantial improvement in another course.  It
> was a
> situation where each course was populated by the same students.  So
> the
> researchers speculated that students were able to maintain their
> grades in
> the target course with less time and effort, and that they gave that
> extra
> time and effort to the other course, where it was apparently more
> important to them to improve their performance.  But if the
> researchers
> had not been paying attention to what was going on in the other
> non-target
> course(s), they might easily have concluded that the intervention was
> not
> effective.
>
> I can't remember who the researcher was.  But if anyone is dying to
> know,
> I can probably track it down.
>
> Cheers,
> Bruce
>
>
> Mike Palij wrote
>> Just one more addition to this thread, a cautionary tale.
>> Susan Reverby (a researcher best known for her work on the Tuskegee
>> syphilis nontreatment study; see:
>> http://books.google.com/books?hl=en&lr=&id=qm5X5gW7qNIC&oi=fnd&pg=PR7&
>> dq=%22susan+reverby%22+tuskegee&ots=b5Q3bpmUif&sig=GasL1Eq2uegPmHuaZD9
>> uCyNX1pM#v=onepage&q=%22susan%20reverby%22%20tuskegee&f=false
>> )
>> who is on another mailing list that I'm a member of, provided a link
>> to an analysis by HIV researcher Ida Susser of a program intervention
>> study that was published in the New England Journal of Medicine.  The
>> program was to promote preventive measures to stop the spread of the
>> HIV but, as implemented, appears to have failed to do so.  Susser
>> identifies some possible factors for why the intervention failed
>> which
>> highlights some of the problems the exist between the people
>> implementing a program and the recipients of the program.  Susser had
>> her analysis published on the Aljazeera America website (just a news
>> website, not a jihadi outlet) and it shows that when doing program
>> evaluation one cannot simply do statistical analysis but one also
>> should do an analysis of how the program was implemented.
>> NOTE: even if the program was implemented as planned, the plan that
>> was used may not have been the best approach to use, consequently, it
>> should not come as a surprise that there were negative results,
>> however, it failed because the wrong implementation was used not that
>> the implementation does not work.
>> Susser's article can be read here:
>> http://america.aljazeera.com/opinions/2015/3/blame-research-design-for
>> -failed-hiv-study.html#
>>
>> -Mike Palij
>> New York University
>
>> mp26@
>
>>
>>   ----- Original Message -----
>>   From: Dates, Brian
>>   To:
>
>> SPSSX-L@.UGA
>
>>
>>   Sent: Monday, March 02, 2015 10:19 AM
>>   Subject: Re: Stats course geared towards Program Evaluation
>>
>>
>>   Ryan,
>>
>>
>>
>>   The Evaluators’ Institute at George Washington University has a
>> full
>> cadre of courses for evaluators.  Here’s the current listing:
>>
>>
>>
>>   Analytic Approaches
>>
>>     a.. Applied Regression Analysis for Evaluators
>>     b.. Applied Statistics for Evaluators
>>     c.. Hierarchical Linear Modeling
>>     d.. Intermediate Cost-Benefit and Cost-Effectiveness Analysis
>>     e.. Intermediate Qualitative Analysis
>>     f.. Introduction to Cost-Benefit and Cost-Effectiveness Analysis
>>     g.. Needs Assessment
>>     h.. Practical Meta-Analysis: Summarizing Results Across Studies
>>     i.. Qualitative Data Analysis
>>   The link to the Evaluators’ Institute is:
>> http://tei.gwu.edu/course-listing-category
>>
>>
>>
>>   I’d also recommend the session by Stephanie Evergreen, Presenting
>> Data
>> Effectively: Practical Methods for Improving Evaluation Communication
>> .
>> She’s done amazing work with data visualization. We’ve had her at my
>> organization for a full day, and her stuff is really good.
>>
>>
>>
>>   Hope this all helps.
>>
>>
>>
>>   Brian
>>
>>
>>
>>   Brian Dates, M.A.
>>   Director of Evaluation and Research | Evaluation & Research |
>> Southwest Counseling Solutions
>>   Southwest Solutions
>>   1700 Waterman, Detroit, MI 48209
>>   313-841-8900 (x7442) office | 313-849-2702 fax
>>
>
>> bdates@
>
>>  | www.swsol.org
>>
>>
>>
>>   From: SPSSX(r) Discussion [mailto:
>
>> SPSSX-L@.UGA
>
>> ] On Behalf Of Ryan Black
>>   Sent: Monday, March 02, 2015 10:05 AM
>>   To:
>
>> SPSSX-L@.UGA
>
>>   Subject: Stats course geared towards Program Evaluation
>>
>>
>>
>>   OT:
>>
>>
>>
>>   Is anyone familiar a grad level stats course that is geared towards
>> program evaluation?
>>
>>
>>
>>   Thanks,
>>
>>
>>
>>   Ryan
>>
>>   ===================== To manage your subscription to SPSSX-L, send
>> a
>> message to
>
>> LISTSERV@.UGA
>
>>  (not to SPSSX-L), with no body text except the command. To leave the
>> list, send the command SIGNOFF SPSSX-L For a list of commands to
>> manage subscriptions, send the command INFO REFCARD
>>
>>   ===================== To manage your subscription to SPSSX-L, send
>> a
>> message to
>
>> LISTSERV@.UGA
>
>>  (not to SPSSX-L), with no body text except the command. To leave the
>> list, send the command SIGNOFF SPSSX-L For a list of commands to
>> manage subscriptions, send the command INFO REFCARD
>>
>> =====================
>> To manage your subscription to SPSSX-L, send a message to
>
>> LISTSERV@.UGA
>
>>  (not to SPSSX-L), with no body text except the command. To leave the
>> list, send the command SIGNOFF SPSSX-L For a list of commands to
>> manage subscriptions, send the command INFO REFCARD
>
>
>
>
>
> -----
> --
> Bruce Weaver

> bweaver@

> http://sites.google.com/a/lakeheadu.ca/bweaver/
>
> "When all else fails, RTFM."
>
> NOTE: My Hotmail account is not monitored regularly.
> To send me an e-mail, please use the address shown above.
>
> --
> View this message in context:
> http://spssx-discussion.1045642.n5.nabble.com/Stats-course-geared-towards-Program-Evaluation-tp5728871p5728909.html
> Sent from the SPSSX Discussion mailing list archive at Nabble.com.
>
> =====================
> To manage your subscription to SPSSX-L, send a message to

> LISTSERV@.UGA

>  (not to SPSSX-L), with no body text except the command. To leave the
> list, send the command SIGNOFF SPSSX-L For a list of commands to
> manage
> subscriptions, send the command INFO REFCARD
>
> =====================
> To manage your subscription to SPSSX-L, send a message to

> LISTSERV@.UGA

>  (not to SPSSX-L), with no body text except the
> command. To leave the list, send the command
> SIGNOFF SPSSX-L
> For a list of commands to manage subscriptions, send the command
> INFO REFCARD





-----
--
Bruce Weaver
[hidden email]
http://sites.google.com/a/lakeheadu.ca/bweaver/

"When all else fails, RTFM."

NOTE: My Hotmail account is not monitored regularly.
To send me an e-mail, please use the address shown above.

--
View this message in context:
http://spssx-discussion.1045642.n5.nabble.com/Stats-course-geared-towards-Program-Evaluation-tp5728871p5728911.html
Sent from the SPSSX Discussion mailing list archive at Nabble.com.

=====================
To manage your subscription to SPSSX-L, send a message to
[hidden email] (not to SPSSX-L), with no body text except the
command. To leave the list, send the command
SIGNOFF SPSSX-L
For a list of commands to manage subscriptions, send the command
INFO REFCARD

=====================
To manage your subscription to SPSSX-L, send a message to
[hidden email] (not to SPSSX-L), with no body text except the
command. To leave the list, send the command
SIGNOFF SPSSX-L
For a list of commands to manage subscriptions, send the command
INFO REFCARD
--
Bruce Weaver
bweaver@lakeheadu.ca
http://sites.google.com/a/lakeheadu.ca/bweaver/

"When all else fails, RTFM."

PLEASE NOTE THE FOLLOWING: 
1. My Hotmail account is not monitored regularly. To send me an e-mail, please use the address shown above.
2. The SPSSX Discussion forum on Nabble is no longer linked to the SPSSX-L listserv administered by UGA (https://listserv.uga.edu/).