Genlin question

classic Classic list List threaded Threaded
6 messages Options
Reply | Threaded
Open this post in threaded view
|

Genlin question

Maguin, Eugene
I'm having trouble with a genlin analysis, my first time using it. Syntax is

GENLIN VALPRE(REFERENCE=FIRST) BY ORFLevelDbJ09 MALEDM09/MODEL=ORFLevelDbJ09
   MALEDM09 ORFLevelDbJ09*MALEDM09 DISTRIBUTION=BINOMIAL.

The warning is

Warnings
Variable name = specified for subcommand MODEL is not a valid variable name.
This command is not executed.

However, this works fine.

crosstabs VALPRE BY ORFLevelDbJ09 by MALEDM09.

It must be something in the model subcommand but I don't see it. I want an
interaction term and I think I have it specified correctly something is not
right.

Thanks, Gene Maguin

=====================
To manage your subscription to SPSSX-L, send a message to
[hidden email] (not to SPSSX-L), with no body text except the
command. To leave the list, send the command
SIGNOFF SPSSX-L
For a list of commands to manage subscriptions, send the command
INFO REFCARD
Reply | Threaded
Open this post in threaded view
|

Re: Genlin question

Maguin, Eugene
Apologies. I figured it out.

MODEL=ORFLevelDbJ09.

Is the not the same as

MODEL ORFLevelDbJ09.

Even though I'm pretty it is in other procedures.

Gene Maguin

=====================
To manage your subscription to SPSSX-L, send a message to
[hidden email] (not to SPSSX-L), with no body text except the
command. To leave the list, send the command
SIGNOFF SPSSX-L
For a list of commands to manage subscriptions, send the command
INFO REFCARD
Reply | Threaded
Open this post in threaded view
|

Literature on treating 'Not Applicable' responses

Mark A Davenport MADAVENP
In reply to this post by Maguin, Eugene

All,

I have scoured our library resources andf my old texts and notes but can find nothing on treating 'Not applicable' as an item response.  The original analyst changed them all to 'missing'.  I don't like that idea at all.  Has anybody run across articles or book chapters on the subject?

Ultimately, I will be analyzing the data using AMOS 18s ordinal data modelling capabilities.

Thanks,

Mark



***************************************************************************************************************************************************************
Mark A. Davenport Ph.D.
Senior Research Analyst
Office of Institutional Research
The University of North Carolina at Greensboro
336.256.0395
[hidden email]

'An approximate answer to the right question is worth a good deal more than an exact answer to an approximate question.' --a paraphrase of J. W. Tukey (1962)


Reply | Threaded
Open this post in threaded view
|

Re: Literature on treating 'Not Applicable' responses

J P-6
Mark,
 

Depending on the exact context, what I usually do is label the “Not applicable” value as such then make that value a missing value code. That way it is removed from the denominator for frequency calculations and I still know how many selected that option.

 

Another thing I do is make the value for NA something way out of range like -99 or 999, that way if it somehow does not get coded as missing I can easily catch it in any summary statistic. I have seen folks get into real trouble by assigning it the next value in sequence.

 

This web site has some info (scroll to near bottom):

http://www.hcup-us.ahrq.gov/db/coding.jsp

 

 But if you are doing SEM NA might be a meaningful subgroup. But whatever value NA is assigned I cannot envision it as a legitmitate value for inclusion in a var-covar calculation.

 

But short answer: I cannot find a reference either!

 

HTH,

John




From: Mark A Davenport MADAVENP <[hidden email]>
To: [hidden email]
Sent: Thu, June 17, 2010 12:47:03 PM
Subject: Literature on treating 'Not Applicable' responses


All,

I have scoured our library resources andf my old texts and notes but can find nothing on treating 'Not applicable' as an item response.  The original analyst changed them all to 'missing'.  I don't like that idea at all.  Has anybody run across articles or book chapters on the subject?

Ultimately, I will be analyzing the data using AMOS 18s ordinal data modelling capabilities.

Thanks,

Mark



***************************************************************************************************************************************************************
Mark A. Davenport Ph.D.
Senior Research Analyst
Office of Institutional Research
The University of North Carolina at Greensboro
336.256.0395
[hidden email]

'An approximate answer to the right question is worth a good deal more than an exact answer to an approximate question.' --a paraphrase of J. W. Tukey (1962)



Reply | Threaded
Open this post in threaded view
|

Re: Literature on treating 'Not Applicable' responses

Bruce Weaver
Administrator
I do something similar, but reserve 9, 99, 999, etc for "Missing", and 8, 88, 888 etc for "Not applicable".  Both sets of values are user-defined missing values for most purposes.



J P-6 wrote
Mark,

Depending on the exact context, what I usually do is label the “Not applicable” value as such then make that value a missing value code. That way it is removed from the denominator for frequency calculations and I still know how many selected that option.
 
Another thing I do is make the value for NA something way out of range like -99 or 999, that way if it somehow does not get coded as missing I can easily catch it in any summary statistic. I have seen folks get into real trouble by assigning it the next value in sequence.
 
This web site has some info (scroll to near bottom):
http://www.hcup-us.ahrq.gov/db/coding.jsp
 
 But if you are doing SEM NA might be a meaningful subgroup. But whatever value NA is assigned I cannot envision it as a legitmitate value for inclusion in a var-covar calculation.
 
But short answer: I cannot find a reference either!
 
HTH,
John




________________________________
From: Mark A Davenport MADAVENP <M_Davenport@uncg.edu>
To: SPSSX-L@LISTSERV.UGA.EDU
Sent: Thu, June 17, 2010 12:47:03 PM
Subject: Literature on treating 'Not Applicable' responses


All,

I have scoured our library resources andf my old texts and notes but can find nothing on treating 'Not applicable' as an item response.  The original analyst changed them all to 'missing'.  I don't like that idea at all.  Has anybody run across articles or book chapters on the subject?

Ultimately, I will be analyzing the data using AMOS 18s ordinal data modelling capabilities.

Thanks,

Mark



***************************************************************************************************************************************************************
Mark A. Davenport Ph.D.
Senior Research Analyst
Office of Institutional Research
The University of North Carolina at Greensboro
336.256.0395
M_Davenport@uncg.edu

'An approximate answer to the right question is worth a good deal more than an exact answer to an approximate question.' --a paraphrase of J. W. Tukey (1962)

--
Bruce Weaver
bweaver@lakeheadu.ca
http://sites.google.com/a/lakeheadu.ca/bweaver/

"When all else fails, RTFM."

PLEASE NOTE THE FOLLOWING: 
1. My Hotmail account is not monitored regularly. To send me an e-mail, please use the address shown above.
2. The SPSSX Discussion forum on Nabble is no longer linked to the SPSSX-L listserv administered by UGA (https://listserv.uga.edu/).
Reply | Threaded
Open this post in threaded view
|

Re: Literature on treating 'Not Applicable' responses

John F Hall

This was standard practice in most surveys I was involved in.  As Jon says, this throws up most errors in descriptive stats.  I'm not sure about one or two cases in a very large data set, but they'll certainly show up in range checks (eg freq <varlist> /for not /sta min max.).  In the old punched card days this would have used more columns (and more cards) so we'd stick to 8,9 etc but we always recoded the 8s and 9s to higher values inside SPSS.  To keep data prep costs down we mainly used the upper and lower zones '+' '-', read them in as alpha and then use recode/convert to get them into numeric format.  Some earlier surveys from the 1970s have been archived with  -1 as missing (risky?!). 
 
Some major surveys (eg British Social Attitudes) use code 7 for "Other" (semi-missing?) which doesn't leave many codes for valid responses and makes for some complex recoding in multiple response questions when codes are repeated for longer response lists.
----- Original Message -----
Sent: Thursday, June 17, 2010 8:55 PM
Subject: Re: Literature on treating 'Not Applicable' responses


I do something similar, but reserve 9, 99, 999, etc for "Missing", and 8, 88,
888 etc for "Not applicable".  Both sets of values are user-defined missing
values for most purposes.




J P-6 wrote:

>
> Mark,
>
> Depending on the exact context, what I usually do is label the “Not
> applicable” value as such then make that value a missing value code. That
> way it is removed from the denominator for frequency calculations and I
> still know how many selected that option.
>
> Another thing I do is make the value for NA something way out of range
> like -99 or 999, that way if it somehow does not get coded as missing I
> can easily catch it in any summary statistic. I have seen folks get into
> real trouble by assigning it the next value in sequence.
>
> This web site has some info (scroll to near bottom):
> http://www.hcup-us.ahrq.gov/db/coding.jsp
>
>  But if you are doing SEM NA might be a meaningful subgroup. But whatever
> value NA is assigned I cannot envision it as a legitmitate value for
> inclusion in a var-covar calculation.
>
> But short answer: I cannot find a reference either!
>
> HTH,
> John
>
>
>
>
> ________________________________
> From: Mark A Davenport MADAVENP <[hidden email]>
> To: [hidden email]
> Sent: Thu, June 17, 2010 12:47:03 PM
> Subject: Literature on treating 'Not Applicable' responses
>
>
> All,
>
> I have scoured our library resources andf my old texts and notes but can
> find nothing on treating 'Not applicable' as an item response.  The
> original analyst changed them all to 'missing'.  I don't like that idea at
> all.  Has anybody run across articles or book chapters on the subject?
>
> Ultimately, I will be analyzing the data using AMOS 18s ordinal data
> modelling capabilities.
>
> Thanks,
>
> Mark
>
>
>
> ***************************************************************************************************************************************************************
> Mark A. Davenport Ph.D.
> Senior Research Analyst
> Office of Institutional Research
> The University of North Carolina at Greensboro
> 336.256.0395
> [hidden email]
>
> 'An approximate answer to the right question is worth a good deal more
> than an exact answer to an approximate question.' --a paraphrase of J. W.
> Tukey (1962)
>
>
>
>


-----
--
Bruce Weaver
[hidden email]
http://sites.google.com/a/lakeheadu.ca/bweaver/
"When all else fails, RTFM."

NOTE:  My Hotmail account is not monitored regularly.
To send me an e-mail, please use the address shown above.
--
View this message in context: http://old.nabble.com/Genlin-question-tp28916050p28918327.html
Sent from the SPSSX Discussion mailing list archive at Nabble.com.

=====================
To manage your subscription to SPSSX-L, send a message to
[hidden email] (not to SPSSX-L), with no body text except the
command. To leave the list, send the command
SIGNOFF SPSSX-L
For a list of commands to manage subscriptions, send the command
INFO REFCARD