Re: SPSSX-L Digest - 11 May 2010 to 12 May 2010 (#2010-135)

classic Classic list List threaded Threaded
1 message Options
Reply | Threaded
Open this post in threaded view
|

Re: SPSSX-L Digest - 11 May 2010 to 12 May 2010 (#2010-135)

Christine Unson
Christine Unson PhD
Associate Professor
Department of Public Health
phone: (203) 392 7029
email: [hidden email]
________________________________________
From: SPSSX(r) Discussion [[hidden email]] On Behalf Of Automatic digest processor [[hidden email]]
Sent: Thursday, May 13, 2010 12:03 AM
To: Recipients of SPSSX-L digests
Subject: SPSSX-L Digest - 11 May 2010 to 12 May 2010 (#2010-135)

There are 29 messages totalling 3641 lines in this issue.

Topics of the day:

  1. IF sentence (2)
  2. SV: Reading old SPSS-X files from tapes into SPSS for Windows
  3. Size or significance of correlations (3)
  4. Fw: Size or significance of correlations
  5. deviation contrast for linear mixed model?
  6. Overlapping Periods (4)
  7. computing new variable based on percentiles (4)
  8. Help on correlations for multiple observation (3)
  9. Converting C# UTC Date/Time value (4)
 10. Dichotomous Variables & Correlation (3)
 11. Code not properly working on PASW DM book's example
 12. Stats question - multiple regression/controls (2)

=====================
To manage your subscription to SPSSX-L, send a message to
[hidden email] (not to SPSSX-L), with no body text except the
command. To leave the list, send the command
SIGNOFF SPSSX-L
For a list of commands to manage subscriptions, send the command
INFO REFCARD

----------------------------------------------------------------------

Date:    Wed, 12 May 2010 09:55:47 +0200
From:    Spousta Jan <[hidden email]>
Subject: Re: IF sentence

Yes, Elina, it is possible to compute maxima from variables. Perhaps your complicated logic can reduce into one row in this way:

COMPUTE mt = DATEDIFF(MAX(Pvm.1, Pvm.2, Pvm.3, ...), Pvm.1, "days").
/* update the list of variables / arguments of MAX(.) - you can perhaps use the easier form "MAX(Pvm.1 TO Pvm.11)" */ EXECUTE.

You need no testing of missings here, because the maximum is always valid in this case, since you have at least Pvm.1 valid.

Best regards,

Jan

-----Original Message-----
From: SPSSX(r) Discussion [mailto:[hidden email]] On Behalf Of Elina Tenhunen
Sent: Tuesday, May 11, 2010 5:04 PM
To: [hidden email]
Subject: Re: IF sentence

Hi,
Thank you for helping me!

This is how actually got it to work.

DO IF (not missing(Pvm.11)).
- COMPUTE mt = DATEDIFF(Pvm.11,Pvm.1,"days").
ELSE IF  (not missing(Pvm.10)).
- COMPUTE mt = DATEDIFF(Pvm.10,Pvm.1,"days").
ELSE IF  (not missing(Pvm.9)).
- COMPUTE mt = DATEDIFF(Pvm.9,Pvm.1,"days").
ELSE IF  (not missing(Pvm.8)).
- COMPUTE mt = DATEDIFF(Pvm.8,Pvm.1,"days").
ELSE IF  (not missing(Pvm.7)).
- COMPUTE mt = DATEDIFF(Pvm.7,Pvm.1,"days").
ELSE IF  (not missing(Pvm.6)).
- COMPUTE mt = DATEDIFF(Pvm.6,Pvm.1,"days").
ELSE IF (not missing(Pvm.5)).
- COMPUTE mt = DATEDIFF(Pvm.5,Pvm.1,"days").
ELSE IF (not missing(Pvm.4)).
- COMPUTE mt = DATEDIFF(Pvm.4,Pvm.1,"days").
ELSE IF (not missing(Pvm.3)).
- COMPUTE mt = DATEDIFF(Pvm.3,Pvm.1,"days").
ELSE IF (not missing(Pvm.3)).
- COMPUTE mt = DATEDIFF(Pvm.2,Pvm.1,"days").
ELSE .
  COMPUTE mt = 0.
END IF.
EXECUTE.

(In finnish pvm means date and mt is for follow up time) I wanted to account how many days this peole have been in our "follow up".

And actually I still have problem. Some of them are still coming to checkups so there will Pvm.12, Pvm.13 etc Is it possible to make somekind of Max(Pvm) Variable?
So I don't have write a new code for everytime I want to see how long they have been in follow up.

And this is how the data could look like:

ID    Pvm.1          Pvm.2           Pvm.3            Pvm.4      ect.
 1  01.07.2007 12.10.2007 12.11.2007 12.02.2008
 2 09.10.2008
 3 14.02.2007
 4 16.11.2008
 . 12.03.2008
 . 09.05.2008 22.09.2008
 31.12.2008 24.04.2009 03.09.2009
 12.10.2009
 13.02.2010 13.02.2010
 10.02.2007 12.05.2007 12.09.2007 12.03.2008
 20.09.2002 03.01.2003 11.02.2003 01.04.2003
 26.05.2006 12.11.2006 01.02.2007 02.03.2007
 30.11.2004 11.03.2005 11.04.2005 11.05.2005
 21.08.2007 11.10.2007 12.12.2007

Sorry abouth my terrible english :)
Best,
Elina

=====================
To manage your subscription to SPSSX-L, send a message to [hidden email] (not to SPSSX-L), with no body text except the command. To leave the list, send the command SIGNOFF SPSSX-L For a list of commands to manage subscriptions, send the command INFO REFCARD



_____________
Tato zpráva a v¹echny pøipojené soubory jsou dùvìrné a urèené výluènì adresátovi(-ùm). Jestli¾e nejste oprávnìným adresátem, je zakázáno jakékoliv zveøejòování, zprostøedkování nebo jiné pou¾ití tìchto informací. Jestli¾e jste tento mail dostali neoprávnìnì, prosím, uvìdomte odesilatele a sma¾te zprávu i pøilo¾ené soubory. Odesilatel nezodpovídá za jakékoliv chyby nebo opomenutí zpùsobené tímto pøenosem.

Jste si jisti, ¾e opravdu potøebujete vytisknout tuto zprávu a/nebo její pøílohy? Myslete na pøírodu.


This message and any attached files are confidential and intended solely for the addressee(s). Any publication, transmission or other use of the information by a person or entity other than the intended addressee is prohibited. If you receive this in error please contact the sender and delete the message as well as all attached documents. The sender does not accept liability for any errors or omissions as a result of the transmission.

Are you sure that you really need a print version of this message and/or its attachments? Think about nature.

-.- --

------------------------------

Date:    Wed, 12 May 2010 10:43:03 +0200
From:    John F Hall <[hidden email]>
Subject: Re: SV: Reading old SPSS-X files from tapes into SPSS for Windows

This is a multi-part message in MIME format.

------=_NextPart_000_0220_01CAF1BF.E6697940
Content-Type: text/plain;
        charset="iso-8859-1"
Content-Transfer-Encoding: quoted-printable

Staffan

Here's the outfit I found.

http://www.altirium.com/

http://www.altirium.co.uk/Altirium/about-altirium.html


Mark Sear wrote:

"Sounds like a 9-track tape to me. Hopefully not a 7-track as they are =
not pleasant to deal with.
Restoring the files as they are stored should not be an issue so long as =
the condition of the tapes is good, what we will have to think about is =
if they need converting in any way.  The first thing we would do is to =
read the raw data from the tapes and then secure it by backing it up to =
other tapes, we'd make 2 copies. Then we could look at what work was =
required."



Being retired on a fractional pension I couldn't afford to get the =
conversion done, so there are still one or two important SPSS *.sys =
files on the tapes (eg from pilot work and main surveys of European =
Values) which I can't access.



John



PS  Sorry about the typeface: the computer took over.





.----- Original Message -----=20

  From: Staffan Lindberg=20
  To: 'John F Hall'=20
  Sent: Wednesday, May 12, 2010 6:59 AM
  Subject: SV: SV: Reading old SPSS-X files from tapes into SPSS for =
Windows


  Thanks Jon!

  =20

  You've given me some clues and I will test them out.

  =20

  best

  =20

  Staffan

  =20

  Fr=E5n: John F Hall [mailto:[hidden email]]=20
  Skickat: den 11 maj 2010 19:46
  Till: Staffan Lindberg; [hidden email]
  =C4mne: Re: SV: Reading old SPSS-X files from tapes into SPSS for =
Windows

  =20

  I had a similar problem when I retired from the Polytechnic of North =
London in 1992.  The computer service dumped everything of mine from the =
Vax cluster on to 7 magnetic tapes which I then left at the Data Archive =
at Essex.  They no longer had a Vax mainframe and two years later they =
returned the box of tapes to me with 2 CDs with everything on. =20

  =20

  As far as I recall, none of the *.sys files copied across, but some of =
the *.exp and *.imp did.  All raw data and SPSS setup files were there, =
but had extensions added to denote edition, ie test.sps would be saved =
as test.sps_1.  Once I worked out what was happening, by manually =
deleting all the _n extensions, the files became readable by SPSS for =
Windows.  All the SPSS *.sav files and the (very few) *.por files =
worked, but I had to reconstruct all the rest of my stuff from scratch, =
including modifying 1972 syntax!.  My documentation, including full user =
manuals for major surveys, was invaluable, plus I found a few errors. =20

  =20

  Everything initially deposited at Essex is still available and =
working: the problems are with  the smaller studies which Essex turned =
away after they moved towards being a data archive rather than the =
survey archive they were originally set up as.  They now get all =
government data, including surveys: whilst they still get all the big =
non-government surveys, they're not interested in the smaller ad-hoc =
surveys, however professionally conducted and regardless of their =
importance for theory or policy.  Some of us take great care with our =
data, but PNL managed to lose everything of mine from before 1986, =
except material already deposited at Essex.

  =20

  There seemed to be nowhere able to read the (?9-track or 7-track?) =
magnetic tapes, but I eventually found an outfit in Aylesbury (UK) which =
could do some conversions, but only at a price beyond my means.=20

  =20

  I've checked the CDs again to see what exactly worked and what didn't. =
 I found a file initially exported by SPSS-X and displayed as =
sr501exm.por but which Windows lists as an EXP file so there must be a =
hidden extension on it: I probably created it with save out =
'sr501exm.exp' or similar and then tried to change it.  Using right =
click and open with I can open it with SPSS 15 so I presume 18 as well.

  =20

  If you can burn the files from tape to CD, does this help?

    ----- Original Message -----=20

    From: Staffan Lindberg=20

    To: [hidden email]=20

    Sent: Tuesday, May 11, 2010 3:58 PM

    Subject: SV: Reading old SPSS-X files from tapes into SPSS for =
Windows

    =20


    Thank you Mark and Richard!

    Sorry, I should have mentioned that they are stored as SPSS system =
files not
    EBCDIC raw files. The coding schemata with variable names. =
positions, labels
    etc have long since been lost. They now lie embedded in the system =
files. As
    I understand you the future looks bleak. Richard mentions that there =
may be
    some documentation on the detailed structure of an SPSS system file =
from
    that era. Anyone knows something about this?

    Richard also mentions the possibility of separating the data from =
the
    dictionary by a "low level parse". I must confess to my ignorance as =
to what
    a "low level parse" is. How do you do that? I know this looks bleak, =
but am
    not ready to give up just yet.

    best

    Staffan Lindberg
    Sweden

    -----Ursprungligt meddelande-----
    Fr=E5n: Richard Ristow [mailto:[hidden email]]
    Skickat: den 11 maj 2010 14:06
    Till: Staffan Lindberg; [hidden email]
    =C4mne: Re: Reading old SPSS-X files from tapes into SPSS for =
Windows

    At 06:36 AM 5/11/2010, Staffan Lindberg wrote:

    >I have a some old SPSS-X files on tapes from the late 70's and =
early
    >80's. Originally they were run on an IBM (360, I think) mainframe
    >before being stored on tape. The OS was MVS. I have them converted
    >from EBCDIC to ASCII and tried with several different editors but
    >they are completely unreadable.

    Whatever else will work, that probably won't. Remember, SPSS stores
    numbers in the floating-point format native to the machine; on IBM
    mainframes of that age, that's 32-bit hexfloat (as it's known -- the
    360/370 specific format). Bytewise conversion to ASCII will scramble
    those, probably beyond recovery.

    >I have the files now both converted and unconverted on my hard =
drive.

    Thank goodness for the unconverted, where you have a chance.

    >My question is if there are other ways of getting these files into
    >SPSS for windows? I have a vague recollection of a FILE HANDLE
    >command, but cannot find any information on how to use it. I would
    >be thankful for any input on this even a finger pointing in a
    >possible direction. Hopefully there are still some data
    >archaeologists out there?

    Is there public documentation on the detailed structure of an SPSS
    system file of that era? If so, at worst one could do a low-level
    parse, extracting the data dictionary and data separately, noting
    which were string variables and should be converted to ASCII =
strings,
    which were numeric variable and should be converted to (probably)
    text-represented numbers.

    Sounds like fun, in an adventurous sort of way.

    =3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D
    To manage your subscription to SPSSX-L, send a message to
    [hidden email] (not to SPSSX-L), with no body text except =
the
    command. To leave the list, send the command
    SIGNOFF SPSSX-L
    For a list of commands to manage subscriptions, send the command
    INFO REFCARD

------=_NextPart_000_0220_01CAF1BF.E6697940
Content-Type: text/html;
        charset="iso-8859-1"
Content-Transfer-Encoding: quoted-printable

<!DOCTYPE HTML PUBLIC "-//W3C//DTD HTML 4.0 Transitional//EN">
<HTML xmlns:v =3D "urn:schemas-microsoft-com:vml" xmlns:o =3D=20
"urn:schemas-microsoft-com:office:office" xmlns:w =3D=20
"urn:schemas-microsoft-com:office:word" xmlns:x =3D=20
"urn:schemas-microsoft-com:office:excel" xmlns:p =3D=20
"urn:schemas-microsoft-com:office:powerpoint" xmlns:a =3D=20
"urn:schemas-microsoft-com:office:access" xmlns:dt =3D=20
"uuid:C2F41010-65B3-11d1-A29F-00AA00C14882" xmlns:s =3D=20
"uuid:BDC6E3F0-6DA3-11d1-A2A3-00AA00C14882" xmlns:rs =3D=20
"urn:schemas-microsoft-com:rowset" xmlns:z =3D "#RowsetSchema" xmlns:b =
=3D=20
"urn:schemas-microsoft-com:office:publisher" xmlns:ss =3D=20
"urn:schemas-microsoft-com:office:spreadsheet" xmlns:c =3D=20
"urn:schemas-microsoft-com:office:component:spreadsheet" xmlns:odc =3D=20
"urn:schemas-microsoft-com:office:odc" xmlns:oa =3D=20
"urn:schemas-microsoft-com:office:activation" xmlns:html =3D=20
"http://www.w3.org/TR/REC-html40" xmlns:q =3D=20
"http://schemas.xmlsoap.org/soap/envelope/" xmlns:rtc =3D=20
"http://microsoft.com/officenet/conferencing" XMLNS:D =3D "DAV:" =
XMLNS:Repl =3D=20
"http://schemas.microsoft.com/repl/" xmlns:mt =3D=20
"http://schemas.microsoft.com/sharepoint/soap/meetings/" xmlns:x2 =3D=20
"http://schemas.microsoft.com/office/excel/2003/xml" xmlns:ppda =3D=20
"http://www.passport.com/NameSpace.xsd" xmlns:ois =3D=20
"http://schemas.microsoft.com/sharepoint/soap/ois/" xmlns:dir =3D=20
"http://schemas.microsoft.com/sharepoint/soap/directory/" xmlns:ds =3D=20
"http://www.w3.org/2000/09/xmldsig#" xmlns:dsp =3D=20
"http://schemas.microsoft.com/sharepoint/dsp" xmlns:udc =3D=20
"http://schemas.microsoft.com/data/udc" xmlns:xsd =3D=20
"http://www.w3.org/2001/XMLSchema" xmlns:sub =3D=20
"http://schemas.microsoft.com/sharepoint/soap/2002/1/alerts/" xmlns:ec =
=3D=20
"http://www.w3.org/2001/04/xmlenc#" xmlns:sp =3D=20
"http://schemas.microsoft.com/sharepoint/" xmlns:sps =3D=20
"http://schemas.microsoft.com/sharepoint/soap/" xmlns:xsi =3D=20
"http://www.w3.org/2001/XMLSchema-instance" xmlns:udcs =3D=20
"http://schemas.microsoft.com/data/udc/soap" xmlns:udcxf =3D=20
"http://schemas.microsoft.com/data/udc/xmlfile" xmlns:udcp2p =3D=20
"http://schemas.microsoft.com/data/udc/parttopart" xmlns:wf =3D=20
"http://schemas.microsoft.com/sharepoint/soap/workflow/" xmlns:dsss =3D=20
"http://schemas.microsoft.com/office/2006/digsig-setup" xmlns:dssi =3D=20
"http://schemas.microsoft.com/office/2006/digsig" xmlns:mdssi =3D=20
"http://schemas.openxmlformats.org/package/2006/digital-signature" =
xmlns:mver =3D=20
"http://schemas.openxmlformats.org/markup-compatibility/2006" xmlns:m =
=3D=20
"http://schemas.microsoft.com/office/2004/12/omml" xmlns:mrels =3D=20
"http://schemas.openxmlformats.org/package/2006/relationships" =
xmlns:spwp =3D=20
"http://microsoft.com/sharepoint/webpartpages" xmlns:ex12t =3D=20
"http://schemas.microsoft.com/exchange/services/2006/types" xmlns:ex12m =
=3D=20
"http://schemas.microsoft.com/exchange/services/2006/messages" =
xmlns:pptsl =3D=20
"http://schemas.microsoft.com/sharepoint/soap/SlideLibrary/" xmlns:spsl =
=3D=20
"http://microsoft.com/webservices/SharePointPortalServer/PublishedLinksSe=
rvice"=20
XMLNS:Z =3D "urn:schemas-microsoft-com:" xmlns:st =3D "=01"><HEAD>
<META content=3D"text/html; charset=3Diso-8859-1" =
http-equiv=3DContent-Type>
<META name=3DGENERATOR content=3D"MSHTML 8.00.6001.18904">
<STYLE>@font-face {
        font-family: Calibri;
}
@font-face {
        font-family: Tahoma;
}
@page Section1 {size: 612.0pt 792.0pt; margin: 70.85pt 70.85pt 70.85pt =
70.85pt; }
P.MsoNormal {
        MARGIN: 0cm 0cm 0pt; FONT-FAMILY: "Times New Roman","serif"; FONT-SIZE: =
12pt
}
LI.MsoNormal {
        MARGIN: 0cm 0cm 0pt; FONT-FAMILY: "Times New Roman","serif"; FONT-SIZE: =
12pt
}
DIV.MsoNormal {
        MARGIN: 0cm 0cm 0pt; FONT-FAMILY: "Times New Roman","serif"; FONT-SIZE: =
12pt
}
A:link {
        COLOR: blue; TEXT-DECORATION: underline; mso-style-priority: 99
}
SPAN.MsoHyperlink {
        COLOR: blue; TEXT-DECORATION: underline; mso-style-priority: 99
}
A:visited {
        COLOR: purple; TEXT-DECORATION: underline; mso-style-priority: 99
}
SPAN.MsoHyperlinkFollowed {
        COLOR: purple; TEXT-DECORATION: underline; mso-style-priority: 99
}
SPAN.E-postmall18 {
        FONT-FAMILY: "Calibri","sans-serif"; COLOR: #1f497d; mso-style-type: =
personal-reply
}
.MsoChpDefault {
        FONT-SIZE: 10pt; mso-style-type: export-only
}
DIV.Section1 {
        page: Section1
}
</STYLE>
<!--[if gte mso 9]><xml>
 <o:shapedefaults v:ext=3D"edit" spidmax=3D"1026" />
</xml><![endif]--><!--[if gte mso 9]><xml>
 <o:shapelayout v:ext=3D"edit">
  <o:idmap v:ext=3D"edit" data=3D"1" />
 </o:shapelayout></xml><![endif]--></HEAD>
<BODY lang=3DSV link=3Dblue bgColor=3Dwhite vLink=3Dpurple>
<DIV>
<DIV><FONT size=3D2 face=3DArial>
<DIV><FONT color=3D#810081 size=3D2 face=3DArial><FONT=20
color=3D#000000>Staffan</FONT></FONT></DIV>
<DIV><FONT color=3D#810081 size=3D2 face=3DArial></FONT>&nbsp;</DIV>
<DIV><FONT color=3D#810081 size=3D2 face=3DArial><FONT =
color=3D#000000>Here's the outfit=20
I found.</FONT></FONT></DIV>
<DIV><FONT color=3D#810081 size=3D2 face=3DArial></FONT>&nbsp;</DIV>
<DIV><FONT color=3D#810081 size=3D2 face=3DArial><A=20
href=3D"http://www.altirium.com/">http://www.altirium.com/</A></FONT></DI=
V>
<DIV><FONT color=3D#810081></FONT>&nbsp;</DIV>
<DIV><A =
href=3D"http://www.altirium.co.uk/Altirium/about-altirium.html"><FONT=20
color=3D#810081>http://www.altirium.co.uk/Altirium/about-altirium.html</F=
ONT></A></DIV>
<DIV><FONT color=3D#810081></FONT>&nbsp;</DIV>
<DIV><FONT color=3D#810081 size=3D2 face=3DArial></FONT>&nbsp;</DIV>
<DIV><FONT size=3D2 face=3DArial>Mark Sear wrote:</FONT></DIV>
<DIV><FONT face=3DArial><SPAN=20
style=3D"FONT-FAMILY: Arial; COLOR: navy; FONT-SIZE: 10pt"><FONT=20
color=3D#0000ff></FONT></SPAN></FONT>&nbsp;</DIV>
<DIV><FONT face=3DArial><SPAN=20
style=3D"FONT-FAMILY: Arial; COLOR: navy; FONT-SIZE: 10pt"><FONT=20
color=3D#0000ff>"Sounds like a 9-track tape to me. Hopefully not a =
7-track as they=20
are not pleasant to deal with.<O:P></O:P></FONT></SPAN></FONT></DIV>
<DIV>
<P class=3DMsoNormal><FONT face=3DArial><SPAN=20
style=3D"FONT-FAMILY: Arial; COLOR: navy; FONT-SIZE: 10pt"><FONT=20
color=3D#0000ff>Restoring the files as they are stored should not be an =
issue so=20
long as the condition of the tapes is good, what we will have to think =
about is=20
if they need converting in any way.&nbsp;</FONT></SPAN></FONT><FONT=20
color=3D#0000ff><FONT face=3DArial><SPAN=20
style=3D"FONT-FAMILY: Arial; COLOR: navy; FONT-SIZE: =
10pt"><O:P>&nbsp;</O:P></SPAN></FONT><FONT=20
face=3DArial><SPAN style=3D"FONT-FAMILY: Arial; COLOR: navy; FONT-SIZE: =
10pt">The=20
first thing we would do is to read the raw data from the tapes and then =
secure=20
it by backing it up to other tapes, we=92d make 2 copies. Then we could =
look at=20
what work was required."</SPAN></FONT></FONT></P>
<P class=3DMsoNormal><FONT color=3D#0000ff><FONT face=3DArial><SPAN=20
style=3D"FONT-FAMILY: Arial; COLOR: navy; FONT-SIZE: =
10pt"></SPAN></FONT></FONT>&nbsp;</P>
<P class=3DMsoNormal><FONT face=3DArial><SPAN=20
style=3D"FONT-FAMILY: Arial; COLOR: navy; FONT-SIZE: 10pt"><FONT=20
color=3D#000000>Being retired on a fractional pension I couldn't afford =
to get the=20
conversion done, so there are still one or two important=20
SPSS&nbsp;<STRONG>*.sys</STRONG> files on the tapes (eg from pilot work =
and main=20
surveys of European Values) which I can't =
access.</FONT></SPAN></FONT></P>
<P class=3DMsoNormal><FONT color=3D#000000 face=3DArial><SPAN=20
style=3D"FONT-FAMILY: Arial; COLOR: navy; FONT-SIZE: =
10pt"></SPAN></FONT>&nbsp;</P>
<P class=3DMsoNormal><FONT color=3D#000000 face=3DArial><SPAN=20
style=3D"FONT-FAMILY: Arial; COLOR: navy; FONT-SIZE: =
10pt">John</SPAN></FONT></P>
<P class=3DMsoNormal><FONT color=3D#000000 face=3DArial><SPAN=20
style=3D"FONT-FAMILY: Arial; COLOR: navy; FONT-SIZE: =
10pt"></SPAN></FONT>&nbsp;</P>
<P class=3DMsoNormal><FONT color=3D#000000 face=3DArial><SPAN=20
style=3D"FONT-FAMILY: Arial; COLOR: navy; FONT-SIZE: 10pt">PS&nbsp; =
Sorry about=20
the typeface: the computer took over.</SPAN></FONT></P>
<P class=3DMsoNormal><FONT face=3DArial><SPAN=20
style=3D"FONT-FAMILY: Arial; COLOR: navy; FONT-SIZE: 10pt"><FONT=20
color=3D#000000></FONT></SPAN></FONT>&nbsp;</P>
<P class=3DMsoNormal><FONT face=3DArial><SPAN=20
style=3D"FONT-FAMILY: Arial; COLOR: navy; FONT-SIZE: 10pt"><FONT=20
color=3D#000000></FONT></SPAN></FONT>&nbsp;</P>
<P class=3DMsoNormal><FONT face=3DArial><SPAN=20
style=3D"FONT-FAMILY: Arial; COLOR: navy; FONT-SIZE: 10pt"><FONT=20
color=3D#000000>.</FONT></SPAN></FONT></FONT><FONT size=3D2>-----</FONT> =
Original=20
Message ----- </P></DIV></DIV></DIV>
<BLOCKQUOTE=20
style=3D"BORDER-LEFT: #000000 2px solid; PADDING-LEFT: 5px; =
PADDING-RIGHT: 0px; MARGIN-LEFT: 5px; MARGIN-RIGHT: 0px"=20
dir=3Dltr>
  <DIV=20
  style=3D"FONT: 10pt arial; BACKGROUND: #e4e4e4; font-color: =
black"><B>From:</B>=20
  <A title=[hidden email]=20
  href=3D"mailto:[hidden email]">Staffan Lindberg</A> =
</DIV>
  <DIV style=3D"FONT: 10pt arial"><B>To:</B> <A =
title=[hidden email]=20
  href=3D"mailto:[hidden email]">'John F Hall'</A> </DIV>
  <DIV style=3D"FONT: 10pt arial"><B>Sent:</B> Wednesday, May 12, 2010 =
6:59=20
  AM</DIV>
  <DIV style=3D"FONT: 10pt arial"><B>Subject:</B> SV: SV: Reading old =
SPSS-X files=20
  from tapes into SPSS for Windows</DIV>
  <DIV><BR></DIV>
  <DIV class=3DSection1>
  <P class=3DMsoNormal><SPAN=20
  style=3D"FONT-FAMILY: 'Calibri','sans-serif'; COLOR: #1f497d; =
FONT-SIZE: 11pt"=20
  lang=3DEN-US>Thanks Jon!<o:p></o:p></SPAN></P>
  <P class=3DMsoNormal><SPAN=20
  style=3D"FONT-FAMILY: 'Calibri','sans-serif'; COLOR: #1f497d; =
FONT-SIZE: 11pt"=20
  lang=3DEN-US><o:p>&nbsp;</o:p></SPAN></P>
  <P class=3DMsoNormal><SPAN=20
  style=3D"FONT-FAMILY: 'Calibri','sans-serif'; COLOR: #1f497d; =
FONT-SIZE: 11pt"=20
  lang=3DEN-US>You=92ve given me some clues and I will test them=20
  out.<o:p></o:p></SPAN></P>
  <P class=3DMsoNormal><SPAN=20
  style=3D"FONT-FAMILY: 'Calibri','sans-serif'; COLOR: #1f497d; =
FONT-SIZE: 11pt"=20
  lang=3DEN-US><o:p>&nbsp;</o:p></SPAN></P>
  <P class=3DMsoNormal><SPAN=20
  style=3D"FONT-FAMILY: 'Calibri','sans-serif'; COLOR: #1f497d; =
FONT-SIZE: 11pt"=20
  lang=3DEN-US>best<o:p></o:p></SPAN></P>
  <P class=3DMsoNormal><SPAN=20
  style=3D"FONT-FAMILY: 'Calibri','sans-serif'; COLOR: #1f497d; =
FONT-SIZE: 11pt"=20
  lang=3DEN-US><o:p>&nbsp;</o:p></SPAN></P>
  <P class=3DMsoNormal><SPAN=20
  style=3D"FONT-FAMILY: 'Calibri','sans-serif'; COLOR: #1f497d; =
FONT-SIZE: 11pt"=20
  lang=3DEN-US>Staffan<o:p></o:p></SPAN></P>
  <P class=3DMsoNormal><SPAN=20
  style=3D"FONT-FAMILY: 'Calibri','sans-serif'; COLOR: #1f497d; =
FONT-SIZE: 11pt"=20
  lang=3DEN-US><o:p>&nbsp;</o:p></SPAN></P>
  <DIV>
  <DIV=20
  style=3D"BORDER-BOTTOM: medium none; BORDER-LEFT: medium none; =
PADDING-BOTTOM: 0cm; PADDING-LEFT: 0cm; PADDING-RIGHT: 0cm; BORDER-TOP: =
#b5c4df 1pt solid; BORDER-RIGHT: medium none; PADDING-TOP: 3pt">
  <P class=3DMsoNormal><B><SPAN=20
  style=3D"FONT-FAMILY: 'Tahoma','sans-serif'; FONT-SIZE: =
10pt">Fr=E5n:</SPAN></B><SPAN=20
  style=3D"FONT-FAMILY: 'Tahoma','sans-serif'; FONT-SIZE: 10pt"> John F =
Hall=20
  [mailto:[hidden email]] <BR><B>Skickat:</B> den 11 maj 2010=20
  19:46<BR><B>Till:</B> Staffan Lindberg; <A=20
  =
href=3D"mailto:[hidden email]">[hidden email]</A><BR>=
<B>=C4mne:</B>=20
  Re: SV: Reading old SPSS-X files from tapes into SPSS for=20
  Windows<o:p></o:p></SPAN></P></DIV></DIV>
  <P class=3DMsoNormal><o:p>&nbsp;</o:p></P>
  <DIV>
  <P class=3DMsoNormal><SPAN=20
  style=3D"FONT-FAMILY: 'Arial','sans-serif'; FONT-SIZE: 10pt">I had a =
similar=20
  problem when I retired from the Polytechnic of North London in =
1992.&nbsp; The=20
  computer service dumped everything of mine from the Vax cluster on to =
7=20
  magnetic tapes which I then left at the Data Archive at Essex.&nbsp; =
They no=20
  longer had a Vax mainframe and two years later they returned the box =
of tapes=20
  to me with 2 CDs with everything on.&nbsp; =
</SPAN><o:p></o:p></P></DIV>
  <DIV>
  <P class=3DMsoNormal>&nbsp;<o:p></o:p></P></DIV>
  <DIV>
  <P class=3DMsoNormal><SPAN=20
  style=3D"FONT-FAMILY: 'Arial','sans-serif'; FONT-SIZE: =
10pt">As&nbsp;far as I=20
  recall, none of the <STRONG><SPAN=20
  style=3D"FONT-FAMILY: 'Arial','sans-serif'">*.sys</SPAN></STRONG> =
files copied=20
  across,&nbsp;but some of the <STRONG><SPAN=20
  style=3D"FONT-FAMILY: =
'Arial','sans-serif'">*.exp</SPAN></STRONG>&nbsp;and=20
  <STRONG><SPAN style=3D"FONT-FAMILY: =
'Arial','sans-serif'">*.imp</SPAN></STRONG>=20
  did.&nbsp; All raw data and SPSS setup files were there, but had =
extensions=20
  added to denote edition, ie <STRONG><SPAN=20
  style=3D"FONT-FAMILY: 'Arial','sans-serif'">test.sps</SPAN></STRONG> =
would be=20
  saved as <STRONG><SPAN=20
  style=3D"FONT-FAMILY: =
'Arial','sans-serif'">test.sps_1</SPAN></STRONG>.&nbsp;=20
  Once I worked out what was happening, by manually deleting all the=20
  <STRONG><SPAN style=3D"FONT-FAMILY: =
'Arial','sans-serif'">_n</SPAN></STRONG>=20
  extensions, the files became readable by SPSS for =
Windows.&nbsp;&nbsp;All the=20
  SPSS&nbsp;<STRONG><SPAN=20
  style=3D"FONT-FAMILY: 'Arial','sans-serif'">*.sav</SPAN></STRONG> =
files and the=20
  (very few) <STRONG><SPAN=20
  style=3D"FONT-FAMILY: 'Arial','sans-serif'">*.por</SPAN></STRONG> =
files worked,=20
  but I had to reconstruct all the rest of my stuff from scratch, =
including=20
  modifying 1972 syntax!.&nbsp; My documentation, including full user =
manuals=20
  for major surveys, was invaluable, plus&nbsp;I found a few =
errors.&nbsp;=20
  </SPAN><o:p></o:p></P></DIV>
  <DIV>
  <P class=3DMsoNormal>&nbsp;<o:p></o:p></P></DIV>
  <DIV>
  <P class=3DMsoNormal><SPAN=20
  style=3D"FONT-FAMILY: 'Arial','sans-serif'; FONT-SIZE: =
10pt">Everything=20
  initially deposited at Essex is still available and working: the =
problems are=20
  with&nbsp; the smaller studies which Essex turned away after they =
moved=20
  towards being a data&nbsp;archive rather than the&nbsp;survey archive =
they=20
  were&nbsp;originally set up&nbsp;as.&nbsp; They now get all government =
data,=20
  including surveys: whilst they still get all the big non-government =
surveys,=20
  they're not interested in the smaller ad-hoc surveys, however =
professionally=20
  conducted and regardless of&nbsp;their importance for theory or =
policy.&nbsp;=20
  Some of us take great care with our data, but PNL managed to lose =
everything=20
  of mine from before 1986, except material already deposited at=20
  Essex.</SPAN><o:p></o:p></P></DIV>
  <DIV>
  <P class=3DMsoNormal>&nbsp;<o:p></o:p></P></DIV>
  <DIV>
  <P class=3DMsoNormal><SPAN=20
  style=3D"FONT-FAMILY: 'Arial','sans-serif'; FONT-SIZE: 10pt">There =
seemed to be=20
  nowhere able to read the (?9-track or 7-track?) magnetic tapes, but I=20
  eventually found an outfit in Aylesbury (UK) which could do some =
conversions,=20
  but only at a price beyond my means.&nbsp;</SPAN><o:p></o:p></P></DIV>
  <DIV>
  <P class=3DMsoNormal>&nbsp;<o:p></o:p></P></DIV>
  <DIV>
  <DIV>
  <P class=3DMsoNormal><SPAN=20
  style=3D"FONT-FAMILY: 'Arial','sans-serif'; FONT-SIZE: 10pt">I've =
checked the=20
  CDs again to see what exactly worked and what =
didn't.</SPAN>&nbsp;<SPAN=20
  style=3D"FONT-FAMILY: 'Arial','sans-serif'; FONT-SIZE: 10pt"> I found =
a file=20
  initially exported by SPSS-X and displayed as <STRONG><SPAN=20
  style=3D"FONT-FAMILY: 'Arial','sans-serif'">sr501exm.por =
</SPAN></STRONG>but=20
  which Windows lists as an EXP file so there must be a hidden extension =
on it:=20
  I probably created it with <STRONG><SPAN=20
  style=3D"FONT-FAMILY: 'Arial','sans-serif'; COLOR: teal">save out =
'sr501exm.exp'=20
  </SPAN></STRONG>or similar and then tried to change it.&nbsp; Using=20
  <STRONG><SPAN style=3D"FONT-FAMILY: 'Arial','sans-serif'">right=20
  click</SPAN></STRONG> and <STRONG><SPAN=20
  style=3D"FONT-FAMILY: 'Arial','sans-serif'">open with</SPAN></STRONG> =
I can open=20
  it with SPSS 15 so I presume 18 as well.</SPAN><o:p></o:p></P></DIV>
  <DIV>
  <P class=3DMsoNormal>&nbsp;<o:p></o:p></P></DIV>
  <DIV>
  <P class=3DMsoNormal><SPAN=20
  style=3D"FONT-FAMILY: 'Arial','sans-serif'; FONT-SIZE: 10pt">If you =
can burn the=20
  files from tape to CD, does this =
help?</SPAN><o:p></o:p></P></DIV></DIV>
  <BLOCKQUOTE=20
  style=3D"BORDER-BOTTOM: medium none; BORDER-LEFT: black 1.5pt solid; =
PADDING-BOTTOM: 0cm; MARGIN: 5pt 0cm 5pt 3.75pt; PADDING-LEFT: 4pt; =
PADDING-RIGHT: 0cm; BORDER-TOP: medium none; BORDER-RIGHT: medium none; =
PADDING-TOP: 0cm">
    <DIV>
    <P class=3DMsoNormal><SPAN=20
    style=3D"FONT-FAMILY: 'Arial','sans-serif'; FONT-SIZE: 10pt">----- =
Original=20
    Message ----- <o:p></o:p></SPAN></P></DIV>
    <DIV>
    <P style=3D"BACKGROUND: #e4e4e4" class=3DMsoNormal><B><SPAN=20
    style=3D"FONT-FAMILY: 'Arial','sans-serif'; FONT-SIZE: =
10pt">From:</SPAN></B><SPAN=20
    style=3D"FONT-FAMILY: 'Arial','sans-serif'; FONT-SIZE: 10pt"> <A=20
    title=[hidden email]=20
    href=3D"mailto:[hidden email]">Staffan Lindberg</A>=20
    <o:p></o:p></SPAN></P></DIV>
    <DIV>
    <P class=3DMsoNormal><B><SPAN=20
    style=3D"FONT-FAMILY: 'Arial','sans-serif'; FONT-SIZE: =
10pt">To:</SPAN></B><SPAN=20
    style=3D"FONT-FAMILY: 'Arial','sans-serif'; FONT-SIZE: 10pt"> <A=20
    title=[hidden email]=20
    =
href=3D"mailto:[hidden email]">[hidden email]</A>=20
    <o:p></o:p></SPAN></P></DIV>
    <DIV>
    <P class=3DMsoNormal><B><SPAN=20
    style=3D"FONT-FAMILY: 'Arial','sans-serif'; FONT-SIZE: =
10pt">Sent:</SPAN></B><SPAN=20
    style=3D"FONT-FAMILY: 'Arial','sans-serif'; FONT-SIZE: 10pt"> =
Tuesday, May 11,=20
    2010 3:58 PM<o:p></o:p></SPAN></P></DIV>
    <DIV>
    <P class=3DMsoNormal><B><SPAN=20
    style=3D"FONT-FAMILY: 'Arial','sans-serif'; FONT-SIZE: =
10pt">Subject:</SPAN></B><SPAN=20
    style=3D"FONT-FAMILY: 'Arial','sans-serif'; FONT-SIZE: 10pt"> SV: =
Reading old=20
    SPSS-X files from tapes into SPSS for =
Windows<o:p></o:p></SPAN></P></DIV>
    <DIV>
    <P class=3DMsoNormal><o:p>&nbsp;</o:p></P></DIV>
    <P style=3D"MARGIN-BOTTOM: 12pt" class=3DMsoNormal><BR>Thank you =
Mark and=20
    Richard!<BR><BR>Sorry, I should have mentioned that they are stored =
as SPSS=20
    system files not<BR>EBCDIC raw files. The coding schemata with =
variable=20
    names. positions, labels<BR>etc have long since been lost. They now =
lie=20
    embedded in the system files. As<BR>I understand you the future =
looks bleak.=20
    Richard mentions that there may be<BR>some documentation on the =
detailed=20
    structure of an SPSS system file from<BR>that era. Anyone knows =
something=20
    about this?<BR><BR>Richard also mentions the possibility of =
separating the=20
    data from the<BR>dictionary by a "low level parse". I must confess =
to my=20
    ignorance as to what<BR>a "low level parse" is. How do you do that? =
I know=20
    this looks bleak, but am<BR>not ready to give up just=20
    yet.<BR><BR>best<BR><BR>Staffan =
Lindberg<BR>Sweden<BR><BR>-----Ursprungligt=20
    meddelande-----<BR>Fr=E5n: Richard Ristow=20
    [mailto:[hidden email]]<BR>Skickat: den 11 maj 2010 =
14:06<BR>Till:=20
    Staffan Lindberg; <A=20
    =
href=3D"mailto:[hidden email]">[hidden email]</A><BR>=
=C4mne:=20
    Re: Reading old SPSS-X files from tapes into SPSS for =
Windows<BR><BR>At=20
    06:36 AM 5/11/2010, Staffan Lindberg wrote:<BR><BR>&gt;I have a some =
old=20
    SPSS-X files on tapes from the late 70's and early<BR>&gt;80's. =
Originally=20
    they were run on an IBM (360, I think) mainframe<BR>&gt;before being =
stored=20
    on tape. The OS was MVS. I have them converted<BR>&gt;from EBCDIC to =
ASCII=20
    and tried with several different editors but<BR>&gt;they are =
completely=20
    unreadable.<BR><BR>Whatever else will work, that probably won't. =
Remember,=20
    SPSS stores<BR>numbers in the floating-point format native to the =
machine;=20
    on IBM<BR>mainframes of that age, that's 32-bit hexfloat (as it's =
known --=20
    the<BR>360/370 specific format). Bytewise conversion to ASCII will=20
    scramble<BR>those, probably beyond recovery.<BR><BR>&gt;I have the =
files now=20
    both converted and unconverted on my hard drive.<BR><BR>Thank =
goodness for=20
    the unconverted, where you have a chance.<BR><BR>&gt;My question is =
if there=20
    are other ways of getting these files into<BR>&gt;SPSS for windows? =
I have a=20
    vague recollection of a FILE HANDLE<BR>&gt;command, but cannot find =
any=20
    information on how to use it. I would<BR>&gt;be thankful for any =
input on=20
    this even a finger pointing in a<BR>&gt;possible direction. =
Hopefully there=20
    are still some data<BR>&gt;archaeologists out there?<BR><BR>Is there =
public=20
    documentation on the detailed structure of an SPSS<BR>system file of =
that=20
    era? If so, at worst one could do a low-level<BR>parse, extracting =
the data=20
    dictionary and data separately, noting<BR>which were string =
variables and=20
    should be converted to ASCII strings,<BR>which were numeric variable =
and=20
    should be converted to (probably)<BR>text-represented =
numbers.<BR><BR>Sounds=20
    like fun, in an adventurous sort of =
way.<BR><BR>=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=
<BR>To=20
    manage your subscription to SPSSX-L, send a message to<BR><A=20
    =
href=3D"mailto:[hidden email]">[hidden email]</A> =
(not=20
    to SPSSX-L), with no body text except the<BR>command. To leave the =
list,=20
    send the command<BR>SIGNOFF SPSSX-L<BR>For a list of commands to =
manage=20
    subscriptions, send the command<BR>INFO=20
  REFCARD<o:p></o:p></P></BLOCKQUOTE></DIV></BLOCKQUOTE></BODY></HTML>

------=_NextPart_000_0220_01CAF1BF.E6697940--

------------------------------

Date:    Wed, 12 May 2010 03:58:31 -0700
From:    Humphrey Paulie <[hidden email]>
Subject: Size or significance of correlations

--0-1723772264-1273661911=:95627
Content-Type: text/plain; charset=us-ascii


Dear all,
I have acorrealted a neumber of varibales (Pearson) and have found correlations as low as .15 to be significant at 0.01 level (two-tailed). My sample size is 280.
I dont know how to interpret this.
the coefficinets are very small however statistically significant.
Which one should I trust, the size of the coefficients or the statistical significnace?
I will be thankful for comments.
Cheers
Humphrey




--0-1723772264-1273661911=:95627
Content-Type: text/html; charset=us-ascii

<table cellspacing="0" cellpadding="0" border="0" ><tr><td valign="top" style="font: inherit;"><br>Dear all,<br>I have acorrealted a neumber of varibales (Pearson) and have found correlations as low as .15 to be significant at 0.01 level (two-tailed). My sample size is 280.<br>I dont know how to interpret this.<br>the coefficinets are very small however statistically significant.<br>Which one should I trust, the size of the coefficients or the statistical significnace?<br>I will be thankful for comments.<br>Cheers<br>Humphrey<br></td></tr></table><br>


--0-1723772264-1273661911=:95627--

------------------------------

Date:    Wed, 12 May 2010 07:56:19 -0400
From:    David Marso <[hidden email]>
Subject: Re: IF sentence

VERY NICE Jan !!!!
On Wed, 12 May 2010 09:55:47 +0200, Spousta Jan <[hidden email]> wrote:

>Yes, Elina, it is possible to compute maxima from variables. Perhaps your
complicated logic can reduce into one row in this way:
>
>COMPUTE mt = DATEDIFF(MAX(Pvm.1, Pvm.2, Pvm.3, ...), Pvm.1, "days").
>/* update the list of variables / arguments of MAX(.) - you can perhaps use
the easier form "MAX(Pvm.1 TO Pvm.11)" */ EXECUTE.
>
>You need no testing of missings here, because the maximum is always valid
in this case, since you have at least Pvm.1 valid.
>
>Best regards,
>
>Jan
>
>-----Original Message-----
>From: SPSSX(r) Discussion [mailto:[hidden email]] On Behalf Of
Elina Tenhunen

>Sent: Tuesday, May 11, 2010 5:04 PM
>To: [hidden email]
>Subject: Re: IF sentence
>
>Hi,
>Thank you for helping me!
>
>This is how actually got it to work.
>
>DO IF (not missing(Pvm.11)).
>- COMPUTE mt = DATEDIFF(Pvm.11,Pvm.1,"days").
>ELSE IF  (not missing(Pvm.10)).
>- COMPUTE mt = DATEDIFF(Pvm.10,Pvm.1,"days").
>ELSE IF  (not missing(Pvm.9)).
>- COMPUTE mt = DATEDIFF(Pvm.9,Pvm.1,"days").
>ELSE IF  (not missing(Pvm.8)).
>- COMPUTE mt = DATEDIFF(Pvm.8,Pvm.1,"days").
>ELSE IF  (not missing(Pvm.7)).
>- COMPUTE mt = DATEDIFF(Pvm.7,Pvm.1,"days").
>ELSE IF  (not missing(Pvm.6)).
>- COMPUTE mt = DATEDIFF(Pvm.6,Pvm.1,"days").
>ELSE IF (not missing(Pvm.5)).
>- COMPUTE mt = DATEDIFF(Pvm.5,Pvm.1,"days").
>ELSE IF (not missing(Pvm.4)).
>- COMPUTE mt = DATEDIFF(Pvm.4,Pvm.1,"days").
>ELSE IF (not missing(Pvm.3)).
>- COMPUTE mt = DATEDIFF(Pvm.3,Pvm.1,"days").
>ELSE IF (not missing(Pvm.3)).
>- COMPUTE mt = DATEDIFF(Pvm.2,Pvm.1,"days").
>ELSE .
>  COMPUTE mt = 0.
>END IF.
>EXECUTE.
>
>(In finnish pvm means date and mt is for follow up time) I wanted to
account how many days this peole have been in our "follow up".
>
>And actually I still have problem. Some of them are still coming to
checkups so there will Pvm.12, Pvm.13 etc Is it possible to make somekind of
Max(Pvm) Variable?
>So I don't have write a new code for everytime I want to see how long they
have been in follow up.

>
>And this is how the data could look like:
>
>ID    Pvm.1          Pvm.2           Pvm.3            Pvm.4      ect.
> 1  01.07.2007 12.10.2007 12.11.2007 12.02.2008
> 2 09.10.2008
> 3 14.02.2007
> 4 16.11.2008
> . 12.03.2008
> . 09.05.2008 22.09.2008
> 31.12.2008 24.04.2009 03.09.2009
> 12.10.2009
> 13.02.2010 13.02.2010
> 10.02.2007 12.05.2007 12.09.2007 12.03.2008
> 20.09.2002 03.01.2003 11.02.2003 01.04.2003
> 26.05.2006 12.11.2006 01.02.2007 02.03.2007
> 30.11.2004 11.03.2005 11.04.2005 11.05.2005
> 21.08.2007 11.10.2007 12.12.2007
>
>Sorry abouth my terrible english :)
>Best,
>Elina
>
>=====================
>To manage your subscription to SPSSX-L, send a message to
[hidden email] (not to SPSSX-L), with no body text except the
command. To leave the list, send the command SIGNOFF SPSSX-L For a list of
commands to manage subscriptions, send the command INFO REFCARD
>
>
>
>_____________
>Tato zpr�va a v�echny p�ipojen� soubory jsou d�v�rn� a ur�en� v�lu�n�
adres�tovi(-�m). Jestli�e nejste opr�vn�n�m adres�tem, je zak�z�no jak�koliv
zve�ej�ov�n�, zprost�edkov�n� nebo jin� pou�it� t�chto informac�. Jestli�e
jste tento mail dostali neopr�vn�n�, pros�m, uv�domte odesilatele a sma�te
zpr�vu i p�ilo�en� soubory. Odesilatel nezodpov�d� za jak�koliv chyby nebo
opomenut� zp�soben� t�mto p�enosem.
>
>Jste si jisti, �e opravdu pot�ebujete vytisknout tuto zpr�vu a/nebo jej�
p��lohy? Myslete na p��rodu.
>
>
>This message and any attached files are confidential and intended solely
for the addressee(s). Any publication, transmission or other use of the
information by a person or entity other than the intended addressee is
prohibited. If you receive this in error please contact the sender and
delete the message as well as all attached documents. The sender does not
accept liability for any errors or omissions as a result of the transmission.
>
>Are you sure that you really need a print version of this message and/or
its attachments? Think about nature.

>
>-.- --
>
>=====================
>To manage your subscription to SPSSX-L, send a message to
>[hidden email] (not to SPSSX-L), with no body text except the
>command. To leave the list, send the command
>SIGNOFF SPSSX-L
>For a list of commands to manage subscriptions, send the command
>INFO REFCARD

------------------------------

Date:    Wed, 12 May 2010 08:51:04 -0400
From:    Mike Palij <[hidden email]>
Subject: Fw: Size or significance of correlations

This is a multi-part message in MIME format.

------=_NextPart_000_005F_01CAF1B0.41609FF0
Content-Type: text/plain;
        charset="iso-8859-1"
Content-Transfer-Encoding: quoted-printable


----- Original Message -----=20
From: Mike Palij=20
To: Humphrey Paulie=20
Cc: Mike Palij=20
Sent: Wednesday, May 12, 2010 8:43 AM
Subject: Re: Size or significance of correlations


First, compute the overall Type I error rate with the following formula:

overall alpha =3D ( 1 - (1-alpha percomparison)**K).

Where alpha percomparison is the Type I error rate, that is, the
Type I error rate you are using for each correlation, presumably
you claim that a correlation is statistically significant at the 0.05=20
level.=20
K is the number of correlations you're evaluating.

If you only had 3 correlations each using .05 alpha per comparison,
the above formula would look like this:

Overall alpha =3D (1 - (1 - .05)**3))
  =3D (1-(.95)**3)=3D (1 - .8574) =3D .1426

That is, you have about 14% chance of having commited a Type I
error after evaluating the three correlations.

Many people would consider an overall Type I error rate of 14%
as being too high (If you have more correlations, expect the
overall Type I error rate to skyrocket).  Some would suggest
fixing the Overall Type I error rate to 0.05 and then use an
adjusted per comparison alpha for each correlation.  The

Bonferroni correction allow you to set the Overall alpha =3D .05
and then use the following to calculate the per comparison alpha:

per comparison alpha =3D .05/K

Where K, as above, refer to the number of correlations you are
evaluating.  For K=3D3, per comparison alpha =3D .05/3 =3D .01667.
This means that correlations with a p-value less than .01667
are considered significant.  For a large number of correlations,
the per comparison alpha can be quite small.

There are more things you can do (e.g., adjust per comparison
alpha so that important correlation have a small value, etc.)
but it appears that you have limited knowledge about the nature
of the correlation and the phenomena they relate to.  That is,
If you had previous research and/or good theory about the nature
of the correlation, you would not be asking how to interpret your
obtained correlations because these would serve as your guide.
Without such guides, you have to purely mechanical guides
such as the Bonferroni correction.

In addition, regarding which correlations you should "trust",
I would recommend trusting only those correlations that are
shown through replications of the research they come from
to continue to be statistical significant.  Otherwise, mere size
of the correlation coefficient or its probability are at best only
tentative sources of information.  Even small values of a=20
correlation can be theoretically meaningful and non-significant
correlation may be non-significant because of a lack of statistical
power.

-Mike Palij
New York University
[hidden email]


  ----- Original Message -----=20
  From: Humphrey Paulie=20
  To: [hidden email]=20
  Sent: Wednesday, May 12, 2010 6:58 AM
  Subject: Size or significance of correlations



        Dear all,
        I have acorrealted a neumber of varibales (Pearson) and have =
found correlations as low as .15 to be significant at 0.01 level =
(two-tailed). My sample size is 280.
        I dont know how to interpret this.
        the coefficinets are very small however statistically =
significant.
        Which one should I trust, the size of the coefficients or the =
statistical significnace?
        I will be thankful for comments.
        Cheers
        Humphrey
      =20


------=_NextPart_000_005F_01CAF1B0.41609FF0
Content-Type: text/html;
        charset="iso-8859-1"
Content-Transfer-Encoding: quoted-printable

<!DOCTYPE HTML PUBLIC "-//W3C//DTD HTML 4.0 Transitional//EN">
<HTML><HEAD>
<META http-equiv=3DContent-Type content=3D"text/html; =
charset=3Diso-8859-1">
<META content=3D"MSHTML 6.00.2900.3676" name=3DGENERATOR>
<STYLE></STYLE>
</HEAD>
<BODY bgColor=3D#ffffff>
<DIV>&nbsp;</DIV>
<DIV style=3D"FONT: 10pt arial">----- Original Message -----=20
<DIV style=3D"BACKGROUND: #e4e4e4; font-color: black"><B>From:</B> <A=20
title=[hidden email] href=3D"mailto:[hidden email]">Mike Palij</A> </DIV>
<DIV><B>To:</B> <A title=[hidden email]=20
href=3D"mailto:[hidden email]">Humphrey Paulie</A> </DIV>
<DIV><B>Cc:</B> <A title=[hidden email] =
href=3D"mailto:[hidden email]">Mike Palij</A>=20
</DIV>
<DIV><B>Sent:</B> Wednesday, May 12, 2010 8:43 AM</DIV>
<DIV><B>Subject:</B> Re: Size or significance of =
correlations</DIV></DIV>
<DIV><BR></DIV>
<DIV><FONT size=3D2>First, compute the overall Type I error rate with =
the=20
following formula:</FONT></DIV>
<DIV><FONT size=3D2></FONT>&nbsp;</DIV>
<DIV><FONT size=3D2>overall alpha =3D ( 1 - (1-alpha=20
percomparison)**K).</FONT></DIV>
<DIV><FONT size=3D2></FONT>&nbsp;</DIV>
<DIV><FONT size=3D2>Where alpha percomparison is the Type I error rate, =
that is,=20
the</FONT></DIV>
<DIV><FONT size=3D2>Type I error rate you are using for each =
correlation,=20
presumably</FONT></DIV>
<DIV><FONT size=3D2>you claim that a correlation is statistically =
significant at=20
the 0.05 </FONT></DIV>
<DIV><FONT size=3D2>level. </FONT></DIV>
<DIV><FONT size=3D2>K is the number of correlations you're=20
evaluating.</FONT></DIV>
<DIV><FONT size=3D2></FONT>&nbsp;</DIV>
<DIV><FONT size=3D2>If you only had 3 correlations each using .05 alpha =
per=20
comparison,</FONT></DIV>
<DIV><FONT size=3D2>the above formula would look like this:</FONT></DIV>
<DIV><FONT size=3D2></FONT>&nbsp;</DIV>
<DIV><FONT size=3D2>Overall alpha =3D (1 - (1 - .05)**3))</FONT></DIV>
<DIV><FONT size=3D2>&nbsp; =3D (1-(.95)**3)=3D (1 - .8574) =3D =
.1426</FONT></DIV>
<DIV><FONT size=3D2></FONT>&nbsp;</DIV>
<DIV><FONT size=3D2>That is, you have about 14% chance of having =
commited a Type=20
I</FONT></DIV>
<DIV><FONT size=3D2>error after evaluating the three =
correlations.</FONT></DIV>
<DIV><FONT size=3D2></FONT>&nbsp;</DIV>
<DIV><FONT size=3D2>Many people would consider an overall Type I error =
rate of=20
14%</FONT></DIV>
<DIV><FONT size=3D2>as being too high (If you have more correlations, =
expect=20
the</FONT></DIV>
<DIV><FONT size=3D2>overall Type I error rate to skyrocket).&nbsp; Some =
would=20
suggest</FONT></DIV>
<DIV><FONT size=3D2>fixing the Overall Type I error rate to 0.05 and =
then use=20
an</FONT></DIV>
<DIV><FONT size=3D2>adjusted per comparison alpha for each =
correlation.&nbsp;=20
The</FONT></DIV>
<DIV><FONT size=3D2></FONT>&nbsp;</DIV>
<DIV><FONT size=3D2>Bonferroni correction allow you to set the Overall =
alpha =3D=20
.05</FONT></DIV>
<DIV><FONT size=3D2>and then use the following to calculate the per =
comparison=20
alpha:</FONT></DIV>
<DIV><FONT size=3D2></FONT>&nbsp;</DIV>
<DIV><FONT size=3D2>per comparison alpha =3D .05/K</FONT></DIV>
<DIV><FONT size=3D2></FONT>&nbsp;</DIV>
<DIV><FONT size=3D2>Where K, as above, refer to the number of =
correlations you=20
are</FONT></DIV>
<DIV><FONT size=3D2>evaluating.&nbsp; For K=3D3, per comparison alpha =
=3D .05/3 =3D=20
.01667.</FONT></DIV>
<DIV><FONT size=3D2>This means that correlations with a p-value less =
than=20
.01667</FONT></DIV>
<DIV><FONT size=3D2>are considered significant.&nbsp; For a large number =
of=20
correlations,</FONT></DIV>
<DIV><FONT size=3D2>the per comparison alpha can be quite =
small.</FONT></DIV>
<DIV><FONT size=3D2></FONT>&nbsp;</DIV>
<DIV><FONT size=3D2>There are more things you can do (e.g., adjust per=20
comparison</FONT></DIV>
<DIV><FONT size=3D2>alpha so that important correlation have a small =
value,=20
etc.)</FONT></DIV>
<DIV><FONT size=3D2>but it appears that you have limited knowledge about =
the=20
nature</FONT></DIV>
<DIV><FONT size=3D2>of the correlation and the phenomena they relate =
to.&nbsp;=20
That is,</FONT></DIV>
<DIV><FONT size=3D2>If you had previous research and/or good theory =
about the=20
nature</FONT></DIV>
<DIV><FONT size=3D2>of the correlation, you would not be asking how to =
interpret=20
your</FONT></DIV>
<DIV><FONT size=3D2>obtained correlations because these would serve as =
your=20
guide.</FONT></DIV>
<DIV><FONT size=3D2>Without such guides, you have to purely mechanical=20
guides</FONT></DIV>
<DIV><FONT size=3D2>such as the Bonferroni correction.</FONT></DIV>
<DIV><FONT size=3D2></FONT>&nbsp;</DIV>
<DIV><FONT size=3D2>In addition, regarding which correlations you should =

"trust",</FONT></DIV>
<DIV><FONT size=3D2>I would recommend trusting only those correlations =
that=20
are</FONT></DIV>
<DIV><FONT size=3D2>shown through replications of the research they come =

from</FONT></DIV>
<DIV><FONT size=3D2>to continue to be statistical significant.&nbsp; =
Otherwise,=20
mere size</FONT></DIV>
<DIV><FONT size=3D2>of the correlation coefficient or its probability =
are at best=20
only</FONT></DIV>
<DIV><FONT size=3D2>tentative sources of information.&nbsp; Even small =
values of a=20
</FONT></DIV>
<DIV><FONT size=3D2>correlation can be theoretically meaningful and=20
non-significant</FONT></DIV>
<DIV><FONT size=3D2>correlation may be non-significant because of a lack =
of=20
statistical</FONT></DIV>
<DIV><FONT size=3D2>power.</FONT></DIV>
<DIV><FONT size=3D2></FONT>&nbsp;</DIV>
<DIV><FONT size=3D2>-Mike Palij</FONT></DIV>
<DIV><FONT size=3D2>New York University</FONT></DIV>
<DIV><FONT size=3D2>[hidden email]</FONT></DIV>
<DIV><FONT size=3D2></FONT>&nbsp;</DIV>
<DIV><FONT size=3D2></FONT>&nbsp;</DIV>
<BLOCKQUOTE=20
style=3D"PADDING-RIGHT: 0px; PADDING-LEFT: 5px; MARGIN-LEFT: 5px; =
BORDER-LEFT: #000000 2px solid; MARGIN-RIGHT: 0px">
  <DIV style=3D"FONT: 10pt arial">----- Original Message ----- </DIV>
  <DIV=20
  style=3D"BACKGROUND: #e4e4e4; FONT: 10pt arial; font-color: =
black"><B>From:</B>=20
  <A title=[hidden email]=20
  href=3D"mailto:[hidden email]">Humphrey Paulie</A> </DIV>
  <DIV style=3D"FONT: 10pt arial"><B>To:</B> <A =
title=[hidden email]=20
  href=3D"mailto:[hidden email]">[hidden email]</A> =
</DIV>
  <DIV style=3D"FONT: 10pt arial"><B>Sent:</B> Wednesday, May 12, 2010 =
6:58=20
  AM</DIV>
  <DIV style=3D"FONT: 10pt arial"><B>Subject:</B> Size or significance =
of=20
  correlations</DIV>
  <DIV><BR></DIV>
  <TABLE cellSpacing=3D0 cellPadding=3D0 border=3D0>
    <TBODY>
    <TR>
      <TD vAlign=3Dtop><BR>Dear all,<BR>I have acorrealted a neumber of=20
        varibales (Pearson) and have found correlations as low as .15 to =
be=20
        significant at 0.01 level (two-tailed). My sample size is =
280.<BR>I dont=20
        know how to interpret this.<BR>the coefficinets are very small =
however=20
        statistically significant.<BR>Which one should I trust, the size =
of the=20
        coefficients or the statistical significnace?<BR>I will be =
thankful for=20
        =
comments.<BR>Cheers<BR>Humphrey<BR></TD></TR></TBODY></TABLE><BR></BLOCKQ=
UOTE></BODY></HTML>

------=_NextPart_000_005F_01CAF1B0.41609FF0--

------------------------------

Date:    Wed, 12 May 2010 07:47:33 -0500
From:    ANDRES ALBERTO BURGA LEON <[hidden email]>
Subject: Re: Size or significance of correlations

Este es un mensaje de varios componentes en formato MIME.
--=_alternative 0046DC7A05257721_=
Content-Type: text/plain; charset="ISO-8859-1"
Content-Transfer-Encoding: quoted-printable

Hello Humphrey:

In my opinion, we need to distinguish between statistical significance and =

practical significance.
The statistical significance only means that this correlation is expected=20
to be different from 0 in the population. You should also look at the size =

of the correlation considering the field in which you are working. Is your =

correlation similar to that found in other studies using this variables?=20
This correlation have any practical meaning? With a correaltion as low as=20
0.15 the variance of one variable explains 2.25% of the variance from the=20
other variable, wich is very low.
Kindly


Mg. Andr=E9s Burga Le=F3n
Coordinador de An=E1lisis e Inform=E1tica
Unidad de Medici=F3n de la Calidad Educativa
Ministerio de Educaci=F3n del Per=FA
Calle El Comercio s/n (espalda del Museo de la Naci=F3n)
Lima 41
Per=FA
Tel=E9fono 615-5840



Humphrey Paulie <[hidden email]>=20
Enviado por: "SPSSX(r) Discussion" <[hidden email]>
12/05/2010 06:01 a.m.
Por favor, responda a
Humphrey Paulie <[hidden email]>


Para
[hidden email]
cc

Asunto
Size or significance of correlations








Dear all,
I have acorrealted a neumber of varibales (Pearson) and have found=20
correlations as low as .15 to be significant at 0.01 level (two-tailed).=20
My sample size is 280.
I dont know how to interpret this.
the coefficinets are very small however statistically significant.
Which one should I trust, the size of the coefficients or the statistical=20
significnace?
I will be thankful for comments.
Cheers
Humphrey


--=_alternative 0046DC7A05257721_=
Content-Type: text/html; charset="ISO-8859-1"
Content-Transfer-Encoding: quoted-printable


<br><font size=3D2 face=3D"sans-serif">Hello Humphrey:</font>
<br>
<br><font size=3D2 face=3D"sans-serif">In my opinion, we need to distinguish
between statistical significance and practical significance.</font>
<p><font size=3D2 face=3D"sans-serif">The statistical significance only mea=
ns
that this correlation is expected to be different from 0 in the population.
You should also look at the size of the correlation considering the field
in which you are working. Is your correlation similar to that found in
other studies using this variables? This correlation have any practical
meaning? With a correaltion as low as 0.15 the variance of one variable
explains 2.25% of the variance from the other variable, wich is very low.</=
font>
<p><font size=3D2 face=3D"sans-serif">Kindly</font>
<p>
<br>
<br><font size=3D2 face=3D"sans-serif">Mg. Andr=E9s Burga Le=F3n<br>
Coordinador de An=E1lisis e Inform=E1tica<br>
Unidad de Medici=F3n de la Calidad Educativa<br>
Ministerio de Educaci=F3n del Per=FA<br>
Calle El Comercio s/n (espalda del Museo de la Naci=F3n)<br>
Lima 41<br>
Per=FA<br>
Tel=E9fono 615-5840</font>
<br>
<br>
<br>
<table width=3D100%>
<tr valign=3Dtop>
<td width=3D40%><font size=3D1 face=3D"sans-serif"><b>Humphrey Paulie &lt;h=
[hidden email]&gt;</b>
</font>
<br><font size=3D1 face=3D"sans-serif">Enviado por: &quot;SPSSX(r) Discussi=
on&quot;
&lt;[hidden email]&gt;</font>
<p><font size=3D1 face=3D"sans-serif">12/05/2010 06:01 a.m.</font>
<table border>
<tr valign=3Dtop>
<td bgcolor=3Dwhite>
<div align=3Dcenter><font size=3D1 face=3D"sans-serif">Por favor, responda =
a<br>
Humphrey Paulie &lt;[hidden email]&gt;</font></div></table>
<br>
<td width=3D59%>
<table width=3D100%>
<tr valign=3Dtop>
<td>
<div align=3Dright><font size=3D1 face=3D"sans-serif">Para</font></div>
<td><font size=3D1 face=3D"sans-serif">[hidden email]</font>
<tr valign=3Dtop>
<td>
<div align=3Dright><font size=3D1 face=3D"sans-serif">cc</font></div>
<td>
<tr valign=3Dtop>
<td>
<div align=3Dright><font size=3D1 face=3D"sans-serif">Asunto</font></div>
<td><font size=3D1 face=3D"sans-serif">Size or significance of correlations=
</font></table>
<br>
<table>
<tr valign=3Dtop>
<td>
<td></table>
<br></table>
<br>
<br>
<br>
<table>
<tr valign=3Dtop>
<td><font size=3D3><br>
Dear all,<br>
I have acorrealted a neumber of varibales (Pearson) and have found correlat=
ions
as low as .15 to be significant at 0.01 level (two-tailed). My sample size
is 280.<br>
I dont know how to interpret this.<br>
the coefficinets are very small however statistically significant.<br>
Which one should I trust, the size of the coefficients or the statistical
significnace?<br>
I will be thankful for comments.<br>
Cheers<br>
Humphrey</font></table>
<br>
<br>
--=_alternative 0046DC7A05257721_=--

------------------------------

Date:    Wed, 12 May 2010 09:16:54 -0400
From:    Diane Putnick <[hidden email]>
Subject: deviation contrast for linear mixed model?

Hi there,

I am trying to model a repeated measures ANOVA type model with 3 repeated
covariates using MIXED.  Parent gender (mother vs. father) is the repeated
fixed factor, country (9 groups) is the between-subjects fixed factor, and
the covariates are mother-father age, education, and social desirability
bias.  The problem I'm having is that I want to get a deviation contrast
(deviation from the grand mean) for the main effect of country.  I don't
see any easy way to do this in the syntax. My options seem to only include
using a single group as the contrast group.  Any suggestions?

Thanks,
Diane

MIXED dv BY Parent NewCountryID
  /CRITERIA=CIN(95) MXITER(100) MXSTEP(5) SCORING(1) SINGULAR
(0.000000000001) HCONVERGE(0,
    ABSOLUTE) LCONVERGE(0, ABSOLUTE) PCONVERGE(0.000001, ABSOLUTE)
  /FIXED=Parent NewCountryID Parent*NewCountryID | SSTYPE(3)
  /METHOD=REML
  /PRINT=SOLUTION TESTCOV
  /REPEATED=Parent | SUBJECT(idnew) COVTYPE(CSH)
  /EMMEANS=TABLES(Parent) COMPARE ADJ(LSD)
  /EMMEANS=TABLES(NewCountryID) COMPARE REFCAT(??) ADJ(LSD)
  /EMMEANS=TABLES(Parent*NewCountryID) .

------------------------------

Date:    Wed, 12 May 2010 13:18:07 +0000
From:    Statisticsdoc Consulting <[hidden email]>
Subject: Re: Size or significance of correlations

Humphrey,
The significance test addresses the question of whether you can reject the Null Hypothesis that your sample was drawn from a population in which the correlation is zero.  As sample size increases, even very small correlations can be statistically significant in the sense of being non-zero.  Nonetheless, the effeft size (square of the.correlation ) can be miniscule.  Which do you trust? Trust both but recognoze that statistical significance and effect size are different issues.
Best,
Steve Brand
www.StatisticsDoc.com

------------------------------

Date:    Wed, 12 May 2010 13:20:05 -0400
From:    Asil Ozdogru <[hidden email]>
Subject: Overlapping Periods

--_000_B7635ACC498A8A40A2715399FB88A4F35CAE1C3522p01789prainci_
Content-Type: text/plain; charset="us-ascii"
Content-Transfer-Encoding: quoted-printable

Hello,

I would like to get the earliest in and latest out dates among a set of ove=
rlapping periods using syntax.
An example is provided below.

Could anyone help me with that?

Thanks,

Asil

EXAMPLE:
I would like to go from this set
StudyID

InDate

OutDate

30006

9/20/2002

4/30/2007

30006

12/29/2006

5/31/2012

30006

11/25/2009

5/31/2012

30014

4/16/1975

3/3/1988

30014

4/21/1980

5/28/1982

30014

2/16/2001

11/24/2001

30022

2/5/1992

7/22/1995

30057

2/1/1988

11/20/1988

30057

12/15/1989

10/17/1990

30057

1/23/1992

10/18/1993


To this set
StudyID

InDate

OutDate

30006

9/20/2002

5/31/2012

30014

4/16/1975

3/3/1988

30014

2/16/2001

11/24/2001

30022

2/5/1992

7/22/1995

30057

2/1/1988

11/20/1988

30057

12/15/1989

10/17/1990

30057

1/23/1992

10/18/1993



--_000_B7635ACC498A8A40A2715399FB88A4F35CAE1C3522p01789prainci_
Content-Type: text/html; charset="us-ascii"
Content-Transfer-Encoding: quoted-printable

<html xmlns:v=3D"urn:schemas-microsoft-com:vml" xmlns:o=3D"urn:schemas-micr=
osoft-com:office:office" xmlns:w=3D"urn:schemas-microsoft-com:office:word" =
xmlns:x=3D"urn:schemas-microsoft-com:office:excel" xmlns:m=3D"http://schema=
s.microsoft.com/office/2004/12/omml" xmlns=3D"http://www.w3.org/TR/REC-html=
40">

<head>
<META HTTP-EQUIV=3D"Content-Type" CONTENT=3D"text/html; charset=3Dus-ascii"=
>
<meta name=3DGenerator content=3D"Microsoft Word 12 (filtered medium)">
<style>
<!--
 /* Font Definitions */
 @font-face
        {font-family:"Cambria Math";
        panose-1:2 4 5 3 5 4 6 3 2 4;}
@font-face
        {font-family:Calibri;
        panose-1:2 15 5 2 2 2 4 3 2 4;}
 /* Style Definitions */
 p.MsoNormal, li.MsoNormal, div.MsoNormal
        {margin:0in;
        margin-bottom:.0001pt;
        font-size:12.0pt;
        font-family:"Times New Roman","serif";}
tt
        {mso-style-priority:99;
        font-family:"Courier New";}
p.msochpdefault, li.msochpdefault, div.msochpdefault
        {mso-style-name:msochpdefault;
        mso-margin-top-alt:auto;
        margin-right:0in;
        mso-margin-bottom-alt:auto;
        margin-left:0in;
        font-size:12.0pt;
        font-family:"Times New Roman","serif";}
p.msochpdefault1, li.msochpdefault1, div.msochpdefault1
        {mso-style-name:msochpdefault1;
        mso-margin-top-alt:auto;
        margin-right:0in;
        mso-margin-bottom-alt:auto;
        margin-left:0in;
        font-size:12.0pt;
        font-family:"Times New Roman","serif";}
span.emailstyle181
        {mso-style-name:emailstyle181;}
span.emailstyle22
        {mso-style-name:emailstyle22;}
span.emailstyle24
        {mso-style-name:emailstyle24;}
p.msochpdefault2, li.msochpdefault2, div.msochpdefault2
        {mso-style-name:msochpdefault2;
        mso-margin-top-alt:auto;
        margin-right:0in;
        mso-margin-bottom-alt:auto;
        margin-left:0in;
        font-size:12.0pt;
        font-family:"Times New Roman","serif";}
p.msochpdefault11, li.msochpdefault11, div.msochpdefault11
        {mso-style-name:msochpdefault11;
        mso-margin-top-alt:auto;
        margin-right:0in;
        mso-margin-bottom-alt:auto;
        margin-left:0in;
        font-size:10.0pt;
        font-family:"Times New Roman","serif";}
span.emailstyle1811
        {mso-style-name:emailstyle1811;
        font-family:"Arial","sans-serif";
        color:#1F497D;}
span.emailstyle221
        {mso-style-name:emailstyle221;
        font-family:"Arial","sans-serif";
        color:#1F497D;}
span.emailstyle241
        {mso-style-name:emailstyle241;
        font-family:"Arial","sans-serif";
        color:#1F497D;}
span.EmailStyle29
        {mso-style-type:personal-reply;
        font-family:"Calibri","sans-serif";
        color:#1F497D;}
.MsoChpDefault
        {mso-style-type:export-only;}
@page Section1
        {size:8.5in 11.0in;
        margin:1.0in 1.0in 1.0in 1.0in;}
div.Section1
        {page:Section1;}
-->
</style>
<!--[if gte mso 9]><xml>
 <o:shapedefaults v:ext=3D"edit" spidmax=3D"1026" />
</xml><![endif]--><!--[if gte mso 9]><xml>
 <o:shapelayout v:ext=3D"edit">
  <o:idmap v:ext=3D"edit" data=3D"1" />
 </o:shapelayout></xml><![endif]-->
</head>

<body lang=3DEN-US link=3D"#000000" vlink=3D"#000000">

<div class=3DSection1>

<p class=3DMsoNormal><span style=3D'font-size:11.0pt;font-family:"Calibri",=
"sans-serif";
color:#1F497D'>Hello,<o:p></o:p></span></p>

<p class=3DMsoNormal><span style=3D'font-size:11.0pt;font-family:"Calibri",=
"sans-serif";
color:#1F497D'><o:p>&nbsp;</o:p></span></p>

<p class=3DMsoNormal><span style=3D'font-size:11.0pt;font-family:"Calibri",=
"sans-serif";
color:#1F497D'>I would like to get the earliest in and latest out dates amo=
ng a
set of overlapping periods using syntax.<o:p></o:p></span></p>

<p class=3DMsoNormal><span style=3D'font-size:11.0pt;font-family:"Calibri",=
"sans-serif";
color:#1F497D'>An example is provided below.<o:p></o:p></span></p>

<p class=3DMsoNormal><span style=3D'font-size:11.0pt;font-family:"Calibri",=
"sans-serif";
color:#1F497D'><o:p>&nbsp;</o:p></span></p>

<p class=3DMsoNormal><span style=3D'font-size:11.0pt;font-family:"Calibri",=
"sans-serif";
color:#1F497D'>Could anyone help me with that?<o:p></o:p></span></p>

<p class=3DMsoNormal><span style=3D'font-size:11.0pt;font-family:"Calibri",=
"sans-serif";
color:#1F497D'><o:p>&nbsp;</o:p></span></p>

<p class=3DMsoNormal><span style=3D'font-size:11.0pt;font-family:"Calibri",=
"sans-serif";
color:#1F497D'>Thanks,<o:p></o:p></span></p>

<p class=3DMsoNormal><span style=3D'font-size:11.0pt;font-family:"Calibri",=
"sans-serif";
color:#1F497D'><o:p>&nbsp;</o:p></span></p>

<p class=3DMsoNormal><span style=3D'font-size:11.0pt;font-family:"Calibri",=
"sans-serif";
color:#1F497D'>Asil<o:p></o:p></span></p>

<p class=3DMsoNormal><span style=3D'font-size:11.0pt;font-family:"Calibri",=
"sans-serif";
color:#1F497D'><o:p>&nbsp;</o:p></span></p>

<p class=3DMsoNormal><span style=3D'font-size:11.0pt;font-family:"Calibri",=
"sans-serif";
color:#1F497D'>EXAMPLE:<o:p></o:p></span></p>

<p class=3DMsoNormal><span style=3D'font-size:11.0pt;font-family:"Calibri",=
"sans-serif";
color:#1F497D'>I would like to go from this set<o:p></o:p></span></p>

<table class=3DMsoTableGrid border=3D1 cellspacing=3D0 cellpadding=3D0
 style=3D'border-collapse:collapse;border:none'>
 <tr style=3D'height:15.0pt'>
  <td width=3D62 nowrap valign=3Dtop style=3D'width:.65in;border:solid blac=
k 1.0pt;
  padding:0in 5.4pt 0in 5.4pt;height:15.0pt'>
  <p class=3DMsoNormal><b><span style=3D'font-size:11.0pt;font-family:"Cali=
bri","sans-serif";
  color:#1F497D'>StudyID<o:p></o:p></span></b></p>
  </td>
  <td width=3D82 nowrap valign=3Dtop style=3D'width:61.8pt;border:solid bla=
ck 1.0pt;
  border-left:none;padding:0in 5.4pt 0in 5.4pt;height:15.0pt'>
  <p class=3DMsoNormal><b><span style=3D'font-size:11.0pt;font-family:"Cali=
bri","sans-serif";
  color:#1F497D'>InDate<o:p></o:p></span></b></p>
  </td>
  <td width=3D82 nowrap valign=3Dtop style=3D'width:61.8pt;border:solid bla=
ck 1.0pt;
  border-left:none;padding:0in 5.4pt 0in 5.4pt;height:15.0pt'>
  <p class=3DMsoNormal><b><span style=3D'font-size:11.0pt;font-family:"Cali=
bri","sans-serif";
  color:#1F497D'>OutDate<o:p></o:p></span></b></p>
  </td>
 </tr>
 <tr style=3D'height:15.0pt'>
  <td width=3D62 nowrap valign=3Dtop style=3D'width:.65in;border:solid blac=
k 1.0pt;
  border-top:none;padding:0in 5.4pt 0in 5.4pt;height:15.0pt'>
  <p class=3DMsoNormal><span style=3D'font-size:11.0pt;font-family:"Calibri=
","sans-serif";
  color:#1F497D'>30006<o:p></o:p></span></p>
  </td>
  <td width=3D82 nowrap valign=3Dtop style=3D'width:61.8pt;border-top:none;
  border-left:none;border-bottom:solid black 1.0pt;border-right:solid black=
 1.0pt;
  padding:0in 5.4pt 0in 5.4pt;height:15.0pt'>
  <p class=3DMsoNormal align=3Dright style=3D'text-align:right'><span
  style=3D'font-size:11.0pt;font-family:"Calibri","sans-serif";color:#1F497=
D'>9/20/2002<o:p></o:p></span></p>
  </td>
  <td width=3D82 nowrap valign=3Dtop style=3D'width:61.8pt;border-top:none;
  border-left:none;border-bottom:solid black 1.0pt;border-right:solid black=
 1.0pt;
  padding:0in 5.4pt 0in 5.4pt;height:15.0pt'>
  <p class=3DMsoNormal align=3Dright style=3D'text-align:right'><span
  style=3D'font-size:11.0pt;font-family:"Calibri","sans-serif";color:#1F497=
D'>4/30/2007<o:p></o:p></span></p>
  </td>
 </tr>
 <tr style=3D'height:15.0pt'>
  <td width=3D62 nowrap valign=3Dtop style=3D'width:.65in;border:solid blac=
k 1.0pt;
  border-top:none;padding:0in 5.4pt 0in 5.4pt;height:15.0pt'>
  <p class=3DMsoNormal><span style=3D'font-size:11.0pt;font-family:"Calibri=
","sans-serif";
  color:#1F497D'>30006<o:p></o:p></span></p>
  </td>
  <td width=3D82 nowrap valign=3Dtop style=3D'width:61.8pt;border-top:none;
  border-left:none;border-bottom:solid black 1.0pt;border-right:solid black=
 1.0pt;
  padding:0in 5.4pt 0in 5.4pt;height:15.0pt'>
  <p class=3DMsoNormal align=3Dright style=3D'text-align:right'><span
  style=3D'font-size:11.0pt;font-family:"Calibri","sans-serif";color:#1F497=
D'>12/29/2006<o:p></o:p></span></p>
  </td>
  <td width=3D82 nowrap valign=3Dtop style=3D'width:61.8pt;border-top:none;
  border-left:none;border-bottom:solid black 1.0pt;border-right:solid black=
 1.0pt;
  padding:0in 5.4pt 0in 5.4pt;height:15.0pt'>
  <p class=3DMsoNormal align=3Dright style=3D'text-align:right'><span
  style=3D'font-size:11.0pt;font-family:"Calibri","sans-serif";color:#1F497=
D'>5/31/2012<o:p></o:p></span></p>
  </td>
 </tr>
 <tr style=3D'height:15.0pt'>
  <td width=3D62 nowrap valign=3Dtop style=3D'width:.65in;border:solid blac=
k 1.0pt;
  border-top:none;padding:0in 5.4pt 0in 5.4pt;height:15.0pt'>
  <p class=3DMsoNormal><span style=3D'font-size:11.0pt;font-family:"Calibri=
","sans-serif";
  color:#1F497D'>30006<o:p></o:p></span></p>
  </td>
  <td width=3D82 nowrap valign=3Dtop style=3D'width:61.8pt;border-top:none;
  border-left:none;border-bottom:solid black 1.0pt;border-right:solid black=
 1.0pt;
  padding:0in 5.4pt 0in 5.4pt;height:15.0pt'>
  <p class=3DMsoNormal align=3Dright style=3D'text-align:right'><span
  style=3D'font-size:11.0pt;font-family:"Calibri","sans-serif";color:#1F497=
D'>11/25/2009<o:p></o:p></span></p>
  </td>
  <td width=3D82 nowrap valign=3Dtop style=3D'width:61.8pt;border-top:none;
  border-left:none;border-bottom:solid black 1.0pt;border-right:solid black=
 1.0pt;
  padding:0in 5.4pt 0in 5.4pt;height:15.0pt'>
  <p class=3DMsoNormal align=3Dright style=3D'text-align:right'><span
  style=3D'font-size:11.0pt;font-family:"Calibri","sans-serif";color:#1F497=
D'>5/31/2012<o:p></o:p></span></p>
  </td>
 </tr>
 <tr style=3D'height:15.0pt'>
  <td width=3D62 nowrap valign=3Dtop style=3D'width:.65in;border:solid blac=
k 1.0pt;
  border-top:none;padding:0in 5.4pt 0in 5.4pt;height:15.0pt'>
  <p class=3DMsoNormal><span style=3D'font-size:11.0pt;font-family:"Calibri=
","sans-serif";
  color:#1F497D'>30014<o:p></o:p></span></p>
  </td>
  <td width=3D82 nowrap valign=3Dtop style=3D'width:61.8pt;border-top:none;
  border-left:none;border-bottom:solid black 1.0pt;border-right:solid black=
 1.0pt;
  padding:0in 5.4pt 0in 5.4pt;height:15.0pt'>
  <p class=3DMsoNormal align=3Dright style=3D'text-align:right'><span
  style=3D'font-size:11.0pt;font-family:"Calibri","sans-serif";color:#1F497=
D'>4/16/1975<o:p></o:p></span></p>
  </td>
  <td width=3D82 nowrap valign=3Dtop style=3D'width:61.8pt;border-top:none;
  border-left:none;border-bottom:solid black 1.0pt;border-right:solid black=
 1.0pt;
  padding:0in 5.4pt 0in 5.4pt;height:15.0pt'>
  <p class=3DMsoNormal align=3Dright style=3D'text-align:right'><span
  style=3D'font-size:11.0pt;font-family:"Calibri","sans-serif";color:#1F497=
D'>3/3/1988<o:p></o:p></span></p>
  </td>
 </tr>
 <tr style=3D'height:15.0pt'>
  <td width=3D62 nowrap valign=3Dtop style=3D'width:.65in;border:solid blac=
k 1.0pt;
  border-top:none;padding:0in 5.4pt 0in 5.4pt;height:15.0pt'>
  <p class=3DMsoNormal><span style=3D'font-size:11.0pt;font-family:"Calibri=
","sans-serif";
  color:#1F497D'>30014<o:p></o:p></span></p>
  </td>
  <td width=3D82 nowrap valign=3Dtop style=3D'width:61.8pt;border-top:none;
  border-left:none;border-bottom:solid black 1.0pt;border-right:solid black=
 1.0pt;
  padding:0in 5.4pt 0in 5.4pt;height:15.0pt'>
  <p class=3DMsoNormal align=3Dright style=3D'text-align:right'><span
  style=3D'font-size:11.0pt;font-family:"Calibri","sans-serif";color:#1F497=
D'>4/21/1980<o:p></o:p></span></p>
  </td>
  <td width=3D82 nowrap valign=3Dtop style=3D'width:61.8pt;border-top:none;
  border-left:none;border-bottom:solid black 1.0pt;border-right:solid black=
 1.0pt;
  padding:0in 5.4pt 0in 5.4pt;height:15.0pt'>
  <p class=3DMsoNormal align=3Dright style=3D'text-align:right'><span
  style=3D'font-size:11.0pt;font-family:"Calibri","sans-serif";color:#1F497=
D'>5/28/1982<o:p></o:p></span></p>
  </td>
 </tr>
 <tr style=3D'height:15.0pt'>
  <td width=3D62 nowrap valign=3Dtop style=3D'width:.65in;border:solid blac=
k 1.0pt;
  border-top:none;padding:0in 5.4pt 0in 5.4pt;height:15.0pt'>
  <p class=3DMsoNormal><span style=3D'font-size:11.0pt;font-family:"Calibri=
","sans-serif";
  color:#1F497D'>30014<o:p></o:p></span></p>
  </td>
  <td width=3D82 nowrap valign=3Dtop style=3D'width:61.8pt;border-top:none;
  border-left:none;border-bottom:solid black 1.0pt;border-right:solid black=
 1.0pt;
  padding:0in 5.4pt 0in 5.4pt;height:15.0pt'>
  <p class=3DMsoNormal align=3Dright style=3D'text-align:right'><span
  style=3D'font-size:11.0pt;font-family:"Calibri","sans-serif";color:#1F497=
D'>2/16/2001<o:p></o:p></span></p>
  </td>
  <td width=3D82 nowrap valign=3Dtop style=3D'width:61.8pt;border-top:none;
  border-left:none;border-bottom:solid black 1.0pt;border-right:solid black=
 1.0pt;
  padding:0in 5.4pt 0in 5.4pt;height:15.0pt'>
  <p class=3DMsoNormal align=3Dright style=3D'text-align:right'><span
  style=3D'font-size:11.0pt;font-family:"Calibri","sans-serif";color:#1F497=
D'>11/24/2001<o:p></o:p></span></p>
  </td>
 </tr>
 <tr style=3D'height:15.0pt'>
  <td width=3D62 nowrap valign=3Dtop style=3D'width:.65in;border:solid blac=
k 1.0pt;
  border-top:none;padding:0in 5.4pt 0in 5.4pt;height:15.0pt'>
  <p class=3DMsoNormal><span style=3D'font-size:11.0pt;font-family:"Calibri=
","sans-serif";
  color:#1F497D'>30022<o:p></o:p></span></p>
  </td>
  <td width=3D82 nowrap valign=3Dtop style=3D'width:61.8pt;border-top:none;
  border-left:none;border-bottom:solid black 1.0pt;border-right:solid black=
 1.0pt;
  padding:0in 5.4pt 0in 5.4pt;height:15.0pt'>
  <p class=3DMsoNormal align=3Dright style=3D'text-align:right'><span
  style=3D'font-size:11.0pt;font-family:"Calibri","sans-serif";color:#1F497=
D'>2/5/1992<o:p></o:p></span></p>
  </td>
  <td width=3D82 nowrap valign=3Dtop style=3D'width:61.8pt;border-top:none;
  border-left:none;border-bottom:solid black 1.0pt;border-right:solid black=
 1.0pt;
  padding:0in 5.4pt 0in 5.4pt;height:15.0pt'>
  <p class=3DMsoNormal align=3Dright style=3D'text-align:right'><span
  style=3D'font-size:11.0pt;font-family:"Calibri","sans-serif";color:#1F497=
D'>7/22/1995<o:p></o:p></span></p>
  </td>
 </tr>
 <tr style=3D'height:15.0pt'>
  <td width=3D62 nowrap valign=3Dtop style=3D'width:.65in;border:solid blac=
k 1.0pt;
  border-top:none;padding:0in 5.4pt 0in 5.4pt;height:15.0pt'>
  <p class=3DMsoNormal><span style=3D'font-size:11.0pt;font-family:"Calibri=
","sans-serif";
  color:#1F497D'>30057<o:p></o:p></span></p>
  </td>
  <td width=3D82 nowrap valign=3Dtop style=3D'width:61.8pt;border-top:none;
  border-left:none;border-bottom:solid black 1.0pt;border-right:solid black=
 1.0pt;
  padding:0in 5.4pt 0in 5.4pt;height:15.0pt'>
  <p class=3DMsoNormal align=3Dright style=3D'text-align:right'><span
  style=3D'font-size:11.0pt;font-family:"Calibri","sans-serif";color:#1F497=
D'>2/1/1988<o:p></o:p></span></p>
  </td>
  <td width=3D82 nowrap valign=3Dtop style=3D'width:61.8pt;border-top:none;
  border-left:none;border-bottom:solid black 1.0pt;border-right:solid black=
 1.0pt;
  padding:0in 5.4pt 0in 5.4pt;height:15.0pt'>
  <p class=3DMsoNormal align=3Dright style=3D'text-align:right'><span
  style=3D'font-size:11.0pt;font-family:"Calibri","sans-serif";color:#1F497=
D'>11/20/1988<o:p></o:p></span></p>
  </td>
 </tr>
 <tr style=3D'height:15.0pt'>
  <td width=3D62 nowrap valign=3Dtop style=3D'width:.65in;border:solid blac=
k 1.0pt;
  border-top:none;padding:0in 5.4pt 0in 5.4pt;height:15.0pt'>
  <p class=3DMsoNormal><span style=3D'font-size:11.0pt;font-family:"Calibri=
","sans-serif";
  color:#1F497D'>30057<o:p></o:p></span></p>
  </td>
  <td width=3D82 nowrap valign=3Dtop style=3D'width:61.8pt;border-top:none;
  border-left:none;border-bottom:solid black 1.0pt;border-right:solid black=
 1.0pt;
  padding:0in 5.4pt 0in 5.4pt;height:15.0pt'>
  <p class=3DMsoNormal align=3Dright style=3D'text-align:right'><span
  style=3D'font-size:11.0pt;font-family:"Calibri","sans-serif";color:#1F497=
D'>12/15/1989<o:p></o:p></span></p>
  </td>
  <td width=3D82 nowrap valign=3Dtop style=3D'width:61.8pt;border-top:none;
  border-left:none;border-bottom:solid black 1.0pt;border-right:solid black=
 1.0pt;
  padding:0in 5.4pt 0in 5.4pt;height:15.0pt'>
  <p class=3DMsoNormal align=3Dright style=3D'text-align:right'><span
  style=3D'font-size:11.0pt;font-family:"Calibri","sans-serif";color:#1F497=
D'>10/17/1990<o:p></o:p></span></p>
  </td>
 </tr>
 <tr style=3D'height:15.0pt'>
  <td width=3D62 nowrap valign=3Dtop style=3D'width:.65in;border:solid blac=
k 1.0pt;
  border-top:none;padding:0in 5.4pt 0in 5.4pt;height:15.0pt'>
  <p class=3DMsoNormal><span style=3D'font-size:11.0pt;font-family:"Calibri=
","sans-serif";
  color:#1F497D'>30057<o:p></o:p></span></p>
  </td>
  <td width=3D82 nowrap valign=3Dtop style=3D'width:61.8pt;border-top:none;
  border-left:none;border-bottom:solid black 1.0pt;border-right:solid black=
 1.0pt;
  padding:0in 5.4pt 0in 5.4pt;height:15.0pt'>
  <p class=3DMsoNormal align=3Dright style=3D'text-align:right'><span
  style=3D'font-size:11.0pt;font-family:"Calibri","sans-serif";color:#1F497=
D'>1/23/1992<o:p></o:p></span></p>
  </td>
  <td width=3D82 nowrap valign=3Dtop style=3D'width:61.8pt;border-top:none;
  border-left:none;border-bottom:solid black 1.0pt;border-right:solid black=
 1.0pt;
  padding:0in 5.4pt 0in 5.4pt;height:15.0pt'>
  <p class=3DMsoNormal align=3Dright style=3D'text-align:right'><span
  style=3D'font-size:11.0pt;font-family:"Calibri","sans-serif";color:#1F497=
D'>10/18/1993<o:p></o:p></span></p>
  </td>
 </tr>
</table>

<p class=3DMsoNormal><span style=3D'font-size:11.0pt;font-family:"Calibri",=
"sans-serif";
color:#1F497D'><o:p>&nbsp;</o:p></span></p>

<p class=3DMsoNormal><span style=3D'font-size:11.0pt;font-family:"Calibri",=
"sans-serif";
color:#1F497D'>To this set<o:p></o:p></span></p>

<table class=3DMsoTableGrid border=3D1 cellspacing=3D0 cellpadding=3D0
 style=3D'border-collapse:collapse;border:none'>
 <tr style=3D'height:15.0pt'>
  <td width=3D62 nowrap valign=3Dtop style=3D'width:.65in;border:solid blac=
k 1.0pt;
  padding:0in 5.4pt 0in 5.4pt;height:15.0pt'>
  <p class=3DMsoNormal><b><span style=3D'font-size:11.0pt;font-family:"Cali=
bri","sans-serif";
  color:#1F497D'>StudyID<o:p></o:p></span></b></p>
  </td>
  <td width=3D82 nowrap valign=3Dtop style=3D'width:61.8pt;border:solid bla=
ck 1.0pt;
  border-left:none;padding:0in 5.4pt 0in 5.4pt;height:15.0pt'>
  <p class=3DMsoNormal><b><span style=3D'font-size:11.0pt;font-family:"Cali=
bri","sans-serif";
  color:#1F497D'>InDate<o:p></o:p></span></b></p>
  </td>
  <td width=3D82 nowrap valign=3Dtop style=3D'width:61.8pt;border:solid bla=
ck 1.0pt;
  border-left:none;padding:0in 5.4pt 0in 5.4pt;height:15.0pt'>
  <p class=3DMsoNormal><b><span style=3D'font-size:11.0pt;font-family:"Cali=
bri","sans-serif";
  color:#1F497D'>OutDate<o:p></o:p></span></b></p>
  </td>
 </tr>
 <tr style=3D'height:15.0pt'>
  <td width=3D62 nowrap valign=3Dtop style=3D'width:.65in;border:solid blac=
k 1.0pt;
  border-top:none;padding:0in 5.4pt 0in 5.4pt;height:15.0pt'>
  <p class=3DMsoNormal><span style=3D'font-size:11.0pt;font-family:"Calibri=
","sans-serif";
  color:#1F497D'>30006<o:p></o:p></span></p>
  </td>
  <td width=3D82 nowrap valign=3Dtop style=3D'width:61.8pt;border-top:none;
  border-left:none;border-bottom:solid black 1.0pt;border-right:solid black=
 1.0pt;
  padding:0in 5.4pt 0in 5.4pt;height:15.0pt'>
  <p class=3DMsoNormal align=3Dright style=3D'text-align:right'><span
  style=3D'font-size:11.0pt;font-family:"Calibri","sans-serif";color:#1F497=
D'>9/20/2002<o:p></o:p></span></p>
  </td>
  <td width=3D82 nowrap valign=3Dtop style=3D'width:61.8pt;border-top:none;
  border-left:none;border-bottom:solid black 1.0pt;border-right:solid black=
 1.0pt;
  padding:0in 5.4pt 0in 5.4pt;height:15.0pt'>
  <p class=3DMsoNormal align=3Dright style=3D'text-align:right'><span
  style=3D'font-size:11.0pt;font-family:"Calibri","sans-serif";color:#1F497=
D'>5/31/2012<o:p></o:p></span></p>
  </td>
 </tr>
 <tr style=3D'height:15.0pt'>
  <td width=3D62 nowrap valign=3Dtop style=3D'width:.65in;border:solid blac=
k 1.0pt;
  border-top:none;padding:0in 5.4pt 0in 5.4pt;height:15.0pt'>
  <p class=3DMsoNormal><span style=3D'font-size:11.0pt;font-family:"Calibri=
","sans-serif";
  color:#1F497D'>30014<o:p></o:p></span></p>
  </td>
  <td width=3D82 nowrap valign=3Dtop style=3D'width:61.8pt;border-top:none;
  border-left:none;border-bottom:solid black 1.0pt;border-right:solid black=
 1.0pt;
  padding:0in 5.4pt 0in 5.4pt;height:15.0pt'>
  <p class=3DMsoNormal align=3Dright style=3D'text-align:right'><span
  style=3D'font-size:11.0pt;font-family:"Calibri","sans-serif";color:#1F497=
D'>4/16/1975<o:p></o:p></span></p>
  </td>
  <td width=3D82 nowrap valign=3Dtop style=3D'width:61.8pt;border-top:none;
  border-left:none;border-bottom:solid black 1.0pt;border-right:solid black=
 1.0pt;
  padding:0in 5.4pt 0in 5.4pt;height:15.0pt'>
  <p class=3DMsoNormal align=3Dright style=3D'text-align:right'><span
  style=3D'font-size:11.0pt;font-family:"Calibri","sans-serif";color:#1F497=
D'>3/3/1988<o:p></o:p></span></p>
  </td>
 </tr>
 <tr style=3D'height:15.0pt'>
  <td width=3D62 nowrap valign=3Dtop style=3D'width:.65in;border:solid blac=
k 1.0pt;
  border-top:none;padding:0in 5.4pt 0in 5.4pt;height:15.0pt'>
  <p class=3DMsoNormal><span style=3D'font-size:11.0pt;font-family:"Calibri=
","sans-serif";
  color:#1F497D'>30014<o:p></o:p></span></p>
  </td>
  <td width=3D82 nowrap valign=3Dtop style=3D'width:61.8pt;border-top:none;
  border-left:none;border-bottom:solid black 1.0pt;border-right:solid black=
 1.0pt;
  padding:0in 5.4pt 0in 5.4pt;height:15.0pt'>
  <p class=3DMsoNormal align=3Dright style=3D'text-align:right'><span
  style=3D'font-size:11.0pt;font-family:"Calibri","sans-serif";color:#1F497=
D'>2/16/2001<o:p></o:p></span></p>
  </td>
  <td width=3D82 nowrap valign=3Dtop style=3D'width:61.8pt;border-top:none;
  border-left:none;border-bottom:solid black 1.0pt;border-right:solid black=
 1.0pt;
  padding:0in 5.4pt 0in 5.4pt;height:15.0pt'>
  <p class=3DMsoNormal align=3Dright style=3D'text-align:right'><span
  style=3D'font-size:11.0pt;font-family:"Calibri","sans-serif";color:#1F497=
D'>11/24/2001<o:p></o:p></span></p>
  </td>
 </tr>
 <tr style=3D'height:15.0pt'>
  <td width=3D62 nowrap valign=3Dtop style=3D'width:.65in;border:solid blac=
k 1.0pt;
  border-top:none;padding:0in 5.4pt 0in 5.4pt;height:15.0pt'>
  <p class=3DMsoNormal><span style=3D'font-size:11.0pt;font-family:"Calibri=
","sans-serif";
  color:#1F497D'>30022<o:p></o:p></span></p>
  </td>
  <td width=3D82 nowrap valign=3Dtop style=3D'width:61.8pt;border-top:none;
  border-left:none;border-bottom:solid black 1.0pt;border-right:solid black=
 1.0pt;
  padding:0in 5.4pt 0in 5.4pt;height:15.0pt'>
  <p class=3DMsoNormal align=3Dright style=3D'text-align:right'><span
  style=3D'font-size:11.0pt;font-family:"Calibri","sans-serif";color:#1F497=
D'>2/5/1992<o:p></o:p></span></p>
  </td>
  <td width=3D82 nowrap valign=3Dtop style=3D'width:61.8pt;border-top:none;
  border-left:none;border-bottom:solid black 1.0pt;border-right:solid black=
 1.0pt;
  padding:0in 5.4pt 0in 5.4pt;height:15.0pt'>
  <p class=3DMsoNormal align=3Dright style=3D'text-align:right'><span
  style=3D'font-size:11.0pt;font-family:"Calibri","sans-serif";color:#1F497=
D'>7/22/1995<o:p></o:p></span></p>
  </td>
 </tr>
 <tr style=3D'height:15.0pt'>
  <td width=3D62 nowrap valign=3Dtop style=3D'width:.65in;border:solid blac=
k 1.0pt;
  border-top:none;padding:0in 5.4pt 0in 5.4pt;height:15.0pt'>
  <p class=3DMsoNormal><span style=3D'font-size:11.0pt;font-family:"Calibri=
","sans-serif";
  color:#1F497D'>30057<o:p></o:p></span></p>
  </td>
  <td width=3D82 nowrap valign=3Dtop style=3D'width:61.8pt;border-top:none;
  border-left:none;border-bottom:solid black 1.0pt;border-right:solid black=
 1.0pt;
  padding:0in 5.4pt 0in 5.4pt;height:15.0pt'>
  <p class=3DMsoNormal align=3Dright style=3D'text-align:right'><span
  style=3D'font-size:11.0pt;font-family:"Calibri","sans-serif";color:#1F497=
D'>2/1/1988<o:p></o:p></span></p>
  </td>
  <td width=3D82 nowrap valign=3Dtop style=3D'width:61.8pt;border-top:none;
  border-left:none;border-bottom:solid black 1.0pt;border-right:solid black=
 1.0pt;
  padding:0in 5.4pt 0in 5.4pt;height:15.0pt'>
  <p class=3DMsoNormal align=3Dright style=3D'text-align:right'><span
  style=3D'font-size:11.0pt;font-family:"Calibri","sans-serif";color:#1F497=
D'>11/20/1988<o:p></o:p></span></p>
  </td>
 </tr>
 <tr style=3D'height:15.0pt'>
  <td width=3D62 nowrap valign=3Dtop style=3D'width:.65in;border:solid blac=
k 1.0pt;
  border-top:none;padding:0in 5.4pt 0in 5.4pt;height:15.0pt'>
  <p class=3DMsoNormal><span style=3D'font-size:11.0pt;font-family:"Calibri=
","sans-serif";
  color:#1F497D'>30057<o:p></o:p></span></p>
  </td>
  <td width=3D82 nowrap valign=3Dtop style=3D'width:61.8pt;border-top:none;
  border-left:none;border-bottom:solid black 1.0pt;border-right:solid black=
 1.0pt;
  padding:0in 5.4pt 0in 5.4pt;height:15.0pt'>
  <p class=3DMsoNormal align=3Dright style=3D'text-align:right'><span
  style=3D'font-size:11.0pt;font-family:"Calibri","sans-serif";color:#1F497=
D'>12/15/1989<o:p></o:p></span></p>
  </td>
  <td width=3D82 nowrap valign=3Dtop style=3D'width:61.8pt;border-top:none;
  border-left:none;border-bottom:solid black 1.0pt;border-right:solid black=
 1.0pt;
  padding:0in 5.4pt 0in 5.4pt;height:15.0pt'>
  <p class=3DMsoNormal align=3Dright style=3D'text-align:right'><span
  style=3D'font-size:11.0pt;font-family:"Calibri","sans-serif";color:#1F497=
D'>10/17/1990<o:p></o:p></span></p>
  </td>
 </tr>
 <tr style=3D'height:15.0pt'>
  <td width=3D62 nowrap valign=3Dtop style=3D'width:.65in;border:solid blac=
k 1.0pt;
  border-top:none;padding:0in 5.4pt 0in 5.4pt;height:15.0pt'>
  <p class=3DMsoNormal><span style=3D'font-size:11.0pt;font-family:"Calibri=
","sans-serif";
  color:#1F497D'>30057<o:p></o:p></span></p>
  </td>
  <td width=3D82 nowrap valign=3Dtop style=3D'width:61.8pt;border-top:none;
  border-left:none;border-bottom:solid black 1.0pt;border-right:solid black=
 1.0pt;
  padding:0in 5.4pt 0in 5.4pt;height:15.0pt'>
  <p class=3DMsoNormal align=3Dright style=3D'text-align:right'><span
  style=3D'font-size:11.0pt;font-family:"Calibri","sans-serif";color:#1F497=
D'>1/23/1992<o:p></o:p></span></p>
  </td>
  <td width=3D82 nowrap valign=3Dtop style=3D'width:61.8pt;border-top:none;
  border-left:none;border-bottom:solid black 1.0pt;border-right:solid black=
 1.0pt;
  padding:0in 5.4pt 0in 5.4pt;height:15.0pt'>
  <p class=3DMsoNormal align=3Dright style=3D'text-align:right'><span
  style=3D'font-size:11.0pt;font-family:"Calibri","sans-serif";color:#1F497=
D'>10/18/1993<o:p></o:p></span></p>
  </td>
 </tr>
</table>

<p class=3DMsoNormal><span style=3D'font-size:10.0pt;font-family:"Calibri",=
"sans-serif"'><o:p>&nbsp;</o:p></span></p>

</div>

</body>

</html>

--_000_B7635ACC498A8A40A2715399FB88A4F35CAE1C3522p01789prainci_--

------------------------------

Date:    Wed, 12 May 2010 14:32:49 -0400
From:    Gene Maguin <[hidden email]>
Subject: Re: Overlapping Periods

Asil,

I might be misunderstanding, but this looks like a straight forward
application of Aggregate where you select the min value of Indate and the
max value of OutDate.

Gene Maguin


>>I would like to get the earliest in and latest out dates among a set of
overlapping periods using syntax.
An example is provided below.
Could anyone help me with that?

Thanks,
Asil



EXAMPLE:
I would like to go from this set

StudyID InDate  OutDate
30006           9/20/2002       4/30/2007
30006           12/29/2006      5/31/2012
30006           11/25/2009      5/31/2012
30014           4/16/1975       3/3/1988
30014           4/21/1980       5/28/1982
30014           2/16/2001       11/24/2001
30022           2/5/1992        7/22/1995
30057           2/1/1988        11/20/1988
30057           12/15/1989      10/17/1990
30057           1/23/1992       10/18/1993

To this set
StudyID InDate  OutDate
30006           9/20/2002       5/31/2012
30014           4/16/1975       3/3/1988
30014           2/16/2001       11/24/2001
30022           2/5/1992        7/22/1995
30057           2/1/1988        11/20/1988
30057           12/15/1989      10/17/1990
30057           1/23/1992       10/18/1993



------------------------------

Date:    Wed, 12 May 2010 11:34:18 -0700
From:    Bruce Weaver <[hidden email]>
Subject: Re: Overlapping Periods

Asil Ozdogru wrote:

>
> Hello,
>
> I would like to get the earliest in and latest out dates among a set of
> overlapping periods using syntax.
> An example is provided below.
>
> Could anyone help me with that?
>
> Thanks,
>
> Asil
>
> EXAMPLE:
>
> --- snip tables, because formatting was lost ---
>
>
>


Dates are numeric variables, so you can use the MIN and MAX functions of
AGGREGATE to pull out the earliest INDATE and latest OUTDATE.  If you want
the same variable names as in the original data, you'll have to set the
OVERWRITE option to yes.



-----
--
Bruce Weaver
[hidden email]
http://sites.google.com/a/lakeheadu.ca/bweaver/
"When all else fails, RTFM."

NOTE:  My Hotmail account is not monitored regularly.
To send me an e-mail, please use the address shown above.
--
View this message in context: http://old.nabble.com/Missing-dates--tp17760957p28539564.html
Sent from the SPSSX Discussion mailing list archive at Nabble.com.

------------------------------

Date:    Wed, 12 May 2010 14:42:46 -0400
From:    "FernandezLanier, Adriana (DCJS)"
         <[hidden email]>
Subject: computing new variable based on percentiles

--_000_5AB66E3A87B709438143813E78A72DCC18C14B85excnysm95banpny_
Content-Type: text/plain; charset="us-ascii"
Content-Transfer-Encoding: quoted-printable

Good afternoon. I hope my question makes sense. I was told that there is a =
function in SPSS that will compute a new variable based on designated perce=
ntiles (e.g., 25th, 75th). For example, I have a variable called "test scor=
e" and I want to collapse/recode it into a variable that reflects low, medi=
um, and high (based on percentiles). I have searched but have not come acro=
ss any SPSS function that will compute a new variable. Any thoughts? Thanks



P Please consider the environment before printing this e-mail.


________________________________
This e-mail, including any attachments, may be confidential, privileged or =
otherwise legally protected. It is intended only for the addressee. If you =
received this e-mail in error or from someone who was not authorized to sen=
d it to you, do not disseminate, copy or otherwise use this e-mail or its a=
ttachments. Please notify the sender immediately by reply e-mail and delete=
 the e-mail from your system.

--_000_5AB66E3A87B709438143813E78A72DCC18C14B85excnysm95banpny_
Content-Type: text/html; charset="us-ascii"
Content-Transfer-Encoding: quoted-printable

<html xmlns:v=3D"urn:schemas-microsoft-com:vml" xmlns:o=3D"urn:schemas-micr=
osoft-com:office:office" xmlns:w=3D"urn:schemas-microsoft-com:office:word" =
xmlns:x=3D"urn:schemas-microsoft-com:office:excel" xmlns:p=3D"urn:schemas-m=
icrosoft-com:office:powerpoint" xmlns:a=3D"urn:schemas-microsoft-com:office=
:access" xmlns:dt=3D"uuid:C2F41010-65B3-11d1-A29F-00AA00C14882" xmlns:s=3D"=
uuid:BDC6E3F0-6DA3-11d1-A2A3-00AA00C14882" xmlns:rs=3D"urn:schemas-microsof=
t-com:rowset" xmlns:z=3D"#RowsetSchema" xmlns:b=3D"urn:schemas-microsoft-co=
m:office:publisher" xmlns:ss=3D"urn:schemas-microsoft-com:office:spreadshee=
t" xmlns:c=3D"urn:schemas-microsoft-com:office:component:spreadsheet" xmlns=
:odc=3D"urn:schemas-microsoft-com:office:odc" xmlns:oa=3D"urn:schemas-micro=
soft-com:office:activation" xmlns:html=3D"http://www.w3.org/TR/REC-html40" =
xmlns:q=3D"http://schemas.xmlsoap.org/soap/envelope/" xmlns:rtc=3D"http://m=
icrosoft.com/officenet/conferencing" xmlns:D=3D"DAV:" xmlns:Repl=3D"http://=
schemas.microsoft.com/repl/" xmlns:mt=3D"http://schemas.microsoft.com/share=
point/soap/meetings/" xmlns:x2=3D"http://schemas.microsoft.com/office/excel=
/2003/xml" xmlns:ppda=3D"http://www.passport.com/NameSpace.xsd" xmlns:ois=
=3D"http://schemas.microsoft.com/sharepoint/soap/ois/" xmlns:dir=3D"http://=
schemas.microsoft.com/sharepoint/soap/directory/" xmlns:ds=3D"http://www.w3=
.org/2000/09/xmldsig#" xmlns:dsp=3D"http://schemas.microsoft.com/sharepoint=
/dsp" xmlns:udc=3D"http://schemas.microsoft.com/data/udc" xmlns:xsd=3D"http=
://www.w3.org/2001/XMLSchema" xmlns:sub=3D"http://schemas.microsoft.com/sha=
repoint/soap/2002/1/alerts/" xmlns:ec=3D"http://www.w3.org/2001/04/xmlenc#"=
 xmlns:sp=3D"http://schemas.microsoft.com/sharepoint/" xmlns:sps=3D"http://=
schemas.microsoft.com/sharepoint/soap/" xmlns:xsi=3D"http://www.w3.org/2001=
/XMLSchema-instance" xmlns:udcs=3D"http://schemas.microsoft.com/data/udc/so=
ap" xmlns:udcxf=3D"http://schemas.microsoft.com/data/udc/xmlfile" xmlns:udc=
p2p=3D"http://schemas.microsoft.com/data/udc/parttopart" xmlns:wf=3D"http:/=
/schemas.microsoft.com/sharepoint/soap/workflow/" xmlns:dsss=3D"http://sche=
mas.microsoft.com/office/2006/digsig-setup" xmlns:dssi=3D"http://schemas.mi=
crosoft.com/office/2006/digsig" xmlns:mdssi=3D"http://schemas.openxmlformat=
s.org/package/2006/digital-signature" xmlns:mver=3D"http://schemas.openxmlf=
ormats.org/markup-compatibility/2006" xmlns:m=3D"http://schemas.microsoft.c=
om/office/2004/12/omml" xmlns:mrels=3D"http://schemas.openxmlformats.org/pa=
ckage/2006/relationships" xmlns:spwp=3D"http://microsoft.com/sharepoint/web=
partpages" xmlns:ex12t=3D"http://schemas.microsoft.com/exchange/services/20=
06/types" xmlns:ex12m=3D"http://schemas.microsoft.com/exchange/services/200=
6/messages" xmlns:pptsl=3D"http://schemas.microsoft.com/sharepoint/soap/Sli=
deLibrary/" xmlns:spsl=3D"http://microsoft.com/webservices/SharePointPortal=
Server/PublishedLinksService" xmlns:Z=3D"urn:schemas-microsoft-com:" xmlns:=
st=3D"&#1;" xmlns=3D"http://www.w3.org/TR/REC-html40">
<head>
<meta http-equiv=3D"Content-Type" content=3D"text/html; charset=3Dus-ascii"=
>
<meta name=3D"Generator" content=3D"Microsoft Word 12 (filtered medium)">
<style>
<!--
 /* Font Definitions */
 @font-face
        {font-family:Calibri;
        panose-1:2 15 5 2 2 2 4 3 2 4;}
@font-face
        {font-family:"Lucida Sans Unicode";
        panose-1:2 11 6 2 3 5 4 2 2 4;}
@font-face
        {font-family:Webdings;
        panose-1:5 3 1 2 1 5 9 6 7 3;}
 /* Style Definitions */
 p.MsoNormal, li.MsoNormal, div.MsoNormal
        {margin:0in;
        margin-bottom:.0001pt;
        font-size:11.0pt;
        font-family:"Calibri","sans-serif";}
a:link, span.MsoHyperlink
        {mso-style-priority:99;
        color:blue;
        text-decoration:underline;}
a:visited, span.MsoHyperlinkFollowed
        {mso-style-priority:99;
        color:purple;
        text-decoration:underline;}
span.EmailStyle17
        {mso-style-type:personal-compose;
        font-family:"Lucida Sans Unicode","sans-serif";
        color:windowtext;}
.MsoChpDefault
        {mso-style-type:export-only;}
@page Section1
        {size:8.5in 11.0in;
        margin:1.0in 1.0in 1.0in 1.0in;}
div.Section1
        {page:Section1;}
-->
</style><!--[if gte mso 9]><xml>
 <o:shapedefaults v:ext=3D"edit" spidmax=3D"1026" />
</xml><![endif]--><!--[if gte mso 9]><xml>
 <o:shapelayout v:ext=3D"edit">
  <o:idmap v:ext=3D"edit" data=3D"1" />
 </o:shapelayout></xml><![endif]-->
</head>
<body lang=3D"EN-US" link=3D"blue" vlink=3D"purple">
<div class=3D"Section1">
<p class=3D"MsoNormal"><span style=3D"font-family:&quot;Lucida Sans Unicode=
&quot;,&quot;sans-serif&quot;">Good afternoon. I hope my question makes sen=
se. I was told that there is a function in SPSS that will compute a new var=
iable based on designated percentiles (e.g., 25<sup>th</sup>,
 75<sup>th</sup>). For example, I have a variable called &#8220;test score&=
#8221; and I want to collapse/recode it into a variable that reflects low, =
medium, and high (based on percentiles). I have searched but have not come =
across any SPSS function that will compute a
 new variable. Any thoughts? Thanks &nbsp;<o:p></o:p></span></p>
<p class=3D"MsoNormal"><span style=3D"font-family:&quot;Lucida Sans Unicode=
&quot;,&quot;sans-serif&quot;"><o:p>&nbsp;</o:p></span></p>
<p class=3D"MsoNormal"><span style=3D"font-family:&quot;Lucida Sans Unicode=
&quot;,&quot;sans-serif&quot;"><o:p>&nbsp;</o:p></span></p>
<p class=3D"MsoNormal"><span style=3D"font-family:&quot;Lucida Sans Unicode=
&quot;,&quot;sans-serif&quot;"><o:p>&nbsp;</o:p></span></p>
<p class=3D"MsoNormal"><span style=3D"font-size:24.0pt;font-family:Webdings=
;
color:green">P</span><span style=3D"font-size:7.5pt;color:green"> Please co=
nsider the environment before printing this e-mail.<o:p></o:p></span></p>
<p class=3D"MsoNormal"><o:p>&nbsp;</o:p></p>
</div>
<br>
<hr>
<font face=3D"Arial" color=3D"Gray" size=3D"1">This e-mail, including any a=
ttachments, may be confidential, privileged or otherwise legally protected.=
 It is intended only for the addressee. If you received this e-mail in erro=
r or from someone who was not authorized
 to send it to you, do not disseminate, copy or otherwise use this e-mail o=
r its attachments. Please notify the sender immediately by reply e-mail and=
 delete the e-mail from your system.<br>
</font>
</body>
</html>

--_000_5AB66E3A87B709438143813E78A72DCC18C14B85excnysm95banpny_--

------------------------------

Date:    Wed, 12 May 2010 14:53:32 -0400
From:    Gene Maguin <[hidden email]>
Subject: Re: computing new variable based on percentiles

Fernandez, Look at the Rank command and the Ntiles subcommand. I haven't
used Ntiles but it seems to me that would do what you want.

Gene Maguin


>>Good afternoon. I hope my question makes sense. I was told that there is a
function in SPSS that will compute a new variable based on designated
percentiles (e.g., 25th, 75th). For example, I have a variable called "test
score" and I want to collapse/recode it into a variable that reflects low,
medium, and high (based on percentiles). I have searched but have not come
across any SPSS function that will compute a new variable. Any thoughts?
Thanks

------------------------------

Date:    Wed, 12 May 2010 15:10:59 -0400
From:    James Paul <[hidden email]>
Subject: Help on correlations for multiple observation

Hi all,

Appreciate your help in solving this problem using SPSS

I have a data set that looks similar to this table. Each firm can adopt 1
strategy. There are 100 such firms. TobinâEUR(tm)s Q for a firm changes every year.
On an average there are about 5 observations of TobinâEUR(tm)sQ for a firm.

Firm Strategy Year TobinâEUR(tm)sQ
ABC 1 2008 1.2
ABC 1 2007 1.1
Abc 1 2006 1.3
def 2 2008 1.1
def 2 2007 1.0
deF 2 2006 1.2

My goal is to find any correlations between Strategy and TobinâEUR(tm)s Q. If I run
correlations in the data set as is (N=5*100), it will project statistical
significance that is actually not there.
Supposedly there is some way to define clusters and run the procedure with
N=100. How do I do this in SPSS (PASW 18)

Another person who uses Stata uses the command
xi:reg Y x1 x2 x3, cluster(firm)

------------------------------

Date:    Wed, 12 May 2010 15:12:55 -0400
From:    "FernandezLanier, Adriana (DCJS)"
         <[hidden email]>
Subject: Re: computing new variable based on percentiles

Yes, that did the job. Thanks! Adriana


-----Original Message-----
From: SPSSX(r) Discussion [mailto:[hidden email]] On Behalf Of Gene Maguin
Sent: Wednesday, May 12, 2010 2:54 PM
To: [hidden email]
Subject: Re: computing new variable based on percentiles

Fernandez, Look at the Rank command and the Ntiles subcommand. I haven't
used Ntiles but it seems to me that would do what you want.

Gene Maguin


>>Good afternoon. I hope my question makes sense. I was told that there is a
function in SPSS that will compute a new variable based on designated
percentiles (e.g., 25th, 75th). For example, I have a variable called "test
score" and I want to collapse/recode it into a variable that reflects low,
medium, and high (based on percentiles). I have searched but have not come
across any SPSS function that will compute a new variable. Any thoughts?
Thanks

=====================
To manage your subscription to SPSSX-L, send a message to
[hidden email] (not to SPSSX-L), with no body text except the
command. To leave the list, send the command
SIGNOFF SPSSX-L
For a list of commands to manage subscriptions, send the command
INFO REFCARD


This e-mail, including any attachments, may be confidential, privileged or otherwise legally protected. It is intended only for the addressee. If you received this e-mail in error or from someone who was not authorized to send it to you, do not disseminate, copy or otherwise use this e-mail or its attachments.  Please notify the sender immediately by reply e-mail and delete the e-mail from your system.

------------------------------

Date:    Wed, 12 May 2010 14:23:08 -0500
From:    "Swank, Paul R" <[hidden email]>
Subject: Re: Help on correlations for multiple observation

Depends on whether some firms use different strategies across the years. In that case, you would have a multilevel model and should use mixed.

Dr. Paul R. Swank,
Professor and Director of Research
Children's Learning Institute
University of Texas Health Science Center-Houston


-----Original Message-----
From: SPSSX(r) Discussion [mailto:[hidden email]] On Behalf Of James Paul
Sent: Wednesday, May 12, 2010 2:11 PM
To: [hidden email]
Subject: Help on correlations for multiple observation

Hi all,

Appreciate your help in solving this problem using SPSS

I have a data set that looks similar to this table. Each firm can adopt 1
strategy. There are 100 such firms. Tobinââ'¬â"¢s Q for a firm changes every year.
On an average there are about 5 observations of Tobinââ'¬â"¢sQ for a firm.

Firm Strategy Year Tobinââ'¬â"¢sQ
ABC 1 2008 1.2
ABC 1 2007 1.1
Abc 1 2006 1.3
def 2 2008 1.1
def 2 2007 1.0
deF 2 2006 1.2

My goal is to find any correlations between Strategy and Tobinââ'¬â"¢s Q. If I run
correlations in the data set as is (N=5*100), it will project statistical
significance that is actually not there.
Supposedly there is some way to define clusters and run the procedure with
N=100. How do I do this in SPSS (PASW 18)

Another person who uses Stata uses the command
xi:reg Y x1 x2 x3, cluster(firm)

=====================
To manage your subscription to SPSSX-L, send a message to
[hidden email] (not to SPSSX-L), with no body text except the
command. To leave the list, send the command
SIGNOFF SPSSX-L
For a list of commands to manage subscriptions, send the command
INFO REFCARD

------------------------------

Date:    Wed, 12 May 2010 12:31:30 -0700
From:    Bruce Weaver <[hidden email]>
Subject: Re: computing new variable based on percentiles

FernandezLanier, Adriana (DCJS) wrote:

>
> Good afternoon. I hope my question makes sense. I was told that there is a
> function in SPSS that will compute a new variable based on designated
> percentiles (e.g., 25th, 75th). For example, I have a variable called
> "test score" and I want to collapse/recode it into a variable that
> reflects low, medium, and high (based on percentiles). I have searched but
> have not come across any SPSS function that will compute a new variable.
> Any thoughts? Thanks
>
>

Converting to categories is often not a very good idea.  See Dave Streiner's
very readable article, for example.  It addresses dichotomization
specifically, but his points apply to categorization generally.

http://server03.cpa-apc.org:8080/Publications/archives/CJP/2002/april/researchMethodsDichotomizingData.asp



-----
--
Bruce Weaver
[hidden email]
http://sites.google.com/a/lakeheadu.ca/bweaver/
"When all else fails, RTFM."

NOTE:  My Hotmail account is not monitored regularly.
To send me an e-mail, please use the address shown above.
--
View this message in context: http://old.nabble.com/computing-new-variable-based-on-percentiles-tp28539667p28540174.html
Sent from the SPSSX Discussion mailing list archive at Nabble.com.

------------------------------

Date:    Wed, 12 May 2010 11:26:57 -0700
From:    Mark Vande Kamp <[hidden email]>
Subject: Converting C# UTC Date/Time value

I have some data where some time stamps were logged as a date/time format that I don't know how to translate. They are 19-digit binary values that are readily translated by a C# function.

An example: In C# code, DateTime.FromBinary(5245773030495241607) returns 5/6/2010 12:06:46 AM.

Can anyone help me with writing SPSS syntax that will translate the 19-digit values I have into an SPSS date/time format?

Thanks,

Mark

------------------------------

Date:    Wed, 12 May 2010 14:54:18 -0400
From:    Lee <[hidden email]>
Subject: Dichotomous Variables & Correlation

I have 2 dichotomous variables and a scale variable, which correlation is
the most appropriate?

Tks
L

------------------------------

Date:    Wed, 12 May 2010 13:30:00 -0700
From:    SR Millis <[hidden email]>
Subject: Re: Dichotomous Variables & Correlation

For the continuous and dichotomous variables: point-biserial correlation, which is a special case of Pearson's r.

For two dichotomous variables: phi correlation, which is another special case of Pearson's r.
~~~~~~~~~~~
Scott R Millis, PhD, ABPP, CStat, CSci
Professor & Director of Research
Dept of Physical Medicine & Rehabilitation
Dept of Emergency Medicine
Wayne State University School of Medicine
261 Mack Blvd
Detroit, MI 48201
Email:  [hidden email]
Email:  [hidden email]
Tel: 313-993-8085
Fax: 313-966-7682


--- On Wed, 5/12/10, Lee <[hidden email]> wrote:

> From: Lee <[hidden email]>
> Subject: Dichotomous Variables & Correlation
> To: [hidden email]
> Date: Wednesday, May 12, 2010, 2:54 PM
> I have 2 dichotomous variables and a
> scale variable, which correlation is
> the most appropriate?
>
> Tks
> L
>
> =====================
> To manage your subscription to SPSSX-L, send a message to
> [hidden email]
> (not to SPSSX-L), with no body text except the
> command. To leave the list, send the command
> SIGNOFF SPSSX-L
> For a list of commands to manage subscriptions, send the
> command
> INFO REFCARD
>

------------------------------

Date:    Wed, 12 May 2010 20:57:12 +0000
From:    Statisticsdoc Consulting <[hidden email]>
Subject: Re: Dichotomous Variables & Correlation

Lee,
Phi for dichotomous by dichotomous.  Point biserial for dichotomous by scale (unless your scale is badly skewed, or is actually an ordinal measure, in which case you would want to look into non parametric alternatives like the U test),
Best,
Steve
------Original Message------
From: Lee
Sender: SPSSX(r) Discussion
To: [hidden email]
ReplyTo: Lee
Subject: Dichotomous Variables & Correlation
Sent: May 12, 2010 2:54 PM

I have 2 dichotomous variables and a scale variable, which correlation is
the most appropriate?

Tks
L

=====================
To manage your subscription to SPSSX-L, send a message to
[hidden email] (not to SPSSX-L), with no body text except the
command. To leave the list, send the command
SIGNOFF SPSSX-L
For a list of commands to manage subscriptions, send the command
INFO REFCARD


www.StatisticsDoc.com

------------------------------

Date:    Wed, 12 May 2010 16:12:41 -0500
From:    Rick Oliver <[hidden email]>
Subject: Re: Code not properly working on PASW DM book's example

This is a multipart message in MIME format.
--=_alternative 007440F886257721_=
Content-Type: text/plain; charset="ISO-8859-1"
Content-Transfer-Encoding: quoted-printable

Oops. Instead of:

COMPUTE #count=3D0.

try:

NUMERIC #count (f8).

I think the former was resetting #count to 0 for each case. You need to=20
declare the scratch variable for use in the subsequent COMPUTE=20
#count=3D#count+1, but you don't need to initialize it to 0, because scratc=
h=20
variables are automatically initialized to 0 -- and in this case=20
explicitly setting it to 0 was causing the condition to never be met. (For =

testing purposes, you might want to remove the random UNIFORM condition.)



From:
Luca Meyer <[hidden email]>
To:
[hidden email]
Date:
05/11/2010 03:17 AM
Subject:
Code not properly working on PASW DM book's example
Sent by:
"SPSSX(r) Discussion" <[hidden email]>



I am exercising with PASW and I have found a non-properly working code on=20
a INPUT PROGRAM to read nested file example (pag. 47-48) in the=20
"Programming and Data Management for PASW=AE Statistics 18"'s book.

The code is the following:

INPUT PROGRAM.=20
COMPUTE #count=3D0.=20
- DATA LIST FIXED END=3D#eof=20
/#yr 1 (A) #reg 3(A) #person 25 (A).=20
- DO IF #eof OR #count =3D 1000.=20
- END FILE.=20
- END IF.=20
- DO IF #yr=3D'Y'.=20
- REREAD.=20
- DATA LIST /Year 3-6.=20
- LEAVE Year.=20
- ELSE IF #reg=3D'R'.=20
- REREAD.=20
- DATA LIST / Region 5-15 (A).=20
- LEAVE Region.=20
- ELSE IF #person=3D'P' AND UNIFORM(1000) < 500.=20
- REREAD.=20
- DATA LIST / SalesRep 7-17 (A) Sales 20-23.=20
- END CASE.=20
- COMPUTE #count=3D#count+1.=20
- END IF.=20
END INPUT PROGRAM.

BEGIN DATA=20
Y 2002
  R Chicago=20
      Jones        900  P=20
      Gregory      400  P
  R Baton Rouge=20
      Rodriguez    300  P
      Smith        333  P
      Grau         100  P
END DATA.

The claim is that the reading will stop at the end of file or when 1000=20
records are read, but I believe the #count is updated within another DO IF =

- END IF procedure and the 1.000 limit does not apply.=20

I have tested my assumption setting #count=3D3 and removing the random=20
selection but the software reads all 5 cases. I have tried to insert an=20
"AND #count<=3D3" within every DO IF and ELSE IF statement within the secon=
d=20
DO IF ... END IF procedure but I always have 5 cases.

How would you correct the syntax to have only 3 cases imported?

Thanks,
Luca

Luca Meyer
www.lucameyer.com
PASW Statistics v. 18.0.2 (2-apr-2010)
R version 2.9.2 (2009-08-24)
Mac OS X 10.6.3 (10D573) - kernel Darwin 10.3.0








--=_alternative 007440F886257721_=
Content-Type: text/html; charset="ISO-8859-1"
Content-Transfer-Encoding: quoted-printable


<br><font size=3D2 face=3D"sans-serif">Oops. Instead of:</font>
<br>
<br><font size=3D2 face=3D"sans-serif">COMPUTE #count=3D0.</font>
<br>
<br><font size=3D2 face=3D"sans-serif">try:</font>
<br>
<br><font size=3D2 face=3D"sans-serif">NUMERIC #count (f8).</font>
<br>
<br><font size=3D2 face=3D"sans-serif">I think the former was resetting #co=
unt
to 0 for each case. You need to declare the scratch variable for use in
the subsequent COMPUTE #count=3D#count+1, but you don't need to initialize
it to 0, because scratch variables are automatically initialized to 0 --
and in this case explicitly setting it to 0 was causing the condition to
never be met. (For testing purposes, you might want to remove the random
UNIFORM condition.)</font>
<br>
<br>
<br>
<table width=3D100%>
<tr valign=3Dtop>
<td><font size=3D1 color=3D#5f5f5f face=3D"sans-serif">From:</font>
<td><font size=3D1 face=3D"sans-serif">Luca Meyer &lt;[hidden email]=
&gt;</font>
<tr valign=3Dtop>
<td><font size=3D1 color=3D#5f5f5f face=3D"sans-serif">To:</font>
<td><font size=3D1 face=3D"sans-serif">[hidden email]</font>
<tr valign=3Dtop>
<td><font size=3D1 color=3D#5f5f5f face=3D"sans-serif">Date:</font>
<td><font size=3D1 face=3D"sans-serif">05/11/2010 03:17 AM</font>
<tr valign=3Dtop>
<td><font size=3D1 color=3D#5f5f5f face=3D"sans-serif">Subject:</font>
<td><font size=3D1 face=3D"sans-serif">Code not properly working on PASW DM
book's example</font>
<tr valign=3Dtop>
<td><font size=3D1 color=3D#5f5f5f face=3D"sans-serif">Sent by:</font>
<td><font size=3D1 face=3D"sans-serif">&quot;SPSSX(r) Discussion&quot; &lt;=
[hidden email]&gt;</font></table>
<br>
<hr noshade>
<br>
<br>
<br><font size=3D1>I am exercising with PASW and I have found a non-properly
working code on a INPUT PROGRAM to read nested file example (pag. 47-48)
in the &quot;Programming and Data Management for PASW</font><font size=3D1 =
face=3D"Verdana">=AE
</font><font size=3D1>Statistics 18&quot;'s book.</font>
<br>
<br><font size=3D1>The code is the following:</font>
<br>
<br><font size=3D1>INPUT PROGRAM. </font>
<br><font size=3D1>COMPUTE #count=3D0. </font>
<br><font size=3D1>- DATA LIST FIXED END=3D#eof </font>
<br><font size=3D1>/#yr 1 (A) #reg 3(A) #person 25 (A). </font>
<br><font size=3D1>- DO IF #eof OR #count =3D 1000. </font>
<br><font size=3D1>- END FILE. </font>
<br><font size=3D1>- END IF. </font>
<br><font size=3D1>- DO IF #yr=3D'Y'. </font>
<br><font size=3D1>- REREAD. </font>
<br><font size=3D1>- DATA LIST /Year 3-6. </font>
<br><font size=3D1>- LEAVE Year. </font>
<br><font size=3D1>- ELSE IF #reg=3D'R'. </font>
<br><font size=3D1>- REREAD. </font>
<br><font size=3D1>- DATA LIST / Region 5-15 (A). </font>
<br><font size=3D1>- LEAVE Region. </font>
<br><font size=3D1>- ELSE IF #person=3D'P' AND UNIFORM(1000) &lt; 500. </fo=
nt>
<br><font size=3D1>- REREAD. </font>
<br><font size=3D1>- DATA LIST / SalesRep 7-17 (A) Sales 20-23. </font>
<br><font size=3D1>- END CASE. </font>
<br><font size=3D1>- COMPUTE #count=3D#count+1. </font>
<br><font size=3D1>- END IF. </font>
<br><font size=3D1>END INPUT PROGRAM.</font>
<br>
<br><font size=3D1>BEGIN DATA </font>
<br><font size=3D1>Y 2002</font>
<br><font size=3D1>&nbsp; R Chicago </font>
<br><font size=3D1>&nbsp; &nbsp; &nbsp; Jones &nbsp; &nbsp; &nbsp; &nbsp;900
&nbsp;P </font>
<br><font size=3D1>&nbsp; &nbsp; &nbsp; Gregory &nbsp; &nbsp; &nbsp;400 &nb=
sp;P</font>
<br><font size=3D1>&nbsp; R Baton Rouge </font>
<br><font size=3D1>&nbsp; &nbsp; &nbsp; Rodriguez &nbsp; &nbsp;300 &nbsp;P<=
/font>
<br><font size=3D1>&nbsp; &nbsp; &nbsp; Smith &nbsp; &nbsp; &nbsp; &nbsp;333
&nbsp;P</font>
<br><font size=3D1>&nbsp; &nbsp; &nbsp; Grau &nbsp; &nbsp; &nbsp; &nbsp;
100 &nbsp;P</font>
<br><font size=3D1>END DATA.</font>
<br>
<br><font size=3D1>The claim is that the reading will stop at the end of
file or when 1000 records are read, but I believe the #count is updated
within another DO IF - END IF procedure and the 1.000 limit does not apply.
</font>
<br>
<br><font size=3D1>I have tested my assumption setting #count=3D3 and remov=
ing
the random selection but the software reads all 5 cases. I have tried to
insert an &quot;AND #count&lt;=3D3&quot; within every DO IF and ELSE IF sta=
tement
within the second DO IF ... END IF procedure but I always have 5 cases.</fo=
nt>
<br>
<br><font size=3D1>How would you correct the syntax to have only 3 cases
imported?</font>
<br>
<br><font size=3D1>Thanks,</font>
<br><font size=3D1>Luca</font>
<br>
<br><font size=3D1>Luca Meyer</font><font size=3D1 color=3Dblue><u><br>
</u></font><a href=3Dhttp://www.lucameyer.com/><font size=3D1 color=3Dblue>=
<u>www.lucameyer.com</u></font></a><font size=3D1><br>
PASW Statistics v. 18.0.2 (2-apr-2010)</font>
<br><font size=3D1>R version 2.9.2 (2009-08-24)<br>
Mac OS X 10.6.3 (10D573) - kernel Darwin 10.3.0</font>
<br><font size=3D1><br>
<br>
<br>
<br>
</font>
<br>
<br>
<br>
--=_alternative 007440F886257721_=--

------------------------------

Date:    Wed, 12 May 2010 16:27:41 -0500
From:    "Marks, Jim" <[hidden email]>
Subject: Re: Overlapping Periods

Gene:

The problem set includes overlapping and non-overlapping intervals--
case 3006 goes from three to one record, while case 30057 has three
non-overlapping intervals.

Asil:

Here is a solution.

** Sample data.
NEW FILE.
DATA LIST FREE/StudyID (f8.0) InDate (ADATE10) OutDate (ADATE10).
BEGIN DATA
30006           09/20/2002       04/30/2007
30006           12/29/2006      05/31/2012
30006           11/25/2009      05/31/2012
30014           04/16/1975       03/03/1988
30014           04/21/1980       05/28/1982
30014           02/16/2001       11/24/2001
30022           02/05/1992        07/22/1995
30057           02/01/1988        11/20/1988
30057           12/15/1989      10/17/1990
30057           01/23/1992       10/18/1993
END DATA.
DATASET NAME intervals WINDOW = FRONT.

** optional if live data is presorted.
SORT CASES BY studyID indate (A).

** Identify overlapping periods.
COMPUTE sequence  = 1.
IF studyid = LAG(studyid) AND indate LE LAG(outdate) sequence =
LAG(sequence) + 1.


** create a variable to identify the start of a new overlapping period.
COMPUTE seq_start = sequence = 1.

** create a variable to number each new set of overlapping periods--
** needed to aggregate data.
FILTER BY seq_start.
RANK indate BY studyID / RANK INTO intervals.
FILTER  OFF.

** transfer the number for each new overlapping period
** down to each case in the overlapping period.
NUMERIC   #seq  (F8.0).
DO IF seq_start.
.  COMPUTE #seq = intervals.
ELSE .
.         COMPUTE intervals = #seq.
END IF .
EXECUTE .

DATASET DECLARE final_data.
AGGREGATE OUTFILE = final_data /BREAK = studyID intervals
  /indate = MIN(indate) /outdate = MAX(indate).
DATASET ACTIVATE final_data.
LIST.

Jim Marks
Director, Market Research
x1616


-----Original Message-----
From: SPSSX(r) Discussion [mailto:[hidden email]] On Behalf Of
Gene Maguin
Sent: Wednesday, May 12, 2010 1:33 PM
To: [hidden email]
Subject: Re: Overlapping Periods

Asil,

I might be misunderstanding, but this looks like a straight forward
application of Aggregate where you select the min value of Indate and
the
max value of OutDate.

Gene Maguin


>>I would like to get the earliest in and latest out dates among a set
of
overlapping periods using syntax.
An example is provided below.
Could anyone help me with that?

Thanks,
Asil



EXAMPLE:
I would like to go from this set

StudyID InDate  OutDate
30006           9/20/2002       4/30/2007
30006           12/29/2006      5/31/2012
30006           11/25/2009      5/31/2012
30014           4/16/1975       3/3/1988
30014           4/21/1980       5/28/1982
30014           2/16/2001       11/24/2001
30022           2/5/1992        7/22/1995
30057           2/1/1988        11/20/1988
30057           12/15/1989      10/17/1990
30057           1/23/1992       10/18/1993

To this set
StudyID InDate  OutDate
30006           9/20/2002       5/31/2012
30014           4/16/1975       3/3/1988
30014           2/16/2001       11/24/2001
30022           2/5/1992        7/22/1995
30057           2/1/1988        11/20/1988
30057           12/15/1989      10/17/1990
30057           1/23/1992       10/18/1993



=====================
To manage your subscription to SPSSX-L, send a message to
[hidden email] (not to SPSSX-L), with no body text except the
command. To leave the list, send the command
SIGNOFF SPSSX-L
For a list of commands to manage subscriptions, send the command
INFO REFCARD

------------------------------

Date:    Wed, 12 May 2010 17:39:43 -0400
From:    Richard Ristow <[hidden email]>
Subject: Re: Converting C# UTC Date/Time value

<html>
<body>
At 02:26 PM 5/12/2010, Mark Vande Kamp wrote:<br><br>
<blockquote type=cite class=cite cite="">I have some data where some time
stamps were logged as 19-digit binary values that are readily translated
by a C# function.<br><br>
An example: In C# code, DateTime.FromBinary(5245773030495241607) returns
5/6/2010 12:06:46 AM.<br><br>
Can anyone help me with writing SPSS syntax that will translate the
19-digit values I have into an SPSS date/time format?</blockquote><br>
It's a pretty fair guess that you have an epoch-and-offset date format:
that is, the number you have represents the time, in a some unit, since a
chosen time in the past (the 'epoch').<br><br>
(These are common now. SPSS uses seconds since midnight, October 14,
1582; Excel uses days starting with 1=1 January 1900; Unix uses seconds
since midnight proleptic Coordinated Universal Time (UTC) of January 1,
1970.)<br><br>
It looks like you have access to a system that will generate such time
stamps for 'right now', and convert them into readable dates and times.
If so, you should be able to find the time unit and the epoch: calculate
difference in, say, seconds between two time-stamps, and see what the
numeric difference is between the binary values.<br><br>
(The unit, whatever it is, is SHORT. The HP48 calculator's unit is 1/8192
seconds, and it counts from something like 0/00/0000; and ITS count is
about a ten-thousandth of that number you give.)<br><br>
The unit is the important thing. You don't need the epoch <i>per se;
</i>you just need the timestamp for a convenient time not too far past.
Take any other timestamp, subtract, convert the result to second or
whatever, and add to your 'epoch' as represented in, say, SPSS
format.<br><br>
<b>CAREFUL</b>, though: SPSS cannot represent 19-digit numbers exactly.
(SPSS carries 53 bits, or marginally less than 16 digits.) You'll have to
read in your numbers as strings and break them apart. I'll guess that the
epoch was a long time ago and there are many leading digits that won't
change over times you're interested in; you can strip those, read the
rest as a number, and proceed as above, having stripped the same digits
from the time-stamp representation of your 'epoch'.<br><br>
Or, perhaps you can read your data in C#, and convert your stamps into,
say, seconds since something reasonable.</body>
<br>
</html>

------------------------------

Date:    Thu, 13 May 2010 09:11:25 +0930
From:    Kylie Lange <[hidden email]>
Subject: Re: Help on correlations for multiple observation

Hi James,

Take a look at the these two articles:

Bland, MJ & Altman, DG. Calculating correlation coefficients with repeated
observations: Part 1 - correlation within subjects. BMJ 1995; 310: 446
(http://www.bmj.com/cgi/content/full/310/6977/446)

Bland, MJ & Altman, DG. Calculating correlation coefficients with repeated
observations: Part 2 - correlation between subjects. BMJ 1995; 310: 663
(http://www.bmj.com/cgi/content/full/310/6980/633)

Cheers,
Kylie.


Quoting James Paul <[hidden email]>:

> Hi all,
>
> Appreciate your help in solving this problem using SPSS
>
> I have a data set that looks similar to this table. Each firm can adopt 1
> strategy. There are 100 such firms. TobinâEUR(tm)s Q for a firm changes every
> year.
> On an average there are about 5 observations of TobinâEUR(tm)sQ for a firm.
>
> Firm Strategy Year TobinâEUR(tm)sQ
> ABC 1 2008 1.2
> ABC 1 2007 1.1
> Abc 1 2006 1.3
> def 2 2008 1.1
> def 2 2007 1.0
> deF 2 2006 1.2
>
> My goal is to find any correlations between Strategy and TobinâEUR(tm)s Q. If I
> run
> correlations in the data set as is (N=5*100), it will project statistical
> significance that is actually not there.
> Supposedly there is some way to define clusters and run the procedure with
> N=100. How do I do this in SPSS (PASW 18)
>
> Another person who uses Stata uses the command
> xi:reg Y x1 x2 x3, cluster(firm)
>
> =====================
> To manage your subscription to SPSSX-L, send a message to
> [hidden email] (not to SPSSX-L), with no body text except the
> command. To leave the list, send the command
> SIGNOFF SPSSX-L
> For a list of commands to manage subscriptions, send the command
> INFO REFCARD
>

------------------------------

Date:    Wed, 12 May 2010 17:59:31 -0400
From:    Nancy Rusinak <[hidden email]>
Subject: Stats question - multiple regression/controls

--000e0cd13b6eb8135304866cc1db
Content-Type: text/plain; charset=ISO-8859-1

I've run a multiple regression in SPSS, using various independent variables
to predict income.  I have included gender and race in my model as
controls.  Now, I need to predict salary and I am unclear on if I include
the partial slopes for gender and race in my prediction or exclude them from
the formula.  I was under the impression that putting them in the model
controlled for them and then they were excluded.  A co-worker says I need to
include them in my formula.  Please advise.  And thank you.  N.

--000e0cd13b6eb8135304866cc1db
Content-Type: text/html; charset=ISO-8859-1
Content-Transfer-Encoding: quoted-printable

I&#39;ve run a multiple regression in SPSS, using various independent varia=
bles to predict income.=A0 I have included gender and race in my model as c=
ontrols.=A0 Now, I need to predict salary and I am unclear on if I include =
the partial slopes for gender and race in my prediction or exclude them fro=
m the formula.=A0 I was under the impression that putting them in the model=
 controlled for them and then they were excluded.=A0 A co-worker says I nee=
d to include them in my formula.=A0 Please advise.=A0 And thank you.=A0 N.<=
br>

--000e0cd13b6eb8135304866cc1db--

------------------------------

Date:    Wed, 12 May 2010 15:36:04 -0700
From:    Mark Vande Kamp <[hidden email]>
Subject: Re: Converting C# UTC Date/Time value

Yes, it is an epoch-and-offset. According to Wikipedia, the epoch is 1 January, year 1, and the offset is 100 nanoseconds (a "tick"). However, I'm having trouble figuring out how to use that information.

Here's one false start -- 2000 years * 86400 sec/year * 10000000 ticks/sec = 1.728E+15.

But that's only a 16-digit number. Is the problem that my math is decimal and the 19-digit string is binary?

I don't write C# code and the example was generated for me by someone else. What a stupid problem.

Mark


I think I'm messedMy first guess was
On Wed, 12 May 2010, Richard Ristow wrote:

> At 02:26 PM 5/12/2010, Mark Vande Kamp wrote:
>
>       I have some data where some time stamps were logged as 19-digit
>       binary values that are readily translated by a C# function.
>
>       An example: In C# code, DateTime.FromBinary(5245773030495241607)
>       returns 5/6/2010 12:06:46 AM.
>
>       Can anyone help me with writing SPSS syntax that will translate
>       the 19-digit values I have into an SPSS date/time format?
>
>
> It's a pretty fair guess that you have an epoch-and-offset date format: that
> is, the number you have represents the time, in a some unit, since a chosen
> time in the past (the 'epoch').
>
> (These are common now. SPSS uses seconds since midnight, October 14, 1582;
> Excel uses days starting with 1=1 January 1900; Unix uses seconds since
> midnight proleptic Coordinated Universal Time (UTC) of January 1, 1970.)
>
> It looks like you have access to a system that will generate such time
> stamps for 'right now', and convert them into readable dates and times. If
> so, you should be able to find the time unit and the epoch: calculate
> difference in, say, seconds between two time-stamps, and see what the
> numeric difference is between the binary values.
>
> (The unit, whatever it is, is SHORT. The HP48 calculator's unit is 1/8192
> seconds, and it counts from something like 0/00/0000; and ITS count is about
> a ten-thousandth of that number you give.)
>
> The unit is the important thing. You don't need the epoch per se; you just
> need the timestamp for a convenient time not too far past. Take any other
> timestamp, subtract, convert the result to second or whatever, and add to
> your 'epoch' as represented in, say, SPSS format.
>
> CAREFUL, though: SPSS cannot represent 19-digit numbers exactly. (SPSS
> carries 53 bits, or marginally less than 16 digits.) You'll have to read in
> your numbers as strings and break them apart. I'll guess that the epoch was
> a long time ago and there are many leading digits that won't change over
> times you're interested in; you can strip those, read the rest as a number,
> and proceed as above, having stripped the same digits from the time-stamp
> representation of your 'epoch'.
>
> Or, perhaps you can read your data in C#, and convert your stamps into, say,
> seconds since something reasonable.
> ===================== To manage your subscription to SPSSX-L, send a message
> to [hidden email] (not to SPSSX-L), with no body text except the
> command. To leave the list, send the command SIGNOFF SPSSX-L For a list of
> commands to manage subscriptions, send the command INFO REFCARD
>

------------------------------

Date:    Wed, 12 May 2010 19:08:57 -0700
From:    Bruce Weaver <[hidden email]>
Subject: Re: Stats question - multiple regression/controls

Nancy Rusinak wrote:

>
> I've run a multiple regression in SPSS, using various independent
> variables
> to predict income.  I have included gender and race in my model as
> controls.  Now, I need to predict salary and I am unclear on if I include
> the partial slopes for gender and race in my prediction or exclude them
> from
> the formula.  I was under the impression that putting them in the model
> controlled for them and then they were excluded.  A co-worker says I need
> to
> include them in my formula.  Please advise.  And thank you.  N.
>
>


Is this a wind-up?  Any variable you want to control for has to be in the
model.



-----
--
Bruce Weaver
[hidden email]
http://sites.google.com/a/lakeheadu.ca/bweaver/
"When all else fails, RTFM."

NOTE:  My Hotmail account is not monitored regularly.
To send me an e-mail, please use the address shown above.
--
View this message in context: http://old.nabble.com/Stats-question---multiple-regression-controls-tp28542431p28542940.html
Sent from the SPSSX Discussion mailing list archive at Nabble.com.

------------------------------

Date:    Wed, 12 May 2010 23:59:03 -0400
From:    Richard Ristow <[hidden email]>
Subject: Re: Converting C# UTC Date/Time value

<html>
<body>
At 06:36 PM 5/12/2010, Mark Vande Kamp wrote:<br><br>
<blockquote type=cite class=cite cite="">Yes, it is an epoch-and-offset.
According to Wikipedia, the epoch is 1 January, year 1, and the offset is
100 nanoseconds (a &quot;tick&quot;).</blockquote><br>
Bully for me then (buff, buff) for having essentially got there with just
one number and my little HP48 calculator. 8-)<br><br>
<blockquote type=cite class=cite cite="">&nbsp;However, I'm having
trouble figuring out how to use that information.<br>
Here's one false start -- <br>
2000 years * 86400 sec/year * 10,000,000 ticks/sec =
1.728E+15.</blockquote><br>
Part of your problem is the 86,400 seconds; there are 86,400 seconds in a
DAY. A year is about 3.15E7 seconds; so, for 2010 years:<br><br>
2010 years * 3.15E7 sec/year * 10,000,000 ticks/sec = 6.343E17
ticks<br><br>
But that's still smaller than the number you posted, by a factor of 8.3.
I don't understand it -- does anybody see what I'm missing?<br><br>
<br>
But calculation does seem to show that a 'tick' is 100 nanoseconds, so
what <i>should</i> work is to take any epoch you please, for
example<br><br>
5/6/2010 12:06:46 AM = 5245773030495241607 ticks<br><br>
Store that epoch as an integral number of ticks, <i>and</i> an SPSS
date-time value (not tested):<br><br>
<tt><font size=2>COMPUTE&nbsp; EpochTicks =
5245773030495241607&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; /* won't
work */.<br><br>
NUMERIC&nbsp; My_Epoch (DATETIME20).<br><br>
COMPUTE&nbsp; My_Epoch=NUMBER('5/6/2010',ADATE10)<br>
&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;
+NUMBER('12:06:46',TIME8)<br>
&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;
-TIME.HMS(12).<br><br>
</font></tt>(That '<tt><font size=2>-TIME.HMS(12)</font></tt>' is the
correction needed, <i>in this instance,</i> to convert your AM time to
24-hour time.)<br><br>
Then, if you have another time measured in ticks,<br><br>
<tt><font size=2>COMPUTE
SecondsDiff=(NowTicks-EpochTicks)/1E7&nbsp;&nbsp; /* won't work */.<br>
NUMERIC NowTime (DATETIME20).<br>
COMPUTE NowTime=MyEpoch+SecondsDiff.<br><br>
</font></tt>Those calculations that won't work, won't work because they
require precision exceeding what SPSS provides.<br><br>
Briefly, you're going to have to drop, say, 6 trailing digits from your
'ticks' counts (assuming you have them as decimal strings), and, say, two
leading digits.<br><br>
All computations <i>will</i> work with values truncated that way,
converted to SPSS numbers.<br><br>
Sorry -- that isn't complete, but it's too late at night.<br><br>
-Best of luck,<br>
&nbsp;Richard</body>
<br>
</html>

------------------------------

End of SPSSX-L Digest - 11 May 2010 to 12 May 2010 (#2010-135)
**************************************************************

=====================
To manage your subscription to SPSSX-L, send a message to
[hidden email] (not to SPSSX-L), with no body text except the
command. To leave the list, send the command
SIGNOFF SPSSX-L
For a list of commands to manage subscriptions, send the command
INFO REFCARD