The goal of this study was to ask teens aged key questions about their sexual health, sexual activity, and the sexual pressures facing teens today. Parents were asked a similar but shorter set of questions. We asked parents how they thought teens today view sex, the sexual pressures they face and about the flow of communication between parents and teens.
Princeton Data Source, LLC a subsidiary of Princeton Survey Research Associates located in Fredericksburg, Virginia conducted interviews with parents and teenagers aged during the period of September 4, and November 7, Details on the design, execution and analysis of the survey are discussed below. Design and Data Collection Procedures Sample Design The goal of this study was to conduct a nationwide survey of young teenagers aged 13 to 16 on issues related to sexual health and activity.
A companion survey was conducted among parents, in part to acquaint parents with the survey topic so they could give informed consent for their teen to participate and in part to provide a point of comparison against which to view teen responses. The sample was designed to be generalizable to the population of young teens in the continental U. The PSRAI Demographic Tracking Survey is a short 10 minute nationally representative survey that asks about household composition including number of children, the age and sex of adult household members, the race and ethnicity of the respondent, the religion of the respondent and the total household income.
This method guarantees coverage of every assigned phone number regardless of whether that number is directory listed, purposely unlisted, or too new to be listed. After selection, the numbers are compared against business directories and matching numbers are purged. The questionnaire was pretested with a small number of parent and young teen respondents.
Interviews were conducted using experienced female interviewers who had previous experience doing similar types of studies. These interviewers are particularly well suited to judge the quality of the answers given and the degree to which respondents understood the questions. After the pretest, a couple of questions were added to the instrument and a few were deleted. Additionally, some final changes were made to question wording and order based on the monitored pretest interviews.
All interviews were conducted using a fully-programmed CATI instrument. Contact Procedures Interviews were conducted during the period September 4, through November 7, Only experienced female interviewers were used for this study. Interviews were first conducted with a parent at the sampled household. After completion of the parent interview, interviewers asked for permission to speak with either the oldest or the youngest child in the household age 13 to 16 to conduct a survey about the same issues.
After obtaining consent from the parent, interviews were conducted with the teen respondent. If the teen respondent was not available, interviewers arranged a time to call back when the teen was likely to be at home. As many as 10 attempts were made to contact a parent at every sampled telephone number. Sample was released for interviewing in replicates, which are representative subsamples of the larger sample. Using replicates to control the release of sample ensures that complete call procedures are followed for the entire sample and ensures that the geographic distribution of numbers called is appropriate.
Calls were staggered over different times of the day and days of the week to maximize the chance of making contact with potential respondents. Each household received at least one daytime call in an attempt to find someone at home.
Questionnaire Monitoring During the first few weeks of the project, both the management of PDS and project staff at PSRAI listened to tapes of the interviews to make sure that the instrument was working as designed and to hear how teens were responding to the questions. In addition to the regular daily monitoring of interviewers by PDS supervisors, the tapes provided another layer of interviewer quality control.
Advertise Project staff at PSRAI continued to listen to tapes of interviews of both sexually active and inactive teens throughout the field period in an effort to gauge how comfortable teens were talking about their experiences and viewpoints. After having listened to interviews with approximately 25 sexually active teens and 25 teens who are not sexually active, PSRAI can say with confidence that teens were open and engaged in telling us about their experiences and viewpoints.
While understandably some teens were a bit shy, these teens did not seem to be evasive or deceptive in their responses. When teens hesitated to respond to a question, the interviewer reminded the teen that they could skip over any question that made them feel uncomfortable or that they did not want to answer.
These young teenagers appeared to have no difficulty telling interviews if they did not want to answer a question. At the same time, most teens completed the entire questionnaire, usually in an open, frank and matter of fact manner. Weighting and Analysis Weighting is generally used in survey analysis to compensate for patterns of nonresponse that might bias results.
Weighting was accomplished using Sample Balancing, a special iterative sample weighting program that simultaneously balances the distributions of all variables using a statistical technique called the Deming Algorithm.
Weights were trimmed to prevent individual interviews from having too much influence on the final results. The use of these weights in statistical analysis ensures that the demographic characteristics of the sample closely approximate the demographic characteristics of the national population. Table 1 compares weighted and unweighted sample distributions to population parameters. Effects of Sample Design on Statistical Inference Post-data collection statistical adjustments require analysis procedures that reflect departures from simple random sampling.
PSRAI calculates the effects of these design features so that an appropriate adjustment can be incorporated into tests of statistical significance when using these data. The so-called "design effect" or deff represents the loss in statistical efficiency that results from systematic non-response.
The total sample design effect for this survey is 1. PSRAI calculates the composite design effect for a sample of size n, with each case having a weight, wi as: In a wide range of situations, the adjusted standard error of a statistic should be calculated by multiplying the usual formula by the square root of the design effect vdeff.
This means that in 95 out every samples drawn using the same methodology, estimated proportions based on the entire sample will be no more than 3. It is important to remember that sampling fluctuations are only one possible source of error in a survey estimate. Other sources, such as respondent selection bias, questionnaire wording and reporting inaccuracy, may contribute additional error of greater or lesser magnitude.
Response Rate Table 2 reports the disposition of all sampled telephone numbers ever dialed from the original telephone number sample. The response rate estimates the fraction of all eligible respondents in the sample that were ultimately interviewed. Analysis of respondent refusals Surveys conducted on potentially sensitive topics such as sexual health and activity always raise the concern that respondents differ in some systematic way from those that refuse to participate.
And although parents of the youngest teens are a little more likely than parents of 15 to 16 year-olds to refuse their teens participation, the difference is not large. There are, however, are a few small differences. There were 59 teens that declined to participate after parental permission was granted.