Title page
Contents
1. Introduction 8
1.1. Background of the Study 8
1.2. Questionnaire Development 9
1.3. Survey Topics 18
1.3.1. School Practices and Programs 18
1.3.2. Parent and Community Involvement at School 18
1.3.3. School Security Staff 18
1.3.4. School Mental Health Services 19
1.3.5. Staff Training 19
1.3.6. Limitations on Crime Prevention 19
1.3.7. Frequency of Crime and Violence at School 19
1.3.8. Number of Incidents 19
1.3.9. Disciplinary Problems and Actions 20
1.3.10. School Characteristics 20
2. Sample Design and Weighting 21
2.1. Sampling Frame 21
2.2. Sample Design 21
2.3. Sample Size 22
2.4. Stratification, Sample Selection, and Final Sample 22
2.5. Weighting 26
3. Data Collection Methods and Response Rates 28
3.1. Data Collection Activities 28
3.2. Interviewer Training 30
3.2.1. Training on Basic Interviewer Skills 30
3.2.2. Training on Questionnaire Follow-up 30
3.2.3. Training on Refusal Conversion 31
3.2.4. Training on Data Retrieval 31
3.3. Data Retrieval 31
3.4. Efforts to Increase Response Rates 32
3.5. Unit Response Rate 33
3.6. Analysis of Unit Nonresponse Bias 35
3.7. Item Response Rates 36
3.8. Analysis of Item Nonresponse Bias 36
3.9. Nonsampling Error 37
4. Data Preparation 38
4.1. Analysis of Disclosure Risk 38
4.2. Editing Specifications 38
4.2.1. Range Specifications 38
4.2.2. Consistency Checks (Logic Edits) 39
4.3. Review and Coding of Text Items 39
4.4. Imputation 40
4.4.1. Imputation Methods 40
4.4.2. Imputation Order 41
4.4.3. Imputation Flags 41
5. Guide to the Public-Use Data File and Codebook 42
5.1. Content and Organization of the Data File 42
5.2. Public-Use Data File 44
5.3. Unique School Identifier 44
5.4. Questionnaire Item Variables 45
5.5. Variables Recoded for Public-Use File to Preserve Confidentiality 45
5.6. Composite Variables 46
5.7. Sampling Frame Variables 50
5.8. Weighting and Variance Estimation Variables 51
5.9. Imputation Flag Variables 51
6. Applying the Weight and Computing Standard Errors 52
6.1. Applying the Weight 52
6.2. Computing Standard Errors 52
7. Data Considerations and Anomalies 56
7.1. Law Enforcement Officers: Items 11 (C0610) Through 18b (C0242) 56
7.2. Number of Incidents: Subitems 26a_1 (C0310) Through 26l_2 (C0364) 56
7.3. Use of Disciplinary Actions: Subitems 34a_1 (C0390) Through 34o_2 (C0456) 56
7.4. Number of Students Involved in Recorded Offenses of Use/Possession of a Firearm/Explosive Device: Subitem 35a_1 (C0458) 56
7.5. Disciplinary Actions Taken: Subitems 35a_1 (C0458) Through 35e_5 (C0506) 57
7.6. Total Removals and Transfers: Subitems 36a (C0518) and 36b (C0520) 57
7.7. Classroom Changes: Item 40 (C0538) 57
7.8. Average Daily Attendance: Item 44 (C0568) 57
7.9. Outliers in Count Variables 58
8. References 59
Appendix A: 2015-16 School Survey on Crime and Safety Questionnaire 60
Appendix B: List of Variables and Record Layout of the Fixed-Format ASCII File for the Public-Use Data 85
Appendix C: List of Variables in the Restricted-Use Data File and Not in the Public-Use Data File 100
Appendix D: 2015-16 School Survey on Crime and Safety Public-Use Codebook 106
Appendix E: Advance Letter to Principals 273
Appendix F: Principal Cover Letter 275
Appendix G: Chief State School Officer Letter 277
Appendix H: Superintendent Letter 279
Appendix I: Interviewer Self-Study Guide 281
I. INTRODUCTION 284
Purpose of the School Survey on Crime & Safety (SSOCS) 284
Survey Design and Sample Size 284
SSOCS Telephone Operations 284
II. CONCEPTS 285
Challenges Collecting Data from Schools 285
Special Permission Districts 285
Late Mail Returns (LMRs) 286
III. NO CHILD LEFT BEHIND and EVERY STUDENT SUCCEEDS ACT 286
SSOCS and NCLB and ESSA 286
IV. REFUSAL AVERSION AND CONVERSION 287
Aversion vs. Conversion 287
Keys to Success 288
V. MATERIALS 291
VI. FREQUENTLY ASKED QUESTIONS & REFUSAL RESPONSES 292
Appendix J: Reminder and Non-Response Follow-UP Operation Interviewer Self Study Guide 298
I. INTRODUCTION 301
Purpose of the School Survey on Crime & Safety (SSOCS) 301
Survey Design and Sample Size 301
SSOCS Telephone Operations 301
II. CONCEPTS 303
Challenges Collecting Data from Schools 303
Special Permission Districts 303
Late Mail Returns (LMRs) 303
III. NO CHILD LEFT BEHIND AND EVERY STUDENT SUCCEEDS ACT 304
SSOCS, NCLB and ESSA 304
Confidentiality 304
Benefits of Participating 305
IV. REFUSAL AVERSION AND CONVERSION 305
Aversion vs. Conversion 305
Keys to Success 305
V. COMPLETING THE SSOCS-26 308
Materials 308
Form Overview 309
Call Guidelines 311
Making the Call 311
VI. THE SSOCS-1 INTERVIEW COMPLETED VIA THE TELEPHONE 314
VII. SETTING OUTCOME CODES 314
VIII. CALL OUTCOMES 316
IX. FREQUENTLY ASKED QUESTIONS & REFUSAL RESPONSES 318
X. FINAL REVIEW EXERCISES 324
Appendix K: Failed Edit Follow-Up Operation Interviewer Self Study Guide 328
Introduction to Failed Edit Follow-up 330
Completing the SSOCS-27 330
A. Materials 330
B. Form Overview 330
C. Call Guidelines 331
D. Before You Call 332
E. Completing the Call Log 332
F. Making the Call 333
G. Understanding the List of Items 334
Conducting the SSOCS-1 Interview 335
Call Outcome Codes 337
Final Review Exercise 338
Appendix L: Reminder E-mails to Pricipals 341
Appendix M: Detailed Analysis of Unit Nonresponse Bias 349
Appendix N: Base-Weighted Item Response Rates 369
Appendix O: Detailed Analysis of Item Nonresponse Bias 377
Appendix P: Detailed Editing Procedures, by Item 385
Appendix Q: Detailed Imputation Procedures, by Item 408
Table 1. Unweighted and weighted unit response rates, by selected school characteristics: School year 2015-16 25
Table 2. Characteristics of the SSOCS:2016 final analysis weight (FINALWGT) 27
Table 3. Schedule of data collection activities: SSOCS:2016 29
Table 4. Number of public schools, by interview status: SSOCS:2016 34
Table 5. Created text item for public-use file: SSOCS:2016 39
Table 6. Summary of weighting and sample variance estimation variables: SSOCS:2000 to SSOCS:2016 53
Table B-1. Variable list, SSOCS:2016 86
Table C-1. SSOCS:2016 variables in the restricted-use file that were omitted from the public-use file 101
Table M-1. Comparison of sample and target population, by school characteristics, SSOCS:2016 352
Table M-2. Response rates by school characteristics: SSOCS:2016 354
Table M-3. Summary of chi-square test of independence between school characteristics and 76 key survey variables: SSOCS:2016 355
Table M-4. Summary of frame characteristics for which key survey variable distributions differed significantly: SSOCS:2016 355
Table M-5. Comparison of respondents and nonrespondents, by school characteristics: SSOCS:2016 357
Table M-6. Comparison of odds ratios, by school characteristics: SSOCS:2016 359
Table M-7. Comparison of key estimates for low-propensity quintile and balance of interviewed sample 361
Table M-8. Nonresponse adjustment cells, weighted and unweighted response rates of cells, and number of respondents: SSOCS:2016 365
Table M-9. Summary of unit nonresponse bias before and after noninterview adjustment 366
Table M-10. Effects of nonresponse adjustment on bias reduction in frame characteristics 366
Table N-1. Detailed base-weighted item response rates: School year 2015-16 370
Table O-1. Item details for items with response rates less than 85 percent: SSOCS:2016 379
Table O-2. Comparison of original and extreme imputed value item mean estimates for items with low and high extreme imputed value estimates 379
Table O-3. Comparison of item respondents and nonrespondents for the variable number of attacks with a weapon (C0326): SSOCS:2016 381
Table O-4. Comparison of item respondents and nonrespondents for the variable number of attacks without a weapon (C0330): SSOCS:2016 383