Supporting Statement B

ETAC T2_SSB_8.21.2024.docx

DoD-wide Data Collection and Analysis for the Department of Defense Qualitative and Quantitative Data Collection in Support of the Independent Review Commission on Sexual Assault Recommendations

Supporting Statement B

OMB: 0704-0644

Document [docx]
Download: docx | pdf

Supporting Statement – Part B

DoD-wide Data Collection and Analysis for Department of Defense Qualitative Data Collection in Support of the Independent Review Commission on Sexual Assault Recommendations (OMB Control Number 0704-0644, expiration 1/31/2026)



Title of Collection: ETAC T2: Evaluation of The U.S. Marine Corps Pre-Command Sexual Assault Prevention and Response and Integrated Prevention Training



Expected Fielding Dates: 01 OCTOBER 2024 – 31 JULY 2026


  1. Description of the Activity

NORC at the University of Chicago (NORC) has been contracted by the Department of Defense, Office of Force Resiliency, Violence Prevention Cell (DoD VPC) and the Sexual Assault Prevention and Response Office (DoD SAPRO) to address IRC recommendations related to evaluating the impact of revised prevention training requirements (2.1c, 2.4, 3.2, 3.6, and 4.4c).

The goal of this study is to evaluate the effectiveness of the United States Marine Corps (USMC) Pre-Command Sexual Assault Prevention and Response and Integrated Prevention Training (Pre-Command SAPR IP Training) program in changing trainees’ beliefs about the importance of preventing sexual assault (SA) as a leader; knowledge, skills, attitudes, subjective norms, and perceived behavioral control related to SA prevention; and the trainees’ behavioral intentions and behavior to prevent SA as a leader. This evaluation will include 800 commanders and Sergeants Major and 40 training instructors and facilitators (USMC HQ staff and civilian employee subject matter experts) from four training session cohorts over the course of 2024, 2025, and 2026. There are three data collection components: (1) baseline and 6-month follow-up surveys, (2) brief training feedback forms collected immediately after training from instructors and trainees, and (3) qualitative interviews.


Baseline Surveys

Training Feedback Forms

Follow-Up Surveys

Qualitative Interviews

Training Cohorts

N~400

N~400

N~400

N~20

Comparison Cohorts

N~400

N/A

N~400

N/A

Training Instructors

N/A

N~40

N/A

N/A

TABLE 1. SAMPLE SIZE ESTIMATES FOR DATA COLLECTION

The overarching research question of this evaluation is whether the Pre-Command SAPR IP Training is effective in achieving its intended goals. The working hypothesis to be tested is that the training will result in improvements in short-term and intermediate outcomes (e.g., changes in behavioral, normative, and control beliefs about the importance of preventing SA as a leader; changes in attitudes, subjective norms, and perceived behavioral control of preventing SA; and changes in behavioral intentions and behavior to implement SA prevention programs) for Marine Corps leadership. There are four main process and outcome research questions:

  1. What is the impact of the Pre-Command SAPR IP Training on slated O5/O6 commanders and E9 Sergeants Major, relative to the comparison condition not receiving the training, on trainees’ knowledge about SA prevention and response, and beliefs about the importance of preventing and responding to SA as a leader?

  2. What is the impact of the Pre-Command SAPR IP Training on slated O5/O6 commanders and E9 Sergeants Major, relative to the comparison condition not receiving the training, on trainees’ attitudes, subjective norms, and perceived behavioral control related to SA prevention and response?

  3. What is the impact of the Pre-Command SAPR IP Training on slated O5/O6 commanders and E9 Sergeants Major, relative to the comparison condition not receiving the training, on trainees’ preparedness, behavioral intentions and (should command assignment durations allow) behavior to prevent and respond to SA as a leader?

  4. During the study period, to what extent was the Pre-Command SAPR IP Training implemented with fidelity and integrity to the standardized curriculum?

  1. Procedures for Information Collection

To answer the guiding research questions, this evaluation will employ multiple methods and phases, which are described below. The respondents will be O5 Lt Colonels/commanders, O6 Colonels/commanders, and E9 Sergeants Major slated for participation in the Commandant’s Cornerstone Training Program sessions in Fall 2024-Spring 2026, and the Pre-Command SAPR IP Training instructors and facilitators (USMC HQ staff and civilian employee subject matter experts).

All commander/Sergeant Major participants will complete a baseline survey prior to their time in Cornerstone. Immediately following the Pre-Command SAPR IP Training, trainees will complete trainee feedback forms, and training instructors will complete instructor feedback forms to assess program fidelity and immediate post-training outcomes. Roughly 5-6 months after the completion of the Pre-Command SAPR IP Training, commander/Sergeant Major participants will complete a follow-up survey. After the follow-up surveys, NORC will conduct qualitative interviews with a small sample of commander/Sergeant Major participants get additional, nuanced information about their impressions and experience of the Pre-Command SAPR IP Training. We will have two cohorts of participants: the first will be the Training Cohort and this group will complete the baseline survey, attend Cornerstone, then take the follow-up survey. The second will be the Comparison Cohort, and this group will complete the baseline and follow-up survey before attending Cornerstone. See Figure 1 for the evaluation timeline, outlining the timing of data collection methods.

FIGURE 1: TIMELINE FOR DATA COLLECTION

NORC expects to collect up to 800 surveys with USMC commanders and Sergeants Major who are slated for Cornerstone training across our baseline and follow-up surveys for the training and comparison cohorts. The target sample size at follow-up, after accounting for anticipated attrition, is 400 (see Figure 2 below).

FIGURE 2: QUASI-EXPERIMENTAL DESIGN

BASELINE AND FOLLOW-UP SURVEY DATA COLLECTION

Measures

Given that the study instrumentation is tailored to the unique training program and outcomes of interest, there are no validated metrics from past empirical research. As such, NORC developed new metrics for the survey instrument, guided by the well-tested Theory of Planned Behavior and published guidance on developing new metrics to evaluate its tenets.1 Survey items reflect the following constructs related to implementing SA prevention and response as a leader: demographic information; attitudes about their role; knowledge of information from the training; general, behavioral, and control beliefs; subjective norms; perceived behavioral control; preparedness; behavioral intentions; and behaviors related to SA prevention and response.

The baseline and follow-up surveys will each take no more than 15 minutes to complete.

Methods

NORC plans to collect up to 800 surveys over two years with USMC commanders and Sergeants Major who have or will participate in the Pre-Command SAPR IP Training across the baseline and follow-up surveys for the training and comparison cohorts. As seen in Figure 2 above, the target sample size at follow-up in a given year, after accounting for anticipated attrition, is 200 training and 200 comparison participants (n=400 total for the study).

The training cohorts (planned for Cornerstone offerings in fall 2024 and fall 2025) will be asked to complete an online baseline survey prior to their arrival at the Commandant’s Cornerstone Training Program. Participants in the training cohort will be invited to complete the online baseline survey via a unique link included in the Cornerstone Training Program Orientation and Registration email sent to their military email address. As a secondary measure to secure additional completed baseline surveys, NORC will provide a QR code to the online baseline questionnaire that will be shown to commanders and Sergeants Major before the USMC Pre-Command SAPR IP Training starts. The participants would need to enter their military email address to enter the survey directed by the QR code (at which time a unique researcher ID will be generated).

The surveys will be programmed in Voxco and hosted by NORC and will be accessible via a mobile or laptop device with an internet connection. Respondents will navigate through and indicate their answer choices on the survey questionnaire. A question will also ask respondents to provide a personal email address that may be used to send the follow-up survey at a later date (email addresses will not be stored with responses to the survey). Participants may skip any question they do not wish to answer. The survey will take no more than 15 minutes to complete. After data collection is complete and the baseline and follow-up surveys are linked, NORC will remove all personally identifiable data (e.g., email addresses) from the analytic datasets to further ensure confidentiality. Only the unique researcher-generated participant IDs will remain.

The training cohorts will complete an online follow-up survey (link sent via email) approximately 5 to 6 months after the Pre-Command SAPR IP Training. Participants for the follow-up survey will be invited to complete the survey by receiving an email invitation to their military email address and/or their personal email address (if provided at baseline survey) with a unique link to the survey questionnaire. The follow-up survey for the training cohort will have the same specifications as the baseline survey (e.g., hosted by NORC, accessible via mobile and laptop computers, confidential using researcher-generated IDs, allowed to skip any questions they do not wish to answer, etc.).

The comparison cohorts (planned for Cornerstone offerings in spring 2025 and spring 2026) will complete an online baseline survey (link sent via email) around the same time as the training cohort, or approximately 6 months prior to their enrollment in the spring Commandant’s Cornerstone Training Program. Participants for the baseline survey will be invited to complete the survey by receiving an email invitation to their military email address with a link to the survey. The baseline survey for the comparison cohort will have the same specifications as the baseline survey for the training cohort (e.g., hosted by NORC, accessible via mobile and laptop computers, confidential using researcher-generated IDs, asking for a personal email address, allowed to skip any questions they do not wish to answer, etc.).

The comparison cohorts will be asked to complete an online follow-up survey 6 months after completing their baseline survey and immediately prior to their arrival at the Commandant’s Cornerstone Training Program. Participants in the comparison cohort will be invited to complete the online follow-up survey via a unique link included in the Cornerstone Training Program Orientation and Registration email sent to their military email address. The follow-up survey for the comparison cohort will have the same specifications as the baseline survey (e.g., hosted by NORC, accessible via mobile and laptop computers, confidential using researcher-generated IDs, allowed to skip any questions they do not wish to answer, etc.). As a secondary measure to secure additional completed follow-up surveys, NORC will provide a QR code to the online follow-up questionnaire that will be shown to commanders and Sergeants Major before the USMC Pre-Command SAPR IP Training starts. The participants would need to enter their military email address to enter the survey directed by the QR code (at which time a unique researcher ID will be generated).

NORC will analyze aggregate data for effects in pre/post-training outcomes among training and comparison cohorts.

QUALITATIVE INTERVIEWS

Measures

NORC will conduct qualitative interviews with commanders and Sergeants Major to collect in-depth feedback on their experience of the Pre-Command SAPR IP Training as it relates to constructs within the Theory of Planned Behavior and the outcomes of interest. Qualitative interviews are only planned for the training cohorts (not the comparison cohorts). NORC has created open-ended questions to document attitudes and experiences prior to the training, general attitudes about the Pre-Command SAPR IP Training, post-training intentions and behaviors, and suggested future modifications to the Pre-Command SAPR IP Training. The interviews will take 30 minutes.

Methods

The qualitative interviews will take place with commanders/Sergeants Major of the training cohort after the follow-up survey is completed. On follow-up surveys, respondents will be asked if they would like to participate in an interview with NORC researchers to share their personal impressions of the Pre-Command SAPR IP Training. Respondents who select “yes” will be asked to provide their personal and/or military email address for NORC to follow up. NORC will reassure survey respondents that their email address will not be linked to their survey or qualitative interview responses. NORC will program the survey data export to list those who volunteer for the interviews along with their email addresses separate from the remainder of the survey responses. NORC will email individuals who provided valid email addresses to provide a summary of the purpose of the interview and schedule a 30-minute meeting video conference, for which internet connection will be required (e.g., NORC-hosted Zoom or Microsoft Teams platforms). NORC will conduct 16-20 interviews with commanders and Sergeants Major who have participated in the Pre-Command SAPR IP Training. Participants will be able to complete the interview via a mobile phone, tablet, or laptop computer. NORC staff will lead the interviews and keep detailed notes on the content of the interview. The questions will be presented verbally to the participant. NORC will code the interview notes and look at aggregate responses during analysis.

TRAINING FEEDBACK FORMS

Measures

Training feedback forms will be collected from both training instructors and trainees to assess a mixture of process measures (i.e., program fidelity of the Pre-Command SAPR IP Training) and outcome measures (i.e., immediate post-training constructs measurement). Feedback forms will only be collected from the training cohorts (not the comparison cohorts).

The instructor feedback forms will provide insights to the fidelity of the Pre-Command SAPR IP Training implementation. While parts of the Pre-Command SAPR IP Training are scripted and the instructors are trained on delivery, other parts are unscripted, discussion-based scenarios, for which the instructors (and small group facilitators) are provided a guide with learning objectives and discussion points. The instructor feedback form questions probe for the respondents’ role in the Pre-Command SAPR IP Training; their level of preparedness to lead sessions; whether they were able to cover all the curriculum content and why/why not; their reflections on how well the training sessions went; perceptions of trainees’ level of participation and engagement; an assessment of their trainees’ understanding of the content; and open-ended feedback for training improvement. The feedback form requires 2-4 minutes to complete. The instructor feedback forms will be administered confidentially.

The trainee feedback forms will provide participants’ perspectives and insights into training relevance, applicability, effectiveness, satisfaction, and immediate post-training outcomes for key constructs related to the evaluation theory of change model. The trainee feedback form questions cover the respondent’s rank/position; the extent to which the training and materials are informative and useful; perceived barriers to using training content; an assessment of their pre/post knowledge and understanding of the content; their beliefs, preparedness, and behavioral intentions; content knowledge checks; and open-ended feedback for training improvement. This feedback form also requires 2-4 minutes to complete. The trainee feedback form will be administered anonymously.

Methods

Training feedback forms will be collected from as many instructors and trainees of the Pre-Command SAPR IP Training as possible from the fall 2024 and fall 2025 Cornerstone cohorts. The feedback forms will be available for completion immediately following the conclusion of the third small group discussion before individuals leave for the next session within Cornerstone. Instructors and trainees will be invited to complete the feedback form via a QR code displayed at the conclusion of the Pre-Command SAPR IP Training. Instructors and small group facilitators will prompt trainees to complete the feedback form and provide trainees with a few moments to complete the questionnaire prior to departing for the next Cornerstone session. The feedback forms will be programmed in Voxco and hosted by NORC and accessible on a mobile or laptop device with internet connection. Respondents will click through their answer choices and will be allowed to skip any question they do not wish to answer. NORC will code the responses and analyze the aggregate data.

  1. Statistical methodologies for stratification and sample selection.

The evaluation will take on the form of a natural experiment with all commanders and Sergeants Major attending the Pre-Command SAPR IP Training over the course of 2024 2026 eligible to be in the study, with all fall Cornerstone cases assigned to training and all spring Cornerstone cases assigned to the comparison group. These commanders and Sergeants Major will be surveyed at baseline (on a rolling basis based on their Cornerstone entry date over the period October 2024 – Spring 2026) and once more via a 6-month follow-up survey. There is no stratification in the sample design; all eligible commanders and Sergeants Major over the study intake period are included in the sample.

b. Estimation procedures.

No sample estimation procedures are included in this program evaluation design. That is, all eligible cases during the study period will be entered into the study if the participants agree to participate in the surveys.

c. Degree of accuracy needed for the Purpose discussed in the justification.

To ensure the credibility of evaluation findings, NORC has conducted statistical power calculations to determine the credibility of detecting a significant program effect at specific sample sizes. Statistical power provides an estimate of the probability of identifying a relationship through a significant statistical test when, in fact, such an impact exists.2 G*Power 3.1 was used to calculate power estimates. NORC prepared power analyses based on the projected number of trainees who participate in each Cornerstone training offered over the course of two years (roughly 200 trainees in the spring trainings and 200 trainees in the fall trainings equals 400 trainees annually, and 800 trainees over two years).

While NORC will attempt to enroll 800 participants to the study, it is expected that roughly 50% of baseline participants will complete the follow-up survey as well, accounting for losses due to the 6-month period before administration of the follow-up survey. The power analysis was undertaken with the following assumptions: N = 800, 80% power, p < 0.05, 50% retention rate at 6-month follow-up. NORC expects to retain a minimum of n = 400 at the conclusion of the study, which will allow small effect sizes (f = 0.14) to be detected. Cohen’s f is the standardized effect measure for ANOVA and similar models, where very small is <0.2, small is 0.2-0.5, medium is 0.5-0.8, and large is >0.8.3 NORC also ran an additional analysis to account for higher attrition rates (i.e., worst-case-scenario). These power calculations for this QED design, assuming only an n = 200 at the conclusion of the study, with 80% power and p-value < 0.05 result in an effect size of = 0.20 (small-to-medium effect). NORC is confident that a small-to-medium effect can be detected with the target sample size (n = 400) and worst-case scenario sample size (n = 200).

d. Unusual problems requiring specialized sampling procedures.

NORC does not anticipate a need for specialized sampling procedures given the study design. All eligible cases during the study period will be entered into the sample.

e.  Use of periodic or cyclical data collections to reduce respondent burden.

Two data points from each participant are necessary to identify change over time. Surveys will be administered at baseline and 6 months later. There will be two full years of data collection, with each year consisting of a baseline survey and follow-up survey administration period (see Figures 1 and 2). Without a follow-up data collection, NORC will be unable to assess the outcomes associated with the Pre-Command SAPR IP Training program. In other words, given the need to identify change over time, two surveys (baseline/follow-up) are the least number of surveys possible.

  1. Maximization of Response Rates, Non-Response, and Reliability

First, survey participation in the military is generally low, as evidenced by the response rates below 15% for the WGRA surveys; USMC response to the most recent WGRA is even lower at 8%.4 Low response rates – because those who respond to the survey may be systematically different on key variables from those who did not respond5 – can lead to erroneous conclusions and limited generalizability of findings.6 Incentives are often used to encourage and increase survey participation; however, USMC has indicated that incentives would not be approved for this evaluation, since the eligible population will be completing surveys and interviews while on active duty. Addressing low response rates proactively will ensure that NORC obtains program evaluation data that is trustworthy and useful in showing whether the Pre-Command SAPR IP Training is having the intended impact. The surveys were designed to be brief (less than 15 minutes), and surveys will be included in Cornerstone Orientation and Registration materials sent by USMC staff. In addition, the onsite convenience as a backup measure for one survey (at baseline for the training group and at follow-up for the comparison group) will encourage survey participation. Additionally, NORC in partnership with USMC, will work to encourage survey participation through reminder emails (to a military address and personal email address for those that voluntarily provide that information) and digital communication encouragement from USMC/Marine Core University (MCU). NORC may also use endorsement emails and a video from USMC leadership to encourage participation, an approach which has worked in other prior military studies.

Second, in addition to low survey participation, NORC also faces the potential of low participation for qualitative interviews. Tactics would be similar to those proposed to increase survey participation (NORC email reminders; encouragement announcements/email/video from USMC/MCU).

Nonresponse and Missingness bias

To address unit-level missing data on the surveys, NORC will compare responders with non-responders at baseline and follow-up with basic aggregated demographic information and other information (e.g., demographics, years of service, rank and position in Marines, etc.) available on all eligible personnel receiving Pre-Command SAPR IP Training at the time of study intake period. Any statistically significant differences by demographic or background variables will be addressed as covariates in later outcome models.

To address item-level missing data (i.e., if respondents skip some questions), NORC will first assess the amount of missing data and whether missingness is at random. If there is little missing data (e.g., under 5% - 10%), NORC will assess if it is statistically appropriate to use listwise deletion of these cases. If necessary, NORC will compare the impact of employing various methods to handle missing data (e.g., Full Information Maximum Likelihood and Multiple Imputation procedures)7 to fill in missing values for the surveys that are only partially completed. NORC is experienced in various imputation methods (e.g., nearest neighbor “hot deck”), including multiple imputation (e.g., Rubin’s multiple imputation strategy8 to replace each missing value with a set of plausible values that represent the uncertainty about the correct value).

  1. Tests of Procedures

NORC has collected feedback from Marine Officers (Captains, Sergeants Major, Lt. Colonels, Colonels) on the survey questions. NORC will also beta test the survey instrument on the online platform Voxco internally to ensure that respondents are able to access and complete the survey properly. The email recruitment connectivity and anonymous online survey connectivity (i.e., the technological aspects of the data collection) will be tested with the respective USMC staff involved in planning the data collection. The feedback forms were reviewed with USMC Cornerstone staff with extensive knowledge of the operation of the program.

  1. Statistical Consultation and Information Analysis:

  1. Provide names and telephone number of individual(s) consulted on statistical aspects of the design.

  • Bruce Taylor: (312) 759-4000

  • Elizabeth Mumford: (312) 759-4000

  • Dani Heide: (312) 759-4000

  • Jaclyn Siegel: (312) 759-4000

  • Caroline Lancaster: (312) 759-4000

  • Malina Papanikolaou: (312) 759-4000

  • Wisdom Ibikunle: (312) 759-4000

  1. Provide name and organization of person(s) who will actually collect and analyze the collected information.

  • Bruce Taylor: (312) 759-4000

  • Elizabeth Mumford: (312) 759-4000

  • Cynthia Simko: (312) 759-4000

  • Dani Heide: (312) 759-4000

  • Jaclyn Siegel: (312) 759-4000

  • Caroline Lancaster: (312) 759-4000

  • Malina Papanikolaou: (312) 759-4000

  • Wisdom Ibikunle: (312) 759-4000



1 I. Ajzen et al., “The Theory of Planned Behavior,” in Handbook of Theories of Social Psychology, vol. 1 (England: Sage, 2012), 438–59.

2 J Cohen, Statistical Power Analysis for the Behavioral Sciences | Jacob Cohen |, 2nd ed. (New York, NY: Routledge, 1988), https://www.taylorfrancis.com/books/mono/10.4324/9780203771587/statistical-power-analysis-behavioral-sciences-jacob-cohen.

3 Cohen, J. (1988). Statistical power analysis for the behavioral sciences. Lawrence Erlbaum Associates.

4 RA Breslin et al., “2021 Workplace and Gender Relations Survey of Military Members: Overview Report” (Office of People Analytics, 2022), https://apps.dtic.mil/sti/citations/AD1178339.

5 Richard Hendra and Aaron Hill, “Rethinking Response Rates: New Evidence of Little Relationship Between Survey Response Rates and Nonresponse Bias,” Evaluation Review 43, no. 5 (2019): 307–30, https://doi.org/10.1177/0193841x18807719.

6 Yimeng Guo et al., “Population Survey Features and Response Rates: A Randomized Experiment,” American Journal of Public Health 106, no. 8 (2016): 1422–26, https://doi.org/10.2105/AJPH.2016.303198.

7 Melissa J. Azur et al., “Multiple Imputation by Chained Equations: What Is It and How Does It Work?,” International Journal of Methods in Psychiatric Research 20, no. 1 (2011): 40–49, https://doi.org/10.1002/mpr.329.

8 D.B. Rubin, Multiple Imputation for Nonresponse in Surveys (New York: John Wiley & Sons, 1987); D. B. Rubin, “Inference and Missing Data,” Biometrika 63, no. 3 (1976): 581–90.

File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
AuthorWisdom Ibikunle
File Modified0000-00-00
File Created2026-01-07

© 2026 OMB.report | Privacy Policy