Form 1190-0021 Facilitator Interview

Program Impact Evaluations (Level 3 Evaluations)

1190-0021_CRS_L3_Process_Facilitator Interview Protocol

Facilitator Interview

OMB: 1190-0021

Document [docx]
Download: docx | pdf

STAFF AND PROGRAM FACILITATOR SEMI-STRUCTURED INTERVIEWS

OMB # 1190-0021

Expiration Date: XX/XX/XXXX

Evaluation of Community Relation Service (CRS) Process Evaluation

CRS Staff and Program Facilitator Semi-Structured Interviews

CRS staff, Subject Matter Experts, and program facilitator(s) interview

The Evaluation team will use this protocol to conduct individual interviews with CRS staff and program facilitators within one week before the program or service delivery and one week after the program or service delivery. We will conduct interviews with CRS staff that include conciliators, Subject Matter Experts (SME), and program facilitators. All interviewees may not be asked all questions We will seek staff permission to record all interviews.

Introduction

Thank you for joining today’s interview on the [PROGRAM]. As we mentioned via email, the Mathematica evaluation team is conducting a process evaluation to gain an in-depth understanding of the implementation of the [PROGRAM]. As a reminder, a process evaluation looks at whether program activities have been implemented as intended, including challenges, successes, and lessons from the implementation.

During today’s interview, we will pose a number of questions. Our focus is on hearing directly from you, the expert and facilitators of this work. So, we plan to listen far more than we talk and hope that you will share your candid feedback throughout the discussion. We have planned 60 minutes for today’s conversation. An additional 60-minute conversation will take place after the program ends. We welcome any additional thoughts you would like to share over email. Are there any questions before we get started?

We are hoping to record this discussion so that we can fill in any holes in our notes after the conversation. We will not share the recording with anyone outside of the evaluation team. You can use a pseudo name today if you would prefer to be anonymous. Do we have your permission to record the conversation? [Ask each participant(s) to consent to recording. If any participant(s) declines to be recorded, do not record.]

Great! Let’s get started.

Before program implementation

  1. To start, can you introduce yourself, your role, and your experience with the [PROGRAM]?



  1. Think back to how we all got here. To the best of your knowledge, what was the incident(s) that led to the need for the [PROGRAM]?

  2. What type of preparation work have you done (if any) or are you planning to do before the implementation of the program?

  3. Now, tell me about the [PROGRAM] at [LOCATION OR SITE]. Can you share details of the plans for program delivery?

Ask questions a-b for interviews with CRS conciliators and SMEs only.

    1. Does this align with how the [PROGRAM] is intended to be delivered?

    2. What changes to program delivering are you anticipating with this implementation?

      1. Why? What factors might drive those changes in this case?

      2. Sometimes, changing how you implement a program means that it might not achieve the same goals or outcomes every time. Do you plan to employ any specific strategies to mitigate the effects of these changes?

      3. In general, what changes are typical with [PROGRAM]? How and when do you know that a change will be needed while planning for program delivery?

  1. Who will participate in the program?

  1. How did or will you recruit participants for the program?


  1. What types of activities are you planning for after program implementation?

    1. Will you debrief any partners?

    2. [For SPIRIT programs]: Can you describe the process for creating a SPIRIT council?

    3. What other post-program activities (for example, check-ins) might occur?



  1. What is the process of survey administration and data collection?

    1. What is the plan for survey administration and data collection for this site?


  2. As you know, we are planning to observe the program. What do you suggest that we should pay particular attention to? Why?

After program implementation

  1. In your opinion, was the program implemented as intended? [Probe further.]

Ask questions a-b for interviews with CRS conciliators and SME only.

    1. What changes of the [PROGRAM] emerged during the program?

      1. Why were these changes  needed? [Ask for every variation.]

      2. How typical was the implementation of this program (for example, similarities or differences) compared to other times that the program has been implemented?

    2. [Ask about any specific variations in implementation from your observations.]

  1. What were the successes of and challenges to program implementation?

    1. What contributed to the successes and challenges?

    2. How did you manage or overcome the challenges?

  2. How prepared did you feel to deliver this [PROGRAM]?

    1. What contributed to your level of preparedness?

Ask question 4 of interviews with CRS conciliators and SMEs only

  1. [For SPCP and SPIRIT programs only.] This program relied on [SMEs, VOLUNTEER SMALL GROUP FACILITATORS, CONCILIATORS, OTHER]. How prepared do you think these partners were for program delivery?

    1. What contributed to their preparation?

      1. How could it have been better?



  1. What are the top three lessons learned from this [PROGRAM] that can improve or strengthen future implementations of it?

    1. What, if anything, would you do differently next time?


  2. How effectively did the administration of surveys go?

    1. What worked well and what did not work well?

    2. What could be improved to make the survey administration more effective?









According to the Paperwork Reduction Act of 1995, no persons are required to respond to a collection of information unless such collection displays a valid OMB control Number. Public reporting burden for this collection of information is estimated to average 10 minutes per response, including time for reviewing instructions, searching existing data, gathering and maintaining the data needed, and completing and reviewing the collection of information. The obligation to respond to this collection is voluntary. Send comments regarding the burden estimate or any other aspect of this collection of information, including suggestions for reducing this burden, to ATF, Contracts & Forms please contact Melody Diegor Caprio, CRS/DOJ, at 202-353-1806 or melody.caprio@usdoj.gov located at 145 N. ST NE, Washington, DC 20002, and reference OMB Control No. 1190-0021.

File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
File Title1-column report template
AuthorBrandon Hollie
File Modified0000-00-00
File Created2025-05-19

© 2025 OMB.report | Privacy Policy