CFCE Evaluation Supporting Statement B

CFCE Evaluation Supporting Statement B.pdf

Creative Forces®: NEA Military Healing Arts Network Community Arts Engagement Subgranting Program Evaluation Forms

OMB: 3135-0146

Document [pdf]
Download: pdf | pdf
The National Endowment for the Arts
Creative Forces®: NEA Military Healing Arts Network
Community Engagement Program Evaluation
OMB Information Collection Request - New Collection
Justification – Part B Supporting Statement
Last updated: May 12, 2025

Table of Contents

B.1 Respondent universe and sampling methods .......................................................................... 3
B.2 Procedures for the collection of information ........................................................................... 6
B.3 Methods to maximize the response rates and to deal with nonresponse............................... 7
B.4 Test of procedures or methods to be undertaken ................................................................. 14
B.5 Individuals consulted on statistical aspects & individuals collecting and/or analyzing data . 15

Table of Attachments
Attachment A: Creative Forces Community Engagement Grant Program Logic Models
Attachment B: Instruments
Instrument 1 – Participant Survey
Instrument 2 – Arts Engagement Facilitator Survey
Instrument 3 – Grantee Interview Protocol
Instrument 4 – Partner Interview Protocol
Instrument 5 – Participant Interview Protocol
Instrument 6 – Grant Leadership Interview Protocol
Attachment C: Outreach Communication
Template 1 – Grantee Initial Outreach Email
Template 2 – Grantee Post-Webinar Email
Template 3 – Grantee Interview Invitation
Template 4 – Case Study Invitation
Attachment D: IRB Notice of Approval
Attachment E: Cognitive Testing Report: Arts Engagement Facilitator Survey
Attachment F: Cognitive Testing Report: Participant Survey

2

B.1 Respondent universe and sampling methods
Describe (including a numerical estimate) the potential respondent universe and any
sampling or other respondent selection method to be used. Data on the number of entities
(e.g., establishments, State and local government units, households, or persons) in the
universe covered by the collection and in the corresponding sample are to be provided in
tabular form for the universe as a whole and for each of the strata in the proposed sample.
Indicate expected response rates for the collection as a whole. If the collection had been
conducted previously, include the actual response rate achieved during the last collection.
This outcomes evaluation of the Creative Forces Community Engagement Grant Program will
collect information between July 2025 and July 2027 from two cohorts of grantees: the 20252027 cohort (50 grantees, two years of data collection) and the 2026-2028 cohort (50 grantees,
one year of data collection). These grantees comprise the respondent universe.

The study’s sample will be selected through a competitive RFP process administrated by MidAmerica Arts Alliance (M-AAA) acting as a cooperator for the National Endowment for the Arts.
In the context of this information collection, the sampling universe consists of all organizations
funded through the Creative Forces Community Engagement Grant Program, as they represent
the full population eligible to implement the program and to engage evaluation activities.

The target populations for the evaluation include four groups of stakeholders: grantees,
grantee partners, arts engagement facilitators, and program participants. A sample of eight
grantees will also participate in site visits involving additional interviews.

Grantee interview (Attachment B). Participation in a virtual 50-minute interview is required for
all grantees as part of their award.

3

Partner interview (Attachment B). Partnerships are a requirement of the grant program. Each
grantee will identify a partner to participate in a virtual 30-minute interview.

Case Study Site visit interviews (Attachment B). Eight grantees from the 2025-2027 cohort will
be selected to participate in a one-day site visit for observation of program activities and
additional eight, 30-minute interviews with grantee staff, partners, and/or program
participants. The sample will be representative of the full grantee cohort, with the criteria
determined by the evaluation’s Technical Working Group based on characteristics of the cohort
(e.g., program model, arts discipline, military population served, geography, demographics).
Three alternate programs will be identified in case a selected grantee declines. All site visits will
be conducted during the first year of the 2025-2026 cohort.

Arts Engagement Facilitator Survey (Attachment B). Each Community Engagement program
will have one or more arts engagement facilitators that lead activities for participants. All arts
engagement facilitators across the entire grant program will receive the survey.

Pre/Post Individually-Matched Participant Survey (Attachment B). The participant survey will
be administered to a subset of grantees, depending on two criteria. First, the program
implementation model must provide a minimum of eight hours or three sessions in order to
qualify for a pre/post survey. This minimum is required in order for participants to have
sufficient exposure to the programming that measurable change may occur, and to allow for
4

pre and post survey administration. Second, grantees may decline participation for their
program participants. Using the Grantee Application Form, the evaluator will identify grants
that support participant engagement (as opposed to networking, capacity building) and contact
grantees to determine the number of sessions and hours their program provides for
participants. All participants in programs that meet the minimum criteria will be asked to
complete the survey at the beginning (pre) and end (post) of the program or engagement
period. Data on the number of program sessions and hours of engagement for the CFCE
program will be collected.

A pilot study of the Participant Survey tested several modes and achieved a response rate of
67% (136 out of 204) for the pre-survey, across modalities. There was attrition for the post
survey, with a response rate of 46% (93 out of 204; 68% of the pre-survey respondents). The
pilot study (OMB Control Number 3135-0146) made recommendations to improve response
rates, and these will be incorporated into the survey administration for the evaluation, along
with an eGift gift card incentive. Estimated response rates for the pre and post surveys during
the evaluation are 70% and 60%, respectively. Exhibit 1 summarizes the data collection
respondents and anticipated response rates.

Exhibit 1.

Timing in a

Estimated pool of

Target

grantee’s

respondents

response rate

program

2025-2026

2026-2027

Grantee interview

Post

1 interview/grantee

50

50

100%

Partner interview

Post

1 interview/grantee

50

50

90%

5

Case Study Site Visit

During

interviews

8 of 50 grantees, up

64

0

100

100

300-500

300-500

300-500

300-500

100%

to 8 interviews/site

2025-2027 cohort only

Arts Engagement

Post

1 survey/facilitator

Participant Pre

Program

1 survey/participant

Survey*

beginning

Participant Post

Program end

Facilitator Survey+

1 survey/participant

80%
75%
65%

Survey*
+Estimates based on an average of 2 arts engagement facilitators per grantee
*M-AAA estimates 50 new grantees per cohort, with 24-30 grantees eligible for the pre/post survey, yielding a pool of 300-500
respondents per cohort

B.2 Procedures for the collection of information
Describe the procedures for the collection of information, including
•
•
•
•
•

statistical methodology for stratification and sample selection,
estimation procedure,
degree of accuracy needed for the purpose described in the justification,
unusual problems requiring specialized sampling procedures, and
any use of periodic (less frequent than annual) data collection cycles to reduce
burden.

This outcomes evaluation of the Creative Forces Community Engagement Grant Program is a
one-time data collection using interviews, surveys, and site visits. No sampling is required for
grantee interviews, partner interviews, or the Arts Engagement Facilitator Survey as they are
intended for all potential respondents in those groups.

The Participant Survey will be provided to all programs that meet two criteria: 1) a minimum of
8 hours or three sessions of programming, and 2) the grantee opts into the survey. The sample
consists of all programs meeting these criteria. Post hoc analysis will compare characteristics of
the grantee/programs that opt in with those that do not in order to identify any systematic
6

bias.

For the selection of eight grantees and three alternates for case study site visits, the evaluator
will utilize a purposive sampling strategy with advisement from the evaluation’s Technical
Working Group. To create a representative sample, the evaluator will conduct an initial
descriptive analysis of the 2025-2026 cohort and use this information to frame the sample.
Variables that will be considered include service delivery model, rural/urban location, region,
type of organization and/partners, grant tier, arts discipline, and population served. If other
important variables emerge during the descriptive analysis, they will be considered as well.
Grantees participating in the site visits will be asked to include program staff, partners, and arts
engagement facilitators in the interviews. They may also invite program participants at their
discretion. This is to allow participants to maintain anonymity, which is carefully protected by
some programs.

B.3 Methods to maximize the response rates and to deal with nonresponse
Describe methods to maximize response rates and to deal with issues of non-response. The
accuracy and reliability of information collected must be shown to be adequate for intended
uses. For collections based on sampling, a special justification must be provided for any
collection that will not yield "reliable" data that can be generalized to the universe studied.
Enhancing Response Rates by Engaging Grantees
At the beginning of each grant period, the evaluation team will provide webinars for grantees
to introduce the evaluation. Grantee engagement and buy in will be cultivated through
personal contact and opportunities for grantees to learn and ask questions about the

7

evaluation. Each grantee will be assigned one evaluator who will serve as their evaluation
contact for the entire grant period. In addition, two webinars will be provided at the beginning
of each cohort.

Evaluation Orientation Webinar. At the beginning of each grant year (July/August of 2025
and 2026) a brief, optional webinar will introduce the evaluation team, provide a brief
overview of evaluation activities, and explain how the information will be used. This is also
an opportunity to explain the purpose of the Participant Survey, the program eligibility
requirements for participating in the survey, and the selection process for the case studies.
The informational section of the webinar will last approximately 20 minutes and will be
followed by Q&A. The webinar will be recorded for grantees to access later.

Technical Assistance Webinar for the Participant Survey. For grantees who administer the
Participant Survey, there will be a required technical assistance webinar. The webinar will
cover the Participant Survey, modes of implementation and technical requirements, the
role of grantees in administering the survey, and how the outcomes will be used. It will also
cover information for participants and questions they may have. The webinar will be
recorded for grantees to access later via the M-AAA website.

Enhancing Response Rates for Required Grantee Interviews (50-minute, virtual)
Grantees are required to participate in one interview for the evaluation as part of their grant
award. Therefore, they have advance notice of the interview and the encouragement from the
8

grant programs leadership to participate. In addition, several other techniques will be used to
ensure high response rates.
•

Relationship with one evaluator consistently throughout the grant period

•

Personal contact and direct outreach

•

Advance notice and flexible scheduling

•

Follow-up reminders

Enhancing Response Rates for Partner Interviews (30-minute, virtual)
As part of their award, each grantee will be requested to identify one partner for an interview
and to provide an introduction. Partners are not required to participate. Several techniques will
be used to encourage high response rates.
•

Introduction made by grantee

•

Personal contact and direct outreach

•

Advance notice and flexible scheduling

•

Follow-up reminders

•

Incentive: $35 eGift card

Enhancing Response Rates for Case Study Site Visit Interviews (30-minute, virtual)
Once grantees commit to the case study, the evaluator will coordinate with their contact
person to identify a date that is convenient to the grantee and their program. The evaluator will
collaborate with the contact to develop a site visit schedule that minimizes disruption to their
organization and program. The evaluator will manage all logistics related to travel, removing
9

this burden from the grantee. The following measures will also be used to enhance response
rates.
•

Introduction made by grantee

•

Personal contact and direct outreach

•

Advance notice and flexible scheduling

•

Follow-up reminders

•

Incentive: $35 gift card/interview

Enhancing Response Rates for Surveys
Arts Engagement Facilitator Survey (anonymous, web-based). Arts engagement facilitators
across all programs in both cohorts will be invited to complete the Arts Engagement Facilitator
Survey. The results of cognitive testing of this survey with current arts engagement facilitators
suggest a high level of interest in and support for this survey, which is promising for the
response rate. The following measures will be taken to enhance the response rate.
•

Introduction made by grantee

•

Personal contact and direct outreach

•

Show the association with Creative Forces and the National Endowment for the Arts
(e.g., logos) to legitimize the survey

•

Explanation of the value of the survey and how the information will be used

•

Follow-up reminders

•

Incentive: $30 eGift card

10

Participant Survey. Grantees’ community engagement programs will vary considerably on
multiple dimensions: implementation model, frequency and duration of each session, overall
duration of the program, arts discipline, specific target population, and number of people
served, among others. Results of a pilot study of the participant survey recommended offering
several different administration modes and tailoring them to the individual grantee. Based on
these findings, as well as input from the evaluation’s Technical Working Group and feedback
from the survey’s cognitive testing participants, three different administration modes will be
offered (see Exhibit 2). Each grantee will select the mode that is most appropriate for their
program and participants, with guidance from the evaluation team. Each grantee will use the
same mode at the pre and post time points. The evaluation team will provide technical
assistance and communication materials to the grantee aligned with their chosen method. Data
analyses will compare outcomes by mode to determine whether mode is a confounding factor.
Exhibit 2.
E-Mail

Surveys are administered through email invitations using SurveyMonkey.
Evaluators work with each grantee to create a contact list of participants'
email addresses and send an initial email with a unique link to the preprogram survey. This will occur just before the program begins or
immediately after it first convenes, if contact information is not available
prior to the program. At the end of the program, evaluators send a followup email with a link to the post-program survey. This method ensures
direct delivery to participants, allowing for easy tracking of responses and
automated reminders for non-responders.

Embed Surveys
in Program

This approach integrates pre- and post-program surveys directly into the
workflow of the program using paper copies or an embedded link or QR
code. For example, in-person programs can request paper copies with

11

unique identification numbers for each participant, which can be
completed at the onset of the program and at the end of the program. If
the program involves a series of online modules or webinars, the survey
will be integrated into the program during the first and final sessions. Each
participant will have a unique identification number to complete the
survey online. This method minimizes the chance of losing participants
between survey administrations.
Kiosk for InPerson
Programs

An alternative for in-person programs, this approach uses SurveyMonkey's
kiosk mode to administer surveys on-site. Each participant will have a
unique identification number. Using the organization’s tablets or
computers, participants can complete the pre-survey as they arrive and
the post-survey at the end of the program. This ensures high response
rates and immediate data collection at the onset and end of the program.

During the survey introduction provided by the grantee, and in the introductory language
within the survey instrument, respondents will be informed that their data will be matched
pre/post and how confidentiality will be maintained. To match surveys using email in
SurveyMonkey, each participant's email address will be used as a unique identifier. When
participants receive their pre-survey link via email, their responses will automatically be
associated with their email address in the system. After the program, the post-survey will be
sent to the same email, allowing SurveyMonkey to link the responses for analysis. For paper
and kiosk-administered surveys, participants will be asked to create and record a unique ID
number, such as a combination of initials and birthdate, on both the pre and post surveys. This
unique identifier ensures anonymity while enabling the evaluator to accurately match the pre
and post responses for comparison.

12

Additional techniques will be used to enhance participants’ response rates.
•

Technical assistance for grantees to administer the survey, including information on
how to discuss the survey with grantees and explain how the data will be used

•

Participants introduced to the survey by grantee, emphasizing the importance and
legitimacy of the study

•

Personal contact and direct outreach

•

Show the association with Creative Forces and the National Endowment for the Arts
(e.g., logos) to legitimize the survey

•

Follow-up reminders

•

Incentive: $30 eGift card

Dealing with Issues of Non-Response
In the analysis phase of the surveys, non-response bias will be assessed by comparing known
characteristics of respondents and non-respondents, when such data are available (e.g.,
program type, location, demographics). If notable differences are identified, weighting
adjustments or imputation methods may be used to mitigate bias and ensure the findings are
as representative as possible of the full population.
For the pool of grantees who are eligible to administer the Participant Survey, comparisons will
be made between the organizations that opt into the survey and those that decline. This will be
important for understanding the degree to which participant survey findings can be generalized
to the Creative Forces Community Engagement Program as a whole and potentially to other
13

community arts engagement programs serving military-connected populations.
B.4 Test of procedures or methods to be undertaken
Describe any tests of procedures or methods to be undertaken. Testing is encouraged as an
effective means of refining collections of information to minimize burden and improve utility.
Tests must be approved if they call for answers to identical questions from 10 or more
respondents. A proposed test or set of tests may be submitted for approval separately or in
combination with the main collection of information.
Arts Engagement Facilitator Survey. The evaluator tested this survey in April 2025, with six arts
engagement facilitators from multiple artistic disciplines and community engagement
programs. The objectives of the testing were to detect issues of usability, clarity, and readability
in the survey instrument. Minor changes were made to the survey instrument following the
completion of cognitive testing. The Cognitive Testing Report can be found in Attachment E.
Participant Survey. In 2021, the evaluator tested this survey with nine members of the militaryconnected population, who were also involved in community arts engagement programs as
staff members. The objectives of the testing were to detect issues of usability, clarity, and
readability in the survey instrument. Changes were made to the survey instrument following
the completion of cognitive testing. The survey was then pilot tested in 2023. The results from
that study inform the plans for administration of the survey in the upcoming evaluation. They
also resulted in a targeted revision of the survey. In March 2025, the revised version was tested
with eight members of the military-connected population, who were involved in community
arts engagement programs as participants. Minor changes were made to the survey instrument
following the completion of cognitive testing. The Cognitive Testing Report can be found in
Attachment F).
Interview protocols. The Technical Working Group that advises this evaluation reviewed the

14

interview protocols (Attachment B) and provided feedback. No other testing was performed.
B.5 Individuals consulted on statistical aspects & individuals collecting and/or analyzing data
Provide the name and telephone number of individuals consulted on statistical aspects of the
design and the name of the agency unit, contractor(s), grantee(s), or other person(s) who will
actually collect and/or analyze the information for the agency.
The Arts Endowment contracted ProgramWorks to develop the Participant Outcomes Survey
and to conduct cognitive testing of the survey. See Exhibit 3.
Exhibit 3.
Name

Organizational Affiliation
and Address

Title (Project Role)

Contact Information

Parties doing the data collection and analysis
Shawn Bachtler

Project manager

ProgramWorks; 8155 13th
Ave SW, Seattle, WA 98106

206-595-5878
shawnbachtler@gmail.com

Candace
Co-project manager
ProgramWorks; 8155 13th
Gratama
Ave SW, Seattle, WA 98106
National Endowment for the Arts staff consulted

206-229-8530
candace@illuminateevaluation.com

Kathryn Zickuhr

(202) 682-5563
zickuhrk@arts.gov

Social Science Analyst
Office of Research &
Analysis

National Endowment for the
Arts; 400 7th Street SW,
Washington DC 20506

15


File Typeapplication/pdf
File TitleSupporting Statement for OMB No
AuthorNEA
File Modified2025-06-09
File Created2025-06-09

© 2025 OMB.report | Privacy Policy