National Center for Education Statistics
National Assessment of Educational Progress
National Assessment of Educational Progress (NAEP) 2026
Supporting Statement
Part B
OMB# 1850-0928 v.36
May 2025
Table of Contents
Part B. Collection of Information Employing Statistical Methods 3
B.1. Potential Respondent Universe and Sample Design 3
B.2. Procedures for Collection of Information 7
B.2.a. Recruitment of Schools 7
B.2.b. School Staff Assessment Responsibilities 7
B.2.c. Administration Procedures 10
B.3. Methods to Secure Cooperation, Maximize Response Rates, and Deal with Nonresponse 10
B.3.a. Methods to Maximize Response Rate 10
B.3.b. Statistical Approaches to Nonresponse 11
B.1. Potential Respondent Universe and Sample Design
The possible universe of student respondents for NAEP 2026 is estimated to be 12 million at grades 4, 8, and 12, attending the approximately 154,000 public and private elementary and secondary schools in 50 states and the District of Columbia, including Bureau of Indian Education and Department of Defense Education Activity (DoDEA) Schools, and fourth- and eighth-grade public schools in Puerto Rico.
Respondents are selected according to student sampling procedures with these possible exclusions:
The student is identified as an English learner (EL), but is prevented from participation in NAEP, even with accommodations allowed in NAEP.
The student is identified as having a disability (SD) which prevents participation in NAEP, even with accommodations as allowed in NAEP, and has an Individualized Education Plan (IEP) or equivalent classification, such as a Section 504 plan.
Additional information regarding the classification of students is provided in section B.1.a.
To assess a representative sample of students, the process begins by identifying a sample of schools with student populations that reflect the varying demographics of a specific jurisdiction, be it the nation, a state, or a district. Within each selected school, students are chosen at random to participate and each has the same chance of being chosen, regardless of socioeconomic status, disability, status as an English learner, or any other factors. Selecting schools that are representative helps ensure that the student sample is representative.
The following are characteristic features of NAEP sample designs:
for state-level assessments, approximately equal sample sizes (2,000–3,000 assessed students) from each participating state’s1 public schools;
for district-level assessments, sample sizes of approximately 1,000–2,000 from each participating district’s public schools;
sample sizes of approximately 6,000–20,000 for national-only operational subjects, depending on the size of the item pool;2
samples sizes of approximately 3,000–12,000 for pilot assessments, depending on the size of the item pool;3 and
in each school, some students to be assessed in each subject.
Additional information about the sampling procedures used in NAEP can be found in the technical documentation at http://nces.ed.gov/nationsreportcard/tdw/sample_design/. Note, while the latest documentation for main NAEP that has been published (as of the drafting of this document) is from 2022, the procedures have essentially remained the same. A summary of the sampling procedures is included on the following page. Additional details (taken from the main NAEP 2022 procedures on the technical documentation website) can be found in Appendix G (NAEP 2022 Sample Design).
As in the past, NAEP samples are based on multistage designs. For the national samples, a two-or three-stage design is used. If a three-stage design is used, the first stage is the selection of primary sampling units (PSUs), which are individual counties or groups of contiguous counties. The next stage is the selection of schools (within PSUs, when a three-stage design is used) and the final stage is the selection of students within schools. The national samples have sufficient schools and students to yield results for public schools, private schools, charter schools, each of the four Census Regions of the country, as well sex, race, and degree of urbanization of school location.
The following steps are used to select a sample of public schools and students in a year when NAEP reports state-level results. Private schools are not included in a state-level sample, which focuses solely on public schools.
Generate a sampling frame.
For sampling frames, NAEP
uses the most current versions of the NCES Common Core of Data (CCD;
public schools) and Private School Universe Survey (PSS; private
schools) files. In addition, to address the fact that the CCD file
does not necessarily include the most recent changes to schools by
the time of the assessment, NAEP also conducts a survey of NAEP
State Coordinators to check for additional new schools in a sample
of public-school districts. A similar process on the PSS is done for
Catholic schools in a sample of dioceses.
Classify schools into groups.
Using the list, schools
are classified into groups, first by type of location and then by
the race/ethnicity classification within those locations.
Within each group, order schools by a measure related to student
achievement.
Within each group, schools are sorted by
student achievement to ensure that schools with varying levels of
student achievement are represented in the NAEP sample. This is done
using school-level results on state achievement tests. In a few
cases where recent achievement data are not available, schools are
sorted by the median household income for the area where the school
is located.
Assign a measure of size to all schools.
All
schools on the list are assigned a measure of size. A school’s
measure of size is based on the size of its enrollment in relation
to the size of the state’s student population at the selected
grade level. Larger schools have a larger measure of size as they
represent a larger proportion of the state’s student
population. This step ensures that students from schools of
different sizes are appropriately represented in the sample.
Select the school sample.
After schools are assigned a
measure of size and grouped on an ordered list based on the
characteristics that are referred to in previous steps, the sample
is selected using stratified systematic sampling with probability
proportional to the measure of size using a sampling interval. This
procedure ensures that each school has the required selection
probability. By proceeding systematically throughout the entire
list, schools of different sizes and varying demographics are
selected, and a representative sample of students will be chosen for
the assessment.
Additional details regarding the selection of the school sample are
included in the technical documentation (2022
Sample Design).
Confirm school eligibility.
The list of schools
selected to participate is sent to each state to verify that the
school is eligible for participation. Some factors that would make a
school ineligible include schools that have closed or if the grade
span has changed so that a grade level or age assessed by NAEP is no
longer in the school. Eligibility counts are included in the
technical documentation (NAEP
Assessment Sample Design). Information on response rates can be
found in section B.3.
Select students to participate in NAEP.
School
principals are notified that their schools have been chosen to
participate in NAEP. Within each sampled school, a systematic sample
of students is selected with equal probability from a complete list
of students at the grade or age to be assessed.
NAEP alternates between national-level administration years and state-level administration years that include one or more assessments that support national, state-by-state, and certain urban districts’ reporting. For assessments where results are reported at the national, state, and urban district (TUDA) levels, a single sample of public-school students is selected and used for reporting at each level. That is, a student who is sampled from a school located in a TUDA district contributes to the estimates at each of the district, state, and national levels. Similarly, a student who is sampled from a school in a particular state contributes to the estimates both for that state and the nation. For assessments where results are reported at the national level, but not for states and districts, schools are sampled from across the United States, without any oversampling of particular states or districts.
The process for private school selection is similar to the public-school selection process but depends on the U.S. Department of Education’s private education system databases to create the initial list of all known private schools. Private schools are sampled to be representative of private schools nationwide. The results for private schools are not included in state-level results which are solely focused on public schools.
NAEP yearly sample design plans are not available until the spring of the year preceding the assessments. The purpose of the sample design memorandum is to detail the specific sampling procedures used for the 2026 assessments. Included in this package (Appendix C) is the 2024 draft Sampling memorandum only as a placeholder, which will be replaced in a later amendment with the initial and then final 2026 version.
Additional information about the sampling procedures used in NAEP can be found in the technical documentation at http://nces.ed.gov/nationsreportcard/tdw/sample_design/.
Since each selected school that participates in the assessment effort and each student assessed constitutes only a portion of the full population of interest, weights are applied to both schools and students. The weights permit valid inferences to be drawn from the student samples about the respective populations from which they were drawn and, most importantly, ensure that the results of the assessments are fully representative of the target populations.
Additional information about the 2022 weighting procedures used in NAEP can be found in the technical documentation at 2022 Weighting Procedures.
Note, while the latest documentation that has been published (as of the drafting of this document) is from 2022, the procedures have essentially remained the same. A summary of the weighting procedures is included below.
The final weights assigned to each student as a result of the estimation procedures are the product of the following steps (which are described in additional detail below):
assignment of a “base” weight, the reciprocal of the overall initial probability of selection;
adjustment of the school base weights to reduce extreme variability, arising from special circumstance;
adjustments for school and student nonresponse;
adjustment (if needed) to reflect assignment to a specified assessment subject; and
adjustment of the student weights in state samples so that estimates for key student-level characteristics are in agreement across assessments in different subjects.
School base weights are assigned separately by grade and, as noted, are the reciprocal of the school’s probability of selection for that grade level.
Each sampled student receives a student base weight, whether the student participated in the assessment process. The base weight reflects the number of students that the sampled student represents in the population of interest. The sum of the student base weights for a given subgroup provides an estimate of the total number of students in that subgroup.
Since nonresponse is unavoidable in any survey of a human population, a weighting adjustment is introduced to compensate for the loss of sample data and to improve the precision of the assessment estimates. Nonresponse adjustments are applied at both the school and the student levels; the weights of responding schools are adjusted to reflect the nonresponding schools, and the weights of responding students, in turn, receive an adjustment to account for nonresponding students. School nonresponse adjustment cells are formed in part by geography (state or TUDA for state samples and census division for national samples), urbanicity, and race/ethnicity. Student nonresponse adjustment cells are formed in part by SD/EL status, school nonresponse cell, age relative to grade, sex, and race/ethnicity.
The complexity of the sample selection process as well as the variations in school enrollment can result in extremely large weights for both schools and students. Since unusually large weights are likely to produce large sampling variances for statistics of interest, and especially so when the large weights are associated with sample cases reflective of rare or atypical characteristics, such weights usually undergo an adjustment procedure that “trims” or reduces extreme weights. Again, the motivation is to improve the precision of the survey estimates. The student weight trimming procedure uses a multiple median rule to detect excessively large student weights.
Weighted estimates of population totals for student-level subgroups for a given grade or age will vary across subjects even though the student samples for each subject generally come from the same schools. These differences are the result of sampling error associated with the random assignment of subjects to students through a process known as spiraling. For state assessments, in particular, any difference in demographic estimates between subjects, no matter how small, may raise concerns about data quality. To remove these random differences and potential data quality concerns, a new step was added to the NAEP weighting procedure starting in 2009. This step adjusts the student weights in such a way that the weighted sums of population totals for specific subgroups are the same across all subjects and was implemented using a raking procedure.
Estimates of the sampling variance of statistics derived through the assessment effort are developed through a replication method known as “jackknife.” This process of replication involves the repeated selection of portions of the sample (replicates). A separate set of weights is produced for each replicate, using the same weighting procedures as for the full sample. The replicate weights, in turn, are used to produce estimates for each replicate (replicate estimates). The variability among the calculated replicate estimates is then used to obtain the variance of the full-sample estimate.
Additional information about the weighting procedures used in NAEP can be found in the technical documentation at http://nces.ed.gov/nationsreportcard/tdw/weighting/.
Once the sample of schools is selected for the 2026 NAEP administration, the NAEP State Coordinator and NAEP field staff typically follow a standard set of procedures for securing the participation of public and private schools. The information below will be somewhat similar to procedures from the 2024 operational assessment and the 2025 Field Test, but are continuing to be refined for the 2026 operational assessment and will be updated as necessary in future amendments.
sending a notification to the district superintendents of which and how many schools were selected for NAEP from their district (see Appendix D131 for the letter and Appendix D129 for the included information)
sending a notification to complete the School Technology Survey (see Appendix D59)
sending a notification of each school’s selection for NAEP to the principal or other administrative official, along with an assessment information packet containing introductory information and materials (see Appendix D132 for the letter and Appendix D130 for the included information)
sending a notification with each school’s NAEP assessment date to the principal or other administrative official (see Appendix D88 for 2025 sample), along with additional assessment information (see Appendix D82-85 for 2025 sample)
sending a notification to each school’s principal with instructions for assigning a school coordinator (see letter Appendix D67-70 for 2025 sample)
sending a notification to each school’s identified staff who will handle coordination with NAEP, application installation if needed, and technology support with instructions (see letter Appendix D90)
The responsibilities for school staff are determined by which NAEP Administration Model the school is selected to be assessed in, either school devices or NAEP devices. At the time of this Clearance Package, it is assumed that 60 percent of the sample will be assessed using school devices, and 40 percent of the sample will be assessed using NAEP devices.
The School Coordinator role is essential for both school device and NAEP device models.. They are responsible for preparing for the NAEP assessment in the school using the Assessment Management System (AMS), which is an online secure site that provides participating schools with a convenient way to prepare for the upcoming assessment. AMS serves as the primary resource and action center throughout the assessment process. The AMS site also offers School Coordinators an electronic way to prepare for the assessment at their own pace. Initial 2026 AMS content is included in this package; the remainder will be provided in future amendments. In addition to the School Coordinator role, a Technology Coordinator will also be identified for school device model to assist with installing the NAEP Assessment Application on school devices and assist on the day of the assessment with technical tasks, including troubleshooting technology issues on assessment day, and ensuring the uninstall of the NAEP Assessment Application on student devices post assessment. During the assessment, school staff are to remain in the assessment location to provide support with classroom management and help to ensure an optimal assessment experience. This can be the Technology Coordinator or another school staff person that the school identifies to assist with classroom management. If the school requests to assess all 50 students at the same time, school staff are required to remain in the assessment location.
The AMS school summary page has activities that School Coordinators will need to complete. The following describes the different sections and activities that need to be completed, including the purpose and timeframe for each.
Receive initial communication.
Tasks: District Superintendent, District Assessment Coordinator, and School Administrators receive initial communication information that schools have been sampled for NAEP, ensure that School Technology Survey is completed, and make sure that the registration for the Assessment Management System (AMS) is complete, and determine roles for the assessment. Schools who are eligible and qualify for School Device model will be able to deploy the NAEP Assessment Application on student devices.
Purpose: Aids in identifying Administration model assignment and school staff support.
Timeline for 2026: Summer 2025
Deploy NAEP Assessment Application on school devices
Tasks: Identified Technology Coordinator will utilize the eNAEP Download Center to deploy the NAEP Assessment Application on school devices.
Purpose: Prepare school devices for the assessment.
Timeline for 2026: May–September 2025
Register and Provide School Information
Tasks: School Coordinators will register for the AMS website and provide school contact information and school characteristics, including student enrollment for the selected grade, charter school status.
Purpose: Gain access to the secure AMS website as the designated school coordinator and ensure that NAEP has the most up-to-date information about the school.
Timeline for 2026: June–December 2025
Import Student List/Sample
Tasks: NAEP collects a list of all students in the selected grade for each school. The school submits an Excel file with all students and their demographic data (see sample Appendix I). Note, as described in section A.12, the School Coordinator is only responsible for this task if the State Coordinator has not previously submitted the student list for sampling. As such, only a portion of the School Coordinators are responsible for this task. School Device model schools will complete the installation and deployment of the NAEP Assessment Application.
Purpose: Draw a representative sample of students from the school to participate in the NAEP assessments. Ensure all students have an opportunity to be sampled.
Timeline for 2026: October –November 2025
Review Student Information Tasks
Tasks: Review demographic data to confirm accuracy and add any missing demographic data. School Coordinators will be asked to review and verify student information and to indicate whether students were displaced from a natural disaster.
Purpose: Demographic data are used for reporting results of student groups in The Nation’s Report Card.
Timeline for 2026: December 2025–January 2026
Complete SD/EL Student Information
Tasks: Determine how students participate in NAEP (i.e., without accommodations, with accommodations, or do not test). Provide the Individuals with Disabilities Education Act (IDEA) disability status, English proficiency, primary language, grade- or age-level performance, and accommodations (see Appendix I section B-8).
Purpose: Confirm students have appropriate support to access the NAEP assessment.
Timeline for 2026: December 2025–January 2026
Notify Parents
Tasks: Download the parent notification letters (see Appendix D61-63 for 2025 sample) and certify the date and method by which parents were notified. A translation notice is available to accompany the parent notification letter in instances where parents do not speak English or Spanish (see Appendix D64 for 2025 sample).
Purpose: Ensure that parents/guardians are notified of their student’s selection to participate in NAEP, which is a requirement of the Reauthorized Elementary and Secondary Education Act (ESEA).4
Timeline for 2026: December 2025–January 2026
Manage Questionnaires
Tasks: Identify respondents for the school questionnaire and the relevant teacher(s) for each student, send respondents links to online questionnaires, and monitor completion of questionnaires. Distribute information about NAEP to teachers (see Appendix D118 for 2025 sample)
Purpose: Results are used to provide contextual data from schools and teachers in The Nation’s Report Card.
Timeline for 2026: December 2025– January 2026
Add New Students
Tasks: Identify any newly enrolled students since the original list of students was provided.
Purpose: Ensure all students have an opportunity to be sampled so NAEP can assess a representative sample of students.
Schedule Students, Assessment Logistics, and Encourage Participation
Tasks: Determine assessment session times and locations, share cell phone policy to ensure security of NAEP items, and make a plan to encourage student participation.
Purpose: Ensure that the school is prepared for a successful administration of NAEP.
Timeline for 2026: December 2025–January 2026
Complete Technical Logistics (School Device schools only)
Tasks: School Coordinator will complete the technical logistics in the AMS system, including verifying the app works on student devices.
Purpose: Provides necessary information about schools’ devices and internet connections to alert field staff on what technical considerations need to be addressed prior to assessment day.
Timeline for 2026: One month prior to scheduled assessment day.
Print Appointment Cards and Teacher Notification Letters
Tasks: Print resources to notify students and teachers.
Purpose: Ensure students arrive at assessment location prepared and on time.
Timeline for 2026: One week prior to scheduled assessment date.
Remove NAEP Assessment Application from student devices (School Device schools only)
Tasks: Technology Coordinator will coordinate the uninstall of the NAEP Assessment Application from managed Chromebook devices or from Windows devices.
Purpose: Maintain security of the assessment
Timeline for 2026: Post scheduled assessment.
Destroy any documents with student identifying information
Tasks: School Coordinator will destroy any documents with student identifying information.
Purpose: Maintain security of the student information.
Timeline for 2026: End of the school year.
Before the assessment, the NAEP field representative will host a virtual meeting with the School Coordinator to review the completion of the tasks, answer any questions, and review assessment day procedures.
As part of the ongoing quality control of the assessment process, schools will be asked to complete an additional follow-up survey. Survey questions solicit feedback on pre-assessment, assessment, and procedural processes. The Assessment Feedback Survey from 2024 is included in Appendix E; the 2026 version will be submitted in Amendment #1 in summer 2025.
Trained NAEP representatives will administer the assessment and provide significant support to schools. Some schools will be assessed on NAEP devices, and some schools will be assessed on school devices. In schools assessed on NAEP devices, NAEP representatives will set up and administer the assessment and provide all necessary equipment and assessment materials to the school, including devices with an attached keyboard, and earbuds. In schools assessed on school devices, an assigned staff member will be responsible for identifying school devices (e.g., desktops, laptops, tablets with keyboards) for the assessment and preparing devices in advance of assessment day. For both administration models, NAEP representatives will pack up any NAEP owned equipment or materials and leave the area as they found it.
The traditional NAEP design assesses each student in 60 minutes for one cognitive subject. The schools will administer assessments, typically in groups of approximately 25 students, with two groups conducted sequentially during the school day, although additional concurrent groups may be required. Schools that are assessed under school device model may choose to assess all students simultaneously in one or two locations if additional school staff are available to support the assessment and remain in the testing room. The assessments given in Puerto Rico are translated into Spanish. To account for the language complexities, additional time is provided for the cognitive blocks (for a total of 80 minutes).
Schools within each state will be selected and the chief state school officer and the NAEP State Coordinator will be asked to solicit their cooperation. Since states and school districts receiving Title I funds are required to participate in the main NAEP reading and mathematics assessments (grades 4 and 8) under the National Assessment of Educational Progress Authorization Act, NAEP response rates have improved for these assessments. An area that has typically had lower response rates in NAEP is high schools and private schools. As such, NCES has created specialized materials targeted at this audience:
The Best Practices provides resources and strategies to increase twelfth-grade student motivation and participation (see Appendix D-124).
Videos and additional information on the NAEP website for schools, students, parents, and teachers (see http://nces.ed.gov/nationsreportcard/about/schools.aspx).
Additional brochure and resources targeting private schools, includes NAEP in Your Private School (see Appendix D3 sample from 2024), and a webpage dedicated just to private schools (http://nces.ed.gov/nationsreportcard/about/nonpublicschools.aspx).
There are four main areas that can be focused on to maximize completion rates: (1) early distribution of information and materials; (2) effective communication with school personnel; (3) efforts to encourage student participation; and (4) efforts made by field staff to avoid refusals and to convert initial refusals to cooperating schools.
Early Distribution of Information and Materials
Over the years, feedback from schools and states indicated that notification of a school’s selection in the NAEP sample earlier rather than later is beneficial to the school for planning purposes and improves school response rate. NAEP generally notifies schools of selection in early summer of the year prior to the assessment. In addition, the School Technology Survey and the eNAEP Download Center will be available at the time of school notification, to provide schools who qualify for school device model with more information about the NAEP Assessment Application.
Effective Communication with School Staff
The participation of schools can be increased by effectively communicating information about NAEP, including what NAEP measures, the various assessment components, why it is important that schools, students, and teachers participate, and the role of the school staff. Effective communication materials from the State Coordinator and the field staff (as described in section B.2.a.) helps maximize the participation of schools. In addition, an intuitive and easy-to-use AMS system, School Technology Survey and eNAEP Download Center (as described in section B.2.b.) helps ensure that the School and Technology Coordinators’ experiences are positive.
Encouraging Student Participation
Previous feedback from school administrators has shown that students respond more positively to the assessment when they know the assessment has the support of the school administration. Therefore, the field staff will encourage the School Coordinator to make efforts to encourage students to do their best, including having the principal introduce the assessment. In addition, field staff will suggest to the School Coordinator that grades 8 and 12 schools may want to issue community service credits for participating. Given that grade 12 student participation can be particularly challenging, NAEP has developed a Best Practices Guide to encourage grade 12 participation (to be updated in Amendment #1), which is shared with sampled high schools.
Avoiding Refusals and Converting Initial School Refusals
NAEP representatives will be trained in methods to maximize school participation, which will include being flexible in the assessment scheduling, following up with the School Coordinators, and scheduling in-person preparation meetings, at the School Coordinator’s request.
Not all of the students in the main NAEP sample will respond. Some will be unavailable during the sample time period because of absenteeism or other reasons. If a student decides not to participate, the action will be recorded, but no steps will be taken to obtain participation. The NAEP response rates follow AAPOR (American Association for Public Opinion Research) guidelines. Response rates, in percentages, from the 2022 and 2024 NAEP DBA assessments are shown below.
|
2022 |
2024 |
||
|
Grade 4 |
Grade 8 |
Grade 4 |
Grade 8 |
Student response rates |
|
|
|
|
Public schools |
92 |
89 |
92 |
89 |
Private schools |
94 |
94 |
93 |
91 |
School response rates |
|
|
|
|
Public schools |
99 |
100 |
100 |
100 |
Private schools |
38 |
50 |
45 |
34 |
Note: The numbers in the table above are rounded.
We are working to increase engagement of private school organization leaders in recruitment efforts and requesting customized endorsement letters from these organizations (see Appendix D136). We have also expanded outreach efforts to schools to promote the use of NAEP data tools to highlight the value of NAEP data to private schools. Furthermore, a customized dashboard for private schools is available on The Nation’s Report Card site.
NCES and the Governing Board have established participation rate standards that states and jurisdictions are required to meet to have their results published. Beginning in 2003, if a state’s school response rate is below 85 percent, the results will not be published by NAEP, regardless of the response rate after substitution (see https://nces.ed.gov/nationsreportcard/about/participrates.aspx and https://www.nagb.org/content/nagb/assets/documents/policies/samplingpolicy1.pdf).
Pilot testing of cognitive and non-cognitive items is carried out in all subject areas. The purpose of pilot testing is to obtain information regarding clarity, difficulty levels, timing, and feasibility of items and conditions. In addition to ensuring that items measure what is intended, the data collected from pilot tests serve as the basis for selecting the most effective items and data collection procedures for the subsequent operational assessments. Pilot testing is a cost-effective means for revising and selecting items prior to an operational data collection because the items are administered to a small, nationally representative sample of students and data are gathered about performance that crosses the spectrum of student achievement. Items that do not work well can be dropped or modified before the operational administration.
Prior to pilot testing, many new items are pre-tested with small groups of sample participants (cleared under the NCES pretesting generic clearance agreement; OMB# 1850-0803). All non-cognitive items undergo one-on-one cognitive interviews, which is useful for identifying questionnaire and procedural problems before larger scale pilot testing is undertaken. Select cognitive items also undergo pre-pilot testing, such as item tryouts or cognitive interviews, to test out new item types or formats, or challenging content. In addition, usability testing is conducted on new technologies and technology-based platforms and instruments.
In 2024, NAEP transitioned to the eNAEP test delivery software, the platform on which the assessment is delivered to students. NAEP will also be changing the operational assessment delivery model for 2026. While NAEP previously administered assessments with the assistance of numerous NAEP field staff, who would all enter schools bringing NAEP Surface Pros and Chromebooks, the program has transitioned to a model that is ultimately less expensive and more aligned with the administration model used in state assessments. NAEP will administer the assessment using school devices and the internet. For schools that cannot meet the eligibility requirements for use of school devices, NAEP will provide an alternate delivery model of utilizing less expensive, NAEP- Chromebooks.
To successfully transition to this ultimate plan, a staged approach is being undertaken so that trends can be measured across time. Namely, NAEP has conducted a School-based Equipment study in 2024 (OMB# 1850-0803 v.347) as well as a Field Test in 2025 (OMB# 1850-0803 v.353) to provide more information about student and school interactions with the eNAEP system on school devices, and preparations for use of school devices in operational NAEP assessments moving forward. Also, NAEP Chromebooks were be used in some schools to compare to those utilizing school devices. In preparation for the 2026 NAEP administration, a Field Trial will be conducted with students in a live classroom environment in November 2025 by NAEP field administration staff. The Field Trial will fully replicate the NAEP Operational Administration testing conditions in a small number of schools. Since 2018, the NAEP program has utilized Field Trials prior to large-scale digitally based assessments to inform the upcoming administration.
NCES, ETS, and Westat staff have collaborated on aspects of the NAEP design. The primary persons responsible from NCES are Enis Dogan and Gina Broxterman; from ETS: Jay Campbell, Amy Dresher, Robert Finnegan and Yue Jia, and Ranu Palta-Upreti; and from Westat: Tom Krenzke, Lloyd Hicks, Lisa Rodriguez, and Marcie Hickman. In addition, the NAEP Design and Analysis Committee (see Appendix A-1) have also contributed to NAEP designs on an ongoing basis.
1 Participating states vary depending on the subject and grade assessed, but may include the 50 states, the District of Columbia, the Department of Defense Education Activity, and (for mathematics assessments only) Puerto Rico.
2 NAEP IRT scaling requires a minimum sample size of 1,500-2,000 students per item in order to estimate stable item parameters. Therefore, national assessments with larger item pools have larger samples.
3 NAEP IRT scaling is conducted for most pilot assessments, requiring a minimum of 1,500-2,000 students per item in order to estimate stable item parameters. Therefore, pilot assessments with larger item pools have larger samples.
4 Please note that parents/legal guardians are required to receive notification of student participation, but NAEP does not require explicit parental consent (by law, parents/guardians of students selected to participate in NAEP must be notified in writing of their child’s selection prior to the administration of the assessment).
File Type | application/vnd.openxmlformats-officedocument.wordprocessingml.document |
Author | joconnell |
File Modified | 0000-00-00 |
File Created | 2025-05-19 |