NAEP 2026 Appendix B Weighting Procedures

NAEP 2026 Appendix B v.36.docx

National Assessment of Educational Progress (NAEP) 2026

NAEP 2026 Appendix B Weighting Procedures

OMB: 1850-0928

Document [docx]
Download: docx | pdf



 

NATIONAL CENTER FOR EDUCATION STATISTICS 

NATIONAL ASSESSMENT OF EDUCATIONAL PROGRESS 

 

 

National Assessment of Educational Progress (NAEP) 2026 

 

 

 

 

Appendix B

NAEP 2022 Weighting Procedures 

 

 

 

OMB# 1850-0928 v.36 

 

 

 

 

 

 

 

 

May 2025



The 2022 Weighting Procedures documentation is the most current version available to the public. At this time, there is not a timeline for when the details for later assessment years will be publicly available.


NAEP Technical Documentation Website

NAEP Technical Documentation Weighting Procedures for the 2022 Assessment


NAEP assessments use complex sample designs to create student samples that generate population and subpopulation estimates with reasonably high precision. School and student sampling weights ensure valid inferences from the student samples to their respective populations. In 2022, weights were developed for schools and students sampled at grades 4 and 8 for assessments in mathematics and reading, schools and students sampled at grade 8 for assessments in civics and U.S. history, and for schools and students sampled at ages 9 and 13 for long-term trend (LTT) assessments in mathematics and reading. The grade- based assessments were administered using tablets, and the LTT assessments were administered using paper and pencil.

Student Weights

Each student was assigned a weight to be used for making inferences about students in the target population. This weight is known as the final full-sample student weight and contains the following major components:

the student base weight,

school nonresponse adjustments, student nonresponse adjustments,

school weight trimming adjustments,

student weight trimming adjustments, and student raking adjustment.

Computation of Full-Sample Student Weights

Computation of Replicate Student Weights for Variance Estimation

Computation of Full-Sample School Weights

Computation of Replicate School Weights for Variance Estimation

Quality Control on Weighting Procedures

The student base weight is the inverse of the overall probability of selecting a student and assigning that student to a particular assessment. The sample design that determines the base weights is discussed in the NAEP 2022 Sample Design section.

The student base weight is adjusted for two sources of nonparticipation: at the school level and at the student level. These weighting adjustments seek to reduce the potential for bias from such nonparticipation. Responding schools receive a weighting adjustment to compensate for nonresponding schools, and responding students receive a weighting adjustment to compensate for nonresponding students.

Furthermore, the final weights reflect the trimming of extremely large weights at both the school and student level. These weighting adjustments seek to reduce variances of survey estimates.

An additional weighting adjustment was implemented in the state and Trial Urban District Assessment (TUDA) samples so that estimates for key student-level characteristics were in agreement across assessments in reading and mathematics. This adjustment was implemented using a raking procedure. A similar but separate adjustment was also implemented for the national public school civics and U.S. history samples at grade 8. The raking procedure implemented for civics and U.S. history brought estimates for key student-level characteristics into agreement with those from mathematics and reading at the national level. Similar to previous years, raking was not performed for any of the private school student samples or for student samples in the LTT assessments.

In addition to the final full-sample weight, a set of replicate weights was provided for each student. These replicate weights are used to calculate the variances of survey estimates using the jackknife repeated replication method. The methods used to derive these weights were aimed at reflecting the features of the sample design, so that when the jackknife variance estimation procedure is implemented, approximately unbiased estimates of sampling variance are obtained. In addition, the various weighting procedures were repeated on each set of replicate weights to appropriately reflect the impact of the weighting adjustments on the sampling variance of a survey estimate. A finite population correction (fpc) factor was incorporated into the replication scheme so that it could be reflected in the variance estimates for the grade-based assessments. Similar to previous years, the replication scheme for LTT does not incorporate a finite population correction factor. See Computation of Replicate Student Weights for Variance Estimation for details.

School Weights

In addition to student weights, school weights were calculated to provide secondary users means to analyze data at the school level. The school weights are subject specific and represent the schools that contained at least one student that participated in the NAEP assessment for that subject.

Each school was assigned a weight to be used for making inferences about schools in the target population. This weight is known as the final full- sample school weight, and it contains five major components:

the school base weight,

school nonresponse adjustment,

school weight trimming adjustment,

school session assignment weight, and small-school subject adjustment.

The school base weight is the inverse of the probability of selecting a school for a particular assessment. The school nonresponse adjustment increase the weights of participating schools to account for similar schools that did not participate, and the school trimming adjustment reduce extremely large weights to decrease variances of survey estimates. These two adjustments are the same school-level adjustments used in the student full-sample weight described above.

The school session assignment weight reflects the probability that the particular session type was assigned to the school.

The small-school subject adjustment accounts for very small schools that did not have enough participating students for every subject associated to the school. School weights for subjects that had at least one eligible student are inflated by this factor to compensate for subject(s) that did not have any

eligible students in that school and, thus, are not represented otherwise. In addition to the full-sample weight, a set of replicate weights was provided for each school. The school replicate weights are used to calculate the variances of school-level survey estimates using the jackknife repeated replication method.

Quality Control Procedures

Quality control checks were carried out throughout the weighting process to ensure the accuracy of the full-sample and replicate weights. See Quality Control on Weighting Procedures for the various checks implemented and main findings of interest.




http://nces.ed.gov/nationsreportcard/tdw/weighting/2022/weighting_procedures_for_the_2022_assessment.aspx

Shape3



NAEP Technical Documentation Computation of Full-Sample School Weights

The full-sample or final school weight is the sampling weight used to derive NAEP school estimates of population and subpopulation characteristics for a specified grade (4 and 8) or age (9) and assessment subject (civics, mathematics, reading, and U.S. history). The full-sample school weight reflects the number of schools that the sampled school represents in the population for purposes of estimation.

The full-sample weight, which is used to produce survey estimates, is distinct from a replicate weight that is used to estimate variances of survey estimates. The full-sample weight is assigned to participating schools and reflects the school base weight after the application of the various weighting adjustments. The full-sample weight \(SCH\_WGT_{js}\) for school \(s\) in stratum \(j\) can be expressed as follows:

\begin{equation} SCH\_WGT_{js} = SCH\_BWT_{js} \times SCH\_NRAF_{js} \times SCH\_TRIM_{js} \times SCHSESWT_{js} \times SCH\_SUBJ\_AF_{js} \end{equation}

where

Shape4 \(SCH\_BWT_{js}\) is the school base weight;

Shape5 \(SCH\_NRAF_{js}\) is the school-level nonresponse adjustment factor;

Shape6 \(SCH\_TRIM_{js}\) is the school-level weight trimming adjustment factor;

Shape7 \(SCHSESWT_{js}\) is the school-level session assignment weight that reflects the conditional probability, given the school, that the particular session type was assigned to the school; and

Shape8 \(SCH\_SUBJ\_AF_{js}\) is the small-school subject adjustment factor.

For 2022, the school-level session assignment weight is always one because schools were only assigned to one session type.

The small-school subject adjustment accounts for very small schools that did not have enough participating students for every subject intended for the school. School weights for subjects that had at least one eligible student are inflated by this factor to compensate for schools of the same size that did not have any eligible students for those subjects and would not be represented otherwise.

The factor is equal to the inverse of the probability that a school of a given size had at least one eligible sampled student in a given subject:

\begin{equation} SCH\_SUBJ\_AF_{js} = \max \biggl(\dfrac{SF_{js}}{n_{s}},1 \biggr) \end{equation}



where

Shape9
Shape10




\(SF_{js}\) is the spiraling factor for the given subject; and

\(n_{s}\) is the within-school student sample size.

For example, if a school was to assess students in two subjects with a spiraling ratio of 1:1 (i.e., a spiraling factor of 2) but had only one eligible student, then the small-school subject adjustment would be equal to 2. The factor for schools not needing this adjustment was set equal to 1.

For the 2022 operational assessments, schools could be assigned to one of four sample types:

  1. Grades 4 and 8 mathematics and reading except Puerto Rico,

  2. Grade 8 civics and U.S. history,

  3. Grades 4 and 8 mathematics (Puerto Rico),

  4. Age 9 mathematics and reading long-term trend (LTT).

Students in schools participating in the grades 4 and 8 mathematics and reading assessments were assigned to mathematics and reading at the rates of 52 percent and 48 percent respectively at grade 4, and 50 percent for each subject at grade 8. Students in schools participating in the grade 8 civics and

U.S. history assessments were assigned to civics and U.S. history at the rates of 49 percent and 51 percent respectively. Students in schools participating in the age 9 mathematics and reading assessments were assigned to mathematics and reading at rates of 50 percent for each subject. Puerto Rico had only one operational assessment, so all students in grades 4 and 8 assigned to the operational assessment were assigned to mathematics.

Overall, the school weights of 27 of the approximately 5,200 schools participating in the grade 4 mathematics and reading assessment sample were adjusted to compensate for schools that were too small to take part only in mathematics or only in reading. The small-school adjustment factors ranged from 1.03 to 2.07. For the grade 8 mathematics and reading assessment sample, seven of 5,200 schools had their school weights adjusted to compensate for their size. The small-school adjustment factor was 2.00. Only one out of 400 schools had its school weight adjusted for the LTT assessments in mathematics and reading to account for schools that were too small to participate in both subjects. The small-school adjustment factor of 2 was used. For the assessment sample in civics and U.S. history at grade 8, the adjustment factor was set equal to 1 for all schools.





http://nces.ed.gov/nationsreportcard/tdw/weighting/2022/computation_of_full_sample_school_weights_for_the_2022_assessment.aspx

Shape11



NAEP Technical Documentation Computation of Full-Sample Student Weights


The full-sample or final student weight is the sampling weight used to derive NAEP student estimates of population and subpopulation characteristics for a specified grade (4 or 8) or age (9 or 13) and assessment subject (civics, mathematics, reading, or U.S. history). The full-sample student weight reflects the number of students in the population that the sampled student represents for purposes of estimation. The summation of the final student weights over a particular student group provides an estimate of the total number of students in that group within the population.

The full-sample weight, which is used to produce survey estimates, is distinct from a replicate weight that is used to estimate variances of survey estimates. The full-sample weight is assigned to participating students and reflects the student base weight after the application of the various weighting adjustments.

Computation of Base Weights

School and Student Nonresponse Weight Adjustments

School and Student Weight Trimming Adjustments

Student Weight Raking Adjustment

The full-sample weight \(FSTUWGT_{jsk}\) for student \(k\) from school \(s\) in stratum \(j\) can be expressed as

\begin{equation} FSTUWGT_{jsk} = STU\_BWT_{jsk} \times SCH\_NRAF_{js} \times STU\_NRAF_{jsk} \times \\ SCH\_TRIM_{js} \times STU\_TRIM_{jsk} \times STU\_RAKE_{jsk} , \end{equation}

where

\(STU\_BWT_{jsk}\) is the student base weight;

\(SCH\_NRAF_{js}\) is the school-level nonresponse adjustment factor;

\(STU\_NRAF_{jsk}\) is the student-level nonresponse adjustment factor;

\(SCH\_TRIM_{js}\) is the school-level weight trimming adjustment factor;

\(STU\_TRIM_{jsk}\) is the student-level weight trimming adjustment factor; and \(STU\_RAKE_{jsk}\) is the student-level raking adjustment factor.

School sampling strata for a given assessment vary by school type (public or private), assessment subject (civics, mathematics, reading, or U.S. history), and grade (4 or 8) or age (9 or 13). See the links below for descriptions of the school strata for the various assessments.

State public school samples for mathematics and reading at grades 4 and 8

National private school samples for mathematics and reading at grades 4 and 8 National public school samples for civics and U.S. history at grade 8

National private school samples for civics and U.S. history at grade 8

National public school samples for mathematics and reading at ages 9 and 13 National private school samples for mathematics and reading at ages 9 and 13




http://nces.ed.gov/nationsreportcard/tdw/weighting/2022/computation_of_full_sample_student_weights_for_the_2022_assessment.aspx

Shape12



NAEP Technical Documentation Computation of Base Weights


Every sampled school and student received a base weight equal to the reciprocal of its probability of selection.

Computation of a school base weight varies by

type of sampled school (original or substitute); and sampling frame (new school frame or not).

Computation of a student base weight reflects

the student's overall probability of selection accounting for school and student sampling; assignment to session type at the school- and student-level; and

the student's assignment to a particular subject.

School Base Weights Student Base Weights





http://nces.ed.gov/nationsreportcard/tdw/weighting/2022/computation_of_base_weights_for_the_2022_assessment.aspx

Shape13



NAEP Technical Documentation School Base Weights

The school base weight for a sampled school is equal to the inverse of its overall probability of selection. The overall selection probability of a sampled school differs by the type of sampled school (original or substitute) and by the type of sampling frame (new school frame or not).

The overall selection probability of an originally selected school in a civics, mathematics, reading, or U.S. history sample is equal to its probability of selection from the NAEP public/private school frame.

The overall selection probability of a school from the new school frame in a civics, mathematics, reading, or U.S. history sample is the product of two quantities:

the probability of selection of the school's district into the new-school district sample or the Catholic diocese into the new-school Catholic diocese sample, and

the probability of selection of the school into the new school sample.

The new-school district sampling procedures for the 2022 national public school samples for the civics and U.S. history assessment at grade 8 are very similar to the new-school district sampling procedures for the 2022 state public schools assessments in mathematics and reading.

New-school Catholic diocese sampling procedures for the 2022 national private school assessments for mathematics and reading at grades 4 and 8 and for civics and U.S. history at grade 8 are similar as well.

For the mathematics and reading long-term trend (LTT) assessments at ages 9 and 13, the new-school district and Catholic diocese sampling procedures took advantage of the work already being done for the grade-based assessments.

Substitute schools are preassigned to original schools and take the place of original schools if they refuse to participate. For weighting purposes, substitute schools are treated as if they were the original schools they replaced, so substitute schools are assigned the school base weight of their corresponding original schools.

Learn more about substitute schools for the 2022 national public school assessments for civics and U.S. history at grade 8 and for mathematics and reading LTT assessments at age 9. The 2022 state public school assessment in mathematics and reading do not use substitute schools.

Learn more about substitute schools for the 2022 national private school assessments in mathematics and reading at grades 4 and 8, in civics and U.S. history at grade 8, and in mathematics and reading LTT assessments at ages 9 and 13.




http://nces.ed.gov/nationsreportcard/tdw/weighting/2022/school_base_weights_for_the_2022_assessment.aspx

Shape14



NAEP Technical Documentation Student Base Weights

Every sampled student received a student base weight, whether or not the student participated in the assessment. The student base weight is the reciprocal of the probability that the student was sampled to participate in the assessment for a specified subject. The student base weight \

(STU\_BWT_{jsk}\) for student \(k\) from school \(s\) in stratum \(j\) is the product of seven weighting components and can be expressed as

\begin{equation} STU\_BWT_{jsk} = SCH\_BWT_{js} \times SCHSESWT_{js} \times WINSCHWT_{js} \times \\ STUSESWT_{jsk} \times SUBJFAC_{jsk} \times SUBADJ_{js} \times YRRND\_AF_{js}, \end{equation}

where

\(SCH\_BWT_{js}\) is the school base weight;

\(SCHSESWT_{js}\) is the school-level session assignment weight that reflects the conditional probability, given the school, that the particular session type was assigned to the school;

\(WINSCHWT_{js}\) is the within-school student weight that reflects the conditional probability, given the school, that the student was selected for the NAEP assessment;

\(STUSESWT_{jsk}\) is the student-level session assignment weight that reflects the conditional probability, given that the particular session type was assigned to the school, that the student was assigned to the session type;

\(SUBJFAC_{jsk}\) is Stu_factor the subject spiral adjustment factor that reflects the conditional probability, given that the student was assigned to a particular session type, that the student was assigned the specified subject;

\(SUBADJ_{js}\) is the substitution adjustment factor to account for the difference in enrollment size between the substitute and original school; and

\(YRRND\_AF_{js}\) is the year-round adjustment factor to account for students in year-round schools on scheduled break at the time of the NAEP assessment and thus not available to be included in the sample.

The within-school student weight \((WINSCHWT_{js})\) is the inverse of the student sampling rate in the school. For long-term trend (LTT), due to the oversampling of certain race/ethnicity student groups, some schools have two student sampling rates.

The subject spiral adjustment factor \((SUBJFAC_{jsk})\) adjusts the student weight to account for the spiral pattern used in

distributing civics, mathematics, reading, or U.S. history booklets to the students. The subject factor varies by grade (or age, for LTT) and subject; it is equal to the inverse of the booklet proportions (civics, mathematics, reading, or U.S. history) in the overall spiral for a specific sample.

For cooperating substitutes of nonresponding original sampled schools, the substitution adjustment factor \((SUBADJ_{js})\) is equal to the ratio of the estimated grade (or age-specific) enrollment for the original sampled school to the estimated grade (or age-specific) enrollment for the substitute school. The student sample from the substitute school then "represents" the set of grade-eligible (or age-eligible) students from the original sampled school.

The year-round adjustment factor \((YRRND\_AF_{js})\) adjusts the student weight for students in year-round schools who do not attend school during the time of the assessment. This situation typically arises in overcrowded schools. School administrators in year-round schools randomly assign students to portions of the year in which they attend school and portions of the year in which they do not attend. At the time of assessment, a certain

percentage of students (designated as \(OFF_{js}\)) do not attend school and thus cannot be assessed. The \(YRRND\_AF_{js}\) for a school is calculated as \(1/(1 - OFF_{js}/100)\).




http://nces.ed.gov/nationsreportcard/tdw/weighting/2022/student_base_weights_for_the_2022_assessment.aspx

Shape15



NAEP Technical Documentation School and Student Nonresponse Weight Adjustments


Nonresponse is unavoidable in any voluntary survey of a human population. Nonresponse leads to the loss of sample data that must be compensated for in the weights of the responding sample members. This differs from ineligibility, for which no adjustments are necessary. The purpose of the nonresponse adjustments is to reduce the mean square error of survey estimates. While the nonresponse adjustment reduces the bias from the loss of sample, it also increases variability among the survey weights leading to increased variances of the sample estimates.

However, it is presumed that the reduction in bias more than compensates for the increase in the variance, thereby reducing the mean square error and thus improving the accuracy of survey estimates. Nonresponse adjustments are

School Nonresponse Weight Adjustment

Student Nonresponse Weight Adjustment

made in the NAEP surveys at both the school and the student levels: the responding (original and substitute) schools receive a weighting adjustment to compensate for nonresponding schools, and responding students receive a weighting adjustment to compensate for nonresponding students.

The paradigm used for nonresponse adjustment in NAEP is the quasi-randomization approach (Oh and Scheuren, 1983). In this approach, school response cells are based on characteristics of schools known to be related to both response propensity and achievement level, such as the locale type (e.g., large principal city of a metropolitan area) of the school. Likewise, student response cells are based on characteristics of the schools containing the students and student characteristics that are known to be related to both response propensity and achievement level, such as student race/ethnicity, gender, and age.

Under this approach, sample members are assigned to mutually exclusive and exhaustive response cells based on predetermined characteristics. A nonresponse adjustment factor is calculated for each cell as the ratio of the sum of adjusted base weights for all eligible units to the sum of adjusted base weights for all responding units. The nonresponse adjustment factor is then applied to the base weight of each responding unit. In this way, the weights of responding units in the cell are "weighted up" to represent the full set of responding and nonresponding units in the response cell.

The quasi-randomization paradigm views nonresponse as another stage of sampling. Within each nonresponse cell, the paradigm assumes that the responding sample units are a simple random sample from the total set of all sample units. If this model is valid, then the use of the quasi- randomization weighting adjustment will eliminate any nonresponse bias. Even if this model is not valid, the weighting adjustments can eliminate bias if the achievement scores are homogeneous within the response cells. See, for example, chapter 4 of Little and Rubin (1987).




http://nces.ed.gov/nationsreportcard/tdw/weighting/2022/school_and_student_nonresponse_weight_adjustments_for_the_2022_assessment.aspx

Shape16



NAEP Technical Documentation School Nonresponse Weight Adjustment


The school nonresponse adjustment procedure inflates the weights of cooperating schools to account for eligible noncooperating schools for which no substitute schools participated. The adjustments are computed within nonresponse cells and are based on the assumption that the cooperating and noncooperating schools within the same cell are more similar to each other than to schools from different cells. School nonresponse adjustments were carried out separately by sample; that is, by

sample level (state, national), school type (public, private),

grade (4, 8) or age (9, 13), and

assessment subject (civics, mathematics, reading, U.S. history).

Development of Initial School Nonresponse Cells

Development of Final School Nonresponse Cells

School Nonresponse Adjustment Factor Calculation







http://nces.ed.gov/nationsreportcard/tdw/weighting/2022/school_nonresponse_weight_adjustment_for_the_2022_assessment.aspx

Shape17



NAEP Technical Documentation Development of Final School Nonresponse Cells

Limits were placed on the magnitude of cell sizes and adjustment factors to prevent unstable nonresponse adjustments and unacceptably large nonresponse factors. All initial weighting cells with fewer than six cooperating schools or adjustment factors greater than 3.0 (or 4.0 for long-term trend [LTT]) for the full sample weight were collapsed with suitable adjacent cells. Simultaneously, all initial weighting cells for any replicate with

fewer than four cooperating schools or adjustment factors greater than the maximum of 3.0 or two times the full sample nonresponse adjustment factor were collapsed with suitable adjacent cells. Initial weighting cells were generally collapsed in reverse order of the cell structure; that is, starting at the bottom of the nesting structure and working up toward the top level of the nesting structure.

State Public School Samples for Mathematics and Reading Assessments at Grades 4 and 8

For the grade 4 and 8 public school samples for mathematics and reading, cells with the most similar Black/Hispanic, achievement level, median income, or enrollment composition stratum within a given jurisdiction/Trial Urban District Assessment (TUDA) district and urbanicity (urban-centric locale) stratum were collapsed first. If further collapsing was required after all levels of the first variable were collapsed, cells with the most similar urbanicity strata were combined next. Cells were never permitted to be collapsed across jurisdictions or TUDA districts.

National Public School Samples for Civics and U.S. History Assessments at Grade 8

For the grade 8 public school civics and U.S. history sample, Black/Hispanic composition stratum cells within a given census division stratum and urbanicity stratum were collapsed first. If further collapsing was required after all levels of race/ethnicity classification were collapsed, cells with the most similar urbanicity strata were combined next. Any further collapsing occurred across census division strata but never across census regions.

National Public School Samples for Mathematics and Reading LTT Assessments at Ages 9 and 13

For the LTT public school samples for mathematics and reading, race/ethnicity classification cells within a given census region stratum and urbanicity stratum were collapsed first. Any further collapsing occurred across urbanicity strata but never across census regions.

National Private School Samples for Mathematics and Reading Assessments at Grades 4 and 8

For the grade 4 and 8 private school samples for mathematics and reading, cells with the most similar race/ethnicity classification within a given affiliation, census region, and urbanicity stratum were collapsed first. If further collapsing was required after all levels of race/ethnicity strata were collapsed, cells with the most similar urbanicity classification were combined. Any further collapsing occurred across census region strata but never across affiliations.

National Private School Samples for Civics and U.S. History Assessments at Grade 8

For the grade 8 private school civics and U.S. history samples, cells with the most similar race/ethnicity classification within a given affiliation, census region, and urbanicity stratum were collapsed first. If further collapsing was required after all levels of race/ethnicity strata were collapsed, cells with the most similar urbanicity classification were combined. Any further collapsing occurred across census region strata but never across affiliations.

National Private School Samples for Mathematics and Reading LTT Assessments at Ages 9 and 13

For the LTT private school samples for mathematics and reading, urbanicity strata within a given affiliation and census region were collapsed first. Any further collapsing occurred across census region strata but never across affiliations.

http://nces.ed.gov/nationsreportcard/tdw/weighting/2022/development_of_final_school_nonresponse_cells_for_the_2022_assessment.aspx

Shape18



NAEP Technical Documentation Development of Initial School Nonresponse Cells

The cells for nonresponse adjustments are generally functions of the school sampling strata for the individual samples. School sampling strata usually differ by assessment subject, grade (or age for long-term trend [LTT]), and school type (public or private). Assessment subjects that are administered together by way of spiraling have the same school samples and stratification schemes. Subjects that are not spiraled with any other subjects have their own separate school sample. In NAEP 2022, the following assessments were spiraled together:

mathematics and reading assessments at grades 4 and 8; civics and U.S. history assessments at grade 8; and

mathematics and reading LTT assessments at ages 9 and 13.

The initial nonresponse cells for the various NAEP 2022 samples are described below.

State Public School Samples for Mathematics and Reading Assessments at Grades 4 and 8

For these samples, initial weighting cells were formed within each jurisdiction and grade using the following nesting cell structure:

Trial Urban District Assessment (TUDA) district vs. the balance of the state for states with TUDA districts; urbanicity (urban-centric locale) stratum; and

race/ethnicity classification stratum, achievement level, median income, or grade enrollment.

In general, the nonresponse cell structure used race/ethnicity classification stratum as the lowest level variable. However, where there was only one race/ethnicity classification stratum within a particular urbanicity stratum, then categorized achievement, median income, or enrollment data was used instead.


National Public School Samples for Civics and U.S. History Assessments at Grade 8 The initial weighting cells for these samples were formed using the following nesting cell structure:

census division stratum;

urbanicity stratum (urban-centric locale); and Black/Hispanic composition stratum.

National Public School Sample for Mathematics and Reading LTT Assessments at Ages 9 and 13

The initial weighting cells for these samples were formed using the following nesting cell structure:

census division stratum;

urbanicity stratum (four categories based on urban-centric locale); and

race/ethnicity classification (categories based on the total percentage of Black, Hispanic, and American Indian/Alaska Native students).

National Private School Samples for Mathematics and Reading Assessments at Grades 4 and 8

The initial weighting cells for these samples were formed within each grade using the following nesting cell structure:

affiliation;

census region stratum;

urbanicity stratum (urban-centric locale); and race/ethnicity classification stratum.

National Private School Samples for Civics and U.S. History Assessments at Grade 8 The initial weighting cells for these samples were formed using the following nesting cell structure:

affiliation;

census region stratum;

urbanicity stratum (urban-centric locale); and race/ethnicity classification stratum.

National Private School Samples for Mathematics and Reading LTT Assessments at Ages 9 and 13

The initial weighting cells for these samples were formed using the following nesting cell structure:

affiliation;

census region stratum; and

urbanicity stratum (four categories based on urban-centric locale.




http://nces.ed.gov/nationsreportcard/tdw/weighting/2022/development_of_initial_school_nonresponse_cells_for_the_2022_assessment.aspx

Shape19



NAEP Technical Documentation School Nonresponse Adjustment Factor Calculation

In each final school nonresponse adjustment cell \(c\), the school nonresponse adjustment factor \(SCH\_NRAF_{c}\) was computed as

\begin{equation} SCH\_NRAF_{c} = \dfrac { \sum_{ s \in S_{c}} { SCH\_BWT_{s} \times SCH\_TRIM_{s} \times SCHSESWT_{s} \times X_{s}}

} { \sum_{ s \in R_{c}} { SCH\_BWT_{s} \times SCH\_TRIM_{s} \times SCHSESWT_{s} \times X_{s}} }, \end{equation} where

\(S_{c}\) is the set of all eligible sampled schools (cooperating original and substitute schools and refusing original schools with noncooperating or no assigned substitute) in cell \(c\),

\(R_{c}\) is the set of all cooperating schools within \(S_{c}\), \(SCH\_BWT_{s}\) is the school base weight,

\(SCH\_TRIM_{s}\) is the school-level weight trimming factor,

\(SCHSESWT_{s}\) is the school-level session assignment weight that reflects the conditional probability, given the school, that the particular assessment type was assigned to the school, and

\(X_{s}\) is the estimated grade enrollment (or age-specific enrollment for long-term trend [LTT]) corresponding to the original sampled school.




http://nces.ed.gov/nationsreportcard/tdw/weighting/2022/school_nonresponse_adjustment_factor_calculation_for_the_2022_assessment.aspx

Shape20



NAEP Technical Documentation Student Nonresponse Weight Adjustment


The student nonresponse adjustment procedure inflates the weights of assessed students to account for eligible sampled students who did not participate in the assessment. These inflation factors offset the loss of data associated with absent students. The adjustments are computed within nonresponse cells and are based on the assumption that the assessed and absent students within the same cell are more similar to one another than to students from different cells. Like its counterpart at the school level, the student nonresponse adjustment is intended to reduce the mean square error and thus improve the accuracy of NAEP assessment estimates. Also, like their counterparts at the school level, student nonresponse adjustments were carried out separately by sample; that is, by

grade (4, 8) or age (9, 13),

school type (public, private), and

assessment subject (civics, mathematics, reading, U.S. history).

Development of Initial Student Nonresponse Cells

Development of Final Student Nonresponse Cells

Student Nonresponse Adjustment Factor Calculation




http://nces.ed.gov/nationsreportcard/tdw/weighting/2022/student_nonresponse_weight_adjustment_for_the_2022_assessment.aspx

Shape21



NAEP Technical Documentation Development of Final Student Nonresponse Cells

Similar to the school nonresponse adjustment, cell and adjustment factor size constraints are in place to prevent unstable nonresponse adjustments or unacceptably large adjustment factors. All initial weighting cells with either fewer than 20 participating students or adjustment factors greater than 2.0 for the full sample weight were collapsed with suitable adjacent cells. Simultaneously, all initial weighting cells for any replicate with either fewer than 15 participating students or an adjustment factor greater than the maximum of 2.0 or 1.5 times the full sample nonresponse adjustment factor were collapsed with suitable adjacent cells.

Initial weighting cells were generally collapsed in reverse order of the cell structure; that is, starting at the bottom of the nesting structure and working up toward the top level of the nesting structure. Race/ethnicity cells within students with disabilities (SD) and English learners (EL) groups, school nonresponse cell, age for grade-based assessments or grade for long-term trend (LTT) age-based assessments, and gender classes were collapsed first. If further collapsing was required after collapsing all race/ethnicity classes, cells were next combined across gender, then age for grade-based or grade for age-based assessments, and finally school nonresponse cells. Cells are never collapsed across SD and EL groups for any sample.





http://nces.ed.gov/nationsreportcard/tdw/weighting/2022/development_of_final_student_nonresponse_cells_for_the_2022_assessment.aspx

Shape22



NAEP Technical Documentation Development of Initial Student Nonresponse Cells

Initial student nonresponse cells are generally created within each sample as defined by grade (or age), school type (public or private), and assessment subject (civics, mathematics, reading, or U.S. history). However, when subjects are administered together by way of spiraling, the initial student nonresponse cells are created across the subjects in the same spiral. The rationale behind this decision is that spiraled subjects are in the same schools

and the likelihood that an eligible student participates in an assessment is more related to its school than the assessment subject. Nonresponse adjustment procedures are not applied to excluded students or full-time remote students because they are not required to complete an assessment. Full- time remote students are enrolled in brick-and-mortar schools but do not attend school in person.

The initial student nonresponse cells for the various NAEP 2022 samples are described below.

State Public School Samples for Mathematics and Reading Assessments at Grades 4 and 8

The initial student nonresponse cells for these samples were defined within grade, jurisdiction, and Trial Urban District Assessment (TUDA) district hierarchically as follows:

Students with disabilities (SD)/English learners (EL) by subject; school nonresponse cell;

age (classified into "older"1 student and "modal age or younger" student); gender; and

race/ethnicity.

The highest level variable in the cell structure separates students who were classified either as SD or EL from those who are neither, since SD and EL students tend to score lower on assessment tests than non-SD/non-EL students. In addition, the students in the SD or EL groups are further broken down by subject, since rules for excluding students from the assessment generally differ by subject. Non-SD and non-EL students are not broken down by subject, since the exclusion rules do not apply to them.

National Public School Samples for Civics and U.S. History Assessments at Grade 8

The initial student nonresponse cells for these samples were defined using the following nesting structure: SD/EL by subject;

school nonresponse cell;

age (classified into "older" student and "modal age or younger" student); gender; and

race/ethnicity.

National Public School Samples for Mathematics and Reading LTT Assessments at Ages 9 and 13

The initial student nonresponse cells for these samples were defined using the following nesting structure: SD/EL by subject;

school nonresponse cell;

categorized grade (classified into "lower" and "upper" grade); gender; and

race/ethnicity.

National Private School Samples for Mathematics and Reading Assessments at Grades 4 and 8

The initial weighting cells for these private school samples were formed using the following nesting structure within grade: SD/EL;

school nonresponse cell;

age (classified into "older" student and "modal age or younger" student); gender; and

race/ethnicity.

Although exclusion rules differ by subject, there were not enough SD or EL private school students to break out by subject as was done for the public schools.

National Private School Samples for Civics and U.S. History Assessments at Grade 8

The initial weighting cells for these private school samples were formed using the following nesting structure: SD/EL;

school nonresponse cell;

age (classified into "older" student and "modal age or younger" student); gender; and

race/ethnicity.

National Private School Samples for Mathematics and Reading LTT Assessments at Ages 9 and 13

The initial weighting cells for these private school samples were formed using the following nesting structure: school nonresponse cell;

categorized grade (classified into "lower" and "upper" grade); gender; and

race/ethnicity.



1 1 Older students are those born before October 1, 2011 for grade 4 and before October 1, 2007, for grade 8.




http://nces.ed.gov/nationsreportcard/tdw/weighting/2022/development_of_initial_student_nonresponse_cells_for_the_2022_assessment.aspx

Shape24

NAEP Technical Documentation Student Nonresponse Adjustment Factor Calculation

In each final student nonresponse adjustment cell \(c\) for a given sample, the student nonresponse adjustment factor \(STU\_NRAF_{c}\) was computed as

\begin{equation} STU\_NRAF_{c} = \dfrac { \sum_{ k \in S_{c}} \dfrac { STU\_BWT_{k} \times SCH\_TRIM_{k} \times SCH\_NRAF_{k} }

{SUBJFAC_{k}} } { \sum_{ k \in R_{c}} \dfrac { STU\_BWT_{k} \times SCH\_TRIM_{k} \times SCH\_NRAF_{k} } {SUBJFAC_{k}} },

\end{equation} where

\(S_{c}\) is the set of all eligible sampled students in cell \(s\) for a given sample; \(R_{c}\) is the set of all assessed students within \(S_{c}\);

\(STU\_BWT_{k}\) is the student base weight for a given student \(k\);

\(SCH\_TRIM_{k}\) is the school-level weight trimming factor for the school associated with student \(k\);

\(SCH\_NRAF_{k}\) is the school-level nonresponse adjustment factor for the school associated with student \(k\); and \(SUBJFAC_{k}\) is the subject factor for student \(k\).

The student weight used in the calculation above is the adjusted student base weight, without regard to subject, adjusted for school weight trimming and school nonresponse.

Nonresponse adjustment procedures are not applied to excluded students or full-time remote students because these students are not required to complete an assessment. In effect, these students were placed in a separate nonresponse cell by themselves, and all received an adjustment factor of 1. While these students are not included in the analysis of the NAEP scores, weights are provided for them in order to estimate the sizes of these groups and their population characteristics.




http://nces.ed.gov/nationsreportcard/tdw/weighting/2022/student_nonresponse_adjustment_factor_calculation_for_the_2022_assessment.aspx

Shape25

NAEP Technical Documentation School and Student Weight Trimming Adjustments


Weight trimming is an adjustment procedure that involves detecting and reducing extremely large weights. "Extremely large weights" generally refer to large sampling weights that were not anticipated in the design of the sample. Unusually large weights are likely to produce large sampling variances for statistics of interest, especially when the large weights are associated with sample cases reflective of rare or atypical characteristics. To reduce the impact of these large weights on variances, weight reduction methods are typically employed. The goal of employing weight reduction methods is to reduce the mean square error of survey estimates. While the trimming of large weights reduces variances, it also introduces some bias. However, it is presumed that the reduction in the variances more than compensates for the increase in the bias,

Trimming of School Base Weights

Trimming of Student Weights

thereby reducing the mean square error and thus improving the accuracy of survey estimates (Potter, 1988). NAEP employs weight trimming at both the school and student levels.




http://nces.ed.gov/nationsreportcard/tdw/weighting/2022/school_and_student_weight_trimming_adjustments_for_the_2022_assessment.aspx

Shape26



NAEP Technical Documentation Trimming of School Base Weights

Unusually large school weights can occur under three circumstances:

  1. New Schools: When a school selected from the NAEP new-school sampling frame has an enrollment that is disproportionately large relative to the enrollment of its corresponding school district or Catholic diocese. In other words, when a large new school is selected from a small school district or Catholic diocese.

  2. Private Schools: When a school from the private school frame participates in NAEP but did not participate in the Private School Universe Survey (PSS), the source of the NAEP private school frame. Schools that fall into this category are referred to as PSS nonrespondents and have small probabilities of selection.

  3. Schools with Large Enrollment Increases: When the actual grade enrollment of a school, determined at the time of student sampling, is grossly larger than its enrollment used for school sampling.

If a school's base weight was determined to be too large, the school weight was trimmed. Recall schools were sampled for NAEP with probability proportional to size where size was based on student grade enrollment. If a sampled school had a small grade enrollment, its school base weight was

large. To determine if a school's base weight was too large, a comparison was made between a school's base weight and its ideal weight (described below). If a school's base weight was more than three times its ideal weight, the school's base weight was scaled back or trimmed to three times the ideal weight. The trimming was accomplished by way of a trimming factor. The trimming factor for school \(s\) was calculated using the formula

\begin{equation} SCH\_TRIM_{s} = \left\{\begin{array}{llll} \dfrac{3 \times EXP\_WT_{s}} {SCH\_BWT_{s}} & \text{if } \dfrac{ SCH\_BWT_{s}} { EXP\_WT_{s}} >3 \\ 1 & \text{otherwise } \\ \end{array}\right. , \end{equation}

where

\(EXP\_WT_{s}\) is the ideal base weight for school \(s\); and

\(SCH\_BWT_{s}\) is the actual school base weight for school \(s\).

The ideal weight for a school depends on the type of circumstance: whether it was a new school, private school, or school with large grade enrollment increase. Details of the trimming procedure by type of circumstance are described below.

New Schools

New schools with a disproportionately large student enrollment in a particular grade from a school district (or Catholic diocese) that was selected with a small probability of selection were likely candidates to have their school weights trimmed. The school base weights for such schools may be large relative to what they would have been if they had been selected from the NAEP public or private school sampling frame. The ideal weight for a new school was as follows:

\(EXP\_WT_{s}\) is the ideal base weight the school would have received if it had been on the NAEP public or private school sampling frame.

For the 2022 NAEP assessment, two grade 8 schools out of 73 participating schools selected from the new-school sampling frame had their weights trimmed.

Private Schools

Private school PSS nonrespondents who participated in NAEP and were found subsequently to have either larger enrollments than assumed at the time of school sampling or an atypical probability of selection given their affiliation, the latter being unknown at the time of sampling, were also likely candidates to have their school weights trimmed. The ideal weight for a PSS nonresponding private school was as follows:

\(EXP\_WT_{s}\) is the ideal base weight the school would have received if it had been on the NAEP private school sampling frame with accurate enrollment and known affiliation.

For the 2022 NAEP assessment, there were three private school PSS nonrespondents that participated in NAEP, and none had their weights trimmed.

Schools with Large Enrollment Increases

Schools, other than the PSS nonrespondents described above, whose enrollments determined at the time of student sampling were much larger than those assumed at the time of school sampling were also candidates to have their school weights trimmed. These schools have large relative school weights because their school probabilities of selection were artificially low. The ideal weight for a school with a large grade enrollment increase was as follows:

\(EXP\_WT_{s}\) is the ideal base weight the school would have received if it had been on the relevant NAEP public or private school sampling frame with the updated enrollment figure from student sampling.

For the 2022 NAEP assessment, one school at grade 8 with a large grade enrollment increase had its weight trimmed.

Note that for the long-term trend (LTT) assessments, age-specific enrollment was used in the trimming procedure instead of grade enrollment. No LTT schools had their weights trimmed.




http://nces.ed.gov/nationsreportcard/tdw/weighting/2022/trimming_of_school_base_weights_for_the_2022_assessment.aspx

Shape27



NAEP Technical Documentation Trimming of Student Weights

Large student weights generally come from compounding nonresponse adjustments at the school and student levels with artificially low school selection probabilities, which can result from inaccurate enrollment data on the school frame used to define the school size measure. Even though measures are in place to limit the number and size of excessively large weights—such as the implementation of adjustment factor size constraints in both the school and student nonresponse procedures and the use of the school trimming procedure—large student weights can occur due to compounding effects of the various weighting components.

The student weight trimming procedure uses a multiple median rule to detect excessively large student weights. Any student weight within a given trimming group greater than a specified multiple of the median weight value of the given trimming group has its weight scaled back to that threshold. Student weight trimming was implemented separately by grade (or age, in the case of long-term trend [LTT]), school type (public or private), and subject. Initially, the threshold was set to 3.5. If too many student weights were being trimmed for a particular sample, the threshold was increased to reduce the number of records trimmed. The multiples and the trimming groups are defined for each sample below. Note that because in the initial runs of the national private school samples for mathematics and reading at grades 4 and 8 too many records were getting their weights trimmed, the threshold in those samples was increased to 4.5.

State Public School Samples for Mathematics and Reading at Grades 4 and 8

For these samples, the initial multiple used was 3.5, and the trimming groups were formed within each jurisdiction by Trial Urban District Assessment (TUDA) district vs. the balance of the state for states with TUDA districts.

National Private School Samples for Mathematics and Reading at Grades 4 and 8

For these samples, the initial multiple used was 4.5, and the trimming groups were formed by affiliation (Catholic, Non-Catholic).

National Public School Samples for Civics and U.S. History at Grade 8

For these samples, the initial multiple used was 3.5, and the trimming groups were formed by dichotomies of low/high percentage of American Indian/Alaska Native students (5 percent and below, above 5 percent) and Black and Hispanic students (15 percent and below, above 15 percent).

National Private School Samples for Civics and U.S. History at Grade 8

For these samples, the initial multiple used was 3.5, and the trimming groups were formed by affiliation (Catholic, Non-Catholic).

National Public School Samples for Mathematics and Reading LTT Assessments at Ages 9 and 13

For these samples, the initial multiple used was 3.5, and the trimming groups were defined by region and school oversampling factor for public schools. The school oversampling factor separated, into different trimming groups, schools that had different probabilities of selection by design due to the desire to increase the numbers of Black, Hispanic, and American Indian/Alaska Native students in the sample.

National Private School Samples for Mathematics and Reading LTT Assessments at Ages 9 and 13

For these samples, the initial multiple used was 3.5, and the trimming groups were formed by affiliation (Catholic, Non-Catholic).

The procedure computes the median of the nonresponse-adjusted student weights in the trimming group \(g\) for a given grade (or age) and subject sample. Any student \(k\) with a weight more than \(M\) times the median received a trimming factor calculated as

\begin{equation} STU\_TRIM_{gk} = \left\{\begin{array}{llll} \dfrac{M \times MEDIAN_{g}} {STUWGT_{gk}} & \text{if } STUWGT_{gk} < M

\times MEDIAN_{g} \\ 1 & \text{otherwise } \\ \end{array}\right. , \end{equation} where

\(M\) is the trimming multiple,

\(MEDIAN_{g}\) is the median of nonresponse-adjusted student weights in trimming group \(g\), and

\(STUWGT_{gk}\) is the weight after student nonresponse adjustment for student in trimming group \(g\).

In the 2022 assessment, very few students had weights considered excessively large. Out of the approximately 483,700 students included in the combined grade-based 2022 assessment samples, 35 students had their weights trimmed. None of the approximately 33,500 LTT students had their weights trimmed.



http://nces.ed.gov/nationsreportcard/tdw/weighting/2022/trimming_of_student_weights_for_the_2022_assessment.aspx

Shape28


NAEP Technical Documentation Student Weight Raking Adjustment


Weighted estimates of population totals for student-level subgroups for a given grade will vary across subjects even though the student samples for each subject generally come from the same schools. These differences are the result of sampling error associated with the random assignment of subjects to students through a process known as spiraling. For state assessments in particular, any difference in demographic estimates between subjects, no matter how small, may raise concerns about data quality. To remove these random differences and potential data quality

Development of Final Raking Dimensions Raking Adjustment Control Totals

Raking Adjustment Factor Calculation

concerns, a step was added to the NAEP weighting procedure in 2009. This step adjusts the student weights in such a way that the weighted sums of population totals for specific student groups are the same across all subjects. It was implemented using a raking procedure and applied only to public school assessments.

Raking is a weighting procedure based on the iterative proportional fitting process developed by Deming and Stephan (1940) and involves simultaneous ratio adjustments to two or more marginal distributions of population totals. Each set of marginal population totals is known as a dimension, and each population total in a dimension is referred to as a control total. Raking is carried out in a sequence of adjustments. Sampling weights are adjusted to one marginal distribution and then to the second marginal distribution, and so on. One cycle of sequential adjustments to the marginal distributions is called an iteration. The procedure is repeated until convergence is achieved. The criterion for convergence can be specified either as the maximum number of iterations or an absolute difference (or relative absolute difference) from the marginal population totals. More discussion on raking can be found in Oh and Scheuren (1987).

For NAEP 2022, the student raking adjustment was carried out for each public student sample. Similar to previous years, raking was not performed for any of the private school student samples or for student samples in the long-term trend (LTT) assessments at age 9. The dimensions used in the raking process for each public school student sample were race/ethnicity, gender, and student disability (SD) and English learner (EL) status. (Since 2013, National School Lunch Program [NSLP] eligibility has not been used as a raking dimension because of the instability of these data in many states.)

For the public school student samples in mathematics and reading at grades 4 and 8, the student raking adjustment was carried out separately in each state and TUDA district. The control totals for the raking dimensions for these student samples were obtained from the NAEP student sample weights of the mathematics and reading public samples combined.

For the public school student samples in civics and U.S. history at grade 8, the student raking adjustment was carried out at the national level. The control totals for the raking dimensions for these samples were obtained by summing the NAEP grade 8 student sample weights of the mathematics and reading public samples combined.




http://nces.ed.gov/nationsreportcard/tdw/weighting/2022/student_weight_raking_adjustment_for_the_2022_assessment.aspx

Shape29


NAEP Technical Documentation Development of Final Raking Dimensions

The raking procedure involved three dimensions. The variables used to define the dimensions are listed below along with the categories making up the initial raking cells for each dimension.

Race/Ethnicity

  1. White, not Hispanic

  2. Black, not Hispanic

  3. Hispanic

  4. Asian

  5. American Indian/Alaska Native

  6. Native Hawaiian/Pacific Islander

  7. Two or More Races

Student disability (SD)/English learner (EL) status

  1. SD, but not EL

  2. EL, but not SD

  3. SD and EL

  4. Neither SD nor EL

Gender

  1. Male

  2. Female



For the reading and mathematics samples, in states containing districts that participated in Trial Urban District Assessments (TUDA) at grades 4 and 8, the initial cells were created separately for each TUDA district and the balance of the state. For the civics and U.S. history samples at grade 8, the initial cells were created at the national level. Similar to the procedure used for school and student nonresponse adjustments, limits were placed on the magnitude of the cell sizes and adjustment factors to prevent unstable raking adjustments that could have resulted in unacceptably large or small adjustment factors. Levels of a dimension were combined whenever 1) there were fewer than 30 assessed, excluded, or full-time remote students (20 for any of the replicates) in a category, 2) the smallest adjustment was less than 0.5, or 3) the largest adjustment was greater than 2 for the full sample or for any replicate.


If collapsing was necessary for the race/ethnicity dimension, individual groups with similar student achievement levels were combined first. If further collapsing was necessary, the next closest race/ethnicity group was combined as well, and so on until all collapsing rules were satisfied. In some instances, all seven categories had to be collapsed.

If collapsing was necessary for the SD/EL dimension, the SD/not EL and SD/EL categories were combined first, followed by EL/not SD if further collapsing was necessary. In some instances, all four categories had to be collapsed.

Collapsing gender is generally not expected. However, in the rare event that it is necessary, male and female categories would be collapsed.




http://nces.ed.gov/nationsreportcard/tdw/weighting/2022/development_of_final_raking_dimensions_for_the_2022_assessment.aspx

Shape30



NAEP Technical Documentation Raking Adjustment Control Totals

The control totals used in the raking procedure for NAEP 2022 at grades 4 and 8 were estimates of the student population derived from the set of assessed, excluded, and full-time remote students pooled across subjects (mathematics and reading). The control totals for category \(c\) within dimension \(d\) were computed as

\begin{equation} TOTAL_{c(d)} = \sum_{ R_{c(d)} \smile E_{c(d)}} \dfrac { STU\_BWT_{k} \times SCH\_TRIM_{k} \times SCH\_NRAF_{k}

\times STU\_NRAF_{k} } {SUBJFAC_{k}}, \end{equation} where

Shape31 \(R_{c(d)}\) is the set of all assessed students in category \(c\) of dimension \(d\);

Shape32 \(E_{c(d)}\) is the set of all excluded or full-time remote students in category \(c\) of dimension \(d\);

Shape33 \(STU\_BWT_{k}\) is the student base weight for a given student \(k\);

Shape34 \(SCH\_TRIM_{k}\) is the school-level weight trimming factor for the school associated with student \(k\);

Shape35 \(SCH\_NRAF_{k}\) is the school-level nonresponse adjustment factor for the school associated with student \(k\);

Shape36 \(STU\_NRAF_{k}\) is the student-level nonresponse adjustment factor for student \(k\); and

Shape37 \(SUBJFAC_{k}\) is the subject factor for student \(k\).

The student weight used in the calculation of the control totals above is the student base weight, without regard to subject, adjusted for school weight trimming, school nonresponse, and student nonresponse. Control totals were computed for the full sample and for each replicate independently.




http://nces.ed.gov/nationsreportcard/tdw/weighting/2022/raking_adjustment_control_totals_for_the_2022_assessment.aspx

Shape38



NAEP Technical Documentation Raking Adjustment Factor Calculation

For assessed, excluded, and full-time remote students in a given subject, the raking adjustment factor \(STU\_RAKE_{k}\) was computed as below. First, the weight for student \(k\) was initialized as

\begin{equation} STUSAWT_{k}^{adj(0)} = STU\_BWT_{k} \times SCH\_TRIM_{k} \times SCH\_NRAF_{k} \times STU\_NRAF_{k} \times SUBJFAC_{k} , \end{equation}

where

\(STU\_BWT_{k}\) is the student base weight for a given student \(k\);

\(SCH\_TRIM_{k}\) is the school-level weight trimming factor for the school associated with student \(k\);

\(SCH\_NRAF_{k}\) is the school-level nonresponse adjustment factor for the school associated with student \(k\); \(STU\_NRAF_{k}\) is the student-level nonresponse adjustment factor for student \(k\); and

\(SUBJFAC_{k}\) is the subject factor for student \(k\).

Then, the sequence of weights for the first iteration was calculated as follows for student \(k\) in category \(c\) of dimension \(d\): for dimension 1: \begin{equation} STUSAWT_{k}^{adj(1)} = \dfrac {TOTAL_{c(1)}} { \sum_{ R_{c(1)} \smile E_{c(1)}}

{STUSAWT_{k}^{adj(0)} } } \times STUSAWT_{k}^{adj(0)} , \end{equation}

for dimension 2: \begin{equation} STUSAWT_{k}^{adj(2)} = \dfrac {TOTAL_{c(2)}} { \sum_{ R_{c(2)} \smile E_{c(2)}}

{STUSAWT_{k}^{adj(1)} } } \times STUSAWT_{k}^{adj(1)} , \end{equation}

for dimension 3: \begin{equation} STUSAWT_{k}^{adj(3)} = \dfrac {TOTAL_{c(3)}} { \sum_{ R_{c(3)} \smile E_{c(3)}}

{STUSAWT_{k}^{adj(2)} } } \times STUSAWT_{k}^{adj(2)} , \end{equation}

where

Shape39
Shape40
Shape41





\(R_{c(d)}\) is the set of all assessed students in category \(c\) of dimension \(d\);

\(E_{c(d)}\) is the set of all excluded or full-time remote students in category \(c\) of dimension \(d\); and

\(TOTAL_{c(d)}\) is the control total for category \(c\) of dimension \(d\).

The process is said to converge if the maximum difference between the sum of adjusted weights and the control totals is 1.0 for each category in each dimension. If after the sequence of adjustments the maximum difference was greater than 1.0, the process continues to the next iteration, cycling back to the first dimension with the initial weight for student \(k\) equaling \(STUSAWT_{k}^{adj(3)}\) from the previous iteration. The process continued until convergence was reached.

Once the process converged, the adjustment factor was computed as

\begin{equation} STU\_RAKE_{k} = \dfrac {STUSAWT_{k}} { STU\_BWT_{k} \times SCH\_TRIM_{k} \times SCH\_NRAF_{k} \times STU\_NRAF_{k} \times SUBJFAC_{k} } , \end{equation}

where

\(STUSAWT_{k}\) is the weight for student \(k\) after convergence.

The process was done independently for the full sample and for each replicate.






http://nces.ed.gov/nationsreportcard/tdw/weighting/2022/raking_adjustment_factor_calculation_for_the_2022_assessment.aspx

Shape42



NAEP Technical Documentation Computation of Replicate School Weights


In addition to the full-sample weight, a set of 62 replicate weights was provided for each school. These replicate weights are used in calculating the sampling variance of estimates obtained from the data, using the jackknife repeated replication method. The method of deriving these weights was aimed at reflecting the features of the sample design appropriately for each sample, so that when the jackknife variance estimation procedure is implemented, approximately unbiased estimates of

Defining Variance Strata and Forming Replicates (age-based samples)

sampling variance are obtained. This section gives the specifics for generating the replicate weights for the 2022 assessment samples. The theory that underlies the jackknife variance estimators used in NAEP studies is discussed in the section Replicate Variance Estimation.

For each sample, replicates were formed in two steps. First, each school was assigned to one or more of 62 replicate strata. This step differed for the age-based long-term trend (LTT) samples and

Defining Variance Strata and Forming Replicates (grade-based samples)

Replicate Variance Estimation

the grade-based samples as described in the separate "Defining Variance Strata and Forming Replicates" links above. In the next step, a random subset of schools in each replicate stratum was excluded. The remaining subset and all schools in the other replicate strata then constituted one of the 62 replicates.

For the 2022 LTT assessments, the same PSUs were sampled in 2022 and 2020. In fact, any comparison of the 2022 and 2020 estimates is a comparison of the same schools, so each school must be in the same variance stratum and variance unit in the two years so that the jackknife variance estimation will correctly reflect this dependence. To ensure that standard errors for trend would be calculated appropriately, each noncertainty PSU was assigned the same variance stratum and variance unit as in 2020. Likewise, in certainty PSUs, schools that were retained in 2022 from the 2020 sample were assigned the same variance stratum and variance unit as in 2020.

A replicate weight was calculated for each of the 62 replicates using weighting procedures similar to those used for the full-sample weight. Each replicate base weight contains an additional component, known as a replicate factor, to account for the subsetting of the sample to form the replicate. By repeating the various weighting procedures on each set of replicate base weights, the impact of these procedures on the sampling variance of an estimate is appropriately reflected in the variance estimate.

Each of the 62 replicate weights for school s in stratum j can be expressed as follows:


\begin{equation} \begin{aligned} SCH\_WGT_{js}(r)= {} & SCH\_BWT_{js}(r) \times SCH\_NRAF_{js}(r) \times\\ &SCH\_TRIM_{js} \times SCHSESWT_{js} \times SCH\_SUBJ\_AF_{js} \end{aligned} \end{equation}

where

Shape43 \(SCH\_BWT_{js}(r)\) is the replicate school base weight for replicate \(r\);

Shape44 \(SCH\_NRAF_{js}(r)\) is the school-level nonresponse adjustment factor for replicate \(r\);

Shape45 \(SCH\_TRIM_{js}\) is the school-level weight trimming adjustment factor;

Shape46 \(SCHSESWT_{js}\) is the school-level session assignment weight; and

\(SCH\_SUBJ\_AF_{js}\) is the small-school subject adjustment factor.

Specific school nonresponse adjustment factors were calculated separately for each replicate, as indicated by the index (r) in the formula, and applied to the replicate school base weights. Computing separate nonresponse adjustment factors for each replicate allows resulting variances from the use of the final school replicate weights to reflect components of variance due to this weight adjustment.

School weight trimming adjustments were not replicated, that is, not calculated separately for each replicate. Instead, each replicate used the school trimming adjustment factors derived for the full sample. Statistical theory for replicating trimming adjustments under the jackknife approach has not been developed in the literature. Due to the absence of a statistical framework, and since relatively few school weights in NAEP require trimming, the weight trimming adjustments were not replicated.

In addition, the school-level session assignment weight and the small-school subject adjustment factor also used the same factors derived for the full sample.




http://nces.ed.gov/nationsreportcard/tdw/weighting/2022/computation_of_replicate_school_weights_for_variance_estimation_for_the_2022_assessment.aspx

Shape47



NAEP Technical Documentation Computation of Replicate Student Weights for Variance Estimation


In addition to the full-sample weight, a set of 62 replicate weights was provided for each student. These replicate weights are used in calculating the sampling variance of estimates obtained from the data, using the jackknife repeated replication method. The method of deriving these weights was aimed at reflecting the features of the sample design appropriately for each sample, so that when the jackknife variance estimation procedure is implemented, approximately unbiased estimates of sampling variance are obtained. This section gives the specifics for generating the replicate weights for the 2022 assessment samples. The theory that underlies the jackknife variance estimators used in NAEP studies is discussed in the section Replicate Variance Estimation.

In general, the process of creating jackknife replicate weights takes place at both the school and student level. The precise implementation differs between those samples that involve the selection of

Defining Variance Strata and Forming Replicates

Computing School-Level Replicate Factors

Computing Student-Level Replicate Factors

Replicate Variance Estimation

Primary Sampling Units (PSUs) and those where the school is the first stage of sampling. The procedure for this second kind of sample also differed starting in 2011 from all previous NAEP assessments. The change that was implemented permitted the introduction of a finite population correction factor at the school sampling stage, developed by Rizzo and Rust (2011). In assessments prior to 2011, this adjustment factor has always been implicitly assumed equal to 1.0, resulting in some overestimation of the sampling variance.

PSU-Based (i.e., Age-Based) Samples

For the 2022 long-term trend (LTT) samples, which involve the selection of PSUs, the process for computing replicate student weights for variance estimation is very similar to the one that was used in 2020. The same PSUs were sampled in 2022 and 2020. In fact, any comparison of the 2022

and 2020 estimates is a comparison of the same schools, so each school must be in the same variance stratum and variance unit in the two years so that

the jackknife variance estimation will correctly reflect this dependence. To ensure that standard errors for trend would be calculated appropriately, each noncertainty PSU was assigned the same variance stratum and variance unit as in 2020. Likewise, in certainty PSUs, schools that were retained in 2022 from the 2020 sample were assigned the same variance stratum and variance unit as in 2020. For more information about computing replicate student weights for the LTT samples see here.

Grade-Based Samples


The process for computing replicate student weights for variance estimation for the 2022 grade-based samples is as follows:

For each sample, the calculation of replicate weighting factors at the school level was conducted in a series of steps. First, each school was assigned to one of 62 variance estimation strata. Then, a random subset of schools in each variance estimation stratum was assigned a replicate factor of between 0 and 1. Next, the remaining subset of schools in the same variance stratum was assigned a complementary replicate factor greater than 1. All schools in the other variance estimation strata were assigned a replicate factor of exactly 1. This process was repeated for each of the 62 variance estimation strata so that 62 distinct replicate factors were assigned to each school in the sample.

This process was then repeated at the student level. Here, each individual sampled student was assigned to one of 62 variance estimation strata, and 62 replicate factors with values either between 0 and 1, greater than 1, or exactly equal to 1 were assigned to each student.

For example, consider a single hypothetical student. For replicate 37, that student’s student replicate factor might be 0.8, while for the school to which the student belongs, for replicate 37, the school replicate factor might be 1.6. Of course, for a given student, for most replicates, either the student replicate factor, the school replicate factor, or (usually) both, is equal to 1.0.

A replicate weight was calculated for each student, for each of the 62 replicates, using weighting procedures similar to those used for the full-sample weight. Each replicate weight contains the school and student replicate factors described above. By repeating the various weighting procedures on each set of replicates, the impact of these procedures on the sampling variance of an estimate is appropriately reflected in the variance estimate.

Each of the 62 replicate weights for student \(k\) in school \(s\) in stratum \(j\) can be expressed as

\begin{equation} \begin{aligned} FSTUWGT_{jsk}(r) = {} & STU\_BWT_{jks} \times SCH\_REPFAC_{js}(r) \times SCH\_NRAF_{js}(r) \times \\ & STU\_REPFAC_{jsk}(r) \times STU\_NRAF_{jsk}(r) \times \\ & SCH\_TRIM_{js} \times STU\_TRIM_{jsk} \times STU\_RAKE_{jsk}(r)

\end{aligned}, \end{equation} where

\(STU\_BWT_{jks}\) is the student base weight;

\(SCH\_REPFAC_{js}(r)\) is the school-level replicate factor for replicate \(r\);

\(SCH\_NRAF_{js}(r)\) is the school-level nonresponse adjustment factor for replicate \(r\); \(STU\_REPFAC_{jsk}(r)\) is the student-level replicate factor for replicate \(r\);

\(STU\_NRAF_{jsk}(r)\) is the student-level nonresponse adjustment factor for replicate \(r\);

\(SCH\_TRIM_{js}\) is the school-level weight trimming adjustment factor;

\(STU\_TRIM_{jsk}\) is the student-level weight trimming adjustment factor; and

\(STU\_RAKE_{jsk}(r)\) is the student-level raking adjustment factor for replicate \(r\).

Specific school and student nonresponse and student-level raking adjustment factors were calculated separately for each replicate, as indicated by the index \(r\) in the formula, and applied to the replicate student base weights. Computing separate nonresponse and raking adjustment factors for each replicate allows resulting variances from the use of the final student replicate weights to reflect components of variance due to these various weight adjustments.

School and student weight trimming adjustments were not replicated, that is, not calculated separately for each replicate. Instead, each replicate used the school and student trimming adjustment factors derived for the full sample. Statistical theory for replicating trimming adjustments under the jackknife approach has not been developed in the literature. Due to the absence of a statistical framework, and since relatively few school and student weights in NAEP require trimming, the weight trimming adjustments were not replicated.




http://nces.ed.gov/nationsreportcard/tdw/weighting/2022/computation_of_replicate_student_weights_for_variance_estimation_for_the_2022_assessment.aspx

Shape48



NAEP Technical Documentation Computing School-Level Replicate Factors

The school-level replication procedures differed for the age-based samples and the grade-based samples because the latter incorporate finite population corrections.

Age-Based Samples

For the NAEP 2022 age-based long-term trend (LTT) assessments, the school-level replication was carried out using the same procedures used for 2020 LTT. Those procedures are described here.



Grade-Based Samples

The replicate variance estimation approach for the grade-based civics, mathematics, reading, and U.S. history assessments involved finite population corrections at the school level. The calculation of school-level replicate factors for these assessments depended upon whether or not a school was selected with certainty. For certainty schools, the school-level replicate factors for all replicates are set to unity–this is true regardless of whether or not the variance replication method uses finite population corrections–since certainty schools are not subject to sampling variability. Alternatively, one can view the finite population correction factor for such schools as being equal to zero. Thus, for each certainty school in a given assessment, the school- level replicate factor for each of the 62 replicates (\(r=1, ..., 62\)) was assigned as

\begin{equation} SCH\_REPFAC_{js}(r)=1 , \displaystyle \end{equation}

where \(SCH\_REPFAC_{js}(r)\) is the school-level replicate factor for school \(s\) in primary stratum \(j\) for the \(r\)-th replicate.

For noncertainty schools, where preliminary variance strata were formed by grouping schools into pairs or triplets, school-level replicate factors were calculated for each of the 62 replicates based on this grouping. For schools in variance strata comprising pairs of schools, the school-level replicate factors, \(SCH\_REPFAC_{js}(r) = 1,..., 62\), were calculated as

\begin{equation} SCH\_REPFAC_{js}(r) = \left\{\begin{array}{llll} 1 + \sqrt{(1-min(\pi_{j1}, \pi_{j2}))}, & \text{for } js \in R_{jr}, U_{js} = 1 \\ 1

- \sqrt{(1-min(\pi_{j1}, \pi_{j2}))}, & \text{for } js \in R_{jr}, U_{js} = 2 \\ 1, & \text{for } js \notin R_{jr} \end{array}\right. , \end{equation} where

\(min(\pi_{j1}, \pi_{j2})\) is the smallest school probability between the two schools comprising \(R_{jr}\); \(R_{jr}\) is the set of schools within the \(r\)-th variance stratum for primary stratum \(j\); and

\(U_{js}\) is the variance unit (1 or 2) for school \(s\) in primary stratum \(j\).



For triples (i.e., variance strata comprising 3 schools), the replicate factors are perturbed to something other than 1.0 for two different variance strata, rather than just for one stratum as in the case of pairs (i.e., variance strata comprising 2 schools). The replicate factors are perturbed in variance stratum

\(r\) and variance stratum \(r'\), where \(r'\) is furthest away from variance stratum \(r\) in either direction (i.e., before or after stratum \(r\)). Because there are 62 replicates, the stratum furthest away from stratum \(r\) would be the stratum whose number is the number of stratum \(r\) plus or minus half of 62, depending on whether \(r\) is greater or less than 31. In other words, \(r'=r+31\) \(mod\) \(62\). For example, if variance stratum 40 has three schools, replicate factors are perturbed in variance stratum 40 (\(r\)) and variance stratum 9 (\(r'\)). The school-level replicate factors \ (SCH\_REPFAC_{js}(r)\), \(r = 1,..., 62\), were calculated as follows:

For school \(s\) from primary stratum \(j\) variance stratum \(r\),

\begin{equation} SCH\_REPFAC_{js}(r) = \left\{\begin{array}{llll} 1 + \dfrac {\sqrt{(1-min(\pi_{j1}, \pi_{j2}, \pi_{j3}))}} {2}, & \text{for } js \in R_{jr}, U_{js} = 1 \\ 1 + \dfrac {\sqrt{(1-min(\pi_{j1}, \pi_{j2}, \pi_{j3}))}} {2}, & \text{for } js \in R_{jr}, U_{js} = 2 \\ 1 - \sqrt{(1-min(\pi_{j1},

\pi_{j2}, \pi_{j3}))}, & \text{for } js \in R_{jr}, U_{js} = 3 \end{array}\right. , \end{equation} while for variance stratum \(r'\),

\begin{equation} SCH\_REPFAC_{js}(r’) = \left\{\begin{array}{llll} 1 + \dfrac {\sqrt{(1-min(\pi_{j1}, \pi_{j2}, \pi_{j3}))}} {2}, & \text{for } js \in R_{jr}, U_{js} = 1 \\ 1 - \sqrt{(1-min(\pi_{j1}, \pi_{j2}, \pi_{j3}))}, & \text{for } js \in R_{jr}, U_{js} = 2 \\ 1 + \dfrac {\sqrt{(1-min(\pi_{j1},

\pi_{j2}, \pi_{j3}))}} {2}, & \text{for } js \in R_{jr}, U_{js} = 3 \\ \end{array}\right. , \end{equation}

and for all other variance strata, further called \(r\) with an asterisk (\(r^*\)) (that is, strata other than variance strata \(r\) and \(r'\)),

\begin{equation} SCH\_REPFAC_{js}(r^*) = 1 , \end{equation} where

\(min(\pi_{j1}, \pi_{j2}, \pi_{j3})\) is the smallest school probability among the three schools comprising \(R_{jr}\); \(R_{jr}\) is the set of schools within the \(r\)-th variance stratum for primary stratum \(r\); and

\(U_{js}\) is the variance unit (1, 2, or 3) for school \(s\) in primary stratum \(j\).

In primary strata with fewer than 62 variance strata, the replicate weights for the “unused” variance strata (the remaining ones up to 62) for these schools were set equal to the school base weight (so that those replicates contribute nothing to the variance estimate).




http://nces.ed.gov/nationsreportcard/tdw/weighting/2022/computing_school_level_replicate_factors_for_the_2022_assessment.aspx

Shape49



NAEP Technical Documentation Computing Student-Level Replicate Factors

The student-level replication procedures differed for the age-based samples and the grade-based samples because the latter incorporate finite population corrections.

Age-Based Samples

For the NAEP 2022 age-based long-term trend (LTT) assessments, the student-level replication was carried out using the same procedures used for 2020 LTT. Those procedures are described here.

Grade-Based Samples

For the grade-based civics, mathematics, reading, and U.S. history assessment samples, which involved school-level finite population corrections, the student-level replication factors were calculated the same way regardless of whether or not the student was in a certainty school.

For students in student-level variance strata comprising pairs of students, the student-level replicate factors, \(STU\_REPFAC_{jsk}(r)\), \(r = 1,..., 62\), were calculated as

\begin{equation} STU\_REPFAC_{jsk}(r) = \left\{\begin{array}{llll} 1 + \sqrt {\pi_{s}}, & \text{for } jsk \in R_{jsr}, U_{jsk} = 1 \\ 1 - \sqrt

{\pi_{s}}, & \text{for } jsk \in R_{jsr}, U_{jsk} = 2 \\ 1, & \text{for } jsk \notin R_{jsr} \end{array}\right. , \end{equation} where

\(\pi_{s}\) is the probability of selection for school \(s\);

\(R_{jsr}\) is the set of students within the \(r\)-th variance stratum for school \(s\) in primary stratum \(j\); and \(U_{jsk}\) is the variance unit (1 or 2) for student \(k\) in school \(s\) in stratum \(j\).

For triples (i.e., variance strata comprising three students), the replicate factors are perturbed to something other than 1.0 for two different variance strata, rather than just for one stratum as in the case of pairs (i.e., variance strata comprising 2 students). The replicate factors are perturbed in variance stratum \(r\) and variance stratum \(r'\), where \(r'\) is furthest away from variance stratum \(r\) in either direction (i.e., before or after stratum \(r\)).

Because there are 62 replicates, the stratum furthest away from stratum \(r\) would be the stratum whose number is the number of stratum \(r\) plus or minus half of 62, depending on whether \(r\) is greater or less than 31. In other words, \(r'=r+31\) \(mod\) \(62\). For example, if variance stratum 1 has three students, replicate factors are perturbed in variance stratum 1 (\(r\)) and variance stratum 32 (\(r'\)). The student-level replicate factors \ (STU\_REPFAC_{jsk}(r)\), \(r = 1,..., 62\), were calculated as follows:

\begin{equation} STU\_REPFAC_{jsk}(r) = \left\{\begin{array}{llll} 1 + \dfrac {\sqrt {\pi_{s}}} {2}, & \text{for } jsk \in R_{jsr}, U_{jsk} = 1 \\ 1 +

\dfrac {\sqrt {\pi_{s}}} {2}, & \text{for } jsk \in R_{jsr}, U_{jsk} = 2 \\ 1 - \sqrt {\pi_{s}}, & \text{for } jsk \in R_{jsr}, U_{jsk} = 3

\end{array}\right. , \end{equation}

while for variance stratum \(r' = r + 31\) \(mod\) \(62\),

\begin{equation} STU\_REPFAC_{jsk}(r') = \left\{\begin{array}{llll} 1 + \dfrac {\sqrt {\pi_{s}}} {2}, & \text{for } jsk \in R_{jsr}, U_{jsk} = 1 \\ 1 -

\sqrt {\pi_{s}}, & \text{for } jsk \in R_{jsr}, U_{jsk} = 2 \\ 1 + \dfrac {\sqrt {\pi_{s}}} {2}, & \text{for } jsk \in R_{jsr}, U_{jsk} = 3

\end{array}\right. , \end{equation}

and for all other variance strata, further called \(r\) with an asterisk (\(r^*\)), (that is, variance strata other than strata \(r\) and \(r'\)),

\begin{equation} STU\_REPFAC_{jsk}(r^*) = 1 , \end{equation} where

\(\pi_{s}\) is the probability of selection for school \(s\);

\(R_{jsr}\) is the set of students within the \(r\)-th replicate stratum for school \(s\) in stratum \(j\); and \(U_{jsk}\) is the variance unit (1, 2, or 3) for student \(k\) in school \(s\) in stratum \(j\).

Note, for students in certainty schools, where \(\pi_{s}=1\), the student replicate factors are 2 and 0 in the case of pairs, and 1.5, 1.5, and 0 in the case of triples.




http://nces.ed.gov/nationsreportcard/tdw/weighting/2022/computing_student_level_replicate_factors_for_the_2022_assessment.aspx

Shape50



NAEP Technical Documentation Defining Variance Strata and Forming Replicates

For NAEP 2022, the procedure used to define variance strata and form replicates differed for the age-based samples and the grade-based samples.

Age-Based Samples

In the NAEP 2022 age-based assessments for long-term trend (LTT), the procedure used to define variance strata and form replicates was the same one used for the 2020 LTT assessments. That procedure is described here .

Grade-Based Samples

In the NAEP 2022 grade-based assessments, replicates were formed separately for each sample indicated by grade (4 or 8), school type (public

or private), and assessment subject (civics, mathematics, reading, and U.S. history). To reflect the school-level finite population corrections in the variance estimators for these two-stage samples, replication was carried out at both the school and student levels.

The first step in forming replicates was to create preliminary variance strata in each primary stratum. This was done by sorting the appropriate sampling unit (school or student) in the order of its selection within the primary stratum and then pair off adjacent sampling units into preliminary variance strata. Sorting sample units by their order of sample selection reflects the implicit stratification and systematic sampling features of the sample design. Within each primary stratum with an even number of sampling units, all of the preliminary variance strata consisted of pairs of sampling units. However, within primary strata with an odd number of sampling units, all but one variance strata consisted of pairs of sampling units, while the last one consisted of three sampling units.

The next step is to form the final variance strata by combining preliminary strata if appropriate. If there were more than 62 preliminary variance strata within a primary stratum, the preliminary variance strata were grouped to form 62 final variance strata. This grouping effectively maximized the distance in the sort order between grouped preliminary variance strata. The first 62 preliminary variance strata, for example, were assigned to 62 different final variance strata in order (1 through 62), with the next 62 preliminary variance strata assigned to final variance strata 1 through 62, so that, for example, preliminary variance stratum 1, preliminary variance stratum 63, preliminary variance stratum 125 (if in fact there were that many), etc., were all assigned to the first final variance stratum.

If, on the other hand, there were fewer than 62 preliminary variance strata within a primary stratum, then the number of final variance strata was set equal to the number of preliminary variance strata. For example, consider a primary stratum with 111 sampled units sorted in their order of selection.

The first two units were in the first preliminary variance stratum; the next two units were in the second preliminary variance stratum, and so on, resulting in 54 preliminary variance strata with two sample units each (doublets). The last three sample units were in the 55th preliminary variance stratum (triplet). Since there are no more than 62 preliminary variance strata, these were also the final variance strata.

Within each preliminary variance stratum containing a pair of sampling units, one sampling unit was randomly assigned as the first variance unit and the other as the second variance unit. Within each preliminary variance stratum containing three sampling units, the three first-stage units were randomly assigned variance units 1 through 3.

Mathematics and Reading Assessments (Grades 4 and 8)

At the school level for these samples, formation of preliminary variance strata did not pertain to certainty schools, since they are not subject to sampling variability, but only to noncertainty schools. The primary stratum for noncertainty schools was the highest school-level sampling stratum variable listed below, and the order of selection was defined by sort order on the school sampling frame.

Trial Urban District Assessment (TUDA) districts, remainder of states (for states with TUDAs), or entire states for the public school samples at grades 4 and 8; and

Private school affiliation (Catholic, non-Catholic) for the private school samples at grades 4 and 8.

At the student level, all students were assigned to variance strata. The primary stratum was school, and the order of selection was defined by session number and position on the administration schedule.

Within each pair of preliminary variance strata, one first-stage unit, designated at random, was assigned as the first variance unit and the other first- stage unit as the second variance unit. Within each triplet preliminary variance stratum, the three schools were randomly assigned variance units 1 through 3.

Civics and U.S. History Assessments (Grade 8)

At the school level for these samples, formation of preliminary variance strata did not pertain to certainty schools, since they are not subject to sampling variability, but only to noncertainty schools. The primary stratum for noncertainty schools was the highest school-level sampling stratum variable listed below, and the order of selection was defined by sort order on the school sampling frame.

The nation (50 states and the District of Columbia) for the public school samples at grade 8; and Private school affiliation (Catholic, non-Catholic) for the private school samples at grade 8.

At the student level, all students were assigned to variance strata. The primary stratum was school, and the order of selection was defined by session number and position on the administration schedule.

Within each pair of preliminary variance strata, one first-stage unit, designated at random, was assigned as the first variance unit and the other first- stage unit as the second variance unit. Within each triplet preliminary variance stratum, the three schools were randomly assigned variance units 1 through 3.




http://nces.ed.gov/nationsreportcard/tdw/weighting/2022/defining_variance_strata_and_forming_replicates_for_the_2022_assessment.aspx

Shape51



NAEP Technical Documentation Replicate Variance Estimation

Variances for NAEP assessment estimates are computed using the paired jackknife replicate variance procedure. This technique is applicable for common statistics, such as means and ratios, and differences between these for different subgroups, as well as for more complex statistics such as linear or logistic regression coefficients.

In general, the paired jackknife replicate variance procedure involves initially pairing clusters of first-stage sampling units to form \(H\) variance strata

\(h = 1, 2, 3, ..., H\) with two units per stratum. The first replicate is formed by assigning, to one unit at random from the first variance stratum, a replicate weighting factor of less than 1.0, while assigning the remaining unit a complementary replicate factor greater than 1.0, and assigning all other units from the other \(H - 1\) strata a replicate factor of 1.0. This procedure is carried out for each variance stratum resulting in \(H\) replicates, each of which provides an estimate of the population total.

In general, this process is repeated for subsequent levels of sampling. In practice, this is not practicable for a design with three or more stages of sampling, and the marginal improvement in precision of the variance estimates would be negligible in all such cases in the NAEP setting. Thus in NAEP, when a two-stage design is used–sampling schools and then students–beginning in 2011 replication is carried out at both stages for the purpose of computing replicate student weights. The change implemented in 2011 permitted the introduction of a finite population correction factor at the school sampling stage. Prior to 2011, replication was only carried out at the first stage of selection. See Rizzo and Rust (2011) for a description of the methodology.

When a three-stage design is used, involving the selection of geographic Primary Sampling Units (PSUs), then schools, and then students, the replication procedure is only carried out at the first stage of sampling (the PSU stage for noncertainty PSUs, and the school stage within certainty PSUs). In this situation, the school and student variance components are correctly estimated, and the overstatement of the between-PSU variance component is relatively very small.

The jackknife estimate of the variance for any given statistic is given by the following formula:

\begin{equation} \nu(\hat{t}) =\sum_{h=1}^{H} {(\hat{t}_{h}-\hat{t})^2}, \end{equation} where

Shape52 \(\hat{t}\) represents the full sample estimate of the given statistic; and

Shape53 \(\hat{t}_{h}\) represents the corresponding estimate for replicate \(h\).

Each replicate undergoes the same weighting procedure as the full sample so that the jackknife variance estimator reflects the contributions to or reductions in variance resulting from the various weighting adjustments.

The NAEP jackknife variance estimator is based on 62 variance strata resulting in a set of 62 replicate weights assigned to each school and student.

The basic idea of the paired jackknife variance estimator is to create the replicate weights so that use of the jackknife procedure results in an unbiased variance estimator for totals and means, which is also reasonably efficient (i.e., has a low variance as a variance estimator). The jackknife variance estimator will then produce a consistent (but not fully unbiased) estimate of variance for (sufficiently smooth) nonlinear functions of total and mean estimates such as ratios, regression coefficients, and so forth (Shao and Tu 1995).

The development below shows why the NAEP jackknife variance estimator returns an unbiased variance estimator for totals and means, which is the cornerstone to the asymptotic results for nonlinear estimators. See for example Rust (1985). This paper also discusses why this variance estimator is generally efficient (i.e., more reliable than alternative approaches requiring similar computational resources).

The development is done for an estimate of a mean based on a simplified sample design that closely approximates the sample design for first-stage units used in the NAEP studies. The sample design is a stratified random sample with \(H\) strata with population weights \(W_{h}\), stratum sample sizes \(n_{h}\), and stratum sample means \(\overline{y}_{h}\). The population estimator \(\hat{\overline{Y}}\) and the standard unbiased variance estimator \(\nu(\hat{\overline{Y}})\) are

\begin{equation} \hat{\overline{Y}} =\sum_{h=1}^{H} W_{h}\overline{y}_{h}, \end{equation}

\begin{equation} \nu \left(\hat{\overline{Y}} \right) = \sum_{h=1}^{H} W_{h}^2 \frac{s_h^2}{n_{h}}, \end{equation}

with

\begin{equation} s^2_h=\frac{1}{n_{h}-1} \sum_{i=1}^{n_{h}} {(y_{h_{i}}-\overline{y}_{h})^2}. \end{equation}

The paired jackknife replicate variance estimator assigns one replicate \(h=1,…,H\) to each stratum, so that the number of replicates equals \(H\). In NAEP, the replicates correspond generally to pairs and triplets (with the latter only being used if there are an odd number of sample units within a particular primary stratum generating replicate strata). For pairs, the process of generating replicates can be viewed as taking a simple random sample \ (J\) of size \(\frac{n_{h}}{2}\) within the replicate stratum, and assigning an increased weight to the sampled elements, and a decreased weight to the unsampled elements. In certain applications, the increased weight is double the full sample weight, while the decreased weight is in fact equal to zero. In this simplified case, this assignment reduces to replacing \(\overline{y}_{h}\) with \(\overline{y}_{h}(J)\), the latter being the sample mean of the sampled \(\frac{n_{h}}{2}\) units. Then the replicate estimator corresponding to stratum \(r\) is

\begin{equation} \hat{\overline{Y}}(r)=\sum_{h \ne r}^{H} W_{h} \overline{y}_h + W_r \overline{y}_h(J). \end{equation} The \(r\)-th term in the sum of squares for \(\nu_{j} \left( \hat{\overline{Y}}\right)\) is thus

\begin{equation} \left( \hat{\overline{Y}}(r)- \hat{\overline{Y}} \right)^2 = W_r^2 \left( \overline{y}_r(J)- \overline{y}_r \right)^2. \end{equation}

In stratified random sampling, when a sample of size \(\frac{n_r}{2}\) is drawn without replacement from a population of size \(n_r\), the sampling variance is

\begin{equation} \begin{aligned} E \left( \overline{y}_{r’}(J)-\overline{y}_r \right)^2 = \frac {1} {\frac{n_r}{2} } \frac{ n_r - \frac{n_r}{2}} {n_r}

\frac {1}{n_r-1} \sum_{i=1}^{n_r} \left( y_{r_{i}} - \overline{y}_r \right)^2 \\ = \frac {1} {n_r \left( n_r-1 \right) } \sum_{i=1}^{n_r} \left( y_{r_{i}} - \overline{y}_r \right)^2 = \frac {s^2_r}{n_r}. \end{aligned} \end{equation}

See for example Cochran (1977), Theorem 5.3, using \(n_r\), as the “population size,” \(\frac{n_r}{2}\)as the “sample size,” and \(s^2_r\) as the “population variance” in the given formula. Thus,

\begin{equation} E \left\{ W_r^2 \left( \overline{y}_{r}(J)- \overline{y}_r \right)^2 \right\} = W_r^2 \frac{s_r^2}{n_r}. \end{equation} Taking the expectation over all of these stratified samples of size \(\frac{n_r}{2}\), it is found that

\begin{equation} E \left( \nu_j \left( \hat{\overline{Y}} \right) \right) =\nu \left( \hat{\overline{Y}}\right). \end{equation}

In this sense, the jackknife variance estimator "gives back" the sample variance estimator for means and totals as desired under the theory.

In cases where, rather than doubling the weight of one half of one variance stratum and assigning a zero weight to the other, the weight of one unit is multiplied by a replicate factor of \((1+\delta)\), while the other is multiplied by \((1-\delta)\), the result is that

\begin{equation} E \left( \hat{\overline{y}}(r)- \hat{\overline{y}} \right)^2 = W^2_r \delta^2 \frac{s^2_r}{n_r}. \end{equation}

In this way, by setting \(\delta\) equal to the square root of the finite population correction factor, the jackknife variance estimator is able to incorporate a finite population correction factor into the variance estimator.

In practice, variance strata are also grouped to make sure that the number of replicates is not too large (the total number of variance strata is usually 62 for NAEP). The randomization from the original sample distribution guarantees that the sum of squares contributed by each replicate will be close to the target expected value.

For triples, the replicate factors are perturbed to something other than 1.0 for two different replicate factors, rather than just one as in the case of pairs. Again in the simple case where replicate factors that are less than 1 are all set to 0, the replicate weight factors are calculated as follows.

For unit \(i\) in variance stratum \(r\)

\begin{equation} w_i(r) = \left\{\begin{array}{lll} 1.5w_i & i= \text{variance unit 1}\\ 1.5w_i & i= \text{variance unit 2}\\ 0 & i= \text{variance unit 3} \end{array}\right. \end{equation}

where weight \(w_i\) is the full sample base weight. Furthermore, for \(r'=r+31\) \(mod\) \(62\)

\begin{equation} w_i(r') = \left\{ \begin{array}{llll} 1.5w_i & i= \text{variance unit 1}\\ 0 & i= \text{variance unit 2}\\ 1.5w_i & i= \text{variance unit 3} \end{array}\right. \end{equation}

And for all other values \(r^*\), other than \(r\) and \(r'\), \(w_i \left(r^*\right)=1\).

In the case of stratified random sampling, this formula reduces to replacing \(\overline{y}_r\) with \(\overline{y}_r(J)\) for replicate \(r\), where \ (\overline{y}_r(J)\) is the sample mean from a "\(2/3\)" sample of \(\frac{2n_r}{3}\) units from the \(n_r\) sample units in the replicate stratum, and replacing \(\overline{y}_r\) with \(\overline{y}_{r'}(J)\) for replicate \(r'\), where \(\overline{y}_{r'}(J)\) is the sample mean from another overlapping "\(2/3\)" sample of \(\frac{2n_r}{3}\) units from the \(n_r\) sample units in the replicate stratum.

The \(r\)-th and \(r'\)-th replicates can be written as

\begin{equation} \hat{\overline{Y}}(r)=\sum_{h \ne r}^{H} W_{h} \overline{y}_h + W_r \overline{y}_r(J), \end{equation}

\begin{equation} \hat{\overline{Y}}(r')=\sum_{h \ne r}^{H} W_{h} \overline{y}_h + W_r \overline{y}_{r'}(J). \end{equation}

From these formulas, expressions for the \(r\)-th and \(r'\)-th components of the jackknife variance estimator are obtained (ignoring other sums of squares from other grouped components attached to those replicates):

\begin{equation} \left( \hat{\overline{Y}}(r)- \hat{\overline{Y}}\right)^2= W^2_r \left( \overline{y}_r(J)- \overline{y}_{r}\right)^2, \end{equation}

\begin{equation} \left( \hat{\overline{Y}}(r’)- \hat{\overline{Y}}\right)^2= W^2_r \left( \overline{y}_{r’}(J)- \overline{y}_{r}\right)^2.

\end{equation}

These sums of squares have expectations as follows, using the general formula for sampling variances:

\begin{equation} \begin{aligned} E\left( \overline{y}_r(J)- \overline{Y}_r\right)^2= \frac {1}{\frac{2n_r}{3}} \frac {n_r- \frac{2n_r}{3} }{n_r}

\frac {1}{n_r-1} \sum_{i=1}^{n_r} \left( y_{r_{i}} - \overline{y}_r \right)^2 \\ =\frac{1}{2n_r \left( n_r-1\right)} \sum_{i=1}^{n_r} \left( y_{r_{i}}

  • \overline{y}_r \right)^2 \\ =\frac {s^2_r}{2n_r}, \end{aligned} \end{equation}

\begin{equation} \begin{aligned} E\left( \overline{y}_{r’}(J)- \overline{Y}_r\right)^2= \frac {1}{\frac{2n_r}{3}} \frac {n_r- \frac{2n_r}{3} }{n_r}

\frac {1}{n_r-1} \sum_{i=1}^{n_r} \left( y_{r_{i}} - \overline{y}_r \right)^2 \\ =\frac{1}{2n_r \left( n_r-1\right)} \sum_{i=1}^{n_r} \left( y_{r_{i}}

  • \overline{y}_r \right)^2 \\ =\frac {s^2_r}{2n_r}. \end{aligned} \end{equation} Thus,

\begin{equation} \begin{aligned} E \left\{ W_r^2 \left( \overline{y}_r(J)- \overline{y}_r \right)^2 + W_r^2 \left( \overline{y}_{r’}(J)- \overline{y}_r

\right)^2 \right\} \\ = W_r^2 \left( \frac {s^2_r}{2n_r} + \frac {s^2_r}{2n_r} \right) \\ = W_r^2 \frac {s^2_r}{n_r}, \end{aligned} \end{equation} as desired again.





http://nces.ed.gov/nationsreportcard/tdw/weighting/2022/replicate_variance_estimation_for_the_2022_assessment.aspx

Shape54



NAEP Technical Documentation Quality Control on Weighting Procedures


Given the complexity of the weighting procedures utilized in NAEP, a range of quality control (QC) checks was conducted throughout the weighting process to identify potential problems with collected student-level demographic data or with specific weighting procedures. The QC processes included:

checks performed within each step of the weighting process;

checks performed across adjacent steps of the weighting process; review of participation, exclusion, and accommodation rates;

checks of demographic data of individual schools and students;

Final Participation, Exclusion, and Accommodation Rates

Nonresponse Bias Analyses

comparisons with 2019 demographic data (or 2020 demographic data in the case of long-term trend [LTT]); and nonresponse bias analyses.

To validate the weighting process, extensive tabulations of various school and student characteristics at different stages of the process were conducted. The school-level characteristics included in the tabulations were racial/ethnic enrollment, median income (based on the school ZIP code area), and urban-centric locale. At the student level, the tabulations included race/ethnicity, gender, relative age, student disability (SD) status, English

learner (EL) status, and participation status in National School Lunch Program (NSLP).




http://nces.ed.gov/nationsreportcard/tdw/weighting/2022/quality_control_on_weighting_procedures_for_the_2022_assessment.aspx

Shape55

NAEP Technical Documentation Final Participation, Exclusion, and Accommodation Rates


Final participation, exclusion, and accommodation rates are presented in quality control tables for each grade (or age) and subject by geographic domain and school type. School- and student-level participation rates have been calculated according to National Center for Education Statistics (NCES) standards as they have been for previous assessments.

At the school level, private schools had participation rates below 85 percent in all grades (or ages) and subjects. At the student level, response rates at grade 8 fell below 85 percent for mathematics, reading, or both for the following state domains: Alaska, District of Columbia, Hawaii, New Hampshire, and New York; and the following TUDA domains: District of Columbia Public Schools, New York City, and Milwaukee. As required by NCES standards, nonresponse bias analyses were conducted on each reporting group falling below the 85 percent participation threshold.

Grade 4 Mathematics

Grade 4 Reading

Grade 8 Mathematics

Grade 8 Reading

Grade 8 Civics Grade 8 U.S. History

Age 9 Mathematics

Age 9 Reading

Age 13 Mathematics

Age 13 Reading




http://nces.ed.gov/nationsreportcard/tdw/weighting/2022/final_participation_exclusion_and_accommodation_rates_for_the_2022_assessment.aspx

Shape56



NAEP Technical Documentation Participation, Exclusion, and Accommodation Rates for Age 13 Mathematics

The following table displays the school-level participation rates and student-level participation, exclusion, and accommodation rates for the age 13 long-term trend mathematics assessment. Various weights were used in the calculation of the school rates, as indicated in the column headings of the table. For the student participation rates, student base weights were used. For the student exclusion rates and accommodation rates, student base weights with adjustment for school nonresponse were used. Different weights were used at the student level because the student participation rates are conditional on (i.e., computed within) the participating schools, whereas the exclusion and accommodation rates are population estimates.

The school participation rates reflect the participation of the original sampled schools only and do not reflect any effect of substitution. The rates weighted by the school base weight and enrollment show the approximate proportion of the student population in the domain that is represented by the responding schools in the sample. The rates weighted by just the base weight show the proportion of the school population that is represented by the responding schools in the sample. These rates differ because schools differ in size.


Participation, exclusion, and accommodation rates, age 13 long-term trend mathematics assessment, by school type and geographic region: 2022









School type and geographic region



Number

of schools

in original sample, rounded

School participation rates (percent)

before substitution (weighted by base weight

and enrollment)

School participation

rates (percent) before substitution (weighted by base weight

only)




Number

of students sampled, rounded





Weighted percent of students excluded



Weighted student participation

rates (percent)

after makeups





Weighted percent of students accommodated

National all1

660

85.98

70.79

10,500

2.31

89.11

14.23

Northeast all

110

79.16

55.37

1,300

2.28

84.99

18.15

Midwest all

130

85.27

74.53

1,900

2.33

89.53

12.97

South all

260

87.71

72.78

4,500

1.79

90.17

16.62

West all

160

88.04

78.49

2,700

3.20

89.31

8.67

National public

480

89.81

91.10

9,700

2.48

89.25

15.08

National private

180

40.35

33.24

800

0.29

85.77

4.21

Catholic

60

82.98

80.17

700

0.73

85.77

6.34

Non-Catholic

120

12.54

17.27

100

0.00

85.76

2.83

111 National all includes national public, national private, and Department of Defense Education Activity (DoDEA) schools that are located

in the United States.

NOTE: School counts are rounded to nearest ten and student counts are rounded to nearest hundred. Detail may not sum to totals because of rounding.

SOURCE: U.S. Department of Education, Institute of Education Sciences, National Center for Education Statistics, National Assessment of Educational Progress (NAEP), 2022 Long-Term Trend Mathematics Assessment.




http://nces.ed.gov/nationsreportcard/tdw/weighting/2022/participation_exclusion_and_accommodation_rates_for_age_13_mathematics.aspx

Shape57

NAEP Technical Documentation Participation, Exclusion, and Accommodation Rates for Age 13 Reading

The following table displays the school-level participation rates and student-level participation, exclusion, and accommodation rates for the age 13 long-term trend reading assessment. Various weights were used in the calculation of the school rates, as indicated in the column headings of the table. For the student participation rates student base weights were used. For the student exclusion rates and accommodation rates, student base weights with adjustment for school nonresponse were used. Different weights were used at the student level because the student participation rates are conditional on (i.e., computed within) the participating schools, whereas the exclusion and accommodation rates are population estimates.

The school participation rates reflect the participation of the original sampled schools only and do not reflect any effect of substitution. The rates weighted by the school base weight and enrollment show the approximate proportion of the student population in the domain that is represented by the responding schools in the sample. The rates weighted by just the base weight show the proportion of the school population that is represented by the responding schools in the sample. These rates differ because schools differ in size.


Participation, exclusion, and accommodation rates, age 13 long-term trend reading assessment, by school type and geographic region: 2022









School type and geographic region



Number

of schools

in original sample, rounded

School participation rates (percent)

before substitution (weighted by base weight

and enrollment)

School participation

rates (percent) before substitution (weighted by base weight

only)




Number

of students sampled, rounded





Weighted percent of students excluded



Weighted student participation

rates (percent)

after makeups





Weighted percent of students accommodated

National all1

660

85.98

70.79

10,500

3.08

89.22

13.08

Northeast all

110

79.16

55.37

1,300

4.64

83.90

14.83

Midwest all

130

85.27

74.53

1,900

1.61

89.71

13.16

South all

260

87.71

72.78

4,600

2.85

91.05

15.39

1National all includes national public, national private, and Department of Defense Education Activity (DoDEA) schools that are located in

the United States.

NOTE: School counts are rounded to nearest ten and student counts are rounded to nearest hundred. Detail may not sum to totals because of rounding.

SOURCE: U.S. Department of Education, Institute of Education Sciences, National Center for Education Statistics, National Assessment of Educational Progress (NAEP), 2022 Long-Term Trend Reading Assessment.



School

School


participation

participation





Number

rates (percent)

rates



Weighted


of

before

(percent)



student


schools

substitution

before

Number


participation


in

(weighted by

substitution

of

Weighted

rates

Weighted

original

base weight

(weighted by

students

percent of

(percent)

percent of

sample,

and

base weight

sampled,

students

after

students

School type and geographic region rounded

enrollment)

only)

rounded

excluded

makeups

accommodated


West all

160

88.04

78.49

2,700

3.91

88.66

7.65

National public

480

89.81

91.10

9,700

3.29

89.28

13.53

National private

180

40.35

33.24

800

0.66

87.68

7.92

Catholic

60

82.98

80.17

700

0.42

87.16

4.50

Non-Catholic

120

12.54

17.27

100

0.83

89.97

10.18

1National all includes national public, national private, and Department of Defense Education Activity (DoDEA) schools that are located in the United States.

NOTE: School counts are rounded to nearest ten and student counts are rounded to nearest hundred. Detail may not sum to totals because of rounding.

SOURCE: U.S. Department of Education, Institute of Education Sciences, National Center for Education Statistics, National Assessment of Educational Progress (NAEP), 2022 Long-Term Trend Reading Assessment.




http://nces.ed.gov/nationsreportcard/tdw/weighting/2022/participation_exclusion_and_accommodation_rates_for_age_13_reading.aspx

Shape58



NAEP Technical Documentation Participation, Exclusion, and Accommodation Rates for Age 9 Mathematics

The following table displays the school-level participation rates and student-level participation, exclusion, and accommodation rates for the age 9 long- term trend mathematics assessment. Various weights were used in the calculation of the school rates, as indicated in the column headings of the table. For the student participation rates, student base weights were used. For the student exclusion rates and accommodation rates, student base weights with adjustment for school nonresponse were used. Different weights were used at the student level because the student participation rates are conditional on (i.e., computed within) the participating schools, whereas the exclusion and accommodation rates are population estimates.

The school participation rates reflect the participation of the original sampled schools only and do not reflect any effect of substitution. The rates weighted by the school base weight and enrollment show the approximate proportion of the student population in the domain that is represented by the responding schools in the sample. The rates weighted by just the base weight show the proportion of the school population that is represented by the responding schools in the sample. These rates differ because schools differ in size.


Participation, exclusion, and accommodation rates, age 9 long-term trend mathematics assessment, by school type and geographic region: 2022










School type and geographic region



Number

of schools

in original sample, rounded

School participation rates (percent)

before substitution (weighted by base weight

and enrollment)

School participation

rates (percent) before substitution (weighted by base weight

only)




Number

of students sampled, rounded





Weighted percent of students excluded



Weighted student participation

rates (percent)

after makeups





Weighted percent of students accommodated

National all1

580

85.93

72.56

9,200

1.87

87.08

14.95

Northeast all

90

86.73

63.95

1,300

2.93

83.10

16.41

Midwest all

110

74.94

72.78

1,500

1.37

87.78

13.55

South all

240

93.22

79.44

4,300

1.59

88.44

19.57

West all

140

82.42

68.64

2,200

2.11

86.69

7.81

National public

410

90.45

88.79

8,700

2.02

86.96

16.00

National private

160

32.02

28.98

500

0.12

90.42

2.82

Catholic

50

62.73

60.99

400

0.31

93.25

2.37

Non-Catholic

120

13.88

19.13

100

0.00

84.02

3.09

111National all includes national public, national private, and Department of Defense Education Activity (DoDEA) schools that are located

in the United States.

NOTE: School counts are rounded to nearest ten and student counts are rounded to nearest hundred. Detail may not sum to totals because of rounding.

SOURCE: U.S. Department of Education, Institute of Education Sciences, National Center for Education Statistics, National Assessment of Educational Progress (NAEP), 2022 Long-Term Trend Mathematics Assessment.




http://nces.ed.gov/nationsreportcard/tdw/weighting/2022/participation_exclusion_and_accommodation_rates_for_age_9_mathematics.aspx

Shape59


NAEP Technical Documentation Participation, Exclusion, and Accommodation Rates for Age 9 Reading

The following table displays the school-level participation rates and student-level participation, exclusion, and accommodation rates for the age 9 long- term trend reading assessment. Various weights were used in the calculation of the school rates, as indicated in the column headings of the table. For the student participation rates student base weights were used. For the student exclusion rates and accommodation rates, student base weights with adjustment for school nonresponse were used. Different weights were used at the student level because the student participation rates are conditional on (i.e., computed within) the participating schools, whereas the exclusion and accommodation rates are population estimates.

The school participation rates reflect the participation of the original sampled schools only and do not reflect any effect of substitution. The rates weighted by the school base weight and enrollment show the approximate proportion of the student population in the domain that is represented by the responding schools in the sample. The rates weighted by just the base weight show the proportion of the school population that is represented by the responding schools in the sample. These rates differ because schools differ in size.


Participation, exclusion, and accommodation rates, age 9 long-term trend reading assessment, by school type and geographic region: 2022









School type and geographic region



Number

of schools

in original sample, rounded

School participation rates (percent)

before substitution (weighted by base weight

and enrollment)

School participation

rates (percent) before substitution (weighted by base weight

only)




Number

of students sampled, rounded





Weighted percent of students excluded



Weighted student participation

rates (percent)

after makeups





Weighted percent of students accommodated

National all1

580

85.93

72.56

9,200

2.34

87.13

14.18

Northeast all

90

86.73

63.95

1,300

1.90

82.35

18.66

111National all includes national public, national private, and Department of Defense Education Activity (DoDEA) schools that are located

in the United States.

NOTE: School counts are rounded to nearest ten and student counts are rounded to nearest hundred. Detail may not sum to totals because of rounding.

SOURCE: U.S. Department of Education, Institute of Education Sciences, National Center for Education Statistics, National Assessment of Educational Progress (NAEP), 2022 Long-Term Trend Reading Assessment.



School

School


participation

participation





Number

rates (percent)

rates



Weighted


of

before

(percent)



student


schools

substitution

before

Number


participation


in

(weighted by

substitution

of

Weighted

rates

Weighted

original

base weight

(weighted by

students

percent of

(percent)

percent of

sample,

and

base weight

sampled,

students

after

students

School type and geographic region rounded

enrollment)

only)

rounded

excluded

makeups

accommodated


Midwest all

110

74.94

72.78

1,500

1.20

89.58

14.13

South all

240

93.22

79.44

4,300

2.71

87.94

16.53

West all

140

82.42

68.64

2,200

3.05

86.99

7.52

National public

410

90.45

88.79

8,700

2.52

87.00

15.13

National private

160

32.02

28.98

500

0.22

90.89

3.20

Catholic

50

62.73

60.99

400

0.59

92.30

3.29

Non-Catholic

120

13.88

19.13

100

0.00

87.70

3.15

111National all includes national public, national private, and Department of Defense Education Activity (DoDEA) schools that are located in the United States.

NOTE: School counts are rounded to nearest ten and student counts are rounded to nearest hundred. Detail may not sum to totals because of rounding.

SOURCE: U.S. Department of Education, Institute of Education Sciences, National Center for Education Statistics, National Assessment of Educational Progress (NAEP), 2022 Long-Term Trend Reading Assessment.



http://nces.ed.gov/nationsreportcard/tdw/weighting/2022/participation_exclusion_and_accommodation_rates_for_age_9_reading.aspx

Shape60



NAEP Technical Documentation Participation, Exclusion, and Accommodation Rates for Grade 4 Mathematics

The following table displays the school- and student-level response, exclusion, and accommodation rates for the grade 4 mathematics assessment by school type and jurisdiction. Various weights were used in the calculation of the rates, as indicated in the column headings of the table.

The participation rates reflect the participation of the original sample schools only and do not reflect any effect of substitution. The rates weighted by the base weight and enrollment show the approximate proportion of the student population in the jurisdiction that is represented by the responding schools in the sample. The rates weighted by just the base weight show the proportion of the school population that is represented by the responding schools in the sample. These rates differ because schools differ in size.


Participation, exclusion, and accommodation rates, grade 4 mathematics combined national and state assessment, by school type and jurisdiction: 2022










School type and jurisdiction



Number

of schools

in original sample, rounded

School participation rates (percent)

before substitution (weighted by base weight

and enrollment)


School participation rates (percent)

before substitution (weighted by base weight

only)




Number

of students sampled, rounded





Weighted percent of students excluded




Weighted student participation rates (percent) after makeups





Weighted percent of students accommodated

All

6,410

94.48

82.93

139,400

1.81

91.86

14.27

National all1

6,260

94.45

82.79

135,800

1.81

91.85

14.19

Northeast all

1,060

91.11

76.06

21,700

1.74

90.47

17.74

Midwest all

1,460

95.32

85.44

29,400

1.33

92.18

12.95

South all

2,170

94.70

82.26

50,600

2.12

92.51

16.69

West all

1,500

95.54

85.60

31,900

1.78

91.39

8.73

National public

5,750

99.53

99.54

130,900

1.94

91.79

14.93

Alabama

90

100.00

100.00

2,200

1.27

94.68

10.42

Alaska

130

99.20

93.50

2,100

1.01

88.61

16.83

Arizona

90

100.00

100.00

2,200

1.28

92.83

11.27

Arkansas

90

100.00

100.00

2,000

0.97

92.58

21.24

California

190

100.00

100.00

4,500

2.22

91.91

7.38

111 Includes national public and national private schools located in the United States and all Department of Defense Education Activity schools, but not schools in Puerto Rico.

222 Department of Defense Education Activity schools.


participation

School





Number

rates (percent)

participation




of

before

rates (percent)




schools

substitution

before

Number


Weighted

in

(weighted by

substitution

of

Weighted

student

Weighted

original

base weight

(weighted by

students

percent of

participation

percent of

sample,

and

base weight

sampled,

students

rates (percent)

students

School type and jurisdiction rounded

enrollment)

only)

rounded

excluded

after makeups

accommodated


Colorado

120

99.04

98.43

2,900

1.76

91.10

11.74

Connecticut

90

100.00

100.00

2,100

2.46

91.79

17.50

Delaware

80

100.00

100.00

2,200

1.74

91.03

17.15

District of Columbia

90

100.00

100.00

2,100

2.25

88.24

25.53

Florida

210

100.00

100.00

5,400

2.69

91.80

21.68

Georgia

120

96.17

96.07

3,100

1.40

92.84

15.03

Hawaii

90

100.00

100.00

2,200

1.62

88.53

6.55

Idaho

90

100.00

100.00

2,000

0.92

93.22

10.07

Illinois

150

100.00

100.00

3,200

1.32

91.19

19.24

Indiana

90

98.63

99.22

2,000

0.45

92.77

19.67

Iowa

90

98.67

99.41

2,100

1.42

93.08

13.54

Kansas

100

100.00

100.00

2,100

1.43

92.88

10.00

Kentucky

120

100.00

100.00

2,700

1.93

94.50

16.61

Louisiana

90

100.00

100.00

2,100

1.60

92.36

19.32

Maine

110

100.00

100.00

2,000

1.51

90.46

15.87

Maryland

120

100.00

100.00

3,000

1.37

92.09

21.26

Massachusetts

130

100.00

100.00

3,100

1.91

92.84

18.33

Michigan

140

100.00

100.00

3,100

2.93

91.18

7.97

Minnesota

90

100.00

100.00

2,300

2.47

90.79

10.38



participation

School





Number

rates (percent)

participation




of

before

rates (percent)




schools

substitution

before

Number


Weighted

in

(weighted by

substitution

of

Weighted

student

Weighted

original

base weight

(weighted by

students

percent of

participation

percent of

sample,

and

base weight

sampled,

students

rates (percent)

students

School type and jurisdiction rounded

enrollment)

only)

rounded

excluded

after makeups

accommodated


Mississippi

90

100.00

100.00

2,200

0.76

92.78

13.96

Missouri

100

100.00

100.00

2,000

0.92

94.35

12.21

Montana

130

99.95

98.56

2,100

1.00

89.77

10.64

Nebraska

100

100.00

100.00

2,200

1.23

94.76

14.14

Nevada

100

100.00

100.00

2,400

1.70

92.26

6.13

New Hampshire

100

99.15

98.79

2,200

1.28

86.89

15.94

New Jersey

90

98.72

98.91

2,000

2.03

92.15

20.06

New Mexico

120

100.00

100.00

2,600

1.58

90.58

15.72

New York

120

95.76

95.82

2,900

1.22

86.46

21.33

North Carolina

160

100.00

100.00

4,100

1.98

90.96

13.47

North Dakota

120

99.28

97.26

2,200

1.28

90.24

11.59

Ohio

140

100.00

100.00

2,800

1.21

92.78

16.60

Oklahoma

100

100.00

100.00

2,100

2.21

93.65

16.15

Oregon

90

100.00

100.00

2,200

1.55

87.89

10.17

Pennsylvania

120

99.86

99.93

3,000

2.02

92.53

14.36

Rhode Island

90

100.00

100.00

2,100

1.61

94.25

17.33

South Carolina

90

100.00

100.00

2,100

1.11

93.01

12.09

South Dakota

120

100.00

100.00

2,100

1.13

93.78

9.42

Tennessee

120

100.00

100.00

2,800

2.39

92.04

14.05



participation

School





Number

rates (percent)

participation




of

before

rates (percent)




schools

substitution

before

Number


Weighted

in

(weighted by

substitution

of

Weighted

student

Weighted

original

base weight

(weighted by

students

percent of

participation

percent of

sample,

and

base weight

sampled,

students

rates (percent)

students

School type and jurisdiction rounded

enrollment)

only)

rounded

excluded

after makeups

accommodated


Texas

270

100.00

100.00

6,800

3.09

92.75

19.43

Utah

90

100.00

100.00

2,200

1.10

92.24

11.54

Vermont

130

100.00

100.00

2,100

1.41

88.80

16.41

Virginia

90

100.00

100.00

2,200

2.79

91.98

14.14

Washington

90

100.00

100.00

2,200

2.15

89.21

10.33

West Virginia

100

100.00

100.00

2,100

1.59

92.64

11.28

Wisconsin

130

100.00

100.00

2,700

1.29

90.27

12.14

Wyoming

100

98.78

99.16

2,100

1.27

90.11

13.57

Trial Urban (TUDA) Districts

Albuquerque

40

100.00

100.00

1,100

0.56

91.29

18.75

Atlanta

40

100.00

100.00

1,100

0.87

93.69

12.51

Austin

40

100.00

100.00

1,200

3.08

87.96

30.02

Baltimore City

50

100.00

100.00

1,000

1.39

89.82

26.15

Boston

50

100.00

100.00

1,100

5.73

90.83

19.29

Charlotte-Mecklenburg

40

100.00

100.00

1,100

2.23

92.44

12.53

Chicago

70

100.00

100.00

1,500

2.85

90.39

23.41

Clark County (NV)

60

100.00

100.00

1,600

1.16

92.17

5.76

Cleveland

50

100.00

100.00

900

2.55

88.77

24.26

Dallas

40

100.00

100.00

1,000

4.26

91.71

38.46



participation

School





Number

rates (percent)

participation




of

before

rates (percent)




schools

substitution

before

Number


Weighted

in

(weighted by

substitution

of

Weighted

student

Weighted

original

base weight

(weighted by

students

percent of

participation

percent of

sample,

and

base weight

sampled,

students

rates (percent)

students

School type and jurisdiction rounded

enrollment)

only)

rounded

excluded

after makeups

accommodated


Denver

40

100.00

100.00

1,100

2.16

88.86

15.39

Detroit

40

100.00

100.00

1,100

4.11

89.97

7.55

Duval County (FL)

40

100.00

100.00

1,100

2.07

91.75

23.97

Fort Worth

40

100.00

100.00

1,100

2.26

93.11

18.16

Guilford County (NC)

40

100.00

100.00

1,100

1.46

92.62

15.31

Hillsborough County (FL)

40

100.00

100.00

1,100

3.14

92.13

23.08

Houston

60

100.00

100.00

1,600

3.22

93.43

22.76

Jefferson County (KY)

40

100.00

100.00

1,000

3.58

93.68

22.10

Los Angeles

60

100.00

100.00

1,600

1.98

91.97

10.61

Miami

60

100.00

100.00

1,600

3.08

94.64

25.04

Milwaukee

50

100.00

100.00

1,000

1.45

86.42

22.28

New York City

70

98.73

98.87

1,600

1.26

87.37

27.52

Philadelphia

40

98.11

99.29

1,000

4.32

93.65

21.07

San Diego

40

100.00

100.00

1,000

2.92

88.61

12.46

Shelby County (TN)

40

100.00

100.00

1,000

3.73

94.05

14.79

District of Columbia (DCPS)

50

100.00

100.00

1,300

3.30

89.57

30.10

National private

390

37.50

33.92

1,800

0.48

93.71

5.98

Catholic

120

66.61

68.59

1,100

0.37

93.67

7.03

Non-Catholic

270

20.01

20.37

700

0.54

93.77

5.36



participation

School





Number

rates (percent)

participation




of

before

rates (percent)




schools

substitution

before

Number


Weighted

in

(weighted by

substitution

of

Weighted

student

Weighted

original

base weight

(weighted by

students

percent of

participation

percent of

sample,

and

base weight

sampled,

students

rates (percent)

students

School type and jurisdiction rounded

enrollment)

only)

rounded

excluded

after makeups

accommodated


Other jurisdictions

DoDEA2

100

94.55

92.13

3,000

1.68

88.71

17.62

Puerto Rico

150

100.00

100.00

3,500

0.16

92.19

31.52

111 Includes national public and national private schools located in the United States and all Department of Defense Education Activity schools, but not schools in Puerto Rico.

222 Department of Defense Education Activity schools.

NOTE: Numbers of schools are rounded to nearest ten, and numbers of students are rounded to nearest hundred. Detail may not sum to totals due to rounding.

SOURCE: U.S. Department of Education, Institute of Education Sciences, National Center for Education Statistics, National Assessment of Educational Progress (NAEP), 2022 Mathematics Assessment.




http://nces.ed.gov/nationsreportcard/tdw/weighting/2022/participation_exclusion_and_accommodation_rates_for_grade_4_mathematics_for_the_2022_assessment.aspx

Shape72



NAEP Technical Documentation Participation, Exclusion, and Accommodation Rates for Grade 4 Reading

The following table displays the school- and student-level response, exclusion, and accommodation rates for the grade 4 reading assessment by school type and jurisdiction. Various weights were used in the calculation of the rates, as indicated in the column headings of the table.

The participation rates reflect the participation of the original sample schools only and do not reflect any effect of substitution. The rates weighted by the base weight and enrollment show the approximate proportion of the student population in the jurisdiction that is represented by the responding


schools in the sample. The rates weighted by just the base weight show the proportion of the school population that is represented by the responding schools in the sample. These rates differ because schools differ in size.


Participation, exclusion, and accommodation rates, grade 4 reading combined national and state assessment, by school type and jurisdiction: 2022










School type and jurisdiction



Number

of schools

in original sample, rounded

School participation rates (percent)

before substitution (weighted by base weight

and enrollment)


School participation rates (percent)

before substitution (weighted by base weight

only)




Number

of students sampled, rounded





Weighted percent of students excluded




Weighted student participation rates (percent) after makeups





Weighted percent of students accommodated

All

6,260

94.45

82.79

127,000

1.96

91.70

14.09

National all1

6,260

94.45

82.79

127,000

1.96

91.70

14.09

Northeast all

1,060

91.11

76.06

20,400

2.14

90.18

16.94

Midwest all

1,460

95.32

85.44

27,600

1.44

92.11

12.57

South all

2,170

94.70

82.26

47,200

2.22

92.36

16.86

West all

1,500

95.54

85.60

29,900

1.85

91.23

8.89

National public

5,750

99.53

99.54

122,400

2.11

91.61

14.86

Alabama

90

100.00

100.00

2,000

1.14

93.53

11.45

Alaska

130

99.20

93.50

2,000

0.61

88.73

18.28

Arizona

90

100.00

100.00

2,100

1.21

92.25

11.39

Arkansas

90

100.00

100.00

1,900

1.69

93.87

19.84

California

190

100.00

100.00

4,200

2.30

91.45

7.91

Colorado

120

99.04

98.43

2,700

2.67

91.37

10.85

Connecticut

90

100.00

100.00

2,000

2.51

88.93

17.72

111 Includes national public and national private schools located in the United States and all Department of Defense Education Activity schools.

222 Department of Defense Education Activity schools.

NOTE: Numbers of schools are rounded to nearest ten, and numbers of students are rounded to nearest hundred. Detail may not sum to totals due to rounding.

SOURCE: U.S. Department of Education, Institute of Education Sciences, National Center for Education Statistics, National Assessment of Educational Progress (NAEP), 2022 Reading Assessment.


participation

School





Number

rates (percent)

participation




of

before

rates (percent)




schools

substitution

before

Number


Weighted

in

(weighted by

substitution

of

Weighted

student

Weighted

original

base weight

(weighted by

students

percent of

participation

percent of

sample,

and

base weight

sampled,

students

rates (percent)

students

School type and jurisdiction rounded

enrollment)

only)

rounded

excluded

after makeups

accommodated


Delaware

80

100.00

100.00

2,000

1.33

89.69

18.24

District of Columbia

90

100.00

100.00

1,900

4.17

87.75

23.07

Florida

210

100.00

100.00

5,000

2.37

93.05

22.65

Georgia

120

96.17

96.07

3,000

1.82

92.18

16.21

Hawaii

90

100.00

100.00

2,000

1.24

88.58

6.03

Idaho

90

100.00

100.00

1,900

1.72

92.45

10.09

Illinois

150

100.00

100.00

3,000

0.92

90.91

18.44

Indiana

90

98.63

99.22

1,900

0.68

93.09

19.87

Iowa

90

98.67

99.41

2,000

1.18

92.99

14.86

Kansas

100

100.00

100.00

1,900

0.97

93.24

10.40

Kentucky

120

100.00

100.00

2,600

3.07

93.45

16.12

Louisiana

90

100.00

100.00

1,900

2.39

92.12

17.91

Maine

110

100.00

100.00

1,900

1.02

92.02

16.34

Maryland

120

100.00

100.00

2,800

1.93

91.81

21.34

Massachusetts

130

100.00

100.00

2,900

2.48

92.98

16.39

Michigan

140

100.00

100.00

2,900

2.56

90.98

8.10

Minnesota

90

100.00

100.00

2,100

3.54

91.18

10.29

Mississippi

90

100.00

100.00

2,000

1.31

92.70

13.47

Missouri

100

100.00

100.00

2,000

0.84

93.38

12.72



participation

School





Number

rates (percent)

participation




of

before

rates (percent)




schools

substitution

before

Number


Weighted

in

(weighted by

substitution

of

Weighted

student

Weighted

original

base weight

(weighted by

students

percent of

participation

percent of

sample,

and

base weight

sampled,

students

rates (percent)

students

School type and jurisdiction rounded

enrollment)

only)

rounded

excluded

after makeups

accommodated


Montana

130

99.95

98.56

2,000

1.32

89.73

11.56

Nebraska

100

100.00

100.00

2,000

1.29

94.33

14.01

Nevada

100

100.00

100.00

2,300

1.52

91.47

7.40

New Hampshire

100

99.15

98.79

2,000

1.15

87.70

16.10

New Jersey

90

98.72

98.91

1,900

2.84

91.90

19.03

New Mexico

120

100.00

100.00

2,400

1.38

91.03

14.56

New York

120

95.76

95.82

2,700

2.23

86.57

19.98

North Carolina

160

100.00

100.00

3,800

1.86

91.11

13.85

North Dakota

120

99.28

97.26

2,100

1.70

91.16

11.76

Ohio

140

100.00

100.00

2,700

2.40

92.35

14.63

Oklahoma

100

100.00

100.00

1,900

1.67

92.40

16.62

Oregon

90

100.00

100.00

2,100

1.85

89.66

9.65

Pennsylvania

120

99.86

99.93

2,800

2.12

91.83

14.50

Rhode Island

90

100.00

100.00

2,000

1.19

93.82

17.68

South Carolina

90

100.00

100.00

2,000

1.67

92.09

12.06

South Dakota

120

100.00

100.00

2,000

1.04

93.95

9.33

Tennessee

120

100.00

100.00

2,700

2.14

91.68

14.13

Texas

270

100.00

100.00

6,400

3.28

92.44

19.82

Utah

90

100.00

100.00

2,000

1.03

92.30

10.55



participation

School





Number

rates (percent)

participation




of

before

rates (percent)




schools

substitution

before

Number


Weighted

in

(weighted by

substitution

of

Weighted

student

Weighted

original

base weight

(weighted by

students

percent of

participation

percent of

sample,

and

base weight

sampled,

students

rates (percent)

students

School type and jurisdiction rounded

enrollment)

only)

rounded

excluded

after makeups

accommodated


Vermont

130

100.00

100.00

2,000

1.27

89.00

16.55

Virginia

90

100.00

100.00

2,000

2.25

91.86

12.33

Washington

90

100.00

100.00

2,000

1.72

88.85

9.78

West Virginia

100

100.00

100.00

1,900

1.66

90.27

10.12

Wisconsin

130

100.00

100.00

2,600

0.99

90.73

12.33

Wyoming

100

98.78

99.16

2,000

1.72

91.74

14.36

Trial Urban (TUDA) Districts

Albuquerque

40

100.00

100.00

1,000

1.34

91.29

17.90

Atlanta

40

100.00

100.00

1,000

2.76

92.78

11.45

Austin

40

100.00

100.00

1,100

5.14

89.14

28.32

Baltimore City

50

100.00

100.00

1,000

3.23

90.67

25.26

Boston

50

100.00

100.00

1,000

6.09

90.96

16.80

Charlotte-Mecklenburg

40

100.00

100.00

1,000

1.70

91.77

10.12

Chicago

70

100.00

100.00

1,400

2.26

89.02

23.88

Clark County (NV)

60

100.00

100.00

1,500

1.71

91.94

7.07

Cleveland

50

100.00

100.00

900

2.11

87.53

25.20

Dallas

40

100.00

100.00

1,000

4.18

92.41

38.31

Denver

40

100.00

100.00

1,000

3.17

90.68

13.12

Detroit

40

100.00

100.00

1,000

4.21

89.44

6.40



participation

School





Number

rates (percent)

participation




of

before

rates (percent)




schools

substitution

before

Number


Weighted

in

(weighted by

substitution

of

Weighted

student

Weighted

original

base weight

(weighted by

students

percent of

participation

percent of

sample,

and

base weight

sampled,

students

rates (percent)

students

School type and jurisdiction rounded

enrollment)

only)

rounded

excluded

after makeups

accommodated


Duval County (FL)

40

100.00

100.00

1,000

2.08

93.11

24.16

Fort Worth

40

100.00

100.00

1,000

3.22

91.01

17.14

Guilford County (NC)

40

100.00

100.00

1,000

1.71

91.77

13.07

Hillsborough County (FL)

40

100.00

100.00

1,000

3.03

93.93

22.59

Houston

60

100.00

100.00

1,500

2.28

92.05

23.97

Jefferson County (KY)

40

100.00

100.00

1,000

6.31

92.34

17.78

Los Angeles

60

100.00

100.00

1,500

2.18

92.38

12.00

Miami

60

100.00

100.00

1,500

2.92

92.83

24.82

Milwaukee

50

100.00

100.00

900

2.32

85.33

21.67

New York City

70

98.73

98.87

1,500

2.32

87.42

26.42

Philadelphia

40

98.11

99.29

1,000

6.60

93.38

18.23

San Diego

40

100.00

100.00

900

2.75

88.59

12.38

Shelby County (TN)

40

100.00

100.00

1,000

3.82

91.22

14.68

District of Columbia (DCPS)

50

100.00

100.00

1,200

5.69

88.64

26.20

National private

390

37.50

33.92

1,600

0.25

94.12

5.49

Catholic

120

66.61

68.59

1,000

0.38

95.19

6.15

Non-Catholic

270

20.01

20.37

600

0.17

92.19

5.09

Other jurisdictions

DoDEA2

100

94.55

92.13

2,900

1.72

89.71

17.96

111 Includes national public and national private schools located in the United States and all Department of Defense Education Activity schools.

222 Department of Defense Education Activity schools.

NOTE: Numbers of schools are rounded to nearest ten, and numbers of students are rounded to nearest hundred. Detail may not sum to totals due to rounding.

SOURCE: U.S. Department of Education, Institute of Education Sciences, National Center for Education Statistics, National Assessment of Educational Progress (NAEP), 2022 Reading Assessment.





http://nces.ed.gov/nationsreportcard/tdw/weighting/2022/participation_exclusion_and_accommodation_rates_for_grade_4_reading_for_the_2022_assessment.aspx

Shape93



NAEP Technical Documentation Participation, Exclusion, and Accommodation Rates for Grade 8 Civics

The following table displays the school- and student-level response, exclusion, and accommodation rates for the grade 8 civics assessment. Various weights were used in the calculation of the rates, as indicated in the column headings of the table.

The participation rates reflect the participation of the original sample schools only and do not reflect any effect of substitution. The rates weighted by the base weight and enrollment show the approximate proportion of the student population in the school type and geographic region that is represented by the responding schools in the sample. The rates weighted by just the base weight show the proportion of the school population that is represented by the responding schools in the sample. These rates differ because schools differ in size.


Participation, exclusion, and accommodation rates, grade 8 civics national assessment, by school type and geographic region: 2022










School type and geographic region



Number

of schools

in original sample, rounded

School participation rates (percent)

before substitution (weighted by base weight

and enrollment)

School participation

rates (percent) before substitution (weighted by base weight

only)




Number

of students sampled, rounded





Weighted percent of students excluded



Weighted student participation

rates (percent)

after makeups





Weighted percent of students accommodated

All

570

86.62

69.97

9,400

1.52

90.05

13.25

National all1

570

86.62

69.97

9,400

1.52

90.05

13.25

111 Includes national public, national private, and Department of Defense Education Activity schools located in the United States.

NOTE: Numbers of schools are rounded to nearest ten, and numbers of students are rounded to nearest hundred. Detail may not sum to totals due to rounding.

SOURCE: U.S. Department of Education, Institute of Education Sciences, National Center for Education Statistics, National Assessment of Educational Progress (NAEP), 2022 Civics Assessment.



School

School


participation

participation





Number

rates (percent)

rates



Weighted


of

before

(percent)



student


schools

substitution

before

Number


participation


in

(weighted by

substitution

of

Weighted

rates

Weighted

original

base weight

(weighted by

students

percent of

(percent)

percent of

sample,

and

base weight

sampled,

students

after

students

School type and geographic region rounded

enrollment)

only)

rounded

excluded

makeups

accommodated


Northeast all

90

82.11

60.26

1,200

1.36

88.37

17.65

Midwest all

110

87.34

71.53

1,800

1.39

91.26

13.01

South all

230

91.46

73.39

4,200

1.51

90.52

14.91

West all

140

80.38

69.74

2,100

1.77

88.99

7.73

National public

400

91.00

91.88

8,800

1.65

89.96

14.04

National private

170

33.59

33.24

600

0.00

92.30

4.12

Catholic

40

61.74

74.36

400

0.00

91.89

5.34

Non-Catholic

130

15.03

17.71

200

0.00

93.55

3.32

111 Includes national public, national private, and Department of Defense Education Activity schools located in the United States.

NOTE: Numbers of schools are rounded to nearest ten, and numbers of students are rounded to nearest hundred. Detail may not sum to totals due to rounding.

SOURCE: U.S. Department of Education, Institute of Education Sciences, National Center for Education Statistics, National Assessment of Educational Progress (NAEP), 2022 Civics Assessment.





http://nces.ed.gov/nationsreportcard/tdw/weighting/2022/participation_exclusion_and_accommodation_rates_for_grade_8_civics_for_the_2022_assessment.aspx

Shape98



NAEP Technical Documentation Participation, Exclusion, and Accommodation Rates for Grade 8 Mathematics

The following table displays the school- and student-level response, exclusion, and accommodation rates for the grade 8 mathematics assessment by school type and jurisdiction. Various weights were used in the calculation of the rates, as indicated in the column headings of the table.

The participation rates reflect the participation of the original sample schools only and do not reflect any effect of substitution. The rates weighted by the base weight and enrollment show the approximate proportion of the student population in the jurisdiction that is represented by the responding schools in the sample. The rates weighted by just the base weight show the proportion of the school population that is represented by the responding schools in the sample. These rates differ because schools differ in size.


Participation, exclusion, and accommodation rates, grade 8 mathematics combined national and state assessment, by school type and jurisdiction: 2022










School type and jurisdiction



Number

of schools

in original sample, rounded

School participation rates (percent)

before substitution (weighted by base weight

and enrollment)


School participation rates (percent)

before substitution (weighted by base weight

only)




Number

of students sampled, rounded





Weighted percent of students excluded




Weighted student participation rates (percent) after makeups





Weighted percent of students accommodated

All

5,870

94.69

74.55

138,700

1.53

88.87

13.75

National all1

5,730

94.67

74.35

135,100

1.54

88.86

13.67

Northeast all

950

91.21

62.39

21,700

1.57

86.65

17.75

Midwest all

1,370

95.37

77.07

29,700

1.16

89.24

12.52

South all

2,000

95.51

75.16

50,100

1.65

90.00

15.17

West all

1,350

94.96

80.42

32,000

1.69

88.01

9.45

National public

5,280

99.61

99.50

131,300

1.67

88.68

14.42

Alabama

90

100.00

100.00

2,100

1.64

91.21

9.15

Alaska

110

98.71

93.92

2,000

1.05

83.79

14.47

Arizona

90

100.00

100.00

2,100

1.67

90.30

9.94

Arkansas

90

100.00

100.00

2,200

0.98

91.18

19.69

111 Includes national public and national private schools located in the United States and all Department of Defense Education Activity schools, but not schools in Puerto Rico.

222 Department of Defense Education Activity schools.

NOTE: Numbers of schools are rounded to nearest ten, and numbers of students are rounded to nearest hundred. Detail may not sum to totals due to rounding.

SOURCE: U.S. Department of Education, Institute of Education Sciences, National Center for Education Statistics, National Assessment of Educational Progress (NAEP), 2022 Mathematics Assessment.


participation

School





Number

rates (percent)

participation




of

before

rates (percent)




schools

substitution

before

Number


Weighted

in

(weighted by

substitution

of

Weighted

student

Weighted

original

base weight

(weighted by

students

percent of

participation

percent of

sample,

and

base weight

sampled,

students

rates (percent)

students

School type and jurisdiction rounded

enrollment)

only)

rounded

excluded

after makeups

accommodated


California

180

100.00

100.00

4,400

2.16

88.05

8.73

Colorado

120

96.76

95.47

2,800

1.36

86.46

10.41

Connecticut

90

98.77

97.80

2,000

1.69

87.84

17.79

Delaware

50

100.00

100.00

2,100

1.90

87.27

17.75

District of Columbia

80

100.00

100.00

2,000

2.74

82.67

25.65

Florida

210

100.00

100.00

5,400

2.56

89.47

22.03

Georgia

110

100.00

100.00

3,100

1.75

90.25

16.63

Hawaii

50

100.00

100.00

2,200

1.99

85.29

4.37

Idaho

90

100.00

100.00

2,200

1.22

90.48

11.37

Illinois

140

100.00

100.00

3,300

1.12

87.95

16.25

Indiana

90

98.83

99.41

2,000

0.73

90.54

18.17

Iowa

90

100.00

100.00

2,100

1.48

90.13

15.07

Kansas

90

100.00

100.00

2,200

1.32

91.03

10.74

Kentucky

110

100.00

100.00

2,800

2.32

89.43

14.25

Louisiana

90

100.00

100.00

2,100

2.28

89.70

20.14

Maine

90

97.56

96.39

2,100

0.99

86.84

18.58

Maryland

130

100.00

100.00

3,000

1.82

89.06

18.06

Massachusetts

130

100.00

100.00

3,000

2.62

87.89

17.83

Michigan

130

100.00

100.00

3,000

1.80

86.82

10.49



participation

School





Number

rates (percent)

participation




of

before

rates (percent)




schools

substitution

before

Number


Weighted

in

(weighted by

substitution

of

Weighted

student

Weighted

original

base weight

(weighted by

students

percent of

participation

percent of

sample,

and

base weight

sampled,

students

rates (percent)

students

School type and jurisdiction rounded

enrollment)

only)

rounded

excluded

after makeups

accommodated


Minnesota

100

98.62

99.16

1,900

2.08

85.74

8.79

Mississippi

90

100.00

100.00

2,200

0.84

90.09

12.25

Missouri

100

100.00

100.00

2,200

0.99

91.73

11.54

Montana

100

99.95

98.06

2,100

1.17

85.81

12.23

Nebraska

100

100.00

100.00

2,200

1.76

92.46

12.69

Nevada

90

100.00

100.00

2,400

1.10

87.92

6.89

New Hampshire

80

98.97

98.63

2,100

1.48

82.00

12.45

New Jersey

90

98.85

99.47

2,100

1.61

91.20

19.77

New Mexico

100

100.00

100.00

2,700

1.67

88.36

14.40

New York

130

97.65

97.92

2,900

1.65

81.09

20.63

North Carolina

140

100.00

100.00

3,900

1.14

90.32

13.01

North Dakota

90

100.00

100.00

2,100

1.38

88.51

12.28

Ohio

140

100.00

100.00

2,900

1.11

89.59

16.08

Oklahoma

90

100.00

100.00

2,100

1.61

92.11

14.99

Oregon

90

100.00

100.00

2,100

1.51

84.92

10.91

Pennsylvania

120

99.42

99.75

2,900

1.35

89.04

16.94

Rhode Island

60

100.00

100.00

2,100

2.22

90.48

16.63

South Carolina

90

100.00

100.00

2,100

1.47

91.36

10.12

South Dakota

100

98.95

98.85

2,200

1.62

91.16

6.87



participation

School





Number

rates (percent)

participation




of

before

rates (percent)




schools

substitution

before

Number


Weighted

in

(weighted by

substitution

of

Weighted

student

Weighted

original

base weight

(weighted by

students

percent of

participation

percent of

sample,

and

base weight

sampled,

students

rates (percent)

students

School type and jurisdiction rounded

enrollment)

only)

rounded

excluded

after makeups

accommodated


Tennessee

120

97.51

96.30

3,000

2.30

90.98

12.11

Texas

220

100.00

100.00

6,600

1.56

89.72

15.94

Utah

90

100.00

100.00

2,200

1.67

87.70

12.50

Vermont

90

100.00

100.00

2,200

1.72

87.05

16.37

Virginia

90

98.75

99.52

2,100

1.70

88.42

12.39

Washington

90

100.00

100.00

2,200

1.37

86.92

10.88

West Virginia

90

100.00

100.00

2,200

1.52

90.99

11.68

Wisconsin

130

100.00

100.00

3,100

1.26

87.83

12.25

Wyoming

70

100.00

100.00

2,200

1.41

87.33

11.99

Trial Urban (TUDA) Districts

Albuquerque

30

100.00

100.00

1,100

1.92

85.50

15.81

Atlanta

30

100.00

100.00

1,000

0.90

90.39

19.32

Austin

20

100.00

100.00

1,100

1.94

84.80

25.07

Baltimore City

50

100.00

100.00

1,000

2.86

90.04

21.19

Boston

50

100.00

100.00

1,000

5.94

88.86

19.68

Charlotte-Mecklenburg

30

100.00

100.00

1,100

3.11

89.89

11.80

Chicago

70

100.00

100.00

1,600

0.95

88.15

25.00

Clark County (NV)

50

100.00

100.00

1,600

1.29

85.93

7.37

Cleveland

50

100.00

100.00

900

3.82

87.12

21.92



participation

School





Number

rates (percent)

participation




of

before

rates (percent)




schools

substitution

before

Number


Weighted

in

(weighted by

substitution

of

Weighted

student

Weighted

original

base weight

(weighted by

students

percent of

participation

percent of

sample,

and

base weight

sampled,

students

rates (percent)

students

School type and jurisdiction rounded

enrollment)

only)

rounded

excluded

after makeups

accommodated


Dallas

40

100.00

100.00

1,100

2.21

91.02

22.29

Denver

40

92.34

98.48

1,000

2.70

87.68

13.03

Detroit

40

100.00

100.00

1,000

5.02

88.79

11.55

Duval County (FL)

40

100.00

100.00

1,100

1.53

91.37

21.43

Fort Worth

30

100.00

100.00

1,000

2.24

91.81

14.09

Guilford County (NC)

30

100.00

100.00

1,000

1.77

89.25

14.89

Hillsborough County (FL)

40

100.00

100.00

1,100

2.25

90.53

21.08

Houston

40

100.00

100.00

1,500

3.03

88.66

15.70

Jefferson County (KY)

30

100.00

100.00

1,100

1.46

91.05

15.00

Los Angeles

60

100.00

100.00

1,600

2.33

88.71

9.72

Miami

70

100.00

100.00

1,600

3.57

91.25

18.20

Milwaukee

40

100.00

100.00

1,100

2.12

80.38

23.55

New York City

70

96.75

95.00

1,600

0.69

83.59

25.12

Philadelphia

40

91.14

98.36

1,000

4.47

86.81

20.70

San Diego

40

100.00

100.00

1,000

1.80

85.52

12.45

Shelby County (TN)

40

100.00

100.00

1,100

2.77

90.34

9.39

District of Columbia (DCPS)

30

100.00

100.00

1,000

3.94

81.92

27.63

National private

380

35.49

32.62

1,600

0.00

93.97

4.84

Catholic

110

60.98

65.94

1,000

0.00

94.09

6.83



participation

School





Number

rates (percent)

participation




of

before

rates (percent)




schools

substitution

before

Number


Weighted

in

(weighted by

substitution

of

Weighted

student

Weighted

original

base weight

(weighted by

students

percent of

participation

percent of

sample,

and

base weight

sampled,

students

rates (percent)

students

School type and jurisdiction rounded

enrollment)

only)

rounded

excluded

after makeups

accommodated


Non-Catholic

270

19.80

20.04

600

0.00

93.74

3.61

Other jurisdictions

DoDEA2


60


94.14


86.27


2,100


1.12


89.55


13.29

Puerto Rico

150

100.00

100.00

3,600

0.06

91.07

30.04

111 Includes national public and national private schools located in the United States and all Department of Defense Education Activity schools, but not schools in Puerto Rico.

222 Department of Defense Education Activity schools.

NOTE: Numbers of schools are rounded to nearest ten, and numbers of students are rounded to nearest hundred. Detail may not sum to totals due to rounding.

SOURCE: U.S. Department of Education, Institute of Education Sciences, National Center for Education Statistics, National Assessment of Educational Progress (NAEP), 2022 Mathematics Assessment.




http://nces.ed.gov/nationsreportcard/tdw/weighting/2022/participation_exclusion_and_accommodation_rates_for_grade_8_mathematics_for_the_2022_assessment.aspx

Shape105



NAEP Technical Documentation Participation, Exclusion, and Accommodation Rates for Grade 8 Reading

The following table displays the school- and student-level response, exclusion, and accommodation rates for the grade 8 reading assessment by school type and jurisdiction. Various weights were used in the calculation of the rates, as indicated in the column headings of the table. The participation rates reflect the participation of the original sample schools only and do not reflect any effect of substitution. The rates weighted by the base weight and enrollment show the approximate proportion of the student population in the jurisdiction that is represented by the responding schools in the sample.


The rates weighted by just the base weight show the proportion of the school population that is represented by the responding schools in the sample. These rates differ because schools differ in size.


Participation, exclusion, and accommodation rates, grade 8 reading combined national and state assessment, by school type and jurisdiction: 2022









School type and jurisdiction



Number

of schools

in original sample, rounded

School participation rates (percent)

before substitution (weighted by base weight

and enrollment)


School participation rates (percent)

before substitution (weighted by base weight

only)




Number

of students sampled, rounded





Weighted percent of students excluded




Weighted student participation rates (percent) after makeups





Weighted percent of students accommodated

All

5,730

94.67

74.35

135,200

1.75

89.02

13.19

National all1

5,730

94.67

74.35

135,200

1.75

89.02

13.19

Northeast all

950

91.21

62.39

21,700

1.95

86.87

17.51

Midwest all

1,370

95.37

77.07

29,800

1.13

89.30

11.91

South all

2,000

95.51

75.16

50,100

1.95

90.39

14.72

West all

1,350

94.96

80.42

32,000

1.86

87.84

8.90

National public

5,280

99.61

99.50

131,400

1.89

88.82

13.86

Alabama

90

100.00

100.00

2,100

0.98

92.41

9.08

Alaska

110

98.71

93.92

2,100

0.48

82.03

15.08

Arizona

90

100.00

100.00

2,100

1.77

89.78

9.41

Arkansas

90

100.00

100.00

2,200

1.63

91.43

18.33

California

180

100.00

100.00

4,400

2.48

88.09

7.74

Colorado

120

96.76

95.47

2,800

1.95

86.94

10.67

Connecticut

90

98.77

97.80

2,100

1.77

88.43

16.40

111 Includes national public and national private schools located in the United States and all Department of Defense Education Activity schools.

222 Department of Defense Education Activity schools.

NOTE: Numbers of schools are rounded to nearest ten, and numbers of students are rounded to nearest hundred. Detail may not sum to totals due to rounding.

SOURCE: U.S. Department of Education, Institute of Education Sciences, National Center for Education Statistics, National Assessment of Educational Progress (NAEP), 2022 Reading Assessment.


participation

School





Number

rates (percent)

participation




of

before

rates (percent)




schools

substitution

before

Number


Weighted

in

(weighted by

substitution

of

Weighted

student

Weighted

original

base weight

(weighted by

students

percent of

participation

percent of

sample,

and

base weight

sampled,

students

rates (percent)

students

School type and jurisdiction rounded

enrollment)

only)

rounded

excluded

after makeups

accommodated


Delaware

50

100.00

100.00

2,100

1.49

87.58

18.91

District of Columbia

80

100.00

100.00

2,000

3.07

83.75

24.58

Florida

210

100.00

100.00

5,400

2.34

87.36

22.00

Georgia

110

100.00

100.00

3,100

1.97

92.79

16.63

Hawaii

50

100.00

100.00

2,200

1.56

83.42

4.96

Idaho

90

100.00

100.00

2,200

1.83

91.03

10.68

Illinois

140

100.00

100.00

3,300

1.28

88.38

15.29

Indiana

90

98.83

99.41

2,000

0.51

90.34

16.35

Iowa

90

100.00

100.00

2,100

1.18

90.16

15.87

Kansas

90

100.00

100.00

2,200

1.37

92.93

9.85

Kentucky

110

100.00

100.00

2,800

2.05

91.10

14.91

Louisiana

90

100.00

100.00

2,100

2.85

89.24

18.44

Maine

90

97.56

96.39

2,100

1.32

89.52

18.39

Maryland

130

100.00

100.00

3,000

1.87

90.36

18.17

Massachusetts

130

100.00

100.00

3,000

2.80

88.83

16.89

Michigan

130

100.00

100.00

3,000

1.54

86.24

11.11

Minnesota

100

98.62

99.16

1,900

1.95

84.69

8.52

Mississippi

90

100.00

100.00

2,200

0.68

91.75

12.96

Missouri

100

100.00

100.00

2,200

1.20

92.49

10.61



participation

School





Number

rates (percent)

participation




of

before

rates (percent)




schools

substitution

before

Number


Weighted

in

(weighted by

substitution

of

Weighted

student

Weighted

original

base weight

(weighted by

students

percent of

participation

percent of

sample,

and

base weight

sampled,

students

rates (percent)

students

School type and jurisdiction rounded

enrollment)

only)

rounded

excluded

after makeups

accommodated


Montana

100

99.95

98.06

2,100

0.84

87.24

11.73

Nebraska

100

100.00

100.00

2,200

1.43

92.69

11.41

Nevada

90

100.00

100.00

2,400

1.13

88.06

5.09

New Hampshire

80

98.97

98.63

2,100

1.04

84.57

12.87

New Jersey

90

98.85

99.47

2,100

2.24

89.50

19.35

New Mexico

100

100.00

100.00

2,700

1.69

87.17

12.68

New York

130

97.65

97.92

2,900

2.21

81.82

20.55

North Carolina

140

100.00

100.00

4,000

1.90

89.15

12.43

North Dakota

90

100.00

100.00

2,100

1.53

88.82

11.96

Ohio

140

100.00

100.00

2,900

1.39

89.39

15.95

Oklahoma

90

100.00

100.00

2,100

2.35

92.72

12.65

Oregon

90

100.00

100.00

2,100

0.88

85.28

10.92

Pennsylvania

120

99.42

99.75

2,900

1.72

89.13

17.47

Rhode Island

60

100.00

100.00

2,100

1.69

89.57

17.35

South Carolina

90

100.00

100.00

2,100

1.35

92.35

11.20

South Dakota

100

98.95

98.85

2,200

1.75

91.64

5.95

Tennessee

120

97.51

96.30

2,900

2.69

89.14

11.95

Texas

220

100.00

100.00

6,600

2.19

90.81

14.66

Utah

90

100.00

100.00

2,200

1.31

87.55

12.96



participation

School





Number

rates (percent)

participation




of

before

rates (percent)




schools

substitution

before

Number


Weighted

in

(weighted by

substitution

of

Weighted

student

Weighted

original

base weight

(weighted by

students

percent of

participation

percent of

sample,

and

base weight

sampled,

students

rates (percent)

students

School type and jurisdiction rounded

enrollment)

only)

rounded

excluded

after makeups

accommodated


Vermont

90

100.00

100.00

2,200

1.67

86.98

16.15

Virginia

90

98.75

99.52

2,100

2.45

89.02

10.22

Washington

90

100.00

100.00

2,200

1.54

85.49

9.42

West Virginia

90

100.00

100.00

2,200

1.73

90.88

10.09

Wisconsin

130

100.00

100.00

3,100

0.82

88.02

12.48

Wyoming

70

100.00

100.00

2,200

1.68

87.19

12.90

Trial Urban (TUDA) Districts

Albuquerque

30

100.00

100.00

1,100

1.10

86.92

14.47

Atlanta

30

100.00

100.00

1,000

2.75

90.57

17.20

Austin

20

100.00

100.00

1,100

2.02

86.68

24.02

Baltimore City

50

100.00

100.00

1,000

2.85

90.61

21.72

Boston

50

100.00

100.00

1,000

5.72

87.12

17.28

Charlotte-Mecklenburg

30

100.00

100.00

1,100

2.68

89.31

10.90

Chicago

70

100.00

100.00

1,600

1.51

88.67

22.36

Clark County (NV)

50

100.00

100.00

1,600

1.24

86.43

5.57

Cleveland

50

100.00

100.00

900

3.91

89.60

23.11

Dallas

40

100.00

100.00

1,100

3.12

93.27

22.83

Denver

40

92.34

98.48

1,000

2.59

88.48

13.41

Detroit

40

100.00

100.00

1,000

5.10

87.61

11.23



participation

School





Number

rates (percent)

participation




of

before

rates (percent)




schools

substitution

before

Number


Weighted

in

(weighted by

substitution

of

Weighted

student

Weighted

original

base weight

(weighted by

students

percent of

participation

percent of

sample,

and

base weight

sampled,

students

rates (percent)

students

School type and jurisdiction rounded

enrollment)

only)

rounded

excluded

after makeups

accommodated


Duval County (FL)

40

100.00

100.00

1,100

1.77

92.24

20.80

Fort Worth

30

100.00

100.00

1,000

1.41

91.91

14.01

Guilford County (NC)

30

100.00

100.00

1,000

1.02

89.39

15.27

Hillsborough County (FL)

40

100.00

100.00

1,100

3.18

88.96

20.16

Houston

40

100.00

100.00

1,500

3.65

89.49

14.45

Jefferson County (KY)

30

100.00

100.00

1,100

1.90

91.78

16.16

Los Angeles

60

100.00

100.00

1,600

2.42

89.78

9.27

Miami

70

100.00

100.00

1,600

3.34

90.11

18.85

Milwaukee

40

100.00

100.00

1,100

1.22

83.36

25.27

New York City

70

96.75

95.00

1,600

1.12

84.41

25.99

Philadelphia

40

91.14

98.36

1,000

5.13

88.26

19.72

San Diego

40

100.00

100.00

1,000

2.26

88.48

9.35

Shelby County (TN)

40

100.00

100.00

1,100

2.63

89.30

9.79

District of Columbia (DCPS)

30

100.00

100.00

1,000

4.32

83.17

26.65

National private

380

35.49

32.62

1,600

0.14

94.86

5.36

Catholic

110

60.98

65.94

1,000

0.00

95.42

7.43

Non-Catholic

270

19.80

20.04

600

0.22

93.72

4.07

Other jurisdictions

DoDEA2

60

94.14

86.27

2,100

1.78

89.68

12.42

111 Includes national public and national private schools located in the United States and all Department of Defense Education Activity schools.

222 Department of Defense Education Activity schools.

NOTE: Numbers of schools are rounded to nearest ten, and numbers of students are rounded to nearest hundred. Detail may not sum to totals due to rounding.

SOURCE: U.S. Department of Education, Institute of Education Sciences, National Center for Education Statistics, National Assessment of Educational Progress (NAEP), 2022 Reading Assessment.






http://nces.ed.gov/nationsreportcard/tdw/weighting/2022/participation_exclusion_and_accommodation_rates_for_grade_8_reading_for_the_2022_assessment.aspx

Shape126



NAEP Technical Documentation Participation, Exclusion, and Accommodation Rates for Grade 8 U.S. History

The following table displays the school- and student-level response, exclusion, and accommodation rates for the grade 8 U.S history assessment. Various weights were used in the calculation of the rates, as indicated in the column headings of the table.

The participation rates reflect the participation of the original sample schools only and do not reflect any effect of substitution. The rates weighted by the base weight and enrollment show the approximate proportion of the student population in the school type and geographic region that is represented by the responding schools in the sample. The rates weighted by just the base weight show the proportion of the school population that is represented by the responding schools in the sample. These rates differ because schools differ in size.


Participation, exclusion, and accommodation rates, grade 8 U.S. history national assessment, by school type and geographic region: 2022










School type and geographic region



Number

of schools

in original sample, rounded

School participation rates (percent)

before substitution (weighted by base weight

and enrollment)

School participation

rates (percent) before substitution (weighted by base weight

only)




Number

of students sampled, rounded





Weighted percent of students excluded



Weighted student participation

rates (percent)

after makeups





Weighted percent of students accommodated

All

570

86.62

69.97

9,600

1.66

89.73

12.98

111 Includes national public, national private, and Department of Defense Education Activity schools located in the United States.

NOTE: Numbers of schools are rounded to nearest ten, and numbers of students are rounded to nearest hundred. Detail may not sum to totals due to rounding.

SOURCE: U.S. Department of Education, Institute of Education Sciences, National Center for Education Statistics, National Assessment of Educational Progress (NAEP), 2022 U.S. History Assessment.



School

School


participation

participation





Number

rates (percent)

rates



Weighted


of

before

(percent)



student


schools

substitution

before

Number


participation


in

(weighted by

substitution

of

Weighted

rates

Weighted

original

base weight

(weighted by

students

percent of

(percent)

percent of

sample,

and

base weight

sampled,

students

after

students

School type and geographic region rounded

enrollment)

only)

rounded

excluded

makeups

accommodated


National all1

570

86.62

69.97

9,600

1.66

89.73

12.98

Northeast all

90

82.11

60.26

1,300

1.75

88.70

16.95

Midwest all

110

87.34

71.53

1,900

1.20

90.95

11.47

South all

230

91.46

73.39

4,300

1.80

89.71

15.50

West all

140

80.38

69.74

2,200

1.76

89.20

7.50

National public

400

91.00

91.88

9,000

1.80

89.58

13.80

National private

170

33.59

33.24

600

0.00

93.57

3.40

Catholic

40

61.74

74.36

400

0.00

94.26

5.65

Non-Catholic

130

15.03

17.71

200

0.00

91.51

1.92

111 Includes national public, national private, and Department of Defense Education Activity schools located in the United States.

NOTE: Numbers of schools are rounded to nearest ten, and numbers of students are rounded to nearest hundred. Detail may not sum to totals due to rounding.

SOURCE: U.S. Department of Education, Institute of Education Sciences, National Center for Education Statistics, National Assessment of Educational Progress (NAEP), 2022 U.S. History Assessment.




http://nces.ed.gov/nationsreportcard/tdw/weighting/2022/participation_exclusion_and_accommodation_rates_for_grade_8_u_s_history_for_the_2022_assessment.aspx

Shape131



NAEP Technical Documentation Nonresponse Bias Analyses

NCES statistical standards call for a nonresponse bias analysis for NAEP when response rates at the school or student level fall below 85 percent. To meet this requirement, separate nonresponse bias analysis (NRBA) reports were written in 2022 for each of the following NAEP samples: mathematics and reading at grades 4 and 8, civics and U.S. history at grade 8, mathematics and reading at age 9, and mathematics and reading at age 13. In addition

to these reports, due to special interest in Catholic schools, a separate NRBA was conducted for this subgroup for the mathematics and reading sample at grades 4 and 8.

For the 2022 mathematics and reading assessments at grades 4 and 8, school-level response rates for private schools fell below the 85 percent threshold at both grades, while the response rates for all public school domains were above 85 percent at both grades. At the student level, response rates at grade 8 fell below 85 percent for at least one subject for the following state domains: Alaska, District of Columbia, Hawaii, New Hampshire, and New York; and the following TUDA domains: District of Columbia Public Schools, New York City, and Milwaukee. However, at grade 4, response rates for all domains were above 85 percent.

For the 2022 civics and U.S. history assessments at grade 8, response rates for private schools fell below the 85 percent threshold. At the student level, response rates for all reporting groups in this sample were above 85 percent. Similarly, for the 2022 mathematics and reading assessments at ages 9 and 13, only response rates for private schools fell below 85 percent. Response rates for students across all reporting groups in these samples exceeded the 85 percent threshold.

The procedures and results from these analyses are summarized briefly below. The analyses conducted consider only certain characteristics of schools and students. They do not directly consider the effects of the nonresponse on student achievement, the primary focus of NAEP. Thus, these analyses cannot be conclusive of either the existence or absence of nonresponse bias for student achievement. For more details on these analyses, please see the full reports listed below:

NAEP 2022 NRBA Report for Math and Reading at Grades 4 and 8 NAEP 2022 NRBA Report for Civics and U.S. History at Grade 8 NAEP 2022 NRBA Report for LTT at Age 9

NAEP 2022 NRBA Report for LTT at Age 13

NAEP 2022 NRBA Report for Math and Reading at Grades 4 and 8 for Catholic Schools

School-level Nonresponse Bias Analyses

Each school-level analysis is typically conducted in three parts. The first part of the analysis looks for potential nonresponse bias that was introduced through school nonresponse. The second part examines the remaining potential for nonresponse bias after accounting for the effects of substitution. The third part examines the remaining potential for nonresponse bias after accounting for the effects of both school substitution and school-level nonresponse weight adjustments. The characteristics examined were census region, private school reporting group (Catholic/non-Catholic), urban- centric locale, school grade size category, and race/ethnicity percentages. In addition, two measures of the mean size of enrollment in the respective grades were considered: one is the mean grade enrollment size, i.e., mean size of school attended by an average student, which is estimated using the enrollment-size-adjusted school weight; and the other is mean-estimated grade enrollment, which is estimated using the school weight without the enrollment size adjustment. For each of the three samples, the NRBA results are summarized below.

Mathematics and Reading for Private Schools at Grades 4 and 8

NRBA showed that substitution and nonresponse adjustments decreased the number of variables with significant differences. As with prior years, nonresponse adjustments decreased nonresponse bias in each sample, because the key variable "private school reporting group" in each sample became non-significant after substitution and nonresponse adjustments. The biases of other variables, however, were still significant, or newly significant, after nonresponse adjustments.

For grade 4, the results for census region and mean grade enrollment averaged across students remained significant after substitution and nonresponse adjustments.

For grade 8, the results for census region, school size class, mean grade enrollment averaged across students, and percent Black (non-Hispanic) still have significant bias after nonresponse adjustments.

These results suggest that, even after making nonresponse adjustments, there is possibly significant nonresponse bias in the NAEP achievement results for private schools because non-trivial statistically significant differences remain between the responding and original samples for census region and mean grade enrollment averaged across students for grades 4 and 8, and for school size class and percentage Black (non-Hispanic) for grade 8.

Compared with the 2019 NAEP assessment, private school response rates for NAEP 2022 were approximately 14 to 15 percentage points lower for each grade.

Civics and U.S. History at Grade 8

NRBA demonstrated that in private schools, substitution had little effect on reducing nonresponse bias. In contrast, as a result of the nonresponse adjustments, both Catholic and non-Catholic schools no longer showed nonresponse bias. Still, a significant bias remained for school size and mean grade enrollment. These two remaining biases may be explained by the following. School size is not one of the variables used to adjust for school nonresponse; thus, using the nonresponse adjusted weights would not help reduce bias for school size. The increase in bias for mean grade enrollment averaged across students could be because nonresponse adjustments had removed substantial bias from other groups, such as Catholic/Non-Catholic, which limited the ability to fully adjust for other school characteristics.

These results suggest that, even after nonresponse adjustments, there is possibly significant nonresponse bias in the NAEP achievement results for private schools because non-trivial statistically significant differences remain between the responding and original samples for school size and mean grade enrollment averaged across students.

Mathematics and Reading at Ages 9 and 13

As expected, because very few substitute schools participated at either age, substitution had little effect on reducing nonresponse bias for private schools. Nonresponse adjustments were more effective: for both ages, after adjustments the number of characteristics with significant biases was decreased and the significant bias for Catholics and non-Catholics was removed. For age 9 however, the nonresponse adjustments did not eliminate significant bias across all characteristics of the sample: though the bias decreased for the Midwest and South census regions, the bias increased for the Northeast and West regions and remained significant for the census region overall. For age 13, the nonresponse adjustments eliminated significant bias across the characteristics that had exhibited bias after substitution, but significant bias was introduced for mean enrollment averaged across

students. Private school samples are small, which could explain these increases for both ages. The bias may also be due to nonresponse adjustments making some important variables less biased, with the trade-off being an increase in bias for other variables.

These results suggest that, even after nonresponse adjustments, there is possibly significant nonresponse bias in the NAEP achievement results for private schools because non-trivial statistically significant differences remain between the responding and original samples for census region at age 9 and mean enrollment averaged across students at age 13.

Mathematics and Reading at Grades 4 and 8 for Catholic Schools

For grade 4 Catholic schools, nonresponse adjustment and substitution reduced the absolute bias for census region to 0 percent since census region is explicitly used to form nonresponse adjustment cells. Based on the results of the nonresponse bias analysis, there is no evidence that the responding Catholic school sample is biased from the original eligible Catholic school sample. For grade 8 Catholic schools, after nonresponse adjustment and substitution, the absolute bias for percent Black increased. Because the nonresponse adjustments removed substantial bias from other groups, including the census region, the ability to fully adjust for other school characteristics was very limited. After the nonresponse adjustment and the substitution, a new significant characteristic, school class size was introduced. As school size is not one of the variables used in the nonresponse weighting adjustment, use of the nonresponse adjusted weights may not reduce bias for school size categories.

These results suggest that, even after nonresponse adjustments, there is possibly significant nonresponse bias in the NAEP achievement results for Catholic schools because non-trivial statistically significant differences remain between the responding and original samples for school size class and percentage Black (Non-Hispanic) for grade 8.

Student-level Nonresponse Bias Analyses

For the 2022 mathematics and reading assessments at grades 4 and 8, at the student-level, response rates fell below the critical 85 percent threshold for fourteen reporting domain and subject combinations at grade 8: New York, New York City TUDA, Alaska, District of Columbia, District of Columbia Public Schools (TUDA), and Milwaukee TUDA in both mathematics and reading; Hawaii in reading only; and New Hampshire in mathematics only. After student nonresponse adjustments, there is no evidence of substantial bias in these jurisdictions as a result of student nonresponse.

Each student-level analysis was conducted in two parts. The first part of the analysis examined the potential for nonresponse bias that was introduced through student nonresponse. The second part examined the potential for bias after accounting for the effects of nonresponse weighting adjustments. The characteristics examined were gender, race/ethnicity, relative age, National School Lunch Program eligibility, student disability (SD) status, and English learner (EL) status.




http://nces.ed.gov/nationsreportcard/tdw/weighting/2022/nonresponse_bias_analyses_for_the_2022_assessment.aspx

Shape132

Shape1 Shape2

1/71


File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
File TitleNAEP - Print Preview
AuthorMolin, Ed C
File Modified0000-00-00
File Created2025-05-19

© 2025 OMB.report | Privacy Policy