Only eight states currently test their students on American government or civics (usually as part of a much broader social studies test), and so relatively little is known about young people’s civic knowledge, skills, behaviors, and values. Given the paucity of state data, the federal National Assessment of Educational Progress (NAEP) in Civics receives predominant attention. The fact that only about one quarter of students typically reach the “proficient” level on the NAEP Civics assessment is probably cited more than any other statistic about civic education, and it is often used to support proposals for adding civics requirements.
Indeed, civic education deserves increased attention, and students’ knowledge may be problematic, but these interpretations of the NAEP are based on misunderstandings. Overall proficiency scores on the NAEP Civics assessment do not tell us objectively how well students perform, but the assessment is highly informative if interpreted correctly. The results can be used to learn: which students perform better and worse than the norm for their grade, how students’ knowledge has changed over time, which educational practices are related to higher scores, and how well students understand various specific topics.
A fact sheet funded by the S.D. Bechtel, Jr. Foundation and released today by CIRCLE explains how to interpret the NAEP results. Suggested citation: Peter Levine, “What the NAEP Civics Assessment Measures and How Students Perform,” CIRCLE Fact Sheet (Medford, MA: Center for Information and Research on Civic Learning and Engagement, 2013), via http://www.civicyouth.org/?p=5219.
A subsequent fact sheet will explore the relationship between teaching practices and NAEP scores. A CIRCLE survey also released today provides a more current assessment of young adults’ political knowledge.