This #AskExcelinEd series features our analysis of the nuts and bolts of the first 17 ESSA plans (16 states, plus Washington, D.C.) submitted to the U.S. Department of Education. Each week we will answer a different question about these plans to help the next 34 states learn from the strengths and weaknesses of the first round of plans.
Stay tuned for the next #AskExcelinEd series featuring innovation and next generation student learning.
Note: To date, 14 of the 17 plans have been approved by USED; the remaining 3 plans continue to evolve as those states incorporate feedback from the Department. This entry is accurate as of September 12.
ESSA State Plans
What are states using as School Quality and Student Success Indicators?
The Every Student Succeeds Act (ESSA) requires that each state meaningfully differentiates its schools based at least on the following indicators:
- Academic achievement;
- Another academic indicator (growth and/or graduation rates);
- English Learner Language proficiency; and
- An indicator of school quality or student success.
The indicator of school quality or student success (SQ/SS), should be academic, but may be non-academic and must be weighted less than the academic indicators. (For more on the weighting of academic indicators check out this #AskExcelinEd post). ESSA gives states a significant amount of freedom to select their own SQ/SS indicator or indicators, as long as the indicators are valid, reliable, disaggregated by subgroup and comparable statewide.
When ESSA was first passed, there was intense interest in the SQ/SS indicator. Many education experts speculated on the measures states would include in their accountability systems. Meanwhile, some organizations dedicated to the implementation of rigorous accountability systems focused on student outcomes, like ExcelinEd, were concerned that states would use the SQ/SS indicator to water down or complicate their accountability systems. Fortunately, this has largely not been the case so far.
Here are three takeaways from the first round of ESSA plans submitted to the US Department of Education:
- States were restrained. We did not find the wild experimentation with indicators that was speculated. Most likely, states concluded that new indicators, like school discipline measures or student engagement surveys, were expensive, easy-to-game and come with potential unintended consequences. As a result, most states—14 of 17—selected measures of student attendance and absence as their SQ/SS indicator. Fortunately, most states are limiting the weight they give to non-academic indicators like attendance to less than 5-10 percent of schools’ overall ratings.
- There are some interesting indicators to watch. Seven of 17 states have included measures that monitor the progress of ninth graders. Under Illinois’ “9th grade on-track” indicator, ninth-grade students who earn at least five full-year credits and no more than one semester F are deemed to be on track for graduation. Illinois cites research that supports this expectation, but the jury is still out for now. It will be interesting to see how this measure will affect school behavior.
- States are committed to college and career readiness. Ten of 17 states selected at least one indicator that measures a student’s preparedness to attend college, join the workforce or enlist in the military. ExcelinEd supports the selection of college- and career-readiness indicators since they focus on student outcomes. However, not all such indicators are equally desirable. For example, indicators related to Advanced Placement courses should measure student performance, not simply enrollment in those courses. Prioritizing success over participation ensures schools focus on helping each student succeed in these advanced courses, rather than assigning students to classes for which they are not prepared.
ExcelinEd maintains that all the indicators used in the accountability calculation should focus on student learning outcomes; those that are not should be reported separately. State policymakers in the process of finalizing their ESSA plans have the opportunity to prioritize student learning outcomes when selecting their school quality and student success indicators.
Have another question or need ESSA-related resources? Let us know!
Previous Posts in This Series:
- #AskExcelinEd: How Many States Are Using Summative Ratings in Their ESSA Plans?
- #AskExcelinEd: How Much Weight Do States’ School Accountability Systems Give to Academic Outcomes?
- #AskExcelinEd: How Are States Incorporating Student Growth into Their Accountability Systems?
- ESSA: Resources and Information
- ESSA: Frequently Asked Questions
- The EdFly Blog: Should Attendance, Discipline & School Safety Influence School Report Cards?
- The EdFly Blog: High School Accountability & Advanced Coursework
About the author
Liya Amelga serves as the Associate Director of K-12 Reform supporting ExcelinEd’s K-12 reform agenda with a focus on the Every Student Succeeds Act. Prior to joining the Foundation, she served as the Special Assistant to the Board of School Commissioners for Baltimore City Public Schools drafting and maintaining district policies and overseeing appeals and ethics complaints to the Board. A native of Orange County, California, Liya earned her B.A. in Psychology from the University of Maryland, College Park and her J.D. from the University of Maryland, Carey School of Law. Liya currently resides in Maryland and is based out of ExcelinEd’s D.C. office.