This post is the third in our series (see earlier posts June 27 and Sept. 21) on the progress that the Minnesota Department of Education (MDE) has made developing the state’s new accountability plan, as required under the year-old federal Every Student Succeeds Act (ESSA).
In the three months since our last post, the two ESSA Accountability Committees—the Technical Committee, composed of 16 experts in data and measurement, and the Advisory Committee, composed of 70 school and community stakeholders—have continued to meet and develop recommendations for a new state accountability plan to submit to Commissioner Cassellius.
As described below, the two topics that have dominated these meetings are the new “fifth indicator” of school quality and student success, and the weightings of the five indicators in a state’s overall score.
(Four other committees—Assessment, Educator Quality, English Learner, and School Improvement—have also continued to meet to advise the Commissioner on implementing other components of ESSA, but the work of those committees is beyond the scope of this post.)
Final Regulations Mean Possible Significant Changes to the Implementation Timeline
On November 29th, the US Department of Education (USDE) published the final ESSA regulations, which extended the timeline for state plan submission and implementation. At the December 6th meeting, the Advisory Committee voted in vast majority to recommend to Commissioner Cassellius that the State submit the state plan to USDE on September 18, 2017 instead of March 6, 2017, and that the plan be implemented during the 2018-19 academic year instead of the 2017-18 academic year. Given this recently extended timeline, some of the recommendations the committees previously made to the Commissioner have been reopened for further discussion.
The School Quality/Student Success (SQ/SS) “Fifth Indicator”
As we’ve previously written, one of the most substantial changes from NCLB to ESSA was the requirement that states include a new “fifth indicator” of school quality and student success (SQ/SS) in their accountability plans. Due to issues surrounding the current lack of data and the lag time associated with collecting new types of data, the field of possible measures Minnesota could use for its SQ/SS indicator is quite small. This fall, two measures emerged from Advisory Committee discussions: “chronic absenteeism” and “college and career readiness”.
Chronic Absenteeism Measure: At the Technical Committee meeting on November 14th, the members voted to recommend chronic absenteeism as an SQ/SS measure, and to use a 90 percent attendance as the benchmark. However, with the release of the final regulations, the Commissioner indicated at the December 6th Advisory Committee meeting that she thought that the Committee could come up with something more comprehensive than chronic absenteeism. “We need something fundamentally different that creates better opportunity for all kids,” she said.
College and Career Readiness Measure: The Advisory Committee originally saw defining this measure as a school’s student participation rates in Post-Secondary Enrollment Options (PSEO), Advanced Placement (AP), International Baccalaureate (IB), Career and Technical Education (CTE), and Concurrent Enrollment (CE) programs. At the October 25th Technical Committee meeting, however, MDE revealed that they had discovered limitations with linking into the AP/IB exam data system, so it would not be possible to include AP/IB participation. Therefore, the Committee recommended delaying the inclusion of the college and career readiness measure overall.
At the December 6th Advisory meeting, the Commissioner noted that, if districts would finish their “Common Course Catalogues” (as required under 2007 and 2009 state laws, among others, which seek to standardize district course reporting), then the state could include a measure about participation in college and career readiness courses, as well as measure the number of art, music, and physical education courses a school offers its students. However, MDE admitted that the feasibility of the catalogue being completed by all districts in time to include in the plan is very unlikely.
Other Measures? The Advisory Committee also expressed interest, given the newly extended timeline, in reopening the discussion about possibly building a new survey and/or using the existing Minnesota Student Survey to measure school climate and student engagement. The Advisory Committee voted to continue to meet during the first half of 2017 to discuss potential short and long-term measures to use for the SQ/SS indicator.
Weighting the Five Required Accountability Indicators
ESSA requires that the first four indicators have “much greater weight” than the fifth SQ/SS indicator in a school’s overall score under the state accountability plan. However, in the final regulations, USDE did not provide any guidance or specific numbers about what “much greater weight” means. The final regulations did solidify that no school can be identified for improvement, or exit identification status, based only on the SQ/SS indicator.
On September 29th, the Advisory Committee indicated—based on a compilation of surveys filled out by members during the meeting—that they wanted the following weights for the five indicators:
|3. Graduation Rates
|4. Progress in Achieving EL Proficiency
|5. School Quality/Student Success
However, on December 6th, MDE indicated that, after some trial data runs, fifteen and twenty percent was too high for the SQ/SS indicator because it caused some schools to be identified for support or not be identified for support, which the SQ/SS indicator cannot do. To remedy the problem, MDE assigned the indicator ten percent and distributed the remaining five and ten percent to the proficiency and growth indicators. At the same meeting, the Advisory Committee indicated that they would like to revisit the weights after the measures used for the SQ/SS indicator are finalized.
Other Recommendations Coming Out of the Accountability Committees
While discussion about the SQ/SS indicator and the weights of the various measures have dominated meetings of the Advisory and Technical Committees, there are several other recommendations that have been made by the two committees the last several months:
Inclusion of Districts and All Schools: On September 15th, the Advisory Committee voted in the vast majority to include all schools, both Title I and non-Title I, in the accountability system. On September 23rd, the Technical Committee also voted in favor of that recommendation.
High School Dropout Assignment: The Advisory Committee has recommended that students who drop out should be attributed to the school where they spent the majority of the time (rather than the school they most recently attended), with the vast majority (69 percent) of committee members either in favor or strongly in favor.
Academic Growth: Under ESSA, public elementary and middle schools are required to measure academic growth, but high schools are not. The Advisory Committee recommended that high schools should not have an indicator that measures academic growth; the rationale for this was that students only take one MCA test in high school, so measuring growth is not practical.
Graduation Rates: At the December 6th Advisory Committee meeting, after deliberation at several Advisory and Technical meetings, MDE informed the Committee that the graduation rate indicator will be measured using a blend of four and seven year graduation rates, with four-year graduation rates receiving a higher weight.
Science Inclusion: On August 24th, over half (55%) of the Advisory committee opposed or strongly opposed including science proficiency as an academic indicator, however a third of advisory members supported or strongly supported including science. On September 8th, MDE brought this information to the Technical Committee. Some members expressed interest in reconsidering the inclusion of science, but in the end the recommendation made to the Commissioner was to not include science as a subject.
Setting Long-Term State Goals: ESSA requires that the State set long-term goals and interim measures of progress for each of the first three indicators (see table above). On October 27th, the Advisory Committee voted in the vast majority to set statewide goals that are uniform for all students groups. The Committee also voted in majority to set the goals on a short timeline, with a preference for 2020, though a vocal minority had preferred a longer timeline of 2023. Both of these dates align with World’s Best Workforce review cycles.
Growth Model Selection: At several of its meetings, the Technical Committee discussed potential growth models. Under ESSA, states are required to use an indicator of academic progress, such as growth, for elementary and middle school students. The Committee recommended use of a “z-score” model, as is currently in use. However, at the December 6th Advisory Committee, the Commissioner expressed her interest in revisiting the possibility of using of a “transition matrix” model. This gets pretty technical; read more here.
Proficiency Model Selection: Under ESSA, a state’s accountability system has to include an indicator that is based on student proficiency as measured by the state’s mathematics and reading assessments. The two main options are “proficiency rate” and “proficiency index”. A proficiency rate would be measured as the percentage of students who are either meeting or exceeding standards in each student group at a given school. A proficiency index would measure the same thing, but would also provide partial credit for students who partially met the standards. On October 18th, the Technical Committee voted in the majority to recommend using a proficiency index.
There are some big decisions ahead about how to handle schools that do not meet ESSA’s 95 percent assessment participation rate requirement, and the goals and targets for the five indicators discussed above. Overall, a lot is up in the air right now with the pending change in administration, and possible future rule making by incoming potential Secretary of Education Betsy DeVos.
Education Evolving will continue to follow and report on ESSA planning in the coming months. For more information about MDE’s ESSA implementation plan, visit their website.
Lars Esdal contributed to this post.
Found this useful? Sign up to receive Education Evolving blog posts by email.