By: Chris Woolard
Recently, the State Board of Education unanimously approved Ohio’s Every Student Succeeds Act (ESSA) consolidated plan application and the Ohio Department of Education submitted it to the U.S. Department of Education. It will have four months to review, comment and possibly ask for additional information. Many observers had thought that the feds would take a hands-off approach to state applications, but early indications suggest that is not the case. Based on feedback that other states have received, the federal peer review process has been technical and critical, and reviewers have been stringently interpreting the ESSA law.
It is important to point out that the ESSA application is a technical document, with several prescriptive and complicated technical requirements. The ESSA “state consolidated plan” is the application that all states must complete in order to receive hundreds of millions of federal dollars in education support. The vast majority of this money is then sent to schools and districts, with a focus on supporting disadvantaged students, activities such as English language arts and math supports, after-school programs, teacher professional development, additional resources for homeless students and a host of other programs. At its heart, ESSA is a bill about equity that truly embraces ensuring success for “every student.” Once approved by the U.S. Department of Education, Ohio is tasked with implementing the technical requirements of the federal law.
It has been more than 18 months since Congress passed the law, and now that Ohio submitted our plan, it is a good time to reflect on how stakeholders played a major role in contributing to Ohio’s ESSA submission.
Ohio originally planned to submit our application for the first deadline in April 2017 and hosted a plethora of statewide stakeholder opportunities. After 10 regional meetings with 1,500 participants, 11 webinars with 3,100 participants, an online survey that received 11,200 responses and the initial posting of the draft, stakeholders asked for more time to dig into discussion on the draft. In response to this request, Ohio delayed its submission and conducted a thorough review of the draft with the State Board of Education and major education associations.
After the first draft of the application was published, stakeholders expressed concerns about many issues that were not addressed in the draft template. Many of the high-level concerns that were expressed were issues that are not specifically required to be addressed in the ESSA template. For example, stakeholders were very clear that testing concerns were on the mind of educators. Since then, the state superintendent convened an advisory group to make recommendations, and the General Assembly has removed the requirement for the fourth and sixth grade social studies tests. Likewise, many educators expressed concerns about the educator evaluation system. ESSA removed the requirement for teacher evaluations linked to student growth, so it is now a state decision on how to define effective teaching. The Educator Standards Board was convened and made a series of recommendations to improve the evaluation system. Both the testing and evaluation system concerns were brought to the Department’s attention — however, neither were directly related to the ESSA application. Stakeholders have since provided, and will continue to provide, major input on these issues that are Ohio policies — not ESSA policies. The work continues, even though it is not directly reflected in the ESSA application.
Outside of these larger issues, stakeholders played a major role in developing the technical details of the ESSA plan. Stakeholders don’t agree on all issues, and on many topics, the Department received competing feedback on all sides of a related issue. The Department’s role was to synthesize the feedback received, align it with Ohio-built policies that are already in law and build a plan that meets the federal requirements.
There were several ESSA flexibilities that stakeholders strongly supported. For example, ESSA provides flexibility for advanced eighth grade students who are taking algebra I in middle school to take the corresponding algebra I test rather than also taking the eighth grade test — thus double testing. Ohio has been a national leader in this based on a previous waiver, and nearly one-third of eighth graders are enrolled in algebra I. Not only that, Ohio previously received an expanded waiver to allow this same flexibility with other end-of-course exams (English language arts I, biology, etc.). This represents a major reduction in the number of tests taken, and Ohio is proposing to continue with this policy.
Many stakeholders have expressed concerns that school report cards are too focused only on state test results. While ESSA continues to have rigid requirements on using information from state tests to ensure that all students are succeeding, it does provide additional flexibilities that paint a larger picture about what is happening in schools. Ohio is proposing using chronic absenteeism (some districts already are doing great work) as the ESSA-required measure of school quality and student success, while piloting school climate surveys and other measures that may be included on future report cards when technically feasible and data are available. Many school administrators asked for the opportunity to share more about the good things happening across their districts in a structured way through the report cards. Several districts have quality profiles that describe accomplishments and other important details (see example). Ohio is addressing this feedback and, in fact, will include links on the upcoming report cards for district profiles and narratives.
Another major change in ESSA is the federal government walking away from prescriptive models on how to improve our most struggling schools. Instead, districts and schools will have much more discretion in designing local, evidence-based improvement plans based on the needs of their students. During the feedback process, stakeholders asked for more information and details on this process to ensure their ability to have local plans and produce locally driven evidence of strategies. Additionally, the Department has committed to developing a local engagement toolkit to assist schools and districts in collaborating with their communities to determine priorities for Title funds and setting goals for continuous improvement.
These are just a few examples, but throughout the ESSA template, there are areas where stakeholders directly impacted the application, including phasing in the N-size adjustment, using parent surveys to improve the report cards, focusing on connecting 21st century grants to local school improvement processes, exempting English learners from accountability measures during their first two years, exploring military readiness as a college and career readiness measure, updating and refining several report card measures (Value-Added, Gap Closing, high school indicators), and providing support for disadvantaged students to participate in advanced coursework such as Advanced Placement courses.
The Department is encouraged by the thousands of Ohioans who dedicated their time and expertise to improving our plan for supporting districts, schools and students across the Buckeye state. So…a giant THANK YOU to all the educators and stakeholders who have provided feedback in this process. The process doesn’t end here though. The main work of ESSA occurs with the development and implementation of local improvement plans. Stakeholder engagement also will be a crucial element of those local plans.
Chris Woolard is senior executive director for Accountability and Continuous Improvement for the Ohio Department of Education. You can learn more about Chris by clicking here.
Leave a Comment
By: Virginia Ressa
In 2007, Hattie and Timperly discovered from their meta-analysis of almost 8,000 studies that feedback is nearly seven times as effective in improving student learning as reducing class size. They found that feedback is, “The most powerful single modification that enhances achievement.”2
In my work in researching, planning and leading professional learning around Formative Instructional Practices (FIP), I have become a strong believer in the power of effective feedback. For the past few years, educators have been talking about the highly effective practices that John Hattie identified. He found that feedback is one of the most effective practices for accelerating student learning; but not just any feedback can have the profound impact that Hattie found — it needs to be effective feedback.1
So, what makes feedback effective? The general answer is that feedback is effective when it results in increased student learning. The more specific answer is that feedback is most effective when it is specific, timely, accurate and actionable. Missing any one of these attributes, feedback can be confusing and may not result in moving learning forward.
In my experience as a teacher, I can recall using phrases like “Good job!” or “Great work!” to praise students and encourage them to continue their success. I used to tell students to “Check your work again” or “Try harder next time” to help them focus more on their work and correct their errors. In hindsight, I’m not sure my feedback was all that helpful to students. I spent a great deal of time providing written feedback on my students’ work with very good intentions, but unfortunately, I’m now seeing that my feedback didn’t likely lead to increased student learning.
When providing learners with feedback on their successes, we need to be more specific than “Good job!” Students don’t always know what it is that they did well or how to do it again. It also doesn’t challenge students to move forward in their learning or to keep improving. Instead, success feedback should identify what a student has done correctly in relation to the learning target and point the student toward the next steps in his or her learning.
“Try harder” tells a student very little about what procedural mistake may have been made or what requirement a student may have missed. It is vague, doesn’t connect the learner back to the learning target and provides little direction for what action needs to be taken next. Think about how this example of effective feedback helps move learning forward:
“Read the prompt and rubric again. Your response partially addresses the prompt, but you are missing some important facts to back up your argument.”
The teacher has pointed the student back to the learning target via the rubric, identified the problem with the response and provided a suggestion that the student can act upon. You’re probably thinking that it is going to take more time to provide such specific feedback, and you’re right, it will. However, it is time well spent because the impact on student learning can be so high.1
Is Your Feedback Effective?
Pearson & Battelle for Kids. (2012). Foundations of Formative Instructional Practices Module 3: Analyzing evidence and providing feedback. Columbus, OH: Battelle for Kids.
Ultimately, feedback is only effective if it moves student learning forward. Take some time to reflect on your feedback practices and how students are using the feedback you provide. How could your feedback be more effective? Do you provide both success and intervention feedback that helps your students move forward in their learning?
Effective feedback is one of the core practices of FIP because of its high impact on student learning. To learn more about effective feedback, you can complete module 4 of the Foundations of FIP learning path. This is a great module for teacher-based teams to work through together.
The FIP Video Library has examples of Ohio teachers and students using feedback to improve learning. Watching how other teachers make feedback part of their daily practice and involve students in providing feedback to each other may give you some ideas to try in your classroom. Here is an example of effective feedback provided by the teacher and students, as well as some self-assessment feedback that work together to move the learning forward for everyone.
1Hattie, J. (2009). Visible learning: A synthesis of over 800 meta-analyses relating to achievement. London, England: Routledge.
2Hattie, J., & Timperley, H. (2007). The power of feedback. Review of Educational Research.
Virginia Ressa is an education program specialist at the Ohio Department of Education, where she focuses on helping schools and educators meet the needs of diverse learners through professional learning. You can learn more about Virginia by clicking here.
Leave a Comment