Originally published in Island Ad-Vantages, August 9, 2012
Understanding test scores at DISES
School did not make AYP in math, reading or attendance
by Jessica Brophy
To a parent or community member, navigating the world of standardized education testing can be tricky. There are different tests used to measure different things, given at different times of year.
At a July school board meeting, Deer Isle-Stonington Elementary School Principal Mike Benjamin shared with the school board some of the standardized testing results from the 2011-12 school year.
Unfortunately, explained Benjamin, the small class sizes mean that one new student in the mix, or a student having a bad day can skew the numbers from test to test. For example, the incoming eighth-grade class has only 14 students, which means one student doing better or worse represents a 7-percent swing in the results.
The New England Common Assessment Program is a paper test given by the state. This means the tests arrive in sealed boxes that need to remain under lock and key until the test is administered. It is the test given in October and used by the state to determine Adequate Yearly Progress according to federal No Child Left Behind standards.
One of the problems with the NECAP, said Benjamin, is that it tests students on knowledge from the previous year and is administered in the fall. However, the test isn’t scored until January, and schools aren’t told if they have made AYP in math or reading until June, which does not leave very much time for adjusting teaching to address weaknesses.
Benjamin said the NECAP is due to be replaced within the next few years with a test geared toward the newly implemented “Common Core” standards, now adopted by more than 40 states. The new Common Core test will be administered in spring on computers and the results will be immediate.
The elementary school did not make AYP according to the state standards. In reading, the state’s benchmark was 75 percent of students at grade level proficiency; 64 percent of DISES students are proficient. The school’s proficiency rate ticked up six percent from the year before, but the state’s requirement went up 9 percent.
In math, the state’s benchmark was 70 percent; at DISES, only 61 percent of students were proficient. Again, the school’s proficiency rate went up, this time 5 percent, but the state’s requirements increased 10 percent.
“We’re not keeping pace with state expectations,” said Benjamin. The school will be put on “monitor” status, meaning the state is monitoring the progress of the school toward improving proficiency. If the school fails to make AYP next year, it would become classified as a “Continuous Improvement Priority School” and would need to file a plan for improvement with the state. Eventually, if the state feels the school is not making enough progress, drastic measures could be taken by law, including replacing administration or the state running the school, said Benjamin.
Statewide, only 30 percent of schools made AYP in 2011-12, down from 45 percent in 2010-11.
The improving math and reading results, though not up to the state’s standards, “show me that the PLC [Professional Learning Communities] and RTI [Response to Intervention] processes are working,” said Benjamin. He said he plans to spend time with teachers looking at the test’s results in order to improve instruction.
The school did not make AYP in attendance, either, by one percentage point. Benjamin feels this is due to the large number of families that take vacations outside of school vacation weeks. Truancy laws have been recently tightened though, said Benjamin, and the school is going to be more proactive about truancy this year.
The DIBELS test—the Dynamic Indicators of Basic Early Literacy Skills—is a test of reading skills administered three times per year to assess student literacy skills. Results are available immediately after students take the tests, which gives staff and teachers the opportunity to figure out which students need what kinds of help.
Students take the DIBELS in the fall, in December, and in May. Students are tested on several skills, and categorized as intensive, strategic or core. This refers to the type of teaching the student needs. A student who needs “Intensive” teaching in an area requires one-on-one or maybe very small group instruction. Students who need strategic support are struggling with certain concepts or ideas, and may need small group work either with a teacher or a reading specialist. “Core” students need regular classroom instruction and are otherwise on-track with reading skills and abilities.
The information that comes from the DIBELS is used in PLCs, or weekly meetings of staff and teachers, to determine interventions needed for students, if any. Benjamin compared the categorization of students as similar to that which happens in a hospital, when someone is listed as critical or stable.
Students in need of extra help might get small-group work on a specific problem—understanding how silent letters work in words, for instance—and then return to core instruction.
Other students need more comprehensive help. Often these students have special needs or learning disabilities, and test below grade level. Other students in the same class may excel and test at a high school reading level.
At times, it isn’t even as simple as Student A is doing well, and Student B is not. A student on the autism spectrum, for instance, might be academically four or five grades ahead, but in terms of social skills be a grade or two behind. This is why testing for skills needs to be coupled with teacher-staff discussions of particular student needs.
Overall, the DIBELS show students are improving over the course of the year, which Benjamin attributes to the PLCs.
The Northwest Evaluation Association exam is administered three times per year. This is the first year the school has administered the NWEAs. The benefit to the test is that teachers receive immediate feedback on student performance, and data that can be “drilled into” to find problem areas, said Benjamin. The NWEA is skills-focused and will help the staff identify “those essential skills or standards” for children, said Benjamin.
One of the challenges, as one might expect from looking at this list of tests, is helping students avoid “test fatigue” and to understand the stakes of the test. For several of the grades, performance slipped in May, despite improvement from the fall to winter tests. Benjamin said the test may have been administered too late in the year.
What the NWEA has shown, said Benjamin, is that there’s room to make the RTI [Response to Intervention] process more effective, as there hasn’t been a lot of movement from the group of students who are below grade level into grade level performance. That is, while the test clearly identifies who needs help, and help is administered, the help is not always leading to improved scores. RTI is a process of early intervention in-classroom to address student learning outcomes.
Benjamin is planning on building more RTI time into the master schedule, and freeing up more time of Kimberly Thomas—the new middle-level literacy hire—to work on RTI.
Overall, said Benjamin, the school is working on curriculum mapping and clearly identifying the skills and standards students should have when they leave the elementary school. Once standards are identified, the goal will be to clearly communicate those standards to students and parents, and to “figure out different pathways for students to get there,” said Benjamin.