2011 SAT Results Disappointing, Frustrating

Post date: Sep 09, 2011 12:52:4 PM

The Maine Department of Education has released DISHS's results from the Maine High School Assessment, which consists of the May SAT and a science augmentation test taken earlier in the year.  The report format is probably unfamiliar to many of you, so me walk you through what you will see:

Let me be clear- these results are extremely disappointing and frustrating to us. They are not great scores, and we are not any more happy with them than you are. We knew that this class of students had more ground to cover than the previous two classes, and we thought that we had done a better job helping them catch up than these results show. These results are not good enough and we know we must do better.

What is most frustrating is that we have other results that show that our students are more proficient than these scores would indicate. For example, the Spring 2011 NWEA results not only show a much higher percentage of proficient students, but it also shows a higher percentage of students within the group which took the SAT who were at or significantly above grade level. The relationship between NWEA and SAT scores will be explored further in PLC's this fall.

Additionally, we have compared each student's individual growth on the 10th grade PSAT, 11th grade PSAT, 11th grade SAT, and Spring 2011 NWEA2. The scores from the 10th grade PSAT to the 11th grade SAT increased for 62% of the students in reading, 79% of the students in math, and 66% of the students in writing. These scores clearly indicate individual student growth, just not as much or as quickly as we would like. The College Board has released a document that describes average growth between the 10th grade PSAT and 11th grade PSAT nation-wide. While it is pretty complex to offer a general comparison since there are so many dimensions of comparison, it is fair to say that the improvement of DISHS students is nearly the same as the rest of the nation, especially given that all eligible DISHS students took the PSAT while the national average was made up of students who self-selected to take the PSAT.

These scores are particularly disappointing given the hard work we have been doing to try to address these issues. We've increased literacy and numeracy support across the curriculum, meaning that students get basic skills support in more than just their English and math classes. We've provided much more targeted and timely intervention to students whose NWEA, PSAT, and/or SAT scores indicate that their basic skills are not proficient. Finally, we utilized a Focused Study block last year to provide both basic skills support to students as well as SAT prep for juniors.

We plan to continue all of the above in the coming school year and are also going to do some additional activities to try to help improve student achievement in these core skills. In order to both better understand why some students scored poorly on the SAT as well as why there was such a wide discrepancy between some student's SAT and NWEA scores, teachers are going to conduct an item analysis3 of the SAT results in their PLCs this fall. Teachers have already begun the second phase of curriculum mapping in which they are aligning their curricula to the rigorous Common Core standards and unpacking units to ensure that instructional methods and assessments of student learning “match” the rigor described in the standards. Finally, our Focused Study block is going to be a part of our daily schedule, offering us a new structure within which we can flexibly group students and teachers to provide basic skills support and P/SAT prep to our students who need to continue their growth.

1Average scaled score is basically the school average.

2The list has been reordered and the names deleted so that I am the only one who can link scores to student names.

3A new feature of our SAT reports is the ability to look at a small sample of released questions from the test as well as how each of our students responded. An item analysis can help to identify which types of questions DISHS students missed (or got correct) more often than other test takers and can also help identify which misunderstandings might have contributed to selecting an incorrect answer.