Sustainability: Out-Live Out-Last Out-Reach  Poster Hall

Welcome, guest
Home
Keynote
  Audio Introduction
  Paper
Discussion
Poster Hall
  Enter Hall
  Presenters
Panel
  Teacher Leadership
  Sustainability and Funding
Discussants Reflect
Resources
Who's Here
  Instant Message Center
  Registrants
  Participating Projects
Info Center
  About the conference
  Get Help
  Feedback
  Schedule
  Downloads
  FAQs
Poster: "Honor the Wisdom of the Teacher"
Main Discussion
Topics
Read Posts
on this Topic

This message is in reply to:
assessment - Larry Cuban

Posted by: Jane Hazen Dessecker
Posted on: May 21, 2001 at 3:25 PM
Message:
Well I see there is an error on my poster. Under the graph on teacher observation data, there are two contradictory remarks. This one is correct:"We were pleased that 94% of the teachers had moved from a text-based program to a hands-on science program.However, only 51% had accomplished the minds-on component."

In other words: 51% were 4 and 5 using the Horizon classroom observation tool. An additional 43% were at level three and were doing the hands-on activities in the kits, but not getting to the higher levels of thinking. I list in the narrative the distinction we made between "Hands-on only" and "Hands-on-minds-on." Ths came from the tool.

This year long study was a wonderful opportunity to assess where we really were to make plans for the next five years.
The seven teachers from the district leadership teams were on leave for one year to work in my office. They were all trained to use the Horizon Classroom Observation Tool by our evaluator. AS Horizon only observes 10 teachers a year, we wanted a more in depth analysis of what was going on in our 16 districts....78 buildings. The team designed a plan to spend 1-3 weeks in each district to obtain their data. They observed a random 25% of the active experienced (more than 2 years teaching the program) classroom teachers; that ammounted to a staggering 192 teachers!! They wrote a 70 page report of their work for each district.

Chapter 1 focused on classroom observations and they described what they saw and heard (the Horizon protocol has a pre and post observation interview). The used graphs to describe their results. They also had a Best Practices section where they listed really innovative and successful strategies. These were compiled for each district and put in a large notebook to be given to each district.

Chapter II focused on self-reporting data. It used our CBAM Configuration Checklist. This is a rubric that we developed to "describe" our project. Districts could use it to self-evaluate their program. We compared their self-reporing data to what was observed in Chapter 1. They identified strenghs, weaknesses, and recommendations for improvement It also included a chart on how much time was reported being spent teachng science at each grade level. There were also focus groups of teachers as well as interviews with administrators with specific questions that were to be asked.

Chapter 3 was an analysis of their scope and sequence and a curriculum analysis (mapping) of the state outcomes they covered at each grade level. It made comments about the strengths and weakness of the alignment of the curriculum.

Chapter 4 was on student achievement data and was an analysis of the district's 4th and 6th grade state proficiency outcomes as well as district off-grade proficiency testing. It analyzed test results by strand, gave strengths and weaknesses relating test results to curriculum alignment (Chapter 3).

Chapter 5 was on teacher participation data in profesional development- the 100 hours per teacher requirement. It compared the teacher data and compared it to the county rating. It also included averge number of hours of teachers,lead teachers, and teachers on the leadership team. It also included data on the school year action research component as well as summer workshops.The most exciting aspect was to compare the ratings of the classroom teachers (Horizon protocol) to their hours of professional development. The positive relationship was shown in one of the graphs on the poster. We wanted to relate classroom teaching effectiveness to student achievement but ran into lots of problems (students not randomly assigned to teachers.. etc).

Needless to say, we have never done as much evaluation on any program before and our districts really appreciated the information.

The entire NSF experience has been a real learning experience about data collection and evaluation.
The classroom teachers loved the year, cried when they had to go back. The team got very close and addicted to their research.

They summarized all the data for the 16 districts into one county report which I am using for my final NSF report. I had great fun working with them.