Evaluate results of professional learning programs to determine the effectiveness on deepening teacher content knowledge, improving teacher pedagogical skills and/or increasing student learning
As I have discussed other places in my bPortfolio my school district has selected the SAMR model as a way to measure and record the role of technology in instruction with staff and its impact on student learning. Using SAMR we record what we see in classrooms during walkthroughs and while coaching teachers to track the level of technology integration across the district. This method gives us a way to track the usage of technology and gives an idea about how it is used by teachers with students. When my school district first started moving to a 1:1 model with a pilot in 2014, it is likely that, at the time, SAMR was the best tool to track this data. However recently the instructional technology coaches have been discussing if we should begin looking at a different tool that is easier for teachers to understand to use in order to better track this data. This is a healthy discussion that needs to happen as new tools are introduced. As coaches, we should be using the best tool for the job even if it means we have to make occasional changes. This is something I start to reflect on in my post Is SAMR Enough? Teacher Practice and Technology Integration. In that post I write about how SAMR can be difficult to understand and can put teachers on the defensive. Often teachers view our walkthrough as evaluative. They think we are reporting our findings back to the district office instead of using it to show what is happening at a macro level and to improve our coaching work. This could mean that we need to do a better job of explaining the SAMR model to teachers, or it might be time to think of adopting another tool, like TPACK, Trudacot (now called 4 Shifts Protocol), or Triple E. I think that the ideal for ISTE-C 4c is that if a model used by a school or district is not understood by teachers technology coaches and the department should seek another tool.
In my program evaluation about instructional technology coaching 1:1 PD, I did find that there was a positive correlation between data collected in our technology walkthrough and continued work with an instructional technology coach. If teachers continued to work with a coach often, the work they engaged in with students came out as being higher on the SAMR scale. I don’t think that SAMR is the reason that this happened, I think it is due to the support 1:1 coaching gives to teachers and how it impacts learning. At the end of my program evaluation I acknowledge that a larger study is needed. When I discussed program evaluation with other leaders in my school district it did seem like there were many possible programs that would benefit from this kind of focused investigation. I am sure that includes the other PD programs that are a part of the student learning department. In order to know whether or not professional development is working, more data needs to be collected. In addition to the evaluations through the professional development portal, surveys could be sent out as well as observations and follow up questions from administrators and coaches that ask teachers about how professional development has impacted their teaching.
For more information see my post Is SAMR Enough? Teacher Practice and Technology Integration