ISTE Standard #6
Coaches model and support the use of qualitative and quantitative data to inform their own instruction and professional learning.
ISTE Indicators for Standard 6 6a. Assist educators and leaders in securely collecting and analyzing student data. 6b. Support educators to interpret qualitative and quantitative data to inform their decisions and support individual student learning. 6c. Partner with educators to empower students to use learning data to set their own goals and measure their progress.
Job Aid: Using Student Data
ISTE Indicators Addressed:
6a, 6b, & 6c
This job aid is designed to assist educators and leaders, including school board members, in thinking about education data in various ways. Not only does it give examples of different types of data that teachers can and do collect, but it also describes how teachers analyze that data and what purposes the data serve. Both quantitative and qualitative data regularly inform classroom instruction, and that instruction should inform and align with district-wide decisions. This job aid illustrates that interconnectedness across K-12 education.
I made this poster as I learned about qualitative and quantitative data, and how both are essential to research in education. Making the poster helped me to comprehend the differences between them and prepared me to conduct my own case study. That case study informed subsequent projects, such as designing and conducting a mentorship and multiple professional development sessions. Therefore, a direct line can be drawn from this poster to my practice as an instructional coach. Furthermore, in the year-long professional development curriculum that I developed with a partner, we included this poster as a resource for the teacher participants in order to support them in interpreting both types of data in order to inform their own decisions and support individual student learning in their classrooms (6b). Through the use of our PD surveys, which contained both quantitative and qualitative questions, we also modeled for our teacher participants how to collect and analyze student data (6a). Of course, in our case, we sought to measure their proficiency with ISTE-Educator standards, whereas in their classroom instruction, data collection would be geared toward K-12 content area standards. However, the same basic principles apply. Ultimately, by sharing and discussing this poster “Using Student Data,” we partnered with educators at my partner’s school site to empower their students to use their own learning data in order to set their own goals and measure their progress (6c). Again, modeling that goal-setting and progress-monitoring in the professional development that we conducted was vitally important.
The impact of sharing and discussing this poster with educators is evident in a few ways. Most obviously, it can be seen in discussion boards, Padlets, and Jamboards that my partner through much of this program and I designed and implemented for her colleagues. I cannot say that they have written a lot specifically about data, but they have shared ideas for how to generate, collect, and analyze data in their students’ work. This poster also had an impact with the teacher I mentored. Months before I approached him about that mentorship, I showed him the poster, and we had an interesting discussion about data. He teaches high school Physics, has taught Chemistry, as well, and has degrees in Chemistry, Physics, and Materials Science and Engineering. Therefore, he had a lot to say about data and how I explained it on my poster. That discussion made both of us consider the other’s points and deepened our understanding of the topic. Similarly, sharing the poster with another teacher and a former principal (both of whom I used to work with) was fruitful. The teacher reflected on her own instructional practices, whereas my former principal considered it in the context of his current position at an ed tech company, where he conducts site-based trainings for principals and teachers in order to improve students’ Lexiles (reading levels).
Presentation: Action Research Case Study
ISTE Indicators Addressed:
6a, 6b, & 6c
This presentation encapsulates the case study that I designed and conducted with a partner, using students at her school. It outlines our entire action research process and grounds our work in well-established education research. Our guiding question was “Which tool, Padlet or Jamboard, results in more collaboration, participation, and peer-to-peer discourse?”
In keeping with established research practices, this case study revolved around data from start to finish. In fact, our surveys went through several rounds of feedback and edits in order to ensure that they were designed for both quantitative and qualitative data collection at multiple points throughout the case study. We also accounted for data collection to be done through observations and interviews. Doing so ensured that we had a wealth of meaningful data to analyze (6a). I set up a spreadsheet for that analysis, which supported my partner (a fellow educator) in interpreting the data to inform her decisions and support individual student learning that would take place after our case study (6b). With her input, I designed that spreadsheet to track data during the pre-intervention phase, intervention phase, and post-intervention phase. Therefore we had quite a bit of data from which to find trends and draw our conclusions, as is evident in this presentation. There were four data-driven takeaways from our case study: 1) the students preferred Jamboard over Padlet for group work and could identify advantages and disadvantages to using each tool, 2) there was more buy-in when they felt included in the selection of tech tools and were asked to provide feedback, 3) aside from the technology tools used, the students were more engaged if the topic was relevant to their interests, and 4) including prompts that connect the topic to student experiences and interests increased connectedness and engagement.
This case study presentation was shared with other educators in my master’s program. Through the discussion that resulted from us all seeing each other’s case studies, we saw how similar themes in education could be researched in a wide variety of ways. There truly is no “one-size-fits-all” approach. We all deepened our understanding of concepts that tend to get thrown around like buzz words, such as “collaboration,” “project-based learning,” “choice,” and “engagement.” Sharing the case studies showed us how those concepts could actually be measurable with both quantitative and qualitative data in various contexts, and how multilayered they are, in that many variables and external factors really do need to be considered. Data truly cannot be divorced from its context. Both the data and its context from my case study absolutely have informed the subsequent work that I have done, in particular the year-long professional development curriculum. To be precise, my partner and I used similar data collection methods and utilized both Padlet and Jamboard in activities we designed for our participants. For more on that, please see my Standard 5 page “Professional Learning Facilitator.”
Based on both the “Using Student Data” poster and the “Action Research Case Study Presentation,” I have demonstrated full competency in all indicators of Standard 5 “Data-Driven Decision-Maker.” The poster on quantitative and qualitative data addresses indicators 6a, 6b, and 6c, whereas the case study presentation focuses on indicators 6a and 6b. It was quite rewarding to be able to pull from a resource that I created when it came time to design an entire professional development plan, as was the case with the poster. I actually went back and revised it months after thinking it was “done” because I wanted to better apply visual design principles that I had learned. The point of that was not to make it prettier, but rather to make it more functional and helpful to teachers and education leaders. The case study similarly was a valuable resource from which I drew well after I completed it, as I knew that I had valid and reliable data to back up the instructional decisions I made. As I move forward in my career, I plan to continue referring to both of these works and even refer others to them as appropriate. With the instructional design path that I am taking, there is a ton of room for data to be collected and analyzed to ensure that every product that I release for consumption is pedagogically sound, as well as engaging. I look forward to developing those products with a team that enjoys collaboration and using data to drive our decision-making.