top of page

ISTE Standard #3

Collaborator

Coaches establish productive relationships with educators in order to improve instructional practice and learning outcomes.

ISTE Indicators for Standard 3 3a. Establish trusting and respectful coaching relationships that encourage educators to explore new instructional strategies. 3b. Partner with educators to identify digital learning content that is culturally relevant, developmentally appropriate and aligned to content standards. 3c. Partner with educators to evaluate the efficacy of digital learning content and tools to inform procurement decisions and adoption. 3d. Personalize support for educators by planning and modeling the effective use of technology to improve student learning.

Work Samples

Video Presentation: Technology Integration Mentorship

ISTE Indicators Addressed: 
3a, 3c, & 3d

Artifact Description:

This video is an overview of a month-long mentorship that I developed in partnership with my mentee, who teaches high school physics. The purpose of the mentorship was to integrate technology into the teacher’s instructional practices in a meaningful way that supported student learning. The video encompasses the entire mentoring experience and includes input and reflections by both mentor and mentee.

Implementation:

In order to be an effective collaborator, it was important that the mentorship spanned more than one or two days. Having it last for an entire month really allowed my mentee and I to establish a trusting and respectful coaching relationship so that we could explore new instructional strategies together (3a). I prioritized listening, especially in the beginning but really all throughout the process so that we could address his concerns and challenges with technology integration. It was vitally important that whatever we explored also fit within what he already planned to teach. In other words, I did not try to change what he taught, but rather I helped him to approach it in a different way, using technology. Along the way, we evaluated the efficacy of the digital learning tools chosen for the students to use in order to determine whether the mentee would continue to offer similar technology-integrated options for his students to demonstrate their learning (3c). As stated, I placed a high priority on fitting the digital tools into the content and sequence he already had in mind so that he did not have additional stress to contend with. The idea was to get him to consider and offer technology-integrated learning activities that were new, interesting, and motivating for students. By planning and modeling how to invite and receive student input in order to gauge that interest and motivation, I personalized support for the mentee and helped him to see that surveying students can lead to improved learning (3d).

Impact:

Because this was a full mentorship experience from start to finish, I was able to see its impact. It was immensely helpful that my mentee and I both reflected along the way and extensively at the end.  With indicator 3a, we both viewed the mentorship experience as positive and successful. For me, it was a success in that I brought new instructional strategies to the table, which the mentee was willing to adopt, not just one time but for the remainder of the school year. For my mentee, it was a success in that the digital tools we explored gave him new, easy-to-implement options for instruction. He came away with a template and a rubric for a student-centered learning activity that can be replicated at any time, even with a substitute. The survey and analysis of its results gave us both insight into how social-emotional learning also plays a role in any given innovation adoption (3c). With just five perception-based questions, we obtained a snapshot that revealed how students felt about the new instructional strategy, and that was influential in the decision to continue utilizing the digital learning tools we tested for that strategy. Personalizing support for my mentee had a profound impact on me (3d). In many ways, it brought me back to my roots in education, when I tutored one-on-one. That is something I truly enjoy. It was empowering to realize that I could translate that in a more professional setting with a fellow adult. It bodes well for my budding career in instructional design, as well, because I was able to respond to feedback and make adjustments that made the end result satisfying for all parties. Another aspect of the mentorship that was fantastic was that, in being a collaborator-coach, the experience actually addressed ISTE-Educator and ISTE-Student standards, as well! This opened my eyes to how multilayered coaching can be when it is thoroughly thought-out, organized, and approached by all parties with an open mind.

Presentation: Software Critique

ISTE Indicators Addressed: 
3b & 3c

Artifact Description:

This presentation was created by me and three other colleagues in our master’s program. The purpose was to evaluate software on an instructional level to determine whether it would be appropriate to adopt the software for our schools. At the time, we all taught in secondary schools and therefore had similar considerations for our learners. I led the way in software selection and slide design.

Implementation:

Going through the evaluation process was quite interesting to me. In partnering with other secondary educators whose contexts also included a high percentage of Latinx students, I was able to easily take cultural relevance and developmental appropriateness into consideration (3b). Our content areas differed, so we took a more or less egalitarian approach by choosing two vastly different software, one on typing skills and one on chemistry. I also had used both in my classroom and therefore suggested them due to familiarity. My partnering educators liked that one focused on life/technology skills and the other, being a PhET simulation, dealt with STEM. Typing Jungle could easily be a “sponge activity” in any teaching context and, arguably, helps facilitate all content areas as education is increasingly entwined with technology. PhET simulations are fantastic for math and science specifically, as they illustrate concepts that can be quite abstract, and they are interactive! Out of the four of us in the group, myself and another educator partnered to evaluate the PhET simulation “Balancing Chemical Equations,” since she and I both have taught STEM. In our evaluation, we looked at all levels of the PhET design to determine the simulation’s ease of use and accessibility, as well its efficacy in terms of showing how to balance chemical equations (3c). We found that the digital tool (PhET simulation) had a lot of beneficial characteristics, especially for learners with strong visual-spatial skills as well as language learners, since there was minimal text to read and an emphasis on color coding and animation. While we did identify a few areas for improvement (such as wanting more sound options for engagement and an introductory video to explain the activity), we felt that the PhET simulation would be a great way to provide practice opportunities and build on direct instruction. Therefore, we decided that it should be adopted by our schools (3c).

Impact:

Evaluating digital learning content and tools through this software critique made me much more aware of what to look for. Being systematic by considering not only the basic information, but also the instructional design of the software (title page, directions, user identification, learner control, presentation of information, options for getting help, and ending the program), ensures that adoption of the technology is purposeful and connected to both learning theory and the specific context, including the learner population (3a, 3b). Conducting that evaluation as part of a team also matters greatly, as each member considers the software through the lens of their unique perspective and experience. I was fortunate to collaborate with educators who understood this and, as a result, grew a lot as a coach. I feel I am ready now for a leadership role within educational technology. I have shared this in discussions with other educators outside of my group and in my instructional design network as I prepare my portfolio and LinkedIn profile for my next career move. If I join a team, which is my goal, I could make an impact by structuring and organizing either the development or the evaluation of digital learning content and tools much in the same way I led my team for this software critique.

Reflection

Taking into account both the Mentoring Project and the Software Critique, I have demonstrated competency in all indicators of Standard 3 “Collaborator.” The mentorship covered indicators 3a, 3c, and 3d, while the software critique allowed me to put indicators 3b and 3c into practice. Thanks to the complexity, methodology, teamwork, and leadership that I embraced with gusto during these two experiences, I am a competent collaborator-coach as defined by ISTE. Not only am I able to mentor, but I sincerely enjoy it, even when confronted with challenges. I also find the process of evaluating digital learning tools to be fun, since doing so means being on a task force. I like seeing projects through from start to finish, which was the case for both the mentorship and software critique. I could see myself serving professional roles where the skills I honed as a collaborator-coach are utilized regularly. I also have a clear idea of what features I should include in my own instructional design work. As I continue to add to my portfolio and design products for clients, I will prioritize collaboration in order to improve instructional practice and learning outcomes, no matter the context.

bottom of page