Arquivo mensais:fevereiro 2016

Teacher PD – Week 6 – Reading Notes

Little, J.W. (2004). ‘Looking at student work’ in the United States: a case of competing impulses in professional development. In C. Day & J. Sachs (Eds.) International Handbook on the Continuing Professional Development of Teachers (pp. 94-118). UK: Open University Press.

  • Looking at student work
    • Root teacher learning in and from practice
      • Deep understanding of how children learn
    • External control of teaching and teacher education
      • Standards
      • Controlling practice
      • Exercising sanctions
    • More attention to
      • School reform
      • Public accountability
  • Contradictory purposes
    • Stimulating and supporting teacher learning
    • Instructional decision making
    • Bolstering teacher community
    • Advancing whole-school reform
    • Satisfying demands for public accountability
  • Learning in and from practice
    • Not as many as we would wish for
  • Escalating accountability pressures
  • Looking at student work: profiles of purpose and practice
    • Principal rational for looking at student work
    • Programmatic choices
    • Purposes
      • Deepen teacher knowledge
      • Strengthen teacher’s instructional practice in specific subject domains
      • Collective capacity for improvement in teaching and learning at the school level
      • Review of student work in the service of standards implementation and external accountability
  • Student work as a resource for deepening teacher knowledge
  • Student work as a catalyst for professional community and school reform
  • Student work as an instrument of external accountability
  • Multiple purposes – complementary or competing?
    • Tension
      • Teacher-defined inquiry and compliance with external standards
      • Research-based model and honoring teachers’ own interests and expertise
  • Teacher PD has many purposes, but hard to do them all in one
    • Depth of understanding in particular subject domains
    • Professional norms of mutual support and critique
    • Expectations for both internal and external accountability regarding students’ opportunity to learn
  • Teacher PD must account for local resources and knowledge to promote teacher community at the school site
  • Audit society
    • Leaves no room for experimentation
    • Teachers have little opportunity to reflect
    • No support from cohort
  • Contributions and limitations of research
    • ‘Value added’: contribution to teacher knowledge and practice
      • Triangle studies
        • Teacher development
        • Classroom practice
        • Student learning
      • Instructional triangle
        • Teacher
        • Student
        • Curriculum
    • Moderation
      • Make sure scores are being interpreted in the same way by all
    • Limited scope of research, expanding the scope of practice
      • “The result: a growing arena of practice that remains weakly positioned to capitalize on research, and research weakly attentive to expanding contexts of practice.” Little, 2004)

Little, J.W., Gearhart, M., Curry, M., & Kafka, J. (2003). Looking at student work for teacher learning, teacher community, and school reform. Phi Delta Kappan, 85, 185-192.

  • Looking at student work usually occurs in isolation
  • Common elements of practice
    • Bringing teachers together to focus on student learning and teaching practice
    • Getting student work on the he table and into the conversation
    • Structuring the conversation
  • What seems to work
    • Flexible, creative use of tools for local purposes
    • Ability to exploit subject expertise and examine subject issues
    • A balance between comfort and challenge
    • Facilitation to build a group and deepen a conversation
  • Three Dilemmas in making the most of looking at student work
    • Concern for personal comfort and collegial relationships
    • Scarce time, many interests
    • Uncertainty about what to highlight in “looking at student work”

Science Readings

Roth, K.J., Taylor, J.A., Wilson,C. D. & Landes, N.M. (April, 2013). Scale-up study of a videocase-based lesson analysis PD program: Teacher and student science content learning. Paper presented at the 2013 NARST Annual International Conference, Puerto Rico.

  • The Problem: Science Content Knowledge for Elementary Teachers
    • “Content knowledge is insufficient to identify and address children’s misunderstandings (Roth, Anderson, & Smith, 1987); in fact, teachers sometimes hold the same misconceptions as their students.” (Roth, Taylor, Wilson, & Landes,  2013).
    • Science Teachers Learning from Lesson Analysis (STeLLA)
      • Look at videos of practice
      • Student Thinking Lens
        • Strategies to reveal, support, and challenge student thinking
          • Ask questions to elicit student ideas and predictions
          • Ask questions to probe student ideas and predictions
          • Ask questions to challenge student thinking
          • Engage students in interpreting and reasoning about data and observations
          • Engage students in using and applying new science ideas in a variety of ways and contexts
          • Engage students in making connections by synthesizing and summarizing key science ideas
          • Engage students in communicating in scientific ways
      • Science Content Storyline Lens
        • Identify one main learning goal
        • Set the purpose with a focus question and/or goal statement
        • Select activities that are matched to the learning goal
        • Select content representations matched to the learning goal and engage students in their use
        • Sequence key science ideas and activities appropriately
        • Make explicit links between science ideas and activities
        • Link science ideas to other science ideas
        • Highlight key science ideas and focus question throughout
        • Summarize key science ideas

Roth, K. et al (in press). The Effect of an Analysis-of-Practice, Videocase-Based, Teacher Professional Development Program on Elementary Students’ Science Achievement. Journal of Research on Educational Effectiveness.

  • Few rigorous studies about the relation of PD and student outcomes
    • “…few studies have tested causal relationships between teacher PD programs and student outcomes (Roth et al., 2011; Yoon, Duncan, Lee, Scarloss, & Shapley, 2007; Sleeter, 2014) and even fewer have used rigorous research designs.” (Roth, 2016, p.1)
  • Elementary teachers have little science content knowledge
    • “These problems are especially prevalent for elementary teachers who have little training in science-specific pedagogy or in the science disciplines they are expected to teach (Dorph et al., 2007, 2011; Fulp, 2002; Smith & Neale, 1989; Stoddart, Connell, Stofflett, & Peck, 1993).” (Roth, 2016, p.2)
  • Growing consensus that professional development should:
    • Engage teachers actively in collaborative analyses of their practice;
    • Treat content as central and intertwined with pedagogical issues;
    • Enable teachers to see these issues as embedded in real classroom contexts;
    • Focus on the content and curriculum teachers are teaching;
    • Be guided by an articulated model of teacher learning that specifies what knowledge and skills teachers will gain, what activities will lead to this learning and how this new knowledge and skills will appear in their teaching practices (Ball & Cohen, 1999; Darling-Hammond & Sykes, 1999; Desimone, 2009; Elmore, 2002; Garet et al., 2001; Guskey & Yoon, 2009; Hawley & Valli, 2006).
  • STeLLA’s shortcomings
    • Internal validity – control group was the same group of teachers one year before
    • External validity – samples drawn from urban schools in a single geographic region
    • Scalability – program developers delivered the PD – how would a PD facilitator have implemented it?
  • STeLLA suggested improvements
    • Random assignment of schools in the study to the program or a comparison condition;
    • The inclusion of a diverse sample of schools in the study;
    • A specified comparison condition that matched the treatment condition in duration, intensity, and contact hour
    • A treatment delivered entirely by PD providers who were not developers of STeLLA.

Teacher PD – Week 5 Reaction: The Role of Video in Teacher Learning

Prompt

Read the responses your colleagues wrote about the reading and react to them.

Reaction
(see responses below)

The reading responders did a clear job of relating Gaudin and Chaliès (2015) framework with van Es and Sherin’s (2009) “video club” and Gröschner, Kiemer and colleagues’ Dialogic Video-Cycle (DVC). While some focused on the methodology of the studies, others raised interesting questions about the research paper and the information it offered. Some common themes emerged from the responses, besides their descriptive nature.

All reviewers, for both the video club and DVC, seem to agree that they adhere to Gaudin and Chaliès framework and were successful in achieving their goals. In all cases, the effect of watching video in PD seem to be positive especially on teacher’s motivation and consequently, the students’ interest in the discipline. Teacher’s metacognitive abilities also were positively affected in both cases. The teacher’s reactions to the PD were also positive, giving them a sense of competency and self-efficacy. In both cases teachers watched videos of their own practice and were scaffolded for “selective attention” and “learning to notice”.

Another common thread was that the responders all felt that there was room for more information in the papers. The decision process of selecting and editing the videos that best fit the PD experience was not fully delineated. For one responder the video selection process was clear, yet only in terms of its mechanics and procedures. Little mention is made about content choices involved in editing the videos. The responders also felt that more information about the facilitator’s role and the methodologies applied during the workshops themselves. Focus was given on the video aspect of the PD, diminishing the level of detail presented about the interactions between the facilitator and the teachers.

Finally, some responders mentioned the desire of more information about how these video-based PDs compare to other non-video-based PDs the teachers have already experienced. In general we can see that teachers are satisfied with the results of this kind of PD but no information is given about what specifically they perceived as different and better about this methodology. One glimpse into this comparative notion is given by a teacher in the control group in the DVC study, who says that they would have desired more direct feedback. The teachers in the study did not report this having watched videos of their own practice and reflected upon them.

I personally found that this exercise was of surmount importance to gain new lenses in the papers we read. It is fascinating how each person notices, deems relevant, and draw different information from the same source material. Clearly previous knowledge, experiences, and context transform the lenses through which each one of us sees the world. Being exposed to these different lenses enhances and expands our own understanding. In addition, reacting to these responses engages us even further with the topic and helps us see patterns we might have not noticed before. In this sense, I believe that video-based PD is an exemplar method of achieving the desired learning objectives with teachers. Videos present a concrete source of information and basis for reflection that is powerful, objective, direct, and close to one’s own practice. Obviously well guided facilitation for the reflection process is required, as in any learning experience, yet I believe that once a teacher becomes aware of how to analyze videos, self-guided learning may happen voluntarily.


Response 1/5

Gaudin and Chaliès (2015) conceptualize the process of video viewing in professional development (PD) in the following four broad categories:

    1. Teachers’ activity as they view a classroom video (e.g. view video with selective attention; and view video and knowledge based reasoning).
    2. Objectives of video viewing in the PD context (e.g. build knowledge on “how to interpret and reflect”; build knowledge on “what to do”; hybrid approach; and objectives based on learning goals).
    3. Types of videos viewed (e.g. unknown teacher activity, peer activity, own practice, selecting and organizing videos in line with learning goals and contexts).
    4. Effects of video viewing on teacher education and professional development (e.g. teacher motivation, cognition, classroom practice, recommendations for effective video viewing).

In the context of Guadin and Chaliès’ (2015) work, van Es and Sherin (2009) “video club” is a case viewing video with selective attention with a primary focus of “learning to notice” student ideas in elementary math classrooms. This one year video club PD program considered both the participants’ own practice and the practice of their peers as the PD / research team captured classroom teacher of participants and created experiences for teacher participants in meeting this teacher learning goal. The research team had set out to capture how this type of video club would influence teachers’ thinking and practice and was able to capture changes in both teachers’ thinking and practice over the course of the project. The were able to capture how teachers “made space for students’ thinking, … more frequently probed students’ thinking, and … took on the stance of the learning in the context of teaching” (p. 169).

As I read the van Es and Sherin (2009) work, I wanted to know more about how the facilitators designed the content and processes of the PD in the context of the elementary math knowledge, processes, and dispositions that were expected in each of these classrooms. There was little mention as to how the videos were edited, the types of questions that were used to facilitate teachers’ journeys toward the learning goals—which themselves are on the vague side. Noticing student ideas and providing opportunities for students to express them is certainly a step for teachers to understand how students are approaching the mathematics, but it is unclear whether through this video club participation, teachers were developing knowledge as to how to move students along the learning continuum. I think about Schoenfeld and Floden’s “Teaching for Rubust Understanding in Mathematics” dimensions and am searching for greater clarity as to how van Es and Sherin (2009) think about how teachers consider the mathematics, the cognitive demand, the access to mathematical content, students’ agency, authority, and identify, and use of assessment practices in shaping their practice and how students learn math. I’m also curious as to how this particular PD experience coheres or conflicts with other PD experiences that teachers have taken part of and how that knowledge shapes their beliefs and actions over the course of the project period.


Response 2/5

In the following paragraphs I offer an analysis of Gröschner, Kiemer and colleagues’ Dialogic Video-Cycle (DVC) through the lens of Gaudin et al.’s four-principle conceptualization of how video viewing is used in teacher PD. In each area, we can see close alignment between the DVC model and what Gaudin et al. point to in their research analysis as best PD practices regarding video use.

The nature of teachers’ activity as they view a classroom video

In their video viewing conceptualization, Gaudin et al. (2015) point to selective attention as a key component of productive teacher video viewing activity. The video viewing structure in Gröschner, Kiemer and colleagues’ DVC model very much supported selective attention by designating a specific focus during each analysis session. In the first video analysis and reflection session, for example, (workshop 2 of each DVC cycle) the facilitator selected clips and posed questions that guided the participants to focus on the ways in which the teacher in the video activated student engagement. In the second analysis and reflection session (workshop 3 of each DVC cycle) facilitators used video clips from the same lessons, but this time guided the participants’ attention to the ways in which the teacher scaffolded student ideas. This guiding of attention is very much in keeping with Gaudin et al.’s description of selective attention during video viewing.

Gaudin et al. (2015) explain that another important aspect of video viewing during PD is knowledge-based reasoning. The extent to which the teachers in the DVC study exercised knowledge based reasoning during their video viewing is less clear than the selective attention component due to the authors’ somewhat limited description of the actual analysis conversations themselves. We can, however, infer from the few example questions, and the authors’ mention of teachers posing solutions and alternatives, that teachers engaged in description, interpretation, and prediction while watching and discussing the videos. These are in keeping with Gaudin et al.’s description of “first level” reasoning. The structure of the DVCs also suggests that teachers had the opportunity to engage in “second level” reasoning (comparing visualized events with previous events). During the first cycle, they would have had the opportunity to connect and compare what they saw in the video with their own past practice. During the second DVC, the teachers would have had the, perhaps more explicit, opportunity to compare what they observed in the second videos with what they saw in the first videos. Whether, however, the facilitators deliberately capitalized on these opportunities for comparison remains unknown.

Objectives of video viewing in teacher education and professional development.

Gaudin et al. (2015) point to three major objectives for using videos during teacher PD. One is to model how to implement a practice (for example, were a video used in session 1 of the DVC process it quite likely would have been in this camp). A second objective is to teach participants how to interpret and reflect on practice. And the third is a hybrid of the two. In that they were used in more of a problem solving capacity – as examples, rather than exemplars – the videos used in the DVC reflect the objective of learning how to interpret and reflect. If, however, we consider the overarching goals of the DVC, along with the two-cycle structure, the objective of watching the videos becomes more of a hybrid. The teachers were asked to interpret and reflect on what occurred in the videos in the service of refining their practice for the second cycle (and beyond). Though no videos were specifically chosen as exemplars, it is conceivable that exemplary practice might have surfaced in the variety of clips observed. This suggests the possibility that the same videos could have been used to build capacity for best practice (the “normative” objective), as well as reflecting and interpreting (“developmentalist” objective) throughout the DVC process.

The nature of classroom videos viewed in teacher education and professional development  – and –
The effects of video viewing on teacher education and professional development

Gaudin et al. (2015) describe three types of videos that can be made available for viewing: videos that feature unknown teachers, peers, or one’s own activity. While they point to research that explores the advantages and disadvantages of each type, they emphasize that watching peer and self videos may be the most productive in that they encourage teachers to “ ‘know and recognize’ themselves” (Leblanc, 2012 as cited in Gaudin et al., 2015, p.51) and “‘move toward’ new and more satisfactory ways of teaching” (Gaudin et al. 2015, p.51). In the DVCs, the teachers watched videos of themselves and their peers conducting specific lessons and were guided to identify and reflect upon the effectiveness of the observed strategies for classroom discourse. The results of their approach mirror the affordances that Gaudin et al. describe, particularly in the area of teacher motivation. In Gröschner et al.’s final round table discussion, for example, teachers who had enrolled in the more traditional PD, in which videos were not viewed, expressed a desire for more direct feedback on their own teaching. In contrast, teachers who had participated in the DVC were satisfied with this element of their experience. In their final analysis, Gröschner et al. found teachers in the DVC group had stronger feelings of competence and satisfaction than those in the control group, which aligns with what Gaudin et al. describe in their analysis of similar research on this topic.


Response 3/5

Gaudin and Chaliès discuss four aspects of video viewing as a strategy for teacher professional development: “the nature of teachers’ activity as they view classroom videos,” “the objectives of video viewing in teacher education and professional development,” “the type of video viewed in teacher education and professional development,” and “the effects of video viewing on teacher education and professional development.” Here we want to consider the Kiemer and Gröschner, et al. study of a PD intervention constructed around video viewing through these four lenses.

With respect to the nature of teachers’ activity, Gaudin and Chaliès look for active, rather than passive, engagement with the video. In particular, they prioritize evidence of selective attention and knowledge-based reasoning. Kiemer and Gröschner do not provide much information about the interactions that took place in the course of their workshops, focusing instead on the evolution of teachers’ answers to questions on the pre-, mid- and post- questionnaires that surrounded the workshops themselves, as well as on the effects of the PD after its conclusion. They do however, talk about wanting to engage the teachers in the same ways that they hope the teachers will go on to engage their students, so they have an activity designed to “active students verbally and to clarify discourse rights” and one to “scaffold students’ ideas.” It is unclear what the content of discussions during those activities was, but from the later feedback sections, it seems that teachers felt they were being given “tips and suggestions about things you can change quickly” which sounds like less of a constructivist/cognitive dissonance approach (which might focus first of selective attention), and more of a knowledge-based reasoning focus (looking at what the teacher does in the video to reason through, and get feedback on, what he or she could do to be more effective).

The objectives of video viewing in the Kiemer and Gröschner study are more clear. They want to help teachers build productive classroom discourse, through open-ended questions and feedback, in order to promote student interest, and thus motivation and learning outcomes. This goal seems to fall into Gaudin and Chaliès’s “normative” bucket, helping teachers to reflect on and develop their practice not with the intent of promoting ongoing self-directed reflection, but rather with a focus on leading teachers to come away with intent and strategies for leading student discussions more “correctly.” On the other hand, they do look at teachers’ perceived autonomy, suggesting an interest in their self-guided learning, yet their desired outcomes are all stated in terms of changes in teacher practice and in student outcomes. Additionally, the videos are all certainly “examples, not exemplars” so they are not shown as “what to do” videos, but are nonetheless used as a jumping off point for discussions of “what to do.”

Next, the nature of the classroom videos are again quite clear. Kiemer and Gröschner use videos of the teacher participants themselves, so, presumably, the teachers see videos of themselves as a main focus but also video of their peers who are also participating in the same workshops. The workshops provide the community of support recommended for viewing videos of one’s own teaching, as well as the atmosphere of productive discourse that is scaffolded by the facilitator. The facilitator also pre-selects the clips from the videos to be watched in the workshop, reflecting the need described by Gaudin and Chaliès for more preparation and scaffolding that when watching others teacher. As all of the participants are mid-career teachers, rather than student teachers, using videos of the teachers themselves also fits into Gaudin and Chaliès’s “continuum of teacher professionalization” which suggests that they are ready for such introspection even while earlier career teachers might not be.

Finally, Gaudin and Chaliès see common effects of video viewing as enhancing teacher motivation and teachers’ selective attention and they make particular note of the fact that “little empirical evidence has been presented on how video use benefits actual classroom practice.” Yet, Kiemer and Gröschner’s second article specifically explores the effect of the PD intervention on teacher practices and student interest and motivation, which they also acknowledge as being unique among research papers in this field. For the most part, they do find positive outcomes relative to their objectives. Teachers used feedback more effectively to promote student discourse in their classes and students were found to have more interest, as well as sense of autonomy and competence. Gaudin and Chaliès’s discussion of indirect evidence about teacher practices and student outcomes (as well as their direct mention of Kiemer and Gröschner’s study in this section) suggests that these findings are in line with Gaudin and Chaliès’s ideals for the effects of video viewing in a successful PD intervention.


Response 4/5

Gaudin and Chaliès (2015) analyzed and categorized 255 studies of the use of video in professional development along the following four dimensions: 1) the nature of the activity teachers engage in when viewing video during professional development, 2) the goals of having teachers view such video, 3) the types of video used, and 4) the effects of viewing video in professional development. Using this four-part conceptualization, one can analyze and summarize any program of professional development, including that known as the “Dialogic Video Cycle” or DVC (Gröschner et. al, 2014; Kiemer et. al, 2015).

The Dialogic Video Cycle is a professional development program that uses video to support teachers in shifting the nature of the discourse in their classrooms. The DVC consists of two cycles of professional development, each of which is comprised of three workshops. In the first workshop, teachers work in collaboration with one another in modifying a lesson plan that they then implement in their classrooms. Implementation of this modified lesson plan is filmed in each teacher’s classroom. Clips from these video records are then selected by the DVC facilitator and shown to teachers in the second and third workshops of the DVC. In workshop #2, teachers focus on the types of questions posed by teachers to students in the videos viewed, paying particular attention to whether the questions posed are either open (e.g., what do you think happens if we heat it up?) or closed (e.g., do we have any right angles here?). In workshop #3, on the other hand, teachers are asked to focus on the sorts of feedback provided by teachers to students in the videos, as well as share ideas for how to take up students’ correct and incorrect answers. The second cycle of the DVC consists of three similar such workshops that revolve around the teaching of a different lesson plan.

While teachers in the DVC do not select the video clips viewed in workshops #2 and #3, consistent with the core features of effective professional development (Desimone, 2009), they play an active role in this particular program. Throughout both workshops, teachers are asked a series of questions by the professional development facilitator and are generally encouraged to reflect on their experience delivering the lesson that was filmed. Additionally, teachers are encouraged to ask clarification questions of the teacher in the video being viewed, which that teacher can then respond to by providing necessary explanations or describing contextual factors in greater depth.

The objective of engaging teachers in the DVC professional development program was to support them in moving towards a more dialogic model of discourse, in which both teachers and students co-construct meaning together. This objective was pursued as dialogic classrooms are believed to do better at enhancing students’ interest in relevant subject matter than classrooms in which the teacher adopts a more didactic, uni-directional pattern of discourse (Kiemer et. al, 2015). As such, the primary objective of this PD program, to encourage teachers to change their discursive practices, was pursued as success in meeting this primary objective was expected to result in success in meeting the second objective, enhancing students’ interest in their learning.

The video viewed by teachers in the DVC consists of records of the teachers themselves teaching a lesson they had modified previously in collaboration with one another. Specifically, teachers view video-clips selected “on the basis of the criteria of productive classroom discourse” (Kiemer et. al, 2015, p. 96). Stated differently, selected video-clips are chosen as they will presumably engender rich conversation among teachers about both the questioning behaviour of teachers in the video viewed (workshop #2) and the nature of the feedback provided by teachers in response to student contributions (workshop #3).

According to Kiemer et. al (2015), as a result of the DVC professional development, teachers’ practice did, as hypothesized, change in notable ways. While teachers did not come to ask more open-ended questions as a result of having taken part in the DVC, they did demonstrate significant improvement with regards to the type of feedback provided to students. At the conclusion of the DVC, teachers who took part in this particular development program provided less feedback that simply told students if an answer they had given to some question was right or wrong (i.e., simple feedback) and increasingly gave feedback that highlighted what was right or wrong about an answer, as well as how such an answer could be improved (i.e., constructive feedback). Additionally and as expected, students in the classrooms of teachers who participated in the DVC PD demonstrated an increased interest in the subjects that their teachers came to teach in a more dialogic manner (Kiemer et. al, 2015).


Response 5/5

The implementation of what van Es and Sherin (2009) call “video clubs” has elements that can be critiqued by Gauding & Chaliès’s (2015) four main conceptualizations of the use of video viewing in professional development. The video clubs are an example of using all four principles in varying degrees, but van Es and Sherin note that the focus was on analyzing student thinking rather than implementation of new methods or changing of teachers’ beliefs (p. 159). This focus for the video clubs has both affordances and limitations when analyzing it against the four principles framed by Gaudin and Chaliès.

First, the nature of the teacher activity while viewing the videos did have a specific focus on describing what they identified as student thinking and gave the teachers a structure to interpret using evidence. Attention to this principle was analyzed and showed some of the highest learning opportunity for teachers. In fact, Gaudin and Chaliès highlighted video clubs as a model for the elements that contributed to increasing a teacher’s capacity to reason (p. 46).

Second, video clubs had specific objectives in the professional development around student thinking. However, the paper did not report whether the researcher had in mind the objective of “best practices.” The clips were selected for the “potential to foster productive discussions of student mathematical thinking” (van Es & Sherin, 2009, p. 160), but it was not clear whether there was attention to the practices that may have attributed to higher levels of discourse around a problem. As van Es and Sherin report an increase teachers attending to student thinking in their analysis, it was not clear whether their learning goal of focusing on student thinking could have been better served by having teachers critically discuss the practices that are associated with student thinking. In this, it is not clear whether a hybrid objective could have better served their main focus.

Third, Gaudin and Chaliès note the limitations of having teachers analyze peer’s professional practice (p. 51). In video clubs, the analysis does show a shift in focus of the student in discussions, but it is difficult to infer whether watching a peer could have dampened the depth of these discussions. It was also not clear how teachers might have been scaffolded into watching each other or whether there were practices that could have been critiqued to further increase teacher’s perceptions of student thinking. Gaudin and Chaliès features studies that conclude that first videos should be selected and organized by viewing an unknown teacher (p. 52). Mainly, the video clubs go directly into viewing peer videos, and it could have had an effect on the shift in the conversations and depth of teacher analysis that they are reporting in their study.

Fourth, van Es and Sherin do show an association between video clubs and practices such as the following: attention to student thinking while teaching, knowledge of curriculum, changes in teachers’ instructional practices, and opportunities for student thinking. Although not all the aspects of the effects of video viewing that Gaudin and Chaliès discuss are explicitly addressed in video clubs, there does seem to be an increase in the teachers’ motivation and cognitive abilities. For example, some teachers report learning more about the curriculum and others “positioning themselves as learners in the classroom” (p. 171).

In all, although I may be looking at aspects that are not considered in video clubs proposed by the four conceptualizations, overall the video clubs have a design and many outcomes that are sound when analyzed from Gaudin and Chaliès’s framework. I found video clubs as having a good base for future professional development programs using video viewing. Considering the four principles, fine-tuning video clubs could have great promise for teacher learning.

Beyond Bits and Atoms – Week 5 – Makerspace Final Paper Assignment

Final revision of previous assignment.

Link to Google Docs with proper formatting


Makerspace for Classroom Teachers
Lucas Longo – 2016

Introduction

Makerspaces in schools provide an unique opportunity for teacher professional development, in all disciplines. They may provide a situated learning experience where teachers can recall the difficulties students have in the process of learning and thus reflect upon their own teaching practices. Given that the majority of teachers are not familiar with fabrication and electronics, they are put back into the beginner’s seat, providing the possibility of reflecting metacognitively about learning and teaching. Accompanied with engaging discussions and activities of grounded on the affordances the activities provide, teachers learn about the best practices of teaching through modeling and engagement in practice.

The nature of the activities in makerspaces range from exploring, designing, building, and asking questions – all traits considered desirable in today’s research in education. What if we could apply these features into an English Poetry class? How can we promote transfer from the teachers experiences in the makerspace into their everyday classroom activities? I propose a PD curriculum that through engaging in makerspace, teachers are provided the opportunity to reflect on how learning happens, how to transfer these ideas into their own practice, and ultimately affect learning outcomes of their students in their own disciplines.

“In fact, the richness of makerspaces comes not from the fact that the abstract is left out, but that it is brought in together with new ways to build relationships with and between objects and concepts. ” (Blikstein & Worsley, 2001, p.5)

The Space

To explore this idea of using makerspaces as a learning environment for teachers I went to a public elementary school in Palo Alto where I interviewed the lab coordinator, who I shall call Jane. She walked us through the stations, described the activities students engage with in the space, and how she actively helps classroom teachers use this space as a learning environment in their disciplines. She used to be a science teacher and ran computer programming workshops after school before she created the maker space at the school. Her digital literacy and technical knowledge were surmount in her appointment for the task and as I see it, essential to be able to design the space, choose the tools, and use them for didactical purposes.

The space itself was a regular classroom converted into an open space with the working stations along the wall and low tables covered with paper where the students plan and work on their iPads. An outside area is also used for larger projects and is where all the Lego bricks are stored. Jane transformed it into a Makerspace by getting rid of all the closed cabinets along the walls, cutting the table’s legs making them more accessible to the students, and installing shelves to store material and student’s work. This was the first makerspace in Palo Alto’s District created around one and a half years ago, and now is being used as a model for other schools. An interesting concept that arose from the conversation was that some schools who do not have a full classroom for a makerspace are using karts with equipment and material that is circulated amongst the teachers allowing them to use the tools in their classroom.

Jane also mentioned that she has worked at the richest school in Palo Alto where they still do not have a Makerspace. ‘They are still thinking about the color of the furniture that will go there’ and ‘the teachers are not aware that the administration is even thinking about or planning to create a makerspace.’ ‘Teachers are not bought into it yet – they have the money but nothing happens. Let the kids do it – figure it out what is needed – put in action – do it.’ Her maker mentality needs to be somehow transmitted to the other schools. Freire would appreciate her statements in the sense that she is providing an open space for dialogue and relinquishing control over the experiences the students have in the space. She embraces the notion that the students are responsible for their own projects and that the teachers learn as much from them as they might from the teacher.

This transformative approach to teaching is promoted by her not only at her school but also in more formal PD sessions she holds at the space. At the district level she teaches “iPad in the Classroom”, “How to use Google Docs”, and “Schoology”. Every year she creates new courses to match the current software needs the teachers might have. Her drive and content knowledge applied to ‘spreading the word’ seems to be the key factors in this space’s success, and ultimately its sustainability. Any learning environment needs someone who will skilfully become a caretaker, curator,  facilitator, and enthusiast.

The following pages contain some photos of the space and a floor plan to situate the tools. I also recorded the interview and transcribed it loosely, categorizing by topic what was said. Please refer to the Raw Data for the full transcript of the interview.

Documentation

IMG_1882.JPGGeneral view of the maker space

IMG_1883.JPG3D printer, object scanner, Lego Mindstorms

IMG_1884.JPGSmartboard, document camera, 3D printer

IMG_1885.JPGRobots, Lego Mindstorm, and mini-drones

Screen Shot 2016-01-30 at 11.17.34 AM.pngMakerspace floorplan

The arrangement of the space is divided into tool stations along the perimeter and working stations in the central area. Even though the space might look ‘messy’ it conveys a message of open exploration where all the tools and supplies are readily available for use. There is no check-out sheet of any kind or locked cabinets to which students need to request access to. The space itself is open to students during lunch time and after school promoting the idea of free access and empowers the students to decide when to work on their projects.

Curricula Integration

Jane uses the space as a learning environment for the classroom teachers at the school. She actively engages with the classroom teachers to create learning experiences in the makerspace. At Grade level meetings, she occasionally pops in and gives them ideas of activities and projects they could do in the makerspace that would enhance the learner’s experience with the subject matter. Understanding the affordances provided by the space and the teacher’s current topic of study provides a rich collaboration and an even richer experience for the students. Students for example, are encouraged to document their entire process, from design to final product. With these pictures, notes, and videos the students can ‘reflect back and see how they could do it differently next time.’

We could transfer this concept into PD by stressing the importance and value of formative assessment techniques the teachers could use in their own practice. Another example of this kind of transfer can be extracted from the fact that at lunch time all age groups are together in the space, where peer-to-peer teaching is evident. ‘They observe each other and learn from each other – even the older kids learn from the younger kids.’ A teacher PD designed in a makerspace would facilitate and demonstrate the value of collaboration and group work.

Before the space was built, the teachers ‘apprehensive and did not know about expect’ but they now see the student’s ‘excitement and learning’.  

‘They all get on-board once they see this happening. Classroom teachers think that they have to learn it all themselves. The mindset has changed. Now they know that the kids know more about the apps and the space than they do so they are willing to relinquish control and let them figure it out.’

I found it particularly interesting to note the teacher’s fear of not knowing how to use the makerspace’s tools. Implicitly, this hints towards a ‘banking model’ (Freire, 1970) of education where the teacher believes they are the holder of all knowledge that must be deposited into the student’s mind. In the makerspace, teachers have to dive into to the tools and learn with the students to enable them to progress with their projects. Suddenly the students know more than the teacher about a particular tool and more importantly, about their own project, intentions, and goals. The teacher becomes a facilitator instead of a fact presenter.

I am not suggesting that all of the English Poetry class should be taught in a makerspace or that we have to throw away the current method of teaching that class. The central idea is to use the makerspace as yet another possible affordance in the engagement of students with the content. The challenge is to demonstrate to classroom teachers that it is possible to learn not only from books, lectures, and in-class in-control situations. It is also ‘more work’ to think about lesson plans, design activities, and ultimately integrate the traditional and known curricula to lesser known methods, tools, and environment.

Conclusion

Creating a makerspace in a school must be accompanied with a proficient leader who will be able to talk-the-talk and walk-the-walk. Not only this person must be fluent in the lingo of the technological tools, they must also see that sewing machines, cardboard, wood block, glue, and other low-tech materials provide excellent resources in a makerspace. In that sense, the makerspace can very well be thought of as an “arts and crafts” space with some new tools for creative expression and active learning. The driver must be to facilitate the guided creation of projects students engage in related to the content and learning goals required in a classroom teacher’s discipline.

Along with understanding these core drivers that sustain the children’s interest in the space, this person must also engage classroom teachers in this experience actively promoting experimentation. This role involves exposing teachers to the possibilities the space offers in terms of types of projects, activities, and processes. Initially the teachers might need help with coming up with ideas of integrating their discipline with these possibilities and designing learning experiences to achieve existing learning objectives.

Jane’s pragmatic personality, hands-on approach, and technological know-how are essential characteristics for creating a sustainable makerspace. She is continuously making efforts to spread the word and to help teachers figure out how to best use the space with integrated constructivist activities encouraged in makerspaces. It is a process that takes time, determination, and enthusiasm.

Even though a direct link between the makerspace experience and an improvement in academic performance may be hard to be measured precisely, the effort seems to be rewarding and promote growth beyond test results. It teaches both the children and the teachers to co-create, explore, investigate, and ‘make’ their own learning happen.

Raw Data:

Interview Transcript – Smita Kolhatkar – Barron Park Elementary School – Jan 2016

Physical environment

  • Transformed the classroom into a Makerspace by clearing cabinets, cutting the tables to be closer to the ground
  • Space is open during lunch – a free for all
  • Computers and iPads remain on the tables while the gluing station and making areas along the walls
  • This was the first makerspace (1.5 yrs old) and now being modeled – those who do not have the space, they have karts with equipment on them
  • Smallest and poorest school in Palo Alto – 30% “Free Reduced”(???) Lunch, 30% EL, a lot of special needs students
  • She worked at the richest school in Palo Alto but they still don’t have a Makerspace because they are still thinking about the furniture
  • The teachers are not aware that they are thinking planning to start a Makerspace
  • Teachers are not bought into it yet – they have the money but nothing happens
  • Let the kids do it – figure it out what is needed – put in action – do it
  • 50 kids every day at lunch
  • Lunch 12:25 to 1:00

The learning task

  • Classroom teachers and her talk about what they could do for their classroom
  • Students document their entire process
  • Lego kits comes with curriculum
  • Pictures offer a closure of the project
  • Students prefer the tangible affordances of the robots – having to connect them to a computer in order to program them is an obstacle
  • At the District Level she teaches iPad in the Classroom, how to use Google Docs, Schoology
  • Use the Makerspace as a PD environment
    • Low enrollment this year – teachers have a lot to do – held 1 or 2 only in the past year
  • Our teachers were apprehensive before the space opened up – they did not know about expect – once it opened, they see the student’s excitement and learning they all get onboard
  • At Grade level meetings, she pops in and gives them ideas of what they could do with their classroom
  • As students progress through the years, they come into with previous knowledge and are able to dive into making
  • All age groups get together during lunch time – lots of peer-to-peer teaching. They observe each other and learn from each other – even the older kids learn from the younger kids
  • They teach each other and it comes naturally to them
  • Minecraft – one boy simply observes a group of older students working on their project, learns from it, and makes suggestions about what they should do
  • Classroom teachers think that they have to learn it all themselves – the mindset has changed – now they know that the kids know more about the apps and the space than they do so they are willing to relinquish control and let them figure it out
  • Kids usually finish the projects – almost like an unsaid rule – when they get stuck you help them – but naturally invested in finishing the projects
  • There is no “I Can’t”
  • Exposure to all kinds of things is important
  • Everyone talks about letting kids following their passions but they do not know what passion is!
  • They have an iPad Squad from 5th grade that does updates and maintenance
  • Teach them to document the process by taking photos so that when we have them reflect back they can see how they could do it differently next time
  • Digital etiquette that comes with it is great as well
    • What to do when you search Google and something inappropriate comes up?

Tools

  • They already had a 3D printer, LEGO Mindstorm, computers, and iPads
  • Dash&Dot gave them robots
  • iPad apps are really easy to use and very powerful
  • Parents donated quite a bit of material
  • Schools get a tech budget and spend them how they want – teachers were involved in the decision making process
  • Chromebooks – not as intuitive, apps are still coming out, no camera, cumbersome to carry – “no brainer to choose the iPad over Chromebooks” – not as intrusive in the classroom: no screen standing up on the desk
  • Computers are only used for programming but more and more, the programming capabilities are more and more available on the iPads
  • Circuit kits are extremely popular
  • Dissection projects – tear down electronics
  • Laser cutters are too expensive and not very safe – require exhaust and all – ordered a Glow Forge
  • Minecraft – creation mode only
    • Since Microsoft bought it, they are promising lesson plans to be used in the classroom
  • Lego Mindstorms not so good because you need to program on the computer and newer versions come first for the PC – schools have Macs.
  • Make the most out of the resources we have
  • Kids need to be used to different devices – using the computer is good occasionally
  • Apps are easier but are still not quite there for 3D tools
  • Store pictures on Schoology

The students

  • Students are highly engaged – they never want to stop working
  • Kids come in and says “I want to make something today” – they look at the material and start making
  • Gender preferences start appearing in 4th and 5th graders where boys gravitate to Minecraft
  • 1st and 2nd graders – hard but possible to teach coding
  • Girls don’t enjoy sitting in front of the screen
  • All like the robots and the tangibles
  • Coding must be introduced early on for girls – otherwise would loose interest on it later on
  • Two girls started coding club at their middle school
  • Boys use the sewing machines – don’t even have to ask that
  • We love it because we can make anything we want
  • A lot of boys would even come after school to finish projects
  • They have phases – are into one tool at a time

The teacher

  • She was a classroom teacher and then given the task to integrate technology
  • She acts as the technology integrator
  • First EdCampSVMake focused on Making April 30th – Saturday – 9 to 3 at Barron Elementary – Aimed at educators – https://www.eventbrite.com/e/edcampsvmake-tickets-20901981389?aff=es2
  • Might help that she is a woman in reducing gender biases
  • Her blog – http://haystechblog.blogspot.com

Maker Studio Initial Equipment List by Smita Kolhakatar

  • Sewing station
    • Sewing machine
    • Loads of fabric
    • Sewing accessories
      • Thread
      • Bobbins
      • Needles
      • Buttons
      • Sewing pins
    • Yarn
    • Looms
  • Filming Station
    • Stands for Stop Motion
    • Props for Stop Motion from Plan Toys
  • Gluing Station
    • Glue
    • Glue sticks
    • Glue guns
    • Masking tape
    • Regular tape
  • Robots Station
    • Bee-bots
  • Supplies
    • Markers
    • Crayons
    • Color pencils
    • Pens
    • Markers
    • Scissors
  • General Materials for building
    • Corks
    • Popsicle sticks
    • CDs
    • Straws
    • Wood scraps
    • Filters
    • Empty cartons
    • Cereal boxes
    • Pegs
    • Pipe cleaners
    • Bottle caps
    • Lots of empty boxes
    • Stuffing
  • Keva Planks
  • Stuffed toys
  • Circuitry
    • Battery packs
    • LEDs
    • Wires
    • Battery cells
    • Snap Circuits
    • Dough for squishy circuits
    • Makey Makey
    • Building kits
  • Movable whiteboards
  • iPad mini
  • MacBook Air
  • Makerbot Replicator 2
  • Makerbot Digitizer
  • LEGO NXT class kit
  • LEGO Storystarter class kit
  • Arduino kits
  • Soldering Kits
  • Laptops
  • Make Wonder Dash and Dot robots
  • Furniture
    • Tables (Low)
    • Tables (High)
    • Chairs
    • Cupboard
    • Built in counter space
    • Wall shelving

References:

Ackermann, E. (2001). Piaget’s constructivism, Papert’s constructionism: What’s the difference. Future of learning group publication, 5(3), 438.

Blikstein, P. & Worsley, M. (2014?) Children Are Not Hackers.

Freire, P. (2000). Pedagogy of the oppressed. Bloomsbury Publishing.

Beyond Bits and Atoms – Dream Toy Assignment 1

The assignment is to interview a child and build a toy with educational purposes. Worked with Arron Broder on this one.

Dream Toy Assignment
Aaron Broder & Lucas Longo – 2016

Interview report

José is a 14 year old boy who is interested in architecture and making a new house for his turtle. When we arrived at the lab to interview him, he had been there already for an hour. He has been coming for the past four Tuesdays for unguided play. In front of him was a sheet of paper with a small square labeled 30cm and 50cm on each side. The carefully handwritten title read “Foldable Turtle House”.

Our initial interview plan included talking about what he did after school for fun, if he had any hobbies, and what games he played on his computer or mobile phone. We would then probe him about the tools and kits he had already played with in the lab and possible projects he might be interested in doing. His drawing though, took over the conversation.

His 5 year old turtle’s current house is old and he wants to build a new one. Problem is the turtle is in Brazil and therefore he must somehow be able to take this house on the plane. He seemed to be stuck on his effort to figure out how he could possibly build a foldable structure. We suggested that it did not necessarily had to be foldable, but could be disassembled, making it much easier to build.

To move things along we started talking about what features he wanted in this house. He wanted a simple box that would contain the turtle at night. He thought it would be fun to have a ramp she could go up onto a platform where she could see beyond the walls of the box. What else did his turtle need in this house beyond the structure? “Her food and water bowls are there and I put some soil on the bottom of the house because it’s softer for her to sleep on.”, he said.

He got excited with our suggestions of putting a force sensor under the bowls to detect when they were getting empty. “And then we make an app that alerts me when that happens!” Problem was that this only happened two or three times a week. We talked about a door that would open if the turtle wanted to get into the house – which also would never happen, he said. We were having a hard time trying to figure out how to transform this simple house into something that would provide him with a learning experience.

We explained to him that the goal of this project was to create a ‘dream toy’ with a learning goal. He replied that he wanted to learn how to make a toy, instead of the toy itself. He has played with Lego, remote control cars and planes, Minecraft, and other toys during different phases but nowadays it’s mostly his computer and smartphone that supply him with entertainment. He then mentioned that he and his father built the current turtle’s house. At this point we realized that this would be the most relevant project for him – the show to his father how much he learned and made in his visit to the US, along with giving his beloved turtle a new house, of course.

Ideation and initial prototype

José would be glad if we built this house for him but would love if he could learn how to do it himself. We immediately thought of teaching him how to do it, yet what would then be the deliverable for this assignment? If the course was “Curriculum Construction”, our product would be the course. We had to come up with a tangible object. Our initial brainstorm came up with a kit that he could assemble on his own. At his age though, that would be too easy and little, if any, learning involved. We wanted to stimulate his creativity with scaffolding, not directions on how to build this house.

How about giving him modules that fit into each other allowing him to build any kind of structure he wants? Like Legos? Ok… next idea…

We then realized that instead of giving him Legos, we could give him a tool that he would build his own Lego pieces. Instead of giving him the block, give him the mold that makes bricks. This way he could size and shape the house to his liking, but be scaffolded in the trickiest part of building – joints and connectors. We thereby created a template that provides the shapes of these joints and connectors with which José could trace out the contours onto cardboard and cut out the pieces.

We named it the Template Realization Tinkering Lab (TRTL) Construction Kit. Basically a piece of wood with cut outs that allow for tracing the basic shapes of the connectors and then cutting them out. TRTL therefore allows for infinite variations and true exploration of building, scaffolded by this template. It allowed us to shift from building a toy for José to providing a tool that will enable him to learn about size, scale, and structure while activating his own creativity and exploratory nature of making.

A future development of TRTL will include the GoGo Board as part of the template to enhance even further the creation possibilities of the tool. This would require a handbook or instruction manual as to how to use and set up the Gogo board with examples of applications for the different sensors and how to write code to make it all work.

In general we learned that creating a toy is not as challenging as creating a learning experience. Toys should be ‘fun’ for the sake of ‘fun’. A learning experience not only could be fun but must have learning objectives that takes into consideration the developmental stage of the learner, interests, motivations, previous knowledge, content relevance, cultural context, socio-economic status, and several other factors that will determine its effectiveness. In this sense, we believe that TURTL may attend to several contexts due to its simplicity and maleability in so far as it is a template that allows for open-ended creations. We now need to test the tool and observe it actually helps in the learning process of building structures.

Documentation of the prototyping process

Screen Shot 2016-02-05 at 7.40.23 PM.pngFigure 1: Starting to sketch the construction kit

Screen Shot 2016-02-05 at 7.40.32 PM.pngFigure 2: Playing with a destructible turtle house

Screen Shot 2016-02-05 at 7.40.38 PM.pngFigure 3: Talking sensors with Engin

Screen Shot 2016-02-05 at 7.40.45 PM.pngFigure 4: Moving from a blueprint to a template kit

Screen Shot 2016-02-05 at 7.40.50 PM.pngFigure 5: Planning the joins and labels for the template kit

Screen Shot 2016-02-05 at 7.40.57 PM.pngFigure 6: Starting to draw the prototype in Coreldraw

Screen Shot 2016-02-05 at 7.41.06 PM.pngFigure 7: A cardboard prototype

Screen Shot 2016-02-05 at 7.41.10 PM.pngwFigure 8: Iterating on the instructional diagrams for the next prototype

Beyond Bits and Atoms – Week 5 – Reading Notes

DiSessa, A. A. (2001). Changing minds: Computers, learning, and literacy. Mit Press.

  • Computer Literacy
    • Common term attributed to being able to turn a computer on, insert a CD, or use the mouse for example
  • Material Intelligence – Literacy
    • Intelligence achieved cooperatively with external materials
  • Infrastructural Knowledge
    • Content that is widely adopted and used as basis for new content (e.g. Calculus)
  • Evolution of material intelligence
    • Galileu’s Theorems took pages to be described by him
      • “Theorem 5 – If two particles are moved at a uniform rate, but with unequal speeds, through unequal distances, then the ratio of the time intervals occupied will be the products of the distances by the inverse ratio of speeds.”
      • t1/t2 = (d1/d2) / (r2/r1) (time, distance, rate)
      • No Algebra at the time
        • Only in the 20th century Algebra became widely adopted
      • Lack of mathematical notation – or material intelligence
  • What are the possible future literacies?
  • Romance-novels being read in subways – a social niche – factors influencing its adoption
    • Require being able to read
    • Most readers are women
    • Romantic love as an accepted genre
    • No sanctions against it (e.g. Playboy magazine)
    • Price of printing and revenue share with authors
    • Printing press
    • Uncrowded trains
  • Definitions
    • “A literacy is the convergence of a large number of genres and social niches on a common, underlying representational form.” (DiSessa, 2001, p.24)
    • “Genre is to social niche as species is to ecological niche.” (DiSessa, 2001, p.24)
  • Perspectives on social niches
    • Values, interests, motivations
    • Skills and capabilities
    • Materials
    • Community ad communal practices
    • Economics
    • History

Wilensky, U. (2010). Restructurations: Reformulating Knowledge Disciplines through New Representational Forms. Learning Sciences, Computer Science and Complex Systems, Northwestern University

  • Structuration and Restructuraiont of a discipline
    • From Roman to Hindu-Arabic numerals
    • “the encoding of the knowledge in a domain as a function of the representational infrastructure used to express the knowledge.” (Wilensky, 2010, p.2)
  • Core properties of structurations
    • Power properties – must do what was done before but better
    • Cognitive properties – must be easier to learn
    • Affective properties – memes – ideas that spread in an evolutionary manner through society, social niche, or culture
    • Diversity properties – must attend to all ‘intelligences’ and people’s style
  • Circle can be described in several ways
    • All points are at the same distance form a point called center (Euclid)
    • The formula to plot a circle is x^2 + y^2 = K (Descartes)
    • Logo turtle – if constant linear and angular speed is maintained, a circle is drawn
    • Logo turtles – place many of them in a central point and have them all go straight for the same amount: circle.
  • Agent-based modeling
    • Observe a phenomenon and try to create an equation that fits the observed data
    • Agents have individual procedures which affect the larger population
      • Lynx-Hare Example
      • The Tick Model – Newtonian Physics and Beyond
      • GasLab – Statistical Mechanics and beyond
      • MaterialSim – Materials and Beyond

Wilensky, U., & Resnick, M. (1999). Thinking in levels: A dynamic systems approach to making sense of the world. Journal of Science Education and technology, 8(1), 3-19.

  • Agent-based models
    • Traffic-jam example on agent-based modeling
      • Cars are moving forward all the time but traffic itself is flowing backwards
    • Waves
      • Particles themselves move perpendicular to the direction of travel of the wave
  • Levels
    • NOT hierarchical levels
    • Individual level has its own behaviors and properties
      • Interactions of many of these individuals create emergent properties – a new level

Curriculum Construction – Week 5 – Curriculum Rationale Assignment

Prompt

Directions for Curriculum Rationale

Each group will submit a curriculum rationale that includes the following components:

  • Information about the Site

Give a brief description of the site for which you are constructing your curriculum.  What do you know about the context that will influence what you produce? What do you know (if anything) about the person/people who will be implementing your curriculum?

  • Ideology/Theory

What ideologies or curriculum theories undergird your curriculum?  How do these ideologies/theories influence the design of the curriculum? Be sure to cite particular theorists as appropriate.

  • The Learners

Who are they?  What do you know about them?  What assumptions are you making about how they learn and what is important for them to know?

  • Overall Rationale

Identify the overall why, what, and how of your curriculum and explain why you made these choices.  This section may include a rough outline of topics to be covered and possible scope of the unit. It should be clear how the rationale fits the setting and is appropriate for the learners that you’ve described.

Please bring hard or electronic copies of your rationale to class next week for peer review.  The instructors will also give you feedback about the rationale, so please send us an electronic copy as well.


 

Response

Curriculum Rationale
Celine Zhang, Lisa Jiang, Lucas Longo, Mohamad Haj Hasan

Information about the Site

The site we are working with is the Operations, Information & Technology (OIT) Department at the Stanford Graduate School of Business (GSB). The site is looking to add some online elements to their Base course in Data and Decisions. This course is an introductory course to probability, statistics and regression. The course is a mixture between theory, concepts and practical application in a business context. The course has been taught in the same way for the past 15-20 years, in a completely lecture-based format of mostly theory with limited hands-on application during class time. The culmination of the class is a practical project that is intended to model a real-world application of the concepts learned in class, and the students have an opportunity to work with real clients who have real data and decision needs.

The site would like to design and put all the theoretical and conceptual components of the course online, and utilize class time for more engaging practical applications, clarification of the online content and general discussion. The theoretical part of the course is almost perfectly suited for online consumption for the following reasons:

  1. Students with different backgrounds in the subject can learn at their own pace, repeating concepts and formulas as many times as they want.
  2. The content to be put online is very much “passive” in the sense that little is lost from a one-sided online lecture.

The site would also like to have some adaptive assessment solutions online that would act both as a feedback mechanism for students as well as an observational tool for the teachers as to how students are learning and progressing in the class.

The person leading this initiative is Allison O’Hair. Allison has experience in designing and implementing online content for MIT Sloan and is very knowledgable on the medium of online education. It is interesting to note that although the OIT Department is leading this initiative, the course is technically under the Economics Department at the GSB.

Ideology/Theory

Our curriculum would be undergirded primarily by the Dewey’s concept of progressivism in that we hope to design learning experiences that drive students’ innate desire to learn. According to Dewey, the educator’s role is to set up the right conditions for transfer, rather than teach lessons in isolation. The sign of a mature learner is then someone capable of both identifying and solving their own problems. This focus on “transfer” to enable students to apply their learnings beyond the classroom would definitely be a key learning goal in our curriculum, with ample class time devoted to discussing practical applications of concepts and a real-world project for assessment.

In addition, Dewey placed great emphasis on the interaction between internal and objective conditions for curriculum design. He contended that curriculum construction was always contextual, and that “the trouble with traditional education was not that it emphasized the external conditions that enter into the control of the experiences but that it paid so little attention to the internal factors which also decide what kind of experience is had”. Personalizing the learning experiences for students based on their “internal” conditions, such as their backgrounds, prior knowledge, social-emotional skills and so forth, would be key to effective learning. Ideally, these learning experiences should help prepare students for later experiences and drive continued learning. Selecting and creating learning experiences – be it direct instruction, class discussions or assessments – that are tailored to students’ backgrounds will very much be a core consideration in our curriculum design. Our ultimate goal would be to equip our students with both the desire and skill to continue enhancing their understandings of probability, statistics and regression application in a business context.

In a similar vein to Dewey’s progressivism, Bruner’s theory of emphasizing structure, transfer, and students’ readiness for learning would also underpin our curriculum design. We agree with Bruner that the ultimate goal of education is to help students “learn how to learn” and facilitate transfer. The emphasis on structure, then, is critical because a deep understanding of structure is also a deep understanding of how things are related, and therefore permitting transfer. The implication for us would be to delineate the key concepts to be covered in the course and sequencing them in such a way that “earlier learning renders later learning easier … by providing a general picture in terms of which the relations between things encountered earlier and later are made as clear as possible”. In terms of readiness to learn, we are cognizant that the learners in our course may come equipped with different levels of understanding and aptitudes, and it is our hope to create a curriculum that provides satisfying learning experiences for students regardless of their existing preparedness for the course. We want our curriculum to result in a class that stimulates our students’ desires to learn.

The Learners

The learners are first year MBA students at the GSB (MBA1s). The Base level of Data and Decisions targets students with little or no background in the subject, however the class may have some students with decent background who have chosen to take the Base level instead of the Intermediate or the Advanced levels of the subject. The site is targeting a launch date of Winter 2017 to pilot the course, and it would be presented as an opt-in option for eligible Base students.

The learners come from a wide range of background knowledge and experiences. While most learners were exposed to some part of the content in high school or college, many of them may not have been exposed to or have applied the concepts at work. We would also assume that the learners have different computer skills. This is important because the course uses some Excel and quite a bit of R, a programming language and software environment for statistical computing and graphics. R is especially important for the final project where learners work with a real company and real data to help the company answer some critical business questions using the data. In addition, the learners are in the process of getting a wider business degree, and we assume they are more interested in the business applications of the content as opposed to the details of the formulas and their derivation. For example, it may be more beneficial to know the idea behind variance, how it’s calculated in Excel or R and its use as opposed to knowing the exact formula.

It is fair to assume that the MBA1s are also busy with many social and academic events, which means that there attention and dedication to the course will be spread thin. MBA students, in particularly, care deeply about ‘authentic’ learning experiences and will only devote their time and energy to topics that they perceive as having direct connections to their professional pursuits. We also assume that all MBA students have the appropriate technology affordances for online learning – access to high quality internet access and up- to-date computers to stream videos and run statistical programs.

Overall Rationale

The Data and Decisions course was originally designed as a support course, or prerequisite, for other courses such as Finance and Accounting. It was intended to provide a basic overview of how to use data to extract information that supports decision-making. Since then, specific topics have been added or subtracted from the curriculum to be more focused on data analysis than the calculation of probability and statistical procedures. Students will not only learn methods of using data but, more importantly, should be able to build models and critique them. The hope is that students will become intelligent consumers of data who can look at it and interpret it.

This shift towards decision-making based on data analysis is now central to the current redesign of the curriculum. The goal is to shift from the ‘teaching of formulas’ to doing problem sets, discussions, and application of core concepts. Given this shift in focus, we believe that the teaching of formulas and procedures tend to be more linear and repetitive and thus great candidates for being presented as online content, instead of using valuable classroom time.

Online content also corrects for student’s previous knowledge and pace. Problem sets can be personalized for each student’s level of understanding, thus ensuring everyone’s preparedness for the course’s learning progression. Discussion forums and peer-review mechanisms can also provide different learning opportunities for those who have different learning styles and prefer more collaboration or explanations in different ways. It can also serve as a great formative assessment for teachers to identify common misconceptions and course correct. The implementation of these features and what technological platform will be used remains undecided.

The original course content sequence is as follows:

  1. The first area, probability, provides a foundation for modeling uncertainties, such as the uncertainties faced by financial investors or insurers. We will study the mechanics of probability (manipulating some probabilities to get others) and the use of probability to make judgments about uncertain events.
  2. The second area, statistics, provides techniques for interpreting data, such as the data a marketing department might have on consumer purchases. Statistical methods permit managers to use small amounts of information (such as the number of people switching from Verizon to AT&T in an iPhone test marketing program) to answer larger questions (what would AT&T’s new market share be if the iPhone is launched nationally?)
  3. The third area, regression analysis, is the set of techniques that allow companies to build statistical models of different facets of their businesses. Examples include predicting which movies a customer may like based on her past movie ratings (e.g. Netflix), predicting the sales price of a house (e.g. Zillow), or predicting the sales response to a new ad (e.g. Google).

Original course grading

  • Class Participation Evaluation 10%
  • Mid-term Exam 20%
  • Homeworks 15%
  • Regression Project 20%
  • Final Exam 35%

The proposed course content sequence attempts to flip the sequence so that students have an end goal in mind and learn in a ‘need-to-know’ basis.

  1. Final project – Phase 1
    1. Show previous final projects as examples
    2. Explain what quality work looks like
    3. Show final project grading rubric
    4. Select a real company to obtain data from
  2. Regression analysis – Phase 1
    1. What is it
    2. Examples of how to use it
    3. Underlying concepts  
      1. Regression
      2. Statistics
      3. Probability
  3. Final project – Phase 2
    1. Data manipulation and clean up
    2. Desired data representations or key performance indexes
  4. Regression analysis – Phase 2
    1. How to do it with your own data
    2. Underlying concepts  
      1. Regression
      2. Statistics
      3. Probability
  5. Final project – Phase 3
    1. Data analysis
    2. Present project and results
    3. Peer-review sessions
  6. Conclusion
    1. Cases and further discussions
    2. Feedback from professor and company

Curriculum Construction – Week 5 – Reading Notes

McTighe, J., & Ferrara, S. (1998). Assessing Learning in the Classroom. Student Assessment Series. NEA Professional Library, Distribution Center, PO Box 2035, Annapolis Junction, MD 20701-2035.

  • Assess teaching and learning, not the student and grades
    • “The primary purpose of classroom assessment is to inform teaching and improve learning, not to sort and select students or to justify a grade.” (McTighe & Ferrara, 1998, p.1)
  • Latin roots
    • “the term assessment is derived from the Latin root assidere meaning “to sit beside.” (McTighe & Ferrara, 1998, p.2)
    • Assidere suggests that, in addition to tests and projects, classroom assessments include informal methods of “sitting beside,” observing, and conversing with students as a means of understanding and describing what they know and can do.” (McTighe & Ferrara, 1998, p.2)
  • Types of assessment
    • Tests
      • Rigid format: time limits, paper and pencil, silent
      • Limited set of responses: limited access to source material
    • Evaluation
      • Make judgements regarding quality, value, or worth
      • Pre-set criteria
    • Summative assessment
      • culminating assessment that provides a summary report
    • Formative assessment
      • Ongoing diagnostic
      • Helps teachers adjust instruction
      • Improve student performance
      • Determine previous knowledge
      • Determine ongoing understandings and misconceptions
  • Large scale assessment
    • Usually standardized tests
      • High-stakes
    • Educational accountability
    • Norm referenced
      • Easier interpretation
      • Comparison with others
      • Averages to determine your position
    • Criterion referenced
      • Compared to reestablished standards
  • Classroom assessments
    • Diagnose student
    • Inform parents
    • Improve practice
  • Effective Classroom Assessment
    • Inform teaching and improve learning
      • Performance-based assessments
        • Focus instruction and evaluation
        • Students understand criteria for quality
        • Students get feedback and revise their work
        • Peer- and self-evaluation
    • Multiple sources of information
      • Single test is like a single photograph
      • Frequent sampling
      • Use array of methods
        • Create a Photo Album instead of single photo at the end
          • Different times
          • Different lenses
          • Different compositions
    • Valid, reliable, and fair measurements
      • Validity: How well it measures what it is intended to measure
      • Reliability: If repeated, would you get the same results?
      • Fairness: give students equal chances to show what they know and can do without biases or preconceptions
    • Ongoing
  • Content Standards
    • Declarative knowledge
      • what do students understand (facts, concepts, principles, generalizations)
    • Procedural knowledge
      • what do we want students to be able to do (skills, processes, strategies)
    • Attitudes, values, or habits of mind
      • how we would like students to be disposed to act (appreciate the arts, treat people with respect, avoid impulse behavior)
  • Purpose & Audience
    • Why are we assessing?
    • How will the assessment results be used?
    • Who are the results intended for?

Screen Shot 2016-02-01 at 10.23.12 AM.png

  • Assessment Approaches and Methods
    • Approach – what do you want students to do?
      • Select a response
      • Construct a response
      • Create a product
      • Provide and observable performance
      • Describe their thinking/learning process
    • Selected-Response Format
      • Positive
        • Wide range of knowledge can be ‘tested’
        • Easy to implement
        • Easy to evaluate and compare
        • Fast
      • Negative
        • Assess knowledge and skills in isolation and out of context
        • Not able to assess critical thinking, creativity, oral communication, and social skills
        • Real-world does not have single correct answers
        • Focuses students on acquisition of facts rather than understanding and thoughtful application of knowledge
    • Constructed-Response Format
      • Brief Constructed Response
        • Short written answers
        • Visual representations
        • Positive
          • Students have a better opportunity to show what they know
          • Easier to construct and evaluate than other constructed responses
        • Negative
          • Does not assess attitudes, values, or habits of mind
          • Require judgement-based evaluation – low reliability and fairness
      • Performance-Based Assessment
        • Requires students to apply knowledge and skills rather than recalling and recognizing
        • Associated terminology:
          • Authentic assessment
          • Rubrics
          • Anchors
          • Standards
            • Content standards – what students should know
            • Performance standards – how well students should perform
            • Opportunity-to-learn standards – is the context right
        • Positive
          • Content-specific knowledge
          • Integration of knowledge across subject-areas
          • Life-long learning competencies
        • Negative
          • Do not yield a single correct answer or solution – allows for wide range of responses (also positive)
        • Types
          • Product
            • “Authentic” since it resembles work done outside of school
            • Portfolio to document, express individuality, reflect, observe progress, peer- and self-evaluation
            • Criteria must be identified and communicated with students
          • Performance
            • Can observe directly application of knowledge
            • Students are more motivated and put greater effort when presenting to ‘real’ audiences
            • Time- and labor-intensive
          • Process-focused assessment
            • Information on learning strategies and thinking processes
            • Gain insights into the underlying cognitive processes
            • Examples
              • “How are these two things alike and different?”
              • “Think out loud”
            • Continuous and formative

Screen Shot 2016-02-01 at 10.31.11 AM.png

  • Evaluation Methods and Roles
    • Scoring Rubric (Rubrica – red earth used to mark something of significance)
      • Evaluative criteria
      • Fixed scales
      • Description of how to discriminate levels of understanding, quality, or proficiency
      • Holistic Rubric
        • Overall impression of quality and levels of performance
        • Used for summative purposes
      • Analytic Rubric
        • Level of performance along two or more separate traits
        • Used in day-to-day evaluations in classroom
      • Generic Rubric
        • General criteria for evaluating student’s performance
        • Applied to a variety of disciplines
      • Task-specific Rubric
        • Designed to be used in a specific assessment task
    • Anchors
      • Examples that accompany a scoring rubric
    • Rating scales
      • Bipolar rating scales – bad & good, relevant & irrelevant
    • Checklists
      • Good to ensure no element is forgotten or attended to
    • Written and oral comments
      • Best level of feedback – communicates directly with student
      • Must not be only negative feedback
  • Communication and Feedback Methods
    • How to communicate results?
    • Numerical scores & Letter grades
      • Widely use but not descriptive
    • Developmental and Proficiency Scales
      • Contain description of quality and performance

Screen Shot 2016-02-01 at 11.51.44 AM.png

    • Checklists
      • Careful with poorly defined categories like creativity – open to interpretations
    • Written comments, narrative reports, verbal reports, and conferences
      • Communicate directly with each student
      • Time-consuming
  • Assessment not only measures outcomes but also invokes the values, the how, and the what of learning,
  • Great glossary at the end of this paper.

Coffey, J. (2003). Involving Students in Assessment. In J. Atkin & J. Coffey (Eds.) Everyday Assessment in the Science Classroom. Arlington, VA: National Science Teachers Association. pp. 75-87.

  • Assessment is an opportunity for learning
    • “Whether it comes after teaching, while teaching, or by teaching, we often think of assessment as something done to students, not with them.” (Coffey, 2003, p.76)
  • Teachers
    • check assignments and interpret student responses
    • listen closely to students’ questions so that they can gain insight into their students’ understandings
    • seek to make explicit the assessment criteria so that all students know how they will be evaluated
    • try to use what they learn through assessment to inform teaching, plan future learning activities, and provide relevant feedback
    • constantly gauge trends in class engagement, interests, and understanding
    • strive to fairly assign grades that accurately reflect what a student knows and is able to do.
  • Everyday Assessment
    • “Everyday assessment is a dynamic classroom activity that includes the ongoing interactions among teachers and students as well as more scheduled events, such as weekly quizzes and unit tests.” (Coffey, 2003, p.76)
    • “One of the many purposes of everyday assessment is to facilitate student learning, not just measure what students have learned.” (Coffey, 2003, p.77)
  • Key Features of Assessment
    • explicating clear criteria (Butler and Neuman 1995)
    • improving regular questioning (Fairbrother, Dilln, & Gill 1995)
    • providing quality feedback (Kluger and DeNisi1996; Bangert-Drowns et al. 1991)
    • encouraging student self-assessment (Sadler 1989; Wolf et al. 1991)
  • Responsibility for own learning
    • “When students play a key role in the assessment process they acquire the tools they need to take responsibility for their own learning.” (Coffey, 2003, p.77)
  • Low performing benefited the most
    • “Lower-performing students … showed the greatest improvement in performance when compared to the control class.” (Coffey, 2003, p.77)
  • Learning From Connections
    • “Through the students’ explicit participation in all aspects of assessment activity, they arrived at shared meaning of quality work. Teachers and students used assessment to construct the bigger picture of an area of study, concept, or subject mater area. Student participation in assessment also enabled students to take greater responsibility and direction for their on learning.” (Coffey, 2003, p.78)
  • Shared Meanings of Quality Work
    • Activities
      • students generating their own evaluation sheets
      • conversations in which students and teachers shared ideas about what constituted a salient scientific response, or a good presentation, lab in investigation, or project
      • discussion of an actual piece of student work
      • student’ reflections on their own work or a community exemplar
      • student’ decision making as they completed a project
  • Assessment as a Means to Connect to a Bigger Picture
    • “Teacher and student s leveraged test review as an opportunity to return to the bigger picture of what they had been studying. The class talked about what was going to be covered on the test o quiz so that all students knew what to expect.” (Coffey, 2003, p.84)
  • Assessment as a Vehicle to facilitate Lifelong Learning
    • “The test process also encompassed graded responses after the test, and students would often do test corrections after going over the test. On occasion students would write test questions and grade their own work.” (Coffey, 2003, p.84)
  • Creating Meaningful Opportunities for Assessment
    • Time
    • Use of Traditional Assessment
    • Public Displays of Work
    • Reflection
    • Revision
    • Goal Setting
  • Results
    • “Despite initial resistance, as students learned assessment-related skills, demarcations between roles and responsibilities with respect to assessment blurred. They learned to take on responsibilities and many even appropriated ongoing assessment into their regular habits and repertoires.” (Coffey, 2003, p.86)

Treagust, D., Jacobowitz, R., Gallagher, J, & Parker, J. (March 2003). Embed Assessment in Your Teaching, Science Scope. pp. 36-39.

  • Effective strategies for implementing embedded assessment
    • Use pretests
      • identify students’ personal conceptions
      • misconceptions
      • problems in understanding the topic
    • Ask questions to elicit students’ ideas and reasoning
      • “Acknowledge each student’s answers by recording them on the board or by asking other students to comment on their answers.” (Treagust, Jacobowitz, Gallagher, & Parker, 2003, p. 37)
    • Conduct experiments and activities
      • challenge their own ideas
      • write down their findings
      • share with their peers.
    • Use individual writing tasks
      • capture students’ understanding
      • teacher can assess their progress
    • Use group writing tasks
      • students work together to illustrate each other’s respective understanding
    • Have students draw diagrams or create models
  • Results
    • “25 percent of students in the class taught by one of the authors were rated “Proficient” on the MEAP Science Test compared to 8 percent of other eighth grade classes in the school” (Treagust, Jacobowitz, Gallagher, & Parker, 2003, p. 39)
    • “Moreover. students become more engaged in learning when their teacher gives attention to students’ ideas and learning. and adjusts teaching to nurture their development.” (Treagust, Jacobowitz, Gallagher, & Parker, 2003, p. 39)

Echevarria, J., Vogt, M, & Short, D., (2004). Making Content Comprehensible for English Learners: The SIOP Model. (2nd edition). Boston: Allyn & Bacon. pp. 21-33

  • Sheleterd Instruction Observation Protocol (SIOP)
    • Content Objectives
    • Language Objectives
    • Content Concepts
    • Supplementary Materials
    • Adaptation of Content
    • Meaningful Activities

Teacher PD – Week 5 – Class Notes

PD Design of DVC (Dialogic Video Cycle)

  • 2 groups – intervention group & control group
  • Cycle of 3 events and then repeated
  • Did not see so much about the “Prediction”

Video Clubs (Van Es)

PD

  • noticing student thinking
  • confusion evidence to evidence it and engage in the videos
  • discourses of noticing – noticing framework
  • 1 year – 7 interventions – self selected
  • focused on student thinking rather than content
  • looked at comments while watching the videos
  • self-reports focused on how much they elicit student thinking
  • more developmental vs normative video use
  • looked at peer videos

Research Methods

  • research question was very open
  • who video-clubs would influence:
    • teacher’s thinking about student’s learning
    • looking at teacher as learner
  • video taped PD itself – transcripts
    • fine-grained analysis
      • actor – object of focus
      • topic – what were they talked about
      • stance – what type of discourse they engaged in
      • video based or non video based evidence
    • Results
      • teacher’s views shifted
        • actor – more about student than the teacher
        • topic – from classroom management to mathematical thinking
        • stance – from evaluation to evidence based discussion
      • classroom instruction were also video taped
        • coded for student and whole group discussions
        • changes in instruction
          • made space for student thinking
          • publicly demonstrated student learning
          • probed students for more evidence
          • learning about teaching
      • teacher exit interviews
        • chances in thinking and practice
          • look at student thinking
          • attending to student thinking
          • own school curriculum

Reading for Week 6 – Roth & NARST